Down to Earth_Nature's Role in American History
Page 28
Smithfield operates a nearly one-million-square-foot plant in Tar Heel, North Carolina. The factory can dispatch 32,000 hogs into pork in just a single day. The concentration of hog farms and meatpacking plants in North Carolina over the last 20 years is no accident. Anti-union sentiment, low wages, and lax environmental regulations, some of which were advanced by none other than Wendell Murphy himself (who served as a state senator in the 1980s and early 1990s) account for the trend. By the 1990s, North Carolina was home to almost twice as many hogs as people, with 10 million creatures crammed into the state.49
Huge confinement barns stocking thousands and thousands of animals have posed a new and formidable set of environmental problems. Chief among these is the question of what to do with the millions of tons of animal waste. By some estimates, in the United States today there is something approaching 130 times more animal waste than human waste produced each year. That amounts to approximately five tons of manure for every citizen.50
Back in the days when manure was integrated into the crop and nutrient cycle, animal waste posed few problems. All that changed with the advent of mega-farms and feedlots for producing livestock. As one environmentalist has explained, “The problem is that nature never intended for 80,000 hogs to shit in the same place.”51
To save on labor costs, pig manure is simply flushed away with hoses into holes in the floor. From there it is channeled into giant lagoons. In 1995, approximately 25 million gallons of hog waste from a 12,000-animal “barn” in North Carolina—more than twice the amount of oil involved in the notorious 1989 Exxon Valdez incident—spilled out of one such lagoon. The waste eventually flowed into the New River, where it annihilated virtually all aquatic life in a 17-mile stretch, in effect forcing the public to bear the high ecological costs associated with factory-style animal farms.52
Together vast amounts of hog and chicken waste, laden with such nutrients as nitrogen and phosphorous, have wended their way into coastal waters. In 1991, North Carolina’s Pamlico Sound was the scene of a fish kill so massive that bulldozers had to be called in to bury the dead creatures. Although the exact cause of this massive die-off and other more limited fish kills in Chesapeake Bay is not known, some scientists suspect that hog and chicken manure is the culprit. The animal waste, they surmise, sets off algae blooms that deplete oxygen from the water and put stress on fish populations. The algae also help to feed a microscopic organism named pfiesteria—dubbed “the cell from hell”—which can release a toxin lethal to fish. Manure, once a vital and integral aspect of farm life, has disappeared from the barnyard into the nation’s waters, where it has become one of the most serious environmental dilemmas of our time.53
CONCLUSION
This chapter should not be taken as an anti-meat manifesto but as a plea for understanding the historical roots of modern beef-eating and its ecological and social impacts. Once farm cattle fed on grass and hay; pigs ate garbage, including waste produced in some of America’s largest cities; and chickens trailed cattle around the barnyard pecking grass seeds out of the dung they left behind. The animals converted matter unsuitable for human consumption into a much-valued source of protein. The rise of factory-style livestock production, especially in the decades after World War II, transformed these farm animals into eating machines requiring large quantities of corn and soybeans. Livestock guzzled energy and water, shedding their old role as garbage collectors and assuming a new one as waste producers—ironically, usurping industrial enterprises as the nation’s leading source of water pollution. Old MacDonald is turning over in his grave.
13
AMERICA IN BLACK AND GREEN
Compared to Henry Ford, Thomas Edison, and the other heroes of American consumerism, the name Thomas Midgley rings few, if any, bells. And yet despite his relative obscurity, Midgley, a research chemist born in 1889 who died, according to one obituary, by accidentally strangling himself with “a self-devised harness for getting in and out of bed,” played a fundamental role in some of the consumer age’s most celebrated products.1
Over his career, Midgley made two discoveries that profoundly influenced both the course of American consumer culture and the make-up of the atmosphere. In 1921, while working for General Motors (GM), he uncovered that lead, when added to gasoline, eliminated engine knock, a breakthrough that allowed automakers to boost engine performance and sell faster, racier cars. The advance, however, came at the expense of releasing a known poison into the environment. Later, Midgley went on to invent Freon, the first chlorofluorocarbon. Americans received better air conditioners, deodorants, and hair sprays, but future generations would pay the price for them with skin cancer as the compounds damaged the ozone layer, which shields the earth from the harmful effects of ultraviolet radiation. Midgley, in the words of historian J. R. McNeill, “had more impact on the atmosphere than any other single organism in earth history.”2
In the automobile-oriented suburbs sprouting up all over the nation after World War II, land that had once yielded food was paved over with asphalt and converted into sprawling subdivisions and lawns. Forced to hit the road to fill their stomachs, Americans piled into cars and headed for supermarkets and restaurants, especially after Congress created more time for leisure by passing legislation (in 1938) making the 40-hour, five-day work week the national standard. Fast food hamburger outfits remained so centered around the automobile that it took over a decade after opening in the 1950s before McDonald’s even bothered to install seats and tables in its restaurants.
Car culture ushered in a vast and sweeping network of roads and interstate highways for knitting together metropolitan centers and outlying suburbs. In a sense, the freeway took the place of manure in uniting the ecological fortunes of the country and the city. Gone were the animals, hay, and potato fields, and instead tract housing arose and the lawn was born, grass that was grown mainly for its aesthetic appeal. Cut off from any direct relationship with their food supply, Americans now had the luxury of planting turf grass (originally imported from northern Europe) and dousing it with tons of water, fertilizer, and pesticides—rendering the land from coast to coast into a verdant sea. With 30 million acres under cultivation, the lawn is now the nation’s number one “crop.” Needless to say, the new national landscape, painted in black and green, left behind a trail of social, ecological, and biological consequences.
LIVE FREE AND DRIVE
The rise of the automobile is a well-known chapter in the American past. First built in the 1890s as a luxury item for the well-to-do, the car eventually became a mass-produced commodity. The man chiefly responsible for this momentous change was Henry Ford, who pioneered the use of the assembly line in auto production. In 1914, concerned that the market for cars would remain limited as long as even autoworkers themselves could not afford them, Ford began paying some of his employees five dollars per day, at the time a relatively high wage. By 1929, nearly 50 percent of all U.S. families owned automobiles, a milestone not reached in England until four decades later.
Although some opposed the automobile—early in the century, a group of Minnesota farmers, for instance, plowed up roads and strung barbed wire between trees—most Americans embraced the car with enormous enthusiasm. The automobile succeeded because it met the real, legitimate needs of people for a means of transportation. Consumerism, in part, had reorganized people’s relationship with the land in the way that transformed the car from a luxury to a necessity. Food and clothing once produced in the home, especially by women, were by the 1920s bought in towns and villages. By the 1930s, 66 percent of rural families and 90 percent of urban families purchased store-bought bread instead of making it on their own. More shopping meant that people spent more time in cars on their way to stores. People’s priorities quickly changed. In the 1920s, an inspector from the USDA asked a farmwoman why she bought a Model T before installing indoor plumbing. “Why you can’t go to town in a bathtub!” she exclaimed.3
The impact of the car went far beyond its ability to
provide consumers with a convenient means of doing their marketing. It also helped to stimulate suburbanization. The suburbs began as far back as the 1840s with the advent of railroad travel, proliferating after the Civil War with the development of streetcar lines. By the late nineteenth century, cities such as New York, Chicago, and Philadelphia became increasingly wedded to factory production and, somewhat later, to financial activity. With real estate developers following World War I far more interested in constructing office buildings than new housing, many middle-class residents left the city for the suburbs, relying on cars to shuttle them back and forth to work. By 1940, 13 million Americans lived in auto-centered communities not serviced by public transportation.4
If the automobile met the genuine need for transportation, especially in the more decentralized suburban environment, it also brought with it a vast amount of social baggage. Tellingly, the French term automobile and not the British phrase motorcar came to predominate. Motorcar focused attention on the engine, the driving force behind the new form of transportation. But automobile suggested something far more complex, literally, self-movement. The idea that cars could free people from train and streetcar schedules, instead propelling them on their own through space, conformed to American ideals of freedom, individuality, and democracy.5
Automobiles were sold to Americans with precisely this notion of freedom and liberation in mind. An advertisement by the Ford Motor Company shows a woman persevering in the face of inclement weather. Snuggled in behind the wheel of her Ford’s heated cabin, the woman sets off free from the worry that snow or rain might in some way impede her trip into town. Complete independence from the forces of nature is the message being conveyed.
But the freedom of movement that the automobile made possible came at a price, although advertisers often distracted Americans from confronting it. Far better, at least from the auto industry’s perspective, if people remained walled off from the social and environmental costs of car ownership, much as the woman driver in the Ford advertisement is seen insulated from the elements at large. It would be wrong, however, to conclude that no one at the time recognized the problems presented by the automobile. “Our streets smell badly enough without the addition to the atmosphere of vast quantities of unburned gasoline,” declared one observer in a 1910 issue of the magazine Horseless Age. Gasoline not only caused pollution. Its status as a nonrenewable resource even led some engineers and industry analysts to worry about whether an adequate supply would always remain available. As early as 1905, engineer Thomas J. Fay foresaw that “One of the great problems of the near future in connection with the popularization of the automobile will be that of an adequate and suitable fuel supply.”6
Alternative fuels such as grain alcohol existed. But relative to gasoline, alcohol was more expensive, about double the price per gallon at the turn of the century. And that price did not include a federal excise tax placed on alcohol beginning in 1862 to help defray the Union’s costs in the Civil War. In 1907, the tax was repealed. But the process of denaturing alcohol, to render it undrinkable in an effort to preserve the sobriety of the American republic, added to its price and gave gasoline the edge. Compounding gasoline’s advantage was the fact that it took more alcohol than gas to cover the same distance. Added to this was the political muscle of the petroleum interests. Together these factors combined with the nation’s long-standing concern with temperance to produce a terrible dependency of another kind.7
GET THE LEAD IN
Henry Ford pioneered mass production, but it was Alfred P. Sloan, Jr., the president of General Motors, who figured out a way of selling everyone on the need for all these new cars. In 1927, Sloan introduced the annual model change as a way to “keep the consumer dissatisfied.” He reasoned that if people saw their neighbors driving around in a new car with features their own vehicle did not have, they too would soon be heading off to the showroom. Under Sloan’s leadership, GM surpassed Ford as the nation’s number one auto producer. The company succeeded not by offering consumers a basic means of transportation—Ford’s stock in trade—but by holding out the prospect of faster cars that grew more stylish and larger with every passing year. It was a stroke of genius that set the stage for GM’s decades-long dominance within the industry.8
Leaded gasoline was the key to fulfilling Sloan’s ambitions for GM, at least in the realm of auto performance. Early on, cars had to be cranked by hand to start. But in 1911, the invention of the self-starter eliminated the laborious task of hand cranking, allowing women especially to take to the roads. Automakers could now produce cars with larger, easy-to-start engines. The electrical breakthrough, however, had one drawback: Customers noted a knocking sound coming from the engine. If cars were going to be larger, faster, and easier to use, then a way had to be found to eliminate potential engine damage from “knock.”9
Not long after the invention of the self-starter, the staff at Dayton Engineering Laboratories Company (DELCO) discovered that ethanol or grain alcohol, when burned in a car’s engine, helped to remedy knock. The problem with grain alcohol, however, at least as the oil companies saw it, was that anyone, even ordinary people, could make it. Thus in 1921, when Thomas Midgley, who was working at the DELCO lab, now owned by GM, discovered that tetraethyl lead also functioned as an excellent antiknock agent, the oil and lead interests rejoiced.10
In early 1923, the first gallon of leaded gas was pumped in Dayton, Ohio. The following year, GM, the Du Pont Chemical Company (which controlled roughly a third of GM’s stock), and Standard Oil of New Jersey, combining their various patents, manufactured leaded gasoline under the “Ethyl” brand name. There was only one problem. A few months before Ethyl went on sale, William Mansfield Clark at the U.S. Public Health Service came forward to explain that tetraethyl lead was exceedingly poisonous and had the potential (through the lead oxide it produced when burned) to endanger public health in heavily traveled areas.11
In 1922, U.S. Surgeon General H. S. Cumming wrote a letter to Pierre du Pont, chairman of the board at the chemical company, inquiring about the health hazard posed by leaded gasoline. Midgley responded on the company’s and GM’s behalf. The public health effect of leaded gasoline received “serious consideration,” he wrote, but “no actual experimental data has been taken.” Despite the lack of evidence, Midgley believed that “the average street will probably be so free from lead that it will be impossible to detect it or its absorption.” With no studies to draw on, how could Midgley be so sure of its safety? That remains a mystery, and a doubly curious one given that shortly before responding to Cumming, Midgley had come down with lead poisoning.12
In 1923, General Motors, reasoning that any in-house scientific study it did would be viewed skeptically, agreed to finance a study by the U.S. Bureau of Mines into the safety of tetraethyl lead. The following year, the newly formed Ethyl Gasoline Corporation negotiated a new research contract with the bureau that required the government agency to submit its results to the company for “comment, criticism, and approval.” This was not going to be a disinterested piece of research. The bureau soon issued a report downplaying leaded gasoline’s potential adverse impact on public health. The report prompted one rival car manufacturer to ask whether the bureau existed “for the benefit of Ford and the GM Corporation and the Standard Oil Co. of New Jersey, … or is the Bureau supposed to be for the public benefit and in protection of life and health?”13
The bureau’s biased approach caused some in the scientific community to object. Alice Hamilton, a physician who studied the industrial use of lead and its medical effects (and the first woman faculty member of Harvard Medical School), doubted the safety of leaded gasoline, especially if its use became widespread. In 1925, Yandell Henderson, a physiologist from Yale University, predicted that if the industry had its way, lead poisoning would emerge slowly but “insidiously … before the public and the government awaken to the situation.”14 He turned out to be right. With the burning of huge quantities of gasoline (especially
in the three decades after 1950), lead was deposited on the soil and, unknowingly, tracked into houses across the nation. Infants crawling on the floor then picked it up on their fingers and ingested it, interfering with the development of their nervous systems and contributing to hyperactivity and hearing loss, among other effects, although it would be decades, as Henderson surmised, before the full scope of the problem became evident.
In 1926, another federal study again found “no good grounds for prohibiting the use of ethyl gasoline.” The authors did note, however, that widespread use of leaded gasoline at some point might present a health hazard and urged additional follow-up studies, but none were ever done. Instead, the manufacturers of the product financed all the research into leaded gasoline’s safety. Although Midgley and others in the auto, oil, and chemical industry knew about other more benign alternative additives (ethyl alcohol blends, for example), they pushed lead, probably because of the huge profits they stood to make from its sale.15
Leaded gasoline allowed Detroit to boost performance and sell more automobiles, but at a high biological price. Even something as pernicious as radioactive waste breaks down over the long run, but not lead. In the United States alone, seven million tons of lead were released between the 1920s and 1986, when it was phased out as automakers switched over to catalytic converters. Ethyl is gone, but the lead remains, having insinuated itself into the land, water, and air, as well as the bodies of all life forms.16
MASS TRANSIT MELTDOWN
As it turned out, it would take more than speed and style to ensure the auto’s dominance over mass transit. Rising numbers of automobiles in the 1910s and 1920s did not directly spell the end of public transportation. In fact, if the figures on mass transit use are broken down, some cities—St. Louis, New York, and Chicago, for example—actually showed an increase in ridership between 1918 and 1927. GM’s effort to spur consumption through model changes and faster, more stylish cars was partly a response to the continued vitality of public transportation. But even these changes failed to give the industry the boost in sales it longed for. Stronger measures, the automakers concluded, were in order.17