Down to Earth_Nature's Role in American History
Page 30
Wildfire posed an even more persistent menace. The foothills of the San Gabriels are covered with chaparral, a dense thicket of evergreen shrubs that flourish in a climate defined by hot, dry summers and moist, cool winters. Chaparral is extremely prone to fire; indeed, the flames actually help nurture the growth of the various small trees and shrubs. When developers descended on the foothills of Los Angeles, they were building in the midst of one of North America’s most flammable environments.
In a sense, suburbanites and developers conspired to bring disaster upon themselves. Wealthy residents in such places as Malibu enjoyed the privacy and beauty of the brush, although it significantly increased the risk of fire, a point borne out in the devastating blaze that torched the area in 1956. More important, the proliferation of fire-prone wooden roofs in the postwar period boosted that hazard even further. Tragically, fashionable southern California homeowners opposed the one strategy that experts believed could have helped ward off disaster: prescriptive burning. Setting fire to the land every five years or so reduces both the fuel load and the possibility of more serious conflagrations. Their own worst enemies, residents of toney neighborhoods objected that such a strategy would blacken the countryside and reduce property values. Instead, they relied on the state and federal governments, with their firefighters and disaster relief, to bail them out when self-inflicted calamity hit. The scale of the subsidy given suburban development was simply staggering, especially when one considers that in the 1980s alone, 10,000 wildfires struck the Golden State.39
MALIBU FIRE, 1956
Firefighters, who once focused their efforts on controlling outbreaks in wilderness areas, found that the Malibu disaster marked the start of a new breed of conflagration that occurred on the border between backcountry and built-up suburban developments. (Regional History Collection, University of Southern California)
HOME AND GARDEN
Just about everything we associate with the suburban home—the car, the lawn, the very house itself—guzzles nonrenewable fossil fuels. For 25 years following the end of World War II, American homes consumed rising amounts of energy. In the 1960s alone, energy use per house rose an unprecedented 30 percent.40
But the high-energy home was not the inevitable outcome of suburban expansion. It might surprise some to learn, for instance, that in the 1940s even such mainstream magazines as Newsweek touted the virtues of solar design. These innovative homes were oriented toward the south to capitalize on the sun’s heat in the winter and had overhangs to shield them from the scorching summer sun. They saved on precious natural resources and appealed to America’s wartime conservation mentality. Even into the late 1940s and early 1950s, solar homes commanded serious attention from architects, builders, and the popular press. Once again, the World War II era represented a moment of ecological possibility. As the 1950s unfolded, however, the availability of cheap heating fuels like oil and natural gas dimmed the attraction of the sun. Before too long, the federal government retreated from investing in solar research. “Our descendants 1,000 years hence may curse us for using coal, oil, and gas to heat our homes, when we might as well have used the sun,” declared one solar researcher in 1954.41
With solar design waning in popularity, the stage was set for an orgy of suburban home energy use. In 1945, very few American homes had air conditioning, even though the technology was available as far back as the 1930s. But as air conditioning units became cheaper and more compact in the late 1940s, sales began to rise. Even more important was that once the wartime housing shortage ended in the mid-1950s, builders had to figure out how to continue to stimulate demand for new homes. Air conditioning proved the answer to their prayers. In effect, builders found themselves in the same position as the auto industry in the 1920s, after the need for basic transportation had been met by Ford’s Model T. Air conditioning became the equivalent of GM’s annual model change. The addition of air conditioning to new homes tempted buyers to trade up. Women who stayed at home while their husbands left for air-conditioned offices helped fuel the market for central air. Not only that, the National Weather Bureau also worked to further sensitize people to the perils of heat. In 1959, the bureau put forth a “Discomfort Index,” a composite measure of both heat and humidity. The air conditioning industry seized on the index as the perfect way to know when it was time to crank the machine to “Hi Cool.”42
Between 1960 and 1970, the number of air-conditioned houses went from one million to almost eight million. The energy-intensive machines added comfort to the home and it unquestionably made life more bearable in the South. It also helped to lengthen the lives of those suffering from heart or respiratory disease. It is worth noting, however, that properly designed solar houses could have achieved at least some of the same ends, and at a far lower environmental cost.43
The suburban home’s drain on energy resources also had some less obvious causes. When such large-scale builders as the Levitts cleared the land of trees, they exposed the new homes to both more heat and more cold, increasing the energy that had to be expended to keep the temperature comfortable inside. With the trees gone, developers then planted grass to cover up the scarred earth left behind in the building process. A quick and simple means of sprucing up the terrain, however, soon turned into a major-league obsession. Whatever complexity existed in the agricultural ecosystems that preceded suburbia gave way to a homogenous sea of green, a mass-produced landscape to accompany the mass-produced homes. Homeowners broke out lawnmowers, pesticides, fertilizers, and sprinklers as they set about furiously transforming the landscape into a lush green carpet. Oil and natural gas, it turns out, are the chief components in the production of nitrogen-based fertilizer. Power mowers, of course, also use fossil fuels and, not only that, contribute far more than one might expect to air pollution. One hour spent mowing grass is the equivalent in terms of emissions produced to driving a car 350 miles. Suburban homes guzzled energy, both inside and out.44
Landscape architect Frederick Law Olmsted pioneered the lawn in its suburban incarnation. In the 1860s, Olmsted designed a suburban community outside of Chicago, with houses set back enough from the street to allow homeowners to plant a nice swath of grass. “A smooth, closely shaven surface of grass is by far the most essential element of beauty on the grounds of a suburban house,” wrote lawn advocate Frank J. Scott in 1870. Fittingly, standardization maven Frederick W. Taylor played a role in American lawn history, experimenting with grass that he hoped could be “made in much the same way that an article is manufactured in a machine shop or factory.”45
In the 1920s, as the economy evolved into its present consumption-oriented mode, companies specializing in lawn-care products preyed on the fear that failing to keep grass neatly manicured reflected badly on homeowners themselves. “Many a lawn that looks passable at a distance shows up very poorly when close by,” read a brochure for one lawnmower company in 1928.46 At a time when advertisers urged women to scrutinize the inside of homes for all traces of dirt and sold them new products to eliminate it, companies peddling lawn-related items urged the man of the house to assume his civic duty and make sure the grounds appeared well trimmed and orderly.
During World War II companies such as O. M. Scott & Sons turned the lawn into a means of national unity and moral uplift. “Your lawn is the symbol of peace at home and its proper maintenance a vital factor in keeping up morale,” read a 1942 advertisement for the seed company.47 Tending lawns evolved into a patriotic duty, no less important than fighting the nation’s enemies.
When the war ended, the lawn truly began to swallow the landscape. Gardeners threw down grass seed with each new subdivision. Meanwhile, the game of golf, with its fairways and putting greens, became a national pastime. Earlier in the century, the U.S. Golf Association and the USDA joined forces to find new species of turf grass suited to America’s diverse physical environment. When, following World War II, golf became a multibillion-dollar business, champions such as Sam Snead, Arnold Palmer, and Jack Nickla
us were recruited to advertise mowers and other lawn supplies.
Good-looking lawns brought happy families in their wake, or so advertisements and television shows told viewers. The TV show Father Knows Best, first aired in 1954, opened with a shot of a meticulously kept lawn before zooming in on the family itself. The Vigaro Lawn Food company ran a similar spot. Maintain your lawn well, went the message, and you too could look forward to domestic harmony, or at least convince neighbors that life was as perfect indoors as out.48
Of course the perfect lawn, like the ideal family or body, was nearly impossible to achieve. Four out of five Americans, according to a 1981 poll, expressed dissatisfaction with the state of their lawn. A simple ecological fact explains why the quest for the gold medal lawn ended so often in failure: Turf grass is not native to North America. Most of the grass species come from northern Europe and evolved under moister, cooler conditions than those largely present on the U.S. mainland. Such species flourish in a wet northern climate like Newfoundland’s, but to grow turf grass in the continental United States, one must be prepared to water, fertilize, and mow constantly. In Europe, grasses adapted themselves to being grazed by livestock, meaning that they grew and flowered regularly to meet the dietary needs of animals such as cattle. Were it not for this fact, cutting the lawn would never have become the suburban weekend ritual that it is today.49
It was difficult enough to grow turf grass in America, but the relentless quest for perfection—a neat, bright green expanse free of weeds and insects—has increased the stakes and costs of lawn care. Keeping lawns green requires a huge amount of fertilizer. As late as the 1920s, suburbanites spread manure on their lawns come the fall. But the decline of the horse in urban life led homeowners to turn to commercial fertilizers—increasingly so after World War II, when they became readily available in an artificial form. Prior to 1940, one pound of nitrogen fertilizer per 1,000 square feet of lawn was all the experts recommended; by the 1970s, the figure had risen to eight pounds for the same area. In the early 1980s, Americans spread more chemical fertilizer on their lawns than the entire nation of India used to grow food for its people. Excessive use of nitrogen fertilizer fosters algae blooms, as the nutrients run off into rivers, only to emerge in coastal waters where they harm aquatic life. Fertilizer also contaminates ground water supplies and may play a role in causing cancer, birth defects, and “blue baby syndrome,” where an infant’s blood is deprived of oxygen.50
Were homeowners willing to tolerate a little disorder, they could have forgone the fertilizer. Clover, once a common species in grass seed mixtures, provides a free fertilizer treatment. It absorbs air-borne nitrogen and, when it dies, returns the nutrient to the soil, where it helps nurture the grass. But after World War II, clover, like other “weed” species, came under attack. “It’s time to take up arms against the weeds,” read one 1955 article extolling the virtues of a flawless lawn. “From now on, when man and nature meet on the lawn, it’s dog eat dog.”51 It was as if when the battles overseas ended, Americans, deprived of an outside threat, declared war on the unseen enemies lurking in their yards.
As went the weeds, so went the bugs. By applying the metaphors of war to insects, Americans elevated the killing of bugs into a battle for the nation’s very existence. Insects played an important role in World War II. At one point in the early part of the war, Gen. Douglas MacArthur estimated that mosquitoes had caused two-thirds of his troops in the South Pacific to come down with malaria. Although considered a dangerous insecticide today, DDT (dichlorodiphenyltrichloroethane), first tested in the United States in 1942, saved the lives of countless U.S. soldiers and millions of others across the globe at risk of malaria. In light of its success abroad, when the war ended, DDT was deployed at home on American soil.52
In 1944, Life magazine contemplated waging war against the Japanese beetle, a turf grass pest especially troublesome in the Northeast. “Japanese beetles, unlike the Japanese, are without guile,” the article explained. “There are, however, many parallels between the two. Both are small but very numerous and prolific, as well as voracious, greedy, and devouring.” The deeply ingrained racism of wartime rhetoric was adapted to the postwar insecticide mania, as the civilian use of DDT—proclaimed “the atomic bomb of the insect world”—and other poisons snow-balled.53
Aside from the dangers pesticides posed to human health, they also killed organisms helpful in decomposing grass clippings. That left homeowners little choice but to throw out the clippings as “waste”—losing yet another opportunity for a free fertilizer treatment. If they wanted healthy and green yards, it was off to the store to purchase more fertilizer, in a never-ending cycle of ecological catch-up.
The lawn has become a perfect vehicle for promoting economic growth under consumerism, luring Americans into a war they cannot win. Much of the nation’s climate and geography stands in the way of triumph. But from the standpoint of those in the lawn-care industry, the fact that total victory remains elusive is, of course, a source of great profits. The unattainable quest for perfection has allowed suppliers to sell Americans on the need for an arsenal of high-energy pesticides, herbicides, and lawn tools in a battle that is essentially over before it starts. What Alfred Sloan accomplished with his yearly model change and postwar builders did by promoting air conditioning, the nation’s lawn-care industry arrived at by virtue of ecological misfortune. Turf grass species developed in colder European climes simply cannot flourish easily in this country. Consumer capitalism, a system of social relations predicated on people’s inability to satisfy their insatiable needs, easily took root in the American lawn.
CONCLUSION
Suburban sprawl has had a profound effect on ecosystems across the United States. Taken together, lawns and automobiles have redefined the American landscape—knitting the nation together in swaths of green and ribbons of black. But sadly, by locking into auto-centered suburbanization, our culture may well have imperiled its ability to respond to future environmental challenges.
The reduction of meadow, prairie, desert, and forest habitats—with their complex vegetative complexes—into a lawn monoculture is one of the singular ecological inventions of modern American history, a shift that affected not just plant life, but various birds, insects, butterflies, and small mammals. Simplified lawn ecosystems attract large numbers of birds, it is true. But whereas the native vegetation allowed for a range of avian life, the lawn has drawn only those species—house sparrows and starlings, for example—that feed on the seeds and insects commonly found on the streamlined green expanses. From the loss of species diversity to fertilizer-induced ground water contamination, the American lawn continues to exact a high environmental toll.
The lawn even has the force of law behind it. Many communities forbid homeowners (and have in places for 100 years) from letting “weeds”—that is, undo-mesticated vegetation—take over their front yards. Nor is grass allowed to grow over a certain specified height. Some jurisdictions have even mandated jail time for such offenses. In the late 1980s, one Maryland couple refused to mow their foot-high grass, a protest that eventually helped galvanize resistance to lawn totalitarianism. Although the couple eventually won their case—prevailing on the county to change its law—the suburban meadow movement faces an uphill battle against the thoroughly entrenched ruling lawn culture.54
Compared to the lawn, the ecological impact of the car seems far more glaring and obvious. The tendency to evaluate technology in terms of physical objects has focused attention on the automobile’s obvious by-products, its role in generating petroleum dependence, air pollution, toxic waste, and global warming—all major effects of the nation’s shift to cars as the primary means of transportation. But if we consider not just the car but the system of highways that has arisen, a fuller and less recognized understanding of auto culture’s impact on nature emerges. The creation of a national interstate highway system, like the making of a national lawnscape, has had significant effects on local habitats. When Inte
rstate 75 slashed through Florida’s Big Cypress Swamp, it split the habitat of the Sunshine State’s panther population and caused its numbers to decline. Meanwhile, in the Northeast, the salt spread on roads during the winter has caused some tree species like oaks to dwindle and other, such salt-tolerant ones as sycamores to grow instead. And anyone wondering why ragweed dominates the edges of such roads might care to know that the species thrives on salt.55
Ecological conditions in modern America have changed, and they will continue to evolve. Unfortunately, however, decisions made by large corporate interests such as the auto and oil lobbies (and their government supporters) have narrowed our culture’s range of options in responding to change. What is good for GM or some other large and powerful institution or interest, as anthropologist Roy Rappaport once observed, cannot be good for the country. By forcing us so thoroughly down one single path, conformity limits our ability to respond in the face of an unforeseen turn of events. Whether we have mortgaged the future by committing ourselves to auto-centered suburban expansion is debatable. But there is no question that we have foreclosed, to a major extent, on our ability to adapt to the ecological changes ahead.56