American Empire
Page 65
Where people ate, and where their food was prepared, shifted, another reason for rising caloric intake. Home-cooked family meals became something of an oddity. The growing time pressure on families—more adults working, workers working more hours, and a decline in fixed, daytime work schedules—meant that in more and more homes getting family members together to eat a home-cooked meal became difficult or impossible. (In 1997, only a bare majority of workers had regular, weekday, daytime jobs, as nighttime, weekend, and irregular work became more common.) Instead, people increasingly ate out, often on the run, or ate take-out meals at home, very often not in family groups but individually. In 1995, 29 percent of all meals were eaten away from home, up from 16 percent in 1977.
Fast-food restaurants, which spread from suburban roadsides to city streets and into institutions like colleges and hospitals, provided billions of those meals. And the meals they provided, on average, had ever more calories. In the 1970s and 1980s, fast-food companies discovered that customers would buy more food and drink when offered combinations of items priced at a discount, so-called value meals, and when the size of offerings grew. A single serving of McDonald’s french fries went from 200 calories in 1960 to 320 in the late 1970s, 540 in the late 1990s, and 610 in the early 2000s (at a time when the federal government recommended a total daily input of 1,600 calories for women and 2,200 for men). An oceanic proliferation of high-calorie snack foods also put pounds on bodies. The poor snacked more than the middle and upper classes, and snacking increased more among Hispanics and African Americans than whites, contributing to their higher levels of overweight and obesity.
Young Americans got more calories at school, too. Chronically starved of funds, many school districts, particularly in states like California that had passed tax limitation laws or referenda, did not have the money to maintain cafeterias that cooked food from scratch. Instead, they turned to buying food prepared off-site, often by fast-food companies that specialized in high-calorie items. Many California high schools allowed restaurant chains like Pizza Hut to set up their own vending stations on their campuses, which students preferred to cafeteria fare. And all across the country school districts signed so-called pouring contracts with beverage companies, which in return for cash payments and sometimes a share of their receipts gave the companies the exclusive right to advertise and sell soft drinks—a major source of caloric intake—in district schools.
The tax limitation movement and the squeeze on school finances contributed to the other half of the bloating equation, too, the decline in physical exercise, especially among those in the lower economic strata. Many schools dealt with the requirement in Title IX of the 1972 Education Act that they provide equal funding for boys’ and girls’ athletics by redistributing funds rather than substantially increasing resources, leading to a decline in standards, at least for boys. In 1976, California allowed school districts to entirely exempt high school juniors and seniors from physical education requirements. More generally, the coincidence of the fitness boom of the 1970s and beyond with the era of tax caps and denunciations of the public sector meant that the infrastructure for exercise and sports increasingly grew up in the private sector rather than in schools and public recreation facilities. Poor children, without nearby parks or public gyms or public recreation workers or a stay-at-home parent with the time to shuttle them to the private soccer and other youth sports leagues that burgeoned during this era, had fewer opportunities to engage in sports or exercise than their wealthier peers.
As children spent more time watching television and playing video and computer games, youth fitness suffered. But not all groups suffered equally. Poor, African American, and Mexican American children watched more television than white and well-off children, with associated consequences for weight and health. In many cases, their parents encouraged them to do so because their neighborhoods were so dangerous that the health risks of violence and drugs outside their homes exceeded the risks of physical inactivity within them. The decline in manufacturing also contributed to weight gain, as more workers sat in sedentary jobs. Culture, ethnicity, gender, and race all mattered in body size, but the most important variable in determining the likelihood of obesity was economic class.
In spite of growing bigger, Americans lived longer. Life expectancy at birth rose to just under seventy-seven years in 2000 (seventy-four years for men, seventy-nine and a half for women), from seventy-three in 1975. Blacks had a life expectancy of seventy-two years, whites a bit over seventy-seven, a modest narrowing of the gap over the previous quarter century. By this most basic of health measures, Americans were doing very well.
But some others were doing even better: Australians, Belgians, British, Canadians, Dutch, French, Germans, Greeks, Italians, Japanese, and Spaniards all could expect to live longer. Differences in social structures and health-care delivery systems accounted for most of the gap. Although the United States spent a much larger share of its GNP on health care than other countries, a substantial number of its residents didn’t have easy access to medical services due to a lack of health insurance (universal in almost every other advanced industrial nation) or an absence of nearby medical facilities. In part for this reason, the United States had a higher infant mortality rate than most industrialized countries. Other particularities of American life also retarded the country’s health standing compared to other industrial nations. Driving more meant a higher death rate from motor vehicle accidents. More firearms meant more firearms deaths, over twice the number per capita than in France and thirty-four times the rate in England and Wales, which had very strict gun control laws. More obesity portended more cardiovascular and coronary heart disease and led to an epidemic of type 2 diabetes, including among children, for whom it rarely had been a problem until the 1990s. One demographic historian concluded that at the start of the twenty-first century, “a large share of the American population,” more likely to be poor and nonwhite than the rest, had “health standards more common to less developed countries than to the advanced industrial world.”
Global Impact
Living large had planetary effects. In 1997, the United States, with less than 5 percent of the world’s population, consumed almost a quarter of the energy used by human society. Per person, the United States used twice the amount of energy as Germany, France, and Great Britain. Globally, it was responsible for nearly a quarter of the emissions from fossil fuel.
The whole suburban system of living, working, and shopping depended on large amounts of low-cost energy. The ever-bigger, single-family houses that Americans bought required large amounts of energy to heat and cool. By the early twenty-first century, almost 90 percent of new homes and virtually all new cars had air-conditioning. Increasingly, people and businesses left air-conditioning running all the time. (The United States used more electricity for air-conditioning than the total amount of electricity used in India, a country with more than three times its population.) Because of the increased popularity of SUVs and pickup trucks, the average fuel economy for new passenger vehicles actually fell during the 1990s, from 22.1 miles per gallon in 1987 to 20.8 miles in 2003.
Even the vast amount of lawn that came with suburbanization—the country had an estimated twenty-five to forty million acres of domesticated grass—pushed up energy usage. To keep lawns green and looking vigorous, Americans applied huge amounts of fertilizer made from natural gas. To cut, trim, clear, and manicure lawns and yards, power lawn mowers, leaf blowers, and other motorized equipment was deployed, usually in the hands of low-paid immigrant workers, as Latinos came to dominate the lawn care workforce. (All that equipment made a racket, so the roar of the small-bore gasoline engine became one of the sensory markers of suburban life.) The EPA estimated that seventeen million gallons of gasoline were spilled each summer in the course of trying to refuel lawn mowers and other garden machinery, nearly twice the estimated amount of petroleum that leaked into Prince William Sound, Alaska, in 1989 f
rom the Exxon Valdez in one of the most notorious environmental disasters in the country’s history.
What made the consequences of the high-energy, high-pollution way of life in the United States so globally significant was the country’s large and growing population. Measured per person, energy use in the United States essentially plateaued from the mid-1970s on, as appliances and industry became more efficient, but with more and more people each year, the total energy consumption of the country kept rising. Eighty percent of the country’s energy (in 2000) came from fossil fuels, so rising energy consumption meant increased emission of carbon dioxide, the main greenhouse gas responsible for global warming.
Similarly, per capita water use dropped by 25 percent during the last quarter of the twentieth century as a result of more efficient appliances and local and state conservation efforts, but growing population resulted in a small increase in total water use. (Nearly half the water went to cooling power plants and another third to irrigation.) While in much of the country water resources remained plentiful, in the arid West, with its fast-growing population, getting all the water that agricultural businesses sought along with supplying residential and industrial needs became an increasing problem. In dry years it simply could not be done. The Ogallala Aquifer, the largest single source of groundwater in the country (underlying eight states in the Southwest and Great Plains) dropped an average of a foot a year from the 1970s on, as withdrawals far exceeded natural replenishment.
Without substantial technological or social changes, late-twentieth-century American life was unsustainable on a long-term basis. So many people living so large required such large amounts of energy, water, and other resources and produced so much pollution, including greenhouse gases, that over time inputs inevitably would be depleted and outputs severely damage the environment, locally and globally. Yet the country, by and large, kept doing more of the same, a collective self-destructiveness that could be seen as a kind of social psychosis.
Many people did not believe that pollution or climate warming or resource depletion were serious problems or had faith that they could be addressed by gradual future changes or technological fixes. After all, the country had faced serious problems in the past and overcome them. The lived experience of most Americans was of an improved environment, not its degradation. Largely as a result of legislation passed in the 1960s and 1970s, rivers and drinking water had become cleaner, the air less smoggy, and open dumps of waste less common. Many of the most severe environmental problems of the late twentieth century, like increased greenhouse emissions, did not have easily observable effects. Also, because a disproportionate number of polluting facilities, from petrochemical plants to dumps to sewage treatment plants, were located in neighborhoods of poor, working-class, or nonwhite residents, members of the politically and culturally influential white middle class were less touched by the downside of living large.
Economic interest often overcame ecological concern. It never had been a secret that the world had only a limited amount of oil, which was being used up at a good clip, and that burning gasoline resulted in dangerous emissions. Yet the automobile industry, seeing mandated energy efficiency as a threat to its profits, led long, largely successful campaigns against raising fuel consumption standards and uniformly regulating all passenger vehicles, including SUVs. The auto companies mobilized allies with political clout to support them, most importantly auto dealers and the United Automobile Workers. Though the union supported environmentalism in general, during the George H. W. Bush, Clinton, and George W. Bush administrations it lobbied successfully against efforts to significantly raise fuel efficiency requirements, buying into the industry argument that tougher controls would lead to a loss of jobs.
Population size, which along with per capita resource usage and pollution determined the environmental impact of American society, rarely was addressed during the 1980s and 1990s. Before then, there had been a great deal of discussion of controlling population abroad and, to a lesser extent, in the United States. In the late 1960s, both major parties took for granted the need to limit population in the face of the rapidly growing number of people on the planet. The federal government linked foreign aid to the willingness of recipient countries to establish population control programs. Advocates focused their efforts on poor, nonwhite countries with high rates of population growth, a heritage of the eugenics movement that helped spawn the push to limit population. But some promoters of population control argued for its need in the United States, too. Stanford University entomologist Paul Ehrlich, whose 1968 book The Population Bomb sold two million copies by the mid-1970s, suggested that the optimal population of the United States would be about seventy-five million.
The most complete federal statement on population came from a commission appointed by Richard Nixon, headed by John D. Rockefeller Jr., a longtime advocate of population planning. Its 1972 report said that “in the long run, no substantial benefits will result from further growth of the Nation’s population.” The report identified one aspect of the “population problem” as “the effect on natural resources of increased numbers of people in search of a higher standard of living.” Its proposals to achieve a “gradual stabilization of . . . population” included education in the schools about population; sex education “available to all”; state laws “affirming the desirability that all persons have ready and practicable access to contraceptive information, procedures, and supplies”; the elimination of legal restrictions on voluntary sterilization and abortion; more funds for family planning; measures to reduce illegal immigration; freezing the level of legal immigration; and periodically reviewing immigration policy “to reflect demographic conditions and considerations.”
The sweeping recommendations of the Rockefeller Commission came out of a political moment that proved short-lived. The many dissenting statements by commission members presaged a shift against population control. Nixon himself, under pressure from the Catholic hierarchy, rejected the recommendations of his commission. In the 1968 encyclical “Humanae Vitae,” Pope Paul VI reaffirmed church opposition to any use of abortion, sterilization, or artificial contraception. Around the world, the church took the lead in fighting population control programs with remarkable effectiveness, joined in the United States by conservatives of other faiths, as abortion became an ever more prominent issue. The Reagan administration acceded to the church and other abortion opponents when it instituted a policy of refusing to give money to international family planning programs that provided or funded groups that provided abortion services.
Even many supporters of contraception and abortion mobilized against the population control movement, denouncing it for its historic concentration on the poor and nonwhites while paying little attention to the social and environmental impact of the high-consumption ways of well-off nations and individuals. (The Johnson administration had concentrated its domestic population control efforts on Puerto Rico and other U.S. island possessions, Indian reservations, and inner-city residents.) Also, many population control backers were willing to deny women control over their bodies. Such gender, class, and racial biases became increasingly unacceptable in the wake of the civil rights movement, feminism, and decolonization. Moreover, as Malthusian predictions by population control advocates like Ehrlich that population growth would lead to mass starvation and global chaos proved false, much of the argument justifying the movement disappeared. So by the 1990s, population growth was rarely discussed as a problem within the United States, except as an occasional subsidiary part of arguments against immigration.
Most Americans did not consciously reject the idea that a growing society living large had negative consequences for the national and global environment and might be unsustainable. Rather, they simply did not think about it. People lived large because they could. Living in big houses, driving big cars, going to big malls, even being big, brought physical, psychological, and spiritual satisfactions. Living large created a se
nse of well-being, of security, of power, a sense of worth and superiority. Just as the aftermath of the Cold War renewed the U.S. sense of limitless power abroad, the unevenly distributed economic bounty of the 1980s and 1990s allowed a large upper stratum of society to have a sense of limitlessness in its lifestyle and consumption. Others liked to live large too, even if it only took the form of consuming a very large serving of fast food washed down by a very large cup of soda. Americans had developed the habit of believing themselves exempt from history and the consequences of their actions. Led by men and women doing very well at the expense of the rest of the society, most Americans had little interest in spurning whatever possibilities lay before them for the comforts of a high-consumption way of life.
EPILOGUE
* * *
America After 9/11
On September 11, 2001, Deputy Secretary of State Richard Armitage declared, “History begins today.” He was meeting with the head of the Pakistani intelligence service, who happened to be in Washington when al Qaeda terrorists flew hijacked airplanes into the twin towers of New York’s World Trade Center and the Pentagon and hijacked a fourth plane, which crashed in Pennsylvania after passengers tried to seize control. Armitage warned that the United States gave no brief to the history that led Pakistan to support the Taliban government in Afghanistan, which was allowing Osama bin Laden to operate there. But Armitage’s assertion had a broader meaning, widely shared in Washington and the nation, that a radical disjuncture had taken place. President George W. Bush said that on September 11 “night fell on a different world.”