Fear provided the impetus for gating suburbia. The rate of violent crime was at a postwar high during the 1980s when the construction of gated communities took off. Their popularity continued to grow even after the crime rate began falling in the early 1990s. Generalized apprehension as much as immediate danger pushed people to locate themselves behind barriers. Even when the crime rate was at its height, it remained relatively low in suburban areas, with little evidence that controlled access prevented criminal activity. But by the 1990s, fear of crime and general fearfulness had become free-floating, disassociated from the actualities of crime and threat. A 1995 poll found that nearly 90 percent of the respondents believed (falsely) that crime was getting worse, while a majority worried that they would be the victim of a crime.
Many factors played into the fearfulness of the last decades of the century, including crime rates that remained high compared to earlier eras, drug-related violence, a resurgence of urban youth gangs, growing economic inequality, and the increased flow of immigrants. Many whites had highly racialized perceptions of crime and threats to their person and property, with their fears focused on African Americans and, in some parts of the country, Hispanic immigrants. When in 1984 a white New York subway rider, Bernhard Goetz, pulled out a gun and shot four young black men he believed were about to rob him, he became something of a national hero, a real-life embodiment of the Charles Bronson character in the two Death Wish urban vigilante movies that had come out during the previous decade.
Gated communities provided a physical and psychological barrier against what many people saw as a dangerous, unpredictable society and against people different from themselves. Like high-priced outer suburbs in general, they provided a way of escaping the racial and ethnic hodgepodge of urban life, which had spread to many inner suburbs as well. Southern California and Florida, both of which had very high levels of immigration and suffered major urban riots in the 1980s, housed the largest concentrations of gated communities. By contrast, in the Northeast and Midwest, where suburbs were more racially segregated, suburbanites apparently did not feel the same need for physical barriers.
Gated communities promised order and predictability not only by providing security but also by privatizing functions usually performed by the state. Typically, in these communities, as in many nongated subdivisions, homeowners shared ownership of common facilities and even streets and sidewalks (where there were any) through homeowners’ associations and had to abide by detailed rules and regulations incorporated as covenants to their deeds. Homeowners bore the cost of amenities, security, and routine maintenance, like road repairs, that elsewhere were government responsibilities.
The creation of what were in effect private suburban governments came as part of a broader privatization of American life during the 1980s and 1990s. Individuals and groups who could afford it increasingly provided themselves, through the private sector, services once performed primarily or exclusively by government. Between the early 1980s and the early 1990s, the number of private security guards soared, so that by the end of the period much more money was being spent on private security than on public law enforcement. Private gyms and health clubs multiplied, serving an increasingly health-conscious (and appearance-conscious) population at a time when many towns and cities were cutting back on their recreation budgets (particularly in areas hard-hit by the antitax movements of the late 1970s). School vouchers gained support, with a number of cities, starting with Milwaukee in 1990, experimenting with them. Many states began what in effect was a partial privatization of public higher education, as they reduced the percentage of state college and university costs covered by tax money, forcing increases in tuition.
None of this was completely new. Private and church-affiliated schools predated public education, the wealthy had long had their country clubs and private athletic facilities, and private guards had been used by companies and rich individuals for over a century. But as the income distribution became more top-heavy, a larger group of Americans had the ability and desire to buy themselves out of the use of public services, which in an era of tax-cutting often had deteriorated. Antistate thinking and the extolling of private enterprise, central to Reaganite ideology, legitimated the transfer of functions once seen as the very essence of state responsibility—like educating children, protecting the citizenry, and incarcerating criminals—into the private sector.
People who paid privately for functions like zoning, security, education, street cleaning, and recreation often resented paying taxes to finance parallel government services that they themselves did not use, creating an ongoing political pressure to keep taxes low. Changing demography reinforced this trend. During the 1950s, nearly 70 percent of adults had a child in school, providing a huge bloc of support for public education. By the early 1990s, largely as a result of smaller families and an aging population, the figure had fallen to 28 percent. In many communities, the annual vote on the school budget turned into an ugly battle, with high school athletes and cheerleaders standing by roadsides urging voters to back budgets funding their programs. Just as the post–World War II system of employment-based health and pension benefits created a two-tiered welfare state, the privatization of government services created two-tiered systems of security, education, and recreation, sparking mutual resentments between those who did and did not depend on state services.
Big
Perhaps the most striking aspect of late-twentieth-century suburban growth was the enormous size of just about everything. More than ever before, the country lived large. The original Levittown houses had 750 square feet of living space. By 1970, the average new single-family home was twice as large. By 2000, it had grown by half again, to 2,200 square feet. Since during these years households were getting smaller, the space per person rose even more quickly, far exceeding international norms. As the twenty-first century began, the typical American house provided 718 square feet of space per resident, compared to 544 square feet in Australia, the runner-up in the size derby, 442 square feet in Canada, 256 in Holland, and 170 in Japan.
If one of the great divides in human history was before and after the indoor toilet, the United States did miraculously well in pushing almost all of its residents across that line into convenience and modernity. In 1940, nearly half of all homes lacked indoor plumbing. In 1960, 17 percent still did. But by 2000, only 671,000 houses remained lacking, less than 1 percent of all dwellings. The United States had gone very far in providing decent, spacious, comfortable homes for its residents, especially outside of some central cities and poor rural regions that still had serious housing problems.
But Americans were not content to stop at decent, spacious, and comfortable. By the millions, they moved into suburban and exurban megahouses, “McMansions” as their critics called them, homes of a size that in the past only the very wealthiest might have built. Whereas in the 1970s an exceptionally large suburban home might have had 4,000 square feet, by the end of the century houses of twice or even three times that size had become common. In 2000, a third of all new homes had four or more bedrooms. Wine cellars, media rooms, home gyms, swimming pools, double-height entry halls, three-car garages, huge walk-in closets, and bathrooms with multiple sinks and multiple toilets and Jacuzzis and steam rooms became common features in the massive houses that sprang up on what had been cornfields and wooded lots beyond the older suburbs of the more prosperous metropolitan regions.
The mushrooming of McMansions reflected the substantial number of families who did very well during the Reagan-Bush-Clinton years. An extraordinary 60 percent of all income growth during the 1980s went to the richest 1 percent of the population, but the rest of the top 20 percent saw their income soar as well. (The income of the bottom 40 percent of earners, adjusted for inflation, fell.) By the end of the century, 17 percent of households earned $100,000 a year or more, constituting what conservative writer David Brooks dubbed “a mass upper class,” up from less than
7 percent receiving an equivalent amount twenty years earlier. One in fourteen households had a net worth of more than a million dollars.
Owners of big homes drove a lot, and they liked to drive big vehicles. Since the advent of the automobile, suburban living had involved a lot of driving, and as suburbs spread farther outward, roads weaving together housing developments, office buildings, industrial parks, and shopping centers became ever more crowded. Typically, drivers traversed the roadways alone. In 1990, nearly three out of four workers commuted to their jobs in a vehicle that only they occupied. By 2003, the typical American household had more vehicles than drivers. The wealthier a household was, the more driving it did.
An increasing number of drivers chose to not to purchase cars but sport utility vehicles that were larger, heavier, and more fuel-consuming than traditional automobiles. SUVs made up less than 2 percent of the vehicles sold in 1982, but fifteen years later they had captured over 16 percent of the market. The SUV boom stemmed from an automobile industry effort to evade federal regulation of tailpipe pollution and gasoline consumption. Auto industry lobbying helped convince the Environmental Protection Agency to classify SUVs as light trucks rather than cars, which kept them from being subject to the limits on car pollution under the 1970 Clean Air Act. Similarly, in 1975, Congress allowed the Transportation Department to establish separate fuel efficiency standards for cars and for light trucks, a category that included SUVs. Automobile companies had to achieve fleet-wide average fuel consumption for their cars of 27.5 miles per gallon by 1985 but only 20.5 for light trucks. To meet the car standard, companies began producing small, fuel-efficient vehicles, on which they made little profit, while cutting down on production of large, gas-consuming sedans, traditionally favored family vehicles. Beginning with the Jeep Cherokee in 1983, they also introduced a growing number of SUV models, which they could make as big and powerful as they wanted.
SUVs proved immensely profitable for American automobile makers, more so than minivans, the other new type of vehicle that in effect replaced large sedans. For one thing, they initially faced no foreign competition. In 1964, in the course of a trade dispute with Europe, the United States had placed a 25 percent tariff on light trucks, which effectively kept European and Japanese car manufacturers out of the U.S. market for SUVs and pickup trucks. (Eventually, some foreign companies built plants in the United States—in southern, antiunion regions—to circumvent the tariff.) For another thing, automakers did not have to design SUVs from scratch, instead using engines and chassis already designed for trucks. The Michigan Truck Plant, which made the Ford Expedition and the Lincoln Navigator, proved to be one of the most profitable manufacturing facilities in human history. With a profit margin of $12,000 on the Expedition and even more on the Navigator, in 1998 its workers produced a pretax profit of $3.7 billion.
Some people bought SUVs for their putative safety, though in reality they were not safer than cars and had serious problems with tipping over and braking. But more important, as in the case of gated communities, fear of crime and the desire for security propelled the market. Upper-income families that bought the very large, very plush SUVs that manufacturers began introducing in the mid-1990s wanted vehicles that looked and felt secure, an antidote, they believed, to criminal attacks and other threats. Carmakers played along, designing their upscale mega-vehicles to appear as menacing as possible, four-wheeled monsters that could be used—or so they looked—to mow down whatever marauding herds of ne’er-do-wells their owners might encounter on their way to work or shop.
And shop they did, as an explosion of consumption took place during the last decades of the century. Much of the buying took place in the suburbs, where the proliferation of shopping centers kept going until there were more than forty-three thousand at the beginning of the new millennium, the supply side of the buying bonanza. Institutional investors replaced family firms as the major players in financing shopping centers, malls, and suburban-sited office buildings. Pension funds, banks no longer constrained by government regulation, and real estate investment trusts (a highly liquid means of owning real estate and mortgages, with significant tax advantages, authorized by Congress in the 1960s) poured billions into suburban development. Half the money to build the largest mall in the country, the vast Mall of America outside Minneapolis–St. Paul, which opened in 1992, came from the Teachers Insurance and Annuity Association (TIAA), a pension fund, set up in 1918, for educators (the more tweedy of whom no doubt would have been appalled by this source of money for their golden years).
Earlier, TIAA had helped finance the Woodfield Shopping Center in Schaumburg, Illinois, an outer suburb of Chicago, which epitomized what Joel Garreau called, in a popular 1988 book, an “edge city,” a suburban hub that contained not only housing but major shopping centers, hotels, and office complexes as well. In the mid-1990s, Sears, Roebuck and Company moved its world headquarters from the iconic Sears Tower in downtown Chicago, the country’s tallest building, to the nondescript Schaumburg area. The innovative financial engineering of the 1980s and 1990s literally remade the landscape of the nation, as a generally characterless and often banal architecture of mini-manses, strip malls, big shopping centers, and low-rise office buildings wrapped in reflective glass became the built environment for a huge share of the population (an environment largely constructed using nonunion labor, since unions by and large failed to expand into the most rapidly growing parts of the country).
Large discount stores and purveyors of luxury goods dominated the new landscape of selling. The segmentation of wealth and the segmentation of consumption did not fully coincide. Well-off families, as well as ones of more modest means, took advantage of the low prices that could be found at “superstores” like Wal-Mart, which stocked everything from food and clothing to appliances and CDs. They also patronized “big box” stores that specialized in particular categories of items, like electronics, books, and home furnishings. The growing size of houses, with more and more storage space, made it possible for many families to shop at so-called buying club stores like Costco, founded in 1983, low-price bulk sellers that straddled the line between wholesale and retail business. At the other end of the scale, the 1980s and 1990s saw an explosion in the market for luxury goods and things represented as such, not only among the rich but among the middle class too, and even to some extent among the poor. Consumer companies, most iconically Nike, learned to use branding to turn ordinary items like sneakers and polo shirts into premium goods, for which they could charge higher prices and earn higher margins. Premium cars and ice creams and clothing served as treats, small or large, that consumers gave to themselves. Luxury clothing and jewelry stores inhabited the same suburban environment as the big-box stores, often in shopping malls just down the road.
The decline in the cost of food facilitated the shopping boom. During the 1930s, about a third of household spending went to food. At the end of the century, urban households devoted less than 10 percent of their spending to food, freeing up money for other kinds of purchases. But not enough. Stagnating wages, at least until the last half of the 1990s, forced Americans to borrow more and more money to make all this buying possible. Total consumer debt rose from $352 billion in January 1980 to $803 billion in January 1990 and $1.552 trillion in January 2000. Nearly half of it came in the form of credit card debt. By the end of the 1990s, households on average spent over 12 percent of their income on debt service (including mortgages). Ballooning debt would be an important element in the economic collapse that came less than a decade into the new millennium.
As houses and vehicles and shopping centers and credit card bills became larger and larger, so did the bodies of Americans. But in this case the relationship to social class was reversed; the less money people had, the larger their bodies tended to be. In the early 1960s, American men between ages twenty and seventy-four weighed on average 166 pounds and women 140 pounds. By the start of the twenty-first century they had bloate
d up to 191 and 164 pounds, respectively. (Over those decades, the average height of both men and women had gone up an inch, but the increase in weight proportionately exceeded the increase in height.) Three-fifths of the population was overweight, with one-fifth so much so that its life expectancy was reduced as a result of too many pounds. Americans had become, with the exception of some South Sea islanders, the fattest people in the world.
The bulking up of Americans stemmed from basic metabolic calculus; on average they took in more calories than they had in the past, especially if they were poor, and exercised less, at least if they did not have money. Between 1971 and 2000, the caloric intake of men went up by 7 percent and women by 22 percent. The availability of new, cheaper, high-caloric food ingredients put more affordable calories in easy reach. High-fructose corn syrup, introduced in the early 1970s, tasted six times sweeter than cane sugar and could be cheaply produced from the country’s bountiful crop of corn (encouraging even more corn production). By the early 1980s, both Coke and Pepsi were entirely sweetened by corn syrup. Palm oil provided a parallel development for fats, a cheap, highly saturated oil, much of which was imported from Malaysia, which had many of the characteristics of lard and became widely used in commercial baked goods, potato chips, baby formula, and for cooking french fries. The low cost of these ingredients meant that food manufacturers could increase the size of offerings while staying within existing price frames.
American Empire Page 64