American Empire

Home > Other > American Empire > Page 20
American Empire Page 20

by Joshua Freeman


  As in the case of migration to the North, migration to the West often brought economic mobility, as evidenced in the experience of “Migrant Madonna” Florence Thompson and her children. The relocation of millions of poor Americans to more prosperous regions of the country with greater job opportunities contributed to the postwar rise in average national income. For many, the West did not quite live up to the image of middle-class utopia projected by California governor Earl Warren, whose almost ridiculously attractive family seemed to embody the wondrous possibilities of American life. But the promise that moving on would bring some measure of salvation did not prove completely false for those who made the great trek westward.

  Immigration played a smaller role in American life during the two decades after World War II than it had during most of U.S. history. The post–World War I quota system all but banned immigration from Asia and Africa while limiting European immigration to a level far below its peak at the turn of the century. Unlike in the past, the number of immigrants who came to the United States from elsewhere in the Americas, primarily Canada and Mexico, which were not covered by the quota system, was about the same as the number of immigrants from Europe.

  Canadian immigrants outnumbered those from Mexico. However, largely English-speaking (with a minority of French speakers from Quebec) and culturally close to the white population of the northern United States, they had little social impact except in the northern parts of New England and the Great Lakes states, where most settled. Mexican immigrants had greater social visibility. World War II and the postwar economic expansion brought increased migration, as Mexicans came north to take advantage of job opportunities, especially in agriculture and transportation. In addition, the wartime shortage of labor led the United States to establish the “Bracero” (from the Spanish brazo for “arm”) program, under which Mexican citizens could enter the United States as contract laborers to do seasonal agricultural work without facing the military draft. Meant to be a temporary program, it remained in place until 1964 because it provided a convenient source of cheap labor for western growers, who had to pay Bracero workers only 75 percent of the prevailing wage.

  On the East Coast, Puerto Rican migration added to the Spanish-speaking population. Though Puerto Ricans had been citizens of the United States since 1917, relocation to the mainland had been modest through World War II. After the war, a decline in rural employment, especially in the sugarcane and coffee industries, led to urbanization on the island and, coming at a time when travel costs to the mainland dropped dramatically, migration to the East Coast, especially New York. By 1970, over one-third of all Puerto Ricans were living on the mainland. New York alone had a Puerto Rican population of 800,000, making it the largest Puerto Rican city in the world.

  The population movements of the post–World War II decades resulted in a vast resegregation of the country. While the African American population became more evenly distributed among the different regions, within regions a new segregation was effected as whites left cities for suburbs and nonwhites replaced them. Between 1950 and 1960, some 3.6 million whites moved out of the country’s twelve largest cities, while 4.5 million blacks moved in. Many African American migrants found themselves living in far more segregated circumstances in the North than they had in the South. With jobs moving out of the cities too, newcomers found their hopes for better lives circumscribed and frustrated. While whites embarked on the great suburban adventure of the 1950s, blacks, Puerto Ricans, Mexicans, and other minority groups were left to inherit cities with decaying infrastructures, declining or stagnant employment opportunities, inadequate housing, and declining tax bases. By the 1960s, the notion became widely accepted that the country faced an “urban crisis,” a crisis that contained within it all the accumulated economic, political, racial, and social tensions created by the mass migrations of the postwar era.

  Toward a National Suburban Culture

  Suburbanization, especially the rapid development of whole communities, necessitated the creation of new structures of everyday life. One early Levittown resident remembered, “There were no telephones, no shops. . . . There was no grass, no trees, just mounds of dirt and snow covering it all.” Along the streets and cul-de-sacs of raw housing tracts, a new national suburban culture emerged.

  The car lay at its heart. Suburbanization reinforced the centrality of the automobile to American life. Though residents of new suburban communities often initially traveled back to a city for work, shopping, entertainment, and services, over time that became less common, as employers and services moved outward or sprung up anew in forms built around the automobile. Shopping centers, office parks, drive-in restaurants, and new churches became the hubs of suburban life.

  Some pre–World War II suburban developers, most notably J. C. Nichols, who built the upper-class Country Club District in Kansas City, had incorporated shopping districts into residential communities, designing them to have an urban feel but with plenty of room for parking. After the war, developers no longer made any effort to integrate shopping areas into their surroundings, instead planting them in the middle of seas of parking lots. To get to shopping, suburbanites had to drive, but once there the larger shopping centers provided a pedestrian experience, with small shops lining walkways between “anchor” department stores. Shopping center managers, to encourage shoppers to come regularly, tried to make them substitutes for village centers or urban downtowns, with professional offices for doctors and lawyers, space for community meetings, post offices and banks, restaurants, movie theaters, even skating rinks.

  The expanding suburban market and favorable federal tax treatment—in 1954, as an antirecessionary measure, Congress amended the tax code to permit the accelerated depreciation of building costs—led to the spread of large shopping centers across the country. More and more were fully enclosed, climate-controlled malls, windowless introverted spaces. By 1970, the country had thirteen thousand shopping centers; by 1984, twenty thousand, which accounted for nearly two-thirds of all retail sales.

  Shopping centers spurned the unruliness and heterogeneity of city life. With few exceptions, they aimed to attract white middle-class shoppers, choosing locations and designs meant to keep out others. Store employees also tended to be white, in many cases women living nearby who worked part-time, a contrast to the large department stores in northern cities, which in the decades after World War II finally began to hire substantial numbers of African American and Puerto Rican workers. (Southern stores remained racially segregated in their customers and staff well into the 1960s.) Though they served as surrogate downtowns, shopping centers were not truly public spaces, generally restricting demonstrations, picketing, and other activities deemed undesirable in an effort to maintain controlled, union-free, controversy-free shopping environments.

  Shopping centers soon began to supplant downtown shopping districts, as suburban customers preferred the convenience of nearby stores and easy parking. With shoppers and investors going elsewhere, in city after city the old downtowns grew shabby and landmark stores began to close. An exodus of employers also hurt established downtowns. Not only did factories move out but so did many offices, which relocated to nondescript buildings strung along suburban roads or, at the high end, to “office parks” on highly landscaped campuses far removed from the public access, unpredictability, and worn-out feel of their old locales.

  A new set of car-oriented services popped up on suburban roadways. The drive-in restaurant was the purest expression of car culture. The first opened in Dallas in 1921. Not long after, the White Tower chain combined fast-food service for automobile travelers with franchise ownership, a combination that proved ideally suited to taking advantage of the market opportunities created by postwar suburban development and the expansion of the national highway network. Franchising opened the door for men and women of modest means to get in on the national romance with entrepreneurship, gave companies capital to grow and a man
agerial cadre with a stake in their brands, and provided travelers with standardized services across the country. The most successful fast-food chains, like McDonald’s, which expanded from a single store in San Bernardino, California, to a pervasive national presence, used the techniques of Fordism—a limited array of standardized products, specialized machinery, and an intense division of labor—to keep prices down and volume up. Hotel and motel chains, like Holiday Inn, likewise used franchising to profit from the increasing dominance of the automobile for long trips as well as short ones and the sprawling character of postwar development.

  In many parts of the country, a patchwork of new government agencies formed as suburbs grew—school districts, water districts, fire districts, police districts—fragmenting governance into multitudinous geographic and functional units. (Exceptions to this pattern occurred in areas, mostly in the South and Southwest, where state laws made it easy for cities to annex their suburbs.) Suburban politics often revolved around service delivery, but sprawling growth made regional planning and coordination difficult. Political parties found it hard to build infrastructure in this new terrain, increasingly relying on mass media, rather than local clubs and party leaders, to reach voters.

  With population dispersed, town centers not always present, and civil authority fragmented, churches and synagogues emerged as key nodes of suburban social organization. In addition to their spiritual role, suburban religious institutions sponsored a host of secular activities and imparted to their members a sense of community. Their rapid growth contributed to the remarkable increase in formal religious affiliation during the postwar decades and helped sustain a normative acceptance of the divine in the new social landscape.

  Suburbanization muted class distinctions. In communities like Levittown, blue-collar and white-collar workers, civil servants, even small business owners lived side by side in houses that looked the same and contained very similar furnishings. “There is . . . no wrong side of the tracks,” noted Harper’s in 1953 about the new suburbs. As diners (popular in the eastern half of the country) and bowling alleys spread from city to suburban settings, they shed their male, blue-collar ambience, deliberately seeking a broader, middle-income, family clientele.

  Conviviality and informality characterized the new suburban culture. Many young suburban families did not have parents or older relatives living nearby. (The drive back to the city to visit the folks became a cherished—or dreaded—weekend ritual.) For everyday companionship and support, they turned to their neighbors, minding each other’s children and gathering for endless rounds of backyard barbecues, Tupperware parties, and television watching (before a set in every home became common). Children were constantly in and out of each other’s homes. Steve Wozniak, the son of a Lockheed engineer who would play an outsize role in the creation of personal computers, fondly recalled that in the suburb where he grew up, “there were kids all over, so many kids on our block, and we would just go up and down the block and run into each other and start riding bikes and agree to do something.”

  Bike riding aside, the suburban way of life depended on cheap, plentiful energy. Single-family homes tend to be less energy efficient than multi-unit buildings, especially postwar tract housing, much of which was single-story and lacked traditional temperature-moderating features, depending instead on air-conditioning and extensive heating systems. Dependence on automobiles for so many everyday tasks—getting to work, shopping, transporting children, finding entertainment—contributed to a tripling of national oil consumption between 1948 and 1972, with gasoline accounting for about 40 percent of the use. In 1949, a gallon of gasoline cost only twenty-seven cents, and in real terms the price kept falling until 1972. With little incentive to improve fuel consumption, manufacturers let car mileage slip from an average of fifteen miles per gallon in 1949 to thirteen and a half in 1972.

  Architects, planners, and government officials were not oblivious to the environmental impact of suburbanization. Recognizing the high energy demands of tract housing, in the early postwar years both solar and nuclear power were given considerable attention as possible alternatives to fossil fuels for heating and cooling homes and generating electricity. Some architects and the federal government promoted energy-efficient house designs. But with oil and coal so cheap and plentiful, home builders and home buyers saw little reason for making higher initial investments to keep energy usage down. Similarly, the loss of open space to suburban development, the pollution of groundwater by suburban septic tanks, and the destruction of wetlands and floodplains to build tract housing all raised concern but resulted in little concrete action until the mid-1960s.

  Family Togetherness and Gender Divides

  Suburbanization coincided with and reinforced an increased emphasis on the importance of family. In the decades after World War II, cultural authorities and ordinary people embraced marriage and family as the central sources of personal satisfaction to a greater extent than ever before. Millions of Americans turned to their families for stability and assurance in a dangerous, uncertain world. And they expected from them more than ever before: emotional support, recreation, sexual pleasure, self-improvement, and an existential sense of purpose. The families so weighted tended to be nuclear families, as many young couples started new households distant from the homes and neighborhoods in which they grew up.

  Sex played a more prominent role in marriage than in the past. The postwar years saw increasing social disapproval and punishment for sex outside of marriage, especially homosexuality and sexual intercourse by unmarried women. In the mid-1950s, laws in every state in the Union criminalized sodomy, deployed primarily against gay men. Unmarried pregnancy brought shame and hiding, especially for middle-class white women, leading to hundreds of thousands of dangerous, illegal abortions each year and a very large number of babies put up for adoption. But within marriage, sex and sexual pleasure were held forth as healthy and desirable by marriage guidebooks, psychological and medical authorities, and in commercial culture.

  The ideology of family that pervaded postwar society espoused very different roles for men and women. In a reinvigoration of the notion of the family wage, government, cultural, business, and labor organizations promoted the idea that families could and should live solely on the wages of a male breadwinner. For women, in something of a throwback to the nineteenth century, domesticity, specifically motherhood and wifedom, were widely held forth as the appropriate basis of identity and fulfillment.

  Women themselves had complex, sometimes contradictory feelings about what writer Betty Friedan later famously called “the feminine mystique.” When the war ended, the percentage of adult women who worked for wages fell from a historic wartime high of 37 percent to under 30 percent. Most women who left their jobs did so voluntarily, eager to start families or simply to escape the hardships of manufacturing or service work. But many were forced out of their jobs, particularly in well-paid, traditionally male bastions, like the automobile, iron and steel, machine-making, and transportation industries, where employers, often with the support of male unionists, reestablished the prewar sexual division of labor. Women also dropped out of college to begin families, making up a smaller proportion of college students and getting a smaller proportion of advanced degrees during the 1950s than before the war.

  Soon, though, women began returning to wage labor in increasing numbers. Married women, with older children able to take care of themselves or already out of the house, accounted for the bulk of the growth. By 1960, the percentage of adult women who worked outside their home matched the wartime high.

  Most women worked for economic reasons. As had always been the case, for large segments of the population the notion of a family wage taunted reality rather than represented it. Women with no breadwinner to depend on or in households in which the primary earner did not make enough to support a family worked because they had to. This burden disproportionately fell on nonwhite women; continuin
g a long-standing pattern, a far higher percentage of black women worked for wages than did white women, nearly half in the mid-1950s. Other women took jobs not out of sheer necessity but because doing so enabled their families to afford such discretionary items as vacations, college education for their children, and new appliances.

  Working women found themselves clustered, by cultural norms and discrimination, into just a few areas of the economy. For African Americans, household cleaning and laundering provided the largest source of employment; for whites, light manufacturing, retail trade, clerical work, health care, and education. Even within these categories, men and women worked in different tracks. In 1960, 85 percent of elementary school teachers were female, while 90 percent of high school principals were male. Such job segregation, which all but excluded women from many well-paid sectors of the economy, contributed to a median wage for women that was less than 60 percent of that for men.

  For women who did not work outside their homes—still the large majority—family life could be satisfying, or not, or both at the same time. Many women cherished the opportunity to devote themselves to raising their children and to home life more generally. One survey respondent reported that marriage had given her a “place in life. I feel I am doing exactly as I am fitted. . . . I am happy or content much more of the time than I am not.” But many women chafed or grew depressed by their financial dependence on their husbands and long days spent, without adult company, caring for children, doing laundry, cooking, and keeping up homes to standards being pushed ever upward by advertisers and women’s magazines. Technological advances—indoor plumbing, electricity, clothes washers and dryers, electric irons, vacuum cleaners, refrigerators, freezers, and garbage disposals—did not reduce the amount of time spent on housework (from the 1910s through the 1950s, housewives consistently spent a little over fifty hours a week on domestic labor), instead contributing to a rise in norms of cleanliness, cooking, childcare, and family activity. In some cases, by replacing commercial services like laundries, they actually created more work. One mother of four told Friedan, who in 1957 surveyed her Smith College classmates about their lives fifteen years after their graduation, “I begin to feel I have no personality. I’m a server of food and putter-on of pants and a bedmaker, somebody who can be called on when you want something. But who am I?”

 

‹ Prev