American Empire
Page 18
Some other unions took an approach similar to that of the Mine Workers, getting employers to contribute to benefit funds, which in turn were used to finance union-run health facilities and other employee benefits. In New York City, by the late 1950s half a million workers and members of their families were receiving medical care at union clinics. Generally, though, large corporations fiercely resisted paying for benefit plans that they themselves would not control.
Unions won a leg up in 1949, when the federal courts confirmed a National Labor Relations Board ruling that employers had a legal obligation to engage in collective bargaining over pension demands and that workers could strike if they refused to do so. In September of that year, Ford agreed to a pension plan for workers with at least thirty years’ service in order to avoid a walkout by the Auto Workers. Soon thereafter, 600,000 steelworkers walked off their jobs, returning to work forty-two days later when their employers agreed to provide pensions and half the cost of a rudimentary health insurance plan. Other corporations followed suit, often reluctantly. It took a 104-day strike before Chrysler broke down and agreed to a pension plan. Next came General Motors, which agreed to a pension plan and the auto industry’s first medical insurance benefit as part of a five-year contract it negotiated in 1950. In the years that followed, companies in other industries began agreeing to health and pension benefits too, while the terms of such plans steadily improved.
Private and public welfare benefits were intimately linked. The Auto Workers’ and Steelworkers’ drives for company pensions stemmed from the inadequacies of Social Security. Inflation had eaten away at the spending power of federal retirement payments, which in 1948 averaged only $25 a month, and nearly half the country’s workers were not covered by the program at all. Business opposition had blocked repeated efforts to improve benefits by raising taxes. The CIO unions, by demanding that employers provide retirement benefits above and beyond Social Security payments, created an incentive for business to drop its opposition to improving the government system. After a long legislative battle, in June 1950 Congress passed a law more than doubling Social Security benefits and extending coverage to ten million additional workers, including a million domestic servants and a lesser number of agricultural workers. In 1949, a typical retiring autoworker could count on receiving only $32 a month, all from Social Security. Just three years later, improved Social Security and company pensions combined to nearly quadruple that, to $123 a month.
The United Automobile Workers took the lead in winning yet another benefit, supplementary unemployment payments. In many industries, workers continued to suffer from layoffs as a result of seasonal production patterns and cyclical downturns. During the 1958 recession, steel companies laid off 200,000 workers and put another 300,000 on shortened work schedules. In 1955, Ford agreed to create a fund from which laid-off workers would receive payments in addition to whatever government unemployment benefits they were eligible for. By the early 1960s, such plans covered some two and a half million workers in the steel, rubber, garment, electrical equipment, and auto industries. As in the case of private pensions, supplementary unemployment benefits filled a vacuum created by the inadequacy of government welfare provisions and created pressure to improve them, in this instance leading many states to boost the unemployment insurance benefits they provided.
Many conservatives and businesses remained unreconciled to the growing strength of unionism. In the mid-1950s, midsize manufacturing firms, less able than the industrial giants to pass on rising labor costs to customers, took the lead in a new attack on organized labor, beginning with a political and ideological offensive against the union shop. The newly formed National Right-to-Work Committee framed the issue not as one of the balance of power between business and labor but rather of individual worker rights being impinged upon by compulsory unionism.
Widely publicized hearings between 1957 and 1959 by the Senate Committee on Improper Activities in the Labor and Management Field furthered the notion that unions sometimes exploited the very people they claimed to represent. The committee, led by Arkansas Democrat John McClellan but largely driven by its counsel, Robert F. Kennedy, exposed corruption in a number of local and national unions. The giant Teamsters union came in for a drubbing, as mobbed-up local leaders paraded before the committee. Teamsters president Dave Beck was forced to resign and later went to jail for corruption, while his replacement, Jimmy Hoffa, faced relentless grilling and investigations that made him a national symbol of recalcitrant labor. Public opinion of organized labor, extremely positive before the hearings, fell sharply.
Hoping to capitalize, in 1958 conservative business leaders and Republicans put “right-to-work” referenda on the ballot in six states; if passed, they would have outlawed the union shop in major centers of industry, including Ohio, Illinois, and California. For the most part it proved a disaster. Unions mobilized their members to defeat the ballot measures, succeeding everywhere but Kansas. A large turnout of unionists and liberals contributed to major Republican losses. In California and Ohio, Democratic gubernatorial candidates won in landslides against opponents who endorsed the union shop ban. In congressional races, the Democrats made their strongest showing since the height of the New Deal.
But the victory was short-lived. Divisions within the labor movement and the image of labor leaders as corrupt bosses paved the way for the 1959 passage of the Landrum-Griffin Act, which made only a few concessions to labor while putting in place a new level of government oversight most union officials did not want. The law contained a “bill of rights” for union members; required unions to hold regular, secret-ballot elections and file detailed financial reports; forbade picketing to demand union recognition; and tightened restrictions on secondary boycotts.
Parallel to the effort to check union power politically, employers also tried to check it at the bargaining table. The recession that began in 1957 led businesses to push for greater flexibility on the shop floor, increase workloads, resist union efforts for greater job security, and attempt to weaken or eliminate cost-of-living adjustments, resulting in a series of hard-fought strikes, including in the glass, coal, auto, and copper industries.
The most important clash took place in the steel industry. In 1959, the major steel companies, led by U.S. Steel, set out to undermine the power of the United Steelworkers of America, hoping to force the union to give up contract language that made it difficult for managers to reorganize production and reduce the size of the workforce without negotiating with the union. After the companies put forth demands they knew the union would reject, over half a million steelworkers walked off their jobs. In a remarkable display of solidarity, they stayed out for 116 days, the largest loss of workdays from any labor dispute in the country’s history, returning to their jobs only after the Eisenhower administration obtained a court injunction forcing them back. As it became clear that even then no settlement was near, Eisenhower and Richard Nixon pressured the steel companies to back down. The end result was a smashing defeat for the companies, who agreed to a decent wage increase and effectively abandoned their effort to win freedom to unilaterally change shop floor arrangements.
The postwar stream of wage increases and social benefits won by unions revolutionized working-class life. Heavily unionized midwestern manufacturing centers had some of the highest levels of homeownership in the country. Jack Metzgar, the son of a Johnstown, Pennsylvania, steelworker, recalled, “In 1946, we did not have a car, a television set, or a refrigerator. By 1952 we had all those things.” Union gains, along with improved government welfare programs, meant not only more money each week but the confidence to spend it, as families knew that government and employer benefits would provide security in the event of sickness, layoffs, or old age. When Metzgar’s mother suffered a series of heart attacks, medical bills forced the family to sell their house and move into a government project, but the Steelworkers’ health insurance plan allowed them to avoid financial
ruin. As wage rates went up and security grew, working-class families found themselves able to send children to college, take vacations, and retire while still healthy, while providing an ongoing stimulus to the economy through the greater consumption of goods and services. Metzgar, remembering the increased income, security, and sense of possibility that the Steelworkers’ union brought his family, wrote, “If what we lived through in the 1950s was not liberation, then liberation never happens in real human lives.”
Not everyone experienced liberation to the same extent. Over the course of the 1950s, the gap between wage rates for union and nonunion workers increased substantially. Even within the unionized sector, gaps grew. In 1947, workers in the heavily unionized but highly competitive apparel industry earned on average 71 percent of what autoworkers made; by 1965 that had fallen to just 45 percent.
Americans liked to believe that hard work paid off morally and financially, but to a much greater extent than they usually acknowledged, their standard of living reflected circumstances largely or entirely out of their control. A worker in a heavily capitalized, unionized, relatively noncompetitive industry, like auto or steel, brought home more money, received more benefits, and had greater security than a worker who happened to work in a more labor-intensive, nonunion, heavily competitive sector, like retail trade. Which type of job a person held rested, to a great extent, on their sex, race, and place of residence, with whites, men, and northerners having a disproportionate hold on the best jobs. Even education, increasingly touted as key to upward mobility, could not overcome the segmentation of the labor market and the discriminatory processes that slotted certain demographic groups into certain types of jobs. In 1959, the median income of white men with only a high school education exceeded that of African American women with a college degree by nearly 20 percent. And while union leaders saw their drive to improve their members’ benefits as promoting an upgrading of social welfare for all workers, it often only added to the social distance between those with extensive private protections and those without.
The Mechanics of Consumerism
A powerful array of cultural and commercial forces helped make mass spending possible. For the most part these forces were not new, but they reached greater sophistication and unmatched pervasiveness after World War II. Already, the national culture had largely repudiated the virtue of thrift. Both Keynesian economic thinking and commercial interests stressed the virtues of spending, not saving. Egging on the buyer were mass marketers who emerged from the war with well-established arsenals of selling techniques, including branded products, credit purchase plans, and extensive advertising.
New sources of consumer credit augmented rising income. Before the war, department stores, hotels, and oil companies had issued various kinds of payment cards and charge plates that could be used to make purchases. After the war, many stores introduced revolving credit accounts. In 1949, the Diners Club went the next step when it introduced a credit card that could be used to make purchases from multiple merchants. By the late 1950s, various competing cards had been introduced, including the American Express Card; Bank of America’s BankAmericard, which later evolved into the Visa card; and the Hilton Hotels’ Carte Blanche. The tax code encouraged consumers to take on debt by allowing them to deduct interest payments in calculating their federal income tax. The percentage of tax returns with claims for such deductions rose from under 3 percent in 1950 to over 30 percent ten years later. At the end of the 1960s, credit cards still were used for only a very small percentage of all purchases, dwarfed in dollar amount by other forms of consumer credit, like car loans and home mortgages, but the infrastructure was in place for what would become an explosion of credit card buying in the last decades of the twentieth century.
The emergence of discount stores also facilitated mass purchasing, making appliances, furniture, and other items previously sold through specialty or department stores affordable to young families setting up new households. E. J. Korvette, founded in 1948 with a single Manhattan store, perfected the model, generating a huge volume of sales through rock-bottom prices. Keeping costs down by providing a minimum of store amenities, Korvette’s learned how to make money even with very small markups. Soon it was building ever larger stores within an expanding radius from New York, locating many in rapidly growing suburbs. Other companies followed a similar trajectory. By 1960, the country had over thirteen hundred discount stores. Two years later, two giant variety store companies, Woolworth and Kresge, started discount chains of their own, as did the Dayton’s department store company, which launched its first Target store. That same year, in Rogers, Arkansas, Sam Walton opened his first Wal-Mart Discount City store.
Television, first publicly demonstrated at the 1939 New York World’s Fair, provided a potent new advertising medium to promote mass consumption. The percentage of households that owned a television set rose from just over 2 percent in 1949 to nearly 56 percent in 1954 and 90 percent in 1962. By the end of the 1950s, over a billion and a half dollars a year was going to television advertising. By choosing particular radio and television shows, advertisers could aim products at particular segments of the market. Superficially, rising working-class income allowed a kind of democracy of consumption, as categories of goods and services once reserved for the rich—appliances, cars, vacations, and the like—became widely accessible. But within each category, different types and grades of goods and services were produced for different economic strata, and often for different generational and cultural groupings as well. A working-class teenager might be able to buy a well-worn Chevy with savings from a summer job, but it took a hefty income to afford a new Buick, let alone a Cadillac or Lincoln, designed for the country club set.
Suburbanization
Suburbanization promoted consumer spending. It entailed not only spending on dwellings themselves but on furniture and household appliances to put in them, and on all kinds of goods associated with suburban living.
At the end of World War II, the United States faced a huge housing shortage. Since the start of the Depression a decade and a half earlier, very few homes had been built. Returned servicemen and new families found it nearly impossible to find decent housing. Millions doubled up with friends or relatives or crowded into structures thrown up during the war as temporary shelters. After the father of future basketball star Kareem Abdul-Jabbar returned home from the Army, his family spent years “rooming” in a large apartment in Harlem, which they shared with six other tenants, before finally moving to a public housing project.
In spite of government incentives, postwar housing construction ramped up slowly, hampered by shortages of materials, a cumbersome system for distributing them, inefficient builders, and economic uncertainty. But by the late 1940s, a construction boom started. With well over a million new housing units—most single-family homes—being built annually, far above the pre-Depression rate, the landscape of the nation became rapidly transformed.
Between 1950 and 1970, over 80 percent of the population growth of the country took place in the suburbs, which went from housing thirty-six million people to seventy-four million. Cities grew at a much slower rate, with fourteen of the largest fifteen cities actually losing population between 1950 and 1960. Only in the South and West did urban populations shoot up, in many cases because cities annexed adjacent land. The proportion of Americans living in a metropolitan area outside of a city proper nearly matched the population within city boundaries by 1960, and well exceeded it by 1970.
Large-scale developers took the lead in creating the new suburbia, a departure from the past, when small outfits operating in localized markets accounted for most residential construction. Metropolitan outskirts provided large, undeveloped tracts of the sort generally no longer available within city limits, on which developers could build many units. Turning farm fields and forests into suburban communities generated exceptionally large, one-time gains from land appreciation and the developm
ent process—part of a long national history of finding profitable opportunities on frontiers of one sort or another, where a way of life could be constructed from scratch. Unlike earlier developers who subdivided tracts, put in roads and utilities, and then sold off parcels to individual owners or small builders, the new breed of postwar developers bought large expanses of land, prepared sites, and then built and marketed houses themselves. The scale of the new developments was unprecedented: five thousand homes in Oak Forest, near Houston; eight thousand in Park Forest, outside Chicago; three thousand in Panorama City, California; 17,500 in Lakewood, California; 17,400 in Levittown, New York; 16,000 in Levittown, Pennsylvania; and on and on, all across the country.
Suburban developers took advantage of economies of scale and innovative production techniques. William Levitt gained national fame in the late 1940s by building a new community that ultimately housed eighty-two thousand people on what had been Long Island potato fields, twenty-five miles from Times Square. To keep down prices, he built small houses, just 750 square feet; put them on concrete slabs rather than basements; used nonunion labor; set up his own supply companies; and replaced traditional materials with cheaper ones, like plywood and composition board. He rented his first houses at only $60 a month, soon switching to selling them at the extraordinarily low price of $6,990.
Suburbanization and militarization, two of the great social trends of the twentieth century, had links to each other. Many of the techniques used to make suburban homes affordable had been developed during the war on military-related projects. William Levitt and his brother Alfred learned how to build fast and cheap putting up defense worker housing in Virginia and Hawaii, skills William honed as a Navy Seabee constructing airfields in the Pacific. Industrialist Henry J. Kaiser applied lessons he learned during the war turning out ships, steel, and defense worker housing to mass-produce postwar tract homes near Los Angeles. The federal government requirement that defense plants be built away from existing centers of population created a ready market for suburban homes. Lakewood housed workers from the naval station and Douglas Aircraft plant in nearby Long Beach and from other military contractors that clustered in Southern California. In Levittown, New Jersey, members of the armed services and their families occupied 12 percent of the houses. Veterans Administration loans, authorized by the GI Bill, also connected the military and suburbia. Levitt at first sold houses only to veterans, who had both a moral claim and a ready source of financing for new housing at a time when it still constituted a rare commodity. Nationwide, VA loans financed a sixth of all nonfarm homes built between 1945 and 1955.