A Patriot's History of the United States: From Columbus's Great Discovery to the War on Terror

Home > Other > A Patriot's History of the United States: From Columbus's Great Discovery to the War on Terror > Page 109
A Patriot's History of the United States: From Columbus's Great Discovery to the War on Terror Page 109

by Larry Schweikart


  After Pan Am introduced a new transatlantic airliner designed specifically for passengers, the Douglas DC-7, Boeing matched it with the famous 707 jetliner, providing competition in passenger-aircraft production that drove down prices. The frequency of air travel even produced a new term in the language, a physical and mental malady called jet lag.

  It goes without saying that Americans could fly and drive because their paychecks purchased more than ever before. In 1955 the average income for all industries, excluding farm labor, topped $4,220 per year, and by 1959 that had increased by another 10 percent. A New Orleans salesman with a high school education earned $400 per month; a Chicago private secretary pulled in about the same; and a Charlotte housekeeper received about $140 a month. Marilyn Monroe could command $100,000 a picture—just a little more than the $90,000 annual income earned by Peanuts cartoonist Charles Schulz.67 Such high wages bought a lot of potatoes at $.51 per pound, or rice at $.19 per pound, and even new electronics gadgets like movie projectors were within reach at $89. Chevrolet’s Corvette—arguably the hottest car of the 1950s—had a sticker price of $3,670, or below the average annual salary in America.

  Cookie-Cutter America?

  The newfound freedom on the highways and airways held a threat as well as a promise, for although people could break the restraints of their geography, social class, background, and family more easily than ever, they also were exposed to new and unfamiliar, often uncomfortable social settings and customs. People responded by seeking a balance, embracing similar—almost uniform—housing on the one hand and enjoying their visits to other parts of the country on the other. The popularity of the famous Levittown subdivisions, where all houses were almost identical, have led some historians to mistake this need for order, and the cost advantages resulting from economics of scale, for an overarching quest for conformity. It was no such thing at all. Just as the adventurous pilot scans the landscape for a familiar topography every now and then, so too did Americans embrace individualism while they retained some sense of order.

  To see this, all one has to do is examine American travel patterns to observe how people eagerly entered into parts of the country that were in many ways foreign to them, even threatening. Yankees heading to Florida’s vacation spots for the first time crossed through the redneck backwoods of the Old South. Easterners visiting California often encountered Asians for the first time; and midwesterners taking new jobs in the Southwest were exposed to Indian or Mexican cultures and probably ate their first taco or tasted their first salsa. Even such things as housing—in Arizona few multilevel homes were built because of the heat—food, and beverages differed greatly from place to place. Southern iced tea, for example, was always presweetened, and so-called Mexican food in Texas hardly resembled Mexican food in California. Midwesterners, who battled snows and rains all winter and spring, had trouble relating to water politics in the West, where titanic struggles over the Colorado River consumed lawmakers and citizens.

  Food rapidly democratized and diversified, with the specialized dishes of the elites spreading to the middle class throughout the country. Soldiers who had come back from Italy had a yearning for pasta; New Yorkers who knew Coney Island learned the magic of the hot dog and took the concept with them as they traveled. Asian recipes moved inward from the coasts as Mexican cuisine surged northward. America’s eating establishments became the most richly textured on the planet, with the most varied menus anywhere in the world. Within thirty years, Jamaican hot peppers, Indian curry sauce, flour tortillas, lox, teriyaki sauce, Dutch chocolates, innumerable pasta variations, and spices of all descriptions flooded the shelves of American grocers, allowing a cook in North Dakota to specialize in cashew chicken, N’Awlins shrimp, or enchiladas. Not surprisingly, some of the most celebrated chefs to come out of this era drew upon their ethnic roots or experiences for their cooking. Martha Stewart (born Martha Kostyra) frequently prepared Polish dishes. And Julia Child, who worked in Asia with the Office of Strategic Services and then lived in Paris (where she learned to cook), had a broad firsthand exposure to foreign cuisine. Emeril Lagasse, another future star, born in the early 1960s, earned his chef’s apron in his parents’ Portuguese bakery.

  Far from a decade of conformity, as expressed in the lamentations of books on corporate America, such as William Whyte’s The Organization Man (1956) or Sloan Wilson’s The Man in the Gray Flannel Suit (1955), the population had entered a period of sharp transition where technology was the handmaiden of turmoil. These books and others, such as David Reisman’s The Lonely Crowd (1950), emphasized a shift from rugged individualism to a team or corporate orientation. Reality was quite different: American conformity in fact kept the sudden and difficult transitions of the postwar world from careening out of control. It is not surprising that the two most popular movie stars of the day—the establishment’s John Wayne and the counterculture’s James Dean—in different ways celebrated rugged individualism, not conformity.

  No one symbolized the effort to maintain continuity between 1950s America and its small-town roots and patriotic past more than painter Norman Rockwell (1894–1978), who in some ways was the most important and significant American artist in the history of the Republic. Born in New York, Rockwell left school in 1910 to study at the National Academy of Design. Almost immediately his work found an audience and a market. He painted Christmas cards before he was sixteen, and while still a teenager was hired to paint the covers of Boys’ Life, the official publication of the Boy Scouts of America.68 Setting up a studio in New Rochelle, Rockwell worked for a number of magazines until he received a job in 1916 painting covers for The Saturday Evening Post, a magazine Rockwell called the “greatest show window in America.” In all, Rockwell painted 322 covers for the Post, and illustrated children’s books before he began painting for Look magazine.

  Critics despised Rockwell because he presented an honest, yet sympathetic and loving, view of America.69 He insisted on painting those scenes that captured the American spirit of family—independence, patriotism, and commitment to worship. Inspired by one of Franklin Roosevelt’s speeches, Rockwell produced his masterpieces, the Four Freedoms, which ran in consecutive issues of the Post in 1943 along with interpretive essays by contemporary writers. Freedom from Want was inspired by his family’s cook presenting a turkey at Thanksgiving. Freedom of Speech, possibly the best known Rockwell painting of all, featured a small-town meeting in which a laborer in a brown leather jacket speaks with confidence about a bill or proposal tucked in his pocket.

  Rockwell did not ignore the serious deficiencies of American society. His 1964 Look painting The Problem We All Live With remains one of the most powerful indictments of racial discrimination ever produced. Depicting the desegregation of a New Orleans school district in 1960, Rockwell painted a little black girl, Ruby Bridges, being escorted into the formerly all-white school by four federal marshals. The wall in the background has the splattered remains of a tomato just under the graffito nigger that appears above her head. New Kids in the Neighborhood (1967) pictures a moving van with two African American kids standing beside it—the new kids staring at three white children who are looking at them with curiosity, not anger or fear.

  Rockwell’s paintings capture a stability in a sea of unraveling social and regional bonds. Religion tried to adapt to these changes but failed. It took outsiders, such as Billy Graham and Oral Roberts, to cut through the serenity, comfort, and even sloth of the mainstream religions to get Christianity focused again on saving the lost and empowering the body of Christ. Clinging to stability and eschewing change came at a price: the lack of passion and avoidance of contention in many denominations triggered a staggering decline in membership. One researcher found that starting in 1955, the Methodist Church lost an average of a thousand members every week for the next thirty years.70 In the mid-1950s, churches responded by becoming more traditional and turning down the doctrinal voltage.

  As religion grew less denominationally contentious, thus ma
king it less important to live near those of a similar denomination, Americans found one less impediment to relocating to other cities or regions of the country. The market played a role in this sense of regional familiarity too. Entire industries sprang up to meet the demands of an increasingly mobile population. For example, Kemmons Wilson, a Tennessee architect, traveled with his family extensively and was irritated by the quality of hotels and the fact that most hotels or motels charged extra for children. Wilson and his wife embarked on a cross-country trip in which they took copious notes about every motel and hotel where they stayed: size of rooms, facilities, cost, and so on. He then returned home to design the model motel of optimal size, comfort, and pricing—with kids staying free. The result—Holiday Inn—succeeded beyond Wilson’s wildest dreams. By 1962, Wilson had 250 motels in some 35 states. Wilson saw standardization as the key. Each Holiday Inn had to be the same, more or less, as any other. That way, travelers could always count on a “good night’s sleep,” as he said later. Americans’ quest for familiar products, foods, and even fuel and music in an age of growing mobility produced a vast market waiting to be tapped.71

  Ray Kroc saw that potential. A middle-aged paper-cup salesman who had invented a multiple-milk-shake mixer, Kroc was impressed with a California hamburger stand owned by a pair of brothers named McDonald. He purchased the rights to the name and the recipes and standardized the food. All burgers, fries, and milk shakes at all locations had to be made in exactly the same way. In 1954 he opened the first McDonald’s drive-in restaurant in Des Planes, Illinois, replete with its characteristic golden arches. After five years, there were two hundred McDonald’s restaurants in the United States, and Kroc was opening a hundred more per year.72 By the twenty-first century, “fast food” had become a derogatory term. But fifty years earlier, when truckers planned their stops at roadside truck cafés, the appearance of a McDonald’s restaurant in the distance, with its consistent level of food quality, brought nothing but smiles.

  What Norman Rockwell had done for canvas, Kroc and Wilson did for food and lodging, in the sense that they provided buoys of familiarity in a sea of turbulence and international threats. Americans needed—indeed, demanded—a number of consistent threads, from music to meals, from autos to dwellings, within which to navigate the sea of transformation in which they found themselves.

  The Invisible Man

  One of the main arenas where Americans confronted radical change in the 1950s was in race relations. The continued injustice of a segregated society in which black people were either second-class citizens or, in more “sophisticated” cities, merely invisible, had finally started to change. Ralph Ellison’s novel The Invisible Man eloquently captured the fact that to most white Americans, blacks simply did not exist. Television shows never depicted blacks in central roles; black or “nigger” music, as white-dominated radio stations called it, was banned from playlists (as was Elvis Presley, whom disc jockeys thought was black, early on). One could search in vain for African American executives heading major white-owned companies.

  Few blacks were even remotely equal to whites in economic, political, or cultural power. This situation existed across the nation, where it was winked at or deliberately ignored by most whites. But in the South racism was open and institutionalized in state and local laws. Since Plessy v. Ferguson the doctrine of “separate but equal” had been applied to southern public facilities, including schools, transportation, public restrooms and drinking fountains, and in the vast majority of private restaurants and in the housing market. On municipal buses, for example, blacks were required to give up their seats to whites, and were always expected to go to the back or middle of the bus. Segregation of the races divided everything from church services to whites-only diners. State universities in many southern states would not admit blacks, nor was any black—no matter how affluent—permitted to join country clubs or civic groups. Indeed, even as late as the 1990s, when the black/Asian golfer Tiger Woods became the youngest pro golfer to win the Masters, he was prohibited from joining some of the private golf clubs at which he had played as part of the Professional Golfers’ Association tour. Also in the 1990s, famous televangelist pastor Frederick K. C. Price was not invited to speak at certain churches because of his skin color.

  Large numbers—if not the vast majority—of whites entertained some racial prejudices if not outright racism. Confederate flag-wavers, white-robed Ku Klux Klansmen (whose organization had plummeted in membership since the 1920s), and potbellied southern sheriffs still stood out as not-so-comical symbols of white racism. Equally dangerous to blacks, though, were well-meaning whites, especially northeastern liberals, who practiced a quiet, and perhaps equally systematic, racism. Those northern white elites would enthusiastically and aggressively support the fight for civil rights in the South while carefully segregating their own children at all-white private schools. They overwhelmingly supported public school systems with their votes and their editorials, but insulated their own children from exposure to other races by sending them to Andover or Sidwell Friends. Few had personal acquaintances who were black, and fewer still, when it was in their power, appointed or promoted blacks to corporate, church, or community positions.

  Not surprisingly, this subterranean prejudice was at its worst in liberal meccas such as Hollywood and New York City, where television production headquarters selected the programming for virtually all TV broadcasting in the 1950s and early 1960s. With the notable exception of the radio show Amos and Andy—whose actors were actually white!—black television characters were nonexistent except as occasional servants or for comic relief or as dancers. There were no black heroes on television; worse, there were no black families. Black children did not have many good role models on television, and those African Americans they did see were seldom entrepreneurs, political leaders, or professionals. Perhaps not surprisingly, the wholesale exclusion of blacks from large segments of American society made African Americans suspicious of the few who did achieve positions of importance in white business or culture. Ellison’s Invisible Man appropriately captured white America’s treatment of more than 10 percent of its population.

  Hardly in the vanguard of civil rights, Eisenhower shielded himself from controversy behind the separation of powers. His position, while perhaps appropriate at times, nevertheless contradicted the constitutionally protected civil rights of blacks and cemented the view among black politicians that their only source of support was the Democratic Party. It is ironic, then, that two key events in America’s racial history occurred during Eisenhower’s presidency. The Legal Defense and Educational Fund of the NAACP (National Association for the Advancement of Colored People), led by its director, attorney Thurgood Marshall, earlier had started to take on the “separate but equal” Plessy decision. Marshall had laid the groundwork with a Texas case, Sweatt v. Painter (1950), in which the Supreme Court found that intangible factors, such as isolation from the legal market, constituted inequality. The real breakthrough, however, came in 1954 through a case from Topeka, Kansas, in which the Supreme Court’s ruling in Brown v. Board of Education overturned Plessy v. Ferguson and prohibited state-supported racial discrimination.

  The Reverend Oliver Brown, whose daughter Linda had to walk past a white school to catch her bus to a black school, had brought a suit against the Board of Education of Topeka, Kansas.73 The board argued that its schools were separate, but equal (à la Plessy). In 1953, President Eisenhower had appointed a Republican, Earl Warren of California, as chief justice. This brought about a shift in the Court against Plessy, which the Court found inherently unequal. A year later, the Court required that states with segregated districts (twenty-one states and the District of Columbia) desegregate with “all deliberate speed.” In 1956, Southern states dominated by the Democrats issued a defiant “Southern Manifesto,” in which nineteen senators and eighty-one congressmen promised to use “all lawful means” to reinstate segregation.

  The Court’s language and rulings
after the case generated confusion, uncertainty, and resistance. Racist whites would argue that they were moving with “all deliberate speed” decades after the decision. Equally damaging on the other end of the scale, the Court had stated that segregation “generates a feeling of inferiority” by blacks within the community, which implied that the only way blacks could overcome “inferiority” was to “sit next to whites”—a position that by the 1980s, blacks came to ridicule. It further suggested that in any situation, even voluntary arrangements in which some preordained proportion of races was not achieved, it would “generate a feeling of inferiority.” Eisenhower thought the Brown decision set back racial progress, arguing that real integration could not be brought about by force.

  White resistance to integration involved citizens councils, organizations that threatened blacks whose children attended white schools with economic retaliation, job loss, and other veiled intimidation. States that had made up the border areas in the Civil War—Maryland, Delaware, Kentucky, Missouri, Oklahoma, Kansas—grudgingly began to desegregate. Farther south, though, in the heart of Dixie, a full-scale offensive against desegregation ensued. Latching onto the Supreme Court’s wording of “all deliberate speed,” the Deep South engaged in a massive foot-dragging campaign.

  In Little Rock, Arkansas, the city school board admitted nine black students to Central High School. In response, Governor Orval Faubus, citing the likelihood of violence, encircled the high school with national guard troops to prevent the students from entering. Eisenhower, after conferring with Faubus, concluded the Arkansas governor would remain intransigent. A federal court order forced the national guard to withdraw, at which point the students again sought to enter the school. White mobs threatened to drag the students out and intimidated the authorities into removing the black students. A stunned Ike, who had only months earlier said he could not imagine sending federal troops to enforce integration, nationalized the Arkansas Guard and sent in a thousand paratroopers to ensure the students’ safety. Faubus then closed the schools, requiring yet another court ruling to pry them open. Further efforts at “massive resistance,” a phrase coined by Democrat senator Harry F. Byrd of Virginia, led to state attempts to defund desegregated schools. But state and federal courts held firm, and supported with minimal enthusiasm by Eisenhower and then by Kennedy, the segregated structure finally began to fracture.

 

‹ Prev