The 1950s are sometimes thought of as being morally conservative—and in many ways they were, compared with what happened a generation later. But they can also be seen as a time of emerging permissiveness. World War II helped to loosen acceptable standards of behavior in all sorts of ways so that values were rapidly changing. The sexual revolution of the 1960s can be seen as an acceleration of trends that already were well under way. In 1953, Hugh Hefner launched Playboy magazine, propelled in its first issue by what became an iconic nude photograph of Marilyn Monroe. Some feared that publication of the photograph would ruin her movie career, but instead it helped to raise her to superstardom. Another sign of the times was that in 1956, Grace Metalious’s first novel, Peyton Place, became a runaway best-seller, with more than 8 million copies in print by 1958. Peyton Place, like the Kinsey Reports, was not great literature, but it had a similar attraction in making public the kinds of sexual secrets, affairs, and premarital pregnancies that everyone knew about but was not supposed to talk about in small town America. By 1958, an American publisher dared to release Vladimir Nabokov’s Lolita, the story of a middle-aged man’s affair with a girl in her early teens. The novel, which was exceptional as literature, had been completed in 1953 but turned down by American publishers. It was published in France in 1956 but subsequently suppressed in both Great Britain and France. Yet it passed without censorship in the United States in 1958 and became yet another best-selling sensation. By the next year, a judge had allowed for the American publication of the original version of D. H. Lawrence’s Lady Chatterley’s Lover, a 1928 work that had long been banned as pornographic. American movies were still relatively tame, largely because of censorship that was especially responsive to Roman Catholic concerns, but by the end of the decade American public culture was on the verge of definitions of freedom that in a few years would include sexual freedom.12
Clearly, then, the experts did not create the revolution in mores. Rather, as was evident in the case of Kinsey, they simultaneously reflected the trends already in progress and provided them with scientifically based authority. Sexual mores were changing for many reasons, including increased social mobility, massive commercial interests that used sexual innuendo to sell products, and the simple propensities of human nature. Individual freedom was a long-standing ideal, and many people found it attractive to add sexual freedom to their list of natural rights. Nevertheless, scientific and social scientific studies were also providing sanctions for progressive views that said that individual self-determination was a preeminent value.
Arguably, the most influential expert in all of this was Dr. Benjamin Spock. His book The Common Sense Book of Baby and Child Care, first published in 1946, quickly became a best-seller, and eventually, “Dr. Spock,” as the book was known, was seemingly the universal child-care book of the era. It was so widely recognized that it was used as the basis of a controversy between Lucy and Ricky on an I Love Lucy show. And like the other experts, he was providing scientific authority that both reflected and shaped America’s increasingly individualistic mores.13
Dr. Spock was a particularly appealing scientifically based expert because he questioned the authority of experts. Child-rearing experts of the previous generation had promoted overly technical models for infant care, stressing the necessity of routine and discouraging old-fashioned instinctual practices, including breast-feeding. Dr. Spock’s opening message, by contrast, was quintessentially American and midcentury: “Trust yourself.” He assured new parents, in his preamble, “You know more than you think you know.” The experts may have calculated scientifically based routines for your child, but new parents should not be overwhelmed. They should trust their common sense. And the counterpart to trusting yourself was to trust your child. As if to counter any lingering Calvinism, Spock, a New Englander himself, assured readers, “Your baby is born to be a reasonable, friendly human being.” That philosophy translated into advice to abandon the harsh methods of discipline that were often common in child rearing up to that time. Parents should not shame their children; spankings should be rare, at most; and far more could be accomplished by demonstrations of love than by punishment, which could create resentment and maladjustment. Dr. Spock’s outlook included elements that combined B. F. Skinner and Carl Rogers. One should use positive reinforcement techniques, but also trust oneself and trust one’s child.14
Sydney Hoff, May 24, 1958, The New Yorker
These trends that were so characteristic of the 1950s would have a lasting cultural impact. “Trust yourself” had been around at least since the days of Ralph Waldo Emerson, but at midcentury the advice came not just from exceptional individualists, but from almost everywhere, and with the authority of modern science. Social scientists of many stripes were proclaiming that conformity was the problem and nonconformity the solution, and so was almost everyone else, including contemporary artists, novelists, playwrights, poets, humanists, existentialists, and an assortment of pundits and clergy.
The meaning of life, everyone seemed to agree, could be found not by looking to tradition or to community, either past or present, but rather, by looking within. This midcentury consensus recommended that people free themselves, often with the authority of modern science, from traditionalist moralities and mythologies. Individual development, individuality, and self-fulfillment should be preeminent goals.
The resulting outlook has been nowhere better characterized than it was in the late 1970s by Christopher Lasch as “the culture of narcissism.” Lasch, who came of age in the 1950s, grew up in a politically progressive family and spent his career critiquing the defects of the progressive liberal culture that had emerged since midcentury. Outlooks such as those of Carl Rogers were especially subject to his withering gaze. “Economic man himself,” he wrote, “has given way to the psychological man of our times—the final product of bourgeois individualism. The new narcissist is haunted not by guilt but by anxiety.” By way of contrast, he said, nineteenth-century American individualism and the cult of success were not so much based on competition as on “an abstract ideal of discipline and self-denial.” In the twentieth century, success became increasingly defined as victory over one’s competitors, so that people “wish to be not so much esteemed as admired. They crave not fame but the glamour and excitement of celebrity.” The new media greatly enhanced such attitudes. Advertising had created a culture in which people were more fascinated by images of things than they were by the things themselves.
Yet this new narcissistic culture, as Lasch described it, was not purely individualistic, because it also had to suit the needs of bureaucratic capitalism, which helped produce it. Bureaucratic culture (or that regulated in detail by administrators) came to favor the person who, much like David Riesman’s “other-directed man” or William Whyte’s “organization man,” was skilled at manipulating personal relationships, but had no deep attachments. Society also developed what Lasch characterized as a “new paternalism”—directed by the expert, often supported by the government, and offering scientifically based advice on almost every conceivable dimension of human activity. B. F. Skinner might be seen as at least a minor prophet of that side of the culture. Almost every waking moment, in play as much as at work, in personal relationships, in home life, and in matters of health, was considered to be at its best when it was guided by the benevolent advice of the expert.15
As Lasch and others have observed, the two sides of the emerging culture, both narcissistic individualism and regulation by bureaucrats, are not as contradictory as they might seem, but can be complementary. When communities become almost entirely ad hoc—that is, they exist for particular limited purposes, whether for work, personal relationships, or recreation—the individual can reign supreme as at least in principle the designer of his or her own “lifestyle.” Work may be governed by administrators and technical methodologies, but work also has the potential to provide one with the financial means and freedom to select one’s own set of
lifestyle activities. What frequently becomes problematic in this often attractive arrangement is the question of meaning. To what extent can meaning be found in the endless quest for competitive advancement, consumer goods, and entertainment? In a culture of self-fulfillment, what is actually fulfilling? As sociologist Robert Bellah and his coauthors wrote in their classic book Habits of the Heart, in describing the prevailing American culture as it had evolved by the 1980s, American culture revolved around two poles: the manager and the therapist. “The goal of living,” observed Bellah, “is to achieve some combination of occupation and ‘lifestyle’ that is economically possible and psychically tolerable, that ‘works.’ The therapist, like the manager, takes the ends as they are given; the focus is on the effectiveness of the means.” As Bellah put it, the “center is the autonomous individual, presumed able to choose the roles he will play and the commitments he will make, not on the basis of higher truths but according to the criterion of life-effectiveness as the individual judges it.”16
At midcentury, two of the great ideals inherited from the enlightenment era—faith in scientific instrumental reason and faith in the individual—were among the most widely shared beliefs in the culture. In the upheavals of the late 1960s, many young people protested against the seeming contradictions of the two, as in countercultural outcries against the technocracy of capitalism and the military, against scientific planning and control, and against the myth of objectivity in Western linear thought—all in the name of individual freedom. But when after a few years members of the countercultural generation were settling down and getting jobs, they reentered a culture in which the two ideals, scientific instrumental reason and individualism, were both still alive and well. Rather than being shaped by a community or a tradition, people might choose their own lifestyle. In such a society, the two ideals could be complementary, at least practically speaking. In “the culture of narcissism,” free individuals would be guided by the manager and the therapist.
FIVE
The Latter Days of the Protestant Establishment
At the same time that faith in individual autonomy and the authority of science was standard fare in so much of midcentury American culture, the United States was experiencing one of the most widespread religious revivals in its history. Record numbers were attending religious services of almost every type and level, from those who went to tent revivalists for healing to prosperous old-line Protestants, from fans of Billy Graham to devotees of Reinhold Niebuhr, from hyper-biblicist sects to broad spiritualists, from the white and the black rural South to the urban ethnic neighborhoods of Roman Catholics or Jews, and including varieties of other Christian and non-Christian beliefs and practices. Millions read Rabbi Joshua Loth Liebman’s 1946 number-one best-seller Peace of Mind and the Reverend Norman Vincent Peale’s 1952 counterpart, The Power of Positive Thinking, both of which promised religiously based self-fulfillment. In 1952, a record 75 percent of Americans responding to pollsters said that religion was “very important” in their lives. And in 1957, more than four out of five affirmed that religion was not “old fashioned and out of date,” but rather, “can answer today’s problems.” Between 1950 and 1960, church affiliation jumped an amazing 14 percent, going from 55 percent of the total population to 69 percent. By the end of the 1950s, attendance at religious services similarly reached an all-time peak.1
The intriguing question that emerges, then, is this: How did these two simultaneously huge cultural trends—the consensus outlooks celebrating both scientific authority and autonomy, and the religious revival—fit with each other? More broadly, how could a culture that was so modern, secular, and antitraditional in so many of its practices and ways of thinking be at the same time so religious? How did so much religion fit with the rest of the cultural mainstream? How such questions typically were addressed in the 1950s has important implications for the subsequent rise of the religious right by the late 1970s. They also lead into the culminating theme of this book: how best to accommodate a variety of religious viewpoints in pluralistic America.
The story of religion in the 1950s has many dimensions, but at its center is the continuing heritage of cultural leadership of the mainline Protestant churches. These were the predominantly northern, white, Protestant denominations, such as Episcopal, Congregational (United Church of Christ), Presbyterian, American Baptist, United Methodist, Disciples of Christ, various sorts of Lutheran churches, and others, that were regarded as constituting an informal religious “establishment.” That is, even though America had not had “established” state churches supported by taxes since its early days, Protestant Christianity still held a privileged place in the culture as the predominant religion. Mainline Protestant leaders were part of the liberal-moderate cultural mainstream, and their leading spokespersons were respected participants in the national conversation.
Protestantism had played a complementary role to more secular outlooks in public life throughout US history, and many Protestants were, of course, eager for this role to continue. It was a role that was embodied, for example, in the US Constitution. Written with remarkably little religious language for the time, the Constitution defined the federal government in a practically secular way. At the same time, the framers took care in the religion clauses of the First Amendment (“congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof”) to guarantee that religion might flourish in many supplemental capacities—even in tax-supported established churches in some New England states. Although the early republic was not a “Christian nation” in the sense that some conservative Christians today claim, neither was it wholly secular. Protestant Christianity retained many public privileges. Some of these were ceremonial and others were substantial. Education, which in Christendom had always had a conspicuous religious component, continued to include Protestant teachings. In the mid-nineteenth century, for instance, even state universities had required chapel attendance and were likely to have clergymen as presidents. Protestants could also form voting blocs large enough to shape religiously based legislation, as in Sabbath laws, or in promoting various social and moral reforms. The last great manifestation of that public influence was in the movement for prohibition of the sale of alcoholic beverages, culminating in the Eighteenth Amendment in 1919.
By the 1950s, although Protestants retained disproportional influence, the question loomed as to how such influence might continue. Prohibition itself had brought strong reactions against allowing one religious group to impose its restrictive teachings on everyone else. Moreover, as it was increasingly recognized that the nation included various religious, secular, and simply profane outlooks, the prospects for specific religious teachings to continue to play a role in shaping a public national consensus were looking increasingly problematic.
A prominently proposed solution to this problem was the one offered by Protestant “modernism.” A modernist was one who saw God’s work continuing to be revealed through the best developments of modern times. During the 1920s, American Protestantism became sharply divided between fundamentalists, who militantly insisted on holding onto strictly biblical teachings, and modernists, who believed that the best way to preserve Christianity was to allow it to grow with the best thought and moral ideals of the modern world.
The continuing influence of Protestant modernism in the 1950s might be illustrated by looking at various clergymen, theologians, and popular religious writers, but the best example is found in someone who was none of these, but far more influential: Henry Luce, head of the Time, Inc., publishing empire. The key to understanding Luce is that he was the child of Presbyterian missionaries, and he always remained a missionary to and for America. Born in China in 1898, he attended Yale in the World War I era. As a student he was active in Yale’s famously evangelical Dwight Hall, where he sometimes preached. Although he remained religious after college, his views concerning the essence of Christianity had radically changed. In response to the ch
allenges of modern science and modern thought, he abandoned more traditional forms of Christianity and emerged as a quintessential Protestant modernist. During the next decades, Luce continued to be an active lay Presbyterian, occasionally preaching to church and college groups. He used Time and Life to keep religion in the news, and he probably did as much as anyone in the era to sustain the idea that religion was still part of the American cultural mainstream. Although Luce’s views were explicitly theistic, his application of them involved no consistent distinction between the church (and similar religious groups) and American society. America was, in effect, his church, and America’s mission to the world was Henry Luce’s Christian mission.2
In 1955, Luce’s influential business magazine, Fortune, marked its twenty-fifth anniversary by publishing a series of articles by leaders in various fields speculating on what America would be like twenty-five years hence, in 1980. The projections concerning America’s economic future were, as one might expect in 1955, exceedingly upbeat. Experts anticipated continuing economic growth that would lead to doubling in family incomes and a shortening of the workweek. As an aside, the most fascinating predictions concerned the American energy situation in 1980. A number of the authors were confident about harnessing the atom for peaceful purposes. David Sarnoff, president of the Radio Corporation of America, wrote, for instance, “I do not hesitate to forecast that atomic batteries will be commonplace long before 1980,” and that small atomic generators would be installed in homes. In another of the articles, John von Neumann of the Atomic Energy Commission went so far as to suggest that “a few decades hence energy may be free—just like unmetered air.”3
The Twilight of the American Enlightenment Page 11