“Put it on that stack behind you,” the receptionist said, pointing to a mound of tumbling boxes, envelopes, and towers of paper already creeping up the wall. A polite rejection soon followed.
He put aside his aspiration to be the next Fitzgerald and got a PhD in sociology at Yale in 1951. Later, he joined the staff at the Russell Sage Foundation, a nonprofit devoted to strengthening social science methods and theory, and became its president in 1963. Brim wondered how children internalize the values and behavior of their family and community. As he watched his young subjects, he realized that their parents were going through a similar process of socialization, learning how to be mothers and fathers. Adults increasingly claimed his attention, and as president of Russell Sage, a post he held until 1972, Brim directed research dollars toward establishing middle age as a bona fide field of study. One seed he planted was at the nonprofit Social Science Research Council, which organized committees to focus on cutting-edge topics. In 1973, he chaired a panel on Work and Personality in the Middle Years. This committee and others that followed gathered leading scholars like Paul Baltes and Glen Elder, who were trying to devise a better model to explain the continual process of growing up.
The approach that ultimately took shape has come to be known as life course or life span development theory. Despite many variations, at its core is the idea that human development is flexible and never-ending. Over a lifetime, shifts in behavior and personality are powered by an unpredictable combination of biology, timing, choice, chance, relationships, history, and geography—the decision to move to a city, the unplanned third child, an unforeseen financial bubble or military conflict.
None of these elements is particularly surprising. Common sense would lead most people to the same conclusions, but as a theory the notion was messy and unwieldy, spilling over the boundaries of any single discipline and any defined age group, characteristics that made it extremely difficult to research, test, or prove. Over time, however, it has capsized our understanding of how human beings develop. In 1998, the psychologist Anne Colby called the establishment of life span theory “one of the most important achievements of social science in the second half of the 20th century.”
The idea nonetheless encountered stalwart resistance. Brim remembered the reaction to a 1980 collection of essays he co-edited with the psychologist Jerome Kagan titled Constancy and Change in Human Development. “Most Western contemporary thought” rejects the possibility of growth in adults, they wrote, but “in the new field of life span development, research on middle-aged and older persons indicates that personality and behavior are more malleable than most people think.”
Their colleagues who studied children responded as if they were evolutionary biologists who had just been told that Homo sapiens evolved from chickens instead of apes. When Brim and Kagan presented their ideas at a large conference on child development after the book came out, Eleanor Maccoby, the field’s grande dame and a proponent of biological influences, stood up. “You and Jerry Kagan are fouling the nest of child development,” Brim recalled her saying with contempt. The audience clapped enthusiastically. “And as we were leaving the room,” Brim added, “no one would speak to us.”
The insistent debate over how much people are capable of changing in middle age was as much about human nature as it was about academic theory. Which holds sway: God or man, nature or nurture, biology or environment? The questions reflect two divergent views of humanity: one that maintains we are prisoners of our genetic inheritance; the other that sees an enormous potential for change.
Nineteenth-, twentieth-, and twenty-first-century thinkers have frequently held that human beings are bound by inborn limits. The neurologist Pierre-Paul Broca, the man who founded the Anthropological Society of Paris in 1859 (the same year that Darwin published On the Origin of Species), was convinced the size of the brain limited intelligence. After spending months in the dark morgues of Paris hospitals weighing autopsied brains, Broca used his measurements to support the conventional wisdom that men were smarter than women. Freud’s theory that adult personality was permanently formed in the first years of life similarly rested on a belief in natural limits, as did the notion that men should not squander their finite supply of the vital “male principle” (semen), or that a human being was physiologically incapable of running faster than one mile in four minutes. Assumptions about natural limits underlay negative views of middle age and beyond—“the fixed period” that Anthony Trollope lampooned in his 1882 novel and that William Osler called the “fifteen golden years of plenty.”
Notions of preprogrammed limits, often referenced to genetics, remain popular. A theory bandied about in the late 1990s was that the brain’s critical wiring was completed by age three. At a 1997 White House conference on this issue, Hillary Clinton, then first lady, declared that early experiences “can determine whether children will grow up to be peaceful or violent citizens, focused or undisciplined workers, attentive or detached parents themselves.”
Evidence may prove the fallacy of inherent limits as when Roger Bannister broke the four-minute-mile barrier in 1954, or when scientists discovered men did not run out of semen. But the belief often persists in spite of the facts because it is based on philosophy or politics. Liberals and conservatives both invoke theories of intrinsic limits, either to argue that there is no point in trying to improve immutable characteristics, as Charles Murray and Richard Herrnstein, the authors of The Bell Curve, said about IQ, or to press for greater government intervention, as Hillary Clinton did on behalf of young children.
Ideas about middle age can be just as stubborn, for they, too, are influenced by intellectual fashion and unrecognized bias.
Middle Age Is Official
As the eighties progressed, middle age, the neglected wallflower of development, attracted more and more attention. In 1980, the year Brim and Kagan’s book Constancy and Change came out, four hundred scholars signed up to receive a nationwide list of research projects on midlife that the Social Science Research Council compiled. By 1984, the sign-up sheet had grown to eight hundred.
During those years, Brim and Baltes recruited ninety-five scholars to contribute to a six-volume series on lifelong development. “They were the first new publications of their time. There was no professional journal then about midlife and they were the principal resource around the early 1980s for ideas about midlife development,” Brim said. “This work was the avenue that led to the MacArthur Foundation program in 1989.” That was when Brim, who was receiving money for research on human development from MacArthur, noticed that the foundation had a task force—what it labeled a network—on children, another on adolescents, and a third on successful aging in older Americans. Absent from that lineup was middle age.
“It’s obvious,” Brim recalled telling the director of MacArthur’s health program. “You guys are missing the whole midlife period. I mean it’s not there. You need a network on midlife development.”
The foundation agreed to create one, with Brim at the helm. Life span theory informed the research design, methods, and goals. No other significant national survey linked information-gathering to a particular psychological and social theory. To accomplish this task, Brim enlisted scholars from fields that had never bothered with midlife before. This cocktail of different disciplines produced a unique national survey. The team combined techniques to acquire breadth and depth. They gathered a large random representative sample of the population to question (an approach favored by sociologists, epidemiologists, and demographers), and arranged for small groups to be interviewed and observed at length (as is preferred by psychologists and anthropologists).
Interest in midlife had been percolating for a while, but securing the MacArthur Foundation funding was a turning point. The emerging political significance of a particular segment of the population can be traced through its mention in public pronouncements and programs. Political parties in the nineteenth century adopted platforms that promised to provide specific p
rotection for children, citing their unique needs and vulnerabilities. In 1909, the same year that Ellen Key’s Century of the Child was published in English, President Theodore Roosevelt hosted the first White House conference on children, which led, a few years later, to the creation of a federal Children’s Bureau. By 1944, both political parties had added teenagers as a group deserving special consideration. The elderly’s addition to the public agenda was signaled by the Kennedy administration’s decision in 1961 to hold a presidential conference on the status of seniors. By 1962, every state in the nation had an agency on aging, and in 1963 the White House designated May as Senior Citizens Month.
In the 1980s, foundations took the lead in directing research and establishing new fields of study, and the MacArthur project sent a signal. “That really did put middle age on the map,” Brim said. It was as if a fringe hobby suddenly turned into an Olympic event. Brim deserves a lot of the credit. David Featherman, a leading expert on aging and a former president of the Social Science Research Council, pronounced that “this enterprise established the legitimacy of a new developmental life period.”
Midlife Without the Crisis
The intensive information-gathering from the nearly 7,200 participants between the ages of 25 and 76 occurred in the mid-1990s as researchers distributed questionnaires, conducted phone interviews, ran lab tests, and convened roundtables for intensive discussions.
At the end of the ten-year period, the MacArthur project—later included under the MIDUS banner—offered a detailed snapshot of midlife in America. The results reflected the more comprehensive and fluid life course perspective. “Midlife is more flexible than are childhood and old age,” Brim reported. “It is less driven by a biological clock that causes the changes in childhood and old age. The same events of midlife can occur at different ages for different people. We know, too, that there is no set order, no series of stages, in which these familiar events may occur. There are different sequences for different people. Thus, though most of us will share the events of midlife, there is no single path that we all take.”
Brim’s team discovered that middle age wasn’t so bad after all. The despondent empty nester, the crisis-ridden sports-car buyer, the stubborn closed-minded matron turned out to be much rarer than depicted in the world of prime-time television, films, and advertisements. “New Study Finds Middle Age Is Prime of Life,” the New York Times announced when the results were released in 1999; “Study finds midlife ‘best time, best place to be,’” the Chicago Tribune wrote. The Washington Post devoted part of a special section to the results titled “Midlife Without the Crisis.” The ambitious project helped to transfigure the way we think of the middle-aged mind and body.
The research debunked a number of popular shibboleths. The notion that men abandon middle-aged spouses because they are programmed by evolution to pursue fertile, wrinkle-free maidens was discredited by the fact that most breakups occurred during the first eight years of a marriage. Divorce in midlife was relatively rare; when it did happen, women were more than twice as likely as men to have initiated it. Eight out of every ten men in middle age rated their marriages good or excellent, as did more than seven out of ten women.
For most women, menopause was a nonevent, just as Beauvoir predicted it would be once women were no longer confined to maternal roles. Nearly sixty-two percent said they felt “only relief” when their periods stopped, while fewer than two percent said they felt “only regret.” Another twenty-three percent said they had no particular feelings about it one way or the other. Women also appreciated having greater control of their sex lives in midlife, with less pressure from men and fewer worries about getting pregnant, something often overlooked when the passing of youthful sex is lamented. Virtually no evidence existed that women experienced a diagnosable “syndrome” when children left home. An empty nest was simply another one of the stresses of parenthood, like hearing your baby cry herself to sleep or allowing your teenager to borrow the car, and one that has eased in recent years with the proliferation of cell phones and Skyping. In many cases the pangs of loss were countered by newfound opportunities for mothers to develop their own talents and interests. Men and women in the survey celebrated the independence and freedom they gained when their children left and noted that their marriages often improved.
The MacArthur network’s innovative search for positive experiences enabled researchers to see how a narrow focus on disease and dysfunction had skewed results in the past. By testing for the six components of well-being, or eudaimonia, discussed in chapter two (a feeling of control, positive relationships, personal growth, a sense of purpose, autonomy, and self-acceptance), they discovered that pluses could counteract minuses. So while middle-aged women might have higher rates of depression than men, they also reported better relationships and more personal growth, which strengthened their psychological resilience.
Although many people made it through middle age in good health, problems with high blood pressure, cholesterol, arthritis, and expanding waistlines sometimes made their debut. Not having enough energy, sleep, and time were also frequent complaints. Offsetting these deficits, though, was delight in the feelings of control, experience, and being settled that middle age brought. For most adults, the largest source of satisfaction came from their friends and family.
MacArthur and the Midlife Crisis
MacArthur confirmed the skepticism that George Vaillant and other researchers had about the scourge of the midlife crisis. Less than ten percent of those between 40 and 60 underwent a turning point or crisis related to a looming sense of mortality, its defining characteristic. Turning 30 was actually much more disruptive for most people than 40 or 50 (an observation Daniel Levinson had made). Among respondents who claimed to have had a midlife crisis, the trauma often turned out not to have occurred during middle age at all. Like the phony psychics whose one accurate prediction is remembered out of hundreds of incorrect ones, the occasional midlife crisis perpetuates the belief that the phenomenon is an inescapable ritual of middle age.
Brim told me he believes that people have an internal thermostat they use to adjust their ambitions in order to maintain a happier outlook. The point at which motivated people seem most content is what he calls “just manageable difficulty,” which he estimates to be a job that demands you work at about eighty percent of your capacity. “This intuitive process by which we constantly reset our goals in response to ups and downs is one of the most overlooked aspects of adult development.”
“Middle age is not the period of high anxiety that we’ve been led to believe,” he added. “For most people, midlife is the place to be.”
Today, more than a dozen years after extensive discussions of these results in the media, what remains surprising is the tenacity of this fiction. The idea of a midlife crisis has persisted despite the lack of evidence that most people ever go through one. Initially, mistaken assumptions about the pervasiveness of midlife crises may have been due to researchers’ tendency to study people who had come to a specialist for treatment. It was like surveying an orthopedist’s patient list and concluding that nearly everyone in the population had suffered a broken limb. Alice S. Rossi, a sociologist and MacArthur investigator, suggested that the classic portrait of the sad-sack midlifer “may strike a future developmental researcher as burned out at a premature age, rather than reflecting a normal developmental process all men go through so early in life.” The middle-aged men that Daniel Levinson wrote about in The Seasons of a Man’s Life, for example, had dutifully climbed the social and career ladder in the fifties and early sixties, hitting middle age at the height of the countercultural rebellion when young people accusingly branded their generation as conformist, complacent, and inauthentic. Their feelings of dislocation and unease may have been triggered by the social upheaval occurring at the time.
We have become conditioned to use the midlife crisis as the default explanation for any discontent or unusual decision. Ninety years ago, G. Stanley Hall recog
nized how age, like the older sibling who should know better, always got the blame: “Shortcomings that date from earlier years are now ascribed to age.” So David Foster Wallace, a deeply insightful author, described a breakdown he had in college by saying, “I had kind of a midlife crisis at twenty.” Media pundits in 2011 who described the actor Charlie Sheen’s drug abuse, fights, and rants at age 46 as a midlife crisis ignored similar episodes that occurred in his youth, including shooting his fiancée in the arm when he was 25. And the Danish director Lars von Trier, already notorious for outrageous statements, suddenly blamed a midlife crisis for declaring himself a Nazi at the 2011 Cannes Film Festival. As David Almeida, a psychologist at the University of Pennsylvania and MacArthur researcher, argues: “Many of the stereotypical hallmarks of a midlife crisis, such as the sudden purchase of the expensive sports car, likely have more to do with middle-age financial status than with a search for youth.”
There is, however, another way to think about the midlife crisis. The bare-bones outline matches an archetypical plot: a hero encounters an obstacle or turning point and is shadowed by thoughts of death, yearns for an Edenic past, and then emerges from the crisis with new insight. Stripped of its ageist sentiments, this heroic journey fits adolescents, and 20- and 30-somethings—like David Foster Wallace—better than it does those in middle age. Trying to return home from Troy, Odysseus has the equivalent of a twenty-year midlife crisis (or crises) before reuniting with his wife, Penelope. This universal theme could explain why the midlife crisis narrative has resonated so deeply and has been so hard to dislodge. Everyone creates a story about their lives, fitting events together in a way that makes sense. These tales are modeled on plots that we have heard since childhood. We choose which events are significant and how they mesh with our overall experience. “One spends a lifetime reconstructing one’s past,” the writer Brendan Gill observed, “in order to approach some tentative, usable truth about oneself by ransacking all the data that have hovered dimly somewhere ‘out there,’ helping to form one’s nature.”
In Our Prime Page 15