by Susan Faludi
Yet chlamydia was one of the most poorly publicized, diagnosed, and treated illnesses in the country. Although the medical literature had documented catastrophic chlamydia rates for a decade, and although the disease was costing more than $1.5 billion a year to treat, it wasn’t until 1985 that the federal Centers for Disease Control even discussed drafting policy guidelines. The federal government provided no education programs on chlamydia, no monitoring, and didn’t even require doctors to report the disease. (By contrast, it does require doctors to report gonorrhea, which is half as prevalent.) And although chlamydia was simple to diagnose and easy to cure with basic antibiotics, few gynecologists even bothered to test for it. Nearly three-fourths of the cost of chlamydia infections, in fact, was caused by complications from lack of treatment.
Policymakers and the press in the ’80s also seemed uninterested in signs of another possible infertility epidemic. This one involved men. Men’s sperm count appeared to have dropped by more than half in thirty years, according to the few studies available. (Low sperm count is a principal cause of infertility.) The average man’s count, one researcher reported, had fallen from 200 million sperm per milliliter in the 1930s to between 40 and 70 million by the 1980s. The alarming depletion has many suspected sources: environmental toxins, occupational chemical hazards, excessive X-rays, drugs, tight underwear, even hot tubs. But the causes are murky because research in the area of male infertility is so scant. A 1988 congressional study on infertility concluded that, given the lack of information on male infertility, “efforts on prevention and treatment are largely guesswork.”
The government still does not include men in its national fertility survey. “Why don’t we do men?” William D. Mosher, lead demographer for the federal survey, repeats the question as if it’s the first time he’s heard it. “I don’t know. I mean, that would be another survey. You’d have to raise money for it. Resources aren’t unlimited.”
• • •
IF THE “infertility epidemic” was the first round of fire in the pronatal campaign of the ’80s, then the “birth dearth” was the second. At least the leaders of this campaign were more honest: they denounced liberated women for choosing to have fewer or no children. They didn’t pretend that they were just neutrally reporting statistics; they proudly admitted that they were seeking to manipulate female behavior. “Most of this small book is a speculation and provocation,” Ben Wattenberg freely concedes in his 1987 work, The Birth Dearth. “Will public attitudes change soon, thereby changing fertility behavior?” he asks. “I hope so. It is the root reason for writing this book.”
Instead of hounding women into the maternity ward with now-or-never threats, the birth dearth theorists tried appealing to society’s baser instincts—xenophobia, militarism, and bigotry, to name a few. If white educated middle-class women don’t start reproducing, the birth-dearth men warned, paupers, fools, and foreigners would—and America would soon be out of business. Harvard psychologist Richard Herrnstein predicted that the genius pool would shrink by nearly 60 percent and the population with IQs under seventy would swell by a comparable amount, because the “brighter” women were neglecting their reproductive duties to chase after college degrees and careers—and insisting on using birth control. “Sex comes first, the pains and costs of pregnancy and motherhood later,” he harrumphed. If present trends continue, he grimly advised, “it could swamp the effects of anything else we may do about our economic standing in the world.” The documentation he offered for this trend? Casual comments from some young students at Harvard who seemed “anxious” about having children, grumblings from some friends who wanted more grandchildren, and dialogue from movies like Baby Boom and Three Men and a Baby.
The birth dearth’s creator and chief cheerleader was Ben Wattenberg, a syndicated columnist and senior fellow at the American Enterprise Institute, who first introduced the birth dearth threat in 1986 in the conservative journal Public Opinion—and tirelessly promoted it in an endless round of speeches, radio talks, television appearances, and his own newspaper column.
His inflammatory tactics constituted a notable departure from the levelheaded approach he had advocated a decade earlier in his book The Real America, in which he chided population-boom theorists for spreading “souped-up scare rhetoric” and “alarmist fiction.” The fertility rate, he said, was actually in slow decline, which he saw then as a “quite salutary” trend, promising more jobs and a higher living standard. The birth dearth, he enthused then, “may well prove to be the single most important agent of a massive expansion and a massive economic upgrading” for the middle class.
Just ten years later, the fifty-three-year-old father of four was sounding all the alarms about this “scary” trend. “Will the world backslide?” he gasped in The Birth Dearth. “Could the Third World culture become dominant?” According to Wattenberg’s treatise—subtitled “What Happens When People in Free Countries Don’t Have Enough Babies”—the United States would lose its world power status, millions would be put out of work, multiplying minorities would create “ugly turbulence,” smaller tax bases would diminish the military’s nuclear weapons stockpiles, and a shrinking army would not be able “to deter potential Soviet expansionism.”
When Wattenberg got around to assigning blame, the women’s movement served as the prime scapegoat. For generating what he now characterized as a steep drop in the birthrate to “below replacement level,” he faulted women’s interest in postponing marriage and motherhood, women’s desire for advancing their education and careers, women’s insistence on the legalization of abortion, and “women’s liberation” in general. To solve the problem, he lectures, women should be urged to put their careers off until after they have babies. Nevertheless, he actually maintains, “I believe that The Birth Dearth sets out a substantially pro-feminist view.”
Wattenberg’s birth dearth slogan was quickly adopted by New Right leaders, conservative social theorists, and presidential candidates, who began alluding in ominous—and racist—tones to “cultural suicide” and “genetic suicide.” This threat became the subject of a plank in the political platforms of both Jack Kemp and Pat Robertson, who were also quick to link the fall of the birthrate with the rise in women’s rights. Allan Carlson, president of the conservative Rockford Institute, proposed that the best way to cure birth dearth was to get rid of the Equal Pay Act and federal laws banning sex discrimination in employment. At a 1985 American Enterprise Institute conference, Edward Luttwack went even further: he proposed that American policymakers might consider reactivating the pronatal initiatives of Vichy France; that Nazi-collaborationist government’s attack on abortion and promotion of total motherhood might have valuable application on today’s recalcitrant women. And at a seminar sponsored by Stanford University’s Hoover Institution, panelists deplored “the independence of women” for lowering the birthrate and charged that women who refused to have many children lacked “values.”
These men were as anxious to stop single black women from procreating as they were for married white women to start. The rate of illegitimate births to black women, especially black teenage girls, was reaching “epidemic” proportions, conservative social scientists intoned repeatedly in speeches and press interviews. The pronatalists’ use of the disease metaphor is unintentionally revealing: they considered it an “epidemic” when white women didn’t reproduce or when black women did. In the case of black women, their claims were simply wrong. Illegitimate births to both black women and black teenagers were actually declining in the ’80s; the only increase in out-of-wedlock births was among white women.
The birth dearth theorists were right that women have been choosing to limit family size in record numbers. They were wrong, however, when they said this reproductive restraint has sparked a perilous decline in the nation’s birthrate. The fertility rate has fallen from a high of 3.8 children per woman in 1957 to 1.8 children per woman in the 1980s. But that 1957 peak was the aberration. The national fertility rat
e has been declining gradually for the last several centuries; the ’80s rate simply marked a return to the status quo. Furthermore, the fertility rate didn’t even fall in the 1980s; it held steady at 1.8 children per woman—where it had been since 1976. And the U.S. population was growing by more than two million people a year—the fastest growth of any industrialized nation.
Wattenberg arrived at his doomsday scenarios by projecting a declining birthrate two centuries into the future. In other words, he was speculating on the number of children of women who weren’t even born—the equivalent of a demographer in preindustrial America theorizing about the reproductive behavior of an ’80s career woman. Projecting the growth rate of a current generation is tricky enough, as post—World War II social scientists discovered. They failed to predict the baby boom—and managed to underestimate that generation’s population by 62 million people.
THE GREAT FEMALE DEPRESSION: WOMEN ON THE VERGEOF A NERVOUS BREAKDOWN
In the backlash yearbook, two types of women were named most likely to break down: the unmarried and the gainfully employed. According to dozens of news features, advice books, and women’s health manuals, single women were suffering from “record” levels of depression and professional women were succumbing to “burnout”—a syndrome that supposedly caused a wide range of mental and physical illnesses from dizzy spells to heart attacks.
In the mid-’80s, several epidemiological mental health studies noted a rise in mental depression among baby boomers, a phenomenon that soon inspired popular-psychology writers to dub the era “The Age of Melancholy.” Casting about for an explanation for the generation’s gloom, therapists and journalists quickly fastened upon the women’s movement. If baby-boom women hadn’t received their independence, their theory went, then the single ones would be married and the careerists would be home with their children—in both cases, feeling calmer, healthier, and saner.
• • •
THE RISING mental distress of single women “is a phenomenon of this era, it really is,” psychologist Annette Baran asserted in a 1986 Los Angeles Times article, one of many on the subject. “I would suspect,” she said, that single women now represent “the great majority of any psychotherapist’s practice,” precisely “sixty-six percent,” her hunch told her. The author of the article agreed, declaring the “growing number”of single women in psychological torment “an epidemic of sorts.” A 1988 article in New York Woman issued the same verdict: Single women have “stampeded” therapists’ offices, a “virtual epidemic.” The magazine quoted psychologist Janice Lieberman, who said, “These women come into treatment convinced there’s something terribly wrong with them.” And, she assured us, there is: “Being single too long is traumatic.”
In fact, no one knew whether single women were more or less depressed in the ’80s; no epidemiological study had actually tracked changes in single women’s mental health. As psychological researcher Lynn L. Gigy, one of the few in her profession to study single women, has noted, social science still treats unmarried women like “statistical deviants.” They have been “virtually ignored in social theory and research.” But the lack of data hasn’t discouraged advice experts, who have been blaming single women for rising mental illness rates since at least the 19th century, when leading psychiatrists described the typical victim of neurasthenia as “a woman, generally single, or in some way not in a condition for performing her reproductive function.”
As it turns out, social scientists have established only one fact about single women’s mental health: employment improves it. The 1983 landmark “Lifeprints” study found poor employment, not poor marriage prospects, the leading cause of mental distress among single women. Researchers from the Institute for Social Research and the National Center for Health Statistics, reviewing two decades of federal data on women’s health, came up with similar results: “Of the three factors we examined [employment, marriage, children], employment has by far the strongest and most consistent tie to women’s good health.” Single women who worked, they found, were in far better mental and physical shape than married women, with or without children, who stayed home. Finally, in a rare longitudinal study that treated single women as a category, researchers Pauline Sears and Ann Barbee found that of the women they tracked, single women reported the greatest satisfaction with their lives—and single women who had worked most of their lives were the most satisfied of all.
While demographers haven’t charted historical changes in single women’s psychological status, they have collected a vast amount of data comparing the mental health of single and married women. None of it supports the thesis that single women are causing the “age of melancholy”: study after study shows single women enjoying far better mental health than their married sisters (and, in a not unrelated phenomenon, making more money). The warning issued by family sociologist Jessie Bernard in 1972 still holds true: “Marriage may be hazardous to women’s health.”
The psychological indicators are numerous and they all point in the same direction. Married women in these studies report about 20 percent more depression than single women and three times the rate of severe neurosis. Married women have more nervous breakdowns, nervousness, heart palpitations, and inertia. Still other afflictions disproportionately plague married women: insomnia, trembling hands, dizzy spells, nightmares, hypochondria, passivity, agoraphobia and other phobias, unhappiness with their physical appearance, and overwhelming feelings of guilt and shame. A twenty-five-year longitudinal study of college-educated women found that wives had the lowest self-esteem, felt the least attractive, reported the most loneliness, and considered themselves the least competent at almost every task—even child care. A 1980 study found single women were more assertive, independent, and proud of their accomplishments. The Mills Longitudinal Study, which tracked women for more than three decades, reported in 1990 that “traditional” married women ran a higher risk of developing mental and physical ailments in their lifetime than single women—from depression to migraines, from high blood pressure to colitis. A Cosmopolitan survey of 106,000 women found that not only do single women make more money than their married counterparts, they have better health and are more likely to have regular sex. Finally, when noted mental health researchers Gerald Klerman and Myrna Weissman reviewed all the depression literature on women and tested for factors ranging from genetics to PMS to birth control pills, they could find only two prime causes for female depression: low social status and marriage.
• • •
IF MENTALLY imbalanced single women weren’t causing “The Age of Melancholy,” then could it be worn-out career women? Given that employment improves women’s mental health, this would seem unlikely. But the “burnout” experts of the ’80s were ready to make a case for it anyway. “Women’s burnout has come to be a most prevalent condition in our modern culture,” psychologists Herbert Freudenberger and Gail North warned in Women’s Burnout, one of a raft of potboilers on this “ailment” to hit the bookstores in the decade. “More and more, I hear about women pushing themselves to the point of physical and/or psychological collapse,” Marjorie Hansen Shaevitz wrote in The Superwoman Syndrome. “A surprising number of female corporate executives walk around with a bottle of tranquilizers,” Dr. Daniel Crane alerted readers in Savvy. Burnout’s afflictions were legion. As The Type E Woman advised, “Working women are swelling the epidemiological ranks of ulcer cases, drug and alcohol abuse, depression, sexual dysfunction and a score of stress-induced physical ailments, including backache, headache, allergies, and recurrent viral infections and flu.” But that’s not all. Other experts added to this list heart attacks, strokes, hypertension, nervous breakdowns, suicides, and cancer. “Women are freeing themselves up to die like men,” asserted Dr. James Lynch, author of several burnout tomes, pointing to what he claimed was a rise in rates of drinking, smoking, heart disease, and suicide among career women.
The experts provided no evidence, just anecdotes—and periodic jabs at feminism, which they
quickly identified as the burnout virus. “The women’s liberation movement started it” with “a full-scale female invasion” of the work force, Women Under Stress maintained, and now many misled women are belatedly discovering that “the toll in stress may not be worth the rewards.” The authors warned, “Sometimes women get so enthused with women’s liberation that they accept jobs for which they are not qualified.”
The message behind all this “advice”? Go home. “Although being a full-time homemaker has its own stresses,” Georgia Witkin-Lanoil wrote in The Female Stress Syndrome, “in some ways it is the easier side of the coin.”
Yet the actual evidence—dozens of comparative studies on working and nonworking women—all point the other way. Whether they are professional or blue-collar workers, working women experience less depression than housewives; and the more challenging the career, the better their mental and physical health. Women who have never worked have the highest levels of depression. Working women are less susceptible than housewives to mental disorders big and small—from suicides and nervous breakdowns to insomnia and nightmares. They are less nervous and passive, report less anxiety and take fewer psychotropic drugs than women who stay home. “Inactivity,” as a study based on the U.S. Health Interview Survey data concludes, “. . . may create the most stress.”