by Jill Lepore
Bill and Hillary Clinton frequently appeared together on the campaign trail in 1992. The Republican Party was losing women, fast. But the Democratic Party that Hillary Rodham joined in 1972 was undergoing a transformation, too, one unprecedented in the twentieth century: it was willfully kicking its base out from under it. Since the rise of William Jennings Bryan in 1896, the Democratic Party had been the party of labor. But early in the 1970s, while the Republican Party was courting blue-collar white men, especially men who’d lost their manufacturing jobs, the Democratic Party began abandoning blue-collar union workers, especially white men, in favor of a coalition of women, minorities, and what had come to be called “knowledge workers,” engineers, scientists, and analysts who wore white collars and tapped away at desktop computers at technology firms, universities, consulting firms, and banks.123
Fatefully, the Democratic Party made a bid to become not the party of labor but the party of knowledge. Party leaders, enthralled by the emerging high-tech industry, placed their faith in machines to drive demographic and political change. After the end of the Second World War, with the decline of industrial production, knowledge workers had become the fastest growing occupational sector. Cold War–era government-funded science and technology projects created a civilian offshoot: technology firms grew like weeds in the suburban peripheries of university-rich cities like Boston, New York, New Haven, Philadelphia, Atlanta, Chicago, Seattle, Los Angeles, Ann Arbor, Madison, Austin, Boulder, Chapel Hill, and San Francisco. If small in number, liberals who lived in the suburbs and worked in technology had an outsized influence on the Democratic Party. They favored—and lavishly funded—the campaigns of other highly educated liberals, from George McGovern in 1972 to Michael Dukakis in 1988 to John Kerry in 2004, campaigns that failed miserably.
The new Democratic understanding of the world was technocratic, meritocratic, and therapeutic. They believed that technology could fix political, social, and economic problems, and yet they also believed that they owed their own success to their talents and drive, and that people who had achieved less were less talented and driven. They tended not to see how much of their lives had been shaped by government policies, like government-funded research, or the zoning laws and restrictive covenants that had created high-quality schools in the all-white suburbs or the occasional swank urban pockets in which they typically lived. Notwithstanding all the ways in which government assistance had made possible the conditions of their lives and work, they tended to be opposed to government assistance. Believing in individual achievement and the power of the self, they saw the different political vantages of other people, especially of people who had achieved less, as personal, psychological failings: racism, for instance, they saw not as a structural problem but as a prejudice born of ignorance.124
Some of the attitudes of this political class lay in the mystique surrounding the personal computer. Big IBM machines and punch-card computing had looked, to the New Left, bureaucratic, organizational, and inhuman. Students were cogs in the machine of the university, draftees were cogs in the war machine, itself figured as a computer. In 1964, free speech activists demonstrating at Berkeley had strung around their necks punch cards that read “I am a UC student. Please do not fold, bend, spindle or mutilate me.”125 Personal computing, a rejection of those punch cards, came out of the 1960s Bay Area counterculture, a rage against the (IBM) machine. Its loudest promoter was Stewart Brand, who had joined Ken Kesey’s Merry Pranksters after graduating from Stanford and founded the Whole Earth Catalog in 1967, in Menlo Park, for the tens of thousands of people who were dropping out and moving back to the land, living in communes, and for the much bigger number of people who dreamed of dropping out. (The 1971 Whole Earth Catalog won the National Book Award and sold two and a half million copies. It peddled everything from copies of Milton Friedman’s Capitalism and Freedom for $1.50 to parts for an old Volkswagen, to a “Do-it-Yourself Burial” for $50, to instructions on “How to Build a Working Digital Computer,” for $4.45.)126 “A realm of intimate, personal power is developing,” Brand wrote, “power of the individual to conduct his own education, find his own inspiration, shape his own environment, and share his adventure with whoever is interested.” For Brand and these New Communalists, dropping out meant plugging in. Mind and consciousness, sun and soil, monitor and keyboard. In 1967, one Haight-Ashbury poet handed out a poem that began: “I like to think (and / the sooner the better!) / of a cybernetic meadow / where mammals and computers / live together in mutually / programming harmony / like pure water/touching clear sky.” Not irrelevantly, this same group of people, whole-earth hippies, usually had quite traditional ideas about the role of women. In the 1960s and 1970s on back-to-the-land communes where people read Brand’s Whole Earth Catalog, and imagined they were living on an American frontier, women baked bread and spun wool and breastfed and saved seeds.127
Brand was interested in planetary thinking—the “whole earth”—and imagined a worldwide network of computers bringing the world’s peoples together, in perfect harmony. That required, first, personal computers, one per person. In 1968, Brand helped produce the Mother of All Demos at a computer industry conference in San Francisco to demonstrate a prototype of a personal computer that Kesey later pronounced “the next thing after acid.” Brand wrote about computing for Rolling Stone in 1972: “Ready or not, computers are coming to the people.”128 Bill Gates and Paul Allen, who met as boys in Seattle, founded Microsoft in 1975, later adopting the motto “A personal computer on every desk.” In Cupertino, Steve Jobs and Stephen Wozniak founded Apple Computer in 1976 and released the Apple II the next year. By 1980, Apple’s IPO broke a record held by the Ford Motor Company since 1956.129 By the 1990s, wealthy Silicon Valley entrepreneurs would lead a Democratic Party that had restructured itself around their priorities. Beginning in 1972, the DNC instituted quotas for its delegations, requiring numbers of women, minorities, and youth but establishing no quotas for union members or the working class. The new rules made it possible for affluent professionals to take over the party, a change of course much influenced by longtime Democratic strategist Frederick Dutton’s Changing Sources of Power (1971). Dutton argued that the future of the party was young professionals, not old union members.130 Colorado senator Gary Hart, in 1974, took to mocking “Eleanor Roosevelt Democrats” as fusty and old-fashioned, not hip to the young computer people. The press called Hart’s constituency “Atari Democrats.”131
Personal computing enthusiasts liked to invoke “the power of the people,” but they meant the power of the individual person, fortified with a machine. Republicans, the party of big business, remained closely associated with IBM; Democrats, the party of the people, attached themselves to Apple, and jettisoned people without the means or interest in owning their own computer. The knowledge-worker-not-auto-worker wing of the party tried to move to the center, under the auspices of the Democratic Leadership Council, founded in 1985, and soon joined by Bill Clinton and Al Gore. Calling themselves the “New Democrats,” they blamed Carter’s defeat in 1980 and Mondale’s defeat in 1984 on their support for unions and their old-fashioned, New Deal liberalism.132 “Thanks to the near-miraculous capabilities of micro-electronics, we are vanquishing scarcity,” an article in the DLC’s New Democrat announced in 1995. The class politics of scarcity were dying, and in this new, bright age of the microchip, there would be a “politics of plenty” in which the people left behind—“the losers . . . who cannot or will not participate in the knowledge economy”—would be “like illiterate peasants in the Age of Steam.”133 The party stumbled like a drunken man, delirious with technological utopianism.
BILL CLINTON, FORTY-SIX when he entered the White House and gone already gray, stood six foot two. He had a grin like a 1930s comic-strip scamp, the cadence of a southern Baptist preacher, and the husky voice of a blues singer. He’d grown up poor in Hope, Arkansas—the boy from Hope—and he climbed his way to the White House by dint of charm and hard work and good luck. Durin
g the Vietnam War, he’d dodged the draft. After a Rhodes Scholarship and an education at Yale Law School, he’d begun a career in politics, with his young wife at his side. Like many a president before and since, he liked to be liked and he yearned to be admired, although, unlike most presidents, Clinton wore his neediness on his face; he had, all his life, the face of a boy. He was only thirty-two in 1978 when he was elected governor of Arkansas. He appeared to serve as a bridge between the Old Democrats and the New Democrats. A white southerner from a humble background, he appealed to the party’s old base. An Ivy League–educated progressive with a strong record on civil rights, he appealed to the party’s new base. And yet he was, all along, a rascal.
In 1992, Clinton’s campaign for the Democratic nomination had nearly been felled by his reputation as a philanderer. After allegations of one extramarital affair hit the tabloids, he and his wife appeared on 60 Minutes, sitting together stiffly, and he admitted to “causing pain in my marriage.” Citing his right to privacy, he refused to directly answer any questions about infidelity.134 He also suggested that his candidacy offered an opportunity for the press to turn away from salaciousness.
The year before, the battle for the courts had met the battle of the sexes during the Senate confirmation hearings of Bush’s Supreme Court nominee Clarence Thomas. In 1987, Thurgood Marshall, asked at a conference about the increasingly conservative nature of the court, said, “Don’t worry, I’m going to outlive those bastards.” But, suffering from glaucoma, hearing loss, and other ailments, Marshall retired from the court in 1991.135 To replace him, Bush nominated Thomas, whom he’d earlier appointed to the DC Circuit Court of Appeals. During the confirmation hearings, law professor Anita Hill accused Thomas, her former boss, of sexual harassment. The televised hearings had included graphic details. Despite Hill’s powerful, damning testimony, the Senate confirmed Thomas.
A year later, Clinton attempted to deflect inquiries about his alleged years-long affair with a woman named Gennifer Flowers by piously suggesting that public discourse had been demeaned by televised hearings, proposing to elevate it by refusing to provide details. “This will test the character of the press,” Clinton said on 60 Minutes. “It’s not only my character that has been tested.” The claim lacked the merest plausibility, not least because on other occasions Clinton had been perfectly willing to discuss matters that other presidential candidates and officeholders would have scorned as demeaning to the dignity of the office. Asked by a high school student at an MTV-sponsored event in 1994 whether he wore boxers or briefs, for instance, he’d not hesitated to supply an answer: “Usually briefs.”136
Clinton’s two terms in office frustrated the Left, enraged the Right, and ended in scandal. He won the 1992 election with the lowest popular vote—43 percent—since Woodrow Wilson. He set as his first task health care reform, which had been on the progressive docket for nearly a century. “If I don’t get health care done, I’ll wish I didn’t run for President,” he said. He handed this initiative over to his wife, assigning her to head the Task Force on National Health Care Reform, and calling her his Bobby Kennedy.137
Before her husband took office, Hillary Rodham Clinton, a chronic overpreparer, read forty-three biographies of presidential wives to equip herself for her role. After the administration’s first one hundred days, Vanity Fair, in a profile of the First Lady, described her as holding “unprecedented political ambitions.” The magazine reported: “As the first working mother in the White House, the first unapologetic feminist, and arguably the most important woman in the world, she wants not just to have it all, but to do it all.” She also changed her name again, going by “Hillary Clinton.” Six weeks after Hillary Clinton moved into the White House, Betty Ford came to visit. But Hillary Clinton was no Betty Ford. She had more senior staff assigned to her than did Vice President Al Gore.138
Hillary Clinton’s task force eventually produced a 1,342-page proposal for what was mainly employer-paid health care. Insurance companies and conservative policy groups, in a rerun of the Whitaker and Baxter campaign of 1949, spent hundreds of millions of dollars in advertising and lobbying campaigns to defeat the proposal. One series of ads featured a couple, Harry and Louise, regretting their lack of choice under “health care plans designed by government bureaucrats,” and closed: “KNOW THE FACTS.”
Bill Kristol, like his father before him a prominent conservative writer and strategist, urged Republicans to refuse any deal on health care in order to make the case to the public that “Democratic welfare-state liberalism remains firmly in retreat.” (Conservatives likely also feared that if the Democrats succeeded in passing health care, its popularity would make the Democratic Party unstoppable.) The First Lady, still a neophyte in the capital, urged her husband to make no compromises; in his 1994 State of the Union address, he promised to veto any bill that did not provide for universal coverage. By the midterm elections, when Republicans took over Congress, winning majorities in both houses for the first time in decades, the proposal, much derided for its intricacies and hobbled by conservatives’ distaste for the president’s wife, had failed. Felled by the unyielding partisanship of a new political culture, it never even reached a vote.139
The failure of Clinton’s health care proposal crippled his presidency. His lasting legacy, as a liberal, came in 1993, when he appointed Ruth Bader Ginsburg to the Supreme Court. But Clinton, who had more millionaires in his cabinet even than Bush had, moved to the right—even before the midterms—and much of his agenda amounted to a continuation of work begun by Reagan and Bush.140 He secured the ratification of the North American Free Trade Agreement (NAFTA), against the opposition of labor unions. He took up the War on Drugs waged by Nixon in 1971 and continued by Reagan with the 1986 Anti-Drug Abuse Act. In 1994, the year Newt Gingrich issued a conservative “Contract with America,” Clinton signed a new crime bill that lengthened mandatory sentencing and instituted a 100:1 ratio between sentences for possession of crack and of cocaine. (The bill also included an assault weapons ban, set to expire after ten years.) Some members of the Congressional Black Caucus (CBC) supported the bill; others did not. The NAACP called it a “crime against the American people.” The CBC attempted to introduce a Racial Justice Act, to include provisions relating to racial discrimination in sentencing; Republicans threatened a filibuster. When the crime bill passed, liberals boasted about becoming tough on crime. “The liberal wing of the Democratic Party is for 100,000 cops,” announced Joe Biden, a hard-bitten senator who grew up in Scranton, Pennsylvania. “The liberal wing of the Democratic Party is for 125,000 new State prison cells.”141
These bipartisan measures contributed to a social catastrophe, an era of mass incarceration, in which nearly one in every hundred American adults were in prison, the highest rate virtually anywhere in the world, and four times the world average. The 1994 crime bill didn’t cause that rise, which had begun much earlier, and, in any case, most people in prison are not convicted of federal crimes (their convictions follow state laws). But the federal crime bill, changes in state and local prosecution rates, and especially the new sentencing regime made the problem worse. Two-thirds of the rise in the prison population between 1985 and 2000 involved drug convictions. The overwhelming majority of Americans convicted of drug offenses are black men. “The Drug War Is the New Jim Crow,” read a poster stapled to a telephone booth in California, an observation with which social scientists came, painfully, to agree.142
Clinton struck deals on drugs, crime, and guns because he believed in compromise and bipartisanship, but also because he liked and needed to be liked. Especially after the embarrassment of his health care proposal, he moved still further from the center. His political compromises on welfare and the regulation of the economy proved as consequential as his crime bill. In 1996, Clinton and his band of meritocratic New Democrats found common cause with conservatives, led by House Speaker Gingrich, in realizing his campaign pledge to “end welfare as we know it.” Siding with those who desc
ribed welfare as trapping people in poverty through dependence on the government, his administration abolished Aid for Families with Dependent Children. Under the new regime, welfare was left up to the states. (Clinton vetoed a Republican version of the bill that would have ended guaranteed health care to the poor through Medicaid.)143
In 1999, in another abdication of the New Deal with far-reaching effects, Clinton signed a measure repealing elements of the Glass-Steagall Act, passed in 1933. The repeal lifted a ban on combinations between commercial and investment banks. Larry Summers, Clinton’s Treasury secretary, boasted, “At the end of the 20th century, we will at last be replacing an archaic set of restrictions with a legislative foundation for a 21st-century financial system.” The freewheeling securities industry saw record profits in the wake of the repeal. By the end of the decade, the average CEO of a big company earned nearly four hundred times as much as the average worker. And, not long after that, in 2008, during a global financial collapse, Summers’s twenty-first-century financial system would be revealed as having been cracked from the start.144