Fault Lines

Home > Other > Fault Lines > Page 30
Fault Lines Page 30

by Kevin M. Kruse


  As jobs disappeared, prisons grew. Stricter drug laws imposed in major states like New York (such as the Rockefeller Drug Laws of 1973) had steadily made it easier for the courts to convict those charged with possession or distribution of illegal substances. This long-term trend inside various states took on a national scope in the 1990s, most notably with the Violent Crime Control and Law Enforcement Authorization Act of 1994. Among other things, the $30 billion measure instituted harsher new rules for sentencing convicted felons and fueled new spending for prisons across the nation. Minimum sentences for many drug-related federal crimes were now formally mandated; state sentences were likewise extended, thanks to a provision that linked federal funds for prison development to state promises to make serious offenders serve at least 85 percent of their sentences. Most notably, the crime bill included a “three strikes, you’re out” provision, which mandated life imprisonment for criminals convicted of a violent felony if they had at least two prior convictions.37

  As public laws fueled mass incarceration, the privatization of the penal system created more institutional space and economic incentives for it too. Reporter Eric Schlosser aptly characterized the phenomenon as a “prison-industrial complex.” “The raw material of the prison-industrial complex,” he wrote, “is its inmates: the poor, the homeless, and the mentally ill; drug dealers, drug addicts, alcoholics, and a wide assortment of violent sociopaths.” Prisoners were profitable, however, and fueled a massive growth in the American penal system. California, Schlosser noted, had constructed eight maximum facility prisons between 1984 and 1994 to hold its booming prison population; by 1998, that one state incarcerated more people than the six nations of France, Great Britain, Germany, Japan, Singapore, and the Netherlands combined. By 1994, the private prison industry in America had grown so large and lucrative, it had its own trade magazine: Correctional Building News. Although Republicans had been the first to push for the privatization of prisons, the Clinton administration strongly supported this trend and worked with the Justice Department to place undocumented immigrants in these institutions. In 1997, for instance, the Wackenhut Corporation, a Florida-based company founded by a former FBI agent, was hired by the Federal Bureau of Prisons to manage a major prison.38

  The impact of mass incarceration fell predominantly on racial minorities. “There are today over two million Americans incarcerated in federal and state prisons and local jails throughout the United States,” scholar Manning Marable noted in 2000. “More than one-half, or one million, are black men and women.” The high rates of black imprisonment, he and others observed, stemmed from systemic biases that impacted black suspects at higher rates than whites, ranging from more aggressive policing in inner cities than in suburbs and “mandatory minimum” laws that set tougher sentences for crystalline “crack” cocaine (popular with poorer African Americans) than for powder cocaine (popular with affluent whites), to widely different rates of referral to juvenile courts for the races. As a result, a new generation of African Americans found itself disproportionately swept up in the carceral state. Roughly a quarter of black men in their 20s, Manning observed, were “either in jail or prison, on parole, probation, or awaiting trial.” As prisons shifted their overall emphasis from rehabilitation to punishment, moreover, the realities of incarceration became a permanent marker for large segments of society who were now cast off by society as unwanted and irredeemable.39

  Meanwhile, even those who had benefited from the 1990s boom economy began to have worries, as cracks in its façade appeared. Early excitement about computers turned to anxiety when fears of a “Y2K” bug led to predictions that on New Year’s Day, 2000, entire systems would shut down as a result of coding errors. The havoc never occurred, but a financial meltdown did. Many of the companies that had led the tech boom had profited from low interest rates and free-flowing private capital rather than the value of their product, but the Federal Reserve soon clamped down on interest rates several times and the economy slowed. On March 20, 2000, the tech-sensitive NASDAQ fell dramatically. “Fast-Forward Stocks Meet Rewind Button,” read a headline in the Wall Street Journal. The quick reversal suggested that the overhyped “momentum stocks” that had fueled the dot-com boom might easily go bust in a panic. “They drove them to ridiculous levels on the way up,” a hedge fund manager said, “and they’ll drive them to ridiculous levels on the way down.” 40

  Indeed, it soon became clear that many companies at the center of the “dot-com” boom were not remotely worth their alleged value. One of the best examples of overvalued stock was Pets.com, a start-up that promised to sell pet supplies online. Founded by Greg McLemore in August 1998, the firm went public a year and a half later, even though it still lacked a basic business plan or any solid market research. Julie Wainwright, with a background in marketing, took over the company and poured enormous resources into public relations. The company relied almost entirely on an aggressive ad campaign. Its mascot, a sock-puppet dog with a microphone, appeared on Good Morning America, the Macy’s Day parade, and even in a $3 million ad that aired during the Super Bowl. With its products placed on massive discounts to lure in new customers, the company made little profit of its own and was instead primarily supported by continued investments from private capital firms. In all, Pets.com spent almost $12 million on advertising in its first year, while earning just $619,000 in revenue.41

  With shaky start-ups like this at the heart of the “dot-com” boom, the market soon collapsed and countless companies began to fold. The Dow Jones Internet Index fell by 72 percent from its all time high point of March 2000, with online companies suffering steep drops in stock prices. The burst of the bubble continued into 2001, driving stocks down, bankrupting companies, and leaving many investors with massive losses. Soon, the high-tech marketplace looked like an abandoned mall. While 17 tech companies had purchased $44 million worth of advertisements in the 2000 Super Bowl, only three did so a year later, and at much lower rates.42 The future of the industry remained unclear, with many younger Americans, previously intoxicated by the ease of new wealth, sobered in their expectations. “What a difference a year makes,” lamented the editors of the New York Times. “The NASDAQ sank. Stock tips have been replaced with talks of recession. Many pioneering dot-coms are out of business or barely surviving.” 43 Pets.com stock had gone up to $14 initially, but toward the end it was selling for $1, finally closing at 22 cents.44 When Pets.com shut down its operations on November 8, 2000, the former owners sold the rights to their once famous mascot to a car loan firm called Bar None for $125,000.45 The slogan the new owners gave the once-popular puppet was “Everybody deserves a second chance.”

  Indecision 2000

  Despite the turmoil, the stock market recovered and, more importantly, unemployment rates remained low for the remainder of Clinton’s term. Throughout the year, the Gallup Poll asked Americans: “In general, are you satisfied or dissatisfied with the way things are going in the United States?” In January 2000, 69 percent of Americans said they were satisfied with how things were going. The number dipped a little as the campaign season wore on and the typical complaints of an election surfaced, but it remained a solid majority throughout the year. Clinton’s approval ratings remained strong as well, never sinking below 55 percent for the year. In spite of the impeachment process—or, as some mused, because of it—voters remained supportive of the president, largely due to the conditions of peace and prosperity that had marked the decade.46

  Not surprisingly, the candidates running to replace Clinton that year sought to convince the American people that they could best maintain this status quo. The Democratic nominee, Vice President Al Gore Jr., promised “continued peace and prosperity,” while the Republican nominee, Texas governor George W. Bush, advanced a similarly moderate agenda he called “compassionate conservatism.” Pointedly, both candidates sought to distance themselves from the partisan clashes of the previous decade. Gore kept Clinton at arm’s length, worried the president’s personal s
candals would tarnish his own image. Bush, meanwhile, criticized House Republicans for heartlessly “balancing the budget on the backs of the poor” and, in a sharp break with the culture wars, complained that all too often “my party has painted an image of America slouching toward Gomorrah,” a direct reference to (and dismissal of) the complaints from Robert Bork.47

  With Gore and Bush both trying to claim the political center, the 2000 election pivoted less on matters of policy than it did on personality. An amiable, easy-going Texan, Bush struck many, in the words of the Wall Street Journal columnist Paul Gigot, as a “likable lightweight.” After decades in the federal government, Gore had a stronger command of the issues, but the political reporters who significantly shaped the narrative of the campaign disliked him intensely. His public performances were routinely derided as “stiff” and “wooden,” and his campaign coverage soon centered around a narrative that he was an inveterate liar. Gore brought some of this onto himself, with a tendency to brag about his accomplishments. In 1999, for instance, he noted in an interview: “During my service in the United States Congress, I took the initiative in creating the Internet.” This statement was, on its own terms, true. As Vinton Cerf, the man commonly credited as the “Father of the Internet,” said in 1999, “The Internet would not be where it is in the United States without the strong support given to it and related research areas by the vice president in his current role and in his earlier role as senator. . . . He was the first elected official to grasp the potential of computer communications to have a broader impact than just improving the conduct of science and scholarship. . . . His initiatives led directly to the commercialization of the Internet. So he really does deserve credit.” The press, however, ignored that history and treated the claim as a whopper. Moreover, they simplified what Gore had said into something that would seem even more ridiculous, claiming he said simply “I invented the internet.” 48 Later in the campaign, Time reporter Margaret Carlson explained in an interview with talk radio host Don Imus how and why the media settled into its narrative. Both candidates made misstatements during the campaign, she said, but the press tended to seize on Gore’s alone. “You can actually disprove some of what Bush is saying if you really get in the weeds and get out your calculator or you look at his record in Texas,” she noted. “But it’s really easy, and it’s fun, to disprove Al Gore. As sport, and as our enterprise, Gore coming up with another whopper is greatly entertaining to us.” 49

  With the emphasis on personality over policy, the 2000 election became one of the closest in American history. In the end, Gore beat Bush by a margin of roughly 540,000 in the popular vote. But as election night wore on, it became clear that neither Bush nor Gore had the number of electoral votes needed to declare a victory. With every state except Florida decided, Gore had secured 267 electoral votes to Bush’s 246. Both were beneath the number needed to win—270—and therefore the 25 electoral votes of Florida would determine the winner. Relying on exit polling and competing with each other to call the race first, the networks initially awarded the state to Gore, then backed off, then called it for Bush, and then backed off again.50 Jon Stewart, the comedian and television host, expressed the frustration that he and others felt with the coverage. “The great part was just the giant fuck-up network-wise,” Stewart recalled, “ ‘Gore’s the winner. Gore’s not the winner. Bush is the winner. Bush is not the winner, nobody’s the winner.’ The media declared two people president, and then declared no one president.” 51

  As the color-coded maps of election night remained frozen in place on these networks—first for hours, then days and even weeks as the fight over Florida stretched on—Americans quickly adopted a new shorthand for their deepening political divisions: “red states” for Republican areas and “blue states” for Democratic ones. The assignment of the two colors was, in truth, arbitrary. The networks had started using color-coded maps in 1976, when NBC’s John Chancellor informed viewers that he would be coloring states for Republican Gerald Ford blue and states for Democrat Jimmy Carter red. Over the ensuing decades, the color schemes used in network news coverage of presidential elections changed from year to year, and network to network. (In 1984, Democratic vice presidential nominee Geraldine Ferraro noted how the “big three” networks displayed the results in different hues. ‘‘One network map of the United States was entirely blue for the Republicans,’’ she recalled. ‘‘On another network, the color motif was a blanket of red.’’) In contrast, Time magazine used blue for the Republicans and red for the Democrats from 1988 through 2000. The networks switched the order in their 2000 coverage, however, and as Americans stared endlessly at the electoral maps, and talked about them, the color scheme became cemented in their minds. With it came a new sense of the nation as one divided into separate camps: one set of red states and another set of blue states, rather than a United States.52

  In the 2000 race, the battle for Florida—the last state to be colored in, the one that would decide the whole election—carried on for weeks. The confusion over who had won the state stemmed from confusion over the votes at the county level. There were, closer inspection revealed, a number of serious inconsistencies and irregularities in the voting procedures of several of Florida’s counties. The most famous was the dispute over the so-called “butterfly ballot” clumsily designed by Democratic officials in Palm Beach County, an awkward form that led over 3,000 Jewish retirees—one of the state’s most reliably Democratic demographics—to cast ballots for the archconservative, third-party candidate Pat Buchanan instead of Al Gore.

  More serious than the butterfly ballot was the statewide purge of voter rolls. Between May 1999 and November 2000, Florida secretary of state Katharine Harris ordered the names of nearly 48,000 ex-felons—barred from voting by state law—to be removed from the rolls. In theory, this was perfectly legal; in practice, however, it proved quite controversial. In addition to her official state duties, the Republican Harris also served as one of the state cochairs of the Bush campaign. As reports soon revealed, roughly 2,800 people who had names similar to those of convicted felons had been wrongly removed from the voter rolls, while nearly 8,000 additional individuals who had only misdemeanors on their records—and who thus should have kept their voting rights—were purged as well. (In one of many examples that came to light only later, among those who lost their right to vote was a 64-year-old man whose only conviction had been for falling asleep on a bus-stop bench back in 1959.)53

  When the initial vote totals revealed that George W. Bush had a razor-thin margin of roughly 500 votes statewide, the Gore campaign pushed for selective recounts in a handful of Democratic-leaning counties. As these recounts eroded the Bush lead, the Republican campaign asked the Supreme Court to intervene. On December 12, 2000, the court handed down its pivotal decision in Bush v. Gore. In a bitterly divided 5–4 ruling, the conservative majority ordered an immediate halt to the recounts, awarding Florida’s electoral votes—and therefore, the entire presidential election—to Bush. In the end, a presidential campaign that had begun as a race to the center concluded with an ending that only polarized the country further.54

  CHAPTER 12

  Compassion and Terror

  THE SENSE OF PEACE THAT LOOMED OVER THE 2000 PRESIDENTIAL election, with each candidate focused on small-scale domestic issues and largely embracing limits on the use of military force overseas, seemed incredibly dated just one year later.

  While President George W. Bush sought to replicate some earlier acts of the Reagan Revolution, he initially focused his administration on an agenda of what he called “compassionate conservatism.” He was determined to forge a different path for the Republican Party, one that eschewed the bitter fights over social issues that had marked the culture wars of the previous decade, but he found his agenda stalled and then soon overshadowed by a sudden shift in foreign affairs.

  The horrific, large-scale terrorist attack on September 11, 2001, revealed the nation’s vulnerabilities and swept aside the eas
y confidence America had held when it seemed to be the last superpower standing in the wake of the Cold War. The events of that one day—so significant it became universally known by the shorthand “ 9/11”—immediately transformed the national security agenda, ushering in a new paradigm that soon dominated not just America’s diplomatic and military postures abroad, but its social and political life at home. At heart, the attacks represented a challenge not just to the psyche of the nation but also to its national security institutions. Initially, it seemed likely that the country would once again find unity in the face of outside threats, but it quickly became clear that the fault lines that had emerged over previous decades stubbornly remained.

  Compassionate Conservatism

  In its broadest strokes, the “compassionate conservatism” that Bush championed sought to soften the harder edges of the movement conservatism initiated by the Reagan Revolution of the 1980s and amplified by the Gingrich Republicans of the 1990s. Much as his father, President George H. W. Bush, had spoken of creating a “kinder, gentler America” during his term in office, President George W. Bush said he wanted to enable “armies of compassion”—standing in the private sector but financed by public money—to care for America’s needy. “I call my philosophy and approach compassionate conservatism,” he explained. “It is compassionate to actively help our fellow citizens in need. It is conservative to insist on responsibility and results. And with this hopeful approach, we will make a real difference in people’s lives.” 1

 

‹ Prev