American Empire

Home > Other > American Empire > Page 62
American Empire Page 62

by Joshua Freeman


  In the case of Rwanda, where no issue like refugees directly impacted the United States, Clinton made the opposite decision, refusing to intervene. When hard-line leaders of the Hutu majority launched a campaign of genocide against the Tutsi minority and Hutu moderates, which resulted in the deaths of over 800,000 people, the Clinton administration did nothing. It even rejected the use of the term “genocide” to describe what was happening in order to minimize pressure to send the military to stop the killing.

  Clinton also avoided, at least at first, intervening in the wars and killing that came in the wake of the breakup of Yugoslavia. When ethnic fighting broke out in Bosnia, the United States, first under Bush and then under Clinton, declined to send forces to provide relief or stop the fighting. Only after the July 1995 murder by Serbian forces of over seven thousand unarmed Muslim men and boys at Srebrenica, the worst mass murder in Europe since World War II, did the United States, as part of a NATO force, intervene with aerial attacks. It then brokered the Dayton Accords, ending the fighting, and agreed to send a large U.S. contingent to Bosnia as part of a NATO peacekeeping force.

  The United States got involved in the Balkans again in 1999 when NATO—with the United States providing most of the firepower—launched a full-scale air war against Serbia to stop its ethnic cleansing of Albanian Muslims in Kosovo. The Clinton administration believed that it could quickly force the Serbians to cease their campaign using airpower alone, minimizing the risk to American lives. However, the Serbs held fast for eleven weeks, during which time they accelerated their displacement and killing of Albanians. Only after NATO extended its bombing to Serbia proper, including Belgrade, did Serbian leaders finally capitulate. The bombing campaign, conducted in the name of human rights, which resulted in the deaths of hundreds of civilians and hundreds and possibly thousands of Serbian soldiers but no Americans, reinforced the self-appointed global leadership of the United States, as it waged war against a country that had not attacked it, and bolstered the illusion that it could reshape or control faraway parts of the world with little cost in its own blood.

  But a series of terrorist attacks gave the lie to any notion that a large, technically sophisticated military deployed around the world could make the country invulnerable. Just days after Clinton became president, Mir Aimal Kasi, a Virginia resident who had grown up in Pakistan near the Afghan border, angry at American support for Israel, drove to the entry road to the CIA headquarters in Langley and opened fire with a locally purchased AK-47 on cars waiting to get onto the property, killing two agency employees before escaping the country. A month later, followers of a radical Egyptian imam living in New York detonated a rental van loaded with homemade explosives in the garage of the World Trade Center, killing six people and causing half a billion dollars’ worth of damage.

  Clinton authorized a more aggressive antiterrorism program in 1995, as fears grew that terrorists might obtain nuclear, chemical, or biological weapons. By the next year, the CIA had begun to focus on Osama bin Laden, a wealthy Saudi who had been an indirect ally of the United States in funding and helping organize the anti-Soviet war in Afghanistan. Bin Laden turned against the United States after it stationed troops in Saudi Arabia during the buildup to the Gulf War. In August 1998, members of the al Qaeda network, which bin Laden headed, used truck bombs to cause massive damage and loss of life at the U.S. embassies in Kenya and Tanzania, with most of the casualties Africans who just happened to be passing by. In retaliation, Clinton ordered cruise missile strikes against a chemical plant in Sudan and a camp in Afghanistan, which the United States claimed were al Qaeda installations. Unilateral use of force had become so normalized that there was little protest or even comment at home about launching missiles at countries with which the United States was not at war.

  The United States tried to stop bin Laden from launching further attacks, as CIA head George Tenet and other officials believed he would, possibly inside the United States. But the reluctance of the military to use force against al Qaeda in Afghanistan, which bin Laden used as his base under the protection of the Taliban government, and the hesitation of the Clinton administration to engage in assassination or kill civilian bystanders, made an inherently difficult task more so. Endlessly, the Clinton administration made plans to kill bin Laden, debated scenarios, vetted possible actions with lawyers, worried about killing women and children, but never had what it considered reliable enough evidence or a good enough chance to kill its target alone to launch an attack. Meanwhile, it privately fretted over what it saw as a much greater threat than bin Laden, the global spread of weapons of mass destruction and missiles to deliver them, as the orderly confrontation of the Cold War was replaced by chaotic multifronted conflicts and crusades, with enormous stockpiles of Cold War weapons and the knowledge about how to make them diffusing around the globe.

  Boom Again

  From 1991 until March 2001, the United States experienced the longest continual economic expansion in its modern history. Overall, the economy performed better than during the previous two decades, though it did not match its performance during the quarter century after World War II. Recovery started even before Clinton took office. As his stress on fiscal discipline and minimizing inflation kept wages flat, credit readily available, and investment and productivity increasing, the expansion soon accelerated.

  Early in the expansion, export-oriented manufacturing helped lead the way, benefiting from a cheap dollar in relation to the yen and the deutsche mark. However, the Clinton administration soon worried that the exchange rates that favored U.S. manufacturing would so weaken Japanese industry that they would throw Japan into a recession, which in turn might lead to a sell-off of Japanese investments in the United States and an international economic crisis. To avert that possibility, in 1995 the United States, Japan, and Germany agreed to increase the value of the dollar relative to the yen and the deutsche mark. Japanese goods became cheaper in the United States, while export-oriented manufacturing in the United States cooled down.

  But the U.S. economy continued to grow, now increasingly tied to a huge run-up in stock prices, which far exceeded the rate of growth in corporate profits. The bull market stemmed in part from corporations making enormous share purchases, either as part of mergers and acquisitions or to push up their own stock prices. Also, foreign investors put more and more money into American securities, helping to boost prices. The Federal Reserve and the Clinton administration encouraged the market boom and the increasing dominance of the financial sector, with Congress and the president agreeing on the 1999 repeal of the Glass-Steagall Act, the New Deal law that had forced a separation between investment and commercial banking. As the century drew to a close, the stock market seemed to defy gravity and rationality, as the dot-com bubble in media, telecommunications, and Internet stocks pushed prices far out of line with historic price/earnings ratios, until the market finally tumbled in late 2000 and 2001.

  While it lasted, the soaring stock market stimulated spending and economic growth. Corporations and well-off households borrowed huge amounts of money using inflated equities as collateral, which they then spent on investments and consumption. The personal savings rate fell from 8.7 percent in 1992 to −0.12 percent in 2000, as people—especially in the upper income brackets—not only borrowed money but spent down what they had saved in the past. Corporations also raised money by issuing shares at inflated prices, an important mechanism for financing Silicon Valley start-ups as well as more established companies, which in the past had looked to retained earnings and bond sales for capital.

  Most Americans did not experience significant benefits from the economic expansion until its later years. During the first half of the 1990s, real wages continued their long stagnation. From 1989 through 1994, median family income actually declined after adjusting for inflation. Though the unemployment rate began to fall in 1993, mass layoffs continued to be common. With nearly two and a half million workers in
1993 and 1994 losing jobs that they had held for at least three years, due to plant closings, relocations, or cuts in production, the chronic job insecurity that had been haunting the country since the 1980s continued.

  The weak returns for workers even as the economy improved helped spark a revolt within organized labor. During the late 1980s and early 1990s, reform movements gained strength in a number of unions, most importantly the Teamsters, where an insurgent candidate, Ron Carey, with backing from the rank-and-file Teamsters for a Democratic Union, ousted the candidate of the old guard. The 1994 Republican congressional victories brought dissatisfaction with AFL-CIO president Lane Kirkland to a head. Under pressure from a group of union presidents, Kirkland resigned, succeeded by his secretary-treasurer. But at the AFL-CIO convention in October 1995, dissatisfied union leaders ran their own slate of candidates, with Service Employees International Union president John Sweeney, who had overseen militant action and membership growth in his own organization, capturing the federation presidency in the first contested leadership election since the AFL and CIO merged in 1955.

  Sweeney reenergized the AFL-CIO and reached out to liberal, student, and religious groups of the sort with which labor once had strong tries but from which it had become isolated. He won an early victory when his “America Needs a Raise” campaign pressured Congress to boost the minimum wage, the value of which had been deeply eroded by inflation. Labor issues began receiving more attention in the media and on campuses, where students launched campaigns against goods produced under sweatshop conditions. But Sweeney’s effort to reverse the decline in union membership failed, as only a few unions took up his call to greatly increase their spending on organizing and as employers continued to be effective in defeating unionization drives. In 2000, fewer than 14 percent of workers belonged to a union, and only 9 percent in the private sector.

  With such a small percentage of the workforce unionized, organized labor no longer had much impact on national wage levels. But as unemployment continued to drop in the second half of the 1990s, falling to 3.9 percent in late 2000, the lowest level since 1970, the tight labor market brought substantial increases in real wages. Adjusted for inflation, between 1995 and 1999 the median wage went up 7.3 percent. Low-wage workers made the greatest gains. Black workers did particularly well, as the gap between black and white household incomes diminished. College enrollment and homeownership rates both moved up, while fewer families lived in poverty. Full employment—or as close as the country had come to it since the end of the long postwar boom—proved remarkably effective in raising the overall standard of living and helping the least privileged sectors of society.

  But the broad economic gains of the late 1990s did not reverse the basic dynamic of the economy since the recessions of the 1970s and the corporate revolution. Income and wealth inequality continued to increase during the Clinton years, while economic mobility decreased. In 2000, the top 1 percent of income recipients received more after-tax income than the bottom 40 percent, with the gap between the two groups having doubled over the course of two decades. The average real hourly wage in private industry was still 5 percent below what it had been in 1979.

  The growing gulf between the rich and the ordinary manifested itself dramatically in the soaring compensation for CEOs. In 1999, they took home on average 107 times what workers did, up from 56 times in 1989 and 29 times in 1978, far exceeding the ratio in other economically advanced nations.

  Even though income inequality reached a level not seen since before the New Deal, the fortunes made in the last years of the twentieth century did not carry with them the moral disrepute associated with wealth in earlier generations. Some CEOs did undergo criticism for cavalier disregard for their workers, like Al Dunlap, known as “Chainsaw Al” for his massive firing of workers at Scott Paper and Sunbeam. But fortunes gained through financial maneuvers, speculation, and monopoly were no longer viewed as parasitic, as they had been through the Great Depression. Wall Street had managed to remake its image as an arena of mobility, democracy, and social good, while money made in high-tech industries generally was seen as benign. The lack of social pretension of many of the new billionaires, like Bill Gates and Warren Buffett, provided a degree of cultural and political armor. When Chicago magazine calculated the all-time richest residents of the city, in the number two slot stood Ty Warner, who made his money selling the cute Beanie Babies toys, a personage far less likely to raise populist hackles than Samuel Insull, Cyrus McCormick, or George Pullman, industrial titans of earlier times, whose once-famed fortunes, even adjusted for inflation, were no match for later, lesser-known real estate and insurance billionaires.

  Deadlock

  The robust economy gave the Democrats a strong card going into the 2000 elections. But Vice President Al Gore proved to be a mediocre candidate, stiff and lecturing. The Republican nominee, George W. Bush, the son of the former president, was not well known outside of Texas, where he served as governor, but he won strong backing from both conservative activists and the Republican establishment. Though once in office Bush became one of the most conservative presidents in the country’s history, on the campaign trail he portrayed himself not as a radical reformer but as a “compassionate conservative,” committed not only to reduced taxes and government but also to bipartisanship and maintaining support for people in need. The election generated only modest interest, with voter turnout at 51.3 percent of eligible voters, a bit higher than in 1988 and 1996 but somewhat lower than in 1980, 1984, and 1992.

  The electorate split almost down the middle, with Gore beating Bush in the popular vote by half a percentage point (with left-leaning consumer advocate Ralph Nader taking 2.7 percent of the vote and conservative Pat Buchanan .4 percent). The electoral vote was even closer, with the election in several states that could determine the overall outcome too close to call on election night. The regional polarization of presidential politics had become extreme. Bush won the electoral votes from the entire South and all the Plains and Mountain states except New Mexico, where Gore eked out a 366-vote victory. Gore carried all the Northeast, Middle Atlantic, and West Coast states except New Hampshire and Alaska. (In the desert West and upper South, he lost several states that Clinton had carried, including his home state of Tennessee.)

  The election came down to who won Florida, where both sides claimed victory. Recounts, legal challenges, and political maneuvering proliferated, as the country went through the strange experience of having the election over but not knowing for five weeks who had won. In the end, it all rested on the political sympathies of the Supreme Court. In fragmented and largely incoherent decisions, a majority of justices in Bush v. Gore first stayed a partial recount of Florida ballots that seemed to be eliminating Bush’s slim lead and then gave the state’s electoral votes and with it the election to Bush because a full recount could not be completed within a time limit specified in Florida law. Using the political and cultural capital the Court had accumulated during the rights revolution, the Republican-nominated majority, with only thin pretense of legal consistency and reasoning, picked the next president.

  The 2000 election seemed to confirm the idea of a deeply divided country, with politically and socially conservative “red” states and liberal “blue” states corresponding to two very different ideological and cultural tribes of Americans. The notion had some truth to it. But in many states, the election was quite close, countering the impression of clear regional divisions. Furthermore, for all the intense partisanship of the election and its aftermath, the candidates, though they sharply disagreed on some issues, like abortion rights and gun control, were not terribly far apart about political economy and the role of government. Bush accepted the idea of at least a limited welfare state, seeking to trim back and partially privatize the New Deal–Great Society system of social benefits but not eliminate it, while Gore came from an administration that had accepted many of the basic tenets of Reaganism, including the d
eregulation of business and finance and the downsizing of government.

  When the dust settled on the disputed election, few Americans viewed the presidency of George W. Bush as illegitimate, while the Supreme Court lost little public respect for its role in determining the outcome. An odd combination of intense partisanship, political apathy, and centrist consensus had come to characterize national politics. Compared to the past, fewer Americans thought much of government, or felt it was worth their while to engage in politics, seeing the private realm and the private market as more important determinants of the quality of their lives. At least intuitively, many Americans recognized the extent to which power had shifted outside the control of the institutions of civic life that had been democratized in the decades after World War II.

  CHAPTER 19

  * * *

  Living Large

  “Fifty-inch screen, money green leather sofa / Got two rides, a limousine with a chauffeur,” intoned rapper Biggie Smalls on his 1994 hit “Juicy.” Rap music emerged from the South Bronx in the mid-1970s, amid New York’s fiscal crisis, part of a hip-hop culture that included graffiti and break dancing as well. From its home ground, hip-hop quickly spread to downtown clubs and galleries and black neighborhoods across the country. Two decades later, rap had become utterly mainstream, the dominant music not only among young African Americans, who pioneered it, but among suburban white youth too. Thematically multifarious, rap lyrics chronicled ghetto hardships, whipped up party frenzies, hailed “gangsta” life, and boasted of sexual conquest. And over and over, rappers celebrated consumption, extolling luxury cars, designer clothes, expensive jewelry, and top-shelf liquor, often by brand name. Puff Daddy, whose Gatsbyesque rise from Harlem to the Hamptons was chronicled by the celebrity magazines at supermarket checkout lines, even acclaimed money itself as a kind of brand in “It’s All About the Benjamins” (alluding to hundred-dollar bills, graced with a picture of Benjamin Franklin). The rap music celebration of things—the more, the more expensive, the more glittery, the better—captured something central to American life a half century after World War II, a testament to the complex fluidity of the United States that artists presenting themselves as representatives of an outlaw underclass became the public face of its dominant values.

 

‹ Prev