The Hell of Good Intentions

Home > Other > The Hell of Good Intentions > Page 6
The Hell of Good Intentions Page 6

by Stephen M. Walt


  The bottom line: the liberal vision of an increasingly democratic and economically open world—a world that many U.S. elites believed was in the offing when the Cold War ended—did not emerge as expected. History did not end; if anything, it galloped off in the opposite direction. Nor were these setbacks the result of a series of unfortunate accidents or a run of bad luck; they were mostly due to inflated expectations, hubris, and bad policy choices.

  By the time Donald Trump took the oath of office, visions of a robust and globalized world economy—guided by Washington and underpinned by American power—had largely evaporated. As the political economist Jonathan Kirshner noted in 2014, “actors throughout the world are disenchanted with the American model and with the U.S. orchestration of global economic governance. Many are now searching for alternative conceptions, and, feeling empowered, for greater voice in determining the rules of global governance and recognition of their own, often distinct, interests.”65 Here, as in many other areas, the strategy of liberal hegemony came up short.

  MAKING GLOBAL PROBLEMS WORSE

  When the unipolar era began, U.S. leaders believed that America’s privileged position would allow them to address and eventually solve a wide array of global problems. Although the United States made some progress on a number of challenges and was able to manage or resolve crises in several places, the overall record of the past three presidents was unimpressive.

  Perhaps most obviously, repeated U.S. efforts to resolve the Israeli-Palestinian conflict all ended in abject failure. Bill Clinton oversaw the Oslo peace process in the 1990s, George W. Bush negotiated the Middle East “Road Map” and convened a summit meeting in Annapolis, and Barack Obama spent eight years trying to halt the continued expansion of Israeli settlements and coax the two sides toward a final status agreement.

  Yet by 2016 the two-state solution that all three presidents had favored was further away than ever. The settler population in the territories Israel conquered in 1967 had grown from roughly 281,000 in 1993 to more than 600,000, and a network of Israeli roads, checkpoints, military bases, and settlements crisscrossed the West Bank, making a viable Palestinian state effectively impossible.66 Given the potential leverage that all three presidents had at their disposal, their inability to make meaningful progress toward a solution they believed to be, as Obama put it, “in Israel’s interest, the Palestinians’ interest, America’s interest, and the world’s interest” was a humiliating display of U.S. impotence and diplomatic incompetence.67

  Efforts to limit the danger from weapons of mass destruction—especially nuclear weapons—achieved only slightly better results. On the positive side, the Clinton administration successfully persuaded Ukraine, Belarus, and Kazakhstan to give up the nuclear arsenals they had inherited from the former Soviet Union, and the 1994 Agreed Framework with North Korea delayed its development of nuclear weapons for a few years. The so-called Nunn-Lugar programs helped place Russia’s vast and poorly secured stockpile of nuclear materials under more reliable control; and sustained pressure from the United States and its European allies eventually persuaded the Libyan leader Muammar Gaddafi to abandon his own WMD programs in exchange for a pledge that the United States would not overthrow him.68 The Obama administration also convened several well-attended Nuclear Security Summits that highlighted the need for further work on this problem.

  But on the negative side, the Agreed Framework with North Korea broke down after 2000, and Pyongyang eventually withdrew from the Non-Proliferation Treaty in 2003, tested a nuclear weapon in 2006, and had amassed a stockpile of at least a dozen bombs by 2016. India and Pakistan resumed nuclear tests in 1998 despite strenuous U.S. objections and continued to expand their nuclear arsenals in later years. UN inspectors dismantled Iraq’s nascent nuclear research program after the 1991 Gulf War, but neighboring Iran eventually mastered the full nuclear fuel cycle and produced a stockpile of enriched uranium that brought it within striking distance of a weapons capability. The Joint Comprehensive Plan of Action (JCPOA) completed in 2015 rolled back Iran’s enrichment capacity and uranium stockpile and increased the time it would take for Tehran to “break out” and build a weapon, but Iran was now a latent nuclear weapons state with the ability to get a bomb if it ever wanted to.

  With hindsight, it is not surprising that U.S. efforts to halt the spread of nuclear weapons achieved relatively little after 1993. Washington kept demanding that other states refrain from developing WMD, at the same time making it clear that it intended to keep a vast nuclear arsenal of its own.69 If the mighty United States believed its security depended on having a powerful nuclear deterrent, then surely a few weaker and more vulnerable states might come to a similar conclusion. Moreover, the U.S. decision to ignore its earlier pledge and topple Muammar Gaddafi in 2011 showed the world that Washington could not be trusted and that states with no deterrent were vulnerable to attack. That lesson was not lost on countries such as North Korea or Iran, which had every reason to fear U.S.-led regime change and thus ample incentive to preserve a nuclear option.70

  Last but by no means least, the U.S. response to international terrorism has been costly and counterproductive despite some modest successes. The Clinton administration recognized that groups like Al Qaeda posed a growing challenge in the 1990s, but it never developed an effective response to them.71 On the contrary, Clinton’s most significant attempts to deter, preempt, or retaliate for attacks on U.S. facilities or personnel were embarrassing debacles: a cruise missile strike on an Al Qaeda camp in Afghanistan in August 1998 missed Osama bin Laden, and a subsequent strike on an alleged chemical weapons facility in Sudan was in all likelihood an error based on faulty intelligence.72 Nor did Clinton or his aides ever reevaluate the policies that had helped to inspire movements like Al Qaeda in the first place, such as the strategy of “dual containment” in the Persian Gulf and unconditional U.S. support for Israel.73

  The most obvious failure of U.S. counterterrorism policy, of course, was September 11.74 The Bush administration responded to the attacks by launching a “global war on terror,” with the president vowing “to rid the world of evil.”75 Unfortunately, this mind-set led directly to the fateful decision to invade Iraq, which Bush and his aides believed would “send a message” to America’s enemies and spark a democratic transformation of the region, which they assumed would make it harder for extremists to recruit new followers.

  They could not have been more wrong. The occupation of Iraq fueled anti-Americanism across the Arab and Islamic world, and Iraq quickly became a magnet for extremists eager to take up arms against Uncle Sam. According to Peter Bergen and Paul Cruickshank, the Iraq conflict “greatly increased the spread of the Al Qaeda ideological virus, as shown by a rising number of terrorist attacks … from London to Kabul, and from Madrid to the Red Sea.”76 There were also incidents of blowback in the United States itself, such as the fatal shooting of thirteen U.S. Army soldiers at Fort Hood in 2009 by a U.S. Army psychiatrist who had become convinced that the United States was at war with Islam itself.77

  As the “virus” spread, the war on terror kept expanding and the number of enemies kept growing. Greater reliance on drone strikes and “targeted killings” by U.S. Special Forces kept the costs of the war low, but these measures could not eliminate the problem and frequently made things worse. As terrorism experts Bruce Hoffman and Fernando Reinares noted in 2014, “despite its systematic attrition as a result of the U.S. drone campaign … al-Qaeda has nonetheless been expanding and consolidating its presence in new and far-flung locales.”78

  Al Qaeda was barely present in Somalia in 2001, for example, but a series of bungled U.S. interventions galvanized an Islamic resurgence and eventually spawned al-Shabaab, a radical Islamist group that conducted a lethal attack on a Nairobi shopping mall in 2013 and remains a dangerous force today.79 U.S. counterterror operations and political interference had similar effects in Yemen, which gradually descended into a brutal civil war and remains a haven for Al Qaeda and other r
adical extremists.80

  Perhaps the clearest sign that the “war on terror” had not gone as planned was the emergence of ISIS. An even more extreme offshoot of Al Qaeda, the group seized power in portions of western Iraq and Syria in 2014, proclaimed a new “caliphate,” and used social media and online propaganda to attract thousands of recruits from around the world. ISIS agents and sympathizers conducted attacks in a number of countries—including France, Libya, Turkey, and the United States itself—and refugees from its tyrannical rule began fleeing to other countries.

  Bin Laden was dead, but “bin Ladenism” was clearly alive and well. In December 2013, the heads of the House and Senate Intelligence Committees, Senator Dianne Feinstein (D-CA) and Representative Mike Rogers (R-MI), told CNN that “terror was up worldwide … there were more groups than ever and there was huge malevolence out there,” and both agreed that Americans were not safer than they had been a year or two previously. Two years later, the CIA director, John Brennan, one of the leading architects of the war on terror, was admitting to a congressional committee, “our efforts have not reduced [ISIS’s] terrorism capability and global reach.”81

  The problem, as some U.S. officials had recognized from the start, was that there was no shortage of new extremists to replace those whom the United States had killed or captured. As the head of the U.S. Africa Command, General Thomas D. Waldhauser, admitted in 2017, “We could knock off all the ISIL and Boko Haram this afternoon … But by the end [of the] week, so to speak, those ranks would be filled.”82 Further evidence that the war on terror had become an endless, ever-expanding effort were the revelations that 17 percent of U.S. commando troops were now deployed in Africa (up from a mere 1 percent in 2006) and engaged in more than one hundred separate missions, and that the United States was building a $100 million drone base in Niger to facilitate further attacks on extremist groups in West Africa and Libya.83

  Nor was it obvious that all this effort and expense was necessary or cost-effective. Over time, it became increasingly clear that most terrorists were not brilliant criminal masterminds but incompetent bunglers. The 9/11 attacks were not a harbinger of horrific mass attacks to come; they are more properly seen as a tragic incident when Al Qaeda got extremely lucky. And as John Mueller and Mark G. Stewart have shown, even if the losses suffered on 9/11 are included, international terrorism still poses an exceedingly small threat to American lives.84 In 2001, the year the 9/11 attacks occurred, more Americans died from peptic ulcers than from all acts of terrorism.85 Thus, the enormous political, economic, and human costs of the war on terror—including the instability it has sown in many countries—was based on a panicked and faulty estimate of the true danger America faced.

  To be sure, the war on terror can claim some tangible achievements: a team of Navy SEALS eventually found and killed bin Laden, the drone war eliminated most of Al Qaeda’s original leaders, and U.S. airpower helped a coalition of Iraqis, Kurds, and Iranian militias retake the territory ISIS had seized and forced the organization back underground. Along with improving homeland security efforts, these policies made large-scale attacks on the United States even less likely than they already were.

  Viewed as a whole, however, the U.S. response to terrorism is no more impressive than the rest of its recent foreign policy. U.S. leaders understood that terrorism was a problem back in 1993; the problem is more widespread today. Violent extremists are active in more places than ever before and with more far-reaching political consequences, often as a direct result of misguided U.S. responses. Like other key aspects of U.S. foreign policy, the “war on terror” has been a costly failure.

  CONCLUSION

  No country as wealthy, powerful, and energetic as the United States fails every time, and U.S. foreign policy has produced a number of important successes in recent years. American diplomats brokered the peace treaty between Israel and Jordan in 1994 and the agreements that ended the Bosnian War in 1996. A combination of U.S. pressure and the Nunn-Lugar Cooperative Threat Reduction program improved nuclear security in Russia and the former Soviet republics, and the Bush administration’s Proliferation Security Initiative probably discouraged the export of dangerous weapons technologies. Bill Clinton successfully mediated the 1999 Kargil crisis between India and Pakistan, the PEPFAR program helped reduce the incidence of AIDS in Africa, and U.S. officials handled several potentially serious incidents with China (including a midair collision between a Chinese fighter and a U.S. reconnaissance plane) with skill and sensitivity.

  Some observers might include in this list of successes the restoration of diplomatic relations with Cuba and the multilateral agreement that capped Iran’s nuclear program and lengthened the time it would take for Tehran to acquire the bomb. There were also a number of dogs that didn’t bark—such as an all-out war on the Korean Peninsula, a military clash over Taiwan, or an actual nuclear exchange—and the United States can plausibly claim some credit for these “nonevents.” To say that U.S. foreign policy has been mostly a failure is not to say that it fails at everything.

  Nor is U.S. foreign policy solely responsible for the negative developments described above. Some of these adverse trends—such as China’s rapid rise and growing military potential—would probably have occurred no matter what the United States government did. The euro crisis may have begun when the U.S. housing bubble burst and U.S. financial markets crashed, but Washington is not responsible for the design flaws and other errors that made the euro vulnerable.

  But considering where the United States and the world were in 1993 and where both are today, and looking back at the major initiatives the United States undertook and the most fateful decisions U.S. leaders made, America’s outsize responsibility for today’s problems is hard to deny. U.S. leaders may have had the best of intentions and the fondest of hopes, but their ambitious effort to “shape the world … for the benefit of all nations” fell woefully short. The next chapter explains why.

  2.  WHY LIBERAL HEGEMONY FAILED

  THE PREVIOUS CHAPTER described how the optimistic hopes with which the post–Cold War era began had come crashing to earth by 2016. Longtime adversaries were stronger and more assertive; traditional U.S. allies were weaker and more divided; and America’s ambitious attempt to shape regional politics, spread liberal values, promote peace, and strengthen global institutions had mostly come to naught despite repeated and often costly efforts.

  The taproot of these failures was the U.S. commitment to a grand strategy of “liberal hegemony”: an ambitious effort to use American power to reshape the world according to U.S. preferences and political values. Despite important differences in style and emphasis, the Clinton, Bush, and Obama administrations were all deeply committed to this basic approach.

  Yet liberal hegemony proved an elusive goal. At its most basic level, the strategy failed because it rested on mistaken views of how international politics actually works. It exaggerated America’s ability to reshape other societies and underestimated the ability of weaker actors to thwart U.S. aims. The United States had enormous power and in some cases good intentions, but these virtues could not overcome the strategy’s inherent flaws.

  WHAT IS “LIBERAL HEGEMONY”?

  The grand strategy of liberal hegemony seeks to expand and deepen a liberal world order under the benevolent leadership of the United States.1 At the domestic level, a liberal order is one where most states are governed according to liberal political principles: democracy, the rule of law, religious and social tolerance, and respect for basic human rights. At the international level, a liberal order is characterized by economic openness (i.e., low barriers to trade and investment) where relations between states are regulated by law and by institutions such as the World Trade Organization and the Non-Proliferation Treaty or multilateral alliances such as NATO.

  Proponents of liberal hegemony do not believe that liberal orders arise spontaneously or sustain themselves automatically. On the contrary, they believe that such orders require active leade
rship by powerful countries that are deeply committed to liberal ideals. Not surprisingly, supporters of this strategy believe the United States is uniquely qualified to play that role. In practice, therefore, liberal hegemony rests on two core beliefs: (1) the United States must remain much more powerful than any other country, and (2) it should use its position of primacy to defend, spread, and deepen liberal values around the world.

  To a large extent, the pursuit of liberal hegemony has been an effort to expand the partially liberal order that the United States created and led during the Cold War. From the start of that conflict, U.S. leaders drew a sharp distinction between the democratic “free world” and the un-free world of Soviet-style communism.2 They pushed hard to dissolve the systems of imperial preference that such countries as Great Britain employed, along with other forms of protectionism, in favor of a more open international economic order that would encourage trade and growth and create opportunities for U.S. businesses. And they recognized that any system of states needed norms or rules (i.e., “institutions”) to facilitate mutually beneficial cooperation—at the same time taking care to ensure that these rules were consistent with U.S. interests.

  To be sure, the international order that emerged after World War II was only partly liberal. The Communist world was largely excluded, of course, and some key U.S. allies were not democracies, let alone liberal democracies. There was also considerable disorder within this system at various times and places, and the United States did not hesitate to break the rules (or rewrite them unilaterally) as the need arose. Nonetheless, the Cold War liberal order worked well for the United States and its allies, and their triumph over the Soviet bloc made that order look especially attractive to the countries that built it. With the United States in an overwhelming position of primacy after the Cold War, the time seemed ripe to make that order truly global in scope.

 

‹ Prev