Book Read Free

The Great Democracy

Page 2

by Ganesh Sitaraman


  1

  THE ORIGINS AND MEANING OF NEOLIBERALISM

  The day after Ronald Reagan won the presidency in 1980, the PBS television show The MacNeil/Lehrer Report convened a panel of experts for the usual election postmortem. Pat Buchanan, former Nixon aide and future presidential candidate, called the election a “rejection of the Carter administration,” but more importantly, a “repudiation of the liberal philosophy, because by and large the liberals were defeated.” If Reagan could continue stealing blue-collar Democrats, Buchanan foresaw the creation of a “grand coalition… realigning the parties.”

  Anthony Lewis, columnist at the New York Times, largely agreed with Buchanan and thought the election was a “conservative revolution” not directed solely at Jimmy Carter. But his analysis was slightly broader even than Buchanan’s: this was a “conservative time,” he said, in which traditional liberals were never going to win—and that partly explained the loss of twelve Senate Democrats in addition to Carter. In this new era, the Democrats were a “party without an idea.”

  But the most interesting comments came later in the program, from Morton Kondracke, the executive editor of the New Republic. The magazine had historically been a bastion of progressive and liberal thought, but it had endorsed Republican-turned-Independent John Anderson for president that year. Anderson, of course, never had a chance and ultimately won zero states and received zero electoral votes.

  Representing the rogue liberal magazine, Kondracke argued that the entire worldview of the Democratic Party had failed and that the election was a repudiation not just of Carter but of President Lyndon Johnson’s Great Society. Far from being an armchair critic, Kondracke also came with a solution: “It seems to me that what the Democratic Party has to adopt is some sort of a—what might be called a neoliberal ideology.”

  Jim Lehrer had clearly never heard the phrase. “What in the world is that, Mort?”

  Kondracke had an answer. He said it meant embracing the traditional Democratic values of compassion for the downtrodden, but without government action through things like bureaucratic programs. He cited Senators Gary Hart and Paul Tsongas as exemplars of this new liberal ideology. At the time both were young members of Congress; with neoliberalism on the ascent, they would each run for the presidency in the years to come.

  Pat Buchanan chimed in and rejected Kondracke’s position. He didn’t think Democrats needed to repudiate the New Deal or the Great Society and certainly not President John F. Kennedy’s muscular, patriotic New Frontier. If Democrats adopted Kondracke’s neoliberal approach, Buchanan warned, they would cease to offer voters a choice. They would become “what the Republican Party used to be—a ‘me too’ party” that stood for nothing more than “let’s split the difference on this proposal.”1

  This one exchange, at the dawn of the Reagan administration, captured many of the core features and controversies of politics in the neoliberal era: the ideological dominance of conservatives, the strategy of liberals adopting conservative tactics, and the risks of liberals becoming a pale imitation of conservatives.

  But before going further, it is necessary to go back to Jim Lehrer’s question: What in the world is neoliberalism?

  Forty years after Reagan’s election, the term is becoming more and more prominent—from newspapers and magazine articles to scores of academic studies. To some, it is nothing more than a slur, an insult socialists hurl against conservatives, centrists, liberals, and even progressives. To others, it is synonymous with global capitalism. Still others think of it as a totalizing ideology that touches not just public policy but all aspects of life. Of course, historians are quick to note that its meaning has shifted over the eighty-year period in which it has been in use.

  For the most part, the various uses of neoliberalism relate closely to the common understanding of the term in public policy—or at least derive from its worldview. Neoliberalism is an approach to public policy that relies on individuals operating through private markets as much as possible. The role of the state is to provide a minimal framework for markets, and to the extent that government acts, it should do so in ways that maximize market strategies.2

  The intellectual origins of neoliberalism go back to conservative economists and intellectuals of the 1920s. But its most famous proponents were Friedrich Hayek, organizer of the Mont Pelerin Society, and his junior-partner-turned-popularizer, Milton Friedman. Hayek was an Austrian economist who spent most of the 1930s as a professor at the London School of Economics. Ever a skeptic of government action, he dissented from John Maynard Keynes’s approach to economic policy during the Great Depression and even debated the giant in a series of journal articles. In 1938, Hayek helped bring together a colloquium to celebrate Walter Lippmann’s book The Good Society, which argued for a renewed form of liberalism—one that critics and supporters alike characterized as neoliberalism. The goal of neoliberalism was largely economic: ensure free enterprise and prevent price regulation. Hayek’s group met again in 1947 at Mont Pèlerin, Switzerland. The first meeting of the newly christened Mont Pelerin Society included a variety of past and future luminaries who would work over the years to advance the cause of neoliberalism. Hayek’s efforts at institution building didn’t stop there. Over time, he orchestrated the creation of a cohort of like-minded intellectuals at the University of Chicago, including Milton Friedman. The Chicago School, as it was called, would come to outline neoliberalism in economics, law, and public policy.3

  Hayek’s intellectual arguments also paved the way for the emergence of neoliberalism as a force in the late twentieth century. Although his 1944 book, The Road to Serfdom, was often balanced in supporting a role for government in the economy, from the title on down, the book’s rhetoric frequently boiled down to a slippery slope argument. Government action anywhere risked tyranny and fascism everywhere. The conclusion of many slippery slope arguments is not to articulate and defend a sensible, balanced policy but to reject it in favor of something more radical. And this is what made The Road to Serfdom so politically potent in the postwar fights over public policy. Hayek’s ideas offered an ideological counterpoint to the extremes of authoritarianism and communism—one that conservative activists used to argue against even moderate liberal policies. Anti-communist critic Max Eastman and Reader’s Digest editor DeWitt Wallace took it upon themselves to reprint The Road to Serfdom in the popular magazine. Their edition, in the words of one historian, was “less an abridgement than a re-creation.” Sentences were “reordered and reconnected,” “new sentences were written,” and “qualifications were lost.” The real Road to Serfdom sold forty thousand copies. The Reader’s Digest edition sold a million copies. Corporations like General Motors and New Jersey Power and Light gave reprints of the revised edition to their employees. The National Association of Manufacturers sent copies to its fourteen thousand members. The partisan success of The Road to Serfdom made Hayek a celebrity and public intellectual, though this newfound status cost him legitimacy within the economics profession. Years later, Hayek would comment, “I discredited myself by publishing The Road to Serfdom.”4

  Milton Friedman had no such misgivings about his role as a popularizer of ideas. For all their similarities, Hayek and Friedman were fundamentally different. Hayek was a senior figure when he created the Mont Pelerin Society. Friedman was a junior economist who said the Swiss conference was what “got me started in policy and what led to Capitalism and Freedom,” his popular bestseller. Hayek didn’t have organizations to help him navigate public affairs. By the time Friedman was writing for the public, he benefited from a variety of organizations that had emerged to support neoliberal ideas, including those Hayek had helped create. And importantly, Hayek and the other founders of the Mont Pelerin Society were writing during and in response to the Great Depression, when the ideologies of the future were an open question. Friedman’s context was the Cold War, an era of existential conflict between two ideological adversaries.5

  Friedman’s 1962 book, Capital
ism and Freedom, articulated a far more radical and minimalistic vision of government. Friedman thought that unregulated monopolies were unlikely to be threatening to society. He was skeptical of government action to alleviate poverty. He called for abolishing the minimum wage, public housing, and even national parks. He wanted to get rid of the Federal Communications Commission and, later, the Food and Drug Administration. Over time, Friedman would also register opposition to the Marshall Plan and promote privatization of education through school vouchers. After supporting Senator Barry Goldwater’s failed presidential run in 1964, Friedman continued his advocacy. “Ideas have little chance of making much headway against a strong tide,” Friedman once wrote. “Their opportunity comes when the tide has ceased running strong but has not yet turned.”6

  By the 1970s, the tide of liberalism had ceased running strong. Economically, the decade ushered in wave after wave of anxiety and insecurity. The 1973 Arab oil embargo and resulting energy shortage hit consumers hard. That same year, the US stock market plummeted, losing half its value and leaving the economy in a recession until 1975. Inflation and a stagnant economy pushed Americans from saving to borrowing. Competition from abroad was also on the rise. At the end of World War II, most of the countries in the world with significant industrial potential were either lying in smoldering ruins or still under the thumb of colonialism. By the 1970s, these countries had bounced back, were industrializing, and were offering goods on the global market. Commentators worried about a “crisis of industrial society.” From 1967 to 1977, Boston, Philadelphia, Pittsburgh, and Chicago saw manufacturing down by a third.7

  Economic shocks were not limited to the United States. Fixed exchange rates were abandoned in 1971, fundamentally changing the Bretton Woods monetary system that had existed since the end of World War II. The IMF had to bail out Great Britain in 1976, demoralizing the Labour Party and imposing greater spending cuts than was necessary. Declaring the end of Keynesian economic policy, Prime Minister James Callaghan remarked to the Labour Party conference, “We used to think you could spend your way out of a recession and increase employment by cutting taxes and boosting government spending. I tell you in all candor that option no longer exists.” In the winter of 1978–1979, when Callaghan called for a limited pay increase for union workers, the unions protested. During the “winter of discontent,” schools were closed, mountains of trash filled the streets, and the dead were left unburied. “It was not a revolution, or an attempt to overthrow a government,” BBC reporter Andrew Marr writes in his history of modern Britain. “Yet that is the effect it had.”8

  The social conditions that helped build the postwar liberal era were also increasingly under stress. In this era, a rising middle class and growing economy meant that the expansion of civil rights, economic rights, and social opportunities posed comparatively little threat to economic security. The 1950s and 1960s brought progress along all these lines—from Brown v. Board of Education to the Civil Rights, Voting Rights, and Immigration Acts of the Johnson years. But in the new, anxious world of the 1970s, working-class whites began to fear that scarcity, not abundance, would define the future. As society became more inclusive, offering greater opportunities to women and minorities than ever before in history, people began to fear that the result would be a zero-sum economic game.9

  Big business contributed to this anxiety as well. Since the end of World War II, big business had cooperated with government and unions to establish a form of regulated capitalism in which the benefits of economic growth were broadly shared. Regulations blocked the most egregious and speculative practices, particularly in the financial sector. Taxes were extremely high: top marginal tax rates were 90 percent during the postwar years and remained at 70 percent after the Kennedy tax cut. Perhaps most importantly, business and labor worked together to share their success. Under the Treaty of Detroit, manufacturers agreed to provide significant social welfare benefits to employees—health insurance, pensions—in addition to regular wage increases. The result was that workers, managers, and shareholders alike benefited from growth.

  The 1970s brought the end of the business-labor-government partnership. Big business terminated the Treaty of Detroit, ending the era of cooperation. The Chamber of Commerce, the Business Roundtable, the National Association of Manufacturers, and other groups now fiercely opposed even moderate labor law reforms proposed during the Carter administration and increasingly targeted unions, wage increases, and benefits provision in order to cut costs and increase profits. George Meany, the head of the AFL-CIO, asked businesses in the Wall Street Journal: “Do you secretly seek a death sentence for the collective bargaining system you so often hail in public forums?” Republican senator Orrin Hatch said the late 1970s was the “starting point for a new era of assertiveness by big business in Washington.”10

  With the crisis of the 1970s, as John Kenneth Galbraith once remarked, “the age of John Maynard Keynes gave way to the age of Milton Friedman.” A new type of politician came to power, pushing neoliberalism against the reigning liberal consensus. In his first inaugural address, Ronald Reagan famously announced, “Government is not the solution to our problem; government is the problem.” Once in office, Reaganomics had four pillars: cut government spending, cut taxes, deregulate, and fight inflation through monetary policy. Prime Minister Margaret Thatcher’s agenda wasn’t so different: reduce taxes and spending, break the power of unions, privatize government services, and shift monetary policy away from a focus on unemployment.11

  The spread of these ideas was not limited to the United States and Britain. Neoliberalism was global from the start: Malcolm Fraser in Australia in 1975, Deng Xiaoping in China in 1979, Brian Mulroney in Canada in 1984. Although contexts were different in every country, neoliberalism’s core came to be associated with four elements: deregulation, liberalization, privatization, and austerity, or DLPA.12

  Deregulation

  Deregulation really involves two things: not issuing new regulations and rolling back existing regulations. Neoliberals generally have a rosy view of markets and profit seekers. As a result, they think that the marketplace will generally do a fine job of policing fraud and bad behavior and that most regulations are unnecessary. Of course, regulation of some kind is essential for neoliberals: private property rules and contract laws, for example, are critical for markets to function. But the goal is to minimize the rules.

  Reagan’s deregulation agenda, for example, began immediately. The president banned the promulgation of any new regulations, appointed deregulatory heads of agencies, and tasked the vice president with leading a task force on regulatory relief. Within six months, some 180 regulations had been “withdrawn, modified, or delayed.” Reagan’s Department of Transportation quickly rescinded a rule requiring auto manufacturers to install seat belts or airbags in cars, claiming there would be virtually no safety benefits from those technologies. The administration also “eased or eliminated price controls on oil and natural gas, cable TV, long-distance telephone service, interstate bus service, and ocean shipping.”13

  In 1982, the administration supported deregulation of the saving and loans sector, which allowed the S&Ls to take on broader—and riskier—lending practices. Instead of placing limits on these risky behaviors, the neoliberal approach instead suggested more deregulation—now for S&L bookkeeping rules. The ultimate result was the S&L crisis in the late 1980s, which cost taxpayers $124 billion in bailouts and interest.14

  Environmental inaction was also particularly notable. During Reagan’s first term, enforcement of strip-mining laws was down 62 percent, and enforcement of hazardous waste was down 50 percent. Exposure levels to chemicals were increased. In 1982 alone, the government approved 97 percent of business petitions for an “emergency” ability to use otherwise banned pesticides. In the mid-1980s, more than 375,000 hazardous waste sites needed reform, mostly because they threatened to pollute groundwater. During Reagan’s first four years, his EPA cleaned only six of these sites, and in 1985, the
administration proposed zero funding for groundwater programs and suggested ending the Superfund program for toxic waste cleanup altogether.15

  Democrats were on board with deregulation too. During the Clinton era, the Telecommunications Act of 1996 deregulated the sector, allowing the consolidation of television stations and radio stations, increased cross-ownership of cable and broadcast networks, and higher cable prices. The Gramm-Leach-Bliley Act of 1999 deregulated the financial sector, repealing parts of the New Deal–Era Glass-Steagall regime, which barred investment banks, depository banks, and insurance companies from consolidating. Regulators had watered down the Glass-Steagall regime over the years, but the 1999 law ended a system that had worked since the 1930s. A year later, big banks succeeded in pushing Congress to exempt complex financial products called over-the-counter derivatives from regulation—which would play an important role in the 2008 financial crash.

  Labor also serves as a regulator in the economy—and it, too, came under attack. In Britain, coal miners went on strike in 1984 and gained considerable sympathy across the country. Had the head of the union not made a series of tactical errors, the strike might have brought down Thatcher’s government. But the prime minister had prepared for precisely this occasion—and she refused to cave to the union’s demands. Thatcher’s commitment to fight unions was so strong that she would prevent her government from settling with public sector workers, even if cutting a deal would be financially better for British taxpayers.16

  Although Reagan had been a member of the Screen Actors Guild, he, too, contributed to labor’s decline. Appointees at the National Labor Relations Board and the Department of Labor were part of the strategy, but the marquee event was breaking the air traffic controllers strike. The Professional Air Traffic Controllers Organization (PATCO) sought higher pay given rapidly rising inflation. When they went on strike, Reagan fired eleven thousand air traffic controllers. No one thought Reagan would do it, and the consequences were significant. Strikes had long been one of the most effective tools for labor unions to get managers to bargain. But with a hostile federal government willing to declare war on workers, the number of strikes per year collapsed—and with it the power of the working class.17

 

‹ Prev