Just to give a sense of how extreme a financial crisis we are talking about, here are some statistical charts culled from the pages of the St. Louis Federal Reserve web page.17
Here is the amount of U.S. debt held overseas:
Meanwhile, private U.S. banks reacted to the crash by abandoning any pretense that we are dealing with a market economy, shifting all available assets into the coffers of the Federal Reserve itself, which purchased U.S. Treasuries:
Allowing them, through yet another piece of arcane magic that none of us could possibly understand, to end up, after an initial near-$400-billion dip, with far larger reserves than they had ever had before.
At this point, some U.S. creditors clearly feel they are finally in a position to demand that their own political agendas be taken into account.
CHINA WARNS U.S. ABOUT DEBT MONETIZATION
Seemingly everywhere he went on a recent tour of China, Dallas Fed President Richard Fisher was asked to deliver a message to Federal Reserve Chairman Ben Bernanke: “stop creating credit out of thin air to purchase U.S. Treasuries.”18
Again, it’s never clear whether the money siphoned from Asia to support the U.S. war machine is better seen as “loans” or as “tribute.” Still, the sudden advent of China as a major holder of U.S. treasury bonds has clearly altered the dynamic. Some might question why, if these really are tribute payments, the United States’ major rival would be buying treasury bonds to begin with—let alone agreeing to various tacit monetary arrangements to maintain the value of the dollar, and hence, the buying power of American consumers.19 But I think this is a perfect case in point of why taking a very long-term historical perspective can be so helpful.
From a longer-term perspective, China’s behavior isn’t puzzling at all. In fact it’s quite true to form. The unique thing about the Chinese empire is that it has, since the Han dynasty at least, adopted a peculiar sort of tribute system whereby, in exchange for recognition of the Chinese emperor as world-sovereign, they have been willing to shower their client states with gifts far greater than they receive in return. The technique seems to have been developed almost as a kind of trick when dealing with the “northern barbarians” of the steppes, who always threatened Chinese frontiers: a way to overwhelm them with such luxuries that they would become complacent, effeminate, and unwarlike. It was systematized in the “tribute trade” practiced with client states like Japan, Taiwan, Korea, and various states of Southeast Asia, and for a brief period from 1405 to 1433, it even extended to a world scale, under the famous eunuch admiral Zheng He. He led a series of seven expeditions across the Indian Ocean, his great “treasure fleet”—in dramatic contrast to the Spanish treasure fleets of a century later—carrying not only thousands of armed marines, but endless quantities of silks, porcelain, and other Chinese luxuries to present to those local rulers willing to recognize the authority of the emperor.20 All this was ostensibly rooted in an ideology of extraordinary chauvinism (“What could these barbarians possibly have that we really need, anyway?”), but, applied to China’s neighbors, it proved extremely wise policy for a wealthy empire surrounded by much smaller but potentially troublesome kingdoms. In fact, it was such wise policy that the U.S. government, during the Cold War, more or less had to adopt it, creating remarkably favorable terms of trade for those very states—Korea, Japan, Taiwan, certain favored allies in Southeast Asia—that had been the traditional Chinese tributaries; in this case, in order to contain China.21
Bearing all this in mind, the current picture begins to fall easily back into place. When the United States was far and away the predominant world economic power, it could afford to maintain Chinese-style tributaries. Thus these very states, alone amongst U.S. military protectorates, were allowed to catapult themselves out of poverty and into first-world status.22 After 1971, as U.S. economic strength relative to the rest of the world began to decline, they were gradually transformed back into a more old-fashioned sort of tributary. Yet China’s getting in on the game introduced an entirely new element. There is every reason to believe that, from China’s point of view, this is the first stage of a very long process of reducing the United States to something like a traditional Chinese client state. And of course, Chinese rulers are not, any more than the rulers of any other empire, motivated primarily by benevolence. There is always a political cost, and what that headline marked was the first glimmerings of what that cost might ultimately be.
All that I have said so far merely serves to underline a reality that has come up constantly over the course of this book: that money has no essence. It’s not “really” anything; therefore, its nature has always been and presumably always will be a matter of political contention. This was certainly true throughout earlier stages of U.S. history, incidentally—as the endless nineteenth-century battles between goldbugs, greenbackers, free bankers, bi-metallists and silverites so vividly attest—or, for that matter, the fact that American voters were so suspicious of the very idea of central banks that the Federal Reserve system was only created on the eve of World War I, three centuries after the Bank of England. Even the monetization of the national debt is, as I’ve already noted, double-edged. It can be seen—as Jefferson saw it—as the ultimate pernicious alliance of warriors and financiers; but it also opened the way to seeing government itself as a moral debtor, of freedom as something literally owed to the nation. Perhaps no one put it so eloquently as Martin Luther King Jr., in his “I Have a Dream” speech, delivered on the steps of the Lincoln Memorial in 1963:
In a sense we’ve come to our nation’s capital to cash a check. When the architects of our republic wrote the magnificent words of the Constitution and the Declaration of Independence, they were signing a promissory note to which every American was to fall heir. This note was a promise that all men, yes, black men as well as white men, would be guaranteed the “unalienable Rights” of “Life, Liberty and the pursuit of Happiness.” It is obvious today that America has defaulted on this promissory note, insofar as her citizens of color are concerned. Instead of honoring this sacred obligation, America has given the Negro people a bad check, a check which has come back marked “insufficient funds.”
One can see the great crash of 2008 in the same light—as the outcome of years of political tussles between creditors and debtors, rich and poor. True, on a certain level, it was exactly what it seemed to be: a scam, an incredibly sophisticated Ponzi scheme designed to collapse in the full knowledge that the perpetrators would be able to force the victims to bail them out. On another level it could be seen as the culmination of a battle over the very definition of money and credit.
By the end of World War II, the specter of an imminent working-class uprising that had so haunted the ruling classes of Europe and North America for the previous century had largely disappeared. This was because class war was suspended by a tacit settlement. To put it crudely: the white working class of the North Atlantic countries, from the United States to West Germany, were offered a deal. If they agreed to set aside any fantasies of fundamentally changing the nature of the system, then they would be allowed to keep their unions, enjoy a wide variety a social benefits (pensions, vacations, health care …), and, perhaps most important, through generously funded and ever-expanding public educational institutions, know that their children had a reasonable chance of leaving the working class entirely. One key element in all this was a tacit guarantee that increases in workers’ productivity would be met by increases in wages: a guarantee that held good until the late 1970s. Largely as a result, the period saw both rapidly rising productivity and rapidly rising incomes, laying the basis for the consumer economy of today.
Economists call this the “Keynesian era” since it was a time in which John Maynard Keynes’ economic theories, which already formed the basis of Roosevelt’s New Deal in the United States, were adopted by industrial democracies pretty much everywhere. With them came Keynes’ rather casual attitude toward money. The reader will recall that Keynes fully accepted that
banks do, indeed, create money “out of thin air,” and that for this reason, there was no intrinsic reason that government policy should not encourage this during economic downturns as a way of stimulating demand—a position that had long been dear to the heart of debtors and anathema to creditors.
Keynes himself had in his day been known to make some fairly radical noises, for instance calling for the complete elimination of that class of people who lived off other people’s debts—the “the euthanasia of the rentier,” as he put it—though all he really meant by this was their elimination through a gradual reduction of interest rates. As in so much of Keynesianism, this was much less radical than it first appeared. Actually it was thoroughly in the great tradition of political economy, hearkening back to Adam Smith’s ideal of a debtless utopia but especially David Ricardo’s condemnation of landlords as parasites, their very existence inimical to economic growth. Keynes was simply proceeding along the same lines, seeing rentiers as a feudal holdover inconsistent with the true spirit of capital accumulation. Far from a revolution, he saw it as the best way of avoiding one:
I see, therefore, the rentier aspect of capitalism as a transitional phase which will disappear when it has done its work. And with the disappearance of its rentier aspect much else in it besides will suffer a sea-change. It will be, moreover, a great advantage of the order of events which I am advocating, that the euthanasia of the rentier, of the functionless investor, will be nothing sudden … and will need no revolution.23
When the Keynesian settlement was finally put into effect, after World War II, it was offered only to a relatively small slice of the world’s population. As time went on, more and more people wanted in on the deal. Almost all of the popular movements of the period from 1945 to 1975, even perhaps revolutionary movements, could be seen as demands for inclusion: demands for political equality that assumed equality was meaningless without some level of economic security. This was true not only of movements by minority groups in North Atlantic countries who had first been left out of the deal—such as those for whom Dr. King spoke—but what were then called “national liberation” movements from Algeria to Chile, or, finally, and perhaps most dramatically, in the late 1960s and 1970s, feminism. At some point in the ’70s, things reached a breaking point. It would appear that capitalism, as a system, simply cannot extend such a deal to everyone. Quite possibly it wouldn’t even remain viable if all its workers were free wage laborers; certainly it will never be able to provide everyone in the world the sort of life lived by, say, a 1960s auto worker in Michigan or Turin with his own house, garage, and children in college—and this was true even before so many of those children began demanding less stultifying lives. The result might be termed a crisis of inclusion. By the late 1970s, the existing order was clearly in a state of collapse, plagued simultaneously by financial chaos, food riots, oil shock, widespread doomsday prophecies of the end of growth and ecological crisis—all of which, it turned out, proved to be ways of putting the populace on notice that all deals were off.
The moment that we start framing the story this way, it’s easy to see that the next thirty years, the period from roughly 1978 to 2009, follows nearly the same pattern. Except that the deal, the settlement, had changed. Certainly, when both Ronald Reagan in the United States and Margaret Thatcher in the UK launched a systematic attack on the power of labor unions, as well as on the legacy of Keynes, it was a way of explicitly declaring that all previous deals were off. Everyone could now have political rights—even, by the 1990s, most everyone in Latin America and Africa—but political rights were to become economically meaningless. The link between productivity and wages was chopped to bits: productivity rates have continued to rise, but wages have stagnated or even atrophied:24
This was accompanied, at first, by a return to “monetarism”: the doctrine that even though money was no longer in any way based in gold, or in any other commodity, government and central-bank policy should be primarily concerned with carefully controlling the money supply to ensure that it acted as if it were a scarce commodity. Even as, at the same time, the financialization of capital meant that most money being invested in the marketplace was completely detached from any relation to production of commerce at all, but had become pure speculation.
All this is not to say that the people of the world were not being offered something: just that, as I say, the terms had changed. In the new dispensation, wages would no longer rise, but workers were encouraged to buy a piece of capitalism. Rather than euthanize the rentiers, everyone could now become rentiers—effectively, could grab a chunk of the profits created by their own increasingly dramatic rates of exploitation. The means were many and familiar. In the United States, there were 401(k) retirement accounts and an endless variety of other ways of encouraging ordinary citizens to play the market; but at the same time, encouraging them to borrow. One of the guiding principles of Thatcherism and Reaganism alike was that economic reforms would never gain widespread support unless ordinary working people could at least aspire to owning their own homes; to this was added, by the 1990s and 2000s, endless mortgage-refinancing schemes that treated houses, whose value it was assumed would only rise, “like ATMs”—as the popular catchphrase had it, though it turns out, in retrospect, it was really more like credit cards. Then there was the proliferation of actual credit cards, juggled against one another. Here, for many, “buying a piece of capitalism” slithered undetectably into something indistinguishable from those familiar scourges of the working poor: the loan shark and the pawnbroker. It did not help here that in 1980, U.S. federal usury laws, which had previously limited interest to between 7 and 10 percent, were eliminated by act of Congress. Just as the United States had managed to largely get rid of the problem of political corruption by making the bribery of legislators effectively legal (it was redefined as “lobbying”), so the problem of loan-sharking was brushed aside by making real interest rates of 25 percent, 50 percent, or even in some cases (for instance for payday loans) 120 percent annually, once typical only of organized crime, perfectly legal—and therefore, enforceable no longer by just hired goons and the sort of people who place mutilated animals on their victims’ doorsteps, but by judges, lawyers, bailiffs, and police.25
Any number of names have been coined to describe the new dispensation, from the “democratization of finance” to the “financialization of everyday life.”26 Outside the United States, it came to be known as “neoliberalism.” As an ideology, it meant that not just the market, but capitalism (I must continually remind the reader that these are not the same thing) became the organizing principle of almost everything. We were all to think of ourselves as tiny corporations, organized around that same relationship of investor and executive: between the cold, calculating math of the banker, and the warrior who, indebted, has abandoned any sense of personal honor and turned himself into a kind of disgraced machine.
In this world, “paying one’s debts” can well come to seem the very definition of morality, if only because so many people fail to do it. For instance, it has become a regular feature of many sorts of business in America that large corporations or even some small businesses, faced with a debt, will almost automatically simply see what happens if they do not pay—complying only if reminded, goaded, or presented with some sort of legal writ. In other words, the principle of honor has thus been almost completely removed from the marketplace.27 As a result, perhaps, the whole subject of debt becomes surrounded by a halo of religion.
Actually, one might even speak of a double theology, one for the creditors, another for the debtors. It is no coincidence that the new phase of American debt imperialism has also been accompanied by the rise of the evangelical right, who—in defiance of almost all previously existing Christian theology—have enthusiastically embraced the doctrine of “supply-side economics,” that creating money and effectively giving it to the rich is the most Biblically appropriate way to bring about national prosperity. Perhaps the most ambitious theologian of th
e new creed was George Gilder, whose book Wealth and Poverty became a best-seller in 1981, at the very dawn of what came to be known as the Reagan Revolution. Gilder’s argument was that those who felt that money could not simply be created were mired in an old-fashioned, godless materialism that did not realize that just as God could create something out of nothing, His greatest gift to humanity was creativity itself, which proceeded in exactly the same way. Investors can indeed create value out of nothing by their willingness to accept the risk entailed in placing their faith in others’ creativity. Rather than seeing the imitation of God’s powers of creation ex nihilo as hubris, Gilder argued that it was precisely what God intended: the creation of money was a gift, a blessing, a channeling of grace; a promise, yes, but not one that can be fulfilled, even if the bonds are continually rolled over, because through faith (“in God we trust” again) their value becomes reality:
Debt Page 49