The Map and the Territory
Page 23
IN THE AFTERMATH
As I noted in Chapter 7, an immediate broad activist financial response to the Lehman collapse was necessary to stabilize America’s fractured markets after its largest structural rupture in eight decades. But instead of backing away and allowing markets to rebalance in early 2009, we counterproductively, under the auspices of the Dodd-Frank Act, designated a number of financial businesses as “systemically important.” Although the Dodd-Frank Act explicitly says it is focused on eliminating financial institutions that are “too big to fail,” it does not.25 How can government, having recently bailed out a number of banks, ever allow an institution designated as “systemically important” to fail? At the end of this road is crony capitalism (see page 266).
Propping up “zombie companies” (as near bankrupt companies have come to be known) diverts part of the nation’s savings from funding productive technologies to continuing the support of the obsolescent technologies of our less productive “systemically important” firms. Productive firms that are well run do not seek or need government support; hence, designation as TBTF is superfluous. It matters only to those behemoths that are less productive. JPMorgan was pressured in 2008 to take unneeded bailout money because regulators worried that unless the bailouts were seen as an all-encompassing program of assistance, irrespective of need, failing firms would also refuse to accept such unprecedented assistance for fear of being permanently labeled second rate.
Our bankruptcy statutes, with all of their shortcomings, have been a major contribution to the flexibility and success of our economy for generations. Once financial markets had stabilized and debtor-in-possession financing became feasible, these zombie banks should have been put through the normal time-tested process of balance sheet restructure rather than the more politically responsive procedures of the postcrisis years. Guarantees with sovereign credit are an addicting narcotic. This is especially the case when their heavy costs (in terms of lessened competitiveness) are delayed and, even then, insidiously hard to isolate in clear view. Federal credit guarantees accordingly in recent years have become regulators’ solution of choice for most financial problems. But unscored and largely unnoticed is their negative effect on economic flexibility, so critical to economic growth.
As I noted in Chapter 5, seventeen systemically important financial institutions (SIFIs) have been designated by at least one major regulator as too big to fail, effectively rendering them guaranteed by the federal government. But I fear that even sound banks that are able to compete and survive in the roughest times on their own, if designated as too big to fail, with time will succumb to the awareness that if they take foolish risks, they will not be subjected to extreme punishment by markets. This means that even more scarce savings of the nation will be diverted to support less-productive institutions, both financial and nonfinancial, depriving funding from productive technologies. I emphasize in Chapter 9 that when the U.S. government runs a deficit ex post,26 some other sector of the American economy must be deprived of resources of an equal amount. Alternatively we would need to borrow the equivalent from foreign savers to fund those who are too big to fail. Those companies should be allowed to fail and, if necessary, liquidate.
As has occurred with Fannie Mae and Freddie Mac, the market has already accorded these TBTF institutions subsidized funding. This is evident in the cost of funding of large banking institutions relative to competing smaller institutions not favored with subsidized borrowings. As I noted in Chapter 5, IMF researchers in a recent working paper estimated “the overall funding cost advantage of [global] SIFIs as approximately 60 bp in 2007 and 80 bp in 2009.”27 The top forty-five U.S. banks in this study exhibited about the same degree of support as the global average. I also referenced Federal Reserve research that estimated the comparable market subsidy to Fannie Mae and Freddie Mac of 40 bp.28 In competitive financial markets, 40 bp to 60 bp is a very large advantage.
Such a market-based subsidy will enable a bank or other financial intermediary to attract part of the nation’s savings to fund its operations, even if its policies and portfolio, unsubsidized, would fail. Even more worrisome, the market players are beginning to conjecture that, in the event of the next crisis, most of the American financial system effectively would be guaranteed by the U.S. government.
The net effect of the politics of the financial crisis, especially in the form of the Dodd-Frank Act, will be to lower the long-term path of the nation’s productivity and hence growth in standards of living still further. Regrettably, a slowdown in productivity growth from the path it would have otherwise taken is not politically visible before it will become obvious in a permanent loss of share of global GDP.
THE “UNTHINKABLE”
Prior to the bailout of Bear Stearns, and later General Motors, Chrysler, and AIG, the notion that a large iconic American nonbank corporation would not be allowed to fail was rarely embodied in anybody’s risk management template.29 Few envisioned a major corporation (aside from Fannie Mae and Freddie Mac) being too big to fail. Virtually all risk managers perceived the future as largely determined by competitive markets operating under a rule of law—until 2008, that is.
Henceforth it will be exceedingly difficult to contain the range of possible policy activism in the face of even a modest economic disruption. Short of a significant credible shift in policy direction, promises of future government restraint will not be believed by markets. That became evident, postcrisis, in the failure of elevated risk spreads on liquid long-term debt to return to pre-2007 levels.30 Uncertainty and the variance it adds to potential outcomes of actions by private firms makes the world an economically riskier place.
TIMES HAVE SURELY CHANGED
On May 10, 2012, JPMorgan, America’s largest bank, reported a loss of $2 billion that resulted from a failed hedging operation.31 The loss barely reduced JPMorgan’s net worth. Common shareholders of the bank suffered a loss. Depositors of the bank and taxpayers did not. Nonetheless, Jamie Dimon, JPMorgan’s CEO, was called to testify before the Senate Banking Committee. As a director of JPMorgan between 1978 and 1987, I was aware of numerous sizable losses that made shareholders unhappy. But I do not recall any bank regulator publically commenting on the issue, or the JPMorgan CEO being called to testify. The loss was an issue solely between JPMorgan’s shareholders, its board, and management.
Yet the world has so changed that this recent loss was implicitly considered a threat to taxpayers. Why? Because of the poorly kept secret of the marketplace that JPMorgan will not be allowed to fail any more than Fannie and Freddie have been allowed to fail. In short, JPMorgan, much to its chagrin, I am sure, has become a de facto government-sponsored enterprise no different from Fannie Mae prior to its conservatorship. When adverse events depleted JPMorgan’s shareholder equity, it was perceived by the market that its liabilities were effectively, in the end, taxpayer liabilities. Otherwise why the political umbrage and congressional hearings following the reported loss?
Many, if not most, of the seventeen systemically important American banks are market competitive and in no immediate prospect of failing to meet their obligations. History tells us, however, that eventually such dependence on government protection, as I noted earlier, will dull their competitive drive, which can eventually lead to those institutions’ becoming wards of the state.
CRONY CAPITALISM
Crony capitalism emerges when government has wide discretion in controlling markets and favors some private practitioners over others. Those companies effectively become well-compensated tools of a state that shields them from the winds of creative destruction. The quid pro quo is important political support from the private firm. They have all of the important characteristics of companies that are too big to fail.
My earliest memory of what we now call crony capitalism is President Eisenhower’s famous retirement speech on the dangers of the emergence of a military-industrial complex. Today crony capitalism’s effect on modern global markets is undeniable. Such firms do
minate the economy of Russia, where fealty to President Putin counts greatly. Widespread publicity regarding relatives of China’s political leaders who have accumulated large fortunes, allegedly owing to their political connections, has been an embarrassment to the Communist Party. Crony capitalism in its various guises affects almost all countries to a greater or lesser extent. The World Bank attempts to measure and rank countries by the degree to which government has “control of corruption”—a major aspect of which is cronyism. At the top of their list of crony capitalists for 2010 were Venezuela, Russia, Indonesia, China, India, and Argentina. The least afflicted are the Scandinavian countries, Switzerland, New Zealand, and Singapore. “Cronyism” is rare in the United States because our press would pounce on any evidence of the type of blatant cronyism that exists in Russia. Nonetheless, we need to be vigilant.
THE RIGIDITIES OF ENTITLEMENT ECONOMICS
The rise of the role of government in the United States has coincided with, and is doubtless a cause of, increasing market rigidity. Competitive flexibility is a necessary characteristic of an innovative growing economy, and we are at the edge of losing it. According to the World Economic Forum, we have fallen in competitive rankings from first to seventh since 2008. The momentum of innovation of past generations is still working its way through large segments of our economy, but we are increasingly living off the seed corn of past harvests, and unless we reverse the inexorable increase in the role of government, we will surely lose our preeminence as the undisputed global economic leader.
The hegemony of the United States was evident to me as I spent more than two decades32 in government when our country was still the undisputed “special nation.” Our status was most evident in international gatherings at the IMF, the OECD, and meetings of the finance ministers and central bankers in Basel, Switzerland, and, in fact, all over the world. In recent years, that hegemony, according to Americans participating in recent meetings, has faded measurably. However, there is no apparent successor to the role of special nation. I believe we can regain our leadership role, provided we end our self-destructive policies. I will elaborate on this in the remaining chapters.
TWELVE
MONEY AND INFLATION
The spectacle of American central bankers’ trying to press the inflation rate higher in the aftermath of the 2008 crisis is virtually without precedent. The only previous case I can recall was during the 1930s. The gold standard was abandoned in 1933 because it seemed to be depressing the general price level and inhibiting recovery out of the Great Depression. More important, the restrictive nature of gold undermined the fiscal flexibility required by the New Deal’s welfare state.
After a century and a half of stable prices (when the dollar was convertible into silver and gold), fiat money1 price inflation took hold.2 Between 1933 and 2008, the Consumer Price Index of the BLS increased more than fourteenfold, an average annual rise of 3.4 percent. Most of the rest of the developed world abandoned gold during the 1930s and had inflation experiences similar to that of the United States. Central banks were as a consequence ceded the role of controlling the supply of money, and hence prices. Most economists at that time embraced the notion that the long-term growth rate of real GDP was facilitated by “a little inflation,” and accordingly, central bankers made it their goal to keep the rate of inflation down rather than to keep the level of prices unchanged.3 That policy, of course, acquiesced in an ever-rising price level.
In both the ancient and modern worlds, gold and silver have been universally accepted as means of payment. Their values are perceived as intrinsic and, unlike every other form of money, they do not need the further backing of credit guarantees by a third party. I have always found the status of these precious metals in our societies puzzling.4 But by the seventeenth century, gold and silver had started to become too bulky to handle in the normal course of business. That gave rise to paper currencies and eventually bank deposits (see Box 12.1).5 Originally, bank money was promises to redeem a note on demand in gold or silver; but it soon became apparent that sovereign government promises to pay eventually, rather than on demand, still made these fiat monies acceptable, within limits, as a medium of exchange.
But because they are acceptable as a medium of exchange, in addition to facilitating transactions, they can also serve as a store of value. The amount of cash needed to physically facilitate transactions is almost always quite small—amounts hung up in mail float, for example. The rest of money balances along with a large number of other assets—bonds, stocks, and even homes—are held as a store of value.
BOX 12.1: REDEEMABLE PAPER CURRENCY
As trade volumes mounted, the physical quantities of precious metals required for transactions became infeasible and by the seventeenth century, paper currency emerged in the form of “warehouse receipts” (banknotes) for stored gold. Among London goldsmiths, the warehousers of gold, paper promises to pay in gold rapidly began to displace the metal itself. It soon became obvious to the goldsmiths that much of the gold laid idle most of the time, and because gold is fungible, they realized they could lend out any part of their gold. One depositor’s gold was indistinguishable from that of another depositor. Whose gold they lent out did not matter. They could issue far more warehouse receipts than they had warehoused gold. Fractional reserve banking was born.
Clearly, the average length of time gold deposits laid idle in storage determined the extent to which warehouse receipts could be expanded beyond the actual amount of gold deposits. The longer gold depositors left their bullion idle, the larger the proportion of gold “currency” that could be safely loaned out. The well-established goldsmiths soon found that because withdrawals of stored gold in any month was only a modest fraction of what they held, they could safely re-lend 50–80 percent of their gold without fear of being unable to meet the withdrawals of any depositor. The goldsmiths eventually became today’s commercial bankers and their gold reserves were replaced by commercial bank deposits at central banks. This principle, in modern guise, holds to this day.
THE HOLDING PERIOD
The average holding period between acquisition of a money and its use to purchase a good, service, or asset is, of course, determined in the end by the person’s propensity to save. A person living hand to mouth has no ability to save, and their holding period for money, of necessity, will be quite short. At higher income levels, holding periods for all forms of assets, money included, are quite long. In all cases, money, the data show, will be held only so long as people do not expect it to lose a noticeable part of its value—that is, so long as they don’t expect product prices to rise inordinately. Only when idle store-of-value balances are spent do prices rise. In the case of extreme price rise, money holders figuratively rush for the exits; the average length of the currency’s holding period collapses, and hyperinflation takes hold. Symmetrically, when people increase their holding period for money balances, by definition, spending and prices fall. But so do interest rates, in an extension of Keynes’s liquidity preference paradigm.
FIAT CURRENCY
It is the variability of money-holding periods that has played so large a role in the history of fiat money and finance in general, and, as we shall see, in the structure of models of the financial sector. The soon-to-be United States financed its Revolutionary War, starting in 1775, with paper money (“Continentals”), which people initially accepted in payment for goods and services. It was used, for example, by General Washington to buy goods and services for his fledgling army. For a while, most recipients of those proceeds apparently respent them at much the same pace they did with gold before the hostilities began. (That is, the average holding period of the currency was apparently relatively stable.) But the huge volume of fiat issuance (nearly $250 million) engulfed the market and recipients soon found themselves with more paper money than they needed and presumably began to increase their consumption expenditures and asset buying, shortening the overall average holding period. People soon began to spend the currency a
s quickly as they received it. Their average holding period collapsed. Within three years, the purchasing power of the Continental had fallen to less than a fifth of its face value. By the spring of 1781, valueless, it ceased to circulate. It was not, so to speak, “worth a Continental.”
It is not surprising that when adopted, the Constitution of the United States reflected a hard money ethos. United States currency in circulation through the first six decades of the nineteenth century was confined to gold and silver coin, as well as banknotes redeemable for specie, such as those issued by the First and Second Banks of the United States.
Our history, however, appears to confirm that people are still willing to hold fiat money paper despite losing up to, say, 5 percent or even 10 percent a year in purchasing power. But beyond that, fiat money holders evidence increasing unease, especially if their incomes are not rising commensurately. When the fiat money printing presses speed up, money holders bail out.6 As money turnover accelerates, the currency becomes increasingly unacceptable. In the extreme—the Brazilian financial crisis of 1993–94 comes to mind—the holding period shrinks rapidly as prices explode to the time it takes to accept funds and then unload them.7
THE CHOSEN
This raises a second issue of money and finance: From whom are people willing to accept money in exchange for goods, services, and assets? Anybody can create money simply by issuing a personal IOU, but can he or she get anybody else to accept it as payment for goods or other financial instruments? Sovereign nations with taxing power have proved the most credible purveyors of fiat money, reflected in their ability to float fiat money debt (IOUs) at interest rates lower than any creditor in their private sectors. Thus, governments or their central banks can print money and expect it to be held by the public as a store of value for an extended period of time—much longer than Continentals, but far shorter than gold or silver. Hence, governments can run budget deficits because they are able to find willing buyers for their notes and bonds.