The Map and the Territory
Page 11
Regulations Embodying a Forecast Fail with Regularity
The crisis has demonstrated that neither bank regulators nor anyone else can consistently and accurately forecast whether, for example, subprime mortgages will turn toxic or to what degree, or whether a particular tranche of a collateralized debt obligation will default, much less whether the financial system as a whole will seize up. A large fraction of such difficult forecasts will invariably be proved wrong. Regulators can sometimes identify heightened probabilities of underpriced risk and the existence of bubbles. But they almost certainly cannot, except by chance, effectively time the onset of crisis.37 This should not come as a surprise.
A financial crisis is almost always accompanied by an abrupt and sharp decline in the price of income-producing assets. That decline is usually induced by a dramatic spike in the discount rate on expected income flows as market participants swing from euphoria to fear. Implicit in any sharp price change is that it is unanticipated by the mass of market participants, for were it otherwise, the price imbalances would have been arbitraged away.
Indeed, for years leading up to the increase in foreclosure starts that began in the summer of 2006, it was widely expected that the precipitating event of the “next” crisis would be a sharp fall in the dollar in response to the dramatic increase in the U.S. current account deficit that began in 2002. The dollar accordingly came under heavy selling pressure. The rise in the euro-dollar exchange rate from around 1.10 in the spring of 2003 to 1.30 at the end of 2004 appears to have gradually arbitraged away the presumed dollar trigger of the “next” crisis. The U.S. current account deficit did not play a prominent direct role in the timing of the 2008 crisis, although because of that, it may in the next.
Many analysts argue that forecasting is not required for regulation. A systemic regulator, they hold, could effectively adjust capital and liquidity requirements to match the stage of the business cycle. Properly calibrated, such requirements presumably could be effective in assuaging imbalances. But cycles are not uniform, and it is difficult to judge at any point in time exactly where we are in the cycle. For example, the low of the unemployment rate at cyclical peaks (as identified by the National Bureau of Economic Research) since 1948 has ranged between 2.6 and 7.2 percent. Would policy makers have judged a turn in the business cycle when, for example, the unemployment rate rose to 5.8 percent in April 1995, up from 5.4 percent in March of that year? In the event, the unemployment rate soon reversed itself and continued to fall for five more years.
The Federal Reserve has been concerned for years about the ability of regulatory supervisors and examiners to foresee emerging problems that have eluded internal bank auditing systems and independent auditors. I remarked in 2000 before the American Bankers Association that “in recent years rapidly changing technology has begun to render obsolete much of the bank examination regime established in earlier decades. Bank regulators are perforce being pressed to depend increasingly on greater and more sophisticated private market discipline, the still most effective form of regulation. Indeed, these developments reinforce the truth of a key lesson from our banking history—that private counterparty supervision remains the first line of regulatory defense.”38 Regrettably, that first line of defense also failed in 2008.
A century ago, examiners could appraise individual loans and judge their soundness.39 But in today’s global lending environment, how does a U.S. bank examiner judge the credit quality of, say, a loan to a Russian bank, and hence of the loan portfolio of that bank? That, in turn, would require vetting the Russian bank’s counterparties and those counterparties’ counterparties, all to judge the soundness of a single loan. In short, a bank examiner cannot make a judgment and neither can a credit-rating agency. How deep into the myriad layers of examination is enough for certification?
The complexity of our financial system in operation spawns, in any given week, many alleged pending crises that, in the event, never happen, and innumerable allegations of financial misconduct. To examine each such possibility at the level of detail necessary to reach meaningful conclusions would require an examination force many multiples larger than those now in place in any of our banking regulatory agencies. Arguably, at such levels of examination, sound bank lending and its necessary risk taking would be impeded.
The Federal Reserve and other regulators were, and are, therefore required to guess which of the assertions of pending problems or allegations of misconduct should be subject to full scrutiny by a regulatory workforce with necessarily limited examination capacity. But this dilemma means that in the aftermath of an actual crisis, we will find highly competent examiners failing to have spotted a Bernie Madoff. Federal Reserve supervision and evaluation is as good as it gets, even considering the failures of past years. Banks still have little choice but to rely upon counterparty surveillance as their first line of crisis defense.40
SIX
SCHOONER INTELLIGENCE AND THEN SOME
Few Americans in the nineteenth century probably thought much about broader economywide developments, but those who did more likely than not embraced the conventional wisdom of Adam Smith and his followers: Markets were always self-adjusting and bouts of unemployment were temporary. All economic forces were inevitably pressing for full utilization of the nation’s resources. If supply was excessive, prices would fall, and lower prices would galvanize suppressed demand. To be sure, much attention was given to booms and busts, but business discussions of the state of the economy overall were phrased in general, qualitative terms: “Business is depressed” or “money is very tight.” Indeed, a century later researchers produced such “annals” going back to 1790.1 We did have some hard data: The decennial census of population, which increasingly included economic data, gave analysts a general idea for census years of the size and makeup of the overall economy. But timely numbers for the American economy as a whole were scarce prior to the twentieth century.
Government was seen by almost all as having little role in the economy, other than providing security against crime and war and fostering a rule of law with particular emphasis on the protection of property rights. Prior to the Civil War, revenues to sustain these functions came almost exclusively from customs fees and some sales of public lands. The introduction of income taxes had to await the Sixteenth Amendment to the U.S. Constitution, which did not arrive until 1913.
There were no official fiscal and monetary policies in today’s meaning of the terms. The government was not seen as having a role in directing macroeconomic developments. There were, of course, isolated instances in which the U.S. Treasury, and especially the Second Bank of the United States, appeared to have engaged in what we today term fiscal or monetary policy. But these actions were not systematic, and they were a very pale comparison to today’s policy response to every pending shortfall of economic performance from its perceived optimum.
ECONOMIC FORECASTING AS PRIVATE ENTERPRISE
Through nearly the first century and a half of American history, such economic forecasting as occurred was chiefly a matter for private enterprise and prior to the Civil War was largely related to agriculture and shipping. What businessmen sought above all was information on the competitive forces that affected them directly. In recent decades, we have grown so accustomed to receiving news in real time that we take it largely for granted. But timely information in the first half of the nineteenth century was precious and costly to acquire in a way no one thinks about today. The thirst for information, especially from financial markets, was unquenchable in those early days of the republic. The flow of imports into the populous U.S. East Coast during that century was much less predictable, altering supply and prices, often in unexpected directions. To address that pressing issue, the Journal of Commerce,2 a primary source of breaking shipping news, cleverly deployed two deepwater schooners to intercept incoming ships to get trade stories ahead of the markets. Schooner intelligence was the information high technology of its day. The wasted effort created by lack o
f timely information was widespread. When, for example, prior to 1850, similar commodities were traded in separate markets, with participants in each market unaware of what was concurrently happening in the others, the price-making process was decidedly suboptimal.
The major information breakthrough in the United States occurred when Samuel F. B. Morse demonstrated a commercially viable telegraph in 1844.3 Within a decade, Morse’s telegraph blanketed much of the country east of the Mississippi River and, shortly thereafter, much of populated California. But that left a large void in the center of the country. In the late 1850s, it still took more than three weeks, by a combination of telegraph and stagecoach, to convey a message from one coast to the other. Starting in 1860, the foreshortened route of the legendary Pony Express brought the transmission time to under ten days.4 But the information innovation of the Pony Express came to an abrupt end on November 15, 1861, when the transcontinental telegraph was joined and business could communicate across the continent in a matter of minutes.
There can be little doubt that in the seventeen years between 1844 and 1861 the contribution of information transfer to national productivity was enormous. From the point of view of financial markets, however, the poor state of transatlantic communications still remained a problem. The front page of the New York Times of September 18, 1850, well before the laying of the transatlantic telegraph cable, was typical of that period. The Times announced that “the Royal Mail Steamer Europa arrived at Boston yesterday [and] her mails of September 8th reached this city at an early hour last evening. . . . The news has considerable interest.”
The U.S.-European information merger finally took place (after several false starts) on July 28, 1866, when the transatlantic cable went into operation. On that day, market participants in New York, San Francisco, and London could communicate with one another nearly in real time. Those flows of information greatly facilitated more efficient pricing throughout these geographically diverse locations and undoubtedly improved the allocation of global resources.5
As a consequence of these remarkable new technologies, financial information dissemination finally turned national and global after the Civil War. The Wall Street Journal emerged in 1889, eventually becoming the icon of business news and, of course, the creator of the Dow Jones Industrial Average.
LOCATION MATTERS
Signaling to participants in financial markets in real time profitable ways to invest a society’s savings enhances productivity and standards of living. But if enhanced information can add value to economic output, so too can reduced costs of transportation. Location matters. In the mid-twentieth century, steel sheets coming off a rolling mill near Pittsburgh had less economic value than the same steel entering an automotive assembly line in Detroit. Minimizing the resources necessary to move goods to final users increases net value added.
Railroad expansion and the Homestead Act,6 during and after the Civil War, opened up the Great Plains to a massive migration that led to the more than doubling of national wheat production. Steel production, the backbone of American industrial advance in the last quarter of the nineteenth century, was propelled forward by the invention of the Bessemer furnace (1856) and the discovery in 1866 of iron ore in Minnesota’s Mesabi Range. By 1890, U.S. iron ore production had quadrupled. Northern Michigan’s canal locks, at Sault Ste. Marie, which opened in 1855 and linked Lake Superior and Lake Huron, enabled Mesabi ore to be shipped to the burgeoning steel industry of the Midwest, and channeled a substantial part of national grain output through the Erie Canal and on to the populous East Coast.
A century earlier, aside from coastal and river shipping, transportation was largely horse driven. Horses in those days were a major part of our economy’s capital stock. They, along with oxen and mules, were the key to travel and transport until they were displaced by the railroad and motor vehicles. During the closing years of the nineteenth century, real-time economic information and efficient transportation expanded dramatically, materially reducing the costs of producing the nation’s output. The building of the transcontinental railroad (completed in 1869) reduced the travel time across the continent from six months to six days. And the emergence earlier of the telegraph was to communication what the railroads became to the transportation of goods and people.
In today’s world of global real-time communications and jet travel, we tend to lose sight of what was all too evident to our forebears: that the value of production depends on timely location of goods and the speed at which information on prices, interest rates, and exchange rates becomes available to producers. These seminal relationships are covered further in Chapter 8.
THE BIRTH OF THE AGE OF DATA
By the twentieth century, enough data existed to develop estimates of the size and changes in national economic output. Railcar loadings became a popular means of judging production. (I still use those data to help track industrial activity on a weekly basis.) By the 1920s, economic analysts were soon citing weekly bank clearings (excluding data from heavily financial New York City) to gauge nonfinancial economic trends nationwide. Regular “bank letters” emerged as a major source for interpretations of ongoing business activity, the forerunner of today’s vast commentary on economic trends. The National City Bank letter, originated by George Roberts in 1914, and the Chase National Bank letter, originated in 1920 under the direction of Benjamin Anderson, became prominent. The Economist, published in London, increasingly included articles on U.S. business. The Federal Reserve started producing measures of economic activity in 1919. Successive improvements led to something close to current practice in 1927.
In the 1930s, Simon Kuznets, an economics professor at the University of Pennsylvania, filled the need for more comprehensive measures of national economic developments. Funded by the National Bureau of Economic Research (NBER), Kuznets assembled time series data on national income back to 1869, broken down by industry, final product, and use—a method significantly more detailed than any previous study had produced. His work set the standard for gross national product (GNP) measurement that was later adopted by the Department of Commerce. Eventually ongoing estimates of national income and GNP became available. Efforts to measure the national economy got a major push from the desire to better understand the depth and nature of the contraction of economic activity during the Great Depression, and then received another impetus from the need for planning during the Second World War.
Until the efforts by Kuznets to develop the national accounts, most macroeconomic measurement was directed toward producing a coherent qualitative narrative of the business cycle. That work is most closely associated with Wesley Clair Mitchell, the first director of research of the NBER, which was founded in 1920. Later, Mitchell coauthored with Arthur Burns the 1946 tome Measuring Business Cycles, in which they identified a large number of statistical indicators of economic expansion and contraction that enabled them to identify the turning points of past business cycles. This was the culmination of Mitchell’s thirty-six-year examination of business cycles that dated back to his highly acclaimed 1913 treatise on the subject.
Arthur Burns, under whom I studied at Columbia University in 1950, was also one of my predecessors as Federal Reserve chairman (1970 to 1978) and as chairman of the Council of Economic Advisers (1953 to 1956). We forged a close relationship that continued for nearly four decades. Coincidentally, in 1946, I took an undergraduate course in statistics from Geoffrey Moore, a colleague of Burns’s at the NBER. Moore in later years formalized the work of Burns and Mitchell into a series of “Leading Business Cycle Indicators,” still published by the Conference Board.7 Business cycle turning points, to this day, are “official” dates announced (often well after the fact) by a committee chosen by the NBER. Those turning points have been virtually universally accepted by both private and government economists.
As late as 1947, 18 percent of the U.S. population still lived on farms. (Today the figure is 2 percent.) Economic forecasting was thus still heavily focused on the o
utlook for crops and livestock; expected crop production was dependent on the then-poorly forecastable weather, and livestock forecasts were dependent on the price of feed grains. Farm output continued to loom large even as industrial production began to take an ever increasing share of GNP in the latter part of the nineteenth century as the United States gradually wrested global economic hegemony from the United Kingdom. Day-to-day economics for a large part of the country, however, was still tied to crops, cattle, and weather that remained decidedly local.
The massive addition of railroad mileage (which peaked in 1930) helped in expanding the division of labor in the U.S. economy beyond its heavily local character during most of the nineteenth century. The development of the motor vehicle industry provided yet another large impetus to industrial activity. The motor vehicle, supported by a rapidly growing petroleum industry, carried us into post‒World War II America. Of course, the value offered by ever faster travel and transport must always be balanced against the monetary and nonmonetary costs of achieving those gains. Our short-lived dalliance with supersonic aircraft demonstrated that what is technologically feasible is not always economically or politically viable.
POSTWAR YEARS
The huge statistical bureaucracy that was spawned during the New Deal, and especially during World War II, morphed into innumerable government statistical agencies after the war. Data collection systems by private research organizations such as the National Industrial Conference Board and the National Bureau of Economic Research (both dominant in the 1920s) covering trends in American business were gradually displaced by government statistical collection that previously had largely been restricted to data collected as part of the decennial census. And the census, originally limited to the population estimates required by the Constitution, gradually over the decades began to ask economically related questions. The advent of the computer and later the Internet vastly expanded the detailed statistical chronicling of America.