Currency Wars: The Making of the Next Global Crisis

Home > Other > Currency Wars: The Making of the Next Global Crisis > Page 23
Currency Wars: The Making of the Next Global Crisis Page 23

by James Rickards


  Merton and other leading sociologists of their time were not economists. Yet in a sense they were, because economics is ultimately the study of human decision making with regard to goods in conditions of scarcity. The sociologists cast a bright light on these decision-making processes. Former Bear Stearns CEO Alan Schwartz can attest to the power of Merton’s self-fulfilling prophecy. On March 12, 2008, Schwartz told CNBC, “We don’t see any pressure on our liquidity, let alone a liquidity crisis.” Forty-eight hours later Bear Stearns was headed to bankruptcy after frightened Wall Street banks withdrew billions of dollars of credit lines. For Bear Stearns, this was a real-life version of Merton’s thought experiment.

  A breakthrough in the impact of social psychology on economics came with the work of Daniel Kahneman, Amos Tversky, Paul Slovic and others in a series of experiments conducted in the 1950s and 1960s. In the most famous set of experiments, Kahneman and Tversky showed that subjects, given the choice between two monetary outcomes, would select the one with the greater certainty of being received even though it did not have the highest expected return. A typical version of this is to offer a subject the prospect of winning money structured as a choice between: A) $4,000 with an 80 percent probability of winning, or B) $3,000 with a 100 percent probability of winning. For supporters of efficient market theory, this is a trivial problem. Winning $4,000 with a probability of 80 percent has an expected value of $3,200 (or $4,000 × .80). Since $3,200 is greater than the alternative choice of $3,000, a rational wealth-maximizing actor would chose A. Yet in one version of this, 80 percent of the participants chose B. Clearly the participants had a preference for the “sure thing” even if its theoretical value was lower. In some ways, this is just a formal statistical version of the old saying “A bird in the hand is worth two in the bush.” Yet the results were revolutionary—a direct assault on the cornerstone of financial economics.

  Through a series of other elegantly designed and deceptively simple experiments, Kahneman and his colleagues showed that subjects had a clear preference for certain choices based on how they were presented, even though an alternative choice would produce exactly the same result. These experiments introduced an entirely new vocabulary to economics, including certainty (the desire to avoid losses, also called risk aversion), anchoring (the undue influence of early results in a series), isolation (undue weight on unique characteristics versus shared characteristics), framing (undue weight on how things are presented versus the actual substance) and heuristics (rules of thumb). The entire body of work was offered under the title “prospect theory,” which marked a powerful critique of the utility theory used by financial economists.

  Unfortunately, behavioral economics has been embraced by policy makers to manipulate rather than illuminate behavior based on dubious premises about their superior wisdom. Bernanke’s campaign to raise inflationary “expectations” by printing money and devaluing the dollar while holding rates low is the boldest contemporary version of such manipulation, yet there are others. Orchestrated propaganda campaigns have involved off-the-record meetings of corporate CEOs with business reporters requesting that they apply a more favorable spin to business news. These attempted manipulations have their absurd side, as with the phrase “green shoots” repeated ad nauseam by cable TV cheerleaders in the spring of 2009 at a time when America was losing millions of jobs. Tim Geithner’s self-proclaimed “Recovery Summer” in 2010 is another example—that summer came and went with no recovery at all for the forty-four million on food stamps. These are all examples of what Kahneman called “framing” an issue to tilt the odds in favor of a certain result.

  What Bernanke, Geithner and like-minded behavioralists in policy positions fail to see is something Merton might have easily grasped—the positive feedback effect that arises from framing without substance. If the economy is actually doing well, the message requires no framing and the facts will speak for themselves, albeit with a lag. Conversely, when reality consists of collapsing currencies, failed banks and insolvent sovereigns, talk about green shoots has at best a limited and temporary effect. The longer-term effect is a complete loss of trust by the public. Once the framing card has been played enough times without results, citizens will reflexively disbelieve everything officials say on the subject of economic growth even to the point of remaining cautious if things actually do improve. This does not represent a failure of behavioral economics so much as its misuse by policy makers.

  Behavioral economics possesses powerful tools and can offer superb insights despite occasional misuse. It is at its best when used to answer questions rather than force results. Exploration of the paradox of Keynesianism is one possibly fruitful area of behavioral economic research with potential to mitigate the currency wars. Keynesianism was proposed in part to overcome the paradox of thrift. Keynes pointed out that in times of economic distress an individual may respond by reducing spending and increasing savings. However, if everyone does the same thing, distress becomes even worse because aggregate demand is destroyed, which can cause businesses to close and unemployment to rise. Keynesian-style government spending was thought to replace this shortage of private spending. Today government spending has grown so large and sovereign debt burdens so great that citizens rightly expect that some combination of inflation, higher taxation and default will be required to reconcile the debt burden with the means available to pay it. Government spending, far from stimulating more spending, actually makes the debt burden worse and may increase this private propensity to save. Here is a conundrum that behavioral economists seem well suited to explore. The result may be the discovery that short-term government austerity brightens long-run economic prospects by increasing confidence and the propensity to spend.

  Complexity Theory

  Our definition of complex systems included spontaneous organization, unpredictability, the need for exponentially greater energy inputs and the potential for catastrophic collapse. Another way to understand complexity is to contrast it with that which is merely complicated. A Swiss watch may be complicated, but it is not complex. The number and size of various gears, springs, jewels, stems and casings make it complicated. Yet the parts do not communicate with one another. They touch but do not interact. One gear does not enlarge itself because the other gears think it is a good idea. The springs do not spontaneously self-organize into a liquid metallic soup. The watch is complicated; however, complexity is much more than complication.

  Complex systems begin with individual components called autonomous agents, which make decisions and produce results in the system. These agents can be marine species in the oceanic food chain or individual investors in currency markets; the dynamics are the same. To be complex, a system first requires diversity in the types of agents. If the agents are alike, nothing very interesting will happen. If they are diverse, they will respond differently to various inputs, producing more varied results.

  The second element is connectedness. The idea is that the agents are connected to one another through some channel. This can consist of electrical lines in the case of a power grid or Twitter feeds in the case of a social network, but somehow the agents must have a way to contact one another.

  The third element is interdependence, which means that the agents influence one another. If someone is not sure how cold it is outside and she looks out the window to see everyone wearing down coats, she might choose to wear one too. The decision is not automatic—she might choose to wear only a sweater—but in this case a decision to wear a warm coat is partly dependent on others’ decisions.

  The last element is adaptation. In complex systems, adaptation means more than change; rather it refers specifically to learning. Investors who repeatedly lose money on Wall Street themes such as “buy and hold” may learn over time that they need to consider alternative strategies. This learning can be collective in the sense that lessons are shared quickly with others without each agent having to experience them directly. Agents that are diverse, connected, interdependent and
adaptive are the foundation of a complex system.

  To understand how a complex system operates, it is necessary to think about the strength of each of these four elements. Imagine each one has a dial that can be turned from settings of zero to ten. At a setting of one, the system is uninteresting. It may have the elements of complexity, but nothing much is going on. Diversity is low, connectedness and interdependence are weak and there is almost no learning or adaptation taking place. At a setting of ten, the system is chaotic. Agents receive too much information from too many sources and are stymied in their decision making by conflicting and overwhelming signals.

  Where complexity is most intriguing is in what Scott Page of the University of Michigan calls the “interesting in-between.” This means the dials are set somewhere between three and seven, with each dial different from the others. This allows a good flow of information, interaction and learning among diverse agents, but not so much that the system becomes chaotic. This is the heart of complexity—a system that continuously produces surprising results without breaking down.

  Two further characteristics of complex systems are of the utmost importance in our consideration of their application to currency markets and the dollar. These are emergent properties and phase transitions.

  Saying a system has an emergent property is like saying the whole is more than the sum of its parts. Tasting a delicious, warm apple pie is more interesting than looking at the dough, sugar, apples and butter that went into it. When systems are highly complex, emergent properties are far more powerful and unexpected. Climate is one of the most complex systems ever studied. It is extremely difficult to model, and reliable weather forecasts can be made only about four days in advance. Hurricanes are emergent properties of climate. Their ingredients, such as low air pressure, warm water, convection and the like, are all easily observed, but the exact timing and location at which hurricanes will emerge is impossible to predict. We know them when we see them.

  The best example of an emergent property is probably human consciousness. The human body is composed of oxygen, carbon and hydrogen, with traces of copper and zinc thrown in for good measure. If one were to combine these ingredients in a vat, stir carefully and even jolt the mixture with electricity, nothing would happen. The same ingredients combined through DNA coding, however, produces a human being. There’s nothing in a carbon molecule that suggests thought and nothing in an oxygen molecule that suggests speech or writing. Yet the power of complexity produces exactly those capabilities using exactly those ingredients. Thought emerges from the human mind in the same complex, dynamic way that hurricanes emerge from the climate.

  Phase transitions are a way to describe what happens when a complex system changes its state. When a volcano erupts, its state goes from dormant to active. When the stock market drops 20 percent in one day, its state goes from well behaved to disorderly. If the price of gold were to double in one week, the state of the dollar would go from stable to free fall. These are all examples of phase transitions in complex systems.

  Not every complex system is poised for a phase transition—the system itself must be in a “critical state.” This means that the agents in the system are assembled in such a way that the actions of one trigger the actions of another until the whole system changes radically. A good example of a phase transition in a critical state system is an avalanche. A normal snowfield on a flat surface is fairly stable, yet the same amount of snow on a steep incline may be in a critical state. New snow may fall for a while, but eventually one snowflake will disturb a few others. Those others will disturb more adjacent flakes until a small slide begins that takes more snow with it, getting larger along the way until the entire mountainside comes loose. One could blame the snowflake, but it is more correct to blame the unstable state of the mountainside of snow. The snowfield was in a critical state—it was likely to collapse sooner or later, and if one snowflake did not start the avalanche, the next one could have.

  The same process occurs in a stock market crash. Buy and sell orders hit the market all the time just like snowflakes on the mountain. Sometimes the buyers and sellers are arranged in highly unstable ways so that one sell order triggers a few others, which are then reported by the exchange, triggering even more sell orders by nervous investors. Soon the cascade gets out of control, and more sell orders placed in advance and triggered by “stop-loss” rules are automatically executed. The process feeds on itself. Sometimes the process dies out; after all there are many small disturbances in the snow that do little harm. Sometimes the process grows exponentially until something outside the system intervenes. This intervention can take the form of trading halts, efforts by buying syndicates to reverse the flow or even closing the exchange. Once the cascade stops, the complex system can return to a stable, noncritical state—until the next time.

  The recent multiple catastrophes near Sendai, Japan, perfectly illustrate how phase transitions occur in nature and society and how collapse can spread from one system to another when all are in the critical state. Tectonic plates, oceans, uranium and stock markets are all examples of separate complex systems. However, they can interact in a kind of metasystemic collapse. On March 11, 2011, shifting tectonic plates under the Pacific Ocean off the eastern coast of Japan caused an extremely violent 9.0 earthquake. The thrusting of the ocean floor then transferred energy from one system, the earth’s crust, to another system, the ocean, causing a ten-meter-high tsunami. The tsunami smashed into several nuclear reactors, again transferring energy and causing another catastrophe, this time a partial meltdown in uranium and plutonium fuel rods used in the reactors. Finally, the fear induced by the meltdown in the reactors contributed to a meltdown in the Tokyo stock market, which crashed over 20 percent in two days. The earthquake and tsunami were natural systems. The reactor was a hybrid of natural uranium and man-made design, while the stock exchange is totally man-made. Yet they all operated under the same critical state dynamics embedded in complex systems.

  Importantly, phase transitions can produce catastrophic effects from small causes—a single snowflake can cause a village to be destroyed by an avalanche. This is one secret behind so-called black swans. Nassim Nicholas Taleb popularized the term “black swan” in his book of the same name. In that book, Taleb rightly demolished the normal distribution—the bell curve—as a way of understanding risk. The problem is that he demolished one paradigm but did not produce another to replace it. Taleb expressed some disdain for mathematical modeling in general, preferring to take on the mantle of a philosopher. He dubbed all improbably catastrophic events “black swans,” as if to say, “Stuff happens,” and he left it at that. The term is widely used by analysts and policy makers who understand the “Stuff happens” part but don’t understand the critical state dynamics and complexity behind it. Yet it is possible to do better than throw up one’s hands.

  A forest fire caused by lightning is a highly instructive example. Whether the fire destroys a single tree or a million acres, it is caused by a single bolt of lightning. Simple intuition might hold that large bolts cause large fires and small bolts cause small fires, but that is not true. The same bolt of lightning can cause no fire or a catastrophic fire depending on the critical state. This is one reason why black swans take us by surprise. They are called extreme events, but it would be more accurate to call them extreme results from everyday events. Extreme results will happen with some frequency; it is the everyday events that trigger them that we don’t see coming precisely because they are so mundane. Studying the system shows us how the everyday event morphs into the black swan. As in the case of the avalanche, what really matters is not the snowflake but the snow.

  Two more concepts are needed to round out our understanding of complexity theory. The first involves the frequency of extreme events relative to mild events in a complex system, referred to as a degree distribution. The second is the concept of scale.

  The bell-curve degree distribution used in financial economics says that mild events
happen all the time and highly extreme events practically never. Yet the bell curve is only one kind of degree distribution; there are many others. The degree distribution that describes many events in complex systems is called a power law. A curve that corresponds to a power law is shown below as Figure 2.

  FIGURE 2: A curve illustrating a power-law degree distribution

  In this degree distribution, the frequency of events appears on the vertical axis and the severity of events appears on the horizontal axis. As in a bell curve, extreme events occur less frequently than mild events. This is why the curve slopes downward (less frequent events) as it moves off to the right (more extreme events). However, there are some crucial differences between the power law and the bell curve. For one thing, the bell curve (see Figure 1) is “fatter” in the region close to the vertical axis. This means that mild events happen more frequently in bell curve distributions and less frequently in power law distributions. Crucially, this power law curve never comes as close to the horizontal axis as the bell curve. The “tail” of the curve continues for a long distance to the right and remains separated from the horizontal axis. This is the famous “fat tail,” which in contrast with the tail on the bell curve does not appear to touch the horizontal axis. This means that extreme events happen more frequently in power law distributions.

 

‹ Prev