More Than You Know
Page 20
St. Petersburg and Growth Stock Investing
The St. Petersburg Paradox also provides insight for growth stock valuation.8 What should you be willing to pay for a very small probability that a company can grow its cash flows by a very significant amount forever?9
David Durand took up this question in his classic 1957 article, “Growth Stocks and the Petersburg Paradox.”10 He encourages a good deal of caution, emphasizing reversion-to-the-mean thinking and modeling. But if anything, the challenge of valuing the low probability of significant value is even more pressing today than it was when Durand took on the challenge almost fifty years ago.
Consider, for example, that of the nearly 2,000 technology-stock initial public offerings from 1980 through 2006, less than 5 percent account for over 100 percent of the $2-trillion-plus in wealth creation.11 And even within this small wealth-generating group, only a handful delivered the bulk of the huge payoffs. Given the winner-take-most characteristics of many growth markets, there’s little reason to anticipate a more normal wealth-creation distribution in the future.
In addition, the data show that the distribution of economic return on investment is wider in corporate America today than it was in the past.12 So the spoils awaiting the wealth creators, given their outsized returns, are greater than ever before. As in the St. Petersburg game, the majority of the payoffs from future deals are likely to be modest, but some will be huge. What’s the expected value? What should you be willing to pay to play?
Integrating the Outliers
The St. Petersburg Paradox may be centuries old, but its lessons are as fresh as ever. One of the major challenges in investing is how to capture (or avoid) low-probability, high-impact events. Unfortunately, standard finance theory has little to say about the subject.
33
The Janitor’s Dream
Why Listening to Individuals Can Be Hazardous to Your Wealth
In the existing sciences much of the emphasis over the past century or so has been on breaking systems down to find their underlying parts, then trying to analyze these parts in as much detail as possible. . . . But just how these components act together to produce even some of the most obvious features of the overall behavior we see has in the past remained an almost complete mystery.
—Stephen Wolfram, A New Kind of Science
Beyond Newton
Where does consciousness come from? This question has bedeviled philosophers and scientists for centuries. We have cured diseases, put men on the moon, and probed many details of our physical world. Yet even the best thinkers today have difficulty defining consciousness, let alone explaining it. Why have we had so much success in some scientific realms and so little in others, such as unveiling the mysteries of consciousness?
Not all systems are alike, and we can’t understand the workings of all systems on the same level. Let’s start with the systems that we do understand. Many of science’s triumphs over the past few centuries are rooted in Isaac Newton’s principles. Newton’s world is a mechanical one, where cause and effect are clear and systems follow universal laws. With sufficient understanding of a system’s underlying components, we can predict precisely how the system will behave.
Reductionism is the cornerstone of discovery in the Newtonian world, the basis for much of science’s breathtaking advance in the seventeenth through nineteenth centuries. As scientist John Holland explains, “The idea is that you could understand the world, all of nature, by examining smaller and smaller pieces of it. When assembled, the small pieces would explain the whole.”1 In many systems, reductionism works brilliantly.
But reductionism has its limits. In systems that rely on complex interactions of many components, the whole system often has properties and characteristics that are distinct from the aggregation of the underlying components. Since the whole of the system emerges from the interaction of the components, we cannot understand the whole simply by looking at the parts. Reductionism fails.
Neuroscientist William Calvin, who is in the thick of the consciousness dialogue, notes that we can approach the problem in various ways but that the key to understanding consciousness certainly is not in the “basement” of neural chemistry or the “subbasement” of quantum mechanics. There are too many layers of interaction in the brain. The parts don’t explain the whole. Calvin calls the leap from the subbasement of quantum mechanics directly to the penthouse of consciousness the janitor’s dream.2
Why should investors care about the janitor’s dream? If the stock market is a system that emerges from the interaction of many different investors, then reductionism—understanding individuals—does not give a good picture of how the market works. Investors and corporate executives who pay too much attention to individuals are trying to understand markets at the wrong level. An inappropriate perspective of the market can lead to poor judgments and value-destroying decisions.
Sorting Systems
When a system has low complexity and we can define interactions linearly, reductionism is very useful. Many engineered systems fit this bill. A skilled artisan can take apart your wristwatch, study the components, and have a complete understanding of how the system works. Such systems also lend themselves to centralized decision making. Many companies in the industrial revolution were good examples of engineered systems—a product went down a manufacturing line, and each worker contributed to the end product. Through scientific refinement, managers could continually improve the system’s performance.
On the other hand, centralized control fails for systems with sufficient complexity. Scientists call these “complex adaptive systems” and refer to the components of the system as agents. Complex adaptive systems exhibit a number of essential properties and mechanisms:3 Aggregation. Aggregation is the emergence of complex, large-scale behavior from the collective interactions of many less-complex agents.
Adaptive decision rules. Agents within a complex adaptive system take information from the environment, combine it with their own interaction with the environment, and derive decision rules. In turn, various decision rules compete with one another based on their fitness, with the most effective rules surviving.
Nonlinearity. In a linear model, the whole equals the sum of the parts. In nonlinear systems, the aggregate behavior is more complicated than would be predicted by totaling the parts.
Feedback loops. A feedback system is one in which the output of one iteration becomes the input of the next iteration. Feedback loops can amplify or dampen an effect.4
EXHIBIT 33.1 Complexity and Decision Making
Source: Sente Corporation.
Complex adaptive systems include governments, many corporations, and capital markets. Efforts to assert top-down control of these systems generally lead to failure, as happened in the former Soviet Union. Exhibit 33.1 contrasts the two types of systems.
Thinking about the market as a complex adaptive system is in stark contrast to classical economic and finance theory, which depicts the world in Newtonian terms. Economists treat agents as if they are homogenous and build linear models—supply and demand, risk and reward, price and quantity. None of this, of course, much resembles the real world.5
The Stock Market as a Complex Adaptive System
The stock market has all of the characteristics of a complex adaptive system. Investors with different investment styles and time horizons (adaptive decision rules) trade with one another (aggregation), and we see fat-tail price distributions (nonlinearity) and imitation (feedback loops). An agent-based approach to understanding markets is gaining broader acceptance. But this better descriptive framework does not offer the neat solutions that the current economic models do.
Investors who view the stock market as a complex adaptive system avoid two cognitive traps. The first is the constant search for a cause for all effects. Critical points, where large-scale reactions are the result of small-scale perturbations, are a characteristic of complex adaptive systems. So cause and effect are not always easy to link. Following the
stock market crash of 1987, for instance, the government commissioned a study to isolate the “cause” of the crash. After an exhaustive study, the Brady commission was unable to find a particular cause. The point here is not that cause and effect don’t exist but rather that not every effect has a proportionate cause. As humans like to identify a cause for every effect, this concept is difficult to internalize.
The second trap is to dwell on the input of any individual at the expense of understanding the market itself. For example, executives often question how it is that the empirical evidence shows the market follows cash flows when most investors talk about accounting results. The answer is that every individual operates with his or her own decision rules, while the market represents the aggregation of these rules. Further, studies of systems with sufficient complexity show that a collective of diverse individuals tends to solve problems better than individuals can—even individuals that are so-called experts.6
Using What You’ve Got
Time-pressured decision makers often rely on rules of thumb, or heuristics, for their decision making. While heuristics don’t always lead to the best answer in a particular situation, they are often useful precisely because they save time for their users. However, heuristics can also lead investors to make biased decisions. One facet of successful decision making is gaining an understanding of these biases so as to mitigate their cost.7
The availability heuristic allows investors to assess the frequency or likely cause of an event by the degree to which similar events are “available” in memory. Ease of recall is one bias that emanates from the availability heuristic. In other words, investors or managers may place greater emphasis on information that is available than on information that is relevant.
I believe this bias is at the heart of the janitor’s-dream problem. Investors and managers spend a lot of time focusing on information that is available, like current earnings and multiples, rather than on information that is more meaningful—that is, what the market reveals about expectations for future performance. Corporate managers see analyst reports that dwell on earnings and hence incorrectly assume that the market is a simple addition of these agents.
Investors and corporate managers trying to understand the market must recognize that it’s a complex adaptive system. The market’s action reflects the interaction of many agents, each with varying knowledge, resources, and motivation. So a disproportionate focus on individual opinions can be hazardous to wealth creation.
34
Chasing Laplace’s Demon
The Role of Cause and Effect in Markets
[Our ancestors] must have felt uncomfortable about their inability to control or understand such [causeless] events, as indeed many do today. As a consequence, they began to construct, as it were, false knowledge. I argue that the primary aim of human judgment is not accuracy but the avoidance of paralyzing uncertainty. We have a fundamental need to tell ourselves stories that make sense of our lives. We hate uncertainty and . . . find it intolerable.
—Lewis Wolpert, Faraday Lecture
We’re accustomed to thinking in terms of centralized control, clear chains of command, the straightforward logic of cause and effect. But in huge, interconnected systems, where every player ultimately affects every other, our standard ways of thinking fall apart. Simple pictures and verbal arguments are too feeble, too myopic.
—Steven Strogatz, Sync
Evolution Made Me Do It
Most people know that the human brain has a left and a right hemisphere. The right hemisphere is superior at performing visual and spatial tasks, and the left brain specializes in language, speech, and problem solving. Right-brain-dominant people are known for their creativity, while the left brainers are the analytical types.
But the left-brain system does more than just calculate; it is constantly working to find relationships between events it encounters in the world. Dubbed “the interpreter” by neuroscientist Michael Gazzaniga, the left brain tries to tie life together in a coherent story.1
The corpus callosum, a bridge of nerve tissue, connects the left and right sides of the brain. To better understand the distinct roles of the two hemispheres, Gazzaniga and his colleague Joseph LeDoux studied patients with severed bridges between the left and right brain. The scientists knew that if one hemisphere received exclusive information, the information would be unavailable to the other side.
To test the interaction between hemispheres, Gazzaniga and LeDoux crafted a clever experiment. First, through visual cues they secretly instructed the right hemisphere to perform an action. The left side could observe the action but had no idea why it was going on. Next, the scientists asked the split-brain patient to explain why he was acting. Remarkably, the left hemisphere made up explanations for the actions. For example, if the scientists instructed the right hemisphere to laugh, the patient’s left hemisphere would report that the scientists were funny guys. In LeDoux’s words, “the patient was attributing explanations to situations as if he had introspective insight into the cause of the behavior when in fact he did not.”2 The interpreter at work.
Biologist Lewis Wolpert argues that the concept of cause and effect was a fundamental driver of human evolution. It is evolutionarily advantageous to understand the potential effects of a cause and the causes of an effect. Wolpert suggests that a combination of the concept of cause, language, and social interaction drove the increase in size and complexity of the human brain.3
So we humans are wired to make links between causes and effects. And making up causes for effects is not beyond us. The events with no clear causes that baffled us for most of human existence—illness, lightning, and volcanoes—are things that we now largely understand. Unsurprisingly, our ancestors turned to the supernatural to explain these effects.
Today, we comprehend many systems but remain vexed by large, interconnected systems—often called complex adaptive systems. We can’t understand the global properties and characteristics of a complex adaptive system by analyzing the underlying heterogeneous individuals. These systems are not linear or additive; the whole does not equal the sum of the parts. As a result, cause and effect defies any simple explanation. The stock market is a perfect example of such a system.4
In investing, our innate desire to connect cause and effect collides with the elusiveness of such links. So what do we do? Naturally, we make up stories to explain cause and effect.
Why should investors care about cause and effect in the market? An appreciation of our need for explanation can be an inoculation against making mistakes. Investors who insist on understanding the causes for the market’s moves risk focusing on faulty causality or inappropriately anchoring on false explanations. Many of the big moves in the market are not easy to explain.
Laplace’s Demon
Two hundred years ago, determinism ruled in science. Inspired by Newton, scientists largely embraced the notion of a clockwork universe. The French mathematician Pierre Simon Laplace epitomized this thinking with a famous passage from A Philosophical Essay on Probabilities:An intellect which at any given moment knew all of the forces that animate Nature and the mutual positions of the beings that comprise it, if this intellect were vast enough to submit its data to analysis, could condense into a single formula the movement of the greatest bodies of the universe and that of the lightest atom: for such an intellect nothing could be uncertain; and the future just like the past would be present before its eyes.
Philosophers and scientists now call this “intellect” Laplace’s demon. The notion that we can work out the past, present, and future through detailed calculations was, and remains, a very alluring concept precisely because it plays to our cause-and-effect bias.
But complex adaptive systems do not accommodate such simple calculations. We can describe many complex systems as being in the state of self-organized criticality. “Self-organized” means that there is no leader. The system arises from the interaction of many underlying individuals. “Criticality” suggests nonl
inearity. More specifically, the magnitude of a perturbation within the system (cause) is not always proportionate to its effect. Small perturbations can lead to large outcomes, and vice versa.
A sand-pile metaphor conveys this idea. Imagine sprinkling sand onto a flat surface. At first, not much happens, and the sand grains obey the basic laws of physics. But once the sand pile builds to a certain height and slope, it enters into a self-organized, critical state. A few additional grains sprinkled on the pile may lead to a small or a large avalanche. The size of the avalanche need not match the amount of sand the researcher sprinkles.
To make this metaphor more relevant to investors, replace sand grains with information. Sometimes a piece of information barely moves the market. At other times, seemingly similar information causes a big move. Models of information cascades provide some insight into why this happens.5
Interpreting the Market
Human desire to close the cause-and-effect loop combined with stock market movements that elude simple explanation can lead to some silly after-the-fact narrative. Researchers took the S&P 500 Index’s fifty biggest daily price changes from 1941 through 1987 and examined what the press reported as the cause (see exhibit 34.1). They concluded that up to half of the variance of stock prices was the result of factors other than news on fundamentals. They write:On most of the sizable return days, however, the information the press cites as the cause of the market move is not particularly important. Press reports on subsequent days also fail to reveal any convincing accounts of why future profits or discount rates might have changed.6