Super Thinking
Page 1
PORTFOLIO / PENGUIN
An imprint of Penguin Random House LLC
penguinrandomhouse.com
Copyright © 2019 by Gabriel Weinberg and Lauren McCann
Penguin supports copyright. Copyright fuels creativity, encourages diverse voices, promotes free speech, and creates a vibrant culture. Thank you for buying an authorized edition of this book and for complying with copyright laws by not reproducing, scanning, or distributing any part of it in any form without permission. You are supporting writers and allowing Penguin to continue to publish books for every reader.
Image credits appear on this page
LIBRARY OF CONGRESS CATALOGING-IN-PUBLICATION DATA
Names: Weinberg, Gabriel, author. | McCann, Lauren, author.
Title: Super Thinking: The Big Book of Mental Models / Gabriel Weinberg and Lauren McCann.
Description: New York: Portfolio/Penguin, [2019] | Includes index.
Identifiers: LCCN 2019002099 (print) | LCCN 2019004235 (ebook) | ISBN 9780525533597 (ebook) | ISBN 9780525542810 (international edition) | ISBN 9780525533580 (hardcover)
Subjects: LCSH: Thought and thinking. | Cognition. | Reasoning.
Classification: LCC BF441 (ebook) | LCC BF441 .W4446 2019 (print) | DDC 153.4/2—dc23
LC record available at https://lccn.loc.gov/2019002099
While the authors have made every effort to provide accurate telephone numbers, internet addresses, and other contact information at the time of publication, neither the publisher nor the authors assume any responsibility for errors, or for changes that occur after publication. Further, the publisher does not have any control over and does not assume any responsibility for authors or third-party websites or their content.
Version_2
Contents
Title Page
Copyright
Introduction: The Super Thinking Journey
CHAPTER ONE
Being Wrong Less
CHAPTER TWO
Anything That Can Go Wrong, Will
CHAPTER THREE
Spend Your Time Wisely
CHAPTER FOUR
Becoming One with Nature
CHAPTER FIVE
Lies, Damned Lies, and Statistics
CHAPTER SIX
Decisions, Decisions
CHAPTER SEVEN
Dealing with Conflict
CHAPTER EIGHT
Unlocking People’s Potential
CHAPTER NINE
Flex Your Market Power
CONCLUSION
Acknowledgments
Image Credits
Index
About the Authors
Introduction
The Super Thinking Journey
EACH MORNING, AFTER OUR KIDS head off to school or camp, we take a walk and talk about our lives, our careers, and current events. (We’re married.) Though we discuss a wide array of topics, we often find common threads—recurring concepts that help us explain, predict, or approach these seemingly disparate subjects. Examples range from more familiar concepts, such as opportunity cost and inertia, to more obscure ones, such as Goodhart’s law and regulatory capture. (We will explain these important ideas and many more in the pages that follow.)
These recurring concepts are called mental models. Once you are familiar with them, you can use them to quickly create a mental picture of a situation, which becomes a model that you can later apply in similar situations. (Throughout this book, major mental models appear in boldface when we introduce them to you. We use italics to emphasize words in a model’s name, as well as to highlight common related concepts and phrases.)
In spite of their usefulness, most of these concepts are not universally taught in school, even at the university level. We picked up some of them in our formal education (both of us have undergraduate and graduate degrees from MIT), but the bulk of them we learned on our own through reading, conversations, and experience.
We wish we had learned about these ideas much earlier, because they not only help us better understand what is going on around us, but also make us more effective decision makers in all areas of our lives. While we can’t go back in time and teach our younger selves these ideas, we can provide this guide for others, and for our children. That was our primary motivation for writing this book.
An example of a useful mental model from physics is the concept of critical mass, the mass of nuclear material needed to create a critical state whereby a nuclear chain reaction is possible. Critical mass was an essential mental model in the development of the atomic bomb.
Every discipline, like physics, has its own set of mental models that people in the field learn through coursework, mentorship, and firsthand experience. There is a smaller set of mental models, however, that are useful in general day-to-day decision making, problem solving, and truth seeking. These often originate in specific disciplines (physics, economics, etc.), but have metaphorical value well beyond their originating discipline.
Critical mass is one of these mental models with wider applicability: ideas can attain critical mass; a party can reach critical mass; a product can achieve critical mass. Unlike hundreds of other concepts from physics, critical mass is broadly useful outside the context of physics. (We explore critical mass in more detail in Chapter 4.)
We call these broadly useful mental models super models because applying them regularly gives you a super power: super thinking—the ability to think better about the world—which you can use to your advantage to make better decisions, both personally and professionally.
We were introduced to the concept of super models many years ago through Charlie Munger, the partner of renowned investor Warren Buffett. As Munger explained in a 1994 speech at University of Southern California Marshall Business School titled “A Lesson on Elementary, Worldly Wisdom as It Relates to Investment Management and Business”:
What is elementary, worldly wisdom? Well, the first rule is that you can’t really know anything if you just remember isolated facts and try and bang ’em back. If the facts don’t hang together on a latticework of theory, you don’t have them in a usable form.
You’ve got to have models in your head. And you’ve got to array your experience—both vicarious and direct—on this latticework of models.
As the saying goes, “History doesn’t repeat itself, but it does rhyme.” If you can identify a mental model that applies to the situation in front of you, then you immediately know a lot about it. For example, suppose you are thinking about a company that involves people renting out their expensive power tools, which usually sit dormant in their garages. If you realize that the concept of critical mass applies to this business, then you know that there is some threshold that needs to be reached before it could be viable. In this case, you need enough tools available for rent in a community to satisfy initial customer demand, much as you need enough Lyft drivers in a city for people to begin relying on the service.
That is super thinking, because once you have determined that this business model can be partially explained through the lens of critical mass, you can start to reason about it at a higher level, asking and answering questions like these: What density of tools is needed to reach the critical mass point in a given area? How far away can two tools be to count toward the same critical mass point in that area? Is the critical mass likely to be reached in an area? Why or why not? Can you tweak the business model so that this critical mass point is reachable or easier to reach? (For instance, the company could seed each area with its own tools.)
As you can see, super models are shortcuts to higher-level thinking. If you can understand the relevant models for a situation, then you can bypass lower-level thinking and immediately jump to higher-level thinking. In
contrast, people who don’t know these models will likely never reach this higher level, and certainly not quickly.
Think back to when you first learned multiplication. As you may recall, multiplication is just repeated addition. In fact, all mathematical operations based on arithmetic can be reduced to just addition: subtraction is just adding a negative number, division is just repeated subtraction, and so on. However, using addition for complex operations can be really slow, which is why you use multiplication in the first place.
For example, suppose you have a calculator or spreadsheet in front of you. When you have 158 groups of 7 and you want to know the total, you could use your tool to add 7 to itself 158 times (slow), or you could just multiply 7 × 158 (quick). Using addition is painfully time-consuming when you are aware of the higher-level concept of multiplication, which helps you work quickly and efficiently.
When you don’t use mental models, strategic thinking is like using addition when multiplication is available to you. You start from scratch every time without using these essential building blocks that can help you reason about problems at higher levels. And that’s exactly why knowing the right mental models unlocks super thinking, just as subtraction, multiplication, and division unlock your ability to do more complex math problems.
Once you have internalized a mental model like multiplication, it’s hard to imagine a world without it. But very few mental models are innate. There was a time when even addition wasn’t known to most people, and you can still find whole societies that live without it. The Pirahã of the Amazon rain forest in Brazil, for example, have no concept of specific numbers, only concepts for “a smaller amount” and “a larger amount.” As a result, they cannot easily count beyond three, let alone do addition, as Brian Butterworth recounted in an October 20, 2004, article for The Guardian, “What Happens When You Can’t Count Past Four?”:
Not having much of number vocabulary, and no numeral symbols, such as one, two, three, their arithmetical skills could not be tested in the way we would test even five-year-olds in Britain. Instead, [linguist Peter] Gordon used a matching task. He would lay out up to eight objects in front of him on a table, and the Pirahã participant’s task was to place the same number of objects in order on the table. Even when the objects were placed in a line, accuracy dropped off dramatically after three objects.
Consider that there are probably many disciplines where you have only rudimentary knowledge. Perhaps physics is one of them? Most of the concepts from physics are esoteric, but some—those physics mental models that we present in this book—do have the potential to be repeatedly useful in your day-to-day life. And so, despite your rudimentary knowledge of the discipline, you can and should still learn enough about these particular concepts to be able to apply them in non-physics contexts.
For instance, unless you are a physicist, Coriolis force, Lenz’s law, diffraction, and hundreds of other concepts are unlikely to be of everyday use to you, but we contend that critical mass will prove useful. That’s the difference between regular mental models and super models. And this pattern repeats for each of the major disciplines. As Munger said:
And the models have to come from multiple disciplines—because all the wisdom of the world is not to be found in one little academic department. . . . You’ve got to have models across a fair array of disciplines.
You may say, “My God, this is already getting way too tough.” But, fortunately, it isn’t that tough—because 80 or 90 important models will carry about 90 percent of the freight in making you a worldly-wise person. And, of those, only a mere handful really carry very heavy freight.
Munger expanded further in an April 19, 1996, speech at Stanford Law School similarly titled “A Lesson on Elementary, Worldly Wisdom, Resulted”:
When I urge a multidisciplinary approach . . . I’m really asking you to ignore jurisdictional boundaries. If you want to be a good thinker, you must develop a mind that can jump these boundaries. You don’t have to know it all. Just take in the best big ideas from all these disciplines. And it’s not that hard to do.
You want to have a broad base of mental models at your fingertips, or else you risk using suboptimal models for a given situation. It’s like the expression “If all you have is a hammer, everything looks like a nail.” (This phrase is associated with another super model, Maslow’s hammer, which we cover in Chapter 6.) You want to use the right tool for a given situation, and to do that, you need a whole toolbox full of super models.
This book is that toolbox: it systematically lists, classifies, and explains all the important mental models across the major disciplines. We have woven all these super models together for you in a narrative fashion through nine chapters that we hope are both fun to read and easy to understand. Each chapter has a unifying theme and is written in a way that should be convenient to refer back to.
We believe that when taken together, these super models will be useful to you across your entire life: to make sense of situations, help generate ideas, and aid in decision making. For these mental models to be most useful, however, you must apply them at the right time and in the right context. And for that to happen, you must know them well enough to associate the right ones with your current circumstances. When you deeply understand a mental model, it should come to you naturally, like multiplication does. It should just pop into your head.
Learning to apply super mental models in this manner doesn’t happen overnight. Like Spider-Man or the Hulk, you won’t have instant mastery of your powers. The superpowers you gain from your initial knowledge of these mental models must be developed. Reading this book for the first time is like Spider-Man getting his spider bite or the Hulk his radiation dose. After the initial transformation, you must develop your powers through repeated practice.
When your powers are honed, you will be like the Hulk in the iconic scene from the movie The Avengers depicted on the previous page. When Captain America wants Bruce Banner (the Hulk’s alter ego) to turn into the Hulk, he tells him, “Now might be a really good time for you to get angry.” Banner replies, “That’s my secret, Captain. . . . I’m always angry.”
This is the book we wish someone had gifted us many years ago. No matter where you are in life, this book is designed to help jump start your super thinking journey. This reminds us of another adage, “The best time to plant a tree was twenty years ago. The second best time is now.”
1
Being Wrong Less
YOU MAY NOT REALIZE IT, but you make dozens of decisions every day. And when you make those decisions, whether they are personal or professional, you want to be right much more often than you are wrong. However, consistently being right more often is hard to do because the world is a complex, ever-evolving place. You are steadily faced with unfamiliar situations, usually with a large array of choices. The right answer may be apparent only in hindsight, if it ever becomes clear at all.
Carl Jacobi was a nineteenth-century German mathematician who often used to say, “Invert, always invert” (actually he said, “Man muss immer umkehren,” because English wasn’t his first language). He meant that thinking about a problem from an inverse perspective can unlock new solutions and strategies. For example, most people approach investing their money from the perspective of making more money; the inverse approach would be investing money from the perspective of not losing money.
Or consider healthy eating. A direct approach would be to try to construct a healthy diet, perhaps by making more food at home with controlled ingredients. An inverse approach, by contrast, would be to try to avoid unhealthy options. You might still go to all the same eating establishment but simply choose the healthier options when there.
The concept of inverse thinking can help you with the challenge of making good decisions. The inverse of being right more is being wrong less. Mental models are a tool set that can help you be wrong less. They are a collection of concepts that help you more effectively navigate our complex world.
As noted in the
Introduction, mental models come from a variety of specific disciplines, but many have more value beyond the field they come from. If you can use these mental models to help you make decisions as events unfold before you, they can help you be wrong less often.
Let us offer an example from the world of sports. In tennis, an unforced error occurs when a player makes a mistake not because the other player hit an awesome shot, but rather because of their own poor judgment or execution. For example, hitting an easy ball into the net is one kind of unforced error. To be wrong less in tennis, you need to make fewer unforced errors on the court. And to be consistently wrong less in decision making, you consistently need to make fewer unforced errors in your own life.
See how this works? Unforced error is a concept from tennis, but it can be applied as a metaphor in any situation where an avoidable mistake is made. There are unforced errors in baking (using a tablespoon instead of a teaspoon) or dating (making a bad first impression) or decision making (not considering all your options). Start looking for unforced errors around you and you will see them everywhere.
An unforced error isn’t the only way to make a wrong decision, though. The best decision based on the information available at the time can easily turn out to be the wrong decision in the long run. That’s just the nature of dealing with uncertainty. No matter how hard you try, because of uncertainty, you may still be wrong when you make decisions, more frequently than you’d like. What you can do, however, is strive to make fewer unforced errors over time by using sound judgment and techniques to make the best decision at any given time.
Another mental model to help improve your thinking is called antifragile, a concept explored in a book of the same name by financial analyst Nassim Nicholas Taleb. In his words:
Some things benefit from shocks; they thrive and grow when exposed to volatility, randomness, disorder, and stressors and love adventure, risk, and uncertainty. Yet, in spite of the ubiquity of the phenomenon, there is no word for the exact opposite of fragile. Let us call it antifragile.