You Are Not So Smart
Page 4
In another study, two groups of people who said they were very afraid of snakes were shown slides of snakes while listening to what they believed was their heart rate. Occasionally one group would see a slide with the word “shock” printed on it. They were given a jolt of electricity when they saw this slide, and the researchers falsely increased the sound of the beating of their hearts in the monitor. When they later were asked to hold a snake, they were far more likely to give it a shot than the group who didn’t see the shock slide and hear a fake increase in heart rate. They had convinced themselves they were more afraid of being shocked than of snakes and then used this introspection to truly be less afraid.
Nisbett and Miller set up their own study in a department store where they arranged nylon stockings side by side. When people came by, they asked them to say which of four items in a set was the best quality. Four-to-one, people chose the stocking on the right-hand side even though they were all identical. When the researchers asked why, people would comment on the texture or the color, but never the position. When asked if the order of the presentation influenced their choice, they assured the scientists it had nothing to do with it.
In these and many other studies the subjects never said they didn’t know why they felt and acted as they did. Not knowing why didn’t confuse them; they instead found justification for their thoughts, feelings, and actions and moved on, unaware of the machinery of their minds.
How do you separate fantasy from reality? How can you be sure the story of your life both from long ago and minute to minute is true? There is a pleasant vindication to be found when you accept that you can’t. No one can, yet we persist and thrive. Who you think you are is sort of like a movie based on true events, which is not necessarily a bad thing. The details may be embellished, but the big picture, the general idea, is probably a good story worth hearing about.
3
Confirmation Bias
THE MISCONCEPTION: Your opinions are the result of years of rational, objective analysis.
THE TRUTH: Your opinions are the result of years of paying attention to information that confirmed what you believed, while ignoring information that challenged your preconceived notions.
Have you ever had a conversation in which some old movie was mentioned, something like The Golden Child, or maybe even something more obscure?
You laughed about it, quoted lines from it, wondered what happened to the actors you never saw again, and then you forgot about it.
Until . . .
You are flipping channels one night and all of the sudden you see The Golden Child is playing. Weird.
The next day you are reading a news story, and out of nowhere it mentions forgotten movies from the 1980s, and holy shit, there are three paragraphs about The Golden Child. You see a trailer that night at the theater for a new Eddie Murphy movie, and then you see a billboard on the street promoting Charlie Murphy doing stand-up in town, and then one of your friends sends you a link to a post at TMZ showing recent photos of the actress from The Golden Child.
What is happening here? Is the universe trying to tell you something?
No. This is how confirmation bias works.
Since the conversation with your friends, you’ve flipped channels plenty of times; you’ve walked past lots of billboards; you’ve seen dozens of stories about celebrities; you’ve been exposed to a handful of movie trailers.
The thing is, you disregarded all the other information, all the stuff unrelated to The Golden Child. Out of all the chaos, all the morsels of data, you noticed only the bits that called back to something sitting on top of your brain. A few weeks back, when Eddie Murphy and his Tibetan adventure were still submerged beneath a heap of pop culture at the bottom of your skull, you wouldn’t have paid any special attention to references to it.
If you are thinking about buying a particular make of new car, you suddenly see people driving that car all over the roads. If you just ended a longtime relationship, every song you hear seems to be written about love. If you are having a baby, you start to see babies everywhere. Confirmation bias is seeing the world through a filter.
The examples above are a sort of passive version of the phenomenon. The real trouble begins when confirmation bias distorts your active pursuit of facts.
Punditry is an industry built on confirmation bias. Rush Limbaugh and Keith Olbermann, Glenn Beck and Arianna Huffing-ton, Rachel Maddow and Ann Coulter—these people provide fuel for beliefs, they pre-filter the world to match existing worldviews. If their filter is like your filter, you love them. If it isn’t, you hate them. You watch them not for information, but for confirmation.
Be careful. People like to be told what they already know. Remember that. They get uncomfortable when you tell them new things. New things . . . well, new things aren’t what they expect. They like to know that, say, a dog will bite a man. That is what dogs do. They don’t want to know that man bites a dog, because the world is not supposed to happen like that. In short, what people think they want is news, but what they really crave is olds . . . Not news but olds, telling people that what they think they already know is true.
—TERRY PRATCHETT THROUGH THE CHARACTER
LORD VETINARI FROM HIS The Truth: a Novel of Discworld
During the 2008 U.S. presidential election, researcher Valdis Krebs at orgnet.com analyzed purchasing trends on Amazon. People who already supported Obama were the same people buying books that painted him in a positive light. People who already disliked Obama were the ones buying books painting him in a negative light. Just as with pundits, people weren’t buying books for the information, they were buying them for the confirmation. Krebs has researched purchasing trends on Amazon and the clustering habits of people on social networks for years, and his research shows what psychological research into confirmation bias predicts: you want to be right about how you see the world, so you seek out information that confirms your beliefs and avoid contradictory evidence and opinions.
Half a century of research has placed confirmation bias among the most dependable of mental stumbling blocks. Journalists looking to tell a certain story must avoid the tendency to ignore evidence to the contrary; scientists looking to prove a hypothesis must avoid designing experiments with little wiggle room for alternate outcomes. Without confirmation bias, conspiracy theories would fall apart. Did we really put a man on the moon? If you are looking for proof we didn’t, you can find it.
In a 1979 University of Minnesota study by Mark Snyder and Nancy Cantor, people read about a week in the life of an imaginary woman named Jane. Throughout the week, Jane did things that showcased she could be extroverted in some situations and introverted in others. A few days passed. The subjects were asked to return. Researchers divided the people into groups and asked them to help decide if Jane would be suited for a particular job. One group was asked if she would be a good librarian; the other group was asked if she would be a good real estate agent. In the librarian group, people remembered Jane as an introvert. In the real estate group, they remembered her being an extrovert. After this, when each group was asked if she would be good at the other profession, people stuck with their original assessment, saying she wasn’t suited for the other job. The study suggests even in your memories you fall prey to confirmation bias, recalling those things that support even recently-arrived-at beliefs and forgetting those things that contradict them.
An Ohio State study in 2009 showed people spend 36 percent more time reading an essay if that essay aligns with their opinions. Another study at Ohio State in 2009 showed subjects clips of the parody show The Colbert Report, and people who considered themselves politically conservative consistently reported “Colbert only pretends to be joking and genuinely meant what he said.”
Over time, by never seeking the antithetical, through accumulating subscriptions to magazines, stacks of books, and hours of television, you can become so confident in your worldview that no one can dissuade you.
Remember, there’s always so
meone out there willing to sell eyeballs to advertisers by offering a guaranteed audience of people looking for validation. Ask yourself if you are in that audience. In science, you move closer to the truth by seeking evidence to the contrary. Perhaps the same method should inform your opinions as well.
4
Hindsight Bias
THE MISCONCEPTION: After you learn something new, you remember how you were once ignorant or wrong.
THE TRUTH: You often look back on the things you’ve just learned and assume you knew them or believed them all along.
“I knew they were going to lose.”
“That’s exactly what I thought was going to happen.”
“I saw this coming.”
“That’s just common sense.”
“I had a feeling you might say that.”
How many times have you said something similar and believed it?
Here’s the thing: You tend to edit your memories so you don’t seem like such a dimwit when things happen you couldn’t have predicted. When you learn things you wish you had known all along, you go ahead and assume you did know them. This tendency is just part of being a person, and it is called the Hindsight Bias.
Take a look at the results of this study:
A recent study by researchers at Harvard shows as people grow older they tend to stick to old beliefs and find it difficult to accept conflicting information about topics they are already familiar with. The findings seem to suggest you can’t teach an old dog new tricks.
Of course the study showed this. You’ve known this your whole life; it’s common knowledge.
Consider this study:
A study out of the University of Alberta shows older people, with years of wisdom and a virtual library of facts from decades of exposure to media, find it much easier to finish a four-year degree ahead of time than an eighteen-year-old who has to contend with an unfinished, still-growing brain. The findings show you are never too old to learn.
Wait a second. That seems like common knowledge too.
So which is it—you can’t teach an old dog new tricks, or you are never too old to learn?
Actually, I made both of these up. Neither one is a real study. (Using fake studies is a favorite way of researchers to demonstrate hindsight bias.) Both of them seemed probable because when you learn something new, you quickly redact your past so you can feel the comfort of always being right.
In 1986, Karl Teigen, now at the University of Oslo, did a study in which he asked students to evaluate proverbs. Teigen gave participants famous sayings to evaluate. When participants were given adages, like “You can’t judge a book by its cover,” they tended to agree with the wisdom. What would you say? Is it fair to say you can’t judge a book by its cover? From experience, can you remember times when this was true? What about the expression “If it looks like a duck, swims like a duck, and quacks like a duck, then it probably is a duck?” Seems like common sense too, huh? So which is it?
In Teigen’s study, most people agreed with all the proverbs he showed them, and then agreed once again when he read to them proverbs that stated opposing views. When he asked them to evaluate the phrase “Love is stronger than fear,” they agreed with it. When he presented them the opposite, “Fear is stronger than love,” they agreed with that too. He was trying to show how what you think is just common sense usually isn’t. Often, when students and journalists and laypeople hear about the results of a scientific study, they agree with the findings and say, “Yeah, no shit.” Teigen showed this is just hindsight bias at work.
You are always looking back at the person you used to be, always reconstructing the story of your life to better match the person you are today. You have needed to keep a tidy mind to navigate the world ever since you lived in jungles and on savannas. Cluttered minds got bogged down, and the bodies they controlled got eaten. Once you learn from your mistakes, or replace bad info with good, there isn’t much use in retaining the garbage, so you delete it. This deletion of your old, incorrect assumptions de-clutters your mind. Sure, you are lying to yourself, but it’s for a good cause. You take all you know about a topic, all you can conjure up on the spot, and construct a mental model.
Right before President Nixon left for China, a researcher asked people what they thought the chances were for certain things to happen on his trip. Later, once the trip was over, knowing the outcomes, people remembered their statistical assumptions as being far more accurate than they were. The same thing happened with people who felt that another terrorist attack was likely after 9/11. When no attack happened, these people recalled having made much lower estimates of the risk of another attack.
Hindsight bias is a close relative of the availability heuristic. You tend to believe anecdotes and individual sensational news stories are more representative of the big picture than they are. If you see lots of shark attacks in the news, you think, “Gosh, sharks are out of control.” What you should think is “Gosh, the news loves to cover shark attacks.” The availability heuristic shows you make decisions and think thoughts based on the information you have at hand, while ignoring all the other information that might be out there. You do the same thing with Hindsight Bias, by thinking thoughts and making decisions based on what you know now, not what you used to know.
Knowing hindsight bias exists should arm you with healthy skepticism when politicians and businessmen talk about their past decisions. Also, keep it in mind the next time you get into a debate online or an argument with a boyfriend or girlfriend, husband or wife—the other person really does think he or she was never wrong, and so do you.
5
The Texas Sharpshooter Fallacy
THE MISCONCEPTION: You take randomness into account when determining cause and effect.
THE TRUTH: You tend to ignore random chance when the results seem meaningful or when you want a random event to have a meaningful cause.
Abraham Lincoln and John F. Kennedy were both presidents of the United States, elected one hundred years apart. Both were shot and killed by assassins who were known by three names with fifteen letters, John Wilkes Booth and Lee Harvey Oswald, and neither killer would make it to trial. Spooky, huh? It gets better. Kennedy had a secretary named Lincoln. They were both killed on a Friday while sitting next to their wives, Lincoln in the Ford Theater, Kennedy in a Lincoln made by Ford. Both men were succeeded by a man named Johnson—Andrew for Lincoln and Lyndon for Kennedy. Andrew was born in 1808, Lyndon in 1908. What are the odds?
In 1898, Morgan Robertson wrote a novel titled Futility. Given that it was written fourteen years before the Titanic sank, eleven years before construction on the vessel even began, the similarities between the book and the real event are eerie. The novel describes a giant boat called the Titan which everyone considers unsinkable. It is the largest ever created, and inside, it seems like a luxury hotel—just like the as yet unbuilt Titanic. Titan had only twenty lifeboats, half of what it would need should the great ship sink. The Titanic had twenty-four, also half what it needed. In the book, the Titan hits an iceberg in April four hundred miles from Newfoundland. The Titanic, years later, would do the same in the same month in the same place. The Titan sinks, and more than half of the passengers die, just as with the Titanic. The number of people on board who die in the book and the number in the future accident are nearly identical. The similarities don’t stop there. The fictional Titan and the real Titanic both had three propellers and two masts. Both had a capacity of three thousand people. Both hit the iceberg close to midnight. Did Robertson have a premonition? I mean, what are the odds?
In the 1500s, Nostradamus wrote:
B’tes farouches de faim fleuves tranner
Plus part du champ encore Hister sera,
En caige de fer le grand sera treisner,
Quand rien enfant de Germain observa.
This is often translated to:
Beasts wild with hunger will cross the rivers,
The greater part of the battle will be against Hister.
He will cause great men to be dragged in a cage of iron,
When the son of Germany obeys no law.
That’s rather creepy, considering that it seems to describe a guy with a tiny mustache born about four hundred years later. Here is another prophecy:
Out of the deepest part of the west of Europe,
From poor people a young child shall be born,
Who with his tongue shall seduce many people,
His fame shall increase in the Eastern Kingdom.
Wow. Hister certainly sounds like Hitler, and that second quatrain seems to drive it home. Actually, many of Nostradamus’s predictions are about a guy from Germania who wages a great war and dies mysteriously. What are the odds?
If any of this seems too amazing to be coincidence, too odd to be random, too similar to be chance, you are not so smart. Allow me to explain.
Say you go on a date, and the other person reveals he or she drives the same kind of car you do. It’s a different color, but the same model. Well, that’s sort of neat, but nothing amazing.
Let’s say later on you learn your date’s mom’s name is the same as your mom’s, and your mothers have the same birthday. Hold on a second. That’s pretty cool. Maybe the hand of fate is pushing you toward the other person. Later still, you find out you both own the box set of Monty Python’s Flying Circus, and you both grew up loving Rescue Rangers. You both love pizza, but hate rutabagas. This is meant to be, you think. You are made for each other.
But, take a step back. Now take another. How many people in the world own that model of car? You are both about the same age, so your mothers are too, and their names were probably common in their time. Since you and your date have similar backgrounds and grew up in the same decade, you probably share the same childhood TV shows. Everyone loves Monty Python. Everyone loves pizza. Many people hate rutabagas.