Bushman has been doing this research for a while, and it keeps turning up the same results. If you think catharsis is good, you are more likely to seek it out when you get pissed. When you vent, you stay angry and are more likely to keep doing aggressive things so you can keep venting. It’s druglike, because there are brain chemicals and other behavioral reinforcements at work. If you get accustomed to blowing off steam, you become dependent on it. The more effective approach is to just stop. Take your anger off of the stove.
Bushman’s work also debunks the idea of redirecting your anger into exercise or something similar. He says it will only maintain your state or increase your arousal level, and afterward you may be even more aggressive than if you had cooled off. Still, cooling off is not the same thing as not dealing with your anger at all. Bushman suggests you delay your response, relax or distract yourself with an activity totally incompatible with aggression.
If you get into an argument, or someone cuts you off in traffic, or you get called an awful name, venting will not dissipate the negative energy. It will, however, feel great. That’s the thing. Catharsis will make you feel good, but it’s an emotional hamster wheel. The emotion that led you to catharsis will still be there afterward, and if the catharsis made you feel good, you’ll seek that emotion out again in the future.
32
The Misinformation Effect
THE MISCONCEPTION: Memories are played back like recordings.
THE TRUTH: Memories are constructed anew each time from whatever information is currently available, which makes them highly permeable to influences from the present.
One night your friend tells a story about the time the both of you watched Cool Hand Luke and decided to try and eat as many hard-boiled eggs as you could stomach, but you got sick after five and swore never to eat them again. You are both laughing and clinking your glasses at the folly of your youth, when another friend blows your mind by saying, “No, that was me. You weren’t even there.”
Your mind reels as the pages of your own comic book flip by. You search the panels for scenes that could confirm or deny whether you have lost your mind, but you can’t find conclusive evidence for either person’s account. Who ate those eggs?
Maybe it’s not this extreme, but every once in a while someone tells a story that conflicts with your recollection. The person embellishes with details that slipped past your mental fact-checkers. When you notice, as above, it is a truly unsettling experience because normally you are oblivious to your faulty reconstruction of memory. Not only is your memory easily altered by the influence of others, you also smooth over the incongruences, rearrange time lines, and invent scenarios, but rarely notice you’re doing this until you see yourself in a video, or hear another person’s version of the events. You tend to see your memories as a continuous, consistent movie, yet if you think of the last film you saw, how much of it can you recall? Could you sit back, close your eyes, and recall in perfect detail every scene, every line of dialog? Of course not, so why do you assume you can do the same for the movie of your life?
Take out a piece of paper and get ready to write. Really do it; it will be fun.
OK.
Now, read the following list of words out loud one time, and then try to write as many of them as you can remember on the paper without looking back. When you think you have them all down on paper, come back to the book.
Go:
door, glass, pane, shade, ledge, sill, house, open, curtain, frame, view, breeze, sash, screen, shutter
Now, take a look at the list. How did you do? Did you write down all the words? Did you write the word “window” down? If this test is presented properly, 85 percent of people taking it will remember seeing “window” in the list, but it isn’t there. If you did, you just gave yourself a false memory thanks to the misinformation effect.
In 1974, Elizabeth Loftus at the University of Washington conducted a study in which people watched films of car crashes. She then asked the participants to estimate how fast the cars were going, but she divided the people into groups and asked the question differently for each. These were the questions:
• About how fast were the cars going when they smashed into each other?
• About how fast were the cars going when they collided into each other?
• About how fast were the cars going when they bumped into each other?
• About how fast were the cars going when they hit each other?
• About how fast were the cars going when they contacted each other?
The people’s answers in miles per hour averaged like this:
• Smashed—40.8
• Collided—39.3
• Bumped—38.1
• Hit—34.0
• Contacted—31.8
Just by changing the wording, the memories of the subjects were altered. The car crashes were replayed in the participants’ minds, but this time the word “smashed” necessitated the new version of the memory include cars that were going fast enough to validate the adjective.
Loftus raised the ante by asking the same people if they remembered broken glass in the film. There was no broken glass, but sure enough the people who were given the word “smashed” in their question were twice as likely to remember seeing it.
Since then, hundreds of experiments into the misinformation effect have been conducted, and people have been convinced of all sorts of things. Screwdrivers become wrenches, white men become black men, and experiences involving other people get traded back and forth. In one study, Loftus convinced people they were once lost in a shopping mall as a child. She had subjects read four essays provided by family members, but the one about getting lost as a kid was fake. A quarter of the subjects incorporated the fake story into their memory and even provided details about the fictional event that were not included in the narrative. Loftus even convinced people they shook hands with Bugs Bunny, who isn’t a Disney character, when they visited Disney World as a kid, just by showing them a fake advertisement where a child was doing the same. She altered the food preferences of subjects in one experiment where she lied to people, telling them they had reported becoming sick from eating certain things as a child. A few weeks later, when offered those same foods, those people avoided them. In other experiments, she implanted memories of surviving drowning and fending off animal attacks—none of them real, all of them accepted into the autobiography of the subjects without resistance.
Loftus has made it her life’s work to showcase the unreliability of memory. She has rallied against eyewitness testimony and suspect lineups for decades now, and she also has criticized psychologists who say they can dredge up repressed memories from childhood. For instance, in one of her experiments she had subjects watch a pretend crime and then select the culprit out of a lineup. The police told the subjects the perpetrator was one of the people standing before them, but it was a trick. None of them were the real suspect, yet 78 percent of the people still identified one of the innocent people as the person who they saw committing the crime. Memory just doesn’t work like that, Loftus says, but despite this, many of our institutions and societal norms persist as though it does.
There are many explanations as to why this is happening, but the effect is well established and predictable. Scientists generally agree memories aren’t recorded like videos or stored like data on a hard drive. They are constructed and assembled on the spot as if with Legos from a bucket in your brain. Neurologist Oliver Sacks wrote in The Island of the Colorblind about a patient who became colorblind after a brain injury. Not only could he not see certain colors, he couldn’t imagine them or remember them. Memories of cars and dresses and carnivals were suddenly drained, washed down. Even though this patient’s memories were first imprinted when he could see color, they now could be conjured up only with the faculties of his current imagination. Each time you build a memory, you make it from scratch, and if much time has passed you stand a good chance of getting the details wrong. With a little influence,
you might get big ones wrong.
In 2001, Henry L. Roediger III, Michelle L. Meade, and Erik T. Bergman at Washington University had students list ten items they would expect to see in a typical kitchen, toolbox, bathroom, and other common areas in most homes. Think about it yourself. What ten items would you expect to find in a modern kitchen? This idea, this imaginary place, is a schema. You have schemas for just about everything—pirates, football, microscopes—images and related ideas that orbit the archetypes for objects, scenarios, rooms, and so on. Those archetypes form over time as you see examples in life or in stories from other people. You also have schemas for places you’ve never been, like the bottom of the ocean or ancient Rome.
For instance, when you imagine the ancient Romans, do you see chariots and marble statues with bone-white columns stretching overhead? You probably do, because this is how ancient Rome is always depicted in movies and television. Would it surprise you to know those columns and sculptures were painted with a rainbow of colors that would be gaudy by today’s aesthetic standards? They were. Your schema is fast, but inaccurate. Schemas function as heuristics; the less you have to think about these concepts the faster you can process thoughts that involve them. When a schema leads to a stereotype, a prejudice, or a cognitive bias, you trade an acceptable level of inaccuracy for more speed.
Back to the experiment. After the psychologists had the students list items they’d expect to find in various household locations, they brought in actors posing as a new batch of students and paired them up with the students who’d just made their lists. Together, the subjects and the confederates looked at slides depicting the familiar locations and were asked to pay close attention to what they saw so they could remember it later on. To clear their mental palates, the subjects did some math problems before moving on to the last part of the experiment. The students then returned with their partners and together recalled out loud what they remembered in the scenes, but the confederates included items that weren’t in the pictures. The kitchen scene, for example, didn’t feature a toaster or oven mitts, but both were falsely recalled by the actors. After the ruse, the subjects were handed a sheet of paper and asked to list all the things they could remember.
As you’ve deduced by now, the subjects were easily implanted with false memories for items they expected to be in the scenes. They listed items that were never shown but had been suggested by their partners. Their schemas for kitchens already included toasters and oven mitts, so when the actors said they saw those things, it was no problem for their minds to go ahead and add them to the memory. If their partners had instead said they remembered seeing a toilet bowl in the kitchen, it would have been harder to accept.
In 1932, psychologist Charles Bartlett presented a folktale from American Indian culture to subjects and then asked them to retell the story back to him every few months for a year. Over time, the story become less like the original and more like a story that sounded as though it came from the culture of the person recalling it.
In the original story, two men from Egulac are hunting seals along a river when they hear what they believe are war cries. They hide until a canoe with five men approaches. The men ask them to join them in a battle. One man agrees; the other goes home. After this, the story gets confusing because in the battle someone hears someone else say the men are ghosts. The man who traveled with the warriors is hit, but it isn’t clear what hits him or who. When he gets home, he tells his people what happened, saying he fought with ghosts. In the morning, something black comes out of his mouth, and he dies.
The story is not only strange, but written in an unusual way that makes it difficult to understand. Over time, the subjects reshaped it to make sense to them. Their versions became shorter, more linear, and many details were left out that didn’t make sense in the first place. The ghosts became the enemy, or became the allies, but usually became a central feature of the tale. Many people interpreted them to be the undead, even though in the tale the word “ghost” identifies the name of the clan. The dying man is tended to. The seal hunters become fishermen. The river becomes a sea. The black substance becomes his soul escaping or a blood clot. After a year or so, the stories started to include new characters, totems, and ideas never present in the original, like the journey as a pilgrimage, or the death as a sacrifice.
Memory is imperfect, but also constantly changing. Not only do you filter your past through your present, but your memory is easily infected by social contagion. You incorporate the memories of others into your own head all the time. Studies suggest your memory is permeable, malleable, and evolving. It isn’t fixed and permanent, but more like a dream that pulls in information about what you are thinking about during the day and adds new details to the narrative. If you suppose it could have happened, you are far less likely to question yourself as to whether it did.
The shocking part of these studies is how easily memory gets tainted, how only a few iterations of an idea can rewrite your autobiography. Even stranger is how as memories change, your confidence in them grows stronger. Considering the relentless bombardment to your thoughts and emotions coming from friends, family, and all media: How much of what you recall is accurate? How much of the patchwork is yours alone? What about the stories handed down through time or across a dinner table; what is the ratio of fiction to fact? Considering the misinformation effect not only requires you to be skeptical of eyewitness testimony and your own history, but it also means you can be more forgiving when someone is certain of something that is later revealed to be embellished or even complete fiction.
Consider the previous exercise when you falsely saw curtains in the list of things around a window. It took almost no effort to implant the memory because you were the one doing the implanting. Recognize the control you have over—wait, was it curtains?
33
Conformity
THE MISCONCEPTION: You are a strong individual who doesn’t conform unless forced to.
THE TRUTH: It takes little more than an authority figure or social pressure to get you to obey, because conformity is a survival instinct.
On April 4, 2004, a man calling himself Officer Scott called a McDonald’s in Mount Washington, Kentucky. He told the assistant manager, Donna Jean Summers, who answered the phone, there had been a report of theft and that Louise Ogborn was the suspect.
Ogborn, eighteen, worked at the McDonald’s in question, and the man on the other line told Donna Jean Summers to take her into the restaurant’s office, lock the door, and strip her naked while another assistant manager watched. He then asked her to describe the naked teenager to him. This went on for more than an hour, until Summers told Officer Scott she had to return to the counter and continue her duties. He asked her if her fiancé could take over, and so she called him to the store. He arrived shortly after, took the phone, and then started following instructions. Officer Scott told him to tell Ogborn to dance, do jumping jacks, and stand on furniture in the room. He did. She did. Then, Officer Scott’s requests became more sexual. He told Summer’s fiancé to make Ogborn sit in his lap and kiss him so he could smell her breath. When she resisted, Officer Scott told him to spank her naked bottom, which he did. More than three hours into the ordeal, Officer Scott eventually convinced Summers’s fiancé to force Ogborn to perform oral sex while he listened. He then asked for another man to take over, and when a maintenance worker was called in to take the phone, he asked what was going on. He was shocked and skeptical. Officer Scott hung up.
The call was one of more than seventy made over the course of four years by one man pretending to be a police officer. He called fast-food restaurants in thirty-two states and convinced people to shame themselves and others, sometimes in private, sometimes in front of customers. With each call he claimed to be working with the parent corporation, and sometimes he said he worked for the bosses of the individual franchises. He always claimed a crime had been committed. Often, he said investigators and other police officers were on their way. The employees
dutifully did as he asked, disrobing, posing, and embarrassing themselves for his amusement. Police eventually captured David Stewart, a Florida prison security guard who had in his possession a calling card that was traced back to several fast-food restaurants, including one that had been hoaxed. Stewart went to court in 2006 but was acquitted. The jury said there wasn’t enough evidence to convict him. There were no more hoax phone calls after the trial.
What could have made so many people follow the commands of a person they had never met and from who they had no proof of his being a police officer?
If I were to hand you a card with a single line on it, and then hand you another card with an identical line drawn near two others, one longer and one shorter, do you think you could match up the original to the copy? Could you tell which line in a group of three was the same length as the one on the first card?
You could. Just about anyone would be able to match up lines of equal length in just a few seconds. Now, what if you were part of a group trying to come to a consensus, and the majority of the people said a line that was clearly shorter than the original was the one that matched? How would you react?
In 1951, psychologist Solomon Asch used to perform an experiment where he would get a group of people together and show them cards like the ones described above. He would then ask the group the same sort of questions. Without coercion, about 2 percent of people answered incorrectly. In the next run of the experiment, Asch added actors to the group who all agreed to incorrectly answer his questions. If he asked which line was the same, or longer, or shorter, or whatever, they would force one hapless subject to be alone in disagreement.
You Are Not So Smart Page 16