The Panic Virus
Page 21
According to Wakefield, however, that “mutual agreement” wasn’t the result of a dispute over the tactics he used to reach his conclusions—it was about the implications of his controversial results. “I have been asked to go because my research results are unpopular,” he said at the time. “I did not wish to leave but I have agreed to stand down in the hope that my going will take the political pressure off my colleagues and allow them to get on with the job of looking after the many sick children we have seen.” With a note of defiance in his voice, he added, “I have no intention of stopping my investigations.”
For years, Wakefield had made some variation on this theme the linchpin of his defense whenever he came under scrutiny. Three years earlier, in a response to criticism of his 1998 Lancet paper on the MMR vaccine and autism, he’d trotted out his loyalty to the people he referred to as “desperate parents” and used them to divert attention from the questions that had been raised about his work: “We get these parents ringing up every day,” he told a reporter from The Independent at the time. “They say, ‘My child has autism and bowel problems and we believe they are linked.’ You have to do something for them.”
The genius of Wakefield’s explanation was that it neutralized the charge that he was behaving unethically—if parents were eager for their children to receive endoscopies, what grounds did anyone have for criticizing the doctor who performed them?—while also making the “establishment” figures who opposed him seem patronizing in their willingness to tell parents what not to do despite being unable to suggest any alternatives. As the doctors and scientists with whom he’d once worked went about cutting their ties, Andrew Wakefield bound himself more tightly with each passing day to the parents on whom he’d base his future.
On January 14, 2002, two months after Theresa Cedillo met Wakefield in San Diego, the Cedillos changed their Vaccine Court encephalopathy claim to what was referred to as a “causation-in-fact” case alleging that the MMR vaccine had induced their daughter’s autism. This was a huge roll of the dice, and it all but guaranteed that the litigation would be a major part of the Cedillos’ lives for years to come. Had they stuck with their original table injury claim, the case would have been a straightforward one about a condition already acknowledged to be caused by vaccination in some situations. By making their case about autism, the Cedillos were creating a far more daunting and time-consuming burden of proof for themselves: Before they could even begin to argue that Michelle’s autism had been caused by the MMR vaccine, they would need to prove the general form of the charge—that vaccines were capable of causing autism in anyone. That meant that in order for the Cedillos to receive compensation, the court would need to be convinced that the growing scientific consensus that vaccines did not cause autism was, in fact, incorrect.
The magnitude to which the Cedillos had complicated their task was highlighted in the announcement the court’s Chief Special Master made that July regarding the Omnibus Autism Proceeding: “[P]etitioners’ representatives have stated that they are not prepared to present their causation case” because they needed more time “for the science to crystallize, to obtain experts, and in general to prepare their proof concerning the difficult medical and legal causation issues.” This was a stunning admission: Despite encouraging thousands of families to file suit, their lawyers had absolutely no proof that the fundamental pillar on which they based their claims was correct. As a result, both sides were given sixteen months—until November 2003—to “designate experts” and begin assembling their strongest possible arguments. The deadline for submitting “expert reports with supporting authorities” would come three months after that, and decisions were expected to be handed down in the summer of 2004—which meant that the Cedillos would have to wait a minimum of two more years before they could begin to move on with their lives.
43 There were some exceptions to this rule, the most notable being that there could be no other immediately obvious explanation for the injury. In other words, even though anaphylactic shock was listed as a table injury for the polio vaccine, someone with a severe shellfish allergy wouldn’t be granted automatic relief in the Vaccine Court just because he’d been immunized a couple of hours before eating a plateful of shrimp scampi. A more opaque scenario arose when a petitioner who’d displayed evidence of brain injury before being vaccinated was awarded damages after his symptoms continued to progress after vaccination. That decision was eventually reversed by a unanimous Supreme Court ruling in which David Souter explained the justices’ decision to disallow awards for preexisting conditions with the phrase “One injury, one onset.”
44 Today, the court allows for the reimbursement of “reasonable lawyers’ fees and other legal costs.” There remains a $250,000 cap on “pain and suffering,” but there is no upper limit on the amount of money claimants can receive to cover medical care and lost wages.
45 The Redwoods were never able to file a Vaccine Court claim: In 2001, more than six years had passed since Will Redwood had received the vaccines his parents believed had caused his autism.
CHAPTER 16
COGNITIVE BIASES AND AVAILABILITY CASCADES
For the Cedillos, the decision to initiate litigation was undoubtedly the result of a number of interrelated factors, including love for their daughter, anger at the vaccines that they believed had harmed her, and the hope that they’d be able to alleviate her suffering. Conventional wisdom holds that the more emotional a decision, the less rational it is, but in his 2009 book How We Decide, Jonah Lehrer explains that is not always the case. By way of illustration, Lehrer describes a patient named Elliot who in 1982 had a tumor removed from an area of the brain just behind the frontal cortex. Elliot survived his surgery with his intellect intact—his IQ was exactly the same before and after he went under the knife—but he completely lost the ability to make decisions, ranging from what route to take to work to what color pen to use to write his name. Soon, one of his doctors realized there’d been another significant change in Elliot’s personality: He’d seemingly lost the ability to feel.
“This was a completely unexpected discovery,” Lehrer writes. “At the time, neuroscience assumed that human emotions were irrational”—which would mean that an inability to feel should make it easier to make decisions. It turns out the exact opposite is true: “When we are cut off from our feelings, the most banal decisions become impossible. A brain that can’t feel can’t make up its mind.”
Today, there’s an almost universal acceptance that what has traditionally been perceived as “rational” thought is in fact intimately connected with our emotions. This discovery has led to an explosion of interest in the cognitive biases we use to convince ourselves that the truth lies with what we feel rather than with what the evidence supports. The origins of many of these traits can be traced back to the primitive conditions in which they were selected for millennia ago. Take pattern recognition, which evolutionary biologists like to explain through fables about our ancestors: Imagine a primitive hunter-gatherer. Now imagine he sees a flicker of movement on the horizon, or hears a rustle at his feet. Maybe it was nothing—or maybe it was a lion out hunting for dinner or a snake slithering through the grass. In each of those examples, the negative repercussions of not taking an actual threat seriously will likely result in death—and the end of that particular individual’s genetic line. On the other hand, the repercussions of bolting from what turns out to be the shadow of a swaying tree or the sound of a gentle breeze will likely be nothing worse than a little extra exercise. (In statistical terms, this explains why we’re much more likely to make Type I errors, or false positives, than we are Type II errors, or false negatives.)
Unfortunately, evolution is a blunt tool, and a by-product of that protective instinct is a tendency to connect the dots even when there are no underlying shapes to be drawn. When our yearning to feel in control and our ability to recognize randomness are in conflict, the urge to feel in control almost always wins—as was likely the case when Lorr
aine Pace became convinced that there were an unusually high number of breast cancer cases in her Long Island community. (The technical name for this tendency is the clustering illusion.)
Pattern recognition and the clustering illusion are just two of literally dozens of cognitive biases that have been identified over the past several decades. Some of the others have been alluded to earlier in this book: When SafeMinds members set out to write an academic paper about a hypothesis they already believed to be true, they set themselves up for expectation bias, where a researcher’s initial conjecture leads to the manipulation of data or the misinterpretation of results, and selection bias, where the meaning of data is distorted by the way in which it was collected. In addition to being a natural reaction to the experience of cognitive dissonance, the hardening conviction on the part of vaccine denialists in the face of studies that undercut their theories is an example of the anchoring effect, which occurs when we give too much weight to the past when making decisions about the future, and of irrational escalation, which is when we base how much energy we’ll devote to something on our previous investment and discount new evidence indicating we were likely wrong. (Remember that the next time you refuse to turn around despite signs that you’ve been traveling in the wrong direction or you hold on to a stock because you’re convinced you’ll be able to make back the money you’ve already lost.) My favorite of the cognitive biases on the grounds of its name alone refers to the phenomenon of crafting a hypothesis to support your data, and in the process making it untestable. This is called the Texas sharpshooter’s fallacy, named after an imaginary cowboy who confirms his skill as a marksman by shooting bullets into the side of a barn and then drawing a target around the resulting holes.
These realities of human cognition help explain why it can be so difficult to demonstrate to someone that their initial read on a situation—their instinct, their gut reaction, their feeling—is, in fact, wrong. They also show why two reasonable, intelligent people who disagree can be equally certain that the evidence supports their understanding of the “facts.” It’s at this point that confirmation bias, the granddaddy of all cognitive biases, kicks into action—which is to say, it’s at the precise moment when we should be looking for reasons that we might be wrong that we begin to overvalue any indication that points to our being right. (This is part of the reason the scientific method can be so hard to grasp, and so hard to adhere to: It goes against our makeup to try to find ways to punch holes in our own arguments.) Misapprehensions about medicine are particularly vulnerable to the effects of confirmation bias, because the process by which a given intervention works is so often contra-logical: It makes no intuitive sense that rebreaking a bone would help a fracture to heal or that using chemotherapy to kill living tissue would help a person survive cancer. Now consider vaccines. Receiving a shot in a doctor’s office might not activate the disgust response in the way that early inoculation methods did, but injecting a healthy child with a virus in order to protect him from a disease that has all but disappeared still feels somehow wrong. The fact that getting vaccinated is so obviously painful and that infants can’t understand that you want to help and not hurt them only makes matters worse.
Confirmation bias, like all the unconscious legerdemains referred to above, is a phenomenon that occurs on an individual level. The extent to which the autism advocacy world has been taken over by anti-vaccine sentiment illustrates a whole other category of cognitive sleights-of-hand: those that arise out of group dynamics. Ten years ago, the ARI/DAN! meeting held in San Diego every fall was the only major national autism conference in the country. In 2010, in addition to ARI conventions in Baltimore, Maryland, and Long Beach, California, there was the annual AutismOne event in Chicago, Talk About Curing Autism conferences in Birmingham, Alabama; Madison, Wisconsin; and Orange County, California, and dozens of state-and county-wide meetings sponsored by groups like the National Autism Association (NAA).
Wherever and whenever these gatherings occurred, one of the dominant themes ended up being the dangers of vaccines and the venality of the medical establishment. This is not because participants arrived eager to attack the American Academy of Pediatrics or the CDC. Instead, the animus is a natural outgrowth of people with similar interests getting together in the first place: Inevitably, group members leave with views that are simultaneously more extreme and more similar to each other’s than the ones they came in with. As an example, consider the current views of three people whose first exposure to an international network of parents with autistic children occurred at the 2001 ARI/DAN! conference. Vicky Debold says she was motivated to travel to San Diego by her frustration with the health care industry she worked in and had always trusted, while Theresa Cedillo was enticed by the prospect of learning about ways she might be able to help her daughter. For Jane Johnson, the convention provided a sense of emotional connectedness that had been missing from her life. “I looked out at an audience of hundreds of people, and they knew how I felt,” she says. “To be surrounded by a thousand other people who know [your] pain is really powerful.” That conference, Johnson says, was the first of eight consecutive ones at which she was brought to tears.
Nine years later, the opinions of all three women have coalesced around an opposition to the way vaccines are developed and administered. In a recent conversation, Johnson told me that public health officials around the world are potentially more culpable than the tobacco company executives who hid evidence that smoking is harmful: “This is the federal government giving every kid a carton of cigarettes and saying, ‘Get to work.’ ” Vicky Debold, whose reaction to hearing Barbara Loe Fisher in May 2000 was, “You wouldn’t be saying and doing the kinds of things you’re doing if you had seen the kids die in the ICU that I saw,” has become one of the leaders of Fisher’s National Vaccine Information Center. And Theresa Cedillo lent her daughter’s name to the largest vaccine-related compensation lawsuit in the world.
The process that explains this convergence of views is called an “availability cascade,” a concept that was first articulated in a 1999 paper by Timur Kuran, an economics and political science professor at Duke who was then at the University of Southern California, and Cass Sunstein, who currently heads up the White House’s Office of Information and Regulatory Affairs and was at the time a professor at the University of Chicago Law School. Kuran and Sunstein defined the term as a “self-reinforcing process of collective belief formation by which an expressed perception triggers a chain reaction that gives the perception increasing plausibility through its rising availability in public discourse.”46 Another way of saying this is that an availability cascade describes how the perception that a belief is widely held—the “availability” of that idea—can be enough to make it so. In this instance, the believability of the notion that vaccines cause autism has grown in proportion to the number of people talking about it, as opposed to the theory’s actual legitimacy.
This blind-leading-the-blind phenomenon is all the more dangerous because one all-but-guaranteed effect of surrounding yourself with like-minded people is increased polarization: Think about the red-meat frenzy of the quadrennial Democratic and Republican conventions or the sign-waving mania churned up by a Tea Party rally. Or to come at it from a different angle, consider what happens when a bunch of otherwise reasonable Red Sox fans get together to watch a baseball game in a bar: It’s a fair bet that nobody’s going home that night feeling any warmer toward the Yankees.47
One reason this us-versus-them mentality has become so ubiquitous is that the mechanisms that have historically allowed for easy access to moderate positions are rapidly disappearing. Twenty years ago, it took a fair amount of effort to create an information cocoon: In 1987, nearly three-quarters of Americans tuned into a nightly news broadcast from one of the three networks, creating a sort of national common denominator for information about the world. Now that figure has fallen below one-third, as consumers abandon the presumed neutrality of the networks in favor of cable n
ews telecasts that gratify viewers by feeding them exaggerated versions of the opinions they already hold. An even more potent force in this regard is the Internet, where it’s easier than not to fall down a wormhole of self-referential and mutually reinforcing links that make it feel like the entire world thinks the way you do. The anonymity and lack of friction inherent in the online world also mean that a small number of committed activists—or even an especially zealous individual—can create the impression that a fringe viewpoint has broad support. (Kuran and Sunstein refer to these people as “availability entrepreneurs.”) A 2007 study titled “Inferring the Popularity of an Opinion from Its Familiarity: A Repetitive Voice Can Sound like a Chorus” provides statistical proof of this phenomenon: The authors analyzed a series of group discussions and found that participants tended “to infer that a familiar opinion [was] a prevalent one, even when its familiarity derive[d] solely from the repeated expression of one group member.”
When dealing with debates that hinge on the interpretation of facts—Are April snowstorms a sign of global warming or the opposite? Are the latest unemployment numbers a sign the Obama administration’s policies are succeeding or failing?—these differences of opinion can quickly become personal: When confronted with people who see black where we see white, it’s hard not to assume that there’s something about them—either they’re ignorant, or their emotions have the better of them, or they’re lying in the service of a more base interest—that prevents them from acknowledging what to us is self-evident. Looked at through this light, the seemingly contradictory ideas that have defined the vaccine wars begin to make more sense. The inner workings of the human mind, more than any ambiguity about the facts, explain why more and more parents were rushing to file claims in Vaccine Court at precisely the same time that scientists were growing more certain that vaccines did not cause autism.