Book Read Free

The Panic Virus

Page 7

by Seth Mnookin; Dan B. Miller


  * * *

  One final legacy of the polio vaccine’s scandal-tinged rollout involves a little known agency of crack infectious disease specialists housed within the CDC. The Epidemic Intelligence Service (EIS) was the brainchild of a public health giant named Alexander Langmuir, who in 1951 convinced federal health officials that the country needed a team of medical doctors to conduct the first wave of investigation whenever new and disturbing disease patterns emerged.

  It had been the EIS that had first traced the defective batches of polio vaccine back to Cutter Laboratories, and the speed with which it worked had been a major factor in preventing the total collapse of the polio campaign. But in the same investigation in which Langmuir’s ground troops had fingered Cutter, they discovered that Wyeth Pharmaceuticals, one of the other manufacturers licensed to make the vaccine, had also produced defective batches. When confronted with the possibility that a second revelation could spark a panic that would fatally cripple the vaccine program, Langmuir concluded that the risks attributable to Wyeth were small enough that it would be better to keep quiet. Seven years later, Langmuir faced a similar decision after EIS investigators found that in very rare cases—less than one person out of every million vaccinated—the oral polio vaccine developed by Albert Sabin caused paralysis. By that point, polio had all but vanished from the United States. Given this situation, Langmuir chose to bury the data implicating Sabin’s vaccine, calculating that it was better to avoid the potential reemergence of the virus than to let the public know all the facts. Whether he made the correct decision is a subject for medical ethicists and philosophers to reason through. What is beyond debate is that in the years to come, public health officials learned the disastrous consequences of failing to educate the public about the risks and realities of fighting disease.

  10 This method of viral transmission occurs much more frequently than we’d like to admit. Viruses that are transmitted through the ingestion of animal fecal matter are also quite common. Take toxoplasmosis, a parasitic disease that occurs most commonly in cats: In the United States, approximately one-third of the population has been infected; in France, the rate is closer to two-thirds. Food for thought the next time you don’t wash your hands before a meal.

  11 In 2003, an academic paper presented convincing evidence that Roosevelt’s age when he first got sick (he was thirty-nine) and the disease’s progression made it likely that he had actually suffered from Guillain-Barré syndrome, an autoimmune disorder that attacks the nervous system and was all but unknown at the time.

  12 Reporters on site had received an embargoed copy of the report at 9:15 a.m., which was delivered under the protection of four policemen. Proving that the media’s frenzy for beating competitors by mere minutes is not a product of the Internet age, NBC immediately broke the embargo, and was just as quickly denounced by its competitors as forever tainting the sanctity of agreements made between reporters and their sources.

  13 The difficulty in determining whether correlation equals causation causes an enormous number of misapprehensions. Until a specific mechanism demonstrating how A causes B is identified, it’s best to assume that any correlation is incidental, or that both A and B relate independently to some third factor. An example that highlights this is the correlation between drinking milk and cancer rates, which some advocacy groups (including People for the Ethical Treatment of Animals) use to argue that drinking milk causes cancer. A more likely explanation is that cancer diagnoses and milk consumption both have a positive correlation with increased age: On average, milk drinkers live longer than non–milk drinkers, and the older you are, the more likely you are to develop cancer. This does not, however, mean that drinking milk actually causes people to live longer: It could be that people who drink milk have better access to high-quality health care or eat more healthily than those who do not.

  14 Belli’s tort-related work, which resulted in more than $600 million in damages by the time he died in 1996, was not all he was known for: He served as Jack Ruby’s pro bono counsel in his trial for the murder of Lee Harvey Oswald; he received a letter from the Zodiac Killer; in 1969, he helped set up the Rolling Stones–headlined concert at Northern California’s Altamont Speedway at which a member of the Hell’s Angels stabbed a fan to death; over the course of his career, he represented everyone from Mae West to Zsa Zsa Gabor and Chuck Berry to Muhammad Ali; he accused his fifth (of six) wife of throwing their pet dog off the Golden Gate Bridge; and he played an evil overlord named Gorgan in a Star Trek episode titled “And the Children Shall Lead.” Ironically, one of his last victories led to his financial ruin: In 1995, Belli won a class-action lawsuit against breast implant manufacturer Dow Corning, but his firm was unable to recover the $5 million it had spent on the trial when the company declared bankruptcy.

  CHAPTER 4

  FLUORIDE SCARES AND SWINE FLU SCANDALS

  Even taking into account the Cutter Incident, the polio vaccine was, by any objective standard, an enormous success. Within a half-decade of its introduction, cases of paralytic polio in the United States decreased by almost 90 percent. Parents were no longer afraid to let their children play outside in the summer, and the sight of a child stuck inside an iron lung became a rare occurrence. In its 1962 annual report, the New York City Health Department declared that the country was on the verge of winning mankind’s centuries-old “battle against infectious disease” and would soon be ready to commence a “great era of cold war against chronic diseases for which we do not have biologic cures.”

  In order to reach this victory, the government’s role in vaccinations had to evolve from sporadic advocacy to codified law. One of the first indications of this shift was the 1964 formation of a federal committee that would help legislate national immunization practices. Soon, state legislatures joined in the effort: In 1969, New York became the first state to enact a compulsory school vaccination law. By the mid-1970s, thirty-nine other states had followed suit.

  The rapid expansion of vaccination law established freedom from disease as a national priority and not just an individual prerogative. The benefits of these laws were felt almost immediately. Before the measles vaccine was introduced in 1963, the country averaged more than 500,000 cases annually. By 1968, only 22,000 cases were reported—a decline of more than 95 percent.

  The retreat of measles and other viruses rippled through nearly every aspect of American life. A 2006 study by researchers at the Harvard School of Public Health estimated that the polio vaccine alone has saved the United States more than $180 billion, a bottom-line quantification of the more than one million cases of paralytic polio and 160,000 deaths it has prevented. (To put that number in context: The population of Providence, Rhode Island, is 171,000.) The benefits of widespread vaccination have been particularly acute for women, who, no longer needing to spend weeks quarantined at home with sick children, have had greater freedom to join the workforce.

  Just as the benefits of this patchwork of municipal, state, and federal regulations were becoming clear, the shortcomings of the hasty way these laws had been constructed began to be obvious as well. Physicians were devising their own health policies without repercussions: Some counseled against receiving vaccines while others declined to administer them altogether. School vaccination requirements were routinely ignored. The poor were perpetually under-immunized, either because of their lack of access to medical care, their suspicions about the motives of public health workers, or both. As a result, measles cases surged just as the disease appeared poised to disappear: In 1970, there were more than twice as many infections as there’d been two years earlier. The next year, the figure rose another 65 percent.

  The persistence of measles highlighted once again the inescapable paradox of vaccines: When a disease is endemic, any potential side effects of its vaccine appear slight compared to the risks of not getting immunized. The flip side is that the more effective a given vaccine, the more the disease it targets will become an abstraction. (Who born in the last f
ifty years can remember seeing a case of polio?) This is a well-known catch-22 for public health officials, who constantly fear that vaccines will start to feel, like vegetables or vitamins, like something one could just as easily do without.

  Too often, this quandary has led to a failure to acknowledge potential side effects. Like taking on increasing amounts of debt without a plan for how to pay it off, those promissory notes cause even greater harm in the long run: When officials gloss over concerns that prove to be legitimate, they raise the suspicions of everyone who takes for granted that public health policy is designed to do exactly what it claims—keep people healthy. Combine this drop of doubt with a press corps that’s unconcerned with the particulars of science and you create a scenario in which the beliefs of conspiracy theorists start to sound a lot less bizarre.

  A quintessential example of this dynamic at work is the controversies over the fluoridation of public drinking water. The first hint that fluoride might be an effective bulwark against tooth decay came in the beginning of the twentieth century, when a dentist in Colorado noticed some of his patients had small areas of discoloration on their teeth, a condition known as mottling. Over the course of more than a decade, he connected mottling to the naturally occurring presence of fluoride in drinking water. (The territorial variation in fluoride led to some colorful regional nicknames, including Texas Teeth and Colorado Brown Teeth.) He also noticed that while large quantities of fluoride produced severe mottling, smaller quantities appeared to result in stronger teeth and fewer cavities.

  The next piece of evidence came from Minonk, Illinois, a small coal mining and farming community whose water was naturally fluoridated at approximately 2.5 parts per million (ppm). (To give a sense of just how minuscule that is, that means the water in Minonk was composed of approximately .00025 percent fluoride.) In the 1930s, a retrospective study found that children who’d lived in Minonk all their lives had significantly lower levels of tooth decay than children who’d moved to the area after their teeth were fully developed. Those results spurred a total of thirteen more population-wide studies in five states between the early 1930s and the mid-1940s, by which time data from an additional dozen countries also confirmed fluoride’s protective effects. Finally, in 1945, the United States Public Health Service launched what was to be a decade-long field test to be conducted simultaneously in four metropolitan areas. In each one, two communities with comparable makeups would be chosen. The water supply of one of those would receive 1 ppm of fluoride, while the water supply of the control group would remain untreated.

  It didn’t take ten years for the results to become clear: By 1950, the children in the communities that received fluoridated water had half as much tooth decay as the children in the control communities. That year, the Health Service recommended that all public drinking water be fluoridated. In the vast majority of places, this occurred with little fuss—which was to be expected, considering there was a half-century’s worth of evidence supporting the move. In almost one thousand communities across the country, however, determined anti-fluoride activists succeeded in forcing voter referendums on the issue. Fifty-nine percent of the time, they won. (Fluoride remains a preoccupation of health policy skeptics to this day: One of New Jersey’s most prominent anti-vaccine activists recently urged her followers to fight for “Water Fluoridation Choice” and warned that fluoride is suspected of causing “more human cancer death, faster, than any other chemical.”)

  These debates likely never would have occurred had it not been for the press’s willingness to parrot quack claims under the guise of reporting on citizen concerns. In this instance, anti-fluoridationists fixated on the dangers of fluorine, a poisonous gas that is among the most chemically reactive of all elements. What we know as “fluoride” does not contain fluorine gas—it’s made up of some combination of sodium fluoride, sodium aluminum fluoride, and calcium fluoride. Equating the two is like accusing Joe the Plumber of being a murderous dictator because he shares a first name with Joseph Stalin, or claiming that Joseph Stalin must be a brilliant writer due to the skill of Joseph Conrad. That parallel is not as tenuous as it sounds: Sodium fluoride is used to strengthen teeth, sodium chloride is table salt, and chlorine is a poisonous gas that was used by Germany in World War I.

  Another of the activists’ tactics was using specious arguments that appealed to the press’s love of simplicity. This practice started with their vehement rejection of being labeled as “anti-” anything under the guise of being advocates for choice. In theory, this seemed like a reasonable distinction; in reality, personal “choice” in this matter is impossible, since the treatment of public water supplies is an all-or-nothing proposition. (With vaccination, the situation is more complex: Individual choice doesn’t prohibit other members of a community from being inoculated, but it does put everyone who can’t be vaccinated at risk. What’s more, infectious diseases, unlike tooth decay, can be deadly.) Similarly, the anti-fluoridationists’ appeal for stricter safety requirements sounded like a no-brainer—who could be against those? The problem is that the assurance they sought demanded an impossible standard: Scientists can’t give an ironclad guarantee that minuscule quantities of any substance in the universe are completely benign—all they can say is, “To the best of our knowledge there is no danger.” Finally, anti-fluoridationists exhibited a flagrant disregard for the universal truism that the dose makes the poison: When they asserted that there is some hypothetical quantity of drinking water that would contain enough fluoride to be toxic to humans, they did so confident that nobody in the media would point out that that amount was fifty bathtubs’ worth. Drinking that much water is physiologically impossible: Consuming as little as a tenth of a bathtub’s worth at one time is enough to cause hyper-hydration—and death.

  If the mechanisms through which activists enlist journalists to promote their causes are relatively straightforward, the factors that inspire those partisans in the first place oftentimes reflect a range of moral and social concerns that seemingly have nothing to do with the issue under discussion. Much as had been the case a half-century earlier with Lora Little and her followers, the battle over the fluoridation of public drinking water became, in the words of the social anthropologist Arnold Green, a “surrogate issue” for burgeoning social anxieties about modernity and the leveling effects of faceless bureaucracies in the 1950s.

  Twenty years later, a similar synchronicity between a specific health issue and larger social concerns was seen in Great Britain in relation to the whole cell vaccine for pertussis. At first blush, the declining frequency of whooping cough infections in the twentieth century looks to be another of the unalloyed triumphs of modern science. In 1942, an American scientist named Pearl Kendrick combined killed, whole cell pertussis bacterium with weakened diphtheria and tetanus toxins to create the first combination DPT vaccine. Kendrick’s discovery occurred during a decade in which more than 60 percent of British children caught whooping cough. Those infections resulted in more than nine thousand deaths, which was more than the number of fatalities caused by any other infectious disease. After mass DPT inoculations began in the 1950s, these numbers immediately plummeted.15

  But the success of the pertussis vaccine provides another example of the disturbing complacency on the part of trailblazing vaccinologists in the first half of the twentieth century. While whole cell vaccines can be both effective and safe, their reliance on the actual contagion itself as opposed to an isolated component makes them among the most primitive of all vaccines. The fact that a whole cell pertussis vaccine was shown to be effective in 1942 should not have stopped immunologists from seeking out more nuanced mechanisms for protecting children in the decades to come. Instead, even after reports about possible complications from the vaccine began surfacing, the reaction on the part of the scientific community was a collective shrug. Hundreds of thousands of children’s lives had already been saved. Even if the claims that a tiny percentage of children had suffered post-vaccination complicatio
ns were true—and there has never been definitive evidence that they were—the net result was an enormous gain. Awards and accolades come to those who protected the world against a hitherto deadly disease, not those who tinker with vaccines others had already developed.

  In Britain, this veneer of nonchalance was shattered in January 1974, when a study describing three dozen children purported to have suffered neurological problems following DPT vaccination was published. At the time, social unrest in the U.K. was higher than it had been at any point since the Great Depression. After decades of close to one hundred percent employment, the number of people out of work rose to above one million for the first time since the 1930s. The seemingly imminent threat of everything from nuclear war to a world without oil contributed to a nihilistic disdain for the traditional forces ruling society.

  All these factors help to explain why the British press seized upon this specific report after having ignored similarly equivocal ones that had been published in years past. Newspaper articles, magazine features, and TV segments featured interviews with parents who believed their children had been harmed, regardless of the strength or paucity of the evidence. (One particularly flimsy example ran in The Times, which wrote about a severely handicapped man whose parents were, for the first time, claiming their son had been injured two decades earlier.) Once concerns about vaccine safety had been raised, advocacy groups like the Association of Parents of Vaccine-Damaged Children helped ensure that these issues remained in the forefront of public consciousness. Within months, vaccination rates began to fall—and whooping cough infections began to rise for the first time since World War II.

  While the United States avoided this particular scare, a sense of disaffection was pervasive there as well. With the nation still reeling from more than fifty thousand deaths in Vietnam and the disgrace of President Nixon, its leaders were about to discover that the public was ill disposed to take part in a public health campaign just because they were told to do so.

 

‹ Prev