The Panic Virus

Home > Other > The Panic Virus > Page 5
The Panic Virus Page 5

by Seth Mnookin; Dan B. Miller


  The person who best embodies the philosophical continuity of vaccine resistance movements is the early twentieth-century activist Lora Little, whom journalist and author Arthur Allen described in his book Vaccine as a “granola-belt Mother Jones who promoted whole foods and naturopathy and denounced . . . white male medical practitioners before it was fashionable to do so.” According to Little, vaccines (along with mechanization, Western medicine, establishment doctors, processed foods, and white sugar) contributed to most modern ills. During her anti-vaccination campaigns, Little passionately decried the alienation brought about by the efficiencies of industrialization. Speaking of New York City, which even then was the prototypically modern city, she said, “Every doctor there has become a cog in the medical machine. And once the machine gets its grip on you, you cannot escape, you are drawn in and ground through the mill.”

  Little also predicted the current-day conspiracy theories that lash together doctors, government officials, and vaccine manufacturers in their quest for riches. In her 1906 tract “Crimes in the Cowpox Ring,” Little wrote:

  The salaries of the public health officials in this country, reach the sum of $14,000,000 annually. One important function of the health boards is vaccination. Without smallpox scares their trade would languish. Thousands of doctors in private practice are also beneficiaries in “scare” times. And lastly the vaccine “farmers” represent a capital of $20,000,000, invested in their foul business.

  Substitute “measles” for “smallpox” and that same paragraph could appear in pamphlets put out by groups like the National Vaccine Information Center (NVIC) or the Australian Vaccine Network. Here’s Barbara Loe Fisher, the founder and president of the NVIC, talking about pharmaceutical companies in a 2009 speech in which she compared the United States government’s vaccine policy to medical experiments conducted by Nazis during World War II:

  They make trillions of dollars in the global market. . . . Literally, every known disease that you can imagine, there’s a vaccine being created for it. Their intention is to have laws passed requiring businesses to use it because that’s how they make their money. So, what is the situation on the ground, with parents . . . who take their children to expensive pediatricians whose practices are economically heavily dependent on the [vaccine] schedules? . . . Pediatricians are now saying children must get forty-eight doses of fourteen vaccines by age six.

  For all the tactical and rhetorical similarities, the most poignant link between early activists such as Little and their modern-day descendants is their tendency to locate the cause of their personal tragedies in some larger evil. For Little, that tragedy was the death of her seventeen-month-old son, who, according to his medical records, suffered from simultaneous measles and diphtheria infections. Little was convinced the cause of his death lay elsewhere. Her son, she decided, could not have been the victim of the vagaries of human existence; that would be too senseless. Instead, as she told thousands of people over the years, he was killed by the smallpox vaccine. For Fisher, a television program about vaccines that she saw a year and a half after her own son received a combined diphtheria-pertussis-tetanus (DPT) shot convinced her that his physical and developmental difficulties began the very day he had been vaccinated.

  Despite her popularity, Little’s fears were hardly representative of the general mood during the first half of the twentieth century, when science notched one victory after another in what had been previously a one-sided fight between bacteria and viruses and the human beings they attacked. At times, it must have seemed as though scientists needed only identify the cause of a disease in order to cure it: In 1910, Paul Ehrlich discovered that an arsenic-based compound could wipe out syphilis; in the 1930s, sulfur-based medicines were proven to be effective against everything from pneumonia and puerperal sepsis to staph and strep; and in 1941, Howard Florey and Ernst Boris Chain showed that humans could use penicillin without fear of death.

  The implications of these medical triumphs were stunning. Even as deadlier weapons were being created, warfare was becoming safer: In the Spanish-American War in 1898, thirteen soldiers fell ill for each combat-related death. By World War I, that ratio was 1:1; by World War II, it had fallen to 1:85. Between 1920 and 1955, the lifespan of an average American increased by more than 25 percent—an increase that was primarily due to fewer young people dying of disease. Over that same span, California went from having 110,000 cases of diphtheria and seven hundred deaths annually to forty-two cases and two deaths a year. By mid-century, the notion of a world free of infectious diseases seemed, for the first time in human history, to be a possibility and not a pipe dream.

  These victories led to a transnational excitement and pride. In the 1930s, after witnessing state-sponsored institutes in Europe and Asia make one breakthrough after another, the American government for the first time assumed a central role in funding and conducting medical research. Doctors and scientists were repeatedly cited as among the most trusted members of society. It was perhaps inevitable that the march of accomplishments led to the scientific establishment’s growing increasingly mesmerized by its own power. Vaccine proponents, be they doctors, politicians, or self-styled intellectuals, held fast to their own credo and accused their opponents of taking part, as a New York Times editorial put it, “in a futile attempt to head off human progress.” This smug sense of superiority was mixed with a condescending bewilderment at what physician Benjamin Gruenberg described as the hoi polloi’s insistence “upon the right to hold opinions (and to act according to these opinions) upon such highly technical questions as the efficacy of vaccination, the value of serums, or the causation of cancer.”

  One of the most shocking examples of this hubris occurred during the early years of World War II. Even before America’s involvement in the conflict, the public health infrastructure in the United States had set in place a plan to give yellow fever vaccinations to any troops headed to tropical climates. By the time the country joined the Allied cause in late 1941, some high-ranking military officials had become so overwrought about the threat of biological warfare that they decided that all troops should be vaccinated, regardless of their assignment. The resulting scramble to develop vast quantities of the vaccine—at one point, the Rockefeller Foundation was producing tens of thousands of doses every week—led, not surprisingly, to shoddy quality control. Within months, large numbers of troops were showing signs of jaundice; eventually, up to 10 percent of soldiers in the most severely affected units were hospitalized. It turned out that batches of the vaccine had been contaminated with hepatitis B. By the time the vaccinations were halted, 300,000 troops had been infected and more than sixty had died.

  The entire yellow fever campaign was, as Arthur Allen wrote, a dark page in the history of public health: “None of the 11 million Americans vaccinated against yellow fever during the war got yellow fever. Then again, none was challenged with yellow fever. [The] fear of biologically trained killer mosquitoes was not realized. It was somehow all in vain.”

  Amid the killing of World War II, the deaths of five dozen soldiers from hepatitis B did not attract a lot of notice. In the 1950s, the fight to conquer polio made vaccines one of the biggest news stories of the decade. Here, the threat was real and the potential victims came from all parts of society. The result was an unprecedented campaign with almost universal support—and a case where national exuberance led health officials to forget everything they should have learned about the risks of poor oversight. This time the ensuing tragedy wouldn’t escape the public’s attention.

  8 The actual differences in these ratios, while significant, were likely not quite so stark: People were usually inoculated when they were in good health and could receive adequate medical care, thus increasing their chances of survival.

  9 Nelmes was infected by a cow named Blossom, a saintly contrast to Mrs. O’Leary’s Cow, the purported protagonist of the Great Chicago Fire of 1871. To this day, Blossom’s hide hangs on the library wall in St. George’s Medical
School, where Jenner studied. Phipps and Nelmes were, thankfully, allowed to rest in peace.

  CHAPTER 3

  THE POLIO VACCINE: FROM MEDICAL MIRACLE TO PUBLIC HEALTH CATASTROPHE

  On June 6, 1916, the first two polio cases of the summer were diagnosed in New York City. At the time, polio—or infantile paralysis, as it was often called—was a mysterious and frightening, although not terribly common, childhood illness. It had been a quietly persistent presence in human populations for thousands of years, but continual, widespread epidemics didn’t appear until the 1880s. The virus quickly made up for lost ground, and in the first decade of the twentieth century, countries around the world were ravaged by outbreaks, especially during warmer weather. One reason for the disease’s sudden virulence was undoubtedly the crowded living conditions of modern cities: Polio, like typhoid, cholera, and hepatitis A, is transmitted through what’s clinically referred to as the fecal-oral route, which typically occurs as a result of inadequately treated drinking water, improper food handling, and poor sewage methods.10

  Even with the growing number of epidemics, polio victims in the first decade and a half of the twentieth century generally did not suffer from lifelong consequences—in fact, in the vast majority of cases, the infection was so minor as to be barely noticeable. Approximately 5 percent of the time, however, the virus reached the central nervous system. Usually, even those infections resulted in relatively benign symptoms—headaches, diarrhea, muscle pain—that disappeared in a matter of weeks. In fewer than 2 percent of cases, however, the virus attacked motor neurons in the spinal cord and brain. At first, those patients also exhibited flulike symptoms such as achiness or general discomfort. Soon the muscles became weak and spastic, before eventually becoming completely unresponsive. The result was paralysis or, if the muscles of the chest became permanently incapacitated, death.

  The epidemic of 1916 completely upended those percentages, and the result was unlike anything New York City, or the country, had experienced before. To start with, the virus was spreading exponentially faster than it previously had. Even more terrifying: The fatality rate among those with clinical infections spiked to between 20 and 25 percent, five times higher than expected. Since 1911, when New York’s Department of Health began keeping statistics, the city had averaged 280 polio patients a year, and the annual death toll had fluctuated from a high of seventy to a low of thirteen. By the end of June, more children were dying of polio every week than had died in the previous five years combined.

  The epidemic that overwhelmed New York City that summer was so unexpected, so inexplicable, and so lethal that it affected the public’s perception of health and medicine for years to come. Every funeral highlighted doctors’ helplessness; every paralyzed child was a stark reminder of the lack of effective treatment. One local expert told the media that polio seemed to “pick the strong and well children in preference to the weak”—an anti-Darwinian truism that mocked scientific progress and knowledge. A mishmash of confusing and contradictory emergency measures only heightened the panic and confusion. Sometimes infected children were told to remain at home; others were instructed to go immediately to the nearest hospital. Quarantines were put into effect in some neighborhoods but not others. Some parents of sick children were unable to find help: In Staten Island, after watching his son die in his car on the way to the hospital, a father proceeded to drive around “with the boy’s body for hours looking for someone who would receive it.” Others fought to resist quarantine efforts: In one instance, a local newspaper reported that it took “the authority and strength of four deputy sheriffs and two physicians to get a child from its father.” Before long, surrounding communities closed their doors to New York City residents: In Hoboken, New Jersey, “policemen were stationed at every entrance to the city—tube, train, ferry, road and cow path—with instructions to turn back every van, car, cart, and person.”

  By early August, with the weekly death count approaching four hundred and parts of New York City poised to descend into anarchy, the police department gave its blessing to the Home Defense League, a “volunteer vigilance force” whose 21,000 members were authorized to patrol the streets and barge into homes looking for “violations of the sanitary code that might mean a spread of infantile paralysis.” Conspiracy theories took hold: In the town of Oyster Bay, on Long Island, city counselors accused John D. Rockefeller and Andrew Carnegie of using their millions to corrupt “men and microbes” in order to create “causeless hysteria and . . . needless hardships.”

  By the end of November, when the epidemic had run its course, 26,212 people had been infected in a total of twenty-eight states. In New York City alone, there were 9,023 total victims and 2,448 deaths. The vast majority of them were under ten years old.

  For the next thirty years, the spread of polio waxed and waned without any discernible logic. In 1923, only 695 cases were reported nationwide. In 1931, that number rose to 14,105. Eleven years later, it was back under 3,000. Then, just as the United States was recovering from its involvement in World War II, the country was battered again. In 1946, the total number of reported polio cases approached those recorded thirty years earlier. Two years later, the number of victims hit an all-time high. The year after that, it rose again—this time by an astounding 35 percent, to 40,076. With parents no clearer than they’d been decades earlier on how to protect their children, each summer’s outbreak undercut the country’s sense of postwar victory and security. Polio might not have been the biggest public health threat—for much of the 1940s, an average of 190,000 Americans died of the flu each year—but it took up the most space in the public’s imagination.

  By 1952, when more than 58,000 people were infected, polio ranked second only to the atomic bomb as the thing Americans feared most. This outsized anxiety was, to be sure, partly due to what a leading medical writer of the time called “polio’s uncommon nastiness.” It was also fueled by the popular press, which then, as now, relied on dramatic stories to draw in readers. Perhaps most responsible of all was the National Foundation for Infantile Paralysis, which had been founded in 1937 by President Franklin D. Roosevelt, polio’s most famous victim.11 By 1945, when Roosevelt died just over a year into his fourth term in office, the foundation had become the best-known charity in the country. Year after year, its celebrity-studded March of Dimes campaigns highlighted its prodigious fund-raising prowess. Outside the federal government, it was perhaps the only organization that had the prestige and the national apparatus in place to unite the public around a single cause. When, in November 1953, a foundation-funded virologist at the University of Pittsburgh named Jonas Salk announced that he’d developed a polio vaccine, millions of citizens had already been primed to do whatever they could to fight this national menace. Within days, parents were lining up to volunteer their children for what would be the largest medical field trial in history.

  The Salk trials began in the spring of 1954; by May, 1.8 million children had been injected in 211 counties spread over forty-four states—and there were still six months of tests remaining. (At the end of the year a Gallup poll found that more people knew about Salk’s field trials than knew the president’s full name.) This degree of public involvement and scrutiny created inevitable pressures. While Salk, the foundation, and the government worked together, they all stood to gain something different. For Salk, the success of the vaccine would make him among the most revered people on the planet; for the foundation, it would validate the billions of dollars it had raised and spent over the years; and for the government, success would be a fulfillment of an implicit promise to defend its citizens, thereby solidifying its place as a positive force in people’s lives.

  That summer and fall, another 25,000 Americans came down with polio, but for the first time there was the expectation of a future free of paralyzed children confined to iron lungs. When the foundation announced that it was committing another $9 million to buy 25 million doses of the vaccine, the press surmised that the trials had already
been proven effective; why else, they reasoned, would anyone spend all that money? At one point, the New York World-Telegram and Sun reported that Salk’s vaccine had worked in one hundred percent of the test subjects. Almost nobody bothered to point out that that was a scientific impossibility; with the next March of Dimes campaign set to start in January, there was nothing to be gained by tempering expectations.

  As the year drew to a close, Thomas Francis, a professor of epidemiology and a postdoc advisor of Salk’s, was chosen to review the reams of data that had been collected and to write an independent report analyzing the results. With the entire world looking over his shoulder, he set to work. The following spring, mere months before that year’s polio season would begin, he announced that he was ready with his conclusions.

  On April 12, 1955—the tenth anniversary of FDR’s death—five hundred doctors and scientists, and hundreds more reporters, gathered in the University of Michigan’s Rackham Auditorium to watch Francis, a short, stocky man with a neat mustache, present his findings. Francis used the same measured tone and somnolent cadence as he did when talking to colleagues or students; even so, one newspaper described the morning as being infused with a sense of “fanfare and drama far more typical of a Hollywood premiere than a medical meeting.” At the back of the auditorium, a special riser supported the sixteen television cameras covering the event; three flights up, a media center housed another two hundred members of the press.12 In ballrooms and conference halls around the country, 54,000 doctors had assembled to watch Francis’s presentation via closed-circuit broadcasts. Six thousand crammed into Manhattan’s Waldorf-Astoria alone.

 

‹ Prev