Book Read Free

Pox

Page 21

by Michael Willrich


  By 1906, the Philippine Commission was boasting of the real possibility of eradication: “The day should not be far distant when smallpox will disappear from the Philippines.” The following year, Dr. Victor Heiser, the U.S. director of health, stated the argument in its baldest form. “During the year there has been unquestionably less smallpox in the Philippines than has been the case for a great many years previous.... In fact, if any justification were needed for American occupation of these islands, these figures alone would be sufficient, if nothing further had been accomplished for the benefit of the Filipinos.” Between the arrival of the U.S. troops in the summer of 1898 and 1915, some 18 million vaccinations were performed in the Philippines under American rule. The Filipinos, according to U.S. officials, had come to accept vaccination as an effective and necessary measure, suggesting, if true, a dramatic transformation of medical beliefs in a very short time.109

  With the end of the war, the question of force became the greatest political liability of U.S. colonial health policy. Significantly, in 1904 the Philippine Commission ordered that public vaccinators would henceforth be “prohibited from using force in accomplishing vaccinations.” Individuals who refused to submit to vaccination would be tried in the courts. All of these ongoing efforts did not succeed in completely wiping out smallpox on the islands. The tropical climate continued to render much of the American-produced vaccine useless. But the efforts did dramatically reduce the incidence of smallpox there and laid the groundwork for the Philippines to become, in 1931, the first Asian country in which the disease was eradicated.110

  At a time of pervasive opposition to compulsory vaccination at home and abroad, U.S. health officials presented the vaccination campaigns in Puerto Rico and the Philippines as evidence of the efficiency of compulsion. Azel Ames touted the Puerto Rico campaign as “A Lesson for the World.” Surgeon General Walter Wyman of the U.S. Public Health and Marine-Hospital Service declared, “No greater proof as to the efficacy of vaccination exists than in the Philippine Islands.” For Dr. John E. Snodgrass, assistant to the director of health in Manila, the truth of that proposition could be seen in the scarless faces of the rising generation of Filipinos. “The only argument necessary to explode the theories of the anti-vaccinationists,” he proclaimed before the Panama-Pacific International Exposition in 1915, “is to compare the visages of the children of today with those of their parents.”111

  FIVE

  THE STABLE AND THE LABORATORY

  Far from the battlefields of the nation’s first overseas colonial wars, American health officials on the U.S. mainland encountered rising resistance after 1900 to their own widening war on smallpox. The contentious politics of smallpox control centered on the growing divide between public health authorities and the public itself regarding the risks of vaccination.

  Turn-of-the-century Americans lived in a world filled with risk. Each year one out of every fifty workers was killed on the job or disabled for at least four weeks due to a work accident. Railroad and streetcar accidents annually killed and maimed tens of thousands of people. Children worked in mines, stole rides on the back of moving cars, and played stickball in alleys carpeted with horse manure. Apart from a few things recognized by the courts as “imminently dangerous,” such as arsenic or nitroglycerin, product liability did not exist. The average American breadwinner carried just enough insurance to cover his own burial.1

  During the first two decades of the twentieth century, a spate of new progressive social policies would create an enlarged role for the American government in managing the ordinary risks of modern urban-industrial life. The resulting “socialization” of risk, though narrow by the standards of Britain and Germany, was a dramatic departure for American institutions that prized individual freedom and responsibility. European-style social insurance gained traction in the first American workman’s compensation laws, enacted in forty-two states between 1911 and 1920. Mothers’ pension programs (launched in forty states during the same decade) provided aid to families that lost the wages of the “normal” (male) breadwinner due to his sudden death or disability. In tort law, too, the courts had women and children first in mind as they imposed tougher standards of liability upon railroad corporations. U.S. social politics still had a long way to go before a recognizably modern national welfare state insured its citizens against the financial insecurities of old age, or an American court seriously entertained the argument that an exploding Coke bottle entitled the injured party to compensation from the manufacturer. But the foundation was laid, in the social and political ferment of the Progressive Era, for a government that would one day promise its citizens “freedom from fear.”2

  Arriving just as the American people and their policy makers began to seriously debate these issues, the turn-of-the-century smallpox epidemics raised broad public concerns about the quality and safety of the nation’s commercial vaccine supply. The ensuing controversy caused ordinary Americans, private physicians, and public officials to revise old expectations about risk and responsibility and the role of government in managing both.3

  By the fall of 1901, the wave of American epidemics had carried small-pox to every state and territory in the union. The new mild type smallpox was the culprit in the majority of places, but deadly variola major struck several major American cities, particularly in the Northeast. Compulsory vaccination was the order of the day, enforced at the nation’s borders, in cities and towns, at workplaces, and, above all, in the public schools. The public policy was a boon to the vaccine industry, driving up demand for smallpox vaccine. American vaccine makers of the day ranged in size from rising national pharmaceutical firms such as Detroit’s Parke, Davis & Company and Philadelphia’s H. K. Mulford Company (a U.S. forerunner of today’s Merck) to the dozens of small “vaccine farms” that sprouted up around the country. To meet the unprecedented demand for vaccine-coated ivory points or capillary tubes of liquid lymph, the makers flooded the market with products, some inert, some “too fresh,” and some seriously tainted. Complaints of vaccine-induced sore arms and feverish bodies filled the newspapers and medical journals. Every family seemed to have its own horror story.

  Popular distrust of vaccine surged in the final months of the year, as newspapers across the country reported that batches of tetanus-contaminated diphtheria antitoxin and smallpox vaccine had caused the deaths of thirteen children in St. Louis, four in Cleveland, nine in Camden, and isolated fatalities in Philadelphia, Atlantic City, Bristol (Pennsylvania), and other communities. In all but St. Louis, where antitoxin was the culprit, the reports implicated vaccine. Even The New York Times, a relentless champion of compulsory vaccination, expressed horror at the news from Camden, the epicenter of the national vaccine scare. “Vaccination has been far more fatal here than smallpox,” the paper told its readers. “Parents are naturally averse to endangering their children to obey the law, claiming that the chances of smallpox seem to be less than those of tetanus.”4

  Pain, sickness, and the occasional death after vaccination were nothing new. But the clustering, close sequence, and staggering toll of these events was unprecedented in America. Newspaper stories of children dying in terrible agony—their jaws locked and bodies convulsing, as helpless parents and physicians bore witness—turned domestic tragedies into galvanizing public events. Allegations of catastrophic vaccine failure triggered extraordinary levels of conflict between angry citizens and defensive officials. In one typical incident, which occurred as the ninth Camden child entered her death throes, the health officials of Plymouth, Pennsylvania, discovered that many parents, ordered to get their children vaccinated for school, were secretly wiping the vaccine from their sons’ and daughters’ arms.5

  Jolted from their professional complacency, physicians and public health officials were forced to reconsider the existing distribution of coercion and risk in American public health law. In one sense, compulsory vaccination orders, whether they applied only to schoolchildren or to the public at large, already sociali
zed risk. The orders imposed a legal duty upon individuals (and also parents) to assume the risks of vaccination in order to protect the entire community from the presumably much greater danger of smallpox. Spreading the risk of vaccination across the community made its social benefit (immunity of the herd) seem a great bargain. As any good progressive knew, the inescapable interdependence of modern social life required just such sacrifices for the public welfare and the health of the state. Still, the state did almost nothing to ensure vaccine quality. The bacteriological revolution spawned a proliferating array of “biologics”—vaccines, antitoxins, and sera of endless variety—that were manufactured in unregulated establishments and distributed, by the companies’ druggist representatives and traveling detail men, in unregulated markets. The risks of these products lay where they fell—on the person left unprotected by an inert vaccine or poisoned by a tainted one.6

  The situation illustrates the larger dualism of American law at the turn of the century. Ordinary Americans, particularly working-class people, were caught between the increasingly strong state presence in their everyday social lives and the relatively weak state regulation of the economy. And the government insulated itself from liability. In a leading decision, handed down just three years before the Camden crisis, the Georgia Supreme Court took up the question of whether a municipal government could be sued for injuries caused by bad vaccine used by its public vaccinators. The answer was an unblinking No. Citing “a principle as old as English law, that ‘the King can do no wrong,’” the court refused to allow a resident of Rome, who had submitted to vaccination “under protest,” to sue the government for using “vaccine matter which was bad, poisonous and injurious, and from which blood poisoning resulted.” To allow such a case to proceed, the court warned, “would be to paralyze the arm of the municipal government, and either render it incapable of acting for the public weal, or would render such action so dangerous that the possible evil consequences to it, resulting from the multiplicity of suits, might be as great as the smallpox itself.” The arm of the state was protected; the arm of the citizen was not.7

  Supporters of compulsory vaccination defended the policy in a quasi-scientific rhetoric of risk assessment. From the expert point of view, lay concerns about vaccine safety were steeped in ignorance and fear, which should have evaporated in the face of hard statistical evidence. Officials assured the public that vaccines were safer than ever: “the preparation of glycerinized vaccine lymph has now been brought to such perfection that there should be no fear of untoward results in its use,” Surgeon General Walter Wyman said three years before Camden. Even if untoward results did arise, the social benefits of vaccination outweighed the costs. As the Cleveland Medical Journal put it, “Better [by] far two score and ten sore arms than a city devastated by a plague that it is within our power to avert.”8

  The vaccine crisis of 1901–2 revealed that cost-benefit analysis was not the only way Americans thought about risk. When the Times observed that Camden parents reasonably concluded that vaccination had become more dangerous than smallpox, turning the public health argument on its head, the paper made a rare concession to vaccination critics. As the Times said, the incidents were “furnishing the anti-vaccinationists with the only good argument they have ever had.” But most worried parents would not have called themselves “anti-vaccinationists.” And much more was involved in the rising popular resistance to vaccination in 1901 than a cool-headed consideration of quantifiable facts.9

  Perceptions of risk—the intuitive judgments that people make about the hazards of their world—can be stubbornly resistant to the evidence of experts. This is because risk perceptions are mediated by experience, by culture, and by relations of power. Certain factors tend to elevate the sense of risk that a person associates with a specific thing or activity, even in the face of countervailing statistical data. A mysterious phenomenon whose workings defy the comprehension of laypeople causes more dread than a commonplace hazard. A hazard whose adverse effects may be delayed, rather than immediate, heightens perceived risk. Significantly, perceived risk tends to spike when the hazard is not voluntarily undertaken. This is especially true when the social benefits claimed for a potentially hazardous activity are not readily apparent to those ordered to undertake it.10

  All of which helps to explain why in the fall of 1901 popular perceptions diverged so radically from the official line on vaccine safety. A century after the introduction of Jennerian vaccination, vaccines remained mysterious entities—even to the companies that made them and the physicians who used them. Many American communities had experienced neither a small-pox epidemic nor a general vaccination in over fifteen years, increasing both the public’s sense of complacency about the disease and its unfamiliarity with the prophylactic. By force of law, local health boards and school boards ordered citizens to assume the risks of vaccination. Many did, some eagerly, some grudgingly, some only with a billy club against their back. Then the St. Louis and Camden tragedies shocked the nation. Public confidence in the vaccine supply, already shaky, plummeted. Opposition to compulsory vaccination, already strong, surged. Ultimately, these events pierced the veil of official certitude and corporate confidence. Vaccine companies publicly accused each other of peddling poisonous virus. Some health boards suspended vaccination orders. Others launched investigations of vaccine purity and potency. In medical meetings, newspaper columns, and statehouse floors across the country, the debate increasingly turned on a single issue: the right of the state to regulate vaccines. In the fall of 1901, regulation was a controversial idea. A few months later, it was federal law.11

  A South Jersey industrial city of 76,000 people, Camden lay just across the sewage-choked Delaware River from Philadelphia. Times were good. Camden’s population had grown by 30 percent during the 1890s. Decent jobs could be had at the Pennsylvania Railroad and in the city’s ironworks, chemical plants, shoe factories, cigar companies, lumber mills, oil cloth factories, and woolen mills. Though the presence of immigrants and other newcomers was more keenly felt than in the past, Camden people remained overwhelmingly white and American-born, a generation or more removed from Europe. Crowded tenements of the sort found in New York and Chicago were scarce. Wage earners lived in low-slung neighborhoods of single-family homes. Like most communities, the people of Camden invested their pride and dreams in the rising generation. In September 1901, eight thousand children took their seats in the city’s thirty-two public schools. By mid-November, half of those desks would be empty.12

  The trouble started on October 7. Eight-year-old Pearl Ludwick took ill with smallpox, followed, in quick succession, by her father, an oil cloth printer, and all seven of her brothers and sisters. Only Pearl’s mother was spared the pox; those days must have been among the most trying of her life. Then Pearl’s father and eldest brother rose from bed one night and, both delirious with the fever, bumped a table, which knocked over a lamp. The ensuing blaze burned the Ludwick house to the ground—but not before the Ludwicks got out and hundreds of neighbors rushed to the scene. All, of course, were exposed to smallpox. With this improbable chain of events commenced the Camden smallpox epidemic of 1901–2.13

  New Jersey had seen little smallpox during the past sixteen years, and vaccination had fallen out of practice. But in 1901 smallpox seemed to be causing trouble everywhere in the United States, including Philadelphia. That summer, anticipating an epidemic year, the New Jersey Board of Health issued a public warning. “An extensive outbreak of small-pox can be prevented with absolute certainty if vaccination of all susceptible persons is secured,” the board declared. “[T]he question now arises, Shall general vaccination be done before a great calamity compels resort to this preventive measure, or must there first be startling losses of life to arouse parents, guardians, school boards, the public, and in too many instances the health authorities also, to a realizing sense of their duty to institute precautions against the spread of this pestilential disease?” No matter how you parsed that questio
n, the message was dead serious. But it took the Ludwick family fire to bring its meaning home to Camden.14

  Camden authorities ordered a municipal pesthouse built, and physicians worked long hours to meet the “rush to get vaccinated.” For those families who still needed convincing, the Camden Board of Education announced that it would enforce an 1887 state law that authorized local boards to exclude unvaccinated children. The Camden Board of Health president, Dr. Henry H. Davis, who happened also to be the medical director of the school board, dispatched vaccinators to the city schools. The Camden Medical Society opened a free vaccine station on Federal Street, in the heart of the city. And many residents were vaccinated by private physicians or, on the cheap, by the neighborhood druggist. Within a month, an estimated 27,000 people—more than one third of the city’s residents—had undergone vaccination, including five thousand public schoolchildren. And the scraping continued. Across the city, children and adults alike had the sore arms and fresh scars to show for it.15

  The state board advised physicians to exercise care when performing the procedure. “The operation of vaccination should be conducted with aseptic precautions,” the board instructed, “and none but glycerinized lymph from a trustworthy producer should be employed.” The board was referring to liquid vaccine that had been treated with glycerin, which acted as a preservative and killed bacteria in the product. Glycerinized vaccine was the state of the art. Whether from a sense of political propriety or fair play to the Philadelphia area’s many vaccine companies—including H. M. Alexander’s Vaccine Farm, H. K. Mulford Company, and John Wyeth & Brother—the board refrained from endorsing any make of vaccine and offered no advice as to how anyone might distinguish the “trustworthy” from the more dubious products on the market. Trust was a commercial transaction, not a public dispensation.16

 

‹ Prev