Miracle Cure

Home > Other > Miracle Cure > Page 20
Miracle Cure Page 20

by William Rosen


  With the disease’s talent for hijacking the body’s own defenses and capacity for remaining lethal years after it had seemingly been defeated, it’s no surprise that physicians have been battling the white plague for as long as there have been physicians. They haven’t always accurately identified it, or even agreed with one another about any of its characteristics. Hippocrates thought the disease was inherited; Galen that it was contagious. Fifteen centuries later, the debate still raged. Paracelsus, the Swiss physician whose theory of health depended on the proper balance between mercury, sulfur, and salt, couldn’t understand why, if the disease was contagious, so many people in European cities who were exposed to it exhibited no symptoms.

  Even after Robert Koch identified the guilty bacterium in 1882, the controversy wasn’t settled. One reason that the belief in tuberculosis as a hereditary condition proved so durable was that environmental conditions can, indeed, affect the likelihood of activating a latent infection. Pure mountain air didn’t cure TB, but it did seem to slow it down.

  As a result, sanatoriums, places where patients could cough out their lives in relative comfort, sprang up all over Europe, especially in regions near the mountains or the sea. “Climate cures” built boomtowns throughout the American West, from Albuquerque to Pasadena to Colorado Springs, each of them advertising themselves as the ideal destination for tubercular patients. Edward Trudeau, an American physician who believed tuberculosis to be hereditary, contracted the disease himself in 1873, and built a European-style sanatorium in Saranac Lake, New York, where residents* were confined to bed. Though not bedrooms. Trudeau, convinced of the curative powers of clean mountain air, required his patients to sleep outdoors, even in subzero temperatures. The Saranac Lake model actually became so popular that, in the first decade of the twentieth century, one American in 170 lived in a self-declared sanatorium.

  The idea of housing consumptives together would scarcely have made sense absent the conviction that tuberculosis victims weren’t themselves infectious. It was a plausible enough notion. Because the pathogen could establish colonies without—at first—causing much in the way of symptoms, hosts could be exposed to the disease without appearing to contract it. Since living in places where the air was clean (and, more important, the sanitation well managed) seemed to relieve symptoms and even slow the progress of the disease, many physicians thought tuberculosis couldn’t be infectious (or, at least, not mostly so). At a time when the germ theory itself was still very novel, nineteenth-century European and American societies were largely ignorant of the dangers of spreading the disease. One consequence was that it became a trope for the nineteenth century’s literary romantics. Pale heroines languishing beautifully and consumptive children who “appear like fairies to gladden the hearts of parents and friends for a short season” became notorious clichés of Victorian literature. Nor was tuberculosis of interest only to romantics; the residents of the Berghof atop Thomas Mann’s Magic Mountain form a cross-section of early twentieth-century European intelligentsia, united only by metaphysics and obstructed breathing.

  Like the European society they symbolized, though, their ailment was undergoing a gigantic transformation. As the germ theory took hold, tuberculosis was transformed from a romantic ailment to a contagious one. Consumptives were no longer regarded as brave symbols of individual suffering, but as a social danger, the modern version of lepers.* Public health campaigns warned everyone to take care around coughing and sneezing, and especially advocated for isolating tubercular patients. By the end of the 1920s, sanatoriums were already being transformed from comfortable resorts for the wealthy to quasi prisons for the poor.

  Segregating the infected from the not yet infected was harsh, but nearly the only useful response to the disease. Koch’s tuberculin had been both a failure and a scandal; Colenso Ridgeon’s cure in A Doctor’s Dilemma was a fiction. Shaw’s play asked audiences if “thirty men . . . found out how to cure consumption . . . Why do people go on dying of it?” Though a tuberculosis vaccine had been developed by the French physicians Albert Calmette and Camille Guérin as early as 1916, and first administered to humans in 1921, it was at best only partly effective.* One popular surgical procedure, the so-called pneumothorax technique, deliberately collapsed an infected lung to allow the lesions caused by the tubercular granulomas to heal. The only characteristic all of these purported cures shared was an almost complete lack of effectiveness. Patients came to sanatoria like Mineral Springs not to be cured, but to be made as comfortable as possible while their bodies repaired themselves, or—more frequently—while they waited to die.

  —

  A month into her stay at the Mineral Springs Sanatorium, on November 15, 1945, Patricia Thomas became the first patient to receive an injection of a new compound called streptomycin. After a series of injections of this new compound over the following months, she went home, completely cured.

  Her battle with Mycobacterium tuberculosis ended, but the war over the discovery of streptomycin would rage on, never really subsiding. Truces are broken regularly, whenever advocates for the two scientists at the heart of the quarrel—Selman Abraham Waksman and Albert Schatz—square off. Entire books have been devoted to arguing one side or the other. One of the most prestigious science magazines in the world, Nature, was, for a time, a battlefield, when Schatz’s champion, the microbiologist and historian Milton Wainwright, fought on the page with Waksman’s defender, William Kingston, a professor of business and history.

  Here’s what isn’t disputed in this war for credit: Albert Schatz graduated from Rutgers University in New Brunswick, New Jersey, in May 1942 with a degree in soil science, and immediately started graduate work under one of the field’s leading lights, Selman Waksman. Both professor and student specialized in actinomycetes, a suborder of soil-dwelling bacteria with thousands of known members, even in the 1940s.

  Their interest was more than simple intellectual curiosity. Ever since the 1920s, actinomycetes, an unusual group of bacteria that exhibit Penicillium-like branching filaments, had been identified as powerful antagonists to other bacteria. This wasn’t really unexpected, since a teaspoon of soil can be home to a billion or more bacteria, which meant a huge number of competitors in the never-ending hunt for the raw materials essential to produce the biomass needed by all life: DNA, RNA, amino acids, fats, and the like. In a nutrient-rich environment like soil, packed with organic material and trace minerals, the competition is fierce, and actinomycetes were already known to be using some very powerful weapons indeed, each one an enthusiastic killer of other bacteria.

  In 1939, the French-born microbiologist René Dubos, a former student of Waksman’s then working at the Rockefeller Institute for Medical Research in New York, had isolated two distinct compounds from another family of soil bacteria, B. brevis. Each of the compounds, which Dubos named tyrothricin and gramicidin, were, like penicillin, enthusiastic killers of Gram-positive pathogens. Unfortunately, unlike penicillin, they didn’t do so by weakening the distinctive Gram-positive cell wall. Tyrothricin blocked the synthesis of proteins; gramicidin made cell membranes permeable to salts. Since animal cells have membranes and depend on protein synthesis, both activities made the compounds nearly as deadly to hosts as to pathogens.

  Dubos’s discoveries weren’t inconsequential; gramicidin is still prescribed today for infections of the skin and throat. But to treat systemic infections like tuberculosis, a drug must enter the bloodstream. Since neither gramicidin nor tyrothricin could do so safely, their largest contribution to the antibiotic revolution was to hint that soil might be valuable for growing more than just crops. Somewhere in the dirt there had to be something that would kill pathogens while leaving their hosts alive.

  The pursuit of a substance with the right balance of destructiveness and discretion was a full-time job at Waksman’s lab at Rutgers. Throughout the 1920s and 1930s, Waksman and his team had been collecting different soils from all over the eastern U
nited States, isolating the varieties of actinomycetes they contained, and then growing them in Petri dishes filled with agar. Once they had a colony, the graduate students working in Waksman’s labs would expose it to another sort of bacteria, and note the results; if the newly introduced bacteria failed to thrive, then that particular colony of actinomycetes had an antibiotic property.

  This is the most tedious sort of research, completely lacking in bursts of inventive genius or innovative experimental design. It did not, however, lack for institutional support.

  In 1939, Merck & Co. had concluded an agreement with Selman Waksman that provided the lab with an ongoing grant for the study of antibiotics. Merck’s support included money—Waksman had initially been engaged to consult on microbial fermentation for $150 a month; later that year, Merck added another $150 for working on “antibacterial chemotherapeutic agents,” along with experimental animals, and funding for a fellowship in the Rutgers lab. (The investment in the fellowship wasn’t purely altruistic; Waksman’s first fellow, Jackson Foster, later became the director of Merck’s microbiological lab.) Waksman would do the research, and in return for the funding, and for handling “production, purification . . . and to arrange for clinical trials,” Merck would own the patents from any resulting research, and would pay a royalty of 2.5 percent of net sales to the Rutgers Endowment Foundation, a nonprofit charity originally established to solicit donations from alumni.

  For its first year or two, Merck’s investment hadn’t paid much in the way of dividends. However, at the beginning of 1940, Waksman reacted to news of the progress of Florey’s team by saying, “These Englishmen have discovered [what] a mold can do. I know the actinomycetes can do better” and proved as good as his word. By the spring of 1941, he had presented his patrons at Merck with the first fruits of his actinomycete farm: the antibacterial compounds clavacin, actinomycin,* and streptothricin. The harvest was promising if not spectacular. All three compounds were effective, but toxic; strepothricin, in particular, was frustratingly able to kill a variety of Gram-negative pathogens in mice, but had the unfortunate side effect of destroying kidney function in the four human volunteers on whom it was—prematurely, not to say irresponsibly—tested.

  Waksman was undaunted. It was around this time that he coined the word by which this variety of antibacterial drugs would henceforth be known: antibiotic, which he defined as “a chemical substance produced by microorganisms”—i.e., penicillin, but not Salvarsan—“which has the capacity to inhibit the growth of and even destroy bacteria and other organisms.”

  The problem was finding the antibacterial needle in the enormous actinomycete haystack. Years later, Waksman would tell people, “We isolated one hundred thousand strains of streptomycetes [as actinomycetes were then known]. Ten thousand were active on agar media, one thousand were active in broth culture, one hundred were active in animals, ten had activity against experimental TB, and one turned out to produce streptomycin.” Though the numbers are casual approximations, the technique was essentially that: Throw lots of actinomycetes up against the wall, and see which ones stuck.*

  Which is how Schatz spent his days, from May 1942, when he joined Waksman’s lab, to November, when he was drafted to serve in an Army Air Forces medical detachment in Florida. It’s what he did during his off hours in Florida, which he spent finding, and sending, different soil samples back to Waksman’s lab in New Brunswick. And it’s what occupied him after he was given a medical discharge, in June 1943, and returned to Waksman’s lab.

  Credit: Rutgers University

  Albert Schatz (1920–2005) and Selman Waksman (1888–1973)

  He did so as one of the few buck privates in the United States Army who took a cut in pay returning to civilian life. Private Schatz had been earning fifty dollars a month while in the service, along with free housing, food, and clothing. As a civilian PhD candidate, he performed the (literally) dirty work of analyzing soil samples for even less: forty dollars a month, which was well below the minimum wage for a forty-hour week. And Schatz, like PhD candidates then and now, worked a lot more than forty hours. By his own, understandably aggrieved, account, “During the four month interval between June and October, 1943, I worked day and night, and often slept in the laboratory. I prepared my own media and washed and sterilized the glassware I used.” He even cadged his meals from the stuff grown in the university’s agricultural college. He didn’t do so without complaint. But he did it, convinced that actinomycetes held the key to a yet-to-be-discovered class of pathogen killers.

  And those too-toxic-for-humans antibiotics like streptothricin killed Gram-negative bacteria. The dissertation that Schatz was researching on a salary of ten dollars a week was explicit: “Two problems, therefore, appeared to be of sufficient interest to warrant investigation; namely . . . a search for an antibiotic agent possessing . . . activity . . . against Gram-negative eubacteria . . . and a search for a specific antimycobacterial agent.” The sulfa drugs and penicillin were both enzyme blockers, the first inhibiting the ones bacteria needed to synthesize an essential B vitamin, the second blocking the enzymes needed to assemble the giant molecule of amino acids and sugars that make up the bacterial walls of the Gram-positive pathogens streptococci, staphylococci, and clostridia. But Gram-negative bacteria, with their very different cell walls, are some of the most prolific killers in human history, including Yersinia pestis, the bacterium that causes bubonic plague, and Vibrio cholerae, which causes cholera.*

  Schatz’s phrase “specific antimycobacterial agent” was a euphemism for “TB killer.” It was also a red flag for Waksman, Schatz’s boss and thesis advisor, who recognized that his soil science lab wasn’t a secure research facility for something as dangerous as the white death bacillus. Even so, he agreed to support Schatz’s research, with the proviso that it be conducted in an isolated basement that he turned over to his graduate student, either out of an exaggerated fear of a tuberculosis outbreak—Schatz’s recollection—or a perfectly reasonable exercise of caution, since the lab lacked what were, even in 1943, state-of-the-art defenses against disease outbreaks: no ultraviolet lights that could be used to kill dangerous microbes; no negative-pressure fans and air filters to keep them from escaping.

  Whatever the reason, Schatz’s “exile” (his word) produced results. On October 19, 1943, after exposing hundreds of actinomycetes from different soils to the most virulent of tuberculosis bacilli, a variant of M. tuberculosis designated H-37, he found two that were antagonistic, one taken directly from the soil, the other swabbed from the throat of a chicken. Both were variants of a bacterium known since 1914 as Actinomyces griseus. Schatz renamed it Streptomyces griseus. No one knows who named the substance they produced; the first mention of streptomycin in print was in a letter from Selman Waksman to Randolph Major at Merck.

  Here was the first step in finding a practical antibiotic. The next one, which Florey’s team at Oxford had faced three years before, was running a series of trials to find if it worked not just in vitro, but in vivo—in living, breathing animals. As with the first penicillin extracts, this required streptomycin to be produced in quantity. A sufficient amount for one-off experiments could be distilled in Waksman’s New Jersey lab. Or, it could as long as Schatz was willing to be Waksman’s Norman Heatley. He set up a similarly makeshift system in his basement lab; to prevent the liquid from boiling away while he slept, the night porter at Waksman’s lab awakened him whenever the liquid fell below a red line Schatz had drawn on the distillation flasks.

  However, testing Schatz’s new compound for effectiveness in vivo needed a more sophisticated facility than the young biologist’s basement lab could provide. And in 1945, there was no research hospital in America more sophisticated than the Mayo Clinic. Founded in 1846 by an English émigré doctor named William Worrall Mayo as an outpost for what was literally frontier medicine, the clinic moved to Rochester, Minnesota, in 1864, where Mayo joined the U.S. Army Medical Corps a
s a member of the state’s enrollment board, which examined recruits for the Union army.

  From the time Mayo’s sons William J. and Charles H. joined him, the clinic was leading the transition from medicine as art to medicine as science. And, as the scientific advances of the nineteenth century were, by the beginning of the twentieth, almost entirely the result of collaboration between specialists, so too was medicine at the Mayo Clinic. By 1889, when the Mayos joined with the Sisters of St. Francis to build Saint Mary’s Hospital, William J. Mayo was writing, “It has become necessary to develop medicine as a cooperative science: the clinician, the specialist, and the laboratory workers uniting for the good of the patient. . . . Individualism in medicine can no longer exist.” When William Mayo first expressed this sentiment about “cooperative science,” he was really writing about a perceived deficiency in clinical practice, not medical research; he wanted to apply the best features of the latter to the former, which, in part, explains the decision to reconfigure the clinic as a not-for-profit charity in 1919. Now, staff would be salaried, not contract, physicians. This, the Mayos believed, would encourage collaboration between specialized researchers and clinicians, who would no longer run the risk of financial penalties for enlisting the best research in their practices.

  This forward-thinking philosophy would transform the Mayo Clinic into a world-famous research laboratory. The transformation was already complete when the top veterinary pathologist in the country, William Feldman, joined its Institute of Experimental Medicine in 1927. His research partner, Corwin Hinshaw, had an even more unusual background: Before receiving his medical degree in 1933 from the University of Pennsylvania, he had already done graduate work in zoology, bacteriology, and parasitology. What the two had in common was an interest in lung disease, particularly bovine, avian, and especially human tuberculosis.

 

‹ Prev