Book Read Free

Miracle Cure

Page 24

by William Rosen


  Meanwhile, Bristol Laboratories, which was awaiting patent approval for its own method of making tetracycline, would still need a license for the product patent. When they couldn’t get one, Bristol initiated a lawsuit against Pfizer, whose own patent application was looking cloudier by the day, as they had apparently withheld information about the original discovery that would have damaged their claim.

  The only solution that worked for everyone was a complicated roundelay of cross-licenses. American Cyanamid acquired Heyden’s pending patent application for a method of producing tetracycline, then withdrew its own application for a patent on the tetracycline molecule itself. In return, Pfizer granted American Cyanamid/Lederle a license to manufacture the drug. Finally, on the theory that a competitor makes less trouble when its attorneys are inside the boardroom sending cease-and-desist letters out than outside the boardroom sending cease-and-desist letters in, Pfizer and American Cyanamid agreed to grant cross-licenses to Bristol Laboratories as well: They would be allowed to continue making tetracycline, but to supply only Squibb and Upjohn.

  The peace treaty worked brilliantly at achieving its intended objective: propping up the price of the various tetracycline antibiotics, which had fallen by two-thirds between 1948 and 1952, but stabilized thereafter for another decade, mostly for sales of the simplest, most generic form. In 1951, Lederle’s Aureomycin was 41.5 percent of the broad-spectrum market; five years later, it was barely 12 percent; but Achromycin, its version of tetracycline, represented 66 percent of sales, and virtually all of the company’s $43 million in profit. And they weren’t the only ones making money. Squibb sold the antibiotic as Steclin; Upjohn had Panmycin. Even Bristol Laboratories, once they eased out of the restrictions of the original cross-license, sold tetracycline under the name Polycycline.*

  Though selling what were, chemically, identical drugs, the companies didn’t stop competing. But since they were no longer competing to supply a superior product, and had agreed not to compete on price, victory would go to the best drug marketer. And it is in no way a criticism of Pfizer’s research and production brilliance to say that, when it came to marketing, they were truly in a class by themselves.

  Their teacher was the legendary advertising executive Dr. Arthur M. Sackler.

  For a man whose critics recall him as one of the twentieth century’s great hucksters,* Sackler’s academic credentials are certainly impressive enough. He graduated from New York University’s School of Medicine in 1937 and founded the Laboratories for Therapeutic Research in 1938, while simultaneously completing his residency in psychiatry at Creedmoor State Hospital in Queens. Over the years, he would publish more than 150 research papers, largely in the most rarefied realms of neuroendocrinology and the metabolic basis of schizophrenia. His outsized place in the history of medicine, however, is a result of his mastery of a different aspect of human behavior. In 1942, he joined the William Douglas McAdams advertising agency, which he would, shortly thereafter, acquire.

  The timely encounter between a brilliant and ambitious physician, a newly sophisticated advertising business, and a seemingly inexhaustible supply of miracle drugs changed medicine forever. Because Pfizer lacked the long-standing relationships with doctors and hospitals of its competitors, Sackler proposed the company persuade physicians to try Terramycin and Tetracyn not simply at one-on-one sales calls, but through their trade journal: the Journal of the American Medical Association. His strategy was clear enough, at least in retrospect. Before 1952, advertising in JAMA was almost entirely free of branded drugs. Readers of America’s premier physician’s journal were far more likely to be introduced to ads for generic medical supplies (“Doctors Get a Heap of Comfort from Grinnell Gloves”) and, of course, ubiquitous ads for cigarettes: “More Doctors Smoke Camels Than Any Other Cigarette.” Even with that level of support, advertising in JAMA was still, by the standards of midcentury magazine publishing, sparse. Drug companies—“ethical” drug companies, as they were still known—hadn’t quite erased the line separating medicine from marketing.

  By 1955, though, the journal was carrying more advertising pages annually than Life magazine, and the number of pages with branded ads increased by a whopping 500 percent, almost all of it from Sackler’s number one client. From 1952 on, Pfizer purchased more than two-thirds of all the antibiotic advertising in JAMA. And, if that weren’t enough, between 1952 and 1956 virtually every issue of JAMA arrived in the offices of hundreds of thousands of physicians with Pfizer’s house newsletter, Spectrum, bound in. The ads were ubiquitous, and striking. One resembled an eye chart:

  O

  CU

  LAR

  INFEC

  TIONS

  RESPOND

  TO BROAD

  SPECTRUM

  TERRAMYCIN

  Eye infections, respiratory ailments, skin lesions: Terramycin treated them all. As part of the strategy of positioning the drug as the antibiotic of choice for the maximum number of potential users, the company even produced a cherry-flavored suspension for children, Pfizer promising to “Turn Satans into Seraphs.”

  —

  As the various tetracycline producers battled over market share, the impulse to exploit every possible application for their wonder molecules found an unexpected niche.

  Almost all of the pharmaceutical firms that were jump-started by the antibiotic revolution had been in the business of manufacturing vitamins since at least the 1930s. Millions of dollars had flowed to the Wisconsin Alumni Research Foundation to license the technology for supplementing milk with vitamin D by irradiating it with ultraviolet light; the New Zealand company Glaxo had grown to be the preeminent supplier of vitamins in the United Kingdom. Ten percent of Merck’s sales in 1940 was from vitamins, and they were a significant source of revenue for both I. G. Farben in Germany and Hoffmann-La Roche in France. The discovery that vitamin deficiencies were at the root of dozens of human diseases, from scurvy to rickets, created an enthusiasm for a vitamin-fortified diet that continues to this day.*

  One of the most dangerous such diseases had been discovered and named even before vitamins themselves. For nearly a century, medicine had recognized the condition known as pernicious anemia, which is caused by a lack of what had been called, since the middle of the nineteenth century, the “intrinsic factor” needed for production of red blood cells. The disease, sometimes known as Addison’s or Biermer’s anemia (for the English and German doctors who first identified it), was treatable by eating large amounts of calves’ liver, then later liver juice, and even later, liver extract. The lifesaving ingredient in the liver that corrected the deficiency, however, wasn’t identified until 1938 by a team led by Merck’s Karl Folkers: vitamin B12.

  A dozen different ailments, all of them serious, can be helped by adding B12 to the diet, from inflammation of the gastrointestinal tract, to immune system disorders like lupus, to celiac disease. As the medical community learned of the lifesaving properties of more and more vitamins, however, they made a monumental though understandable error. Although a deficiency of a particular vitamin was frequently the cause of serious disease—scurvy, from a lack of vitamin C, for example—the reverse isn’t true. Taking fewer vitamins than the body needs causes illness; taking more than it needs doesn’t promote health. This proved a difficult concept for people, including most doctors, and explains to this day why some of us take five times the recommended daily allowance of vitamin C in the mistaken belief that it will ward off colds. We want to believe that whenever some is good, more must be better. This is also why dozens of biochemists in the 1940s and 1950s dosed domestic animals with massive quantities of vitamins, particularly the “animal protein factor” known as B12.

  The search for an inexpensive compound that would make for healthier cows, sheep, pigs, and chickens was well under way by the end of the Second World War, and the researchers working for the newly ambitious pharmaceutical companies were as eager
to discover it as they were new weapons in the battle against disease. One such biochemist at Lederle, Thomas H. Jukes, had learned from a review of published research that Merck had discovered that Waksman’s world-changing actinomycete, S. griseus, produced not just streptomycin, but also B12. He hoped his own search for soil-dwelling bacteria would hit a similar double jackpot, and in December 1948 Jukes received a sample of Aureomycin to test on chickens as a possible cheaper source for the animal protein factor than liver extract, one that was cheap enough for animal feed. The sample was extremely small: enough for Jukes, and his colleague Robert Stokstad, to give supplements to a few laying hens that had been starved to the point that their eggs would normally die within two weeks. The hope was that Aureomycin-produced B12, the protein factor, would allow the chicks to survive.

  The results surprised everyone. A dozen young birds had been given Aureomycin, others liver extract, and a control group nothing at all. The chicks given the Aureomycin grew at a rate far faster than those whose hens had been fed a normal diet, which was expected. But they also grew faster than the chicks dosed with liver extract. The reason couldn’t be B12. Something in the antibiotic itself was accelerating their growth.* Streptomyces aureofaciens had proved to be a gold maker in an entirely different realm.

  The demand for Aureomycin as a human antibiotic was so great that Lederle could provide no more of the product to Jukes and his team. In a burst of resourcefulness fully worthy of Norman Heatley at the Dunn School’s penurious best, Jukes “dug residues out of the Lederle dump” in order to extract tiny quantities of Aureomycin from the waste products of the fermentation process by which the company manufactured it. He sent the resulting samples to agronomists all over the country to verify its effect, and when a researcher at the University of Florida reported that he had tripled the growth rate of young pigs, even Lederle was convinced. They started marketing Aureomycin to farmers, not as an antibiotic per se, but as a source of B12 in order to avoid the annoying regulatory hand of the Food and Drug Administration.

  Stokstad and Jukes presented their findings at the annual meeting of the American Chemical Society on April 9, 1950. The following day, the front page of the New York Times announced that the new “Wonder Drug . . . had been found to be one of the greatest growth-promoting substances so far to be discovered.” They further reported that the use of Aureomycin as a supplement represented “enormous long-range significance for the survival of the human race in a world of dwindling resources and expanding populations.” The Times even speculated that the drug’s “hitherto unsuspected nutritional powers” might aid the growth of malnourished and underweight children.

  They wrote truer than they knew. That same year, in one of the more disturbing sidebars to the antibiotic revolution, a Florida physician named Charles Carter started a three-year-long study during which he gave a daily dose of 75 milligrams of Aureomycin to mentally disabled children. As a moment in history when even a wealthy country like the United States worried more about food scarcity than obesity (and hardly at all about the ethics of experimentation on thementally disabled), Jukes could proudly announce, “The average weight for the supplemented group was 6.5 pounds, while the control group averaged 1.9 pounds in yearly weight gain.”

  The use of Aureomycin as a method for increasing the growth rate of children did not, luckily, catch on. Not so with domestic animals. Because of the peace treaty that settled the tetracycline patent competition, Lederle wouldn’t be alone in pursuing the agricultural market with tetracycline-derived nutritional supplements. In 1950, Pfizer had already put its corporate toe in the water with a compound they named Bi-Con, which combined streptomycin with vitamin B12 as a growth additive. If Aureomycin was being profitably sold to farmers, why not Terramycin? In 1952, Herb Luther, an animal nutritionist at Pfizer, started what he called “Project Piglet” in an attempt to find a feed that could be given directly to young pigs, accelerating their growth in order to limit the risk of sows rolling over and crushing their too-small offspring. (Morbid fact: In 1950, more than a third of newborn piglets died each year in this way.) Luther’s experimental animals, who were awakened to the stirring sounds of the “William Tell Overture,” and put to sleep to “Brahms’ Lullaby,” were fed on the nursing formula that Pfizer branded as Terralac. Jukes had opened another front in the antibiotic revolution, and here the battlefields wouldn’t be pharmacies and hospitals, but feed stores.

  It would be more than a little disrespectful to discuss Thomas Jukes without mentioning that he was also a groundbreaking evolutionary biologist, one of the pioneers of what has become a mainstream element of current thinking about molecular evolution—the powerful idea that much evolutionary change is not adaptive, but neutral. This theory, which Jukes first called non-Darwinian evolution, was independently proposed by the Japanese biologist Motoo Kimura in 1968, holds that most evolutionary variability isn’t due to natural selection, in which adaptations spread because they improve survival or reproductive productivity, but the random drift of mutations that aren’t particularly “superior.” In other words, most of the changes that are observed in a species over time are inconsquential; they do nothing to improve survival or reproduction.*

  Though Jukes’s reputation in the world of biology rests mostly on his work on evolution, he was best known during his lifetime as a journalist who, for more than four decades, fought against pseudoscience and creationism from the bully pulpit of a regular column in Nature magazine. In addition, he was a polemicist and public intellectual well remembered today for battling against proposed bans on the insecticide DDT, arguing that the number of lives it saved (by killing malarial mosquitoes) was dramatically greater than any possible ecological risk.

  But it was his discovery that antibiotics accelerated the growth of meat-producing animals that has had, by far, the longest tail of consequence. It’s not simply, or even mostly, that it led to the increased consumption of animal protein around the world at a dizzying rate. Far more important, it exposed untold quintillions of bacterial pathogens to antibiotics in doses too small to kill them. The result was the cultivation of some of the most robust bacteria the planet has encountered in the last billion years.

  The causes of antibiotic resistance are, of course, much greater and more varied than the promiscuous use of drugs like Aureomycin in animal feed. Even before penicillin had been completely isolated or tested on humans, Ernst Chain and E. P. Abraham had identified an enzyme produced by Staphylococcus aureus—penicillinase—that cleaves the chemical bonds holding the beta-lactam ring together, and so degrades the antibiotic action of nearly an entire family of antibiotics.* It wasn’t until 1945, though, that the phenomenon was documented: An Australian study tested 159 different strains of S. aureus, 128 collected before the advent of penicillin, the other 31 from hospital wards where penicillin was used therapeutically. Only the 31 showed resistance.

  But this was just one study, and the first flush of the enthusiasm for the miracle drugs ignored problems of resistance. Articles published during the first years of widespread antibiotic use, roughly 1944 to 1948, seemed to assume that the practice of medicine had largely become a matter of sorting through an array of wonder drugs, and, when in doubt, prescribing them all. Given the uncritical acceptance of the first antibiotics by clinicians—an understandable bit of blinkered thinking, since even when antibiotics were ineffective, they were generally very safe—and the widespread appetite for them by patients, antibiotic resistance was certain to appear, and to appear quickly.

  Nonetheless, it was dramatically accelerated by Jukes’s discovery. Since their introduction, up to a quarter of all the antibiotics manufactured have been administered to animals. Sometimes those drugs were restricted to animal use; in May 1940, milk cows on display at the “Foodzone” exhibit of the 1939 World’s Fair—including the first of Borden’s “Elsies”—came down with the mammary gland infection known as mastitis, which was cured by the too-toxic-for-hu
man-use gramicidin. Much more frequently, however, the antibiotics prescribed were identical to those used by humans. By the 1950s, as much as 10 percent of the milk consumed in the United Kingdom and the United States was contaminated with penicillin, which had been used to treat mastitis in cows who were then milked before the penicillin cleared their systems.

  And that was just when antibiotics were used more or less as intended. Because Aureomycin and Terralac were given not only to cure (or even prevent) infection, but to promote growth, the dosages were very small: generally around 200 grams per ton of feed for a two-week period—enough to grow animals to their slaughterhouse weight as quickly as possible, but far below the level needed to stop a bacterial infection. In fact, these exposures more closely replicate the concentrations of antibiotic molecules in nature, such as Alexander Fleming’s mold juice or Selman Waksman’s soils.

  There, they behave very differently. The standard explanation for the phenomenon of bacterial effectiveness—the fact that some microbes produce substances that are highly toxic to other microbes—was historically couched in distinctly martial, not to say anthropomorphic, terms: Bacteria (and fungi like the penicilliums) produce the molecules we call antibiotics to defend themselves in an eternal Hobbesian war of all against all.

  Plausible, but for the uncomfortable fact that the molecules that microbes produce aren’t usually toxic in the concentrations that occur in nature.

  The natural history of antibiotics is not, it turns out, the obvious one, in which unicellular life evolved defense mechanisms during billions of years of natural selection. Antibiotic molecules in nature aren’t always, or even usually, weapons. Bacterial populations react to the presence of low concentrations of antibiotic molecules in a variety of ways, and many of them are actually positive. Many antibiotics are what biologists call “hormetic”: beneficial in low doses, even promoting the creation of what are known as biofilms, matrices that hold bacterial cells together with a polymer “glue” and make them far more durable in the presence of both the animal immune system, and antibiotics themselves.

 

‹ Prev