by Sonia Shah
The results were quickly picked up by the Associated Press and other media outlets. Panic ensued. Hundreds of calls from distraught patients, doctors, and drug companies poured in to Psaty’s office, forcing him to hire extra help. Doctors and industry execs were irate, calling Psaty’s report alarmist. “This is a great example of news that’s not ready for prime time,” seethed the American Heart Association’s Rodman Starke.118
While Pharmaceutical Executive magazine pooh-poohed Psaty’s findings—“many people who die are taking some drugs,” the magazine noted breezily119—Pfizer demanded that Psaty’s university submit mountains of notes, meeting minutes, and records for their review. Drug maufacturers criticized Psaty’s study and, he says, publicly questioned his integrity. The embattled Psaty and other similarly harassed researchers aired their sad story in a 1997 New England Journal of Medicine paper, under the headline “The messenger under attack.”120
By 2000, the proportion of the nation’s health care budget devoted to drugs was growing by 15 percent every year—almost twice the rate of growth in spending for hospitals and doctors—and was expected to continue to rise over the coming decade.121 While the industry takes pains to point out how this outlay of cash actually saves society money, by preventing costly hospitalizations, in fact most of the increased spending on drugs centered around just a handful of heavily marketed, brand-name drugs—not much more than two dozen of the over nine thousand drugs on the market. The bestsellers included such nakedly commercial hits as Astra-Zeneca’s acid-reflux drug Nexium, which contains the same active molecule as the off-patent drug that preceded it, Prilosec, and allergy drug Clarinex, the metabolite of its off-patent parent drug Claritin.122
Clearly, the relentless growth of the drug industry—and the concomitant expansion of its clinical research activities—are not inevitable results of the American quest for good health. According to the World Health Organization, of the thousands of drugs available in the United States, just over three hundred are essential for public health. That’s not to say that such new drugs aren’t useful to some subset of patients, as FDA officials and industry execs often point out. “The availability of me-too drugs can reduce health care costs,” says former FDA commissioner Mark McClellan. “If there is just one cholesterol medication available, the price may be very high; if there are three or four or five, the price can come down a lot.”123 Plus, each patient’s unique physiology responds differently to subtly different drugs—when one statin doesn’t work, perhaps another will. And even hyped lifestyle drugs like Viagra have important medical uses, not just by treating those who genuinely suffer from erectile dysfunction, but by counteracting sexual side effects brought on by the treatment of more serious diseases. Drugs that come in tasty syrups, convenient pills, and handy nasal sprays help more people take the medicines they need.
But the number of people helped—and the margin by which they are healed—is certainly narrow, and the flip side of the mass marketing of blockbuster drugs is a heavy toll in adverse effects that would otherwise remain sporadic. Today, when elderly patients are rushed to the emergency room, it is 50 percent more likely that their problem stems from taking too many drugs, rather than from not taking enough.124 Approved drugs kill over one hundred thousand Americans every year,125 not counting the scores whose bad reactions are unreported or wrongly attributed to the disease the drug is meant to treat,126 making adverse reactions to pill popping the fifth leading cause of death in the United States.127
Each new drug must be tested on scores of warm bodies, a “conditional privilege” that society grants to investigators because it values medical innovation and beneficial new drugs, as the NIH notes in its 2004 Guidelines for the Conduct of Research Involving Human Subjects.128
And yet, what the recent history of our drug development and marketing system shows is that there is no easy equation between new drugs and health benefits, not even in the United States, where people consume more new drugs than anywhere else in the world. This is as true for me-too ED drugs that lead drug company sales as the masses of drugs aimed at more serious illnesses that comprise the majority of industry portfolios. For, although the risks to experimental subjects in a trial for me-too lifestyle drugs aren’t insignificant, they are generally short-term trials involving subjects who are not seriously ill. In drugs aimed at more dangerous conditions, the contrast between the risk-shouldering subjects and the distant beneficiaries can be stark indeed.
4
Uncaging the Guinea Pig
Long before the drug industry embarked on its global hunt for bodies in the poor reaches of the world, Western medical researchers relied on the bodies of their own vulnerable populations to satisfy their scientific curiosity.
The notion of manipulating human bodies to answer scientific questions arose, in part, in recognition of the fact that even the most elaborate pharmacopoeias did little to muffle the death toll from disease and infection. For countless bloody centuries nobody really knew how the body functioned or why it became diseased. The circulating blood, the pumping heart, the pulsating nerves and organs: none of these hidden mechanisms had revealed themselves, and the body remained as mysterious as the strange factors that suddenly appeared to sicken it.1
But by dissecting bodies and looking inside them, Western physicians began to figure out how they functioned and what went wrong when they became infirm. Over a thousand years human dissections and vivisections—the mutilation of live human beings—slowly revealed the body’s mechanisms. Most of the cutting took place on the bodies of the poor and imprisoned, doubling as public spectacle and social opprobrium.2
The travails of those who ended up as “clinical material” rarely surfaced in polite company. Poor and colored people were generally considered less sensitive to pain anyway. As the prominent French physiologist and avid dissector Claude Bernard wrote in his 1865 Introduction to the Study of Experimental Medicine, scientists considered themselves immune to the “cries of people of fashion” along with those of the dissected themselves.3 Medical science was above the fray, he insisted, and could only be judged by its own practitioners. As for the human subjects involved, if society didn’t respect their rights, why should scientists?
Such attitudes persisted uncontested for nearly a century. It was a government-sponsored inquiry into the course of syphilis that would finally expose them to a shocked public.
Syphilis is an old American disease, brought back to Europe by Columbus’s returning sailors. In some people, the corkscrew-shaped Treponema bacteria causes no symptoms for years; some even naturally rid their body of it without ever knowing they’ve had it, inadvertently passing it on to others through sexual contact. For an unlucky minority the bacterium causes serious disease. It begins with genital sores, then a general rash and ulceration, and finally “revolting abscesses eating into bones and destroying the nose, lips, and genitals,” as medical historian Roy Porter described it. Untreated, it often proves fatal. (To repair the damage to syphilitics’ noses, sixteenth-century surgeons sewed flaps of skin from the upper arm onto their faces, leaving them to recover, arm attached to nose, for weeks at a time.) Medicine had little to offer. Until the synthesis of arsenic-based drugs in 1908, physicians prescribed the application of mercury ointments, a useless therapy that nevertheless caused teeth to fall out, ulcers to form on the gums, and bones to crumble.4
The economic burden of the disease weighed heavily in the U.S. South of the 1920s. The contagion ran rampant among the impoverished black laborers that many industries relied upon. If those sick with syphilis could be somehow treated, “the results would more than pay for the cost in better labor efficiency,” as one doctor from the U.S. Public Health Service (PHS) pointed out. Medical research into the field was urgent.5
With its potent whiff of sex, disfigurement, and death trailing behind, syphilis was considered an illicit, dirty disease. Syphilitics were so despised that during the 1930s U.S. hospitals refused to treat them. In 1934 a U.S. govern
ment health commissioner was kicked off the radio for simply uttering the word “syphilis” on the air. Sufferers of venereal diseases were relegated to special clinics, where their immoral ways couldn’t contaminate the upstanding sick in nearby hospitals. Their fates in the clinics couldn’t have been heartening. By then, standard treatment—over a year of painful weekly injections of arsenic—was expensive, time consuming, and only partially effective.6
Public disgust for syphilitics made experimentation easier in many ways. In 1931, Rockefeller-funded malaria researcher Mark Boyd injected the Plasmodium falciparum malarial parasite into black patients demented with syphilis at a Florida hospital. True, the idea of killing the syphilis bacterium by inducing high malarial fevers was a therapeutic craze at the time. But while white patients were generally administered the mild Plasmodium vivax malarial parasite, Boyd infected his black subjects with the parasite’s deadly cousin falciparum. No law or social mores required that he ask for the patients’ or their families’ consent, although he did do so for the eventual autopsies of the patients’ bodies.7
In 1929, a Public Health Service feasibility study determined that a mass treatment program for rural black workers suffering from syphilis was possible. But by 1932, funding for such elaborate endeavors had dried up, and the government doctors’ attention turned from providing care to scientific research. What if they enrolled syphilis-infected patients into a study, provided no treatment at all, and just watched what happened? Several interesting questions might be answered, maintained the PHS’s Dr. Taliaferro Clark, who conceived of the study. Perhaps the course of the disease was different in blacks than in whites, for instance, or perhaps no treatment was better than treatment. Whatever the case, autopsies of subjects who succumbed to the disease while under the researchers’ watch could help shed light on these pressing questions.8
Even back then such a “natural history” study would likely have been impossible to conduct on white, literate, or middle-class patients. But the subjects for this study would be impoverished, mostly illiterate black male sharecroppers in Macon County, Alabama, around the town of Tuskegee, where syphilis rates were soaring.
American science already relied on blacks as a source of clinical material, just as American plantations relied on them for their back-breaking labors in the field. The black janitors and technicians who cleaned up after American scientists were often called upon to supply animal and human bodies to experiment on. Young black boys could be enticed into capturing and etherizing dogs for experiments, and black men to tend experimental animals in the dark corridors of research hospitals; either could be approached to offer their bodies for ghastly experiments, such as one in which subjects had to swallow a twelve-foot-long tube that would be inflated later while lodged deep in the body.9
Still, the government doctors found recruitment for their no-treatment syphilis study difficult, even among the black workers they derided as ignorant and lazy in private correspondence later collected by Wellesley medical historian Susan Reverby. Finally, they resorted to deception, offering what they called “free treatment.” Nearly four hundred black male sharecroppers who considered themselves ill with “bad blood,” but who in fact were suffering from late-stage syphilis, along with 201 healthy black men who would serve as controls, enrolled in the study. As the subjects were unaware that they suffered from syphilis, the government doctors were under no pressure to offer the standard syphilis treatments of the time. They gave, instead, the long-dismissed mercury ointments, along with aspirin, tonics, free lunches, and burial insurance, watching and taking notes as the condition of the syphilitic men worsened. As it was imperative that the patients in the study not be treated with any medicines, which would contaminate the resulting syphilitic corpse, the government doctors met with local clinicians “to ask their cooperation in not treating the men,” according to a researcher involved in the study.10
Deceived and untreated, the sick men considered themselves lucky to be involved in the study. “The ride to and from the hospital in this vehicle with the Government emblem on the front door, chauffeured by a nurse, was a mark of distinction for many of the men who enjoyed waving to their neighbors as they drove by,” the nurse recruited to entice the men into the study recalled. Believing themselves graced with free medical care from government doctors, they started families, unknowingly spreading the infection to their partners and children.11
The government doctors felt under no obligation to conceal the deception at the heart of their research from their colleagues. After all, in the mid-1930s, medical research was the stuff of heroic drama. A polio victim, Franklin Delano Roosevelt, sat in the White House, urging Americans to send in their spare change to support medical research into polio. In 1936, a blockbuster film, The Story of Louis Pasteur, extolled the field’s forefather. No notion that research subjects might require protections from the ministrations of their doctor scientists, or that black people might be entitled to the same rights and freedoms as whites, existed to counterbalance the prerogative of researchers to do what they may. When the doctors busy at Tuskegee presented their preliminary findings at the annual meeting of the American Medical Association (AMA) that year, noting that the patients from whom they purposely withheld treatment were getting sicker much faster than their controls, nobody batted an eye. Papers about the study appeared in the medical literature at about five-year intervals from then on.12
With the arrival of penicillin in the 1940s, and its remarkable efficiency in treating syphilis, the nontreatment study in Tuskegee lost its primary reason for being. What was the point in seeing how the disease progressed without treatment, when treatment was now so simple and effective? Yet the Public Health Service clung to its original plan. They did not offer penicillin to the hundreds of black men under their care suffering from syphilis. On the contrary, in order to protect the integrity of their data, they conspired with local draft boards to withhold the army’s standard syphilis treatment from any Tuskegee subjects drafted into service.13
After all, medical progress required risk-taking, and sometimes researchers had to employ trickery or exploit their authority over vulnerable subjects in order to get the job done. In 1943, as the country headed to war, the U.S. Public Health Service paid two hundred prisoners one hundred dollars each to be infected with gonorrhea as government doctors watched, hoping to learn about its transmission. In another study government scientists infected eight hundred prisoners and hospital patients with malaria to study the efficacy of new antimalarial drugs. In that study the docs placed malaria-infected mosquitoes on prisoners’ warm stomachs as a Life magazine photographer hovered nearby.14
As documented by University of Virginia bioethicist Jonathan D. Moreno in his 2000 book on secret state experiments, Undue Risk, thousands of American soldiers were used in experiments designed to determine the fatal dose of poison gas, which had felled scores in World War I. In one test, termed a trial of “summer clothing,” soldiers were locked into gas chambers full of mustard gas in exchange for a three-day pass. Wearing only street clothes and a gas mask, some pleaded with their captors to be released but were denied until falling unconscious.15
Starting in 1946, scientists working with the Atomic Energy Commission (later renamed the Nuclear Regulatory Commission) launched a series of studies involving human consumption of radioactive materials at two schools for troubled children in Massachusetts, Fernald and Wrentham. The child inmates at these brutal institutions were heavily tranquilized, and were consigned half naked to bare, cement-floored rooms outfitted with grates into which their urine and feces could be hosed. Scientists fed the children meals contaminated with radioactive material, drawing their blood afterward to study how their bodies fared.
The children were so neglected, as one resident subject remembered years later, that they “would do practically anything for attention.” Even so, their parents had to be actively misled in order for the experiments to proceed. In letters to the parents the studies wer
e presented as “examinations” on nutrition, aimed at “brighter children” who would receive “special diets” as part of their membership in a special “Science Club.” The Fernald and Wrentham studies continued until 1973, with periodic reports appearing in medical journals.16
While the government pursued these inquiries in order to further their political and military goals, university scientists signed on for a chance to enter the exciting new field of radiation experimentation. As one prominent scientist remembered, “This was something like bacteriology . . . this was going to be a terrific field.”17 In a 1945 study eighteen hospitalized patients under the care of physicians from the University of Rochester, the University of Chicago, and the University of California were secretly injected with plutonium. The purpose was to discern how the body processed the metal, and researchers pored over their plutonium-injected patients’ urine and stools and sampled their extracted teeth to find out. Patients were told that the doctors’ pointed and ongoing interest in them had nothing to do with experimentation but was part and parcel of their “long-term care,” as Moreno writes. Some were even dug up from their graves to see how much plutonium remained in their bones; their families were told this was in order to discern the effects of “past medical treatment.”18
Similar state experimentation proceeded in other Western countries. In Australia, over eight hundred Jewish refugees and injured soldiers were purposely infected with malaria, sometimes at doses equal to the bites of thirteen thousand infected mosquitoes. The government scientists withheld antimalarial treatment from their shivering charges while they removed up to two pints of blood and injected them with insulin and adrenalin in order to simulate blood loss, starvation, and anxiety. “They never told us anything,” recalls one subject who survived the trial. “At first I didn’t realize it was dangerous. . . . I thought it would be an adventure and that is why I went,” recalled another.19