The Best American Science and Nature Writing 2014

Home > Science > The Best American Science and Nature Writing 2014 > Page 25
The Best American Science and Nature Writing 2014 Page 25

by Deborah Blum


  As a biologist, Fleming knew that evolution was inevitable: sooner or later, bacteria would develop defenses against the compounds the nascent pharmaceutical industry was aiming at them. But what worried him was the possibility that misuse would speed the process up. Every inappropriate prescription and insufficient dose given in medicine would kill weak bacteria but let the strong survive. (As would the micro-dose “growth promoters” given in agriculture, which were invented a few years after Fleming spoke.) Bacteria can produce another generation in as little as twenty minutes; with tens of thousands of generations a year working out survival strategies, the organisms would soon overwhelm the potent new drugs.

  Fleming’s prediction was correct. Penicillin-resistant staph emerged in 1940 while the drug was still being given to only a few patients. Tetracycline was introduced in 1950, and tetracycline-resistant Shigella emerged in 1959; erythromycin came on the market in 1953, and erythromycin-resistant strep appeared in 1968. As antibiotics became more affordable and their use increased, bacteria developed defenses more quickly. Methicillin arrived in 1960 and methicillin resistance in 1962; levofloxacin in 1996 and the first resistant cases the same year; linezolid in 2000 and resistance to it in 2001; daptomycin in 2003 and the first signs of resistance in 2004.

  With antibiotics losing usefulness so quickly—and thus not making back the estimated $1 billion per drug it costs to create them—the pharmaceutical industry lost enthusiasm for making more. In 2004, there were only five new antibiotics in development, compared to more than five hundred chronic-disease drugs for which resistance is not an issue—and which, unlike antibiotics, are taken for years, not days. Since then, resistant bugs have grown more numerous and, by sharing DNA with each other, have become even tougher to treat with the few drugs that remain. In 2009 and again this year, researchers in Europe and the United States sounded the alarm over an ominous form of resistance known as CRE, for which only one antibiotic still works.

  Health authorities have struggled to convince the public that this is a crisis. In September, Dr. Thomas Frieden, the director of the U.S. Centers for Disease Control and Prevention, issued a blunt warning: “If we’re not careful, we will soon be in a post-antibiotic era. For some patients and some microbes, we are already there.” The chief medical officer of the United Kingdom, Dame Sally Davies—who calls antibiotic resistance as serious a threat as terrorism—recently published a book in which she imagines what might come next. She sketches a world where infection is so dangerous that anyone with even minor symptoms would be locked in confinement until they recover or die. It is a dark vision, meant to disturb. But it may actually underplay what the loss of antibiotics would mean.

  In 2009, three New York physicians cared for a sixty-seven-year-old man who had major surgery and then picked up a hospital infection that was “pan-resistant”—that is, responsive to no antibiotics at all. He died fourteen days later. When his doctors related his case in a medical journal months afterward, they still sounded stunned. “It is a rarity for a physician in the developed world to have a patient die of an overwhelming infection for which there are no therapeutic options,” they said, calling the man’s death “the first instance in our clinical experience in which we had no effective treatment to offer.”

  They are not the only doctors to endure that lack of options. Dr. Brad Spellberg of UCLA’s David Geffen School of Medicine became so enraged by the ineffectiveness of antibiotics that he wrote a book about it. “Sitting with a family, trying to explain that you have nothing left to treat their dying relative—that leaves an indelible mark on you,” he says. “This is not cancer; it’s infectious disease, treatable for decades.”

  As grim as they are, in-hospital deaths from resistant infections are easy to rationalize: perhaps these people were just old, already ill, different somehow from the rest of us. But deaths like this are changing medicine. To protect their own facilities, hospitals already flag incoming patients who might carry untreatable bacteria. Most of those patients come from nursing homes and “long-term acute care” (an intensive-care alternative where someone who needs a ventilator for weeks or months might stay). So many patients in those institutions carry highly resistant bacteria that hospital workers isolate them when they arrive and fret about the danger they pose to others. As infections become yet more dangerous, the health care industry will be even less willing to take such risks.

  Those calculations of risk extend far beyond admitting possibly contaminated patients from a nursing home. Without the protection offered by antibiotics, entire categories of medical practice would be rethought.

  Many treatments require suppressing the immune system, to help destroy cancer or to keep a transplanted organ viable. That suppression makes people unusually vulnerable to infection. Antibiotics reduce the threat; without them, chemotherapy or radiation treatment would be as dangerous as the cancers they seek to cure. Dr. Michael Bell, who leads an infection-prevention division at the CDC, told me: “We deal with that risk now by loading people up with broad-spectrum antibiotics, sometimes for weeks at a stretch. But if you can’t do that, the decision to treat somebody takes on a different ethical tone. Similarly with transplantation. And severe burns are hugely susceptible to infection. Burn units would have a very, very difficult task keeping people alive.”

  Doctors routinely perform procedures that carry an extraordinary infection risk unless antibiotics are used. Chief among them: any treatment that requires the construction of portals into the bloodstream and gives bacteria a direct route to the heart or brain. That rules out intensive-care medicine, with its ventilators, catheters, and ports—but also something as prosaic as kidney dialysis, which mechanically filters the blood.

  Next to go: surgery, especially on sites that harbor large populations of bacteria such as the intestines and the urinary tract. Those bacteria are benign in their regular homes in the body, but introduce them into the blood, as surgery can, and infections are practically guaranteed. And then implantable devices, because bacteria can form sticky films of infection on the devices’ surfaces that can be broken down only by antibiotics.

  Dr. Donald Fry, a member of the American College of Surgeons, who finished medical school in 1972, says: “In my professional life, it has been breathtaking to watch what can be done with synthetic prosthetic materials: joints, vessels, heart valves. But in these operations, infection is a catastrophe.” British health economists with similar concerns recently calculated the costs of antibiotic resistance. To examine how it would affect surgery, they picked hip replacements, a common procedure in once-athletic baby boomers. They estimated that without antibiotics, one out of every six recipients of new hip joints would die.

  Antibiotics are administered prophylactically before operations as major as open-heart surgery and as routine as cesarean sections and prostate biopsies. Without the drugs, the risks posed by those operations, and the likelihood that physicians would perform them, will change.

  “In our current malpractice environment, is a doctor going to want to do a bone marrow transplant, knowing there’s a very high rate of infection that you won’t be able to treat?” asks Dr. Louis Rice, chair of the department of medicine at Brown University’s medical school. “Plus, right now health care is a reasonably free-market, fee-for-service system; people are interested in doing procedures because they make money. But five or ten years from now, we’ll probably be in an environment where we get a flat sum of money to take care of patients. And we may decide that some of these procedures aren’t worth the risk.”

  Medical procedures may involve a high risk of infections, but our everyday lives are pretty risky too. One of the first people to receive penicillin experimentally was a British policeman, Albert Alexander. He was so riddled with infection that his scalp oozed pus and one eye had to be removed. The source of his illness: scratching his face on a rosebush. (There was so little penicillin available that, though Alexander rallied at first, the drug ran out and he died.)

&
nbsp; Before antibiotics, five women died out of every one thousand who gave birth. One out of nine people who got a skin infection died, even from something as simple as a scrape or an insect bite. Three out of ten people who contracted pneumonia died from it. Ear infections caused deafness; sore throats were followed by heart failure. In a post-antibiotic era, would you mess around with power tools? Let your kid climb a tree? Have another child?

  “Right now, if you want to be a sharp-looking hipster and get a tattoo, you’re not putting your life on the line,” says the CDC’s Bell. “Botox injections, liposuction, those become possibly life-threatening. Even driving to work: we rely on antibiotics to make a major accident something we can get through, as opposed to a death sentence.”

  Bell’s prediction is a hypothesis for now—but infections that resist even powerful antibiotics have already entered everyday life. Dozens of college and professional athletes, most recently Lawrence Tynes of the Tampa Bay Buccaneers, have lost playing time or entire seasons to infections with drug-resistant staph, MRSA. Girls who sought permanent-makeup tattoos have lost their eyebrows after getting infections. Last year three members of a Maryland family—an elderly woman and two adult children—died of resistant pneumonia that took hold after simple cases of flu.

  At UCLA, Spellberg treated a woman with what appeared to be an everyday urinary tract infection—except that it was not quelled by the first round of antibiotics, or the second. By the time he saw her, she was in septic shock, and the infection had destroyed the bones in her spine. A last-ditch course of the only remaining antibiotic saved her life, but she lost the use of her legs. “This is what we’re in danger of,” he says. “People who are living normal lives who develop almost untreatable infections.”

  In 2009 Tom Dukes—a fifty-four-year-old inline skater and bodybuilder—developed diverticulosis, a common problem in which pouches develop in the wall of the intestine. He was coping with it, watching his diet and monitoring himself for symptoms, when searing cramps doubled him over and sent him to urgent care. One of the thin-walled pouches had torn open and dumped gut bacteria into his abdomen—but for reasons no one could explain, what should have been normal E. coli were instead highly drug-resistant. Doctors excised 8 inches of his colon in emergency surgery. Over several months, Dukes recovered with the aid of last-resort antibiotics, delivered intravenously. For years afterward, he was exhausted and in pain. “I was living my life, a really healthy life,” he says. “It never dawned on me that this could happen.”

  Dukes believes, though he has no evidence, that the bacteria in his gut became drug-resistant because he ate meat from animals raised with routine antibiotic use. That would not be difficult: most meat in the United States is grown that way. To varying degrees, depending on their size and age, cattle, pigs, and chickens—and, in other countries, fish and shrimp—receive regular doses to speed their growth, increase their weight, and protect them from disease. Out of all the antibiotics sold in the United States each year, 80 percent by weight are used in agriculture, primarily to fatten animals and protect them from the conditions in which they are raised.

  A growing body of scientific research links antibiotic use in animals to the emergence of antibiotic-resistant bacteria: in the animals’ own guts, in the manure that farmers use on crops or store on their land, and in human illnesses as well. Resistant bacteria move from animals to humans in groundwater and dust, on flies, and via the meat those animals get turned into.

  An annual survey of retail meat conducted by the Food and Drug Administration—part of a larger project involving the CDC and the U.S. Department of Agriculture that examines animals, meat, and human illness—finds resistant organisms every year. In its 2011 report, published last February, the FDA found (among many other results) that 65 percent of chicken breasts and 44 percent of ground beef carried bacteria resistant to tetracycline, and 11 percent of pork chops carried bacteria resistant to five classes of drugs. Meat transports those bacteria into your kitchen, if you do not handle it very carefully, and into your body if it is not thoroughly cooked—and resistant infections result.

  Researchers and activists have tried for decades to get the FDA to rein in farm overuse of antibiotics, mostly without success. The agency attempted in the 1970s to control agricultural use by revoking authorization for penicillin and tetracycline to be used as “growth promoters,” but that effort never moved forward. Agriculture and the veterinary pharmaceutical industry pushed back, alleging that agricultural antibiotics have no demonstrable effect on human health.

  Few, though, have asked what multi-drug–resistant bacteria might mean for farm animals. Yet a post-antibiotic era imperils agriculture as much as it does medicine. In addition to growth promoters, livestock raising uses antibiotics to treat individual animals, as well as in routine dosing called “prevention and control” that protects whole herds. If antibiotics became useless, then animals would suffer: individual illnesses could not be treated, and if the crowded conditions in which most meat animals are raised were not changed, more diseases would spread.

  But if the loss of antibiotics changes how livestock are raised, then farmers might be the ones to suffer. Other methods for protecting animals from disease—enlarging barns, cutting down on crowding, and delaying weaning so that immune systems have more time to develop—would be expensive to implement, and agriculture’s profit margins are already thin. In 2002 economists for the National Pork Producers Council estimated that removing antibiotics from hog raising would force farmers to spend $4.50 more per pig, a cost that would be passed on to consumers.

  H. Morgan Scott, a veterinary epidemiologist at Kansas State University, unpacked for me how antibiotics are used to control a major cattle illness, bovine respiratory disease. “If a rancher decides to wean their calves right off the cow in the fall and ship them, that’s a risky process for the calf, and one of the things that permits that to continue is antibiotics,” he said, adding, “If those antibiotics weren’t available, either people would pay a much lower price for those same calves, or the rancher might retain them through the winter” while paying extra to feed them. That is, without antibiotics, those farmers would face either lower revenues or higher costs.

  Livestock raising isn’t the only aspect of food production that relies on antibiotics or that would be threatened if the drugs no longer worked. The drugs are routinely used in fish and shrimp farming, particularly in Asia, to protect against bacteria that spread in the pools where seafood is raised—and as a result, the aquaculture industry is struggling with antibiotic-resistant fish diseases and searching for alternatives. In the United States, antibiotics are used to control fruit diseases, but those protections are breaking down too. Last year, streptomycin-resistant fire blight, which in 2000 nearly destroyed Michigan’s apple and pear industry, appeared for the first time in orchards in upstate New York, which is (after Michigan) one of the most important apple-growing states. “Our growers have never seen this, and they aren’t prepared for it,” says Herb Aldwinckle, a professor of plant pathology at Cornell University. “Our understanding is that there is one useful antibiotic left.”

  Is a post-antibiotic era inevitable? Possibly not—but not without change.

  In countries such as Denmark, Norway, and the Netherlands, government regulation of medical and agricultural antibiotic use has helped curb bacteria’s rapid evolution toward untreatability. But the United States has never been willing to institute such controls, and the free-market alternative of asking physicians and consumers to use antibiotics conservatively has been tried for decades without much success. As has the long effort to reduce farm antibiotic use: the FDA will soon issue new rules for agriculture, but they will be contained in a voluntary “guidance to industry,” not a regulation with the force of law.

  What might hold off the apocalypse for a while is more antibiotics—but first pharmaceutical companies will have to be lured back into a marketplace they have already deemed unrewarding. The need for new
compounds could force the federal government to create drug-development incentives: patent extensions, for instance, or changes in the requirements for clinical trials. But whenever drug research revives, achieving a new compound takes at least ten years from concept to drugstore shelf. There will be no new drug to solve the problem soon—and given the relentlessness of bacterial evolution, none that can solve the problem forever. In the meantime, the medical industry is reviving the old-fashioned solution of rigorous hospital cleaning and also trying new ideas: building automatic scrutiny of prescriptions into computerized medical records and developing rapid tests to ensure that the drugs aren’t prescribed when they are not needed. The threat of the end of antibiotics might even impel a reconsideration of phages, the individually brewed cocktails of viruses that were a mainstay of Soviet Union medical care during the Cold War. So far the FDA has allowed them into the U.S. market only as food-safety preparations, not as treatments for infections.

  But for any of that to happen, the prospect of a post-antibiotic era has to be taken seriously, and those staring down the trend say that still seems unlikely. “Nobody relates to themselves lying in an ICU bed on a ventilator,” says Rice of Brown University. “And after it happens, they generally want to forget it.”

  When I think of preventing this possible future, I reread my great-uncle’s obit, weighing its old-fashioned language freighted with a small town’s grief.

  The world is made up of “average” people, and that is probably why editorials are not written about any one of them. Yet among these average people, who are not “great” in political, social, religious, economic or other specialized fields, there are sometimes those who stand out above the rest: stand out for qualities that are intangible, that we can’t put our finger on.

 

‹ Prev