Pandemic

Home > Other > Pandemic > Page 14
Pandemic Page 14

by Sonia Shah


  The fact that wanton consumption of antibiotics—using either more or less than the precise amount required to tame an infection—leads to the development of antibiotic-resistant pathogens has been long known. It was first outlined by Alexander Fleming, the scientist who discovered penicillin. “I would like to sound one note of warning,” he said, in his speech accepting the Nobel Prize in Physiology or Medicine in 1945. “It is not difficult to make microbes resistant to penicillin in the laboratory by exposing them to concentrations not sufficient to kill them, and the same thing has occasionally happened in the body. The time may come,” he went on, presciently,

  when penicillin can be bought by anyone in the shops. Then there is the danger that the ignorant man may easily underdose himself and by exposing his microbes to non-lethal quantities of the drug make them resistant. Here is a hypothetical illustration. Mr. X has a sore throat. He buys some penicillin and gives himself, not enough to kill the streptococci but enough to educate them to resist penicillin. He then infects his wife. Mrs. X gets pneumonia and is treated with penicillin. As the streptococci are now resistant to penicillin the treatment fails. Mrs. X dies. Who is primarily responsible for Mrs. X’s death? Why Mr. X whose negligent use of penicillin changed the nature of the microbe.62

  While Fleming warned about the perils of underuse of antibiotics, the same risks apply with overuse. But while judicious use of antibiotics served the needs of public health, rampant consumption served those of private interests. In many countries, hospital physicians found it convenient to dose whole wards with antibiotics, indiscriminately. Patients found comfort in consuming antibiotics for colds and flus and other viral infections, for which they are useless. Farmers profited by giving antibiotics to their livestock, which for reasons that are still unclear made them grow faster and helped them thrive in factory farms. (Their provision of low-dose antibiotics to their livestock for “growth promotion” accounts for 80 percent of all antibiotic consumption in the United States.) Cosmetic companies enlarged their markets by packaging antibiotics in their soaps and hand lotions.63 By 2009, the people and animals of the United States were consuming upward of 35 million pounds of antibiotics annually.64 “Fleming’s warning,” one microbiologist writes, “has fallen on ears deafened by the sound of falling money.”65

  In countries like India, where there are fewer restrictions on antibiotic consumption, overuse is rampant. Even the most high-end antibiotics are available without a prescription. The poor, who can’t afford full courses of the drugs, pop one or two tablets at a time, modern-day Mr. X’s. While hundreds of thousands of Indians die every year for lack of access to the right antibiotics at the right time, antibiotics are routinely used by others for nonbacterial conditions, such as colds and diarrhea. Studies suggest that up to 80 percent of patients with respiratory infections and diarrhea in India—conditions unlikely to be alleviated by the use of antibiotics—are given antibiotics. Precise diagnostics that would rule out this risky and ineffective use are expensive and hard to come by. Plus, the pharmacists who fill scrips for antibiotics make a good living on it, as do the companies that sell the drugs.66

  Antibiotics, had they been well stewarded, experts say, could have effectively treated infections for hundreds of years. Instead, one by one our bacterial pathogens have figured out how to rout the onslaught of antibiotics to which they’ve been indiscriminately subjected. We now face what some experts call an era of “untreatable infections.” Already, a “growing minority” of infections have become “technically untreatable,” as David Livermore of the U.K.’s national Antibiotic Resistance Monitoring lab wrote in 2009.67

  Controlling antibiotic consumption would almost certainly solve the problem. Places that use antibiotics sparingly whether due to the difficulty of accessing them, like Gambia, or due to more conscious restraints, like Scandinavia, experience low rates of drug-resistant microbes. MRSA is rare in Finland, Norway, and Denmark, as well as the Netherlands, even in hospitals. Since 1998, new patients admitted to Dutch hospitals have been swabbed and tested for MRSA, and if they’re found positive, they’re treated with antibiotics and kept in isolation until they’ve demonstrably shaken the bug. By 2000, only 1 percent of staph strains in Dutch hospitals showed resistance to methicillin and its cousin compounds. In Denmark, national guidelines restricted antibiotic prescriptions, and MRSA rates dropped from 18 percent of all staph in the late 1960s to just 1 percent within ten years.68

  But even as the toll of drug-resistant bacteria rises, vested interests are reluctant to admit there’s a problem, and weak public institutions even more reluctant to challenge them. Attempts to slow the consumption of antibiotics in the United States—and threaten the financial interests of the livestock and the drug industries as well as doctors and hospitals—have floundered again and again.

  In 1977, the FDA proposed removing the antibiotics penicillin and tetracycline from use in livestock for growth promotion, but Congress blocked the move. Then in 2002, the FDA announced that it would regulate the use of antibiotics in livestock only if the practice could be proved to cause high levels of drug-resistant infections in people. Even experts who believe it does admit that the connection is nearly impossible to conclusively prove. Finally, in response to a lawsuit filed by a coalition of NGOs, in 2012 a federal court ordered the FDA to regulate the practice anyway.69 In December 2013, the FDA issued a set of voluntary guidelines on antibiotic use in livestock, but it was so full of loopholes that one activist fighting for stricter controls called it “an early holiday gift to industry.”70

  The government has been similarly recalcitrant to rein in antibiotic use in hospitals and doctors’ offices. In 2006, after a fractious ten-year-long effort, the CDC issued voluntary guidelines on how to prevent drug-resistant bacteria spread in hospitals. The guidelines were so jumbled, the Government Accountability Office reported, that they “hindered efforts” to put them into useful action, as the journalist Maryn McKenna recounts in her history of MRSA.71

  Finally, in September 2014, the White House issued a series of guidelines on the issue. Whether political leaders had finally challenged commercial interests still remained unclear. The guidelines fell roughly into two categories: those that would restrict the use of antibiotics—and directly conflict with the interests of drug companies, farmers, and hospitals—and those that would foster the development of new antibiotics and diagnostic tests to take the place of the old. Suggestively, the former were delayed, while the latter fast-tracked. The implementation of the guidelines that would restrict consumption was put off until 2020, pending the actions of a new advisory council and task force. But the government immediately announced plans to provide a windfall to the pharmaceutical industry, with a $20 million prize for the development of a rapid diagnostic test to identify highly antibiotic-resistant bacteria.72

  The burden of drug-resistant pathogens extends beyond the people who will die from infections for which no effective treatment exists. A much larger group of people will suffer infections for which only a select few antibiotics will work. They will show up at hospitals and clinics with what seem to be routine infections and be erroneously treated with the wrong antibiotics. Studies suggest that anywhere from 30 to 100 percent of patients with MRSA are initially treated with ineffective antibiotics.73 Delays in effective treatment allow the pathogen to progress until it is too late. A simple urinary tract infection, for example, becomes a much more serious kidney infection. A kidney infection becomes a life-threatening bloodstream infection.74

  And then there are those of us like me and my son. It used to be that staph infections didn’t really affect otherwise healthy people like us. It was a problem for people weakened by hospitalization, or acute rehab units, or long-term care facilities. But then in 1999, Staphylococcus aureus, in response to an onslaught of antibiotics, spawned a drug-resistant form, picked up the ability to secrete a toxin, and escaped from the American hospitals in which it first emerged. By 2001, 8 percent of the U.S. gener
al population was colonized by MRSA bacteria, mostly inside their noses.75 Had the surveyors sampled more obscure bodily locations, that number might have been even larger. Two years later, 17.2 percent were colonized. It’s not just the skin and soft-tissue infections that MRSA most often causes in healthy people. If, via a wound, say, or a dental procedure, or a botched lancing of a boil, MRSA penetrates deeper into the body, the consequences are dire. Tissue-destroying infections of the lung (necrotizing pneumonia) and flesh-eating disease (necrotizing fasciitis) are just two of the unpleasant—and often deadly—possibilities. By 2005, MRSA had caused more than 1.3 million infections in the United States, causing what experts have called a public-health crisis in the nation’s emergency rooms and doctors’ offices.76

  For now, the MRSA strain most commonly picked up by people outside of hospitals, the USA300 strain, while impervious to penicillin and other similar “beta-lactam” antibiotics, is still susceptible to non-beta-lactam antibiotics. That may not help much if you have necrotizing pneumonia—38 percent of patients die within forty-eight hours of being admitted to the hospital—but it is still something.77 But not for long, perhaps. Staph strains that can resist non-beta-lactam drugs as well have already been spotted.78

  There are few new drugs on the horizon. Since antibiotics are not used for very long, there’s little market incentive for drug companies to develop new ones. The market value of a brand-new antibiotic is just $50 million, a paltry sum for a drug company considering the research and development costs incurred to create such drugs. As a result, between 1998 and 2008, the FDA approved just thirteen new antibiotics, only three of which boasted new mechanisms of action.79 In 2009, according to the Infectious Diseases Society of America, only sixteen of the hundreds of new drugs in development were antibiotics. None targeted the most resistant and least treatable gram-negative bacteria, like those endowed with NDM-1.80

  * * *

  The U.S. government is not alone in falling prey to the rising power of private interests and allowing pathogens to spread as a result. It’s happened to our premier international agency, the World Health Organization, as well.

  The agency was created by the United Nations in 1948 to coordinate campaigns to protect global public health, using dues collected from the UN’s member nations. But over the course of the 1980s and early 1990s, the major donor nations, skeptical of the UN system, started to slowly starve it of public financing. (They introduced a policy of zero real growth to the UN budget in 1980, and of zero nominal growth in 1993.)81 To make up the budgetary shortfall, the WHO started to turn to private finance, collecting so-called voluntary contributions from private philanthropies, companies, and NGOs, as well as donor countries. In 1970, these voluntary contributions accounted for a quarter of the agency’s budget. By 2015, they made up more than three-quarters of the agency’s nearly $4 billion budget.

  If these voluntary donations simply substituted for missing public funding, they wouldn’t make much of a difference in the way the WHO functions. But they don’t. Public funding (via annual dues from member nations) doesn’t come with any strings attached. The dues are simply assessed and collected, and the WHO is in charge of deciding how to spend the money. That’s not true of voluntary contributions. By making a voluntary contribution, individual donors buy control at the WHO. They can bypass the WHO’s priorities and allot the money for whatever specific purpose they like.82

  Thus the WHO’s activities, the agency’s director-general Margaret Chan admitted in an interview with The New York Times, are no longer driven by global health priorities but rather by donor interests.83 And those interests have introduced a pronounced distortion into the WHO’s activities. While the agency’s regular budget is allocated to different health campaigns in proportion to their global health burden, according to an analysis of the agency’s 2004–2005 budget, 91 percent of the WHO’s voluntary contributions were earmarked for diseases that account for just 8 percent of global mortality.84

  Many of the WHO’s deliberations are conducted behind closed doors, so the full extent of the influence of private donors is unclear. But their conflicts of interest are plain enough. For example, insecticide manufacturers help the WHO set malaria policy, even though their market for antimalarial insecticides would vanish if malaria actually receded. Drug companies help the WHO determine access-to-medicine policies, despite the fact that they stand to lose billions from the cheaper generic drugs that would improve patients’ ability to get the treatments they need. Processed food and drinks companies help the agency craft new initiatives on obesity and noncommunicable diseases, even though their financial health depends on selling products that are known to contribute to these very problems.85

  As the integrity of the WHO has been degraded by private interests, so has its ability to effectively lead global responses to public-health challenges. During the Ebola epidemic in West Africa in 2014, the weakened agency was unable to muster a prompt response. It turns out one reason why is that the agency had been forced to compromise on the integrity of the officials they’d hired. Rather than being appointed for their commitment to global health, they’d been appointed for political reasons. When the affected countries wanted to downplay the epidemic so as not to upset their mining companies and other investors, the WHO’s politically appointed local officials went along with it. As an internal document leaked to the Associated Press revealed, they refused to acknowledge the epidemic until it was too late to contain. They failed to send reports on Ebola to WHO headquarters. The WHO official in Guinea refused to get visas for Ebola experts to visit the afflicted country. It wasn’t exactly a cover-up, but the agency’s top polio official, Bruce Aylward, admitted in the fall of 2014 that WHO’s actions ended up “compromising” the effort to control the Ebola epidemic rather than aiding it.86

  As the effectiveness of the WHO’s leadership wanes, the influence of private global health outfits grows. Some have started to eclipse public ones like the WHO entirely. Bill Gates, the cofounder of the computer giant Microsoft, used the fortune he accumulated from the global high-tech economy to form the Bill and Melinda Gates Foundation, the world’s largest private philanthropy, in 2000. The Gates Foundation soon became the world’s third-biggest financier of global health research, outstripped only by the U.S. and U.K. governments, and one of the world’s single largest donors to the WHO.87 Today it’s the privately run Gates Foundation that sets the global health agenda, not the WHO. In 2007, the foundation announced that resources should be devoted to the eradication of malaria, contrary to a long-established consensus among scientists in and outside the WHO that controlling the disease was safer and more feasible. Nevertheless, the WHO immediately adopted the Gates plan. When the agency’s malaria director, Arata Kochi, dared to publicly question it, he was promptly put on “gardening leave,” as one malaria scientist put it, never to be heard from again.88

  The well-intentioned people at the Gates Foundation have no particular private interest that directly conflicts with their ability to promote global health campaigns in the public interest, or at least none that we know of.89 But if they did, there’d be no mechanism to hold them accountable for it. Powerful private interests unfettered by public controls, even when they have charitable intentions, are like royalty. We’ve ceded control to them and now we must simply hope that they are good. Our ability to mount a cooperative defense against the next pandemic depends on it.

  * * *

  Of course, even if political leaders are corrupt and political institutions are rotten, people can still cooperate with each other. They can take matters into their own hands, launching their own cooperative efforts to contain pathogens. For example, when city leaders failed to alert New Yorkers to the spread of cholera in the nineteenth century, private physicians banded together and issued their own bulletins.

  Such actions make sense. And extreme events do tend to bring people closer together. Think of New Yorkers after the September 11 terror attacks or in the wake of recent
hurricanes. But that’s not what tends to happen when pandemic-causing pathogens strike.

  Unlike acts of war or catastrophic storms, pandemic-causing pathogens don’t build trust and facilitate cooperative defenses. On the contrary, due to the peculiar psychic experience of new pathogens, they’re more likely to breed suspicion and mistrust among us, destroying social bonds as surely as they destroy bodies.

  SIX

  BLAME

  People on the street eye us warily as my guide and I slowly drive the smooth, wide roads of Cité Soleil, a seaside slum along the outskirts of Port-au-Prince. The dusty, flat neighborhood is mostly treeless, the sun beating down on its shacks and crumbling, bullet-ridden buildings. It’s around noon on a weekday, but despite high unemployment in the slum, the streets are empty. I’d come to visit Cité Soleil in the summer of 2013 to get a sense of how the people most vulnerable to the cholera epidemic felt about it. But I’m reluctant to approach the few people we see, sitting on overturned buckets in a patch of shade or ambling in the expanse of packed dirt in front of their shacks. They frown at us as we pass, although whether that’s because of the sun in their eyes or something else it’s hard to say.

  The sense of incipient violence grows more palpable as we continue on to the edge of the neighborhood, where the city dump is located. Here, people who make their living scavenging through Port-au-Prince’s garbage are walking around, and we can see clumps of people talking in the distance up the road. They seem approachable to me, but before we can reach them, we’re stopped by the helmeted guards posted in front of the gates of the dump. We’re not allowed to wander around on our own, they tell us sternly, without a uniformed official with us. Without visible proof of government permission, someone may “act crazy,” they say. This makes no sense to us, and one of the guards has what looks like an empty ChapStick container shoved up one of his nostrils, which somewhat undermines his authority. We reluctantly climb back into the car anyway. When I snap a few photos before ducking in, he angrily raps on the window. If anyone had seen what I just did, he scolds, they’d throw rocks at me or worse.

 

‹ Prev