Book Read Free

The Plague Cycle

Page 19

by Charles Kenny


  Losing effective treatment will not only undermine our ability to fight routine infections, but also have serious complications, serious implications, for people who have other medical problems. For example, things like joint replacements and organ transplants, cancer chemotherapy and diabetes treatment, treatment of rheumatoid arthritis. All of these are dependent on our ability to fight infections that may be exacerbated by the treatment of these conditions. And if we lose our antibiotics, we’ll lose the ability to do that effectively.58

  Despite surgeons vowing to clean their hands, arms, and instruments with (even greater) care, the risk of postoperative infection would multiply. And every surgical procedure would become dramatically more dangerous. If antibiotics don’t work on an infected wound, for example, surgeons are forced to use the state-of-the-art treatment that prevailed in the 1930s: debridement. That involves trying to cut out all of the infected tissue and is more dangerous and far more invasive than a course of penicillin.59 As we overprescribe antibiotics, creating a range of antibiotic-resistant superbugs, we may speedily regress toward a world in which maggots (that eat putrefying flesh) become our best defense against gangrene. Indeed, they’re already commercially available for the treatment of wounds responding poorly to antibiotics.60

  A review of the antimicrobial threat, sponsored by the British government and chaired by economist Jim O’Neill, predicts that, by 2050, 10 million people could be dying a year from increased antimicrobial resistance worldwide if we fail to act. That’s more than die worldwide each year from cancer. It compares to a World Health Organization estimate of 250,000 additional deaths each year from the effects of climate change between 2030 and 2050.61 Global deaths from terrorism run at about one one-thousandth of the potential toll of antibiotic resistance.62 (Compare the attention paid by newspapers, cable news, or presidential debates to the three issues: terror comes first and climate second, with antibiotics finishing a distant third.)

  Meanwhile, the number of new drug approvals for antibacterials has been dropping, while the gap between antibiotic introduction and the emergence of resistant infections appears to be shrinking. It took nine years between the introduction of the antibiotic tetracycline and the identification of tetracycline-resistant shigella. It took one year between the introduction of ceftaroline and the identification of ceftaroline-resistant staphylococcus.63 Our profligacy with use increases research costs to find new alternatives, and in the meantime treatment costs tick steadily upward. But we’re doing little to increase research budgets for new drugs.

  * * *

  Making matters worse is the ongoing military research aimed at making infection threats more deadly so that they can be weaponized.

  Humans have abused the power of parasites in war for a long time. An ancient torture described by Plutarch used maggots’ taste for the distasteful to slowly consume a victim encased between two boats.64 We’ve seen early accounts of Tartars catapulting the corpses of Black Death victims into the besieged town of Caffa. And the Venetian doctor Michiel Angelo Salamon tried a more complex variation of the same approach. He distilled liquids collected from the spleen, buboes, and carbuncles of plague-stricken victims as a potential bioweapon.65

  A more effective historical attempt at biological warfare involved British colonists giving Native Americans blankets dusted in smallpox scabs. It was an approach that General Washington feared the British would turn on his troops as they marched on Boston, and it led him to order the entire army inoculated through the process of variolation.66

  Microbes were, of course, also used as weapons of war in the twentieth century. Jiang Chun Geng is a farmer who lives in Dachen, China. He was born in 1930 and was caught up in the brutal war between Japan and his country at the age of twelve. His whole family became infected with festering sores soon after Japanese soldiers passed through his village, as he reported to Judith Miller of the Manhattan Institute. His mother and younger brother died from the sores. But he lived on, with a decomposing and swollen right leg that, sixty years later, remained a putrid open wound.67

  As reported by Miller, Jiang was the victim of biological warfare, carried out by the euphemistically titled Epidemic Prevention and Water Purification Department—Unit 731 of the Imperial Japanese Army. Over the period 1932 to 1945, the unit performed experiments on prisoners of war that involved intentional infection and investigative surgery. As well, it carried out germ warfare involving plague, anthrax, and typhus. Around the Chinese city of Harbin, soldiers laced one thousand wells with typhoid bacilli and handed out bottles of lemonade injected with typhus to local children. In Nanking, they distributed anthrax-filled chocolate and cake to kids. They even took their experiments far enough to discover new vectors: a US Army report on the unit’s accomplishments notes “one of the greatest” was the use of the human flea to infect victims with the plague in the Chinese city of Changteh—showing you didn’t need rats to spread the Black Death.68

  But Unit 731 also followed a more traditional route by mixing fleas infected with the bacteria into bags of rice and wheat, which they dropped over the city of Quzhou in October 1940. The rats ate the grain and picked up the plague-carrying fleas. For whatever reason, Yersinia pestis had lost its edge since the Black Death—perhaps sanitary conditions were better in wartime China than they’d been in Europe of the Middle Ages. Only 121 people died of the resulting plague outbreak.69

  US General Douglas MacArthur recognized cutting-edge innovation when he saw it. The perpetrators never faced war crimes charges, with most of the research team co-opted into the US biowarfare program in exchange for immunity. Jiang, along with other victims, has never received reparations for what was done to him, and the Japanese government still refuses to acknowledge the program, which may have killed many thousands in attacks across seventy towns and cities.

  More recently, in 2001, letters laced with anthrax spores were sent to US senators and media outlets, along with notes saying what the spores were and that the recipient might want to visit a hospital. The letters clearly weren’t designed to maximize death, but they did demonstrate the risk of biological weapons, as well as the ease of hiding their production—no one has been charged in connection with the crime. The airborne release of one kilogram of anthrax aerosol over New York could kill hundreds of thousands—and anthrax that is antibiotic-resistant has been developed by both the US and the former Soviet Union.70

  In 2003, then US secretary of state Colin Powell presented evidence to the UN Security Council regarding Iraq’s weapons of mass destruction and used the anthrax incident as an example. At one point he picked up a vial of liquid and noted, “Less than a teaspoon of dry anthrax… in an envelope shut down the United States Senate in the fall of 2001.” Saddam Hussein could have produced twenty-five thousand liters of anthrax, he confided. “We have firsthand descriptions of biological weapons factories on wheels and on rails.… They can produce anthrax and botulinum toxin. In fact, they can produce enough dry biological agent in a single month to kill thousands upon thousands of people.”71

  As we now know, while Iraq previously had a bioweapons program, the anthrax was either never made or was destroyed prior to the US invasion. But biological warfare remains a real risk. Indeed, in the future, it’s possible that the biggest threat to the vast majority of humanity from infectious disease will be a bioengineered weapon of mass destruction.

  Compared to building an atomic weapon—which involves a nuclear power plant or complex enrichment infrastructure alongside advanced engineering and conventional explosives technology—bioweapons are straightforward to manufacture. The Manhattan Project to develop the atom bomb in World War Two was comparable in manpower and capital cost to the entire prewar automobile industry of the United States. Today, single university labs regularly create bioweapons, sometimes by accident. According to Nathan Myhrvold, former chief technology officer for Microsoft, the technology of molecular biology manipulation means that “access to mass death has been democratized;
it has spread from a small elite of superpower leaders to nearly anybody with modest resources.” Indeed, a team of virologists at the University of Wisconsin-Madison has listed the simple changes they would need to make to convert a lethal strain of bird flu into something that could be easily transmitted between mammals.

  The small individual scale of operations is why the Soviet Union could secretly continue a massive bioweapons program after a 1972 treaty with the US banned their production. More alarming even than the ease with which these killer pathogens can be produced is that, in contrast to a single nuclear dirty bomb that spreads radioactive material or even a Hiroshima-sized nuclear weapon, the right biological agent can spread worldwide, killing on every continent.72 And the international treaty meant to prevent states from working on infectious threats, the Biological Weapons Convention, lacks significant verification mechanisms, leaving the world at the mercy of the goodwill of its signatories.

  But once again, the threat comes not only from the risk but, also, our response to it. The recent history of bioterror suggests significant limits to terrorists’ practical ability to create effective weapons: the 1984 tainting of salad with salmonella in Oregon by the Rajneeshee religious cult killed no one; the 2001 anthrax mailings killed very few.73 And Colin Powell’s UN speech was part of an effort to justify the invasion of Iraq on what turned out to be faulty grounds.

  Just as naturally occurring infectious disease has often been used as an excuse to deny liberties, the risk of bioterror was one justification for the growth of fortress America—including increased military spending, greater use of targeted assassination, torture, invasive intelligence-gathering operations, and complex, expensive visa and screening requirements for visitors to the US. It’s not clear that, on net, these programs make Americans safer (let alone the world).74 They certainly have considerable spillover effects—including reduced effectiveness in fighting infectious disease itself, as we’ve seen in the case of polio eradication.

  * * *

  There are many diseases we’re unlikely to wipe out for a long time if ever: infections such as Covid-19, Ebola, and plague that have animal reservoirs, as well as conditions like flu that mutate so fast and spread so rapidly that we’re constantly playing catch-up. And new diseases will continue to emerge—hopefully, only from nature, but perhaps including man-made threats. As long as we remain a planet of 7-plus billion, close-packed and widely traveled, with a love for meat, eggs, and milk, infections will be a force in our lives.

  The only question is “How big?” We have the technology and resources to confine infection to the status of a minor global annoyance—killing few, temporarily disabling more, but in the bush leagues of death rates. But there is also the pessimistic view: we have the technology—and infections have the natural capacity—to become a massive force for global death and decline. And we can make the future even darker by reacting late and at a cost to liberties. The choice is ours.

  CHAPTER ELEVEN Flattening the Plague Cycle

  By 2030, end preventable deaths of newborns and children under 5 years of age…

  —United Nations Sustainable Development Goals

  Covid-19 lockdown empties Times Square. (Source: AP Photo/Seth Wenig)

  The fight against infection is a public fight. Individuals acting alone can only do so much to counter it. That’s why responses to the infectious disease threat have involved ever growing government involvement. In 1868, Sir John Simon, medical officer of health for the City of London, proudly enumerated the extended powers of government that had already developed to respond to disease in Victorian Britain. The state

  “has interfered between parent and child, not only in imposing limitation on industrial uses of children, but also to the extent of requiring children should not be left unvaccinated. It has interfered between employer and employed [insisting] certain sanitary claims shall be fulfilled… between vendor and purchaser… and has made it a public offense to sell adulterated food or drink or medicine.… It has provided that in any sort of epidemic emergency organized medical assistance, not peculiarly for paupers, may be required of local authorities.”1

  Dorothy Porter suggests that public health interventions were viewed by early Victorians as “the most infamous growth of authoritarian, paternalist power of central government… and the growth of the despotic influence of a particular profession—the medical profession.”2 Nonetheless, the authoritarians, paternalists, and despots won. Since then the (necessary) intrusions have only grown.

  But there’s still more to be done. In 2017, about 10 million people died from communicable diseases worldwide. Many of those deaths were preventable. We need to extend the full benefits of the sanitation and medical revolution. And we need to improve our response to new disease threats. Both efforts will necessarily be global. Just as no one individual or family can fully respond to the infectious threat, in a globalized world neither can any one country.

  Even though everyone is at threat from infection, it’s among poor people in poor countries where that threat most often translates into dying. That’s because while fighting infectious disease is cheaper than dealing with the diseases of the rich, poor countries have incomes so low that even simple preventatives are out of reach.

  Take the response to HIV: we have seen that for the first time in 2013, more people worldwide were put on antiretrovirals than were newly infected from the disease.3 And costs for those antiretroviral drugs had come down markedly. That price decline was delayed by the monopoly (patent) rights given to companies that develop new drugs. But even after generic (off-patent) producers entered the market, antiretrovirals remained a very expensive way to save lives in Africa: their cost of about $350 per year of life saved compared to $42 per life year for programs that support adult male circumcision (which reduces HIV transmission), $24 or less per life year for extending bed net coverage against malaria, and $5 or less per life year for increasing vaccination coverage.4

  The average annual health expenditure in the world’s poorest countries is just $10 per person—about half of that private expenditure and about half from the government. (In the US, health dollars spent are closer to $6,000 per person per year.) If you were a minister of health responsible for a budget that could amount to as little as $5 per person in your population, and the list of needs included everything from shots to cancer treatments, how would you spend the money: on comparatively expensive treatments to keep AIDS patients alive or, rather, on bed nets that would save more people for the same price? In a world that’s as rich as ours, no one should be forced to make decisions like that.

  In 2013, the Lancet Global Health Commission, chaired by former treasury secretary Lawrence Summers, declared that the world can and should provide universal access to a range of cheap and effective global health services, most focused on infectious diseases, by 2035. These would prevent about 15 million deaths a year in the world’s poorest eighty-two countries, those classified as low-income or lower-middle-income by the World Bank. The commission estimated that most countries could afford to finance these interventions by themselves, but the poorest countries as a group might require a total of $9 billion in aid financing. That is approximately one-fifteenth of current global aid flows, suggesting it’s eminently affordable.

  Admittedly, a greater challenge may be ensuring that the money committed is turned into improved outcomes in countries where a lot of care is distinctly substandard.5 A recent World Bank–sponsored survey of practice in Nigerian health clinics and hospitals tested to see if doctors and nurses followed clinical guidelines in asking basic questions when presented with patients complaining of symptoms of common illnesses, if they diagnosed the illnesses correctly, and if basic medical tools and basic drugs were available to treat such conditions. The survey indicated a national average of 32 percent adherence to clinical guidelines, 36 percent diagnostic accuracy, and 44 percent drug availability. Slightly more than two-thirds of facilities did at least have one working thermometer
. Nigeria is hardly the exception in Sub-Saharan Africa, and national averages hide even worse outcomes in rural hospitals.6

  Frustratingly, donors in rich countries have to date focused on particular disease campaigns—including against malaria, polio, and AIDS—almost to the exclusion of supporting the development of functioning health systems. For all of the great success against smallpox and polio and the significant progress against malaria and AIDS, health improvements will only continue if broader health networks work.7 They deserve more support.

  Beyond the health sector, the more that basic sanitation systems function, the less people will need treatments and cures in the first place. The World Bank estimates that it would cost $28 billion a year between now and 2030 to reach universal global access to basic water, sanitation, and hygiene facilities. That means clean water less than fifteen minutes’ walk away and at least a decent pit-latrine to use as a toilet. Making sure that every household has its own clean water supply, and a toilet that can safely dispose of feces, would cost $114 billion a year.8

  That’s a lot of money (and, again, it will take more than money). On the other hand, given the global burden of disease related to poor sanitation, it might seem like a bargain.

  It may be that technology can come to the rescue, lowering costs. The Gates Foundation is backing the development of toilets that will need no water or sewage to convert urine and feces into pathogen-free manure or electrical energy—at a cost below 5 cents per user per day.9 A number of different models are being piloted in developing countries—if they work, they’d hugely simplify the provision of sanitation and could dramatically reduce the risk of feces-related infection in slums and cities worldwide.

  Achieving further dramatic global reductions in infection will also take behavior change: doctors correctly diagnosing and treating, nurses turning up to work, but, most important, people using basic sanitary techniques. Once again, we have some idea how that is done: in 2001, the Indian government launched the Total Sanitation Campaign. Learning from earlier efforts in which latrines were constructed only for them to sit unused or be converted into storage sheds, the government put a lot of effort into fostering use. In some areas, village leaders were given a cash bonus if their village was declared open defecation–free. The program was a partial success—open defecation dropped from 64 to 53 percent of households over the period from 2001 to 2011. And fully eliminating open-field defecation in a village had a dramatic impact on child heights—significantly reducing stunting.10 But that it took ten years of effort to reduce open defecation by one-sixth demonstrates how hard it can be to make progress.

 

‹ Prev