The Plague Cycle

Home > Other > The Plague Cycle > Page 11
The Plague Cycle Page 11

by Charles Kenny


  The eight million residents of New York generate enough waste to employ eleven thousand people to get rid of it—and that does not account for the office buildings, which are served by private trash companies.

  In the US as a whole, there are 2.3 million janitors and building cleaners, as well as 1.4 million maids and housekeeping cleaners, according to the US Bureau of Labor Statistics. What about plumbers, pipefitters, and steamfitters? There are 387,000 of those—many involved in building and fixing bathrooms and kitchens. Add in the soap, cleaning compound, and toilet preparation manufacturing industries that employ another 100,000 people.47 The broad “cleanliness industry” may keep as many as 4.5 million Americans employed—around 3 percent of the labor force.

  The sanitation revolution that led to this considerable powerhouse of the modern economy is all that allows a city like New York, with its mass of population living (literally) on top of one another, to work. In an area of 305 square miles, or 0.0005 percent of global land, it contains a larger population than the world as a whole as recently as 4000 BCE, suggesting the power of civilization and agriculture—when combined with modern health and sanitation—to overcome Malthusian constraints on progress.

  * * *

  Improved sanitation came with costs. We’ll see that it was a weapon of empire. It was also a gateway to total war, keeping soldiers healthy enough to fight at unprecedented scale. The trench warfare of World War I should have been the gift of the century to microbes. Millions of soldiers living for years packed together in sodden trenches, coughing, sneezing, and excreting in extremely close proximity, periodically exposing themselves to a whole host of injuries that would allow even the laziest bacteria of putrefaction to get ahead and reproduce. Under such conditions, typhus would normally have been rampant, for example. But at least on the Western Front, times had changed. Only 104 cases of typhus were reported among the Allies during the entire war.48 (Russia failed to control the disease, however: as many as 3 million soldiers and civilians on the Eastern Front died of typhus during the conflict.)49

  Hundreds of thousands more soldiers were left healthy enough to be crippled by high explosives and metal-jacketed bullets that tore through soft tissues. That so many survived these wounds can be credited to improvements in hospital care. In the American Civil War, medical officer Middleton Goldsmith first used a treatment process for wounds that involved cutting out all the dead and damaged tissue (“debridement”) and then pouring a mixture of bromine and bromide of potassium onto the wound. Of his 308 patients, only eight died. Four years later, British doctor Joseph Lister demonstrated his antiseptic technique for surgery, and the widespread adoption of sterilization meant that hospital mortality rates for gunshot wounds were half the Civil War level by the time of the Spanish-American War in 1898.50 It was a revolutionary change from the standards of Borodino, one that allowed far more soldiers to recover from injuries to fight again: progress, of a sad and twisted sort.

  Combined with other sanitary interventions, clean water and sewage helped upend the rankings of causes of death in rich countries. Infectious diseases accounted for about 50 percent of all deaths in the UK in the 1860s. By 1900, infection was responsible for a little more than 40 percent of deaths in both the United States and the UK. Already, heart disease, cancer, and vascular conditions had taken the top spots in the mortality rankings. The decline continued into the twentieth century to the extent that, even before the invention of most vaccines and antibiotics, the US and Northern European death toll from infection had dramatically fallen.51

  And by the time sewer engineer Joseph Bazalgette died in 1891, after adding flood defenses, river crossings, and street improvements to his list of gifts to England’s capital, urban areas had closed most of the health gap with rural areas.52 English life expectancy, which had stalled for nearly forty years from 1820 to well past the mid-century mark, at last began to climb. In 1870, it surpassed forty-one years—a level it had previously reached in 1582, when Shakespeare was writing his plays and the much smaller population was mostly rural.53 Life expectancy reached fifty by the first decade of the twentieth century.54

  Marcella Alsan and Claudia Goldin, at Stanford and Harvard Universities respectively, studied the rollout of clean water and a modern sewage system across US cities around the turn of the twentieth century. Infant mortality among the white population fell from about 17 percent dying before their first birthday in the 1880s to 9 percent in 1915. Alsan and Goldin used evidence from Boston to point out that water and sewage improvements alone may account for a little less than half of that decline, largely thanks to fewer cases of diarrhea related to water laced with fecal matter.55

  As Alsan and Goldin’s results suggest, Bazalgette and his sanitarian allies cannot take all of the credit for overall health improvements. Nutrition played a role, too—children and adults alike ate more and better food because families had more income.56 But sanitation was a far more influential force. In the 1630s, wealthy English people could certainly afford adequate food but achieved a life expectancy of only thirty-nine years. By the start of the third millennium those same wealthy classes were experiencing an average life expectancy of eighty-one years because of sanitation-diminished exposure to widespread infection.57

  * * *

  Improved sanitation helps against many diseases, but it makes the biggest difference with those that are less easily spread between humans—ones that require a vector animal like a mosquito, flea, or rat, or that only pass through bodily fluids. The unequal spread of sanitation worldwide meant that while everyone remained at risk of diseases like measles, smallpox, or the flu, which travel easily and directly to a new victim through the air, the rich and sanitary became comparatively free of the rest.

  Take the 15 million deaths of the third global plague pandemic: the vast majority of those deaths were in South and East Asia. Europe lost perhaps seven thousand people and the United States five hundred. One factor explaining the difference: for all Europe and the US were comparatively urbanized and connected, they were also comparatively rich, with better quality housing and sanitary systems—and fewer rats and fleas.58

  The sanitation gap continues to influence the health gap: in 2000, northern Uganda suffered an outbreak of Ebola. Dr. Matthew Lukwiya ran a missionary hospital in the region and took the lead in fighting the outbreak, alerting authorities in Uganda’s capital, Kampala, and caring for patients in the hospital.

  Ebola isn’t highly infectious—for you to catch it, infected body fluids from a victim have to come into contact with your nose, mouth, genitals, or broken skin. But if you’re working in 100-plus Fahrenheit conditions in a gown, gloves, and face mask for hours on end, you might be excused for making the deadly mistake of scratching a mosquito bite or wiping sweat from your eyes before scrubbing down. That’s why the people who care for Ebola sufferers are the most likely to catch the disease themselves. Twelve healthcare workers at Lukwiya’s hospital died. The doctor was only able to thwart a staff mutiny through a combination of threatening to leave, appealing to their professional spirit, and singing hymns.

  Eventually, Lukwiya succumbed himself, catching the disease while caring for an infected nurse. He was buried the same day he died, by a team wearing full protective gear that sprayed bleach over the coffin as it was lowered into the ground while the doctor’s wife and children looked on. Thankfully, he was the last to die at the hospital, and Uganda’s Ebola outbreak ended in February 2001.59 Lukwiya had stopped the disease from spreading—and he did so largely using the tools of sanitation and isolation.

  Ebola is distinctive in that it kills or clears up quickly and is only infectious when victims show symptoms. As a result, each victim tends to infect only a very few more people with the disease. If people caring for Ebola patients protect themselves using caps, goggles, gloves, and gowns, while using bleach to clean any blood, vomit, or feces that spills, and the victim is kept isolated, the average infection rate drops below one, and the Ebola epidemi
c burns out. That’s why the disease is only a major threat in countries with very weak surveillance and isolation capacities, like Uganda and Liberia.

  Contrast Ebola with airborne Covid-19: in the latter case, isolation and social distancing slowed spread, and access to clean water for hand washing was important. But even cities with advanced water and sewer systems, along with armies of workers keeping hospitals, streets, and buildings clean, saw the coronavirus spread.

  Or think about measles, which spreads through the air even more efficiently than Covid-19. The average number of new people infected by every measles sufferer can reach eighteen among populations where no one has had the disease. In those cases, improved care, including boosting sanitary conditions, can reduce mortality rates, but there’s little chance that it will significantly reduce levels of endemic infection. If it weren’t for the fact that measles at its worst is a far less deadly disease than Ebola, it would have wiped out much of humanity long ago.

  Nonetheless, in the twenty-first century, there’s some likelihood that we’ll wipe out measles—thanks to the invention and worldwide spread of the measles vaccine. Hopefully, the same technology will soon reduce Covid-19 to a minor threat. As we’ll see in the next chapter, for all that the disciples of Bazalgette have prevented many millions of deaths—especially in the industrialized world—it’s the followers of vaccine inventor Edward Jenner who’ve saved the most lives worldwide.

  CHAPTER SEVEN Salvation by Needle

  I shall ask if any solid observations have been made from which it may be justly concluded that, in the countries where the art of medicine is most neglected, the mean duration of man’s life is less than in those where it is most cultivated.

  —Rousseau

  Some vaccines (in this case against polio) can be delivered via sugar cube instead of needle. (Credit: Wellcome polio vaccine. Wellcome Collection. Attribution 4.0 International [CC BY 4.0])

  Relying not just on their immune systems and the instincts shaped by the diseases of pre-civilization, civilized humans met the expanded infection challenge with a toolbox that included mostly ineffective treatments (snuff, sacrifice), a short list of more effective sanitary interventions (cooking, clean water), and finally, exclusion of the potentially ill. For most of history, medical advice was often worse than useless in killing microbes or easing symptoms.

  Perhaps the greatest doctor in ancient times was Hippocrates, who was born in Cos, in Greece, around 460 BCE.1 He’s nicknamed “the father of Western medicine” in large part because he was firm in the opinion that diseases usually have earthly rather than supernatural causes. Hippocrates was also the first recorded person to suggest the use of an extract of the aspen tree for pain, anticipating modern aspirin.2

  Hippocrates’s ideas regarding the “humors” of phlegm, blood, and black and yellow bile were part of a theory of well-being and illness that dominated much of Western medicine for two millennia. He maintained that human health was determined by the balance of these four bodily fluids. Sickness followed when the humors were out of equilibrium. And bloodletting—cutting open the veins to regain humoral harmony—was a cure.

  With halfway bleeding people to death, the state of the art in treatments, it shouldn’t be surprising that doctors were often seen as helpless in the face of pandemics. The chronicler of Justinian’s plague, Procopius, noted that “the most illustrious physicians predicted that many would die, who unexpectedly escaped entirely from suffering shortly afterwards, and that they declared that many would be saved, who were destined to be carried off almost immediately.… No device was discovered by man to save himself.” For all the attempts there’d been to find natural causes for disease, Procopius suggested that “it is quite impossible either to express in words or to conceive in thought any explanation, except indeed to refer it to God.”3

  Eight hundred years after Justinian and eighteen hundred after Hippocrates, the poet Petrarch complained bitterly about quackery and punditry during the Black Death: “No remedy is exactly right, and there is no solace. And to the accumulated disaster is added not knowing the causes and origin of the evil. For neither ignorance nor even the plague itself is more hateful than the nonsense and tall tales of certain men, who profess to know everything but in fact know nothing.”4 The advice of doctors regarding preventatives and cures included roasting food rather than boiling it, avoiding milk and meat, and eating lettuce.

  Guy de Chauliac, physician to Pope Clement VI who’d described Hansen’s disease with such positive effect, survived a bout with the Black Death himself. In 1363, he wrote a medical textbook, Inventarium seu Collectorium in parte Cyrurgicali Medicine (A Partial Inventory of Collection of Surgical Medicine), that looked back on the first plague outbreaks. “For preservation, there was nothing better than to flee the area before it was infected and to purge oneself with pills of aloe and reduce the blood through phlebectomy,” he wrote—retreating to the cure of bloodletting. As for underlying causes, de Chauliac suggested the conjunction of Saturn, Jupiter, and Mars on March 24, 1345, was the major factor, “which made such an impression upon the air and the other elements that, just as a magnet moves iron, so it changed the thick humors into something scorched and venomous.”5

  Not everyone agreed with the leading medical experts of the day. The contemporary chronicler de Mussi noted that the disease spread through ports by way of infected sailors carrying a “contagious pestilence.”6 That cities imposed ordinances designed to reduce the risk of coming into contact with the sick suggests that the possibility that pestilence was contagious was taken seriously. But doctors remained impotent to prevent or cure the condition.

  The same was broadly true worldwide. In 1102, China launched a national welfare program that established infirmaries within major cities with the explicit aim of controlling the spread of disease. There were state-sponsored lists of approved medicines, and these were (in theory) often distributed for free at infirmaries. Critics at the time carped at this, and there’s no evidence that the system as a whole significantly reduced the impact of such killers as measles, smallpox, flu, and dysentery.7

  * * *

  In the West, Vesalius, a sixteenth-century anatomist, used his extensive knowledge of the human form acquired from dissection to challenge the reputation of Galen—the most revered Roman proponent of the theory of humors. Sixty-five years later, William Harvey published De motu cordis, which demonstrated the general circulation of the blood. The discovery effectively undermined Galen and Hippocrates’s model. In 1637, the French philosopher René Descartes wrote in his Discourse on Method of the hopes that such advances offered:

  It is true that the science of medicine, as it now exists, contains few things whose utility is very remarkable: but… all at present known in it is almost nothing in comparison of what remains to be discovered; and… we could free ourselves from an infinity of maladies of body as well as of mind, and perhaps also even from the debility of age, if we had sufficiently ample knowledge of their causes, and of all the remedies provided for us by nature.8

  Less than a century after Descartes wrote this, the first effective if dangerous preventative against one of the greatest infectious killers, smallpox, did finally begin to spread from Asia to Europe, thanks in part to (somewhat) scientific experimentation.

  The last person to suffer from smallpox (hopefully ever) was Janet Parker, a medical photographer from Birmingham. In 1978, she was working in a lab that held one of the few remaining samples of the disease worldwide. Somehow, it escaped containment. Parker began to feel unwell, experiencing headaches, backache, nausea, and chills as well as terrible dreams. Then large, red, blistering pustules began spreading all over her body: Parker’s skin must have felt as if it were on fire. At a point where she was too weak to stand, she was admitted to a hospital; meanwhile, World Health Organization officials rushed to vaccinate anyone who’d been in contact with her—five hundred people.

  Parker’s condition deteriorated—her eyes scarred to alm
ost complete blindness, and she suffered renal failure and pneumonia. Soon, she stopped responding to people. Professor Henry Bedson, head of the lab where she worked, was so wracked with guilt that he killed himself. And five days later, on September 11, 1978, Janet herself passed away, joining hundreds of millions or more before her who’d also succumbed to the dreaded disease.9

  Something akin to smallpox was described in Chinese texts more than sixteen hundred years ago. It may have been present in ancient Rome; others suggest it reached the shores of the Mediterranean by the tenth century and Southern Europe by the thirteenth. Genetic evidence points to the potential for an even more recent evolution of the disease, but it was certainly circulating by the 1500s.10

  It was an equal opportunity killer, striking down rich and poor alike. Consider England’s royalty in the seventeenth century: King Charles II survived the pox as a child, but lost two siblings to the disease. Tragedy struck the next generation, too, when Charles II’s only descendent died of smallpox in 1660 while the family was exiled in Europe. That left James, Duke of York, Charles II’s brother, in the line of succession. He became king in 1685. James’s career was nearly as unsuccessful as his father’s (who was beheaded at the end of the English Civil War), and he was deposed in the “Glorious Revolution” that put William of Orange and James’s daughter Mary on the throne.

  But the thirty-two-year-old Mary was herself struck down by smallpox. As Donald Hopkins relates, William, “whose father had died of smallpox the week before he was born, who had lost his mother to smallpox when he was ten years old, and who had had a severe case of smallpox himself as a child, now prepared to bury his young wife, a victim of the same savage illness.”11 William was succeeded by Mary’s sister, Anne, whose only son was killed by smallpox in turn, ending the Stuart royal line.

 

‹ Prev