Book Read Free

Very, Very, Very Dreadful

Page 3

by Albert Marrin


  THE RISE OF SCIENTIFIC MEDICINE

  By the time the plague ended in London, Europeans were beginning to question long-held ideas about the natural world. For centuries, the educated had believed such knowledge came only from the writings of the Greek philosopher Aristotle and other ancient “authorities,” whose works set the standards for judging every medical idea. Supposedly, any explanation that differed from theirs could not be true. This assumption changed in the mid-1600s.

  Advanced thinkers believed that observation, experimentation, and reason counted more than printed words in old books. The Italian Galileo Galilei (1564–1642) showed that the sun is the center of the solar system and that planet Earth revolves around this star, not the other way around. Another researcher, the Englishman Sir Isaac Newton (1642–1727), demonstrated how an invisible force called gravity attracts every particle of matter in the universe according to precise mathematical formulas. In this way, the sun’s gravitational pull holds the planets in their orbits, just as Earth’s gravity draws snowflakes to the ground. Without gravity, the universe would fly apart in chaos.

  An etching of Hippocrates, often considered the “Father of Medicine.” (1584) Credit 12

  Traditional ideas about disease also came into question. The terms doctor and physician once meant a person whose knowledge came from ancient masters like Hippocrates, the “Father of Medicine,” and Galen, both Greek. For over a thousand years, the writings of Hippocrates and Galen were sacred to the medical profession, taught in every university. Sickness, they claimed, had two causes. In the first case, it resulted from having too much of a certain “humor,” or bodily fluid: blood, phlegm, yellow bile, or black bile. When a person fell ill, it was because he or she had become too hot, too cold, too moist, or too dry. The favored treatment was bleeding—cutting into a vein to reduce the amount of the “excess” humor. For example, doctors bled George Washington four times for a bad cold, taking a total of ninety-six ounces, well over half the blood in his body, and hastening his death. Another treatment intended to restore the balance of bodily fluids was purging: massive vomiting. The second cause of sickness was believed to be “miasmas,” poisonous vapors rising from rotting matter in the soil. To clear away “bad air,” physicians advised building bonfires in the streets, beating drums, ringing church bells, and firing cannons.

  These ideas gradually gave way as a host of discoveries laid the groundwork for scientific medicine. In 1663, Robert Hooke (1635–1703), an English chemist, saw tiny box-like compartments, which he called “cells,” in a piece of cork he was examining under a crude microscope. He soon found that his own hair and skin also contained cells; thus, he concluded, cells are the basic structure of all life-forms.

  Antony van Leeuwenhoek, considered the world’s first microbiologist, in a painting by Jan Verkolje. (1673) Credit 13

  Meanwhile, Antony van Leeuwenhoek (1632–1723), a Dutch merchant with a passion for science, built better microscopes than had ever existed. Instantly, another world, swarming with life, opened before his eyes. In 1683, the Dutchman observed “animalcules,” or what we call microorganisms, tiny life-forms invisible to the naked eye. After viewing a drop of dirty canal water for the first time, he wrote about seeing “with great wonder…many very little living animalcules, very prettily a-moving. The biggest sort…had a very strong and swift motion, and shot…like a pike does through water. The second sort…oft spun around like a top.” These were the simplest forms of animal life, single-celled creatures such as amoebas. Though Leeuwenhoek was the first human to see bacteria, he did not realize they could cause disease.17

  The discovery of animalcules did, however, lead the Dutchman to question the theory of spontaneous generation. This theory held that simple life-forms—worms, maggots, fleas, lice—arose from nonliving matter such as dust, mud, or dead flesh. Though Leeuwenhoek’s studies showed that these creatures emerged from eggs, he failed to convince the scientific community of his day. Acceptance would have to wait until the nineteenth century.

  Louis Pasteur, a French chemist and microbiologist renowned for his discoveries of the principles of vaccination, microbial fermentation, and pasteurization, a process that carries his name, in a painting by Albert Edelfelt. (1885) Credit 14

  Louis Pasteur (1822–1895), a French chemist and researcher, settled the question of spontaneous generation. During the 1860s, in a series of careful experiments, he proved that new life can arise only from existing life, not from dirt or other lifeless matter. Equally important, Pasteur championed germ theory, the idea that microorganisms cause diseases. “The microbe causes the illness,” he declared. “Look for the microbe and you’ll understand the illness.” A German physician, Robert Koch (1843–1910), later found that specific types of microbes cause specific infectious diseases. The discoveries of Pasteur and Koch were like golden keys unlocking nature’s secrets. They showed that diseases were not sent as punishment from on high or caused by magic, witchcraft, or the devil. Instead, diseases had natural causes, and the human mind could understand them, control them, and conquer them.18

  Scientists developed vaccines to fight age-old killers. Edward Jenner (1749–1823), an English country doctor, had already used a vaccine to prevent smallpox. Nicknamed the “Speckled Monster,” this disease killed its victims outright or left survivors with faces disfigured by deep scars. While making his rounds, Jenner paid special attention to milkmaids, who worked with cows. He noticed that women who got cowpox, a related though nonfatal disease, seemed immune to smallpox. To test his theory, in 1796 Jenner “vaccinated” (from vacca, Latin for “cow”) a local boy, eight years old—that is, he scratched pus from cowpox sores into the boy’s arm. Sure enough, the child developed a low fever but not the fatal or disfiguring smallpox. Several weeks later, to check his theory, Jenner scraped pus from actual smallpox sores into the boy’s arm. Again, a low fever but no disease: the boy was immune.19

  Nineteenth-century scientists followed Jenner’s lead. Between 1879 and 1897, vaccines against cholera, anthrax, rabies, typhoid fever, yellow fever, diphtheria, and bubonic plague became available in Europe and America. But neither Jenner nor anyone else understood how vaccines worked. We will learn the answer in another chapter.

  Nineteenth-century surgical tools. Credit 15

  Surgery also advanced from the mid-1800s onward. Until then, patients dreaded the pain inevitable in any operation. To keep a person from squirming when the surgeon cut, leather straps and strong men held him or her down. The surgeon had to be tough-minded, someone who had schooled himself to work quickly and ignore screams. Novelist Fanny Burney left a chilling account of her breast operation in 1810: “When the dreadful steel was plunged into the breast, I began a scream that lasted…during the whole time of the incision—& I almost marvel that it rings not in my Ears still! So excruciating was the agony.” Burney was lucky; she survived. Many others died of shock, which occurs when the blood fails to circulate properly in the body.20

  Surgery changed dramatically in the 1840s, as ingenious dentists and physicians began to use anesthetics, drugs that put patients into a deep sleep, making them insensitive to pain. The first such drug, nitrous oxide, or “laughing gas,” made patients giddy after inhaling it, then unconscious for a short time—just enough to pull a tooth. Inhaling the fumes of ether and chloroform put them out for longer periods. This allowed for more complicated procedures on parts of the body surgeons had seldom operated on with success: the abdomen, chest, and brain. Anesthesia also became a blessing to women in childbirth. In 1853, for example, England’s Queen Victoria gave birth while under chloroform. “The effect was soothing, quieting & delightful beyond measure,” Her Majesty told her diary.21

  Anesthetics, however, did nothing to prevent infection, the chief cause of surgical deaths. Before Pasteur’s germ theory changed medical thinking, surgery and infection were synonymous. Surgical masks, gowns, and gloves did not exist until the early 1900s. Surgeons wore their street clothes, and o
nly washed their hands after operating.

  In 1918, William W. Keen (1837–1932), America’s first brain surgeon, recalled how, as a young doctor, he operated on wounded Union army soldiers during the Civil War:

  We surgeons in 1861–65 [were] utterly unaware of bacteria and their dangers….May le bon Dieu [the good God] forgive us our sins of ignorance. We operated in old blood-stained and often pus-stained coats, veterans of a hundred fights. We operated with…undisinfected hands. We used undisinfected instruments from undisinfected plush-lined cases, and still worse, used marine sponges which had been used in prior pus cases and had been washed only in tap water. If a sponge or an instrument fell to the floor, it was washed and squeezed in a basin of tap water and used as if it were clean….The silk with which we sewed up all wounds was undisinfected. If there was any difficulty in threading the needle we moistened it with (as we now know) bacteria-laden saliva, and rolled it between bacteria-infected fingers. We dressed the wounds with clean but undisinfected sheets, shirts, tablecloths, or other old soft linen rescued from the family ragbag. We had no sterilized gauze dressing, no gauze sponges….Death was peering over the shoulder of the surgeon, watching for his victim.22

  No wonder patients routinely developed lethal infections after “successful” operations. Nor should we be surprised that when President Abraham Lincoln lay dying with a bullet in his head, doctors probed for it in the usual way. One by one, they poked their fingers into the wound—unwashed fingers with dirt caked under the nails.

  Before the 1840s, amputations were conducted without anesthesia. (Date unknown) Credit 16

  Humanity owes the English surgeon Joseph Lister (1827–1912) a huge debt of gratitude. Inspired by Pasteur’s germ theory of disease, Lister demanded absolute cleanliness in the operating room. At first, he ordered that carbolic acid, made from coal tar, be sprayed to kill airborne bacteria. He soon realized that the surgeon’s hands and instruments swarmed with unseen killers, so he insisted those be sprayed, too. Equally important, he urged that the moment a surgeon took a scalpel to a patient, antiseptics be used to sterilize the wound. “Listerism” became the rage, the success of his methods leading to their adoption by hospitals throughout the Western world. As a result, deaths from infections following surgery fell from 45 to 15 percent. Today, they are near zero.

  Meanwhile, specialized instruments became available to physicians. The stethoscope enabled them to detect abnormalities in the heart and lungs by listening to the sounds these organs produced. The clinical thermometer measured changes in body temperature, while the hypodermic syringe injected medicine into the bloodstream through a hollow needle. X-rays, discovered by the German physicist Wilhelm Roentgen in 1895, allowed physicians to look into patients’ bodies without having to cut them open.

  One of René Laennec’s early stethoscopes. Laennec invented the instrument in 1816. Credit 17

  Aspirin, developed in 1897, reduced fever, becoming the world’s most widely used medicine. Vitamins were discovered in 1906, and the first successful human blood transfusion took place the following year. Milk was “pasteurized,” heated to kill bacteria. Governments, too, acted upon the germ theory, promoting better sanitation, pest control, and water purification. Devices that we in the West now take for granted—like flush toilets, vacuum cleaners, refrigerators, and window screens—began to keep homes cleaner and food fresh. Fashions changed. To avoid bacteria, men shaved their beards and women shortened their dresses so they wouldn’t sweep along the foul streets. Newly created public health services checked age-old killers. Deaths from tuberculosis, for example, which mainly affects the lungs, fell from around one in four Americans in 1800 to one in ten by the start of the twentieth century.23

  Despite its victories, scientific medicine had its opponents. Many devout Christians rejected any teaching that conflicted with the literal text of the Bible. For them, as for people during the Black Death, disease was “a judgment of God on the sins of the people.” Edward Jenner’s discovery of vaccination drew harsh criticism from the pulpit. Clergymen denounced the doctor for having put himself above God. Only the Almighty, they said, sends illness and only the Almighty cures it. Vaccination, critics charged, was “a diabolical operation,” and its inventor was “flying in the face of Providence.” Anti-vaccination societies sprang up on both sides of the Atlantic. In Boston, clergymen railed against vaccination’s “bidding defiance to Heaven itself, even to the will of God.” Cartoonists drew pictures of skeletons vaccinating babies and of snakes labeled VACCINATION hissing at babies and their terrified mothers.24

  A cartoon from a December 1894 anti-vaccination publication. Credit 18

  Some critics insisted that Satan was the true father of anesthesia. Its development, they warned, was part of his unholy plan to turn humanity away from God. The pains of childbirth, they argued, were divine punishment for Adam and Eve’s disobedience in the Garden of Eden. Pain, therefore, was humanity’s lot, an inevitable aspect of divine justice.

  And as late as the 1870s, several prominent physicians challenged the germ theory of disease. The most famous of these was Samuel D. Gross of Philadelphia. Known as “America’s Surgeon” and the “Emperor of American Surgery,” Gross denounced Listerism as quackery. “The demonstration of disease producing germs,” he thundered, “is wanting and I have never found any appreciable benefits from the use of antiseptic dressings.”25

  The Gross Clinic by Thomas Eakins, regarded as one of the finest American paintings. (1875) Credit 19

  Artist Thomas Eakins immortalized the doctor in The Gross Clinic (1875), often called America’s greatest painting. Eakins depicts an operation on a patient’s infected leg bone. Gross’s assistants administer anesthesia and hold the wound open with curved steel tools. Like them, Gross is unmasked, ungloved, and wearing street clothes. He has stepped back, a bloody scalpel in his bloody hand, to lecture students. These are seated in an open gallery overlooking the operating table, breathing their germs toward the gaping wound. Sometimes the eminent surgeon sharpened his scalpel on the sole of his boot, a practice not shown in the painting.26

  Medicine’s Samuel Grosses, however, were a dying breed. Scientific medicine’s best arguments were beyond dispute: it saved lives and alleviated suffering. By 1900, for the first time since cities came into existence, European and American cities recorded more births than deaths. We get a hint of what this meant from a simple name change. People used to call the house parlor the “laying-out room,” because dead family members lay there in their coffins awaiting burial. The parlor now became the “living room,” where the family spent its leisure time and entertained guests.27

  The medical profession had reason for pride in itself and optimism about the future. In one century, it had learned more about disease than in all of human history. The physician had come into his own. (We say “his” because, thus far, few women had become medical doctors.) No longer was he a “quack” or a “sawbones” who hacked off limbs for a living. He was a new kind of priest, a holy man in a white coat, the symbol of purity and expertise. The Medical Times and Hospital Gazette, an English journal, expressed this view in glowing terms. “The good physician is prepared to meet any emergency that may arise,” it stated. “He is a ‘friend in need,’ a tower of strength in the sick room. He is the man upon whom the people have learned to depend when sickness occurs and Death hovers over their dwelling.”28

  The American physician, according to Dr. Victor C. Vaughan, had become a “co-worker with the Creator.” Outlook, a popular American magazine, declared the medical profession God’s instrument, a sacred thing, because “it is not ‘God’s will’ that children should die of diphtheria or young men be destroyed in the flower of their manhood by typhoid fever.” The sky was the limit. “Public health is purchasable,” proclaimed a publication of New York City’s Department of Health. “Within natural limitations a community can determine its own death-rate.” A prominent physician announced, “No one fears a repetition of
the ghastly scenes of the Black Death in the fourteenth century.” Obviously, the conquest of disease was just a matter of will, time, knowledge, and money.29

  Nevertheless, the medical profession had no idea that it was about to face a stunning crisis. Though it had achieved much, it had not reckoned on the perfect storm that seemed to come out of the blue. In 1918, war and influenza joined forces to ignite history’s worst-ever health disaster.

  The history of war has always been a history of epidemics.

  —American Journal of Public Health editorial, April 1918

  BREWING THE PERFECT STORM

  Well-clothed, well-fed military officers plan wars in the safety and comfort of headquarters far from future battlefields. Cautious and meticulous, they try to think of every move and circumstance in advance. There is, however, a problem. What former heavyweight champion Mike Tyson once said about boxing also applies to the military: “Everyone has a plan until they get punched in the mouth.” Despite officers’ best efforts, war plans go awry the moment fighting begins. The reason is simple: the human mind is not omniscient; it cannot foresee everything that can happen. Stupidity, bad information, enemy actions, and dumb luck all play a role in the outcome of battle. Thus, armies, like prizefighters, must quickly adapt to changing conditions or face defeat.

  The war that began in July 1914 came as a shock to everyone. Both the Allied and the German high commands had planned their opening moves in minute detail. Optimists, both sides expected a fast-paced war of movement. There would be attacks by masses of soldiers advancing shoulder to shoulder across open fields. By sheer weight of numbers, each side thought, its army would rout the enemy and seize his capital, gaining victory in a blaze of glory. “On to Berlin!” cried Allied troops as they boarded trains for the front; “On to Paris!” shouted German troops. Few expected the war to last beyond Christmas. By the time Santa Claus arrived with his goodies, Daddy would surely be home, safe and sound. As a popular song told the folks in England, “Keep the home fires burning.”

 

‹ Prev