Scenes like this were playing out all over Britain. Mortality rates within hospitals had reached an all-time high by the 1860s. Efforts to clean up the wards had made little impact on incidences of hospitalism. What’s more, in the past several years there had been growing disagreement within the medical community over prevailing disease theories.
Cholera, in particular, had become increasingly difficult to explain within a miasmic paradigm. There had already been three major outbreaks in recent decades that had claimed the lives of nearly 100,000 people in England and Wales alone. The disease was running rampant throughout Europe, creating in its wake a medical, political, and humanitarian crisis that could not be ignored. Although non-contagionists could point to the fact that outbreaks often occurred in filthy urban areas, they could not explain why cholera had followed lines of human communication as it spread from the Indian subcontinent, nor could they resolve why some outbreaks occurred during the winter when bad smells were minimal.
Back in the late 1840s, a physician from Bristol named William Budd argued that the disease was spread by contaminated sewage carrying “a living organism of a distinct species, which was taken by the act of swallowing it, which multiplied in the intestine by self propagation.” In an article published in the British Medical Journal, Budd wrote that “there was no proof whatever” that “the poisons of specific contagious diseases ever originate spontaneously” or were transmitted through the air via miasma. During the latter outbreak, he prioritized disinfecting measures with an antiseptic, advising, “All discharges from the sick men to be received, on their issue from the body, if possible, into vessels containing a solution of chloride of zinc.”
Budd wasn’t the only one to question the spontaneous origin and aerial transmission of cholera’s spread. The surgeon John Snow also began investigating the matter when a deadly outbreak occurred near his house in Soho, London, in 1854. Snow started plotting cases on a map, and that was when he noticed that a majority of people who fell ill were receiving their water from a pump on the southwest corner of the intersection of Broad (now Broadwick) Street and Cambridge (now Lexington) Street. Even cases that were at first glance unconnected with the pump turned out to be associated with it after all, such as that of the fifty-nine-year-old woman who lived quite a distance from the water supply. When Snow interviewed her son, he was told that his mother often visited Broad Street because she preferred the taste of water from that particular pump. She was dead within two days of drinking from the supply.
Like Budd, Snow concluded that cholera was transmitted through contaminated water supplies, not by poisonous gases or miasmas in the air. He published a map of the epidemic to support his theory. Despite strong skepticism from the local authorities, Snow was able to persuade them to remove the handle from the Broad Street pump, after which the outbreak quickly subsided.
Incidences like this began to call into question the predominant belief within the medical community that disease arose from filth and was transmitted through the air by noxious gases, or miasmas. More proof came in 1858 when a terrible, inescapable stench crept through the city of London, pervading every nook and cranny within a mile of the river Thames. The scorching summer heat intensified the foul odor. People went out of their way to avoid coming in contact with the river. “The Great Stink” arose from human excrement piled along the riverbanks—a problem that had been growing worse as London became more and more populated. As the scientist Michael Faraday, famed for his work on electromagnetism, observed, “The feculence rolled up in clouds so dense that they were visible at the surface.” As he sailed down the river one afternoon, he noted that the water was an “opaque pale brown fluid.” The smell was so bad that members of Parliament had to cover their windows with heavy cloth just so they could continue working. The Times reported that government officials, “bent upon investigating the matter to its very depth, ventured into the library, [where they were] instantaneously driven to retreat, each man with a handkerchief to his nose.”
Londoners assumed that the “poisonous effluvia” (that is, miasma) arising from the water would lead to an outbreak of disease in the city. There were even rumors that a boatman had already died from inhaling the noxious vapors. Thousands fled the city in fear for their lives. After years of trying to secure funding for a new sewer system in London, hygiene reformers thought it would be poetic if Parliament was finally made to interfere because of its own decimation. And yet, strangely, no epidemics occurred that summer.
There was a perceptible shift away from miasma and toward contagion theories in the 1850s and 1860s, due in part to these events. Many doctors, however, remained unconvinced. Snow’s investigations in particular still didn’t suggest a plausible mechanism for the transmission of the disease. His conclusions correlated cholera with contaminated drinking water. But, like other contagionists, Snow didn’t explicitly state what it was that was being transmitted through that water. Was it an animalcule? Or a poisonous chemical? If the latter, wouldn’t it be infinitely diluted in large bodies of water like the river Thames? What’s more, Snow himself acknowledged that contagionism didn’t provide a satisfactory explanation of all diseases, and he continued to allow for the possibility of spontaneous generation in the development of diseases that caused putrefaction, like erysipelas.
The voices crying out for a better explanation of contagious and epidemic diseases were growing ever louder.
* * *
The problem of hospital infection had vexed Lister for so long that he wondered if he would ever find a solution to it. But since his conversation with Professor Anderson about Pasteur’s latest research on fermentation, he felt a renewed optimism. Lister immediately sought out Pasteur’s publications on the decomposition of organic material, and with the help of Agnes he began replicating the French scientist’s experiments in his laboratory at home. For the first time, the answer was within his reach.
The research with which Lister was familiarizing himself began nine years earlier when a local wine merchant approached Pasteur with a problem. Monsieur Bigo had been manufacturing wine from beetroot juice when he noticed that a large number of his vats were turning sour while they fermented. Pasteur was then dean of the Faculty of Sciences at Lille University. His reputation as a brilliant chemist had been established years earlier when he demonstrated that a crystal’s shape, its molecular structure, and its effect on polarized light were all interconnected. He soon formed the view that only living agents could produce optically active asymmetric compounds and that further study of molecular asymmetry would unlock the secrets to the origin of life.
But why would Bigo consult a chemist with his problems? At the time, fermentation was thought to be a chemical process rather than a biological one. Although many scientists recognized that yeast acted as a catalyst in the conversion of sugar to alcohol, most believed that yeast was a complex chemical substance. Bigo had become acquainted with Pasteur’s work because his son was one of Pasteur’s students at the university. So it seemed only natural to Bigo that he should turn to the chemist for help.
In fact, Pasteur had his own reasons for wanting to investigate the causes behind the spoiled vats of wine. For some time, he had been interested in the nature of amyl alcohol, which he discovered was a “complex milieu composed of two isomers; one, which … rotates the plane of light under the polarimeter; the other, which is inactive [and] has no optical activity.” Moreover, the former contained the same asymmetrical characteristics that Pasteur had shown could only arise from living agents. Beetroot juice contained a mixture of both the inactive and the active amyl alcohols and therefore presented Pasteur with a unique opportunity to study the two isomers under different conditions.
Pasteur began making daily visits to the winery, where he eventually transformed the cellar into a makeshift laboratory. Like Bigo, he noticed that some batches of wine smelled fine, while others emanated an almost putrid odor. These vats were also covered in a mysterious film. Puzzled by this, Pa
steur took samples from each of the vats and examined them under his microscope. Much to his surprise, he discovered that the shape of the yeast was different depending on the sample. If the wine was unspoiled, the yeast was round. If it was corrupted, the yeast was elongated and appeared alongside other, smaller, rod-shaped structures: bacteria. A biochemical analysis of the spoiled batches also revealed that under the wrong conditions, hydrogen attached itself to the nitrates in the beetroot, producing lactic acid, which gave off the fetid odor and made the wine taste sour.
Crucially, Pasteur was able to show that the amyl alcohol that was optically active had arisen as a result of the yeast, not from the sugar, as some scientists had previously argued. He did this by demonstrating that when measured under a polarimeter, amyl alcohol differed too much to have inherited its asymmetry from sugar, a nonliving agent. And because Pasteur believed that life alone was responsible for asymmetry, he had to conclude that fermentation was a biological process and that the yeast that helped produce wine was a living organism.
Pasteur’s opponents pointed out that yeast was not required in sugar fermentations that produced lactic or butyric acid, nor was it possible to see yeast organisms in putrefying meat. But it wasn’t the yeast that was responsible for the spoiling of the vats; rather, it was bacteria (the rod-shaped microbes) that caused the wine to go bad. In a similar vein, Pasteur also demonstrated that the same was true in sour milk and rancid butter, though the microbes responsible in each case were different from one another. There seemed to be a specificity to the properties of the microbes that he was observing under the microscope.
Pasteur’s conclusions were bold. To say that the yeast acted on the beetroot juice because it was a living organism was to go against the very tenets of mainstream chemistry in the mid-nineteenth century. While the guardians of the old paradigm were willing to accept the presence of microorganisms in fermentable substances, they only did so on the basis that these microorganisms arose spontaneously as part of the fermenting process. Nevertheless, Pasteur believed that these microbes were carried through the air on dust particles and that they were born of themselves. They did not come into existence de novo.
In a series of experiments, Pasteur boiled fermentable substances to rid them of any existing microorganisms. He then placed these substances in two different kinds of flasks. The first was an ordinary flask with an open top. The second had a neck shaped like an S that prevented dust and other particles from entering the flask. This flask also remained open and exposed to the air. After a certain amount of time, the first flask began to teem with microbial life, while the swan-neck flask remained uncontaminated. From these experiments, Pasteur finally proved that microbes were not generated spontaneously; otherwise, the flask with the curved neck would have become reinfected. His experiments established what is now considered one of the cornerstones of biology: Only life begets life. In a speech on his findings delivered to the Sorbonne, Pasteur said, “Never will the doctrine of spontaneous generation recover from the mortal blow of this simple experiment.” It wasn’t long before the word “germ” was being used to describe these protean microbes.
In an instant, Pasteur went from being a serious chemist held in esteem by most of the scientific community to being considered a maverick by his championing of what he called “the world of the infinitely small.” His research immediately fell under attack, threatening to topple long-established views of how the world worked. The scientific journal La Presse passed damning judgment on the French scientist: “I am afraid that the experiments you quote, M. Pasteur, will turn against you.… The world into which you wish to take us is really too fantastic.”
* * *
Undeterred, Pasteur began to make connections between fermentation and putrefaction. “The applications of my ideas are immense,” he wrote in 1863. “I am ready to approach the great mystery of the putrid diseases, which constantly occupy my mind.” There was good reason for Pasteur to be so preoccupied with the subject of infectious diseases: between 1859 and 1865, three of his daughters died of typhoid fever.
Pasteur believed that putrefaction, like fermentation, was also caused by the growth of minute microorganisms that were carried through the air by dust. “Life directs the work of death at every stage,” he wrote. There was just one problem, however. Pasteur was not a physician, a point he lamented as his research into the matter progressed: “How I wish I had … the special knowledge I need to launch myself wholeheartedly into the experimental study of one of the contagious diseases.” Fortunately for Pasteur, his work had already begun to attract the attention of a select few within the medical community, like Sir Thomas Spencer Wells, surgeon to Queen Victoria.
Wells spoke of Pasteur’s latest work on fermentation and putrefaction in an address to the British Medical Association in 1863, a year before it came to Lister’s attention. In it, Wells argued that Pasteur’s research on the decomposition of organic material shed light on the causes of putrid infections: “[By] applying the knowledge for which we are indebted to Pasteur of the presence in the atmosphere of organic germs … it is easy to understand that some germs find their most appropriate nutriment in the secretions from wounds, or in pus, and that they so modify it as to convert it into a poison when absorbed.” Unfortunately, Wells failed to make the impact he had hoped at the convention. His peers were not convinced by the existence of germs, and like others who read Pasteur’s work, Wells made no real attempt to put the germ theory of putrefaction into practice.
Lister picked up the baton. Initially, he focused on the parts of Pasteur’s research that confirmed for him a view he already held: that the danger was indeed present in the air around the patient. Like Wells, Lister took away from Pasteur’s work the idea that it wasn’t the air as such but its constituent of microbial life that was the source of hospital infection. In those early days, he probably thought that the contamination of the air and the infection of the wound were attributable to the invasion of a single organism. Lister could not yet conceive of the vast number of airborne germs and their varying degrees of virulence, nor did he understand that germs could be transmitted in many different ways and by many different media.
Lister came to the vital realization that he couldn’t prevent a wound from having contact with germs in the atmosphere. So he turned his attention to finding a means of destroying microorganisms within the wound itself, before infection could set in. Pasteur had conducted a number of experiments that demonstrated that germs could be destroyed in three ways: by heat, by filtration, or by antiseptics. Lister ruled out the first two because neither were applicable to the treatment of wounds. Instead, he focused on finding the most effective antiseptic for killing germs without causing further injury: “When I read Pasteur’s article, I said to myself: just as we can destroy lice on the nit-filled head of a child by applying a poison that causes no lesion to the scalp, so I believe that we can apply to a patient’s wounds toxic products that will destroy the bacteria without harming the soft parts of this tissue.”
Surgeons had been using antiseptics to irrigate wounds for some time. The problem was that there was no consensus among doctors on what caused sepsis, and generally these substances were used to control suppuration only after infection had set in. Around this time, The Lancet reported, “It was a great part of the care of the old practitioners to avert inflammation and … to treat it. We are not so fearful of it now. Blood-poisoning is to surgeons of the present day as great a source of dread as inflammation was to their predecessors, and is a far larger and more real evil.” Unfortunately, while blood poisoning is far more dangerous than inflammation, the medical journal got it fundamentally wrong: inflammation accompanies suppuration, which is often a symptom of blood poisoning and septicemia. Inflammation is not a disease in and of itself, and it often signifies that something more sinister is going on. Until this distinction was made, it was difficult for surgeons to understand the rationale behind using antiseptics before infection set in, especially
because many in the medical community believed inflammation and pus were integral parts of the healing process. Good, clean, and limited laudable pus was necessary for normal wound healing, but excessive or contaminated pus was seen to be a dangerous medium for putrefaction.
Complicating matters was the fact that many antiseptic substances also proved ineffective or caused further damage to the tissue, thus making the wound even more vulnerable to infection. Everything from wine and quinine to iodine and turpentine had been used to treat infected wounds, but none proved consistently effective in stopping putrid suppuration once it had already begun. Corrosive substances like nitric acid, which could effectively combat putrefactive infection, were often diluted too much to be of any real use.
In the first few months of 1865, Lister tested many antiseptic solutions while trying to find the best one to counteract the microbes that he now understood were the cause of hospital infections. Most of these had a bad track record, possibly because they had only been employed after inflammation and suppuration had already set in. Lister wanted to test their efficacy by using these solutions prophylactically. He turned first to one of the most popular substances at this time, called Condy’s fluid, or potassium permanganate, which was also used as a flash powder by early photographers. Lister tested Condy’s fluid on a patient shortly after an operation, before infection had a chance to set in. His dresser, Archibald Malloch, wrote that he “held the limb in one hand, and the flaps, from which all the stitches had been cut out, in the other, while Mr. Lister poured kettle after kettleful of hot diluted Condy’s fluid between the flaps to cleanse them; the stumps being finally covered with a linseed poultice.” Despite the compound’s having a strong oxidizing agent that could act like an antiseptic, the wound eventually began to suppurate. Lister wasn’t achieving the results he sought and abandoned his trial.
The Butchering Art Page 15