This is one of those places where our basic sensibilities deviate from the sensibilities of our nineteenth-century ancestors. They look and act like modern people in many ways: they take trains and schedule meetings and eat in restaurants. But every now and then, strange gaps open between us and them, not just the obvious gaps in technological sophistication, but more subtle, conceptual gaps. In today’s world, we think of hygiene in fundamentally different ways. The concept of bathing, for instance, was alien to most nineteenth-century Europeans and Americans. You might naturally assume that taking a bath was a foreign concept simply because people didn’t have access to running water and indoor plumbing and showers the way most of us in the developed world do today. But, in fact, the story is much more complicated than that. In Europe, starting in the Middle Ages and running almost all the way to the twentieth century, the prevailing wisdom on hygiene maintained that submerging the body in water was a distinctly unhealthy, even dangerous thing. Clogging one’s pores with dirt and oil allegedly protected you from disease. “Bathing fills the head with vapors,” a French doctor advised in 1655. “It is the enemy of the nerves and ligaments, which it loosens, in such a way that many a man never suffers from gout except after bathing.”
You can see the force of this prejudice most clearly in the accounts of royalty during the 1600s and 1700s—in other words, the very people who could afford to have baths constructed and drawn for them without a second thought. Elizabeth I bothered to take a bath only once a month, and she was a veritable clean freak compared to her peers. As a child, Louis XIII was not bathed once until he was seven years old. Sitting naked in a pool of water was simply not something civilized Europeans did; it belonged to the barbaric traditions of Middle Eastern bathhouses, not the aristocracy of Paris or London.
Slowly, starting in the early nineteenth century, the attitudes began to shift, most notably in England and America. Charles Dickens built an elaborate cold-water shower in his London home, and was a great advocate for the energizing and hygienic virtues of a daily shower. A minor genre of self-help books and pamphlets emerged, teaching people how to take a bath, with detailed instructions that seem today as if they are training someone to land a 747. One of the first steps Professor Higgins takes in reforming Eliza Doolittle in George Bernard Shaw’s Pygmalion is getting her into a tub. (“You expect me to get into that and wet myself all over?” she protests. “Not me. I should catch my death.”) Harriet Beecher Stowe and her sister Catharine Beecher advocated a daily wash in their influential handbook, The American Woman’s Home, published in 1869. Reformers began building public baths and showers in urban slums around the country. “By the last decades of the century,” the historian Katherine Ashenburg writes, “cleanliness had become firmly linked not only to godliness but also to the American way.”
Poster issued by the Central Council for Health Education (1927–1969), 1955
The virtues of washing oneself were not self-evident, the way we think of them today. They had to be discovered and promoted, largely through the vehicles of social reform and word of mouth. Interestingly, there is very little discussion of soap in the popular embrace of bathing in the nineteenth century. It was hard enough just to convince people that the water wasn’t going to kill them. (As we will see, when soap finally hit the mainstream in the twentieth century, it would be propelled by another new convention: advertising.) But the evangelists for bathing were supported by the convergence of several important scientific and technological developments. Advances of public infrastructure meant that people were much more likely to have running water in their homes to fill their bathtubs; that the water was cleaner than it had been a few decades earlier; and, most important, that the germ theory of disease had gone from fringe idea to scientific consensus.
This new paradigm had been achieved through two parallel investigations. First, there was the epidemiological detective work of John Snow in London, who first proved that cholera was caused by contaminated water and not miasmatic smells, by mapping the deaths of a Soho epidemic. Snow never managed to see the bacteria that caused cholera directly; the technology of microscopy at the time made it almost impossible to see organisms (Snow called them “animalcules”) that were so small. But he was able to detect the organisms indirectly, in the patterns of death on the streets of London. Snow’s waterborne theory of disease would ultimately deliver the first decisive blow to the miasma paradigm, though Snow himself didn’t live to see his theory triumph. After his untimely death in 1858, The Lancet ran a terse obituary that made no reference whatsoever to his groundbreaking epidemiological work. In 2014, the publication ran a somewhat belated “correction” to the obit, detailing the London doctor’s seminal contributions to public health.
John Snow’s cholera map of Soho
The modern synthesis that would come to replace the miasma hypothesis—that diseases such as cholera and typhoid are caused not by smells but by invisible organisms that thrive in contaminated water—was ultimately dependent, once again, on an innovation in glass. The German lens crafters Zeiss Optical Works began producing new microscopes in the early 1870s—devices that for the first time had been constructed around mathematical formulas that described the behavior of light. These new lenses enabled the microbiological work of scientists such as Robert Koch, one of the first scientists to identify the cholera bacterium. (After receiving the Nobel Prize for his work in 1905, Koch wrote to Carl Zeiss, “A large part of my success I owe to your excellent microscopes.”) With his great rival Louis Pasteur, Koch and his microscopes helped develop and evangelize the germ theory of disease. From a technological standpoint, the great nineteenth-century breakthrough in public health—the knowledge that invisible germs can kill—was a kind of team effort between maps and microscopes.
Today, Koch is rightly celebrated for the numerous microorganisms that he identified through those Zeiss lenses. But his research also led to a related breakthrough that was every bit as important, though less widely appreciated. Koch didn’t just see the bacteria; he also developed sophisticated tools to measure the density of bacteria in a given quantity of water. He mixed contaminated water with transparent gelatin, and viewed the growing bacterial colonies on a glass plate. Koch established a unit of measure that could be applied to any quantity of water—below 100 colonies per milliliter was considered to be safe to drink.
New ways of measuring create new ways of making. The ability to measure bacterial content allowed a completely new set of approaches to the challenges of public health. Before the adoption of these units of measurement, you had to test improvements to the water system the old-fashioned way: you built a new sewer or reservoir or pipe, and you sat around and waited to see if fewer people would die. But being able to take a sample of water and determine empirically whether it was free of contamination meant that cycles of experimentation could be tremendously accelerated.
Microscopes and measurement quickly opened a new front in the war on germs: instead of fighting them indirectly, by routing the waste away from the drinking water, new chemicals could be used to attack the germs directly. One of the key soldiers on this second front was a New Jersey doctor named John Leal. Like John Snow before him, Leal was a doctor who treated patients but who also had a passionate interest in wider issues of public health, particularly those concerning contaminated water supplies. It was an interest born of a personal tragedy: his father had suffered a slow and painful death from drinking bacteria-infested water during the Civil War. His father’s experience in the war gives us a compelling statistical portrait of the threat posed by contaminated water and other health risks during this period. Nineteen men in the 144th Regiment died in combat, while 178 died of disease during the war.
Leal experimented with many techniques for killing bacteria, but one poison in particular began to pique his interest as early as 1898: calcium hypochlorite, the potentially lethal chemical that is better known as chlorine, also known at the time as “chloride of lime.” The chemical
had already been in wide circulation as a public health remedy: houses and neighborhoods that had suffered an outbreak of typhoid or cholera were routinely disinfected with the chemical, an intervention that did nothing to combat waterborne disease. But the idea of putting chlorine in water had not yet taken hold. The sharp, acrid smell of chloride of lime was indelibly associated with epidemic disease in the minds of city dwellers throughout the United States and Europe. It was certainly not a smell that one wanted to detect in one’s drinking water. Most doctors and public health authorities rejected the approach. One noted chemist protested: “The idea itself of chemical disinfection is repellent.” But armed with tools that enabled him to both see the pathogens behind diseases such as typhoid and dysentery and measure their overall presence in the water, Leal became convinced that chlorine—at the right dosage—could rid water of dangerous bacteria more effectively than any other means, without any threat to the humans drinking it.
Eventually, Leal landed a job with the Jersey City Water Supply Company, giving him oversight of seven billion gallons of drinking water in the Passaic River watershed. This new job set the stage for one of the most bizarre and daring interventions in the history of public health. In 1908, the company was immersed in a prolonged legal battle over contracts (worth hundreds of millions of dollars in today’s money) for reservoirs and water-supply pipes they had recently completed. The judge in the case had criticized the firm for not supplying waste that was “pure and wholesome” and ordered them to construct expensive additional sewer lines designed to keep pathogens out of the city’s drinking water. But Leal knew the sewer lines would be limited in their effectiveness, particularly during big storms. And so he decided to put his recent experiments with chlorine to the ultimate test.
In almost complete secrecy, without any permission from government authorities (and no notice to the general public), Leal decided to add chlorine to the Jersey City reservoirs. With the help of engineer George Warren Fuller, Leal built and installed a “chloride of lime feed facility” at the Boonton Reservoir outside Jersey City. It was a staggering risk, given the popular opposition to chemical filtering at the time. But the court rulings had severely limited his timeline, and he knew that lab tests would be meaningless to a lay audience. “Leal did not have time for a pilot study. He certainly did not have time to build a demonstration-scale facility to test the new technology,” Michael J. McGuire writes in his account, The Chlorine Revolution. “If the chlorine of lime feed system lost control of the amount of chemical being fed and a slug of high chlorine residual was delivered to Jersey City, Leal knew that would define the failure of the process.”
Cholera victim
It was the first mass chlorination of a city’s water supply in history. Once word got out, however, it initially seemed as though Leal was a madman or some kind terrorist. Drinking a few glasses of calcium hypochlorite could kill you, after all. But Leal had done enough experiments to know that very small quantities of the compound were harmless to humans but lethal to many forms of bacteria. Three months after his experiment, Leal was called to appear in court to defend his actions. Throughout his interrogation, he stood strong in defense of his public health innovation:
Q: Doctor, what other places in the world can you mention in which this experiment has been tried of putting this bleaching powder in the same way in the drinking water of a city of 200,000 inhabitants?
A: 200,000 inhabitants? There is no such place in the world, it has never been tried.
Q: It never has been.
A: Not under such conditions or under such circumstances but it will be used many times in the future, however.
Q: Jersey City is the first one?
A: The first to profit by it.
Q: Jersey City is the first one used to prove whether your experiment is good or bad?
A: No, sir, to profit by it. The experiment is over.
Q: Did you notify the city that you were going to try this experiment?
A: I did not.
Q: Do you drink this water?
A: Yes sir.
Q: Would you have any hesitation about giving it to your wife and family?
A: I believe it is the safest water in the world.
Cholera warning, 1866
—
ULTIMATELY THE COURT CASE was settled with near complete victory for Leal. “I do there find and report,” the special master in the case wrote, “that this device is capable of rendering the water delivered to Jersey City, pure and wholesome … and is effective in removing from the water … dangerous germs.” Within a few years, the data supporting Leal’s daring move had become incontrovertible: communities such as Jersey City that enjoyed chlorinated drinking water saw dramatic decreases in waterborne diseases like typhoid fever.
At one point in Leal’s cross-examination during the Jersey City trial, the prosecuting attorney accused John Leal of seeking vast financial rewards from his chlorine innovation. “And if the experiment turned out well,” he sneered, “why, you made a fortune.” Leal interrupted him from the witness box with a shrug, “I don’t know where the fortune comes in; it is all the same to me.” Unlike others, Leal made no attempt to patent the chlorination technique that he had pioneered at the Boonton Reservoir. His idea was free to be adopted by any water company that wished to provide its customers with “pure and wholesome” water. Unencumbered by patent restrictions and licensing fees, municipalities quickly adopted chlorination as a standard practice, across the United States and eventually around the world.
About a decade ago, two Harvard professors, David Cutler and Grant Miller, set out to ascertain the impact of chlorination (and other water filtration techniques) between 1900 and 1930, the period when they were implemented across the United States. Because extensive data existed for rates of disease and particularly infant mortality in different communities around the country, and because chlorination systems were rolled out in a staggered fashion, Cutler and Miller were able to get an extremely accurate portrait of chlorine’s effect on public health. They found that clean drinking water led to a 43 percent reduction in total mortality in the average American city. Even more impressive, chlorine and filtration systems reduced infant mortality by 74 percent, and child mortality by almost as much.
It is important to pause for a second to reflect on the significance of those numbers, to take them out of the dry domain of public health statistics and into the realm of lived experience. Until the twentieth century, one of the givens of being a parent was that you faced a very high likelihood that at least one of your children would die at an early age. What may well be the most excruciating experience that we can confront—the loss of a child—was simply a routine fact of existence. Today, in the developed world at least, that routine fact has been turned into a rarity. One of the most fundamental challenges of being alive—keeping your children safe from harm—was dramatically lessened, in part through massive engineering projects, and in part through the invisible collision between compounds of calcium hypochlorite and microscopic bacteria. The people behind that revolution didn’t become rich, and very few of them became famous. But they left an imprint on our lives that is in many ways more profound than the legacy of Edison or Rockefeller or Ford.
Chlorination wasn’t just about saving lives, though. It was also about having fun. After World War I, ten thousand chlorinated public baths and pools opened across America; learning how to swim became a rite of passage. These new aquatic public spaces were the leading edge in challenges to the old rules of public decency during the period between the wars. Before the rise of municipal pools, women bathers generally dressed as though they were bundled up for a sleigh ride. By the mid-1920s, women began exposing their legs below the knee; one-piece suits with lower necklines emerged a few years later. Open-backed suits, followed by two-piece outfits, followed quickly in the 1930s. “In total, a woman’s thighs, hip line, shoulders, stomach, back and breast line all become publicly exposed between 1920 and 1940,” t
he historian Jeff Wiltse writes in his social history of swimming, Contested Waters. We can measure the transformation in terms of simple material: at the turn of the century, the average woman’s bathing suit required ten yards of fabric; by the end of the 1930s, one yard was sufficient. We tend to think of the 1960s as the period when shifting cultural attitudes led to the most dramatic change in everyday fashion, but it is hard to rival the rapid-fire unveiling of the female body that occurred between the wars. Of course, it is likely that women’s fashion would have found another route to exposure without the rise of swimming pools, but it seems unlikely that it would have happened as quickly as it did. No doubt exposing the thighs of female bathers was not in the forefront of John Leal’s mind as he dumped his chlorine into the Jersey City reservoir, but like the hummingbird’s wing, a change in one field triggers a seemingly unrelated change at a different order of existence: a trillion bacteria die at the hands of calcium hypochlorite, and somehow, twenty years later, basic attitudes toward exposing the female body are reinvented. As with so many cultural changes, it’s not that the practice of chlorination single-handedly transformed women’s fashion; many social and technological forces converged to make those bathing suits smaller: various strands of early feminism, the fetishizing gaze of the Hollywood camera, not to mention individual stars who wore those more revealing suits. But without the mass adoption of swimming as a leisure activity, those fashions would have been deprived of one of their key showcases. What’s more, those other explanations—as valid as they are—usually get all the press. Ask your average person on the street what factors drive women’s fashion, and they’ll inevitably point to Hollywood or glossy magazines. But they won’t often mention calcium hypochlorite.
—
THROUGH THE NINETEENTH CENTURY, the march of clean technologies had largely unfolded on the terrain of public health: big engineering projects, mass filtration systems. But the story of hygiene in the twentieth century is a much more intimate affair. Just a few years after Leal’s bold experiment, five San Francisco entrepreneurs invested a hundred dollars each to launch a chlorine-based product. It seems with hindsight to have been a good idea, but their bleach business had been aimed at big industry, and sales didn’t develop as quickly as they had hoped. But the wife of one of the investors, Annie Murray, a shop owner in Oakland, California, had an idea: that chlorine bleach could be a revolutionary product for people’s homes as well as factories. At Murray’s insistence, the company created a weaker version of the chemical and packaged it in smaller bottles. Murray was so convinced of the product’s promise that she gave out free samples to all her shop customers. Within months, bottles were selling like crazy. Murray didn’t realize it at that time, but she was helping to invent an entirely new industry. Annie Murray had created America’s first commercial bleach for the home, and the first in a wave of cleaning brands that would become ubiquitous in the new century: Clorox.
How We Got to Now: Six Innovations That Made the Modern World Page 11