Clorox bottles became so commonplace that the remnants our grandparents left behind are used by archaeologists to date dig sites today. (The pint-glass chlorine bleach bottle is to the early twentieth century what spear tips are to the iron age or colonial pottery is to the eighteenth century.) It was accompanied by other bestselling hygiene products for the home: Palmolive soap, Listerine, a popular antiperspirant named Odorono. Hygiene products like these were among the first to be promoted in full-page advertisements in magazines and newspapers. By the 1920s, Americans were being bombarded by commercial messages convincing them that they were facing certain humiliation if they didn’t do something about the germs on their bodies or in their homes. (The phrase “often a bridesmaid, never a bride” originated with a 1925 Listerine advertisement.) When radio and television began experimenting with storytelling, it was the personal-hygiene companies that once again led the way in pioneering new forms of advertising, a brilliant marketing move that still lingers with us today in the phrase “soap opera.” This is one of the stranger hummingbird effects of contemporary culture: the germ theory of disease may have reduced infant mortality to a fraction of its nineteenth-century levels, and made surgery and childbirth far safer than it had been in Semmelweis’s day. But it also played a crucial role in inventing the modern advertising business.
Today the cleaning business is worth an estimated $80 billion. Walk into a big-box supermarket or drugstore, and you will find hundreds, if not thousands, of products devoted to ridding our households of dangerous germs: cleaning our sinks and our toilets and floors and silverware, our teeth and our feet. These stores are effectively giant munitions depots for the war on bacteria. Naturally, there are some who feel that our obsession with cleanliness may now have gone too far. Some research suggests that our ever cleaner world may actually be linked to increasing rates of asthma and allergies, as our childhood immune systems now develop without being exposed to the full diversity of germs.
Clorox advertisement
—
THE CONFLICT BETWEEN MAN and bacteria that played out over the past two centuries has had far-reaching consequences: from the trivial pursuits of swimwear fashion all the way to the existential improvements of lowered infant mortality rates. Our growing understanding of the microbial routes of disease enabled cities to burst through the population ceilings that had constrained them for the entirety of human civilization. As of 1800, no society had successfully built and sustained a city of more than two million people. The first cities to challenge that barrier (London and Paris, followed shortly by New York) had suffered mightily from the diseases that erupted when that many people shared such a small amount of real estate. Many reasonable observers of urban life in the middle of the nineteenth century were convinced that cities were not meant to be built on this scale, and that London would inevitably collapse back to a more manageable size, as Rome had done almost two thousand years before. But solving the problems of clean drinking water and reliable waste removal changed all of that. A hundred and fifty years after Ellis Chesbrough first took his grand tour of European sewage, cities such as London and New York were approaching ten million residents, with life expectancies and infectious disease rates that were far lower than their Victorian antecedents.
Of course, the problem now is not cities of two million or ten million; it’s megacities such as Mumbai or São Paulo that will soon embrace thirty million human beings or more, many of them living in improvised communities—shantytowns, favelas—that are closer to the Chicago that Chesbrough had to raise than a contemporary city in the developed world. If you look only at today’s Chicago or London, the story of the past century and a half seems to be one of incontrovertible progress: the water is cleaner; the mortality rates are much lower; epidemic disease is effectively nonexistent. And yet today there are more than three billion people around the world who lack access to clean drinking water and basic sanitation systems. In absolute numbers, we have gone backward as a species. (There were only a billion people alive in 1850.) So the question before us now is how we bring the clean revolution to the favelas, and not just Michigan Avenue. The conventional assumption has been that these communities need to follow the same path charted by Snow, Chesbrough, Leal, and all the other unsung heroes of our public health infrastructure: they need toilets connected to massive sewer systems that dispose of waste without contaminating reservoirs that pump out filtered water, delivered through an equally elaborate system direct to the home. But increasingly, these new megacities’ citizens—and other global development innovators—have begun to think that history need not repeat itself.
However bold and determined John Leal was, if he had been born just a generation earlier, he would have never had the opportunity to chlorinate the Jersey City water, because the science and the technology that made chlorination possible simply hadn’t been invented yet. The maps and lenses and chemistry and units of measure that converged in the second half of the nineteenth century gave him a platform for the experiment, so much so that it is probably fair to assume that if Leal hadn’t brought chlorination to the mainstream, someone else would have done it within a decade, if not sooner. All of which leads to the question: If new ideas and new technology can make a new solution imaginable, the way the germ theory and the microscope triggered the idea of chemically treating water, then has there not been a sufficient supply of new ideas since Leal’s day that might trigger a new paradigm for keeping our cities clean, one that would bypass the big-engineering phase altogether? And perhaps that paradigm might be a leading indicator of a future that we’re all destined to share. The developing world has famously bypassed some of the laborious infrastructure of wired telephone lines, jumping ahead of more “advanced” economies by basing their communications around wireless connections instead. Could the same pattern play out with sewers?
In 2011, the Bill and Melinda Gates Foundation announced a competition to help spur a paradigm shift in the way we think about basic sanitation services. Memorably called the “Reinvent the Toilet Challenge,” the competition solicited designs for toilets that do not require a sewer connection or electricity and cost less than five cents per user per day. The winning entry was a toilet system from Caltech that uses photovoltaic cells to power an electrochemical reactor that treats human waste, producing clean water for flushing or irrigation and hydrogen that can be stored in fuel cells. The system is entirely self-contained; it has no need for an electrical grid, a sewer line, or a treatment facility. The only input the toilet requires, beyond sunlight and human waste, is simple table salt, which is oxidized to make chlorine to disinfect the water.
Those chlorine molecules might well be the only part of the toilet that John Leal would recognize, were he around to see it today. And that’s because the toilet depends on new ideas and technology that have become part of the adjacent possible in the twentieth century, tools that hopefully can allow us to bypass the costly, labor-intensive work of building giant infrastructure projects. Leal needed microscopes and chemistry and the germ theory to clean the water supply in Jersey City; the Caltech toilet needs hydrogen fuel cells, solar panels, and even lightweight, inexpensive computer chips to monitor and regulate the system.
Ironically, those microprocessors are themselves, in part, the by-product of the clean revolution. Computer chips are fantastically intricate creations—despite the fact that they are ultimately the product of human intelligence, their microscopic detail is almost impossible for us to comprehend. To measure them, we need to zoom down to the scale of micrometers, or microns: one-millionth of a meter. The width of a human hair is about a hundred microns. A single cell of your skin is about thirty microns. A cholera bacterium is about three microns across. The pathways and transistors through which electricity flows on a microchip—carrying those signals that represent the zeroes and ones of binary code—can be as small as one-tenth of a single micron. Manufacturing at this scale requires extraordinary robotics and laser tools; there’s no such
thing as a hand-crafted microprocessor. But chip factories also require another kind of technology, one we don’t normally associate with the high-tech world: they need to be preposterously clean. A spec of household dust landing on one of these delicate silicon wafers would be comparable to Mount Everest landing in the streets of Manhattan.
Bill Gates inspects the winning entry in the 2011 “Reinvent the Toilet Challenge.”
Environments such as the Texas Instruments microchip plant outside Austin, Texas, are some of the cleanest places on the planet. To even enter into the space, you have to don a full clean suit, your body covered head-to-toe with sterile materials that don’t shed. There’s something strangely inverted about the process. Normally when you find yourself dressing in such extreme protective outfits, you’re guarding yourself against some kind of hostile environment: severe cold, pathogens, the vacuum of space. But in the clean room, the suit is designed to protect the space from you. You are the pathogen, threatening the valuable resources of computer chips waiting to be born: your hair follicles and your epidermal layers and the mucus swarming around you. From the microchip’s point of view, every human being is Pig Pen, a dust cloud of filth. Washing up before entering the clean room, you’re not even allowed to use soap, because most soaps have fragrances that give off potential contaminants. Even soap is too dirty for the clean room.
There is a strange symmetry to the clean room as well, one that brings us back to those first pioneers struggling to purify the drinking water of their cities: to Ellis Chesbrough, John Snow, John Leal. Producing microchips also requires large quantities of water, only this water is radically different from the water you drink from the tap. To avoid impurities, chip plants create pure H2O, water that has been filtered not only of any bacterial contaminants but also of all the minerals, salts, and random ions that make up normal filtered water. Stripped of all those extra “contaminants,” ultrapure water, as it is called, is the ideal solvent for microchips. But those missing elements also make ultrapure water undrinkable for humans; chug a glass of the stuff and it will start leeching minerals out of your body. This is the full circle of clean: some of the most brilliant ideas in science and engineering in the nineteenth century helped us purify water that was too dirty to drink. And now, a hundred and fifty years later, we’ve created water that’s too clean to drink.
The interior of Texas Instruments
Standing in the clean room, the mind naturally drifts back to the sewers that lie beneath our city streets, the two polar extremes of the history of clean. To build the modern world, we had to create an unimaginably repellent space, an underground river of filth, and cordon it off from everyday life. And at the same time, to make the digital revolution, we had to create a hyper-clean environment, and once again, cordon it off from everyday life. We never get to visit these environments, and so they retreat from our consciousness. We celebrate the things they make possible—towering skyscrapers and ever-more-powerful computers—but we don’t celebrate the sewers and the clean rooms themselves. Yet their achievements are everywhere around us.
5. Time
In October 1967, a group of scientists from around the world gathered in Paris for a conference with the unassuming name “The General Conference on Weights and Measures.” If you’ve had the questionable fortune to attend an academic conference before, you probably have some sense of how these affairs go: papers are presented, along with an interminable series of panel discussions, broken up by casual networking over coffee; there’s gossip and infighting at the hotel bar at night; everyone has a tolerably good time, and not a whole lot gets done. But the General Conference on Weights and Measures broke from that venerable tradition. On October 13, 1967, the attendees agreed to change the very definition of time.
For almost the entire span of human history, time had been calculated by tracking the heavenly rhythms of solar bodies. Like the earth itself, our sense of time revolved around the sun. Days were defined by the cycle of sunrise and sunset, months by the cycles of the moon, years by the slow but predictable rhythms of the seasons. For most of that stretch, of course, we misunderstood what was causing those patterns, assuming that the sun was revolving around the earth, and not the reverse. Slowly, we built tools to measure the flow of time more predictably: sundials to track the passage of the day; celestial observatories such as Stonehenge to track seasonal milestones like the summer solstice. We began dividing up time into shorter units—seconds, minutes, hours—with many of those units relying on a base-12 counting system passed down from the ancient Egyptians and Sumerians. Time was defined by grade-school division: a minute was one-sixtieth of an hour, an hour was one-twenty-fourth of a day. And a day was simply the time that passed between the two moments when the sun was highest in the sky.
But starting about sixty years ago, as our tools of measuring time increased in precision, we began to notice flaws in that celestial metronome. The clockwork of the heavens turned out to be, well, a bit wobbly. And that’s what the General Conference on Weights and Measures set out to address in 1967. If we were going to be truly accurate with our measurements of time, we needed to trade the largest entity in the solar system for one of the smallest.
Nundinal calendar, Rome. The ancient Etruscans developed an eight-day market week, known as the nundinal cycle, around the eighth or seventh century BC.
—
MEASURED PURELY BY TOURIST ATTENTION, the Duomo of Pisa is generally overshadowed by its famous leaning neighbor next door, but the thousand-year-old cathedral, with its brilliant white stone and marble façade, is in many ways a more impressive structure than the tilted bell tower beside it. Stand at the base of the nave and gaze up toward the fourteenth-century apse mosaic, and you can re-create a moment of absentminded distraction that would ultimately transform our relationship to time. Suspended from the ceiling is a collection of altar lamps. They are motionless now, but legend has it that in 1583, a nineteen-year-old student at the University of Pisa attended prayers at the cathedral and, while daydreaming in the pews, noticed one of the altar lamps swaying back and forth. While his companions dutifully recited the Nicene Creed around him, the student became almost hypnotized by the lamp’s regular motion. No matter how large the arc, the lamp appeared to take the same amount of time to swing back and forth. As the arc decreased in length, the speed of the lamp decreased as well. To confirm his observations, the student measured the lamp’s swing against the only reliable clock he could find: his own pulse.
Most nineteen-year-olds figure out less scientific ways to be distracted while attending mass, but this college freshman happened to be Galileo Galilei. That Galileo was daydreaming about time and rhythm shouldn’t surprise us: his father was a music theorist and played the lute. In the middle of the sixteenth century, playing music would have been one of the most temporally precise activities in everyday culture. (The musical term “tempo” comes from the Italian word for time.) But machines that could keep a reliable beat didn’t exist in Galileo’s age; the metronome wouldn’t be invented for another few centuries. So watching the altar lamp sway back and forth with such regularity planted the seed of an idea in Galileo’s young mind. As is so often the case, however, it would take decades before the seed would blossom into something useful.
Galileo spent the next twenty years becoming a professor of mathematics, experimenting with telescopes, and more or less inventing modern science, but he managed to keep the memory of that swinging altar lamp alive in his mind. Increasingly obsessed with the science of dynamics, the study of how objects move through space, he decided to build a pendulum that would re-create what he had observed in the Duomo of Pisa so many years before. He discovered that the time it takes a pendulum to swing is not dependent on the size of the arc or the mass of the object swinging, but only on the length of the string. “The marvelous property of the pendulum,” he wrote to fellow scientist Giovanni Battista Baliani, “is that it makes all its vibrations, large or small, in equal times.”
&
nbsp; In equal times. In Galileo’s age, any natural phenomenon or mechanical device that displayed this rhythmic precision seemed miraculous. Most Italian towns in that period had large, unwieldy mechanical clocks that kept a loose version of the correct time, but they had to be corrected by sundial readings constantly or they would lose as much as twenty minutes a day. In other words, the state of the art in timekeeping technology was challenged by just staying accurate on the scale of days. The idea of a timepiece that might be accurate to the second was preposterous.
The swinging altar lamp inside Duomo of Pisa
Preposterous, and seemingly unnecessary. Just like Frederic Tudor’s ice trade, it was an innovation that had no natural market. You couldn’t keep accurate time in the middle of the sixteenth century, but no one really noticed, because there was no need for split-second accuracy. There were no buses to catch, or TV shows to watch, or conference calls to join. If you knew roughly what hour of the day it was, you could get by just fine.
How We Got to Now: Six Innovations That Made the Modern World Page 12