A Short History of Nearly Everything: Special Illustrated Edition

Home > Nonfiction > A Short History of Nearly Everything: Special Illustrated Edition > Page 19
A Short History of Nearly Everything: Special Illustrated Edition Page 19

by Bill Bryson


  By this point physicists could be excused for thinking that they had just about conquered the atom. In fact, everything in particle physics was about to get a whole lot more complicated. But before we take up that slightly exhausting story, we must bring another strand of our history up to date by considering an important and salutary tale of avarice, deceit, bad science, several needless deaths and the final determination of the age of the Earth.

  The chillingly familiar shape of a mushroom cloud rises above Bikini Atoll in the South Pacific in 1954 during one of the American military’s first tests of hydrogen bombs. The blast shown here had a force of 11 megatons, or more than twice the destructive impact of all the explosives used by all sides in the Second World War. (credit 9.12)

  1 The name comes from the same Cavendishes who produced Henry. This one was William Cavendish, seventh Duke of Devonshire, who was a gifted mathematician and steel baron in Victorian England. In 1870 he gave the university £6,300 to build an experimental laboratory.

  2 Geiger would also later become a loyal Nazi, unhesitatingly betraying Jewish colleagues, including many who had helped him.

  3 There is a little uncertainty about the use of the word uncertainty in regard to Heisenberg’s principle. Michael Frayn, in an afterword to his play Copenhagen, notes that several words in German—Unsicherheit, Unschärfe, Ungenauigkeit and Unbestimmtheit—have been used by various translators, but that none quite equates to the English uncertainty. Frayn suggests that indeterminacy would be a better word for the principle and indeterminability would be better still. Heisenberg himself generally used Unbestimmtheit.

  4 Or at least, that is how it is nearly always rendered. The actual quote was: “It seems hard to sneak a look at God’s cards. But that He plays dice and uses “telepathic” methods…is something that I cannot believe for a single moment.”

  Morning rush hour in Mexico City shows a city choking under a haze of pollution and smog. (credit 10.1)

  GETTING THE LEAD OUT

  In the late 1940s, a graduate student at the University of Chicago named Clair Patterson (who was, first name notwithstanding, an Iowa farm boy by origin) was using a new method of lead isotope measurement to try to get a definitive age for the Earth at last. Unfortunately, all his rock samples became contaminated—unusually wildly so. Most contained something like two hundred times the levels of lead that would normally be expected to occur. Many years would pass before Patterson realized that the reason for this lay with a regrettable Ohio inventor named Thomas Midgley, Junior.

  Midgley was an engineer by training and the world would no doubt have been a safer place if he had stayed so. Instead, he developed an interest in the industrial applications of chemistry. In 1921, while working for the General Motors Research Corporation in Dayton, Ohio, he investigated a compound called tetraethyl lead (also known, confusingly, as lead tetraethyl), and discovered that it significantly reduced the juddering condition known as engine knock.

  Even though lead was widely known to be dangerous, by the early years of the twentieth century it could be found in all manner of consumer products. Food came in cans sealed with lead solder. Water was often stored in lead-lined tanks. Lead arsenate was sprayed onto fruit as a pesticide. Lead even came as part of the composition of toothpaste tubes. Hardly a product existed that didn’t bring a little lead into consumers’ lives. However, nothing gave it a greater and more lasting intimacy than its addition to motor fuel.

  Lead is a neurotoxin. Get too much of it and you can irreparably damage the brain and central nervous system. Among the many symptoms associated with over-exposure are blindness, insomnia, kidney failure, hearing loss, cancer, palsies and convulsions. In its most acute form it produces abrupt and terrifying hallucinations, disturbing to victims and onlookers alike, which generally then give way to coma and death. You really don’t want to get too much lead into your system.

  On the other hand, lead was easy to extract and work, and almost embarrassingly profitable to produce industrially—and tetraethyl lead did indubitably stop engines from knocking. So in 1923 three of America’s largest corporations, General Motors, Du Pont and Standard Oil of New Jersey, formed a joint enterprise called the Ethyl Gasoline Corporation (later shortened to simply Ethyl Corporation) with a view to making as much tetraethyl lead as the world was willing to buy, and that proved to be a very great deal. They called their additive “ethyl” because it sounded friendlier and less toxic than “lead,” and introduced it for public consumption (in more ways than most people realized) on 1 February 1923.

  Introduced to the world in 1923, tetraethyl lead—more commonly just “ethyl”—promised to make car engines run more smoothly, but also posed considerable dangers to those who made it and those who breathed it in. (credit 10.2)

  Almost at once production workers began to exhibit the staggered gait and confused faculties that mark the recently poisoned. Also almost at once, the Ethyl Corporation embarked on a policy of calm but unyielding denial that would serve it well for decades. As Sharon Bertsch McGrayne notes in her absorbing history of industrial chemistry, Prometheans in the Lab, when employees at one plant developed irreversible delusions, a spokesman blandly informed reporters: “These men probably went insane because they worked too hard.” Altogether, at least fifteen workers died in the early days of production of leaded gasoline, and untold numbers of others became ill, often violently so; the exact numbers are unknown because the company nearly always managed to hush up news of embarrassing leakages, spills and poisonings. At times, however, suppressing the news became impossible—most notably in 1924 when, in a matter of days, five production workers died and thirty-five more were turned into permanent staggering wrecks at a single ill-ventilated facility.

  (credit 10.3)

  As rumours circulated about the dangers of the new product, ethyl’s ebullient inventor, Thomas Midgley, decided to hold a demonstration for reporters to allay their concerns. As he chatted away about the company’s commitment to safety, he poured tetraethyl lead over his hands, then held a beaker of it to his nose for sixty seconds, claiming all the while that he could repeat the procedure daily without harm. In fact, Midgley knew only too well the perils of lead poisoning: he had himself been made seriously ill from over-exposure a few months earlier and now, except when reassuring journalists, never went near the stuff if he could help it.

  Buoyed by the success of leaded petrol, Midgley now turned to another technological problem of the age. Refrigerators in the 1920s were often appallingly risky because they used insidious and dangerous gases that sometimes seeped out. One leak from a refrigerator at a hospital in Cleveland, Ohio, in 1929 killed more than a hundred people. Midgley set out to create a gas that was stable, non-flammable, non-corrosive and safe to breathe. With an instinct for the regrettable that was almost uncanny, he invented chlorofluorocarbons, or CFCs.

  Seldom has an industrial product been more swiftly or unfortunately embraced. CFCs went into production in the early 1930s and found a thousand applications in everything from car air-conditioners to deodorant sprays before it was noticed, half a century later, that they were devouring the ozone in the stratosphere. As you will be aware, this was not a good thing.

  Ozone is a form of oxygen in which each molecule bears three atoms of oxygen instead of the normal two. It is a bit of a chemical oddity in that at ground level it is a pollutant, while way up in the stratosphere it is beneficial since it soaks up dangerous ultraviolet radiation. Beneficial ozone is not terribly abundant, however. If it were distributed evenly throughout the stratosphere, it would form a layer just 2 millimetres or so thick. That is why it is so easily disturbed.

  Chlorofluorocarbons are also not very abundant—they constitute only about one part per billion of the atmosphere as a whole—but they are extravagantly destructive. A single kilogram of CFCs can capture and annihilate 70,000 kilograms of atmospheric ozone. CFCs also hang around for a long time—about a century on average—wreaking havoc all the while. And
they are great heat sponges. A single CFC molecule is about ten thousand times more efficient at exacerbating greenhouse effects than a molecule of carbon dioxide—and carbon dioxide is of course no slouch itself as a greenhouse gas. In short, chlorofluorocarbons may ultimately prove to be just about the worst invention of the twentieth century.

  Midgley never knew this because he died long before anyone realized how destructive CFCs were. His death was itself memorably unusual. After becoming crippled with polio, Midgley invented a contraption involving a series of motorized pulleys that automatically raised or turned him in bed. In 1944, he became entangled in the cords as the machine went into action and was strangled.

  Thomas Midgley, the American inventor who had the distinction of devising two of the twentieth century’s most regrettable compounds—chlorofluorocarbons and leaded petrol. (credit 10.4)

  If you were interested in finding out the ages of things, the University of Chicago in the 1940s was the place to be. Willard Libby was in the process of inventing radiocarbon dating, allowing scientists to get an accurate reading of the age of bones and other organic remains, something they had never been able to do before. Up to this time, the oldest reliable dates went back no further than the First Dynasty in Egypt—about 3000 BC. No-one could confidently say, for instance, when the last ice sheets had retreated or at what time in the past the Cro-Magnon people had decorated the caves of Lascaux in France.

  Willard Libby, whose invention of radiocarbon dating in the 1940s provided the first reasonably reliable method for dating ancient materials. (credit 10.5)

  Libby’s idea was so useful that he would be awarded a Nobel Prize for it in 1960. It was based on the realization that all living things have within them an isotope of carbon called carbon-14, which begins to decay at a measurable rate the instant they die. Carbon-14 has a half-life—that is, the time it takes for half of any sample to disappear—of about 5,600 years, so by working out how much of a given sample of carbon had decayed, Libby could get a good fix on the age of an object—though only up to a point. After eight half-lives, only 0.39 per cent of the original radioactive carbon remains, which is too little to make a reliable measurement, so radiocarbon dating works only for objects up to forty thousand or so years old.

  Curiously, just as the technique was becoming widespread, certain flaws within it became apparent. To begin with, it was discovered that one of the basic components of Libby’s formula, known as the decay constant, was out by about 3 per cent. By this time, however, thousands of measurements had been taken throughout the world. Rather than restate every one, scientists decided to keep the inaccurate constant. “Thus,” Tim Flannery notes, “every raw radiocarbon date you read today is given as too young by around 3 per cent.” The problems didn’t quite stop there. It was also quickly discovered that carbon-14 samples can be easily contaminated with carbon from other sources—a tiny scrap of vegetable matter, for instance, that has been collected with the sample and not noticed. For younger samples—those under twenty thousand years or so—slight contamination does not always matter so much, but for older samples it can be a serious problem because so few remaining atoms are being counted. In the first instance, to borrow from Flannery, it is like miscounting by a dollar when counting to a thousand; in the second it is more like miscounting by a dollar when you have only two dollars to count.

  Libby’s method was also based on the assumption that the amount of carbon-14 in the atmosphere, and the rate at which it has been absorbed by living things, has been consistent throughout history. In fact it hasn’t been. We now know that the volume of atmospheric carbon-14 varies depending on how well or not the Earth’s magnetism is deflecting cosmic rays, and that that can vary significantly over time. This means that some carbon-14 dates are more dubious than others. Among the more dubious are dates just around the time that people first came to the Americas, which is one of the reasons the matter is so perennially in dispute.

  Finally, and perhaps a little unexpectedly, readings can be thrown out by seemingly unrelated external factors—such as the diets of those whose bones are being tested. One recent case involved the long-running debate over whether syphilis originated in the New World or the Old. Archaeologists in Hull found that monks in a monastery graveyard had suffered from syphilis, but the initial conclusion that the monks had done so before Columbus’s voyage was cast into doubt by the realization that they had eaten a lot of fish, which could make their bones appear to be older than in fact they were. The monks may well have had syphilis, but how it got to them, and when, remain tantalizingly unresolved.

  Because of the accumulated shortcomings of carbon-14, scientists devised other methods of dating ancient materials, among them thermoluminescence, which measures electrons trapped in clays, and electron spin resonance, which involves bombarding a sample with electromagnetic waves and measuring the vibrations of the electrons. But even the best of these could not date anything older than about two hundred thousand years, and they couldn’t date inorganic materials like rocks at all, which is of course what you need to do if you wish to determine the age of your planet.

  The problems of dating rocks were such that at one point almost everyone in the world had given up on them. Had it not been for a determined English professor named Arthur Holmes, the quest might well have fallen into abeyance altogether.

  The English geologist and professor Arthur Holmes, who devised a method for accurately dating rocks based on the decay rate of uranium into lead, allowing him to show that Earth was at least three billion years old. Holmes was later instrumental in advancing the new theory of plate tectonics. (credit 10.6)

  Holmes was heroic as much for the obstacles he overcame as for the results he achieved. By the 1920s, when he was in the prime of his career, geology had slipped out of fashion—physics was the new excitement of the age—and had become severely underfunded, particularly in Britain, its spiritual birthplace. At Durham University, Holmes was for many years the entire geology department. Often he had to borrow or patch together equipment in order to pursue his radiometric dating of rocks. At one point, his calculations were effectively held up for a year while he waited for the university to provide him with a simple adding machine. Occasionally, he had to drop out of academic life altogether to earn enough to support his family—for a time he ran a curio shop in Newcastle upon Tyne—and sometimes he could not even afford the £5 annual membership fee for the Geological Society.

  The technique Holmes used in his work was theoretically straightforward and arose directly from the process first observed by Ernest Rutherford in 1904 by which some atoms decay from one element into another at a rate predictable enough that you can use them as clocks. If you know how long it takes for potassium-40 to become argon-40, and you measure the amounts of each in a sample, you can work out how old a material is. Holmes’s contribution was to measure the decay rate of uranium into lead to calculate the age of rocks, and thus—he hoped—of the Earth.

  But there were many technical difficulties to overcome. Holmes also needed—or at least would very much have appreciated—sophisticated gadgetry of a sort that could make very fine measurements from tiny samples, and, as we have seen, it was all he could do to get a simple adding machine. So it was quite an achievement when in 1946 he was able to announce with some confidence that the Earth was at least three billion years old and possibly rather more. Unfortunately, he now met yet another formidable impediment to acceptance: the conservativeness of his fellow scientists. Although happy to praise his methodology, many maintained that he had found not the age of the Earth but merely the age of the materials from which the Earth had been formed.

  It was just at this time that Harrison Brown of the University of Chicago developed a new method for counting lead isotopes in igneous rocks (which is to say those that were created through heating, as opposed to the laying down of sediments). Realizing that the work would be exceedingly tedious, he assigned it to young Clair Patterson as his dissertation projec
t. Famously, he promised Patterson that determining the age of the Earth with his new method would be “duck soup.” In fact, it would take years.

  Harrison Brown, who created a refined technique for determining the age of very ancient rocks, enabling Clair Patterson finally to make an accurate assessment of Earth’s age: 4.55 billion years. (credit 10.7)

  Patterson began work on the project in 1948. Compared with Thomas Midgley’s colourful contributions to the march of progress, Patterson’s discovery of the age of the Earth feels more than a touch anti-climactic. For seven years, first at the University of Chicago and then at the California Institute of Technology (where he moved in 1952), he worked in a sterile lab, making very precise measurements of the lead/uranium ratios in carefully selected samples of old rock.

  The problem with measuring the age of the Earth was that you needed rocks that were extremely ancient, containing lead- and uranium-bearing crystals that were about as old as the planet itself—anything much younger would obviously give you misleadingly youthful dates—but really ancient rocks are only rarely found on Earth. In the late 1940s no-one altogether understood why this should be. Indeed, and rather extraordinarily, we would be well into the space age before anyone could plausibly account for where all the Earth’s old rocks went. (The answer was plate tectonics, which we shall of course get to.) Patterson, meanwhile, was left to try to make sense of things with very limited materials. Eventually, and ingeniously, it occurred to him that he could circumvent the rock shortage by using rocks from beyond Earth. He turned to meteorites.

 

‹ Prev