Book Read Free

A Short History of Nearly Everything

Page 12

by Bill Bryson


  Chemists also used a bewildering variety of symbols and abbreviations, often self-invented. Sweden's J. J. Berzelius brought a much-needed measure of order to matters by decreeing that the elements be abbreviated on the basis of their Greek or Latin names, which is why the abbreviation for iron is Fe (from the Latin ferrum ) and that for silver is Ag (from the Latin argentum ). That so many of the other abbreviations accord with their English names (N for nitrogen, O for Oxygen, H for hydrogen, and so on) reflects English's Latinate nature, not its exalted status. To indicate the number of atoms in a molecule, Berzelius employed a superscript notation, as in H 2 O. Later, for no special reason, the fashion became to render the number as subscript: H 2 O.

  Despite the occasional tidyings-up, chemistry by the second half of the nineteenth century was in something of a mess, which is why everybody was so pleased by the rise to prominence in 1869 of an odd and crazed-looking professor at the University of St. Petersburg named Dmitri Ivanovich Mendeleyev.

  Mendeleyev (also sometimes spelled Mendeleev or Mendeléef) was born in 1834 at Tobolsk, in the far west of Siberia, into a well-educated, reasonably prosperous, and very large family--so large, in fact, that history has lost track of exactly how many Mendeleyevs there were: some sources say there were fourteen children, some say seventeen. All agree, at any rate, that Dmitri was the youngest. Luck was not always with the Mendeleyevs. When Dmitri was small his father, the headmaster of a local school, went blind and his mother had to go out to work. Clearly an extraordinary woman, she eventually became the manager of a successful glass factory. All went well until 1848, when the factory burned down and the family was reduced to penury. Determined to get her youngest child an education, the indomitable Mrs. Mendeleyev hitchhiked with young Dmitri four thousand miles to St. Petersburg--that's equivalent to traveling from London to Equatorial Guinea--and deposited him at the Institute of Pedagogy. Worn out by her efforts, she died soon after.

  Mendeleyev dutifully completed his studies and eventually landed a position at the local university. There he was a competent but not terribly outstanding chemist, known more for his wild hair and beard, which he had trimmed just once a year, than for his gifts in the laboratory.

  However, in 1869, at the age of thirty-five, he began to toy with a way to arrange the elements. At the time, elements were normally grouped in two ways--either by atomic weight (using Avogadro's Principle) or by common properties (whether they were metals or gases, for instance). Mendeleyev's breakthrough was to see that the two could be combined in a single table.

  As is often the way in science, the principle had actually been anticipated three years previously by an amateur chemist in England named John Newlands. He suggested that when elements were arranged by weight they appeared to repeat certain properties--in a sense to harmonize--at every eighth place along the scale. Slightly unwisely, for this was an idea whose time had not quite yet come, Newlands called it the Law of Octaves and likened the arrangement to the octaves on a piano keyboard. Perhaps there was something in Newlands's manner of presentation, but the idea was considered fundamentally preposterous and widely mocked. At gatherings, droller members of the audience would sometimes ask him if he could get his elements to play them a little tune. Discouraged, Newlands gave up pushing the idea and soon dropped from view altogether.

  Mendeleyev used a slightly different approach, placing his elements into groups of seven, but employed fundamentally the same principle. Suddenly the idea seemed brilliant and wondrously perceptive. Because the properties repeated themselves periodically, the invention became known as the periodic table.

  Mendeleyev was said to have been inspired by the card game known as solitaire in North America and patience elsewhere, wherein cards are arranged by suit horizontally and by number vertically. Using a broadly similar concept, he arranged the elements in horizontal rows called periods and vertical columns called groups. This instantly showed one set of relationships when read up and down and another when read from side to side. Specifically, the vertical columns put together chemicals that have similar properties. Thus copper sits on top of silver and silver sits on top of gold because of their chemical affinities as metals, while helium, neon, and argon are in a column made up of gases. (The actual, formal determinant in the ordering is something called their electron valences, for which you will have to enroll in night classes if you wish an understanding.) The horizontal rows, meanwhile, arrange the chemicals in ascending order by the number of protons in their nuclei--what is known as their atomic number.

  The structure of atoms and the significance of protons will come in a following chapter, so for the moment all that is necessary is to appreciate the organizing principle: hydrogen has just one proton, and so it has an atomic number of one and comes first on the chart; uranium has ninety-two protons, and so it comes near the end and has an atomic number of ninety-two. In this sense, as Philip Ball has pointed out, chemistry really is just a matter of counting. (Atomic number, incidentally, is not to be confused with atomic weight, which is the number of protons plus the number of neutrons in a given element.) There was still a great deal that wasn't known or understood. Hydrogen is the most common element in the universe, and yet no one would guess as much for another thirty years. Helium, the second most abundant element, had only been found the year before--its existence hadn't even been suspected before that--and then not on Earth but in the Sun, where it was found with a spectroscope during a solar eclipse, which is why it honors the Greek sun god Helios. It wouldn't be isolated until 1895. Even so, thanks to Mendeleyev's invention, chemistry was now on a firm footing.

  For most of us, the periodic table is a thing of beauty in the abstract, but for chemists it established an immediate orderliness and clarity that can hardly be overstated. "Without a doubt, the Periodic Table of the Chemical Elements is the most elegant organizational chart ever devised," wrote Robert E. Krebs in The History and Use of Our Earth's Chemical Elements, and you can find similar sentiments in virtually every history of chemistry in print.

  Today we have "120 or so" known elements--ninety-two naturally occurring ones plus a couple of dozen that have been created in labs. The actual number is slightly contentious because the heavy, synthesized elements exist for only millionths of seconds and chemists sometimes argue over whether they have really been detected or not. In Mendeleyev's day just sixty-three elements were known, but part of his cleverness was to realize that the elements as then known didn't make a complete picture, that many pieces were missing. His table predicted, with pleasing accuracy, where new elements would slot in when they were found.

  No one knows, incidentally, how high the number of elements might go, though anything beyond 168 as an atomic weight is considered "purely speculative," but what is certain is that anything that is found will fit neatly into Mendeleyev's great scheme.

  The nineteenth century held one last great surprise for chemists. It began in 1896 when Henri Becquerel in Paris carelessly left a packet of uranium salts on a wrapped photographic plate in a drawer. When he took the plate out some time later, he was surprised to discover that the salts had burned an impression in it, just as if the plate had been exposed to light. The salts were emitting rays of some sort.

  Considering the importance of what he had found, Becquerel did a very strange thing: he turned the matter over to a graduate student for investigation. Fortunately the student was a recent émigré from Poland named Marie Curie. Working with her new husband, Pierre, Curie found that certain kinds of rocks poured out constant and extraordinary amounts of energy, yet without diminishing in size or changing in any detectable way. What she and her husband couldn't know--what no one could know until Einstein explained things the following decade--was that the rocks were converting mass into energy in an exceedingly efficient way. Marie Curie dubbed the effect "radioactivity." In the process of their work, the Curies also found two new elements--polonium, which they named after her native country, and radium. In 1903 the Curies and Becque
rel were jointly awarded the Nobel Prize in physics. (Marie Curie would win a second prize, in chemistry, in 1911, the only person to win in both chemistry and physics.)

  At McGill University in Montreal the young New Zealand-born Ernest Rutherford became interested in the new radioactive materials. With a colleague named Frederick Soddy he discovered that immense reserves of energy were bound up in these small amounts of matter, and that the radioactive decay of these reserves could account for most of the Earth's warmth. They also discovered that radioactive elements decayed into other elements--that one day you had an atom of uranium, say, and the next you had an atom of lead. This was truly extraordinary. It was alchemy, pure and simple; no one had ever imagined that such a thing could happen naturally and spontaneously.

  Ever the pragmatist, Rutherford was the first to see that there could be a valuable practical application in this. He noticed that in any sample of radioactive material, it always took the same amount of time for half the sample to decay--the celebrated half-life--and that this steady, reliable rate of decay could be used as a kind of clock. By calculating backwards from how much radiation a material had now and how swiftly it was decaying, you could work out its age. He tested a piece of pitchblende, the principal ore of uranium, and found it to be 700 million years old--very much older than the age most people were prepared to grant the Earth.

  In the spring of 1904, Rutherford traveled to London to give a lecture at the Royal Institution--the august organization founded by Count von Rumford only 105 years before, though that powdery and periwigged age now seemed a distant eon compared with the roll-your-sleeves-up robustness of the late Victorians. Rutherford was there to talk about his new disintegration theory of radioactivity, as part of which he brought out his piece of pitchblende. Tactfully--for the aging Kelvin was present, if not always fully awake--Rutherford noted that Kelvin himself had suggested that the discovery of some other source of heat would throw his calculations out. Rutherford had found that other source. Thanks to radioactivity the Earth could be--and self-evidently was--much older than the twenty-four million years Kelvin's calculations allowed.

  Kelvin beamed at Rutherford's respectful presentation, but was in fact unmoved. He never accepted the revised figures and to his dying day believed his work on the age of the Earth his most astute and important contribution to science--far greater than his work on thermodynamics.

  As with most scientific revolutions, Rutherford's new findings were not universally accepted. John Joly of Dublin strenuously insisted well into the 1930s that the Earth was no more than eighty-nine million years old, and was stopped only then by his own death. Others began to worry that Rutherford had now given them too much time. But even with radiometric dating, as decay measurements became known, it would be decades before we got within a billion years or so of Earth's actual age. Science was on the right track, but still way out.

  Kelvin died in 1907. That year also saw the death of Dmitri Mendeleyev. Like Kelvin, his productive work was far behind him, but his declining years were notably less serene. As he aged, Mendeleyev became increasingly eccentric--he refused to acknowledge the existence of radiation or the electron or anything else much that was new--and difficult. His final decades were spent mostly storming out of labs and lecture halls all across Europe. In 1955, element 101 was named mendelevium in his honor. "Appropriately," notes Paul Strathern, "it is an unstable element."

  Radiation, of course, went on and on, literally and in ways nobody expected. In the early 1900s Pierre Curie began to experience clear signs of radiation sickness--notably dull aches in his bones and chronic feelings of malaise--which doubtless would have progressed unpleasantly. We shall never know for certain because in 1906 he was fatally run over by a carriage while crossing a Paris street.

  Marie Curie spent the rest of her life working with distinction in the field, helping to found the celebrated Radium Institute of the University of Paris in 1914. Despite her two Nobel Prizes, she was never elected to the Academy of Sciences, in large part because after the death of Pierre she conducted an affair with a married physicist that was sufficiently indiscreet to scandalize even the French--or at least the old men who ran the academy, which is perhaps another matter.

  For a long time it was assumed that anything so miraculously energetic as radioactivity must be beneficial. For years, manufacturers of toothpaste and laxatives put radioactive thorium in their products, and at least until the late 1920s the Glen Springs Hotel in the Finger Lakes region of New York (and doubtless others as well) featured with pride the therapeutic effects of its "Radioactive mineral springs." Radioactivity wasn't banned in consumer products until 1938. By this time it was much too late for Madame Curie, who died of leukemia in 1934. Radiation, in fact, is so pernicious and long lasting that even now her papers from the 1890s--even her cookbooks--are too dangerous to handle. Her lab books are kept in lead-lined boxes, and those who wish to see them must don protective clothing.

  Thanks to the devoted and unwittingly high-risk work of the first atomic scientists, by the early years of the twentieth century it was becoming clear that Earth was unquestionably venerable, though another half century of science would have to be done before anyone could confidently say quite how venerable. Science, meanwhile, was about to get a new age of its own--the atomic one.

  PART III A NEW AGE DAWNS

  8 EINSTEIN'S UNIVERSE

  AS THE NINETEENTH century drew to a close, scientists could reflect with satisfaction that they had pinned down most of the mysteries of the physical world: electricity, magnetism, gases, optics, acoustics, kinetics, and statistical mechanics, to name just a few, all had fallen into order before them. They had discovered the X ray, the cathode ray, the electron, and radioactivity, invented the ohm, the watt, the Kelvin, the joule, the amp, and the little erg.

  If a thing could be oscillated, accelerated, perturbed, distilled, combined, weighed, or made gaseous they had done it, and in the process produced a body of universal laws so weighty and majestic that we still tend to write them out in capitals: the Electromagnetic Field Theory of Light, Richter's Law of Reciprocal Proportions, Charles's Law of Gases, the Law of Combining Volumes, the Zeroth Law, the Valence Concept, the Laws of Mass Actions, and others beyond counting. The whole world clanged and chuffed with the machinery and instruments that their ingenuity had produced. Many wise people believed that there was nothing much left for science to do.

  In 1875, when a young German in Kiel named Max Planck was deciding whether to devote his life to mathematics or to physics, he was urged most heartily not to choose physics because the breakthroughs had all been made there. The coming century, he was assured, would be one of consolidation and refinement, not revolution. Planck didn't listen. He studied theoretical physics and threw himself body and soul into work on entropy, a process at the heart of thermodynamics, which seemed to hold much promise for an ambitious young man. * 15 In 1891 he produced his results and learned to his dismay that the important work on entropy had in fact been done already, in this instance by a retiring scholar at Yale University named J. Willard Gibbs.

  Gibbs is perhaps the most brilliant person that most people have never heard of. Modest to the point of near invisibility, he passed virtually the whole of his life, apart from three years spent studying in Europe, within a three-block area bounded by his house and the Yale campus in New Haven, Connecticut. For his first ten years at Yale he didn't even bother to draw a salary. (He had independent means.) From 1871, when he joined the university as a professor, to his death in 1903, his courses attracted an average of slightly over one student a semester. His written work was difficult to follow and employed a private form of notation that many found incomprehensible. But buried among his arcane formulations were insights of the loftiest brilliance.

  In 1875-78, Gibbs produced a series of papers, collectively titled On the Equilibrium of Heterogeneous Substances , that dazzlingly elucidated the thermodynamic principles of, well, nearly everything--"gases, m
ixtures, surfaces, solids, phase changes ... chemical reactions, electrochemical cells, sedimentation, and osmosis," to quote William H. Cropper. In essence what Gibbs did was show that thermodynamics didn't apply simply to heat and energy at the sort of large and noisy scale of the steam engine, but was also present and influential at the atomic level of chemical reactions. Gibbs's Equilibrium has been called "the Principia of thermodynamics," but for reasons that defy speculation Gibbs chose to publish these landmark observations in the Transactions of the Connecticut Academy of Arts and Sciences , a journal that managed to be obscure even in Connecticut, which is why Planck did not hear of him until too late.

  Undaunted--well, perhaps mildly daunted--Planck turned to other matters. * 16 We shall turn to these ourselves in a moment, but first we must make a slight (but relevant!) detour to Cleveland, Ohio, and an institution then known as the Case School of Applied Science. There, in the 1880s, a physicist of early middle years named Albert Michelson, assisted by his friend the chemist Edward Morley, embarked on a series of experiments that produced curious and disturbing results that would have great ramifications for much of what followed.

  What Michelson and Morley did, without actually intending to, was undermine a longstanding belief in something called the luminiferous ether, a stable, invisible, weightless, frictionless, and unfortunately wholly imaginary medium that was thought to permeate the universe. Conceived by Descartes, embraced by Newton, and venerated by nearly everyone ever since, the ether held a position of absolute centrality in nineteenth-century physics as a way of explaining how light traveled across the emptiness of space. It was especially needed in the 1800s because light and electromagnetism were now seen as waves, which is to say types of vibrations. Vibrations must occur in something; hence the need for, and lasting devotion to, an ether. As late as 1909, the great British physicist J. J. Thomson was insisting: "The ether is not a fantastic creation of the speculative philosopher; it is as essential to us as the air we breathe"--this more than four years after it was pretty incontestably established that it didn't exist. People, in short, were really attached to the ether.

 

‹ Prev