The God Particle

Home > Other > The God Particle > Page 18
The God Particle Page 18

by Leon Lederman


  Physics and engineering students the world over wear T-shirts sporting these four crisp equations. Maxwell's original equations, however, looked nothing like the above. These simple versions are the work of Hertz, a rare example of someone who was more than the usual experimenter with only a working grasp of theory. He was exceptional in both areas. Like Faraday, he was aware of, but uninterested in, the immense practical importance of his work. He left that to lesser scientific minds, such as Marconi and Larry King. Hertz's theoretical work consisted largely of cleaning up Maxwell, reducing and popularizing his theory. Without Hertz's efforts, physics students would have to lift weights so they could wear triple-extra-large T-shirts in order to accommodate Maxwell's clumsy mathematics.

  True to our tradition and our promise to Democritus, who recently faxed us a reminder, we have to interview Maxwell (or his estate) on atoms. Of course he believed. He was also the author of a very successful theory that treated gases as an assembly of atoms. He believed, correctly, that chemical atoms were not just tiny rigid bodies, but had some complex structure. This belief came out of his knowledge of optical spectra, which became important, as we shall see, in the development of quantum theory. Maxwell believed, incorrectly, that his complex atoms were uncuttable. He said it so beautifully in 1875: "Though in the course of ages catastrophies have occurred and may yet occur in the heavens, though ancient systems may be dissolved and new systems evolved out of their ruins, the [atoms] out of which these systems [earth, solar system, and so on] are built—the foundation stones of the material universe—remain unbroken and unworn." If only he had used the terms "quarks and leptons" instead of "atoms."

  The ultimate judgment on Maxwell comes again from Einstein, who stated that Maxwell made the single most important contribution of the nineteenth century.

  THE MAGNET AND THE BALL

  We have glossed over some important details in our story. How do we know that fields propagate at a fixed speed? How did physicists in the nineteenth century even know what the speed of light was? And what is the difference between instantaneous action-at-a-distance and time-delayed response?

  Consider a very powerful electromagnet at one end of a football field and, at the other end, an iron ball suspended by a thin wire from a very high support. The ball tilts ever so slightly toward the faraway magnet. Now suppose we very rapidly turn the current off in the electromagnet. Precise observations of the ball and its wire would record a response as the ball relaxes back to its equilibrium position. But is the response instantaneous? Yes, say the action-at-a-distance folk. The connection between magnet and iron ball is tight and, when the magnet disappears, the ball instantaneously begins to move back to zero tilt. "No!" say the finite-velocity people. The information "magnet is turned off, you can relax now" travels across the gridiron with a definite velocity, so the ball's response is delayed.

  Today we know the answer. The ball has to wait, not very long because the information travels at the speed of light, but there is a measurable delay. But in Maxwell's time this problem was at the heart of a raging debate. At stake was the validity of the field concept. Why didn't scientists just do an experiment and settle the issue? Because light is so fast that it takes only one millionth of a second to cross the football field. In the 1800s that was a difficult delay to measure. Today it is trivial to measure time intervals a thousand rimes shorter so the finite propagation of electromagnetic happenings is easily gauged. For example, we bounce laser signals off a new reflector on the moon to measure the distance between earth and moon. The round trip takes about 1.0 second.

  An example on a larger scale: On February 23, 1987, at exactly 7:36 UT Greenwich mean time, a star was observed to explode in the southern sky. This supernova event took place in the Large Magellanic Cloud, a cluster of stars and dust located 160,000 light-years away. In other words, it took 160,000 years for the electromagnetic information from the supernova to arrive at planet earth. And Supernova 87A was a relatively near neighbor. The most distant object observed is about 8 billion light-years old. Its light set out for our telescope rather close to the Beginning.

  The velocity of light was first measured in an earthbound laboratory by Armand-Hippolyte-Louis Fizeau, in 1849. Lacking oscilloscopes and crystal-controlled clocks, he used an ingenious arrangement of mirrors (to extend the length of the light path) and a rapidly rotating toothed wheel. If we know how fast the wheel is turning, and we know the radius of the wheel, we can calculate the time it takes for a gap to be replaced by a tooth. We can adjust the rotation speed so that this time is precisely the time a light beam takes to proceed from gap to distant mirror and back to gap, and then through gap to the eyeball of M. Fizeau. Mon dieu! I see it! Now gradually speed up the wheel (shorten the time) until the light is blocked. There. Now we know the distance the beam traveled—from light source through gap to mirror and back to wheel tooth—and we know the time it took. Fiddling with this arrangement gave Fizeau the famous number 300 million (3 × 108) meters per second or 186,000 miles per second.

  I am continually surprised at the philosophical depth of all these guys during this electromagnetic renaissance. Oersted believed (contrary to Newton) that all forces of nature (at the time: gravity, electricity, and magnetism) were different manifestations of one primordial force. This is s-o-o-o modern! Faraday's efforts to establish the symmetry of electricity and magnetism invokes the Greek heritage of simplicity and unification, 2 of the 137 goals at Fermilab in the 1990s.

  TIME TO GO HOME?

  In these past two chapters we've covered more than three hundred years of classical physics, from Galileo to Hertz. I've left out some good people. The Dutchman Christiaan Huygens, for example, told us a lot about light and waves. The Frenchman René Descartes, the founder of analytical geometry, was a leading advocate of atomism, and his sweeping theories of matter and cosmology were imaginative but unsuccessful.

  We've looked at classical physics from an unorthodox point of view, that of searching for Democritus's a-tom. Usually the classical era is viewed as an examination of forces: gravity and electromagnetism. As we've seen, gravitation derives from the attraction between masses. In electricity Faraday recognized a different phenomenon; matter is irrelevant here, he said. Let's look at force fields. Of course, once you have a force you must still invoke Newton's second law (F = ma) to find the resultant motion, and here inertial matter really matters. Faraday's matter-doesn't-matter approach was derived from the intuition of Boscovich, a pioneer in atomism. And, of course, Faraday provided the first hints about "atoms of electricity." Perhaps one isn't supposed to look at science history this way, as a search for a concept, the ultimate particle. Yet, it's there beneath the surface in the intellectual lives of many of the heroes of physics.

  By the late 1890s, physicists thought they had it all together All of electricity, all of magnetism, all of light, all of mechanics, all moving things, as well as cosmology and gravity—all were understood by a few simple equations. As for atoms, most chemists felt that the subject was pretty much closed. There was the periodic table of the elements. Hydrogen, helium, carbon, oxygen et al. were indivisible elements, each with its own invisible, indivisible atom.

  There were some mysterious cracks in the picture. For example, the sun was puzzling. Using then-current beliefs in chemistry and atomic theory, the British scientist Lord Rayleigh calculated that the sun should have burned up all its fuel in 30,000 years. Scientists knew that the sun was a lot older than that. This aether business was also troubling. Its mechanical properties would have to be bizarre indeed. It had to be totally transparent, capable of slipping between atoms of matter without disturbing them, yet it had to be as rigid as steel to support the huge velocity of light. Still, it was hoped that these and other mysteries would be solved in due time. Had I been teaching back in 1890, I might have been tempted to send my physics students home, advising them to find a more interesting major. All the big questions had been answered. Those issues that were not understood—
the sun's energy, radioactivity, and a number of other puzzles—well, everybody believed that sooner or later they would yield to the power of the Newton-Maxwell theoretical juggernaut. Physics had been neatly wrapped up in a box and tied with a bow.

  Then suddenly, at the end of the century, the whole package began to unravel. The culprit, as usual, was experimental science.

  THE FIRST TRUE PARTICLE

  During the nineteenth century, physicists fell in love with the electrical discharges produced in gas-filled glass tubes when the pressure was lowered. A glass blower would fashion an exquisite three-foot-long glass tube. Metal electrodes were sealed into the tube. The experimenter would pump out the air as best he could, then bleed in a desired gas (hydrogen, ait, carbon dioxide) at low pressure. Wires from each electrode were attached to an external battery. Large electrical voltages were applied. Then, in a darkened room, experimenters were awed as splendid glows appeared, changing shape and size as the pressure decreased. Anyone who has seen a neon sign is familiar with this kind of glow. At low enough pressure, the glow turned into a ray, which traveled from the cathode, the negative terminal, toward the anode. Logically, it was dubbed a cathode ray. These phenomena, which we now know to be rather complex, fascinated a generation of physicists and interested laypersons all over Europe.

  Scientists knew a few controversial, even contradictory details about these cathode rays. They carried a negative charge. They traveled in straight lines. They could spin a fine paddle wheel sealed into the glass. Electric fields didn't deflect them. Electric fields did deflect them. A magnetic field would cause a narrow beam of cathode rays to bend into a circular arc. The rays were stopped by thick metal but could penetrate thin metal foils.

  Interesting facts, but the critical mystery remained: what were these rays? In the late nineteenth century, there were two guesses. Some researchers thought cathode rays were massless electromagnetic vibrations in the aether. Not a bad guess. After all, they glowed like a light beam, another kind of electromagnetic vibration. And obviously, electricity, which is a form of electromagnetism, had something to do with the ray.

  Another camp thought the rays were a form of matter. A good guess was that they were composed of gas molecules in the tubes that had picked up a charge from the electricity. Another guess was that cathode rays were composed of a new form of matter, small particles never before isolated. For a variety of reasons, the idea that there is a basic carrier of electric charge was "in the air." We'll let the cat out of the bag right now. Cathode rays weren't electromagnetic vibrations and they weren't gas molecules.

  If Faraday had been alive in the late 1800s, what would he have said? Faraday's laws strongly implied that there were "atoms of electricity." As you'll recall, he did some similar experiments, except that he passed electricity through liquids rather than gases, ending up with ions, charged atoms. As early as 1874 George Johnstone Stoney, an Irish physicist, had coined the term "electron" for the unit of electricity that is lost when an atom becomes an ion. Had Faraday witnessed a cathode ray, perhaps he would have known in his heart that he was watching electrons at work.

  Some scientists in this period may have strongly suspected that cathode rays were particles; maybe some thought they had finally found the electron. How do you find out? How do you prove it? In the intense period before 1895, many prominent researchers in England, Scotland, Germany, and the United States were studying gaseous discharges. The one who struck pay dirt was an Englishman named J. J. Thomson. There were others who came close. We'll take a look at two of them and what they did, just to show how heartbreaking scientific life is.

  The guy who came nearest to beating Thomson was Emil Weichert, a Prussian physicist, who demonstrated his technique to a lecture audience in January 1887. His glass tube was about fifteen inches long and three inches in diameter. The illuminated cathode rays were easily visible in a partially darkened room.

  If you're trying to corral a particle, you must describe its charge (e) and mass (m). At the time, the particle in question was too small to weigh. To get around this problem many researchers independently seized upon a clever technique: subject the ray to known electric and magnetic forces and measure its response. Remember F = ma. If indeed the rays were composed of electrically charged particles, the force experienced by the particles would vary with the quantity of charge (e) they carried. The response would be muted by their inertial mass (m). Unfortunately, therefore, the effect that could be measured was the quotient of these two quantities, the ratio elm. In other words, researchers couldn't find individual values for e or m, just a number equal to one value divided by the other. Let's look at a simple example. You are given the number 21 and told that it is the quotient of two numbers. The 21 is a clue only. The two numbers you're looking for might be 21 and 1, 63 and 3, 7 and 1/3, 210 and 10, ad infinitum. But if you have an inkling of what one number is, you can deduce the second.

  To go after elm, Weichert put his tube into the gap of a magnet, which bent the beam into an arc. The magnet pushes on the electric charge of the particles; the slower the particles, the easier it is for the magnet to bend them into a circular arc. Once he figured out the speed, the deflection of particles by the magnet gave him a fair value for elm.

  Weichert understood that if he made an informed guess as to the value of the electric charge, he could deduce the approximate mass of the particle. He concluded: "We are not dealing with atoms known from chemistry because the mass of these moving [cathode ray] particles turns out to be 2,000 to 4,000 times smaller than the lightest known chemical atom, the hydrogen atom." Weichert almost hit the bull's-eye. He knew he was looking at some new kind of particle. He was damn close to the mass. (The electron's mass turned out to be 1,837 times smaller than that of hydrogen.) So why is Thomson famous and Weichert a footnote? Because he simply assumed (guessed) the value of the electric charge; he had no evidence for it. Weichert was also distracted by a job change and a competing interest in geophysics. He was a scientist who reached the right conclusion but didn't have all of the data. No cigar Emil!

  The second runner-up was Walter Kaufmann, in Berlin. He got to the finish line in April 1897, and his shortcoming was the opposite of Weichert's. The book on him was good data, bad thinking. He also derived elm using magnetic and electric fields, but he took the experiment a significant step further. He was especially interested in how the value of elm might change with changes in pressure and with the gas used in the tube—air hydrogen, carbon dioxide. Unlike Weichert, Kaufmann thought that cathode ray particles were simply charged atoms of the gas in the tube, so they should have a different mass for each gas used. Surprise—he discovered that elm does not change. He always got the same number no matter what gas, what pressure. Kaufmann was stumped and missed the boat. Too bad, because his experiments were quite elegant. He got a better value for elm than the champ, J. J. It's one of the cruel ironies of science that he missed what his data were screaming at him: your particles are a new form of matter, dummkopf! And they are the universal constituents of all atoms; that's why elm doesn't change.

  Joseph John Thomson (1856–1940) started out in mathematical physics and was surprised when, in 1884, he was appointed professor of experimental physics at the famous Cavendish Laboratory at Cambridge University. It would be nice to know whether he really wanted to be an experimentalist. He was famous for his clumsiness with experimental apparatus but was fortunate in having excellent assistants who could do his bidding and keep him away from all that breakable glass.

  In 1896 Thomson sets out to understand the nature of the cathode ray. At one end of his fifteen-inch glass tube the cathode emits its mysterious rays. These head for an anode with a hole that permits some of the rays (read electrons) to pass through. The narrow beam thus formed goes on to the end of the tube, where it strikes a fluorescent screen, producing a small green spot. Thomson's next surprise is to insert into the glass tube a pair of plates about six inches long. The cathode beam passes through the gap betwe
en these plates, which Thomson connects to a battery, creating an electric field perpendicular to the cathode ray. This is the deflection region.

  If the beam moves in response to the field, that means it is carrying an electric charge. If, on the other hand, the cathode rays are photons—light particles—they will ignore the deflection plates and continue on their way in a straight line. Thomson, using a powerful battery, sees the spot on the fluorescent screen move down when the top plate is negative, up when the top plate is positive. He thus proves that the rays are charged. Incidentally, if the deflection plates carry an alternating voltage (varying rapidly plus-minus-plus-minus), the green spot will move up and down rapidly, creating a green line. This is the first step in making a TV tube and seeing Dan Rather on the CBS nightly news.

  But it is 1896, and Thomson has other things on his mind. Because the force (the strength of the electric field) is known, it is easy, using simple Newtonian mechanics, to calculate how far the spot should move if one can figure out the velocity of the cathode rays. Now Thomson uses a trick. He places a magnetic field around the tube in such a direction that the magnetic deflection exactly cancels the electric deflection. Since this magnetic force depends on the unknown velocity, he has merely to read the strength of the electric field and the magnetic field in order to derive a value for the velocity. With the velocity determined, he can now go back to testing the deflection of the ray in electric fields. What emerges is a precise value for elm, the ratio of the electric charge on a cathode ray particle divided by its mass.

  Fastidiously, Thomson applies fields, measures deflections, cancels deflections, measures fields, and gets numbers for elm. Like Kaufmann, he double-checks by changing the cathode material—aluminum, platinum, copper on—and repeating the experiment. All give the same number. He changes the gas in the tube: air, hydrogen, carbon dioxide. Same result. Thomson does not repeat Kaufmann's mistake. He concludes that the cathode rays are not charged gas molecules but fundamental particles that must be part of all matter.

 

‹ Prev