Book Read Free

The God Particle: If the Universe Is the Answer, What Is the Question?

Page 27

by Leon Lederman


  In a Fermilab education program for ten-year-olds, we confront them with this problem. We give them an empty square box to look at, shake, weigh. Then we put something in the box, such as a wooden block or three steel balls. Then we ask the students again to weigh, shake, tilt, listen, and to tell us everything they can about the objects: size, shape, weight ... It's an instructive metaphor for our scattering experiments. You'd be surprised how often the kids get it right.

  Let's switch to grownups and particles. Let's say we want to find out the size of protons. So we take a tip from Monet. We look at them in different forms of "light." Could protons be points? To find out, physicists hit protons with other protons at very low energy to explore the electromagnetic force between the two charged objects. Coulomb's law says that this force reaches out to infinity, decreasing in strength as the square of the distance. The target proton and the accelerated proton are, of course, both positively charged, and since like charges repel, the slow proton is readily repelled by the target proton. It never gets very close. In this kind of "light," the proton does in fact look like a point, a point of electric charge. So we increase the energy of the accelerated protons. Now the deviations in the patterns of scattered protons indicate that the penetrations are getting deep enough to touch what's called the strong force, the force that we now know holds the proton's constituents together. The strong force is a hundred times stronger than the Coulomb electrical force, but unlike the electrical force, its range is anything but infinite. The strong force reaches out only to a distance of about 10−13 centimeters, then fades quickly to zero.

  By increasing the energy of the collision, we unearth more and more details about the strong force. As the energy increases, the wavelength of the protons (remember de Broglie and Schrodinger) shrinks. And, as we have seen, the smaller the wavelength, the more detail that can be discerned in the particle being studied.

  Some of the best "pictures" of the proton were taken in the 1950s by Robert Hofstadter of Stanford University. There the "light" used was a beam of electrons rather than protons. Hofstadter's team aimed a well-organized beam of, say, 800 MeV electrons at a small vat of liquid hydrogen. The electrons bombarded the protons in the hydrogen, resulting in a scattering pattern, the electrons emerging in a variety of directions relative to their original motion. Not too different from what Rutherford did. Unlike the proton, the electron does not respond to the strong nuclear force. It responds only to the electric charge in the proton, so the Stanford scientists were able to explore the shape of the charge distribution in the proton. In effect, this revealed the proton's size. It was clearly not a point. The radius was measured to be 2.8 × 10−13 centimeters, with the charge piling up at the center and fading out at the edges of what we call a proton. Similar results were obtained when the experiments were repeated with muon beams, which also ignore the strong force. Hofstadter was awarded a Nobel Prize in 1961 for his "photograph" of the proton.

  About 1968, physicists at the Stanford Linear Accelerator Center (SLAC) bombarded protons with electrons at a much higher energy—8 to 15 GeV—and got a vastly different set of scattering patterns. In this hard light, the proton presented quite a different picture. The relatively low-energy electrons that Hofstadter used were able to see only a "blurry" proton, a smooth distribution of charge that made the proton look like a mushy little ball. The SLAC electrons probed harder and found little guys running around inside the proton. This was the earliest indication of the reality of quarks. The new data and the old data were consistent—like morning and evening paintings by Monet—but the low-energy electrons could reveal only average charge distributions. The visualization provided by the higher-energy electrons showed that our proton contains three rapidly moving, pointlike constituents. Why did the SLAC experiment show this detail, while the Hofstadter study did not? A collision with high enough energy (determined by what goes in and what comes out) freezes the quarks in place and "feels" the pointlike force. It's the virtue of short wavelengths again. This force promptly induces large-angle scattering (remember Rutherford and the nucleus) and large energy changes. The formal name for this phenomenon is "deep inelastic scattering." In Hofstadter's earlier experiments, the quark motion was blurred out and the protons looked "smooth" and uniform inside because of the lower energy of the probing electrons. Think of taking a photograph of three rapidly vibrating, tiny light bulbs using a one-minute time exposure. The film would show one big blurry undifferentiated object. The SLAC experiment, in a crude sense, used a faster shutter, freezing the spots of light so that they could easily be counted.

  Since the quark interpretation of the higher energy electron scattering was very far out and of tremendous importance, these experiments were repeated at Fermilab and at CERN (an acronym for the European Center for Nuclear Research), using muons of ten times the SLAC energy (150 GeV) as well as neutrinos. Muons, like electrons, test the electromagnetic structure of the proton, but neutrinos, impervious to both the electromagnetic and the strong forces, test what's called the weak-force distribution. The weak force is the nuclear force responsible for radioactive decay, among other things. These huge experiments, carried out in heated competition, each came to the same conclusion: the proton is made of three quarks. And we learned some details about how the quarks move about. Their motion defines what we call "proton."

  Detailed analysis of all three types of experiments—electron, muon, and neutrino—also succeeded in detecting a new kind of particle, the gluon. Gluons are carriers of the strong force, and without them the data just could not be explained. The same analysis gave quantitative details on how the quarks whirl about each other in their proton prison. Twenty years of such study (the technical term is structure functions) has given us a sophisticated model that accounts for all the collision experiments in which protons, neutrons, electrons, muons, and neutrinos as well as photons, pions, and antiprotons are aimed at protons. This is Monet with a vengeance. Perhaps Wallace Stevens's poem "Thirteen Ways of Looking at a Blackbird" would be more to the point.

  As you can see, we learn many things in order to account for what-goes-in-and-what-comes-out. We learn about the forces and how these forces result in complex structures such as protons (made of three quarks) and mesons (made of a quark and an antiquark). With so much complementary information, it becomes less and less important that we can't see inside the black box where the collision actually takes place.

  One can't help being impressed by the sequence of "seeds within seeds." The molecule is made of atoms. The core of the atom is the nucleus. The nucleus is made of protons and neutrons. The proton and neutron are made of quarks. The quarks are made of ... whoops, hold it. The quarks can't be broken down, we think, but of course we are not sure. How dare we say we've come to the end of the road? Nevertheless, that is the consensus—at present—and after all, Democritus can't live forever.

  NEW MATTER: SOME RECIPES

  We have yet to discuss an important process that can take place during a collision. We can make new particles. This happens all the time around the house. Look at the lamp that is valiantly trying to illuminate this dark page. What is the source of the light? It is electrons, agitated by the electrical energy squirting into the filament of the bulb or, if you are energy efficient, into the gas of the fluorescent lamp. The electrons emit photons. That's the process. In the more abstract language of the particle physicist, the electron in the process of a collision can radiate a photon. The energy is provided to the electron (via the wall plug) by an accelerating process.

  Now we have to generalize. In the process of creation, we are constrained by the laws of conservation of energy, momentum, charge, and respect for all of the other quantum rules. Also, the object that is somehow responsible for creating a new particle has to be "connected" to the particle being created. Example: a proton collides with another proton, and a new particle, a pion, is made. We write it like this:

  p+ + p+ → p+ + π+ + n

  That is, protons collide
and produce another proton, a positive pion (π+), and a neutron. These particles are all connected via the strong force, and this is a typical creation process. Alternatively, one can view this as a proton, "under the influence" of another proton, dissolving into a "pi plus" and a neutron.

  Another kind of creation, a rare and exciting process called annihilation, takes place when matter and antimatter collide. The term annihilation is used in its strictest dictionary sense of putting something out of existence. When an electron collides with its antiparticle, the positron, the particle and antiparticle disappear, and in their place energy, in the form of a photon, appears momentarily. The conservation laws don't like this process, so the photon is temporary and must soon create two particles in its place—for example, another electron and a positron. Less frequently the photon may dissolve into a muon and an antimuon, or even a positive proton and a negative antiproton. Annihilation is the only phenomenon that is fully efficient in converting mass to energy in accordance with Einstein's law, E = mc2. When a nuclear bomb explodes, for instance, only a fraction of 1 percent of the atomic mass is converted into energy. When matter and antimatter collide, 100 percent of the mass disappears.

  When we're making new particles, the primary requirement is that there be enough energy, and E = mc2 is our accounting tool. For example, we mentioned that a collision between an electron and a positron can result in a proton and an antiproton, or a p and a p-bar, as we call them. Since the rest mass energy of a proton is about 1 GeV, the particles in the original collision must bring in at least 2 GeV to produce a p/p-bar pair. More energy increases the probability of this result and gives the newly produced objects some kinetic energy, making them easier to detect.

  The glamorous nature of antimatter has given rise to the science fiction notion that it may solve the energy crisis. Indeed, a kilogram (2.2 pounds) of antimatter would provide enough energy to keep the United States going for a day. This is because the entire mass of antiproton (plus the proton it takes with it to total annihilation) is converted to energy via E = mc2 In the burning of coal or oil, only one billionth of the mass is converted to energy. In fission reactors this number is 0.1 percent, and in the long-awaited fusion energy supply (don't hold your breath!) it is about 0.5 percent.

  PARTICLES FROM THE VOID

  Another way of thinking about these things is to imagine that all space, even empty space, is awash with particles, all that nature in her infinite wisdom can provide. This is not a metaphor. One of the implications of quantum theory is that these particles do in fact pop in and out of existence in the void. The particles, in all sizes and shapes, are all temporary. They are created and then quickly disappear—a bazaar of seething activity. As long as they occur in empty space, vacuum, nothing really happens. This is quantum spookiness, but perhaps it can help to explain what happens in a collision. Here a pair of charmed quarks (a certain kind of quark and its antiquark) appears and disappears; there a bottom quark and its anti-bottom mate. And wait, over there, what's that? Well, whatever: an × and an anti-X appear, something we have no knowledge of in 1993.

  There are rules in this chaotic madness. The quantum numbers must add to zero, the zero of the void. Another rule: the heavier the objects, the less frequent their evanescent appearance. They "borrow" energy from the void to appear for the minutest fraction of a second, then disappear because they must pay it back in a time specified by Heisenberg's uncertainty relations. Now here is the key: if energy can be provided from the outside, then the transient virtual appearance of these vacuum-originated particles can be converted to real existence, existence that can be detected by bubble chambers or counters. How provided? Well, if an energetic particle, fresh out of the accelerator and shopping for new particles, can afford to pay the price—that is, at least the rest mass of the pair of quarks or X's—then the vacuum is reimbursed, and we say that our accelerated particle has created a quark-antiquark pair. Obviously, the heavier the particles we want to create, the more energy we need from the machine. In Chapters 7 and 8 you'll meet many new particles that came into being in just such a fashion. Incidentally, this quantum fantasy of an all-pervading vacuum filled with "virtual particles" has other experimental implications, modifying the mass and magnetism of electrons and muons, for example. We'll explain further when we get to the "g minus 2" experiment.

  THE RACE

  Beginning in the Rutherford era, the race was on to make devices that could reach very high energies. The effort was helped along in the 1920s by the electric utility companies, because electrical power is transmitted most efficiently when the voltage is high. Another motivation was the creation of energetic x-rays for cancer therapy. Radium was already being used to destroy tumors, but it was enormously expensive and higher energy radiation was thought to be a great advantage. Thus the electric utilities and medical research institutes supported the development of high voltage generators. Rutherford characteristically took the lead when he issued a challenge to England's Metropolitan-Vickers Electrical Company to "give us a potential on the order of ten million volts which can be accommodated in a reasonably sized room ... and an evacuated tube capable of withstanding this voltage."

  German physicists tried to harness the huge voltage of Alpine lightning storms. They hung an insulated cable between two mountain peaks, siphoning off charges as high as 15 million volts and inducing huge sparks that jumped 18 feet between two metal spheres—spectacular, but not too useful. This approach was abandoned when a scientist was killed while adjusting the apparatus.

  The failure of the German team illustrated that one needed more than power. The terminals of the gap had to he housed in a beam tube or vacuum chamber that was a very good insulator. (High voltages love to arc across insulators unless the design is very precise.) The tube also had to be strong enough to withstand having its air pumped out. A high-quality vacuum was essential; if there were too many residual molecules floating around inside the tube they would interfere with the beam. And the high voltage had to be steady enough to accelerate lots of particles. These and other technical problems were worked on from 1926 to 1933 before they were solved.

  Competition was intense throughout Europe, and American institutions and scientists joined the fray. An impulse generator built by Allgemeine Elektrizität Gesellschaft in Berlin reached 2.4 million volts but produced no particles. The idea was transported to General Electric in Schenectady, which improved the energy to 6 million volts. At the Carnegie Institution in Washington, D.C., physicist Merle Tuve drove an induction coil to several million volts in 1928 but didn't have an appropriate beam tube. Charles Lauritsen at Cal Tech succeeded in building a vacuum tube that would hold 750,000 volts. Tuve adopted Lauritsen's tube and produced a beam of 1013 (10 trillion) protons per second at 500,000 volts, theoretically enough particles and energy to probe the nucleus. Tuve did in fact achieve nuclear collisions, but not until 1933, by which time two other efforts had beaten him to the punch.

  Another runner-up was Robert Van de Graaff, of Yale and then MIT, who built a machine that carried electric charge along an endless silk belt up to a large metal sphere, gradually increasing the voltage of the sphere until, at a few million volts, he drew a tremendous arc to the wall of the building. This was the now famous Van de Graaff generator familiar to high school physics students across the land. Enlarging the radius of the sphere postponed the discharge. Encasing the entire sphere in dry nitrogen gas helped increase the voltage. Ultimately, Van de Graaff generators would be the machines of choice in the under-10-million-volt category, but it took years to perfect the idea.

  The race continued through the late 1920s and early '30s. It was a couple of Rutherford's Cavendish gang, John Cockcroft and Ernest Walton, who won, though by a whisker. And (here I have to groan) they were given invaluable help by a theorist. Cockcroft and Walton, after numerous failures, were attempting to reach the one million volts that was perceived to be necessary to probe the nucleus. A Russian theorist, George Gamow, had been visitin
g Niels Bohr in Copenhagen and decided to hop over to Cambridge before heading home. There he got into an argument with Cockcroft and Walton, telling the experimenters that they didn't need all the voltage they were playing with. He argued that the new quantum theory permitted successful nuclear penetrations even if the energy was not high enough to overcome the electrical repulsion of the nucleus. He explained that the quantum theory gave the protons wave properties, which can tunnel through the nuclear charge "barrier," as we discussed in Chapter 5. Cockcroft and Walton finally took note and redesigned their device for 500,000 volts. Using a transformer and a voltage multiplier circuit, they accelerated protons obtained from a discharge tube of the type that J. J. Thomson used to generate cathode rays.

  In Cockcroft and Walton's machine, bursts of protons, about a trillion per second, accelerated down the evacuated tube and smashed into targets of lead, lithium, and beryllium. The year was 1930, and nuclear reactions had finally been induced by accelerated particles. Lithium was disintegrated by protons of only 400,000 eV, far below the millions of electron volts that had been thought necessary. It was a historic event. A new style of "knife" was now available, although still in its most primitive form.

  A MOVER AND SHAKER IN CALIFORNIA

  The action now switches to Berkeley, California, where Ernest Orlando Lawrence, a native of South Dakota, had arrived in 1928 after a brilliant beginning in physics research at Yale. E. O. Lawrence invented a radically different technique of accelerating particles in a machine called a cyclotron, for which he was awarded the Nobel Prize in 1939. Lawrence was familiar with the clumsy electrostatic machines, with their huge voltages and frustrating electrical breakdowns, and he figured there had to be a better way. Searching through the literature for ways to achieve high energy without high voltages, he came across the papers of a Norwegian engineer, Rolf Wideröe. Wideröe noted that one could double the energy of a particle without doubling the voltage by passing it through two gaps in a row. Wideröe's idea is the basis for what is now called the linear accelerator. One gap is positioned after another down a line, the particles picking up energy at each gap.

 

‹ Prev