The God Particle: If the Universe Is the Answer, What Is the Question?

Home > Other > The God Particle: If the Universe Is the Answer, What Is the Question? > Page 40
The God Particle: If the Universe Is the Answer, What Is the Question? Page 40

by Leon Lederman


  In 1973–74, the Stanford electron-positron (e− e+) collider called SPEAR, began taking data, and ran into an inexplicable result. It appeared that the fraction of collisions yielding hadrons was higher than theoretical estimates. The story is complicated and not too interesting until October of 1974. The SLAC physicists, led by Burton Richter, who, in the hallowed tradition of group leaders, was away at the time, began to close in on some curious effects that appeared when the sum of the energies of the two colliding particles was near 3.0 GeV, a suggestive mass, as you may recall.

  What added salsa to the affair was that three thousand miles east at Brookhaven, a group from MIT was repeating our 1967 dimuon experiment. Samuel C. C. Ting was in charge. Ting, who is rumored to have been the leader of all the Boy Scouts in Taiwan, got his Ph.D. at Michigan, did a postdoc term at CERN, and in the early sixties joined my group as assistant professor at Columbia, where his rough edges were sharpened.

  A meticulous, driven, precise, organized experimenter Ting worked with me at Columbia for a few years, had several good years at the DESY lab near Hamburg, Germany, and then went to MIT as a professor. He quickly became a force (the fifth? sixth?) to be reckoned with in particle physics. My letter of recommendation deliberately played up some of his weak points—a standard ploy in getting someone hired—but I did it in order to conclude: "Ting—a hot and sour Chinese physicist." In truth, I had a hang-up about Ting, which dates back to the fact that my father operated a small laundry, and as a child I listened to many stories about the Chinese competition across the street. Since then, any Chinese physicist has made me nervous.

  When Ting worked with the electron machine in the DESY lab, he became an expert in analyzing e+ e− pairs from electron collisions, so he decided that detecting electron pairs is the better way to do the Drell-Yan, oops, I mean the Ting dilepton experiment. So here he was in 1974 at Brookhaven, and, unlike his counterparts at SLAC who were colliding electrons and positrons, Ting was using high-energy protons, directing them into a stationary target, and looking at the e+ e− pairs that came out of the black box with the latest word in instrumentation—a vastly more precise detector than the crude instrument we had put together seven years earlier. Using Charpak wire chambers, he was able to determine precisely the mass of the messenger photon or whatever else would give rise to the observed electron-positron pair. Since muons and electrons are both leptons, which pair you chose to detect is a matter of taste. Ting was bump hunting, fishing for some new phenomenon rather than trying to verify some new hypothesis. "I am happy to eat Chinese dinners with theorists," Ting once reportedly said, "but to spend your life doing what they tell you is a waste of time." How appropriate that such a personality would be responsible for finding a quark named charm.

  The Brookhaven and SLAC experiments were destined to make the same discovery, but until November 10, 1974, neither group knew much about the other's progress. Why are the two experiments connected? The SLAC experiment collides an electron against a positron, creating a virtual photon as the first step. The Brookhaven experiment has an unholy complicated mishmash initial state, but it looks at virtual photons only if and when they emerge and dissolve into an e+ e− pair. Both deal then with the messenger photon, which can have any transitory mass/energy; it depends on the force of the collision. The well-tested model of what goes on in the SLAC collision says a messenger photon is created that can dissolve into hadrons—three pions, say, or a pion and two kaons, or a proton, antiproton, and two pions, or a pair of muons or electrons, and so on. There are many possibilities, consistent with the input energy, momentum, spin, and other factors.

  So if something new exists whose mass is less than the sum of the two colliding beam energies, it also can be made in the collision. Indeed, if the new "thing" has the same popular quantum numbers as the photon, it can dominate the reaction when the sum of the two energies is precisely equal to the new thing's mass. I've been told that just the right pitch and force in a tenor's voice can shatter a glass. New particles come into being in a similar fashion.

  In the Brookhaven version the accelerator sends protons into a fixed target, in this case a small piece of beryllium. When the relatively large protons hit the relatively large beryllium nuclei, all kinds of things can and do happen. A quark hits a quark. A quark hits an antiquark. A quark hits a gluon. A gluon hits a gluon. No matter what the energy of the accelerator collisions of much lower energies occur, because the quark constituents share the total energy of the proton. Thus, the lepton pairs that Ting measured in order to interpret his experiment came out of the machine more or less randomly. The advantage of such a complex initial state is that you have some probability of producing everything that can be reached at that energy. So much is going on when two garbage cans collide. The disadvantage is that you have to find the new "thing" among a big pile of debris. To prove the existence of a new particle, you need many runs to get it to show up consistently. And you need a good detector. Fortunately, Ting had a beauty.

  SLAC's SPEAR machine was the opposite. It collided electrons with positrons. Simple. Pointlike particles, matter and antimatter colliding, annihilating one another. The matter turns into pure light, a messenger photon. This packet of energy in turn coalesces back into matter. If each beam is, say, 1.5525 GeV, you get double that, a 3.105 GeV collision, every time. And if a particle exists at that mass, you can produce this new particle instead of a photon. You're almost forced to make the discovery; that's all the machine can do. The collisions it produces have a predetermined energy. To switch to another energy, the scientists have to reset the magnets and make other adjustments. The Stanford physicists could fine-time the machine energy to a precision far beyond what had been designed into it, a remarkable technological accomplishment. Frankly, I didn't think it could be done. The disadvantage of a SPEAR-type machine is that you must scan the energy domain, very slowly, in extremely small steps. On the other hand, when you hit the right energy—or if you're tipped off somehow, and this was to become an issue—you can discover a new particle in a day or less.

  Let's return for a moment to Brookhaven. In 1967–68, when we observed the curious dimuon shoulder our data went from 1 GeV to 6 GeV, and the number of muon pairs at 6 GeV was only one millionth of what it was at 1 GeV. At 3 GeV there was an abrupt leveling of the yield of muon pairs, and above approximately 3.5 GeV the plunge resumed. In other words, there was this plateau, this shoulder from 3 to 3.5 GeV. In 1969, when we were getting ready to publish our data, we seven authors argued about how to describe the shoulder. Was it a new particle whose effect was smeared out by the highly distorting detector? Was it a new process that produced messenger photons with a different yield? No one knew, in 1969, how the muon pairs were produced. I decided that the data were not good enough to claim a discovery.

  Well, in a dramatic confrontation on November 11, 1974, it turned out that the SLAC and Brookhaven groups each had clear data on an enhancement at 3.105 GeV. At SLAC, when the machine was tuned to that energy (no mean feat!), the counters recording collisions went mad, increasing by a hundredfold and dropping back to the base value when the accelerator was tuned to 3.100 or 3.120. The sharpness of the resonance was the reason it had taken so long to find; the group had gone over that territory before and had missed the enhancement. In Ting's Brookhaven data, the outgoing pairs of leptons, precisely measured, showed a sharp bump centered near 3.10 GeV. He, too, concluded that the bump could mean only one thing: he had discovered a new state of matter.

  The problem of scientific priority in the Brookhaven/SLAC discovery was a very thorny controversy. Who did it first? Accusations and rumors flew. One charge was that the SLAC scientists, aware of Ting's preliminary results, knew where to look. The countercharge was that Ting's initial bump was inconclusive and was massaged in the hours between SLAC's discovery and Ting's announcement. The SLAC people named the new object ψ (psi). Ting named it J. Today it is generally called the J/ψ or J/psi. Love and harmony have been restored in the
community. More or less.

  WHY THE FUSS? (AND SOME SOUR GRAPES)

  All very interesting, but why the tremendous fuss? Word of the November 11 joint announcement spread instantly around the world. One CERN scientist recalled: "It was indescribable. Everybody in the corridors was talking about it." The Sunday New York Times put the discovery on its front page: NEW AND SURPRISING TYPE OF ATOMIC PARTICLE FOUND. Science: TWO NEW PARTICLES DELIGHT AND PUZZLE PHYSICISTS. And the dean of science writers, Walter Sullivan, wrote later in the New York Times: "Hardly, if ever has physics been in such an uproar ... and the end is not in sight." A brief two years later Ting and Richter shared the 1976 Nobel Prize for the J/psi.

  The news came to me, hard at work on a Fermilab experiment with the exotic designation E-70. Can I now, writing in my study seventeen years later recall my feelings? As a scientist, as a particle physicist, I was overjoyed at the breakthrough, a joy tinged, of course, with envy and even just a touch of murderous hatred for the discoverers. That's the normal reaction. But I had been there—Ting was doing my experiment! True, the kinds of chambers that made Ting's experiment sharp weren't available in 1967–68. Still, the old Brookhaven experiment had the ingredients of two Nobel Prizes—if we had had a more capable detector and if Bjorken had been at Columbia and if we had been slightly more intelligent ... And if my grandmother had had wheels—as we used to taunt "iffers"—she would have been a trolley car.

  Well, I can only blame myself. After spotting the mysterious bump in 1967, I had decided to pursue the physics of dileptons at the newer high-energy machines coming on the air. CERN, in 1971, was scheduled to inaugurate a proton-proton collider, the ISR, with an effective energy twenty times that of Brookhaven's. Abandoning my Brookhaven bird in hand, I submitted a proposal to CERN. When that experiment started taking data in 1972, I again failed to see the J/psi, this time because of a fierce background of unexpected pions and our newfangled leaded-glass particle detector which was, unknown to us, being irradiated by the new machine. The background turned out to be a discovery in itself: we detected high-transverse-momentum hadrons, another kind of data signifying the quark structure inside protons.

  Meanwhile, also in 1971, Fermilab was getting ready to start a 200 GeV machine. I gambled on this new machine too. The Fermilab experiment turned on in early 1973, and my excuse was ... well, we really didn't get down to doing what we had proposed to do, being diverted by curious data several groups had been seeing in the brand-new Fermilab environment. It turned out to be a red herring or a blue shrimp, and by the time we got around to dileptons, the November Revolution was in the history books. So not only did I miss the J at Brookhaven, I missed it at both new machines, a new record of malpractice in particle physics.

  I haven't yet answered the question, what was the big deal? The J/psi was a hadron. But we have discovered hundreds of hadrons, so why blow a gasket over one more, even if it has a fancy name like J/psi? It has to do with its high mass, three times heavier than the proton, and the "sharpness" of the mass, less than 0.05 MeV.

  Sharpness? What that means is the following. An unstable particle cannot have a unique, well-defined mass. The Heisenberg uncertainty relations spell it out. The shorter the lifetime, the wider the distribution of masses. It is a quantum connection. What we mean by a distribution of masses is that a series of measurements will yield different masses, distributed in a bell-shaped probability curve. The peak of this curve, for example 3.105 GeV, is called the mass of the particle, but the spread in mass values is in fact a measurement of the particle's lifetime. Since uncertainty is reflected in measurement, we can understand this by noting that for a stable particle, we have infinite time to measure the mass and therefore the spread is infinitely narrow. A very short lived particle's mass cannot be determined precisely (even in principle), and the experimental result, even with superfine apparatus, is a broad spread in the mass measurements. As an example, a typical strong-interaction particle decays in 10−23 seconds and has a mass spread of about 100 MeV.

  One more reminder. We noted that all hadron particles are unstable except the free proton. The higher the mass of a hadron (or any particle), the shorter its lifetime because it has more things into which it can decay. So now we find a J/psi with a huge mass (in 1974 it was the heaviest particle yet found), but the shock is that the observed mass distribution is exceedingly sharp, more than a thousand times narrower than that of a typical strong-interaction particle. Thus it has a long lifetime. Something is preventing it from decaying.

  NAKED CHARM

  What inhibits its decay?

  Theorists all raise their hands: a new quantum number or, equivalently, a new conservation law is operating. What kind of conservation? What new thing is being conserved? Ah, now all the answers were different, for a time.

  Data continued to pour in, but now only from the e+ e− machines. SPEAR was eventually joined by a collider in Italy, ADONE, and later by DORIS, in Germany. Another bump showed at 3.7 GeV. Call it ψ (psi prime), no need to mention J, since this was Stanford's baby entirely. (Ting and company had gotten out of the game; their accelerator had been barely capable of discovering the particle and not capable of examining it further.) But despite feverish effort, attempts to explain the surprising sharpness of J/psi were at first stymied.

  Finally one speculation began to make sense. Maybe J/psi was the long-awaited bound "atom" of c and , the charm quark and its antiquark. In other words, perhaps it was a meson, that subclass of hadron consisting of quark and antiquark. Glashow, exulting, called J/psi "charmonium." As it turned out, this theory was correct, but it took another two years for the speculation to be verified. The reason for the difficulty is that when c and are combined, the intrinsic properties of charm are wiped out. What c brings, cancels. While all mesons consist of quark and antiquark, they don't have to consist of a quark with its own particular antiquark, as does charmonium. A pion, for example, is .

  The search was on for "naked charm," a meson that was a charm quark tethered with, say, an antidown quark. The antidown quark wouldn't cancel the charm qualities of its partner, and charm would be exposed in all its naked glory, the next best thing to what is impossible: a free charm quark. Such a meson, a was found in 1976 at the Stanford e+ e− collider by a SLAC-Berkeley group led by Gerson Goldhaber. The meson was named D0 (D zero), and studies of D's were to occupy the electron machines for the next fifteen years. Today, mesons like cd, cs, and cd are grist for the Ph.D. mill. A complex spectroscopy of states enriches our understanding of quark properties.

  Now the sharpness of J/psi was understood. Charm is a new quantum number, and the conservation laws of the strong force did not permit a c quark to change into a lower-mass quark. To do this, the weak and electromagnetic forces had to be invoked, and these are much slower to act—hence the long lifetime and narrow width.

  The last holdouts against the idea of quarks gave up about this time. The quark idea had led to a far-out prediction, and the prediction had been verified. Probably even Gell-Mann began to give quarks elements of reality, although the confinement problem—there can be no such thing as a free quark—still differentiates quarks from other matter particles. With charm, the periodic table now was balanced again:

  Quarks

  up (u) charm (c)

  down (d) strange (s)

  Leptons

  electron neutrino (νe) muon neutrino (νμ)

  electron (e) muon (μ)

  Now there were four quarks—that is, four flavors of quarks—and four leptons. We now spoke of two generations, arranged vertically in the above table. The u-d-νe-e is the first generation, and since the up and down quarks make protons and neutrons, the first generation dominates our present world. The second generation, c-s-νμ-μ is seen in the intense but fleeting heat of accelerator collisions. We can't ignore these particles, exotic as they may seem. Intrepid explorers that we are, we must struggle to figure out what role nature had planned for them.

  I have not really given
due attention to the theorists who anticipated and helped to establish the J/psi as charmonium. If SLAC was the experimental heart, Harvard was the theoretical brain. Glashow and his Bronx High School of Science classmate Steve Weinberg were aided by a gaggle of young whizzes; I'll mention only Helen Quinn because she was in the thick of the charmonium euphoria and is on my role-model team.

  THE THIRD GENERATION

  Let's pause and step away. It's always more difficult to describe recent events, especially when the describer is involved. There is not enough of the filter of time to be objective. But we'll give it a try anyway.

  Now it was the 1970s, and thanks to the tremendous magnification of the new accelerators and the matching ingenious detectors, progress toward finding the a-tom was very rapid. Experimenters were going in all directions, learning about the various charmed objects, examining the forces from a more microscopic point of view, poking at the energy frontier addressing the outstanding problems of the minute. Then a brake on the pace of progress was applied as research funds became increasingly difficult to find. Vietnam, with its drain on the spirit and the treasury, as well as the oil shock and general malaise resulted in a turning away from basic research. This hurt our colleagues in "small science" even more. High-energy physicists are in part protected by the pooling of efforts and sharing of facilities in large laboratories.

  Theorists, who work cheap (give them a pencil, some paper, and a faculty lounge), were thriving, stimulated by the cascade of data. We still saw the same pros: Lee, Yang, Feynman, Gell-Mann, Glashow, Weinberg, and Bjorken, but other names would soon appear: Martinus Veltman, Gerard 't Hooft, Abdus Salam, Jeffrey Goldstone, Peter Higgs, among others.

  Let's just quickly touch on the experimental highlights, thereby unfairly favoring the "bold salients into the unknown" over the "slow steady advance of the frontier." In 1975, Martin Perl, almost singlehandedly and while dueling, d'Artagnan-like with his own colleague-collaborators, convinced them, and ultimately everyone, that a fifth lepton lurked in the SLAC data. Called tau (x), it, like its lighter cousins the electron and the muon, comes in two signs: τ+ and τ−.

 

‹ Prev