The 4-Percent Universe
Page 22
Karl van Bibber didn't have that luxury. But he did have an advantage over other dark-matter hunters: He'd know his prey if he saw it.
How do you see something that is dark, if by "dark" you mean, as astronomers beginning in the 1970s and 1980s did, "impossible to see"? How do you do something that is, by your own definition, impossible to do?
You don't. You rethink the question.
For thousands of years, astronomers had tried to apprehend the workings of the universe by looking at the lights in the sky. Then, starting with Galileo, they learned to look for more lights in the sky, those that they couldn't see with their eyes alone but that they could see through a telescope. By the middle of the twentieth century, they were expanding their understanding of "light," looking through telescopes that saw beyond the optical parts of the electromagnetic spectrum—radio waves, infrared radiation, x-rays, and so on. After the acceptance of evidence for dark matter, astronomers realized they would need to expand their understanding of "look." Now, if they wanted to apprehend the workings of the universe, they would have to learn to look in a broader sense of the word: to seek, somehow. To come into some manner of contact with. Otherwise, they could do only what ancient astronomers had been compelled to do, in the absence of instruments that extended one of the five senses: Save the appearances. Think. Theorize.
And theoretical was all that dark matter was. From the start, the evidence for it was indirect. We "knew" it was there because of how it affected stuff we could see. The obvious answer to what it was, was more of the same—more of the stuff we would be able to see, if only it weren't so distant, or so inherently dim, that it foiled our usual means of observation. Ockham's razor argued for a universe that consists of matter we already know—matter made from baryons—not matter we don't. Maybe, as Vera Rubin liked to joke, dark matter was "cold planets, dead stars, bricks, or baseball bats."
In 1986, Prince ton's Bohdan Paczynski suggested that if these massive objects we couldn't see did exist in the halo of our own galaxy—where astronomers thought most of the Milky Way's dark matter resided—we could recognize their presence through a technique called gravitational lensing. In 1936, Einstein had suggested that a foreground star could serve as a lens of sorts on a background star. The gravitational mass of the foreground star would bend space, and with it the trajectory of the light from the background star, so that even though the background star was "behind" the foreground star from our line of sight, we would still be able to see it. "Of course," Einstein wrote in an article, "there is no hope of observing this phenomenon directly." To the editor of the journal he privately confided, regarding his paper, "It is of little value."
Einstein, however, was thinking small. He was still stuck in the universe in which he'd come of age. But the universe was no longer swimming only in the stars in our galaxy; it was swimming in galaxies. A few months after Einstein published his brief paper on the subject, Fritz Zwicky pointed out that rather than a foreground star, a foreground galaxy could serve as a gravitational lens. And because a galaxy had the mass of billions of stars, "the probability that nebulae which act as gravitational lenses will be found becomes practically a certainty."
In 1979, that prediction came true when astronomers found two images of the same quasar thanks to the gravitational intervention of a galaxy. The advent of CCD technology and supercomputers, Paczynski realized, might allow astronomers to make gravitational-lens detections on the small scale that Einstein had described, then dismissed. Paczynski reasoned that if, from our line of sight, a dark object in the halo of our galaxy—a Massive Compact Halo Object, or MACHO—passed in front of a star in a neighboring galaxy, the gravitational effect of the dark foreground object would cause the light from the background object to appear to brighten. In 1993, two teams reported that after monitoring the brightness of millions of stars in the Large Magellanic Cloud, they had likely observed three such events—an impressive exercise in astronomy, but not a rate of discovery that suggested a Milky Way halo teeming with dark and massive objects made of baryons.
Then again, maybe the problem wasn't some unobservable matter but the observable effect—gravity. In 1981, Mordehei Milgrom, of the Weizmann Institute in Rehovot, Israel, arrived at Modified Newtonian Gravity, or MOND—a mathematical formula that he claimed described the light curves for galaxies just as well as, and probably better than, the presence of some sort of mystery matter. It did not, however, describe galaxy clusters very well.
But even if it had, physicists had already recognized a seemingly less obvious yet, somewhat paradoxically, more persuasive solution to the dark-matter problem than either the stuff we know or modified gravity: stuff we don't know.
As part of his inner space/outer space research, David Schramm as well as his students had discovered that deuterium (an isotope of hydrogen that has one neutron in the nucleus instead of none) could only be destroyed in stars rather than created (as other elements could be). Therefore, all the deuterium in the universe today must have been present in the earliest universe, and you could conclude that the present amount was at least the primordial amount. Through further calculations you could figure out how dense with baryons the early universe must have been in order for that minimum amount of deuterium to have survived that primordial period. The denser the baryonic matter, the steeper the drop in the deuterium survival rate. In order for at least this much deuterium to have existed in the early universe, the density of baryonic matter must have been at most a certain amount. This analysis therefore revealed an upper limit on the density of baryonic matter. (Schramm and Turner came to call deuterium a "baryometer.")
By similar reasoning and calculations, you could arrive at a lower limit for baryonic matter. Helium-3 (two protons plus a neutron) could only be created in stars rather than destroyed, so you could conclude that the present amount was at most the primordial amount. Then you could calculate how dense with baryons the early universe must have been in order for that maximum amount of helium-3 to have survived, and from that amount you could arrive at a lower limit on the density of baryonic matter.
By using particle physics to set upper and lower limits on the density of baryonic matter in the universe, Schramm and others converged on an omega for baryonic matter of about 0.1.
That amount, however, said nothing about non-baryonic matter.* Soon observations "weighing" the universe on different scales began converging on a number of their own—an omega in the 0.2 range, and perhaps higher. That disparity alone—0.1 baryonic matter versus 0.2 total matter—provided evidence for the existence of more than black holes and baseball bats in the halos of galaxies or suffusing galaxy clusters. The universe needed non-baryonic matter. And in a Big Bang model, such matter could come from only one place—the same place as the protons and neutrons and photons and everything else in the universe: the primordial plasma.
Even if particle physicists didn't know what these particles were, they knew that, like all the other particles that have been streaming through the universe since the first second of the universe, they had to be either fast or slow. Particles that were very light and moved at velocities approaching the speed of light—relativistic velocities—were called hot dark matter. Particles that were heavier and therefore more sluggish, attaching themselves to galaxies and moving at the same pace as the stars and gas, were called cold dark matter. And those two interpretations came with a crucial test.
In the early 1980s, astronomers hadn't yet detected the primordial ripples in the background radiation that would have corresponded to the so-called seeds of creation—the gravitational gathering grounds that would become the structures we see in the current universe. Even so, theorists knew that if those ripples did exist, then the two models of dark matter—hot and cold—would have affected them in different ways, leading to two opposite evolutionary scenarios for the universe.
Hot dark matter—particles moving at relativistic velocities—would have smeared the primordial ripples to large volumes, like a dow
npour on sidewalk chalk. In a universe full of matter gathering around those vast swaths, larger structures would have formed first. These vast gobs of matter would then have had to break up over time into the specks we see today—galaxies. The universe would have had a top-down, complex-to-simple history.
Cold dark matter—particles moving at a small fraction of the speed of light—would have sprinkled the primordial ripples much more subtly and affected the evolution much more slowly. Structure in that universe would have started as specks, or galaxies, and worked its way up to larger and larger structures. The universe would have had a bottom-up, simple-to-complex history. The observations in the early 1980s indicating that the Milky Way is part of a Local Supercluster, or that superclusters are separated by great voids, provided enough support for the cold-dark-matter model that most theorists abandoned the hot-dark-matter model by the middle of the decade. Then astronomers began using redshift surveys to map the universe in three dimensions, beginning in the late 1980s with the dramatic Harvard-Smithsonian Center for Astrophysics sighting of a "Great Wall" of galaxies. From 1997 to 2002, the Two-degree-Field Galaxy Redshift Survey, using the 3.9-meter Anglo-Australian Telescope, mapped 221,000 galaxies; beginning in 2000, the Sloan Digital Sky Survey, operating on the 2.5-meter telescope at the Apache Point Observatory in New Mexico, mapped 900,000 galaxies.
In those surveys and others, astronomers found that the farther across the universe they looked—and therefore the farther back in time—the less complexity they saw. Which is another way of saying that the closer they got to the present, the more complexity they saw. Galaxies formed first, at redshifts of 2 to 4—or roughly nine to twelve billion years ago. Then those galaxies gathered into clusters, at redshifts of less than 1—or less than roughly six billion years ago. And now, today (in a cosmic sense), those clusters are gathering into superclusters. Matter clumped first in small structures, and those small structures continued to gather together. The universe has apparently had a bottom-up, simple-to-complex history, consistent with theoretical cold-dark-matter models.
Still, what those surveys mapped were sources of light. They showed where the galaxies were, leaving scientists to infer where the dark matter was. In 2006, the Cosmic Evolution Survey, or COSMOS, released a map of the dark matter itself. The survey studied 575 Hubble Space Telescope images of instances in which two galaxies or clusters of galaxies lined up one behind the other. Like the microlensing technique that the MACHO surveys had used, weak gravitational lensing relied on a foreground concentration of mass to distort the light from a more distant source. Unlike microlensing, however, weak gravitational lensing recorded not individual events, as objects passed in front of other objects, but ongoing relationships between objects that were, for all practical purposes, stationary relative to each other—galaxies or clusters of galaxies. The light from a foreground object told astronomers how much mass appeared to be there. The gravitational-lensing effect on the background object told them how much foreground mass was there. The difference between the two amounts was the dark matter.
The COSMOS map not only covered an area of the sky nine times the diameter of the full moon, but was three-dimensional; it showed depth. It was like the difference between a map that shows only roads and a map that also shows the hills and valleys that the roads traverse. And because looking deeper into space means looking back in time, the COSMOS map showed how those hills and valleys got there—how the dark matter evolved. According to this "cosmopaleontology," as the team called this approach, the dark matter collapsed upon itself first, and then those centers of collapse grew into galaxies and clusters of galaxies—again, an image consistent with the bottom-up, cold-dark-matter formulation.
Perhaps the most dramatic, and certainly the most famous, indirect evidence for the existence of dark matter was a 2006 photograph of a collision of two galaxy clusters, collectively known as the Bullet Cluster. By observing the collision in x-rays and through gravitational lensing, Douglas Clowe, then at the University of Arizona, separated visible gas from invisible mass. The visible (in x-ray) gas from both clusters pooled in the center of the collision, where the atoms had behaved the way atoms behave—attracting one another and gathering gravitationally. Meanwhile, the invisible mass (detectable through gravitational lensing) appeared to be emerging on either side of the collision. It was as if dark-matter boxcars from both clusters had raced, ghostlike, right through the cosmic train wreck.
The photograph appeared around the world, and the Bullet Cluster became synonymous with dark matter. The false color helped: NASA assigned the visible gas pinkish red and the invisible mass blue. The headline on the press release also helped: "NASA Finds Direct Proof of Dark Matter."
But that wasn't quite true. Even leaving aside the dubious use of the word "proof," the "direct" was subject to debate—and had been closely parsed during the writing of the press release. The problem was that astronomers had been saying for a generation that dark matter dominated baryonic matter in the universe. Now they were saying that dark matter dominated baryonic matter in the universe. "It's not 'direct,'" Clowe conceded. "A true dark-matter direct detection would be catching a particle."
So how could you catch one? How could you capture the evidence that, as Mike Turner liked to say, "you could put in a bottle and bring to the aunt from Missouri who's saying, 'Show me'"? First, you would have to know what to look for—or "look" for.
By the late 1970s, theorists had finished fashioning the standard model of particle physics, an explanation of the relationships among three of the four fundamental forces in the universe—electromagnetism, weak interaction (or weak nuclear force), strong interaction (or strong nuclear force). The particles themselves came in two types, bosons and fermions—those that, respectively, can and cannot occupy the same quantum space. Some theorists proposed a "supersymmetry" between bosons and fermions; each boson would have a fermion partner, and vice versa. The photon, for instance, got a photino superpartner, the guage boson a guagino, the gluon a gluino. And the neutrino got a neutralino.
The neutralino—even before the axion or MACHO—turned out to be an attractive candidate for dark matter. Theorists' calculations predicted how many of these neutralinos would have survived to the present universe, and they predicted the mass of the neutralino, and when they added up those two numbers, the answer was nearly identical to the best estimates of the amount of dark matter. Aesthetically, physicists liked that the neutralino wasn't ad hoc; nobody invented it to solve the problem of dark matter. The neutralino would just be there, and its connection to dark matter was a bonus.
The trouble with the neutralino, from an observer's perspective, was that it interacted only through the weak force. Hence the name that Mike Turner bestowed on this class of dark-matter candidates: Weakly Interactive Massive Particle, or WIMP.* A WIMP wouldn't interact through electromagnetism, meaning that we couldn't see it in any wavelength. It also wouldn't interact through the strong nuclear force, meaning that it would rarely interact with atomic nuclei. The key word, though, is "rarely."
The very occasional exception was the opening that dark-matter detectives needed. It allowed them to take evidence that would be inaccessible to our senses and transform it into evidence that would be accessible. They still wouldn't be able to see the WIMPs themselves, but they would theoretically be able to see two aftereffects of a WIMP-nucleus interaction. One would be a minuscule amount of heat from the agitated nucleus. The other would be an electric charge from loosened electrons. Neither of those aftereffects in itself would be enough to identify a neutralino. But the combination of the two in a single event would be a signature unique to the particle.
To "look" for these effects, however, scientists would have to adopt another kind of "telescope," one that was new to astronomy: the laboratory.
One of the start-up programs at the Center for Particle Astrophysics in the late 1980s (along with the experiment that would become the Supernova Cosmology Project) was an effort at this
kind of detection, the Cryogenic Dark Matter Search, or CDMS. In order to stabilize the target atoms—germanium, in this case—the detector had to maintain a temperature of .07 of a degree Fahrenheit above absolute zero. And in order to block out cosmic rays and other offending ordinary particles, the detector had to be shielded.
Under the leadership of the Center for Particle Astrophysics director Bernard Sadoulet, the CDMS project began life in a shallow site on the Stanford campus, seventy feet below ground, or roughly the equivalent of several hard turns in a subterranean parking garage. The problem wasn't getting a ping—a reading that showed an interaction with the nucleus of the germanium atom. Pings it got. The depth was sufficient to block out cosmic rays but not muons, which are like a heavy version of the electron. Muons penetrated the seventy feet of rock, hit the detector, and made neutrons, which leave a signal similar to the neutralino's but aren't, alas, neutralinos. The problem was getting the right kind of ping.
There was nowhere to go but down. In 2003, the successor detector, CDMS II, began operating under half a mile of rock in a former iron mine in northern Minnesota. By then CDMS had inspired a generation of similar detectors, though the high cost, large scale, and long data gestation for CDMS prompted researchers to consider cheaper, faster approaches. Many of the second-generation detectors relied on the noble gases argon, neon, and xenon, which don't need to be cooled to anywhere near absolute zero to turn into a usable liquid form, and which are far less expensive. In 2007 the XENON10 experiment, a 15-kilogram tank of liquid xenon operating in the underground laboratory at Gran Sasso, Italy, established itself as a viable rival with the release of results at a far more sensitive level than CDMS II had yet been able to reach.
Back in 1992, Sadoulet had told a journalist, "I may be bragging, but I think we're close." Sixteen years, numerous rotations of graduate students and postdocs, and two generations of detector later, a group of twelve CDMS team members gathered at his home to await a "blind" analysis of their data—a test of whether the latest research they had done would coalesce into a quantifiable result. According to their calculations, over the preceding year the CDMS II particle detector should have registered no more than one or two "hits" from stray subatomic particles of ordinary matter. The fewer hits they saw, the more confidently they could eliminate a segment of WIMP phase space—the graph that showed all reasonable combinations of size and mass. Like the settlement of a frontier by pioneers, the elimination of each swath of the graph left a narrower region to explore. At precisely midnight, they gathered around a computer in Sadoulet's living room, "unlocked" the data, and waited for the answer to bloom into view.