Still, from an astronomer's perspective, not until the Sun goes down and stays down—March through September, the austral autumn and winter, when temperatures drop to -100°F—does the South Pole get "benign."
For six months, telescopes at the Pole do nothing but swallow sky and funnel it north, and they do so while operating under impeccable conditions for astronomy. The atmosphere is thin; the Pole is more than 9,300 feet above sea level (of which the first 9,000 down are ice). The atmosphere is also stable, thanks to the absence of the heating and cooling effects of a rising and setting sun. And the area doesn't suffer whiteouts during the dark months; the Pole has the lowest peak wind velocity—55 miles per hour—of any weather station on Earth.* But most important for the kind of astronomy performed by the South Pole Telescope, the air is exceptionally dry.
Technically, the Pole has a desert climate. Snowfall is rare. (The snow that's there is the result of millions of years of windblown accumulation from the periphery of the continent.) Chapped hands at the Pole can take weeks to heal, and perspiration isn't really an issue. The level of moisture is so low that if you took all the water vapor in the atmosphere at any one time during the coldest months and compressed it, the sheet would be less than a hundredth of an inch thick. For astronomers working at microwave or submillimeter wavelengths—cosmic microwave background researchers such as Holzapfel—the less water vapor the better. Even small amounts of atmospheric moisture can absorb submillimeter-wavelength signals, meaning that those CMB photons wouldn't even reach the telescope. That same water vapor can also emit its own submillimeter-wavelength signals, meaning that observers could find themselves mistaking humidity for history.
For Holzapfel, the SPT was only the latest in a series of CMB detectors at the Pole. When Holzapfel was a young grad student at Berkeley, in the late 1980s and early 1990s, he would see the standard computer simulations of the CMB and think, "Well, that's a nice story." The simulations would show the temperature that had to be there, if the Big Bang theory was correct. They would show the fluctuations in the radiation that had to be there, if inflation was correct. To Holzapfel, these simulations were targets—ideals that future data, when instruments were sensitive enough, could hope to approximate.
Then came COBE.
"Astonishing," thought Holzapfel. Forget about approximations. The match to the simulations was exact, at least within the (very) narrow margins of error. Not only did the CMB have a story, it was a story that Holzapfel himself wound up helping to tell. In a series of CMB surveys at the South Pole, the agreement between ideal simulations and actual data had become so fine that scientists couldn't adjust a variable without destroying the universe. Tweak the density of the dark matter in a CMB simulation even slightly, and it no longer matched the data. Leave the dark matter alone and tweak the dark energy instead, and again the simulation departed from the data. Do the same with the density of baryons, or the expansion rate of the universe; same thing happened.
In a way, the South Pole Telescope was one more CMB study. Like the dozens of other dark-energy experiments that had arisen in the first years of the twenty-first century, the SPT would be studying how the universe had changed over time—the evolution of its structure from those seeds in the CMB to what we observe today. This time, however, the story Holzapfel was hoping to tell wasn't only what the universe was, way back when, and how it got to be the way it is today. Instead, the SPT would be "seeing" into the future. In effect, the part of the story of the universe that Holzapfel would be helping to write was the end.
The fate of the universe? Again? Hadn't the Supernova Cosmology Project and the High-z team settled that question back in 1998?
Not exactly. The universe, as it happens, was not as simple as they thought.
Even while the SCP team's acceleration paper, "Measurements of Ω and A from 42 High-Redshift Supernovae"—submitted to the Astro-physical Journal in September 1998 and published the following June—was making its way through internal revisions and peer review, Saul Perlmutter was thinking about the next step in the supernova game. How could you get the greatest number of supernovae at the highest redshifts? The obvious answer: a space telescope. Hubble's field of view was too small for the kind of sky-grabbing such a project would require. And securing time on HST was always a dicey proposition. Better to have a satellite telescope of one's own. So, true to the Berkeley Lab tradition, Perlmutter, his colleagues, and the Department of Energy agreed that they should build one.
Over the next few years they drew up designs and, in a giant hangar in the hills above Berkeley, built cardboard models. And as technology improved, they rejiggered the designs and refashioned the models. They published glossy brochures and produced optimistic press releases. A couple of times they thought they might be one phone call away from final approval. But if the Supernova Acceleration Probe—SNAP—was going to literally get off the ground, it was going to have to do so with the help of NASA, and NASA was not in the habit of agreeing to a $600 million space mission on the say-so of the scientists who would most benefit from the launch.
In 2004, NASA convened a Science Definition Team to determine the viability of the project. An advocate for the DOE and an advocate for NASA would co-chair. For its own advocate, NASA called on Charles L. Bennett, a veteran of COBE and the principal investigator on the Wilkinson Microwave Anisotropy Probe, the satellite successor to COBE, named after former Dicke bird David Wilkinson, who died in 2002.
"But I don't know anything about dark energy!" Bennett protested.
Perfect, he was told. He would meet with dark-energy experts in complete ignorance and ask fundamental questions.
In the first meeting Bennett asked, "Is this going to be the last dark-energy mission? When you fly this mission, are you going to learn everything that you need to or can learn? Or is this just the first one, and then later you're going to have another?" Whether because, to someone who did know something about dark energy, the answer was obviously that you weren't going to learn everything about dark energy in one mission, or because, as Bennett suspected, no members of the SNAP team had ever asked themselves that question, there was silence. And it unnerved him.
In the end, he advised that NASA hold a "serious competition." Open it to the community. See what other approaches might be out there. "Space," he said, "is not the best place to try a crapshoot."
In 2005 Bennett left NASA for Johns Hopkins. There he found himself in the office next to Adam Riess. Bennett told Riess he thought that Perlmutter might have "froze in too early" on supernovae. The choice of method made sense in 1999, but, as he had seen in his role as co-chair for the Science Definition Team, other methods had arisen since then. "Let's erase the blackboard and start from scratch," he said to Riess. They invited other collaborators, and they called themselves ADEPT, for the Advanced Dark Energy Physics Telescope—an acronym suggestive of astronomy's fast-and-loose image of itself, and a name notably lacking in what method the satellite would be using, because the collaborators themselves didn't know.
But the competition was also indicative of how dark-energy astronomy was changing. It had entered a new era. Carlton Pennypacker had likened the early search for distant supernovae to The Treasure of the Sierra Madre, and although he hadn't survived the final reel, his gold dust had indeed led to a gold rush. Perlmutter and Riess were still around, working the mine, but they had plenty of competition—more and more prospectors bearing better and better equipment.
In the fall of 1999 the National Research Council had initiated a study of what science could accomplish "at the intersection of astronomy and physics"—a topic that vindicated David Schramm's vision a quarter of a century earlier. Mike Turner chaired the committee, and he dedicated the resulting study to Schramm: Connecting Quarks with the Cosmos: Eleven Science Questions for the New Century. The first question was "What Is Dark Matter?" But the second question was "What Is the Nature of Dark Energy?"
Its nature. Not what it is, but what it's like. What i
t does. How it behaves. Like dark-matter astronomers, dark-energy astronomers had to confront a paradoxical question: How do you see something you can't see? And like dark-matter astronomers, they had to expand their understanding of "seeing" until it could encompass some manner of "coming into contact with." In doing so, however, they couldn't content themselves with the possibility of one day trapping a neutralino and hearing the ping, or converting an axion into a photon. Dark energy wasn't going to be a particle. The goal wasn't to detect it but to define it.
In particular, astronomers wanted to know if it was truly a cosmological constant—unchanging over space and time—or quintessence—something that did change over space and time. If it was unchanging, then as the universe expanded and the density of matter decreased, dark energy's influence would become greater and greater, leading to faster and faster acceleration, and the universe would indeed devolve into a Big Chill. If it changed over space and time, then it would be some kind of dynamical field previously unknown to physics, so it could, as far as anyone knew, either accelerate or decelerate cosmic expansion in the distant future. "In a Universe with dark energy," Michael Turner wrote in 2001, "the connection between geometry and destiny is severed." "What Is the Nature of Dark Energy?" might have rated second to "What Is Dark Matter?" in the Connecting Quarks with the Cosmos survey of science questions—and would perhaps have ranked number one if the topic weren't still so new when the committee finalized the list in January 2002—but it was, the report said, "probably the most vexing."
That vexation led NASA, the NSF, and the DOE to commission a task force. This time it was Turner's old partner-in-Pizza, Rocky Kolb, who chaired the committee. The Dark Energy Task Force, which released the results of its deliberations in 2006, recommended four methods for investigating dark energy's nature.
One was the old standby, Type Ia supernovae. In the thirty years since Stirling Colgate devised his unsuccessful remote supernova search in the New Mexico desert, and Luis Alvarez challenged Rich Muller to see if he could succeed where Colgate had failed, astronomers had come down from the mountaintops. The two supernova teams from the 1990s had done most of their work in the control rooms of the telescopes—for instance, on Mauna Kea, a dormant volcano on Hawaii's Big Island, where astronomers would sit at the computers, muddy-headed from the thirteen-thousand-foot altitude. By the late 1990s, astronomers using those same telescopes were actually sitting in a control room at sea level, in an office building on a commercial strip in nearby Waimea. A few years later they were migrating to their own offices, back in Baltimore or Berkeley or Cambridge or La Serena or Paris. In the spring of 2007, the principal investigator of Berkeley Lab's Nearby Supernova Factory broke both his ankles in a fall at home and didn't miss a night of observing. He simply installed himself in a burgundy leather armchair in his living room, popped open his laptop, and monitored the University of Hawaii's 88-inch telescope, on Mauna Kea. Petting the dog with one hand while working the keyboard with the other, he spent his evenings studying lists of supernovae for spectroscopic follow-up, asking himself, "Which of these guys should I keep, and which should I throw away?"
So many supernovae, so little time.
Who wasn't working from a laptop in an airport? Who wasn't wireless? Who wasn't sending e-mail to the person next to him or her in bed? But if you had been around in the era when the detection of one distant supernova was a thrill without parallel, then having twelve or seventeen supernovae on your laptop was not something you took for granted.
Or having twelve or seventeen supernovae anywhere, laptop or not. Observing three nights a week for nine months of the year, the Nearby Supernova Factory was designed to discover about 150 or 200 supernovae annually, of which fifty or sixty would be Type Ia. And it wasn't the only factory out there. The Supernova Legacy Survey, a collaboration using the Canada-France-Hawaii Telescope on Mauna Kea, discovered 500 Type Ia during the decade. The Sloan Digital Sky Survey-II Supernova Survey discovered about another 500. The Center for Astrophysics Supernova Group—185. The Lick Observatory Supernova Survey—about 800. The Carnegie Supernova Project—around 100.
For some purposes, quantity was important. The premise of the Nearby Supernova Factory, for example, was that astronomers were never going to know the intrinsic brightness of a supernova. To perform the searches in the 1990s, they had invented methods to standardize supernovae. But in order to further refine their measurements, they needed a wealth of nearby supernovae in all sorts of varieties, in order to have a basis of comparison for whatever kinds of distant supernovae nature might throw their way.
Such as the distant supernovae that Adam Riess was pursuing with his own collaboration, now calling itself the Higher-z team. His showstopping presentation of SN 1997ff at the 2001 Dark Universe meeting offered persuasive evidence that at some point the universe had "turned over"—that the expansion went from decelerating to accelerating, from slowing down under the gravitational attraction of matter to speeding up under the countergravitational force of dark energy. Riess applied for more Hubble time to study supernovae, and in 2003 his team announced that they had determined when the universe turned over—about five billion years ago. In 2004 and 2006 his team produced evidence that even when dark energy was losing the tug of war with matter, as long as nine billion years ago, dark energy had nonetheless been present in the universe.
Another method the Dark Energy Task Force recommended was baryon acoustic oscillations, or BAO. In 1970 Jim Peebles had noted that in the creation of the CMB, the cosmological perturbations would have excited sound waves ("acoustic oscillations") that coursed through the primordial gas, creating peaks at intervals of 436,000 light-years. As the universe expanded, so did the spacing between these peaks; today they were 476 million light-years apart. And because galaxies tended to form on the peaks of these very large waves, astronomers could measure galaxy distribution at different eras, allowing them to see how the peak spacing changed over time—and thus how fast the universe had expanded over time. Whereas Type Ia supernovae behaved like standard candles, the spacing between the peaks acted like a standard ruler. But 476 million light-years was a lot of sky, even for cosmology. Astronomers needed enormous swaths of the whole universe just to lay the "ruler" on the map—technically impossible until 2005, when the Sloan Digital Sky Survey mapped the locations of 46,748 galaxies.
The third was weak lensing, the distortion of light from distant galaxies through the gravitational influence of foreground clusters of galaxies. Astronomers had been using this method to "weigh" dark matter by determining the shapes of millions of galaxies at various distances, which provided a direct probe of the mass of intervening clusters. After 1998, they began using weak lensing to measure the numbers of clusters over the evolution of the universe. That clustering rate depended on how fast the universe was expanding, and therefore the effects of dark energy, at different epochs.
The final approach—the one that Holzapfel was using at the South Pole Telescope—also used galaxy clusters. It was an effort to detect the Sunyaev-Zel'dovich (SZ) effect, named after the two Soviet physicists who predicted its existence in the 1960s. As a CMB photon made its journey from the primordial fireball to us, it might interact with the hot gas of a galaxy cluster, an encounter that would bump it up in energy—and out of the bandwidth the telescope was observing in. When that photon reached the SPT, landing on a microscopic thermometer at the heart of an ultra-cold (0.2 K) gold-plated spiderweb, it would appear as a hole in the CMB. There was something koan-like about the methodology: To see the unseeable, make the visible invisible.
"Very exciting!" Holzapfel said one afternoon, entering the Dark Sector lab that served as headquarters for the South Pole Telescope. Sitting at the controls was an incoming graduate student at Berkeley. She was knitting. "I can see the excitement is at a fever pitch," Holzapfel added.
She shrugged and said she would prefer to be handling Bakelite knobs and huge levers. But that wasn't how telescopes worked anymore. "I hit 'g
o' and wait twenty minutes for the script to run. At least this way"—she held up her knitting—"I get science and a sweater."
The United States established a year-round presence at the Pole in 1956, and the National Science Foundation's U.S. Antarctic Program long ago got everyday life there down to, well, a science. Not that the South Pole wasn't still the South Pole. The case of Jerri Nielsen, the doctor who, during the austral winter of 1998, diagnosed her own breast cancer, took a biopsy, and administered chemotherapy, probably couldn't have happened anywhere else on Earth. And then there was Rodney Marks, an Australian astrophysicist who died suddenly on May 12, 2000; not until sunrise several months later could his body be flown to New Zealand for an autopsy, which revealed that the cause of death was methanol poisoning and raised the possibility that one of his fellow Polies had committed the perfect murder.
Usually, though, workers at the Pole referred to struggles for survival with irony, as if they, too, might encounter the kind of hardship faced by the crew fighting an alien-from-outer-space rampage in The Thing—not the 1952 original, which was set at the North Pole, but the 1982 John Carpenter remake, which took place inside the iconic geodesic dome that had served as science headquarters at the South Pole since the mid-1970s. In early 2008 a new base station officially opened, replacing the geodesic dome (which remained partly visible above the snow). But the new station resembled a small cruise ship more than a desolate outpost. It could house two hundred in private quarters. Through the portholes that lined the two floors, you could contemplate a horizon as hypnotically level as any ocean's. The new station rested on lifts that, as snow accumulated over the decades, would allow it to be jacked up two full stories. Amenities included a state-of-the-art fitness center, a gym, a twenty-four-hour cafeteria, a greenhouse, a computer lab, TV rooms with couches deep enough to hibernate in, and Internet access about nine hours a day, when the communications satellites were above the horizon. As one mechanic said, looking out the cafeteria window at the New Year's Day revelers posing in swimsuits, beach-blanket style, at the ceremonial pole, "Hey, it's a harsh continent, haven't you heard?"
The 4-Percent Universe Page 24