Know This
Page 9
Looking at the Standard Model, we see sixteen subatomic particles: quarks, leptons (such as the electron), and bosons (such as the photon), plus the Higgs boson, charted in a table reminiscent of Mendeleev’s periodic table of the elements—except that there is no periodicity, no apparent ordering at work.
Three of the six leptons (“small” in Greek; particles that don’t participate in the strong nuclear force) are the three “generations” of neutrinos: electron, muon, and tau neutrinos. As integral as they are to the foundations of matter, we’re in the dark about their masses. A particle’s mass is arguably its most distinctive property, so this lacuna is rightly seen as an embarrassment for physics. That is about to change.
Neutrinos are generated in nuclear reactions such as fusion and radioactive decay. The ultimate reactor, of course, was the biggest cauldron of them all: the Big Bang. Like light, neutrinos are stable. Their lifetimes are infinite, because, like light, there is nothing for them to decay into. They change their flavor (generation type) as they sail through the cosmos, a phenomenon called oscillation. The 2015 Nobel Prize in physics went to Takaaki Kajita and Arthur B. McDonald “for the discovery of neutrino oscillations, which shows that neutrinos have mass.” Their work devastatingly refutes claims presented in John Updike’s poem Cosmic Gall (Sorry, John): While they remain small, neutrinos do have mass after all. Thanks to Kajita and McDonald, not only do we know neutrinos have mass but their work gives us a lower limit on the masses. At least one of the three must have a mass bigger than about 1/20 of an electron volt. This is svelte; the next heaviest elementary particle is the electron, whose mass is 10 million times larger. Most important, these lower limits on neutrino masses give experimentalists thresholds to target. All that’s left is to build a scale sensitive enough to weigh them.
Since it’s impossible to collect enough neutrinos to weigh in a terrestrial laboratory, cosmologists will use galaxy clusters as their scales. Sprinkled amid the luminous matter in the clusters are innumerable neutrinos. Their masses can be measured using gravitational lensing, a direct consequence of Einstein’s general theory. All matter, dark and luminous, gravitationally deflects light. The gravitational-lensing effect rearranges photon trajectories, as Eddington showed during the 1919 total solar eclipse. Star positions were displaced from where they would have been seen in the absence of the Sun’s warping of spacetime. The light that should have been there was lensed; the amount of displacement told us the mass of the lens.
What kind of light should we use to weigh poltergeist particles like neutrinos? There certainly aren’t enough neutrinos in our solar system to bend the Sun’s light. The most promising light source of all is also the oldest and most abundant light in the universe, the “3 Kelvin” cosmic microwave background. These cosmic photons arose from the same ancient cauldron that produced the neutrinos plying the universe today. The CMB is “cosmic wallpaper,” a background against which the mass of all matter in the foreground galaxy clusters, including neutrinos, can be measured.
In 2015, the Planck satellite showed powerful evidence for gravitational lensing of the CMB, using a technique eventually guaranteed to detect neutrino masses. This technique, based on the CMB’s polarization properties, will dramatically improve in 2016, thanks to a suite of experiments deploying tens of thousands of detectors cooled below 0.3 Kelvin at the South Pole and in the Chilean Atacama desert.
Neutrinos are also the paradigm of dark matter: they’re massive, dark (they interact with light only via gravitational lensing) and neutral—all required properties of dark matter. While we know that neutrinos aren’t the dominant form of the cosmos’s missing mass, they are the only known form of dark matter. After we measure their masses, we’ll use neutrinos to thin the herd of potential dark-matter candidates. Just as there are many different types of ordinary matter, ranging from quarks to atoms, we might expect there to be several kinds of dark matter. Perhaps there is a “dark” periodic table.
The hunt is on to directly detect dark matter, and several upgrades to liquid noble-gas experiments are coming online in 2016. Perhaps there will be detections. But so far the direct-detection experiments have produced only upper limits on the mass of the dark matter other than neutrinos. In the end, neutrinos just might be the only form of dark matter we ever get to “see.”
The next century of general relativity promises to be as exciting as the first. “Spacetime tells matter how to move; matter tells spacetime how to curve,” said John Archibald Wheeler. We’ve seen what the curvature is. Now we just need to find out what’s the matter. And where better to look for lost matter than where the dark is.
Simplicity
Neil Turok
Director, Perimeter Institute, Waterloo, Ontario; Niels Bohr Chair in Theoretical Physics; author, The Universe Within
We live at a remarkable moment in history. Our scientific instruments have allowed us to see the far reaches of the cosmos and study the tiniest particles. In both cases, they have revealed a surprising simplicity, at odds with the most popular theoretical paradigms. I believe this simplicity to be a clue to a new scientific principle, whose discovery will represent the next revolution in physics and our understanding of the universe.
It is not without irony that at the very moment the observational situation is clearing so beautifully, the theoretical scene has become overwhelmingly confused. Not only are the most popular models very complicated and contrived, they are also being steadily ruled out by new data. Some physicists appeal to a “multiverse” in which all possible laws of physics are realized somewhere, since then (they hope) there would at least be one region like ours. It seems more likely to me that the wonderful new data are pointing us in the opposite direction. The cosmos isn’t wild and unpredictable, it is incredibly regular. In its fundamental aspects, it may be as simple as an atom and eventually just as possible to understand.
Our most powerful microscope, the Large Hadron Collider, has found the Higgs boson. This particle is the basic quantum of the Higgs field, a medium that pervades space and endows particles with mass and properties like electric charge. As fundamental to our understanding of particle physics as the Higgs field is, it is equally important to our understanding of cosmology. It makes a big contribution to the energy in empty space, the so-called dark energy, whose density astronomical observations reveal to be a weirdly tiny, yet positive, number. Furthermore, according to the LHC measurements and the Standard Model of particle physics, the Higgs field is delicately poised on the threshold of instability in today’s universe.
The discovery of the Higgs boson was a triumph for the theory of quantum fields, the amalgamation of quantum mechanics and relativity that dominated 20th-century physics. But quantum field theory has great trouble explaining the mass of the Higgs boson and the energy in empty space. In both cases, the problem is essentially the same. The quantized vibrations of the known fields and particles become wild on small scales, contributing large corrections to the Higgs boson mass and the dark-energy density and generally giving them values much greater than those we observe.
To overcome these problems, many theorists have postulated new particles, whose effects would almost precisely cancel those of all the known particles, “protecting” the mass of the Higgs boson and the value of the dark-energy density from quantum effects. But the LHC has looked for these extra partner particles and, so far, failed to find them. It seems that nature has found a simpler way to tame quantum phenomena on short distances, in a manner we have yet to fathom.
Meanwhile, our most powerful telescope, the Planck satellite, has scanned the universe on the largest visible scales. What it has revealed is equally surprising. The whole shebang can be quantified with just six numbers: the age and temperature of the cosmos today; the density of the dark energy and the dark matter (both mysterious, but simple to characterize); and the strength, and slight dependence on scale, of the tiny initial variations in the density of matter from place to place as it emerged from the Bi
g Bang. None of the complications, like gravitational waves or the more involved density patterns expected in many models, appear to be there. Again, nature has found a simpler way to work than we can currently understand.
The largest scale in physics—the Hubble length—is defined by the dark energy. By accelerating the expansion of the cosmos, the dark energy carries distant matter away from us and sets a limit to what we will ultimately see. The smallest scale in physics is the Planck length, the minuscule wavelength of photons so energetic that two of them will form a black hole. While exploring physics down to the Planck length is beyond the capabilities of any conceivable collider, the universe itself probed this scale in its earliest moments. So the simple structure of the cosmos is likely to be an indication that the laws of physics become simple at this extreme.
All the complexity in the world—including stars, planets, and life—apparently resides in the “messy middle.” It is a striking fact that the geometric mean of the Hubble and Planck lengths is the size of a living cell—the scale on which we live, where Nature is at her most complex.
What is exciting about this picture is that it requires a new kind of theory, one that is simple at both the smallest and largest scales, and at very early and very late cosmological times, so that it can explain these properties of our world. In fact, there are more detailed hints, from both theory and data, that at these extremes the laws of physics should become independent of scale. Such a theory won’t be concerned with kilograms, meters, or seconds, only with information and its relations. It will be a unified theory not only of all the forces and particles but also of the universe as a whole.
The LHC Is Working at Full Energy
Gordon Kane
Theoretical physicist and cosmologist; Victor Weisskopf Distinguished University Professor, University of Michigan; author, Supersymmetry and Beyond
The most interesting recent physics news is that the Large Hadron Collider at CERN, in Geneva, is finally working at its highest-ever design energy and intensity. Why that is so important is because it may at last allow the discovery of new particles—superpartners—that would enable formulating and testing a final theory underlying the physical universe.
As Max Planck immediately recognized when he discovered quantum theory over a century ago, the equations of the final theory should be expressed in terms of universal constants of nature, such as Newton’s gravitational constant G, Einstein’s universal speed of light c, and Planck’s constant h. The natural size of a universe is then tiny, about 10–33 cm, and the natural lifetime about 10–43 seconds, far from the sizes of our world. Physicists need to explain why our world is large and old and cold and dark. Quantum theory provides the opportunity to connect the Planck scales with our scales, our world, and our physical laws, because in quantum theory virtual particles of all masses enter the equations and mix scales.
But that only works if the underlying theory is what is called a supersymmetric one, with our familiar particles, such as quarks and electrons and force-mediating bosons each having a superpartner particle (squarks, selectrons, gluinos, etc.). In collisions at the LHC, the higher energy of the colliding particles turns into the masses of previously unknown particles via Einstein’s E=mc2.
The theory did not tell us how massive the superpartners should be. Naïvely, there were arguments that they should not be too heavy (“naturalness”), so they could be searched for with enthusiasm at every higher energy that became accessible, but so far they have not been found. In the past decade or so, string theory and M-theory have been better understood and now provide clues as to how heavy the superpartners should be. String theories and M-theories differ technically in ways not important for us here. To be mathematically consistent, and part of a quantum theory of gravity and the other forces, they must have nine or ten space dimensions (and one time dimension). To predict superpartner masses, they must be projected onto our world of three space dimensions, and there are known techniques for doing that.
The bottom line is that well-motivated string/M-theories do indeed predict that the LHC run (Run II), which started in late 2015 and is moving ahead strongly in 2016, should be able to produce and detect some superpartners, thus opening the door to the Planck-scale world and promoting study of a final theory to testable science. The news that the LHC works at its full energy and intensity and is expected to accumulate data for several years is a strong candidate for the most important scientific news of recent years.
New Probes of Einstein’s Curved Spacetime—and Beyond?
Steve Giddings
Theoretical physicist; professor, Department of Physics, UC Santa Barbara
One of the most profound puzzles in modern physics is to describe the quantum nature of spacetime. A real challenge here is that of finding helpful experimental guidance. Interestingly, we are just now on the verge of gaining key new experimental information about classical spacetime, in new and important regimes—and this offers a possibility of learning about quantum spacetime as well.
The community has been abuzz about the possible discovery of a new particle at the LHC, seen by its disintegration into pairs of photons. If this is real and not just a fluctuation, there’s a slim chance it is a graviton in extra dimensions, which, if true, could well be the discovery of the century. While this would indeed be a probe of quantum spacetime, I’ll put it aside until more data reveals what’s happening at the LHC.
But we are clearly entering a new era in several respects. First, miles-long instruments built to detect gravitational waves have just reached a sensitivity where they should be able to see these spacetime ripples, emitted from collisions and mergers of distant black holes and neutron stars. In fact, at this writing there have been recent hints of signals seen in these detectors, though we are awaiting a verifiable signal. Once found, these will confirm a major prediction of Einstein’s general relativity and open a new branch of astronomy, where distant objects are studied by the gravitational waves they emit. It’s also possible that precise measurements of the microwave radiation left over from the Big Bang will reveal gravity waves, though the community has backpedaled from the premature announcement of this in 2014, and so the race may well be won by the Earth-based gravity-wave detectors. Either way, developments will be exciting to watch.
Even more profound tests of general relativity may well exist via the Event Horizon Telescope, which is being brought online to study the 4-million-solar-mass black hole at the center of our galaxy. The EHT is really a network of radio telescopes, which combine to make a telescope the size of planet Earth. This will offer an unprecedentedly sharp focus on both our central black hole and on the 6-billion-solar-mass black hole at the center of the nearby elliptical galaxy M87. In fact, with the telescopes that have been networked so far, we’re beginning to see structure whose size is close to that of the event horizon of our central black hole. The EHT should ultimately probe gravity in a regime where it gets extremely strong—so strong that the velocity needed to escape its pull approaches the speed of light. This will give us a new view on gravity in a regime where it has so far not been well tested.
Even more tantalizing is the possibility that the EHT will start to see effects that begin to reveal a more basic quantum reality underlying spacetime. For the 2014 Edge Question, I wrote that our fundamental concept of spacetime seems ready for retirement and needs to be replaced by a more basic quantum structure. There are many reasons for this: One good one is the crisis arising from the attempt to explain black-hole evolution with present-day physics; our current foundational principles, including the idea of spacetime, conflict with each other in describing black holes. Although Stephen Hawking initially predicted that quantum mechanics must break down when we account for emission of particles of Hawking radiation from a black hole, there are now good indications that it should not be abandoned. And if quantum mechanics is to be saved, this tells us that quantum information must be able to escape a black hole as it radiates particles—and this co
nfronts our understanding of spacetime.
The need for information to escape an evaporating black hole conflicts with our current notions of how fields and particles move in spacetime; here, escape is forbidden by the prohibition of faster-than-light travel. A key question is how the familiar spacetime picture of a black hole must be modified to allow such escape. The modifications apparently must extend out at least to the hole’s event horizon. Some have postulated that the new effects abruptly stop right there, at the horizon, but this abruptness is unnatural and leads to other seemingly crazy conclusions associated with what has been recently renamed the “firewall” scenario. A more natural scenario is that the usual spacetime description is also modified in a region extending outward beyond the black-hole horizon, at least through the region where gravity is very strong; the size of this region is perhaps a few times the horizon radius. In short, the need to save quantum mechanics indicates quantum modifications to our current spacetime description that extend into the region that the EHT observations will be probing! Important goals are to improve understanding of the nature of these alterations to the familiar spacetime picture and determine more carefully their possible observability via the EHT’s measurements.
Supermassive Black Holes
Jeremy Bernstein
Professor emeritus, Stevens Institute of Technology; former staff writer, The New Yorker
The most interesting thing I learned was the presence of supermassive black holes at the center of galaxies, including our own. Where did they come from? At what stage were they created? They are not the collapse of a star. I have not heard an explanation that makes a lot of sense to me. Maybe they are pre-Big Bang relics.