Historically, since the first accelerator, which was built in 1932, no one has ever detected an elementary particle that has quantum spin 0 and is a scalar field. (In quantum physics, the terms particle and field are interchangeable.) The pi meson (pion), discovered in 1947 by Cecil Powell, César Lattes, and Giuseppe Occhialini, does have spin 0, but it is not an elementary particle because it is composed of a quark and an antiquark. If the Higgs particle, as an elementary particle, is detected at the LHC, then it would be the first elementary spin-0 particle discovered in nature, with the same quantum numbers as the vacuum.
Of course, if a Higgs boson of spin 0 does exist, it could be a composite of quarks and antiquarks like the pi meson, rather than an elementary particle. However, the pi meson, together with other light mesons, are “pseudoscalar particles”—that is, they have negative parity, in contrast to the elementary Higgs boson, which has positive parity and is a scalar particle. Spin-0 composite scalar mesons have been observed in quark spectroscopy. They are composites of bound quark–antiquark pairs and exist in excited states above the ground state of hydrogen-like systems. The hints of a putative Higgs boson at 125 GeV at the CMS and ATLAS detectors could be interpreted conceivably as a quark–antiquark state. However, this would require the existence of a quark with a mass of about 50 to 60 GeV. No known quark with this mass has yet been detected. If indeed the Higgs boson is an elementary particle with a mass of about 125 GeV, it would be the first elementary particle with spin 0 and positive parity discovered.
Despite its traditional experimental status as a potentially nonexistent elementary particle, the Higgs boson has played a significant role in particle physics and cosmology. It is not as if theoretical physicists have much choice in the matter. Rather, the physics of both the standard model of particle physics and of cosmology demand that the basic Higgs particle be an elementary spinless particle. And it is not the only spin-0 scalar elementary particle that plays an important role in physics. The popular inflation theory, which is hypothesized to explain fundamental features of the early universe, is based on an effective spin-0 particle called the inflaton, which has also not been detected experimentally.
WHY IS THE HIGGS SO IMPORTANT?
The Higgs boson is an important player in the particle physics theater, for it is hypothesized that it produces the masses of all the elementary bosons, quarks, and leptons in the standard model. Elementary particles, as we recall, do not contain more basic constituents, in contrast to the protons, neutrons, and mesons. The proton and neutron are not elementary particles because they are composed of quarks, and the mesons, such as the pi meson and K meson, are composed of quarks and antiquarks. These particles belong to a family of composite particles called hadrons.
We know from accelerator experiments that the W and Z bosons differ in mass by about 10 GeV, and the photon is known experimentally to have zero mass. These facts have to be explained in the electroweak theory in the standard model of particle physics, as developed first by Weinberg and Salam in 1967/1968. Electroweak symmetry means there was a phase during the early universe when the masses of all the particles except the Higgs boson were zero. The “Higgs mechanism” is invoked in the theory of the standard model to make the spinless scalar field vacuum energy not equal to zero. Normally, the vacuum energy of particles is considered to be zero in modern quantum field theory. Actually, as we know, the vacuum energy is not composed of “nothing,” but consists of particles and their antiparticles annihilating one another to produce an energy of zero. The postulated nonzero vacuum energy causes the “spontaneous symmetry breaking” that gives elementary particles their masses.
The Higgs boson is so crucial because it makes the unified theory of electromagnetism and the weak force within the standard model—the so-called electroweak theory—renormalizable within modern quantum field theory. This means that we are able to perform calculations of the collisions of particles, for example, that are finite and meaningful, rather than having to deal with impossible infinities. By way of analogy, the early developments of quantum field theory during the 1930s and 1940s produced a renormalizable and therefore finite theory of QED. This theory is one of the most remarkably successful theories in physics. It is the relativistic quantum theory of James Clerk Maxwell’s classical electromagnetic field equations. Particle physicists hope that the standard model with the Higgs boson can repeat the success of the renormalizable QED.
WHERE IS THE HIGGS BOSON?
The final results from the Tevatron accelerator in the United States and the results up to March 2012 from the LHC already ruled out great swaths of the mass–energy range possible for the Higgs boson to exist. Only a small window of energy was left in which the Higgs particle could be hiding. As it turns out, this was the most difficult energy window in which the LHC could detect the Higgs—namely, the energy range between 115 GeV and 145 GeV. The LHC was built to investigate particle collisions at very high energies, up to 14 TeV. However, at lower energies, such as between 115 GeV and 145 GeV, the unwanted background or statistical noise is high, making it very difficult to detect the decay products of the Higgs boson and thereby infer its existence. It is in this small, remaining window that the putative signal of the Higgs boson has been found. For the LHC to discover the Higgs boson at these lower energies requires a significant luminosity or intensity of proton–proton collisions, which was reached by July 2012.
By 2012, the top quark and the W boson masses were accurately known experimentally. Armed with this knowledge, one could estimate the mass of the Higgs boson. Moreover, fits to precise electroweak data collected during the past decade or two by the LEP at CERN, the Tevatron collider, and, more recently, LHC experiments, also allowed one to constrain the possible mass of the Higgs boson. Combining all the data shows that the Higgs boson mass should be light—between 115 GeV and 130 GeV. Indeed, the best fit to the electroweak data puts the Higgs mass at around 97 GeV. However, this value of the Higgs mass was already excluded at LEP, which stopped functioning in the year 2000 to make room for the LHC, which is housed in part in the original LEP underground ring. This exclusion by LEP, however, only has a statistical significance of one standard deviation (sigma), which is not significant enough to exclude a possible Higgs boson from 97 GeV up to an energy of 135 GeV.
Figure 4.1 © Farley Katz, New Yorker Magazine
By March 2012, spokespeople at the ATLAS and CMS detectors at the LHC were claiming that the proton–proton collision data gathered in 2012 will either rule out the Higgs boson or discover it in the narrow band of remaining, unexplored low energy, which is only 15- to 20-GeV wide. They claimed that this will end the search for the elusive “God particle” that has been going on for more than three decades.
If the discovery of the Higgs boson is not confirmed, this will create a crisis in physics. Indeed, the many textbooks on particle physics and relativistic quantum field theory published during the past 40 years will need to have three or four chapters ripped out. Our whole understanding of the fundamental origin of matter and what makes the unified electroweak theory consistent physically would have to be rethought from scratch.
DO SUPERSYMMETRIC PARTICLES EXIST?
The search for symmetries in nature has been going on since ancient Greece. Symmetry is such a pleasing concept to the human mind that we assume it must be the basis of nature. There are many examples of obvious symmetries, such as symmetric shapes like spheres or snowflakes, or the bilateral symmetry of butterflies or the axial symmetry of trees.
Physicists’ search for symmetries in nature led them to a symmetric description of space and time called supersymmetry. The use of super suggests that this is the most symmetric description of particle physics and spacetime that can be achieved. The idea of supersymmetry was proposed by Hironari Miyazawa in 1966 and was rediscovered subsequently during the early 1970s by several physicists, including Y. Golfand, E. P. Likhtman, Julius Wess, and Bruno Zumino. The origin of supersymmetry lay mainly in mathematical specu
lations about the possible maximum symmetries of space and time. However, the concept was soon adopted by particle physicists for more practical purposes.
When supersymmetry is translated into particle physics, it requires that the number of particles that are observed in the particle zoo be doubled. Each of these new particles is a superpartner of an existing one; one will be a boson with half-integer spin and the other a fermion with integer spin.1 That is, the spin of the superpartners always differs by a spin unit of ½. For example, the electron has spin ½, whereas its superpartner, the selectron, has spin 0. Without supersymmetry, only the bosons are the carriers of forces between particles; but, in supersymmetry theory, fermions can also play this role. For example, the photon, a boson and the carrier of the electromagnetic force, has spin 1, whereas its superpartner, the photino, a fermion, has spin ½ and carries a new supersymmetric force that acts between the superpartners.
HOW IS SUPERSYMMETRY USEFUL?
One of the earliest applications of supersymmetry was in attempting to solve the so-called Higgs mass hierarchy problem. According to the standard model of particle physics, the mass of the Higgs boson receives an enormous contribution from the interaction of this particle with itself—its so-called self-interaction. This is in contrast to the masses of the quarks and leptons, and the W and Z bosons, the masses of which are generated directly by the interactions of the Higgs boson, or Higgs field, with these elementary particles. The Higgs mass hierarchy problem arises because, to get a down-to-earth value for this mass, physicists have to perform a tremendous fine-tuning—a very delicate mathematical cancelation involving numbers with a great many decimal places—between the “bare mass” of the Higgs boson, which is the mass in the absence of interactions with other particles, and the contribution coming from the self-interaction. Such fine-tuning in physics produces a very unnatural consequence for theoretical calculations, and it is unacceptable to most theoretical physicists. One of the major challenges in particle physics during the past four decades has been to remove this Higgs mass hierarchy problem. The most efficacious way to do this has been with supersymmetry. This solution has been called natural supersymmetry.
Another notorious fine-tuning problem in particle physics is the calculation of the energy density of the vacuum using relativistic quantum field theory. Again, the calculation of the vacuum energy density can produce a startling disagreement, by as much as 10122, with the expected observed value of the vacuum energy density. This absurd result is considered one of the worst predictions in the history of physics. In particular, when the vacuum energy density is calculated from the Higgs field vacuum, it produces by itself this absurd fine-tuning of the vacuum energy density. See Chapter 10 for a more detailed discussion of the concepts of fine-tuning, the Higgs boson mass hierarchy problem, the gauge hierarchy problem, and the cosmological constant problem.
Returning to supersymmetry, Leonard Susskind and others during the early 1970s suggested that if superpartners really existed, then this could solve the Higgs mass hierarchy problem. Technically speaking, this required the masses of the superpartners to be not too different from the known masses of the quarks and leptons. In this interpretation, some suggested that a “supercharge” on the superpartners existed in addition to ordinary electric charge. This interpretation resulted in a cancelation between the supercharges of the particles and their superpartners, which alleviated the Higgs mass hierarchy problem. In other words, during the calculation of the Higgs mass, the positive particle contributions are canceled by the negative superpartner contributions, which makes the mass of the Higgs boson agree with its anticipated experimental value. A similar cancelation in a supersymmetric theory would resolve the serious fine-tuning problem in the calculation of the vacuum energy density of particles.
During the 1970s and 1980s, Peter van Nieuwenhuizen, Sergio Ferrara, and Daniel Freedman, among others, introduced the framework of supergravity, in which the boson force carrier of gravity, named the graviton, with spin 2, has a superpartner called the gravitino with spin . The supergravity theory can help solve the problem of how to unify gravity with the other three forces of nature.
Another important problem that supersymmetry may solve is the cosmological constant problem. We recall that the calculation of the vacuum energy density is absurdly big compared with observational data. This vacuum energy density can be related directly to the so-called cosmological constant. Einstein introduced the cosmological constant in 1917 into his gravitational field equations to make the cosmological model based on his gravity theory lead to a static universe. This was the first paper introducing modern cosmology based on Einstein’s gravity theory.2 Then, during the 1920s, the Russian cosmologist Alexander Friedmann solved Einstein’s field equations and discovered that the universe of general relativity was dynamical and would undergo an expansion. This was later found independently by Belgian cosmologist Georges Lemaître. With astronomer Edwin Hubble’s discovery in 1929 that the universe is indeed expanding, the need for Einstein’s cosmological constant disappeared. However, it has reappeared in today’s so-called standard model of cosmology to explain the apparent acceleration of the expansion of the universe, which astronomers discovered through supernovae data in 1998. Einstein’s cosmological constant, designated by the Greek letter Lambda, Λ, produced a repulsive, antigravity force in the universe that was able to balance the attractive force of gravity, as was originally required by Einstein in his static model of the universe. As it is interpreted today by cosmologists, the vacuum energy density associated with the cosmological constant produces a repulsive force that accelerates the expansion of the universe. The energy associated with this force is called dark energy.
During the mid 1960s, Russian physicist Yakov Zeldovich identified the cosmological constant with the energy of the vacuum. The seeming sea of annihilating particles and antiparticles that constitutes the modern vacuum produces a constant vacuum energy that is identified with the cosmological constant through Newton’s gravitational constant. Calculating this vacuum energy leads to the enormous discrepancy of more than 120 orders of magnitude between the theory and observation, and constitutes the most extreme fine-tuning problem in all of physics, signaling a major crisis in both cosmology and particle physics.
In supersymmetry, because of the cancelation of energies between particles and their superpartners, this extreme fine-tuning of the vacuum density no longer occurs. However, we now know experimentally, as of March 2013 from LHC results, that no superpartners exist below a mass of 600 to 800 GeV. In addition, the gluino, which is the superpartner of the gluon, has been excluded up to 1.24 TeV. This promising solution of the cosmological constant problem has been dealt a serious blow.
If the search for superpartners continues to come up empty-handed at the LHC, then it would remove one of the early motivations for promoting super-symmetry. The theory would fail to solve the Higgs mass hierarchy problem and the fine-tuning problem associated with the vacuum energy density calculations. It now appears that if, indeed, superpartners exist, they will have masses beyond what can be detected by the LHC.
SUPERCASTLES IN THE AIR
In its early days, supersymmetry also played an important role in the development of string theory with its many extra dimensions. The original version of string theory contained only boson particles with integer spin. Relativistic quantum field theory and particle physics were founded on the concept that particles are points in space—that is, they had zero dimensions—whereas, in string theory, particles are one-dimensional strings. These strings are very tiny, only about 10−33 cm, which is about the Planck length, and it is the frequencies of their vibrations that identify them as different particles. This theory requires a total of 26 dimensions, which is in stark contrast to our known universe of three spatial dimensions and one time dimension. The need for the extra dimensions arose because of the requirement that string theory satisfy Einstein’s special relativity, which only has three spatial dimensions and
one time dimension in what is called Minkowski spacetime.3 The mathematics of string theory requires that 26 dimensions are necessary to compactify, or shrink, the dimensions to four. It was eventually discovered that supersymmetry, with all its partner particles, was needed to make string theory consistent physically with fermions as well as bosons, which led to what is now called superstring theory, whose viability depends on the discovery of superpartners by the LHC and future accelerators.
Superstring theory has declined in popularity during the past few years because it has not been possible for string theorists to propose realistic tests of the many versions of this theory, except for experimental ways to find the extra space dimensions required by string theory. One of the major claims of superstring theory is that it leads to a finite theory of particle physics—that is, that the standard model of particle physics falls out of the equations of string theory. However, this has not been proved satisfactorily. Nor has it been proved rigorously that replacing zero-dimensional points in spacetime with one-dimensional strings makes finite quantum calculations possible. Because superstring theory is claimed to unify all the forces of nature, including gravity, it would also lead to finite calculations in quantum gravity. However, the latest experimental results from the LHC have excluded the existence of extra dimensions, and weakened the viability of superstring theory, up to about 3 TeV, which represents a lot of territory.
The story about extra dimensions starts with the publication of a unified theory by Theodor Kaluza, born in Silesia in 1885. He was inspired by Einstein’s attempt to unify gravity and electromagnetism, the only known forces during the 1920s. Kaluza proposed that there is a fourth spatial dimension in addition to the three known ones, making spacetime five-dimensional when we include the dimension of time. Oskar Klein, born in 1894, developed Kaluza’s theory further, and compactified the fifth dimension to a very small size so that we could understand why we had not observed the fifth dimension yet.
Cracking the Particle Code of the Universe Page 10