Book Read Free

Hiding in the Mirror: The Quest for Alternate Realities, From Plato to String Theory (By Way of Alicein Wonderland, Einstein, and the Twilight Zone)

Page 15

by Lawrence M. Krauss


  To get an idea about how the normal rules of addition and subtraction can become meaningless when one is considering infinite quantities, my favorite tool involves something called Hilbert’s hotel, named after the famous mathematician David Hilbert, who was one of the pioneers in studying the properties of numbers, and whom I referred to earlier in the context of the development of general relativity.

  Hilbert’s hotel is not like a normal hotel, because it has an infinite number of rooms. Other than being rather large, you might think it would not be qualitatively different from normal hotels, but you would be wrong. For example, say that one evening Hilbert’s hotel has every room occupied. In a normal hotel the manager would put up a NO VACANCY sign, but not so in this case. Say a weary traveler comes in with his family and asks for a room. The owner would happily reply that every room was now occupied, but if the traveler just waited a bit, a room would be available shortly. How would this be possible? Simple. Just take the family from room 1 and put them in room 2, the family from room 2 and put them in room 3, and so on. Since there are an infinite number of rooms, everyone gets accommodated, but now room 1 is vacant, and free for the new traveler.

  Say that the new traveler arrives not merely with his family, but with his friends, as well. Because he is a very popular fellow, he brings an infinite number of friends along, each of whom wants his or her own room. No problem. The manager takes the family from room 1 and puts them in room 2, the family from room 2 and puts them in room 4, the family from room 3 and puts them in room 6, and so on. Now only the evennumbered rooms are occupied, and there are an infinite number of oddnumbered rooms vacant to accommodate the new travelers. As these examples demonstrate, adding up infinite numbers of things is a confusing process, but mathematicians have developed rules that allow one to do so consistently. In performing such operations, however, one can find not only that the sum of an infinite series may be smaller than some of the individual terms, the sum of an infinite series can be smaller than every single term. Moreover, this can be the case not only for series with alternating sign terms, but for series in which every term is positive. Perhaps the most important example of this, and one of great relevance for much of the physics that follows, is the following: When considered using appropriate mathematical tools developed to handle infinite series, the sum of the series 1 + 2 + 3 + 4 + 5 + . . . can be shown to not equal infinity, but rather −1⁄12!

  Now, in a similar vein, using similar mathematical tools, it was recognized by those who studied the mathematical relations associated with the scattering of strongly interacting particles that, if a very specific relation called “duality” (which I shall describe in more detail shortly) exists between all of the particles in the theory, then it is possible to write the total scattering rate as an infinite sum of individual contributions, each of which might blow up as the energy of the scattering particles increased, but the sum of which would instead add up to a finite number.

  In 1968 the physicist Gabriele Veneziano postulated a precise formula for the scattering of strongly interacting particles that had exactly the required duality properties. It was, one should emphasize, a purely mathematical postulate, without more than at most marginal physical or experimental support. Nevertheless, the fact that it appeared to possibly resolve a conundrum that had been plaguing particle physics meant that many physicists started following up on Veneziano’s ideas. It was soon discovered that Veneziano’s purely mathematical “dual model” actually did have a physical framework through a theory not of point particles, but of “relativistic strings” (i.e., extended one-dimensional objects moving at near light-speed). Specifically, if the fundamental objects that interacted and scattered were not zero-dimensional pointlike objects, but rather one-dimensional stringlike objects, then one could show that the particular mathematical miracles associated with duality could naturally and automatically result.

  Faced with the prospect of an embarrassing plethora of new particle states and also of what appeared to be an otherwise mathematically untenable theory based on that old-fashioned idea that the fundamental quantum mechanical excitations in nature are manifested as elementary particles, many physicists felt that the strong interaction had to be, at its foundation, a theory of strings.

  This may all sound a bit too fantastic to be true, and those of you who are old enough to have followed popular science ideas in the 1960s and ’70s may wonder why you never heard tell of strings. The answer is simple: It was too fantastic to be true. Almost as soon as dual string models were developed, a number of even more embarrassing problems arose, both theoretical and experimental. The theoretical problem was, as we physicists like to say, “highly nontrivial”: It turns out that when one examines the specific mathematical miracle associated with the infinite sums that duality is supposed to provide, there is a slight hitch. The sums are supposed to produce formulae for describing the scattering of objects one measures in the laboratory. Now there is one simple rule that governs a sensible universe: If one considers all of the possible outcomes of an experiment and then conducts the experiment, one is guaranteed that one of the outcomes will actually happen. This property, which we call unitarity, really arises from the laws of probability: namely, that the sum of the probabilities of all possible outcomes of any experiment is precisely unity. With dual string models, however, it turned out that the infinite sums in question do not, in general, respect unitarity. Put another way, they predict that sometimes when you perform an experiment, none of the allowed outcomes of the experiment will actually occur.

  Thankfully, however, there turned out to be an explicit mathematical solution to this mathematical dilemma, which will be far from obvious upon first reading it, but here goes: If the fundamental objects in the theory, relativistic strings, lived not in a four-dimensional world, but a twentysix–dimensional world, then unitarity (i.e., sensible probabilities) could be preserved.

  It turns out that it is precisely the infinite sum I discussed earlier that implies this weird need for twenty-six dimensions to preserve unitarity. Considering scattering processes between strings, “virtual strings” could be exchanged, with the possibility of having an infinite number of virtual strings contributing to the scattering process. Now it turns out that the result of performing this sum yields a term that screws up the calculation of probabilities, of the following form: [1 + 1⁄2 ( D − 2) (1 + 2 + 3 + 4 + 5 + . . . )], with D representing the dimension of space-time. Now, if D = 26, and the infinite series in the second term sums up to − 1⁄12, the total result for this offending contribution to physical scattering is precisely zero. Now, you may recall that when Kaluza postulated the existence of a hypothetical mathematical fifth dimension, he did so sheepishly, noting “all the physical and epistemological difficulties.” He essentially suggested that this extra dimension was primarily a mathematical trick, a way of unifying two disparate theories. But Kaluza’s proposal was nothing compared to what appeared to be required for the consistency of dual string models—namely, that the universe must be not five-dimensional, but twenty-six-dimensional.

  You might wonder whether a mathematical trick is sufficient reason to believe in twenty-two new dimensions of space, and no doubt many physicists at the time did, too. However, nature ultimately came to the rescue to resolve the debate, so that no one had to worry about this issue. Or rather, a much simpler theory than dual strings came along to completely explain the strong force.

  The first inklings that dual strings might not provide the answer to the puzzling nature of the strong interaction came from experiments performed within a year or so of the time that Veneziano first proposed his mathematical solution for duality. If duality held true, then at high energies the rates of scattering of strongly interacting particles off of one another that would produce particles that flew off at a fixed angle should decline dramatically as the energy increased. But the observed falloff, while it did exist, was much less severe than the prediction. It turned out that this finding provided
clear support for an idea first proposed at the beginning of the decade by the brilliant theoretical physicist Murray Gell-Mann, who between the mid-1950s and the mid1960s seemed to have an unerring sense of what directions might prove fruitful for unraveling the experimental confusion in elementary particle physics. Gell-Mann suggested in 1961 that one could classify the existing strongly interacting particle states into a very attractive mathematical pattern, which he called the eightfold way. What made this classification system more than mere taxonomy was that one of its first predictions is that new particles would have to exist in order to fill out some parts of the pattern that had not yet been seen. In one of the most remarkably prescient combinations of experiment and theory in recent times, in 1964 one of those new particles, called the omega-minus, was discovered, more or less exactly as Gell-Mann and his collaborators had predicted.

  By 1964 Gell-Mann—and independently, George Zweig—had recognized that this underlying mathematical framework could have a physical basis if all of the dozens of strongly interacting particles, now called “hadrons,” were composed of yet more fundamental particles, which GellMann, the consummate scholar and linguist, dubbed “quarks” in honor of a term from James Joyce’s Ulysses. Quarks themselves remained a purely theoretical construct that nevertheless proved remarkably useful in classifying all the observed hadrons. However, in the late 1960s the reality of quarks as physical entities was suggested when the scattering experiments that killed the dual string picture proved instead to be completely compatible with the notion that hadrons were themselves composed of pointlike particles acting almost independently. On its own, however, the quark model was not sufficient to explain the data. If quarks existed, why had they not been directly observed in highenergy scattering experiments? What force or forces might bind them into hadrons, and how could one explain hadron properties in terms of quark properties? And most confusing of all, if hadrons were strongly interacting, which meant that quarks had to be as well, why did the pointlike particles that appeared to make up hadrons act independently, as if they were almost noninteracting, in these high-energy scattering experiments?

  Well, I already gave the punch line away several chapters earlier. In 1972–74 a series of remarkable theoretical breakthroughs basically resolved almost all the outstanding problems in elementary particle physics, as it was then understood. In particular, in a last-ditch effort to potentially put an end to what had become known as “quantum field theory,” which is the theoretical framework that results when one straightforwardly combines quantum mechanics and relativity using familiar fundamental particles, David Gross at Princeton, who had been a student of Geoffrey Chew’s at Berkeley during the heyday of the bootstrap model, and his own student Frank Wilczek were exploring the mathematical behavior of a type of quantum field theory called a Yang-Mills theory, named after the two physicists who had first proposed it way back in 1954. Yang-Mills theories have another, more technical, name that is even more daunting: nonabelian gauge theories. What this term means is that these theories are similar to electromagnetism, which has a mathematical property called gauge invariance, a form of which was first explored by the mathematician Hermann Weyl in his efforts to unify electromagnetism and gravity.

  An equation is said to possess a certain symmetry, or be invariant under some change, whenever that change does not alter its meaning. For example, if A = B, then A+2 = B+2. Adding 2 to each side of an equation leaves the meaning of the equation invariant. If A and B represent positions in space, for example, then adding 2 to both sides of the equation would be equivalent to translating both A and B by two units in some direction. Each point would still be at the same position as the other point. This transformation is called a “translation,” and the equation is said to be “translationally invariant,” or possess a “translation symmetry.”

  Similarly, the fundamental equations of both gravity and electromagnetism remain invariant when one changes certain quantities in the theory—in the case of gravity, these include the coordinates used to measure the distance between points. As pointed out earlier the specific coordinates one uses to describe some space are chosen for convenience. The underlying physical properties, like curvature, do not depend upon the choice of coordinates. For electromagnetism however, the quantity one can freely change is related to an intrinsic characteristic of charged objects, associated, it turns out, with multiplying all charged quantities by a complex number. Weyl thought one could also make this latter quantity appear as if it were a kind of coordinate transformation, achieved by changing the scale (or “gauge”) of disance measurements. One could thus “unify” the “symmetries” of electromagnetism and gravity as being associated with different kinds of coordinate transformation, but he was wrong. Nevertheless, it turns out that the separate symmetries of these two theories imply that gravity and electromagnetism share one feature in common: In both, the strength of the force between (massive or charged, respectively) objects falls off with the square of the distance between them. It turns out that when one attempts to turn these theories into quantum theories, this particular force law, which means the force is long ranged, requires, via the uncertainty principle, the existence of a massless particle that can be exchanged between objects and by which the force is transmitted. In the case of electromagnetism this particle is called the photon, and in gravity we call the (not yet directly measured) particle the graviton.

  However, in nonabelian or Yang-Mills theories, because the transformations that can leave the equations the same are more complex, instead of having only one massless force carrier field, like the photon in electromagnetism, these theories can have numerous such fields. Moreover, in electromagnetism the photon, while it is emitted and absorbed by objects that carry electric charge, does not itself carry an electric charge. But in Yang-Mills theories the force carriers themselves are charged and thus interact with one another as well as with matter. These theories had begun to have newfound currency in the late 1960s after it was proposed—it later turned out correctly—independently by Glashow, Weinberg, and Salam, who later shared the Nobel Prize for their insight, that one such nonabelian gauge theory could correctly describe all aspects of the weak interaction that converted protons into neutrons, and was responsible for the decay of neutrons into protons, electrons, and neutrinos. Gross, Wilczek, and independently David Politzer, a graduate student of Sidney Coleman’s at Harvard, each turned his attention to another nonabelian gauge theory whose form ultimately turned out to have certain properties that suggested it might be appropriate to describe the interactions between quarks that bound them together into hadrons. Recall that Gross, who was trained in Chew’s “bootstrap” group at Berkeley, was exploring this theory in hope of ruling it out as the last possible quantum field theory—and hence the last theory that was based on elementary particles as the fundamental quantities of interest—that might explain the exotic properties that seemed to be required to result in the high-energy scattering behavior of hadrons.

  Much to his surprise, however, when he, Wilczek, and also Politzer completed their calculations, which explored precisely how virtual particles and antiparticles in this theory might affect how the force between quarks evolved as the quarks got closer together, it turned out that a miracle occurred. As Gross later put it: “For me, the discovery of asymptotic freedom was totally unexpected. Like an atheist who has just received a message from a burning bush, I became an immediate true believer.”

  The theory, which we now call quantum chromodynamics, or QCD for short, had precisely the property needed to explain the experimental data: Namely, the force between quarks would grow weaker as the quarks got closer—which implies, naturally, that as one pulled them farther apart the force would get stronger. This could explain why in high-energy scattering experiments the individual quarks close together inside the proton might appear almost noninteracting, while at the same time no scattering experiment had yet been successful in knocking a single quark apart from its neighbors. Discov
ering the property that quark interactions grew weaker with closer proximity—which they dubbed asymptotic freedom—enabled them, and since then many other researchers, to calculate and predict very precisely the behavior of strongly interacting particles in high-energy collisions. Needless to say, the predictions have all been correct. The converse property, which suggests that the force between quarks continues to grow without bound as you try to separate them, and which has since been dubbed confinement, has not yet been fully proven to arise from QCD. However, numerical calculations with computers all suggest that it is indeed a property of the theory that is now known to describe the strong force. Gross, Wilczek, and Politzer were hence awarded the Nobel Prize in 2004 for their discovery of asymptotic freedom thirty years earlier. Thus, out of the incredible experimental confusion of the 1940s, ’50s, and ’60s had ultimately arisen a beautiful set of theories, now called the standard model, that described all the known, nongravitational forces in nature in terms of rather elegant mathematical quantum field theories called gauge theories. The simplest extension of the basic laws of nature, involving quantum mechanics, relativity, and electromagnetism, had ultimately triumphed over the competing mathematical elegance of exotic ideas such as dual string models, along with their exciting, if somewhat daunting, requirement of extra dimensions.

  But the game was far from over. The fatal warts of dual strings, at least as far as explaining the strong interaction, would later be turned into beauty marks in a much more ambitious program to unify gravity with the other three forces in nature. And the very properties of gauge fields and the matter that couples to them, combined with the remarkable theoretical successes that had been achieved by studying them, would lead theorists to once again revisit the very first effort to unify the first known gauge theories: gravity and electromagnetism. In so doing they would once again be driven to reconsider whether extra dimensions might be the key to understanding nature.

 

‹ Prev