The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory
Page 11
The Quantum Framework
Quantum mechanics is a conceptual framework for understanding the microscopic properties of the universe. And just as special relativity and general relativity require dramatic changes in our worldview when things are moving very quickly or when they are very massive, quantum mechanics reveals that the universe has equally if not more startling properties when examined on atomic and subatomic distance scales. In 1965, Richard Feynman, one of the greatest practitioners of quantum mechanics, wrote,
There was a time when the newspapers said that only twelve men understood the theory of relativity. I do not believe there ever was such a time. There might have been a time when only one man did because he was the only guy who caught on, before he wrote his paper. But after people read the paper a lot of people understood the theory of relativity in one way or other, certainly more than twelve. On the other hand I think I can safely say that nobody understands quantum mechanics.1
Although Feynman expressed this view more than three decades ago, it applies equally well today What he meant is that although the special and general theories of relativity require a drastic revision of previous ways of seeing the world, when one fully accepts the basic principles underlying them, the new and unfamiliar implications for space and time follow directly from careful logical reasoning. If you ponder the descriptions of Einstein's work in the preceding two chapters with adequate intensity, you will—if even for just a moment—recognize the inevitability of the conclusions we have drawn. Quantum mechanics is different. By 1928 or so, many of the mathematical formulas and rules of quantum mechanics had been put in place and, ever since, it has been used to make the most precise and successful numerical predictions in the history of science. But in a real sense those who use quantum mechanics find themselves following rules and formulas laid down by the "founding fathers" of the theory—calculational procedures that are straightforward to carry out—without really understanding why the procedures work or what they really mean. Unlike relativity, few if any people ever grasp quantum mechanics at a "soulful" level.
What are we to make of this? Does it mean that on a microscopic level the universe operates in ways so obscure and unfamiliar that the human mind, evolved over eons to cope with phenomena on familiar everyday scales, is unable to fully grasp "what really goes on"? Or, might it be that through historical accident physicists have constructed an extremely awkward formulation of quantum mechanics that, although quantitatively successful, obfuscates the true nature of reality? No one knows. Maybe some time in the future some clever person will see clear to a new formulation that will fully reveal the "whys" and the "whats" of quantum mechanics. And then again, maybe not. The only thing we know with certainty is that quantum mechanics absolutely and unequivocally shows us that a number of basic concepts essential to our understanding of the familiar everyday world fail to have any meaning when our focus narrows to the microscopic realm. As a result, we must significantly modify both our language and our reasoning when attempting to understand and explain the universe on atomic and subatomic scales.
In the following sections we will develop the basics of this language and describe a number of the remarkable surprises it entails. If along the way quantum mechanics seems to you to be altogether bizarre or even ludicrous, you should bear in mind two things. First, beyond the fact that it is a mathematically coherent theory, the only reason we believe in quantum mechanics is because it yields predictions that have been verified to astounding accuracy. If someone can tell you volumes of intimate details of your childhood in excruciating detail, it's hard not to believe their claim of being your long-lost sibling. Second, you are not alone in having this reaction to quantum mechanics. It is a view held to a greater or lesser extent by some of the most revered physicists of all time. Einstein refused to accept quantum mechanics fully. And even Niels Bohr, one of the central pioneers of quantum theory and one of its strongest proponents, once remarked that if you do not get dizzy sometimes when you think about quantum mechanics, then you have not really understood it.
It's Too Hot in the Kitchen
The road to quantum mechanics began with a puzzling problem. Imagine that your oven at home is perfectly insulated, that you set it to some temperature, say 400 degrees Fahrenheit, and you give it enough time to heat up. Even if you had sucked all the air from the oven before turning it on, by heating its walls you generate waves of radiation in its interior. This is the same kind of radiation—heat and light in the form of electromagnetic waves—that is emitted by the surface of the sun, or a glowing-hot iron poker.
Here's the problem. Electromagnetic waves carry energy—life on earth, for example, relies crucially on solar energy transmitted from the sun to the earth by electromagnetic waves. At the beginning of the twentieth century, physicists calculated the total energy carried by all of the electromagnetic radiation inside an oven at a chosen temperature. Using well-established calculational procedures they came up with a ridiculous answer: For any chosen temperature, the total energy in the oven is infinite.
It was clear to everyone that this was nonsense—a hot oven can embody significant energy but surely not an infinite amount. To understand the resolution proposed by Planck it is worth understanding the problem in a bit more detail. It turns out that when Maxwell's electromagnetic theory is applied to the radiation in an oven it shows that the waves generated by the hot walls must have a whole number of peaks and troughs that fit perfectly between opposite surfaces. Some examples are shown in Figure 4.1. Physicists use three terms to describe these waves: wavelength, frequency, and amplitude. The wavelength is the distance between successive peaks or successive troughs of the waves, as illustrated in Figure 4.2. More peaks and troughs mean a shorter wavelength, as they must all be crammed in between the fixed walls of the oven. The frequency refers to the number of up-and-down cycles of oscillation that a wave completes every second. It turns out that the frequency is determined by the wavelength and vice versa: longer wavelengths imply lower frequency; shorter wavelengths imply higher frequency. To see why, think of what happens when you produce waves by shaking a long rope that is tied down at one end. To generate a long wavelength, you leisurely shake your end up and down. The frequency of the waves matches the number of cycles per second your arm goes through and is consequently fairly low. But to generate short wavelengths you shake your end more frantically—more frequently, so to speak—and this yields a higher-frequency wave. Finally, physicists use the term amplitude to describe the maximum height or depth of a wave, as also illustrated in Figure 4.2.
In case you find electromagnetic waves a bit abstract, another good analogy to keep in mind are the waves that are produced by plucking a violin string. Different wave frequencies correspond to different musical notes: the higher the frequency, the higher the note. The amplitude of a wave on a violin string is determined by how hard you pluck it. A harder pluck means that you put more energy into the wave disturbance; more energy therefore corresponds to a larger amplitude. You can hear this, as the resulting tone is louder. Similarly, less energy corresponds to a smaller amplitude and a lower volume of sound.
By making use of nineteenth-century thermodynamics, physicists were able to determine how much energy the hot walls of the oven would pump into electromagnetic waves of each allowed wavelength-how hard the walls would, in effect, "pluck" each wave. The result they found is simple to state: Each of the allowed waves—regardless of its wavelength—carries the same amount of energy (with the precise amount determined by the temperature of the oven). In other words, all of the possible wave patterns within the oven are on completely equal footing when it comes to the amount of energy they embody.
At first this seems like an interesting, albeit innocuous, result. It isn't. It spells the downfall of what has come to be known as classical physics. The reason is this: Even though requiring that all waves have a whole number of peaks and troughs rules out an enormous variety of conceivable wave patterns in the oven, there ar
e still an infinite number that are possible—those with ever more peaks and troughs. Since each wave pattern carries the same amount of energy, an infinite number of them translates into an infinite amount of energy. At the turn of the century, there was a gargantuan fly in the theoretical ointment.
Making Lumps at the Turn of the Century
In 1900 Planck made an inspired guess that allowed a way out of this puzzle and would earn him the 1918 Nobel Prize in physics.2 To get a feel for his resolution, imagine that you and a huge crowd of people—"infinite" in number—are crammed into a large, cold warehouse run by a miserly landlord. There is a fancy digital thermostat on the wall that controls the temperature but you are shocked when you discover the charges that the landlord levies for heat. If the thermostat is set to 50 degrees Fahrenheit everyone must give the landlord $50. If it is set to 55 degrees everyone must pay $55, and so on. You realize that since you are sharing the warehouse with an infinite number of companions, the landlord will earn an infinite amount of money if you turn on the heat at all.
But on closer reading of the landlord's rules of payment you see a loophole. Because the landlord is a very busy man he does not want to give change, especially not to an infinite number of individual tenants. So he works on an honor system. Those who can pay exactly what they owe, do so. Otherwise, they pay only as much as they can without requiring change. And so, wanting to involve everyone but wanting to avoid the exorbitant charges for heat, you compel your comrades to organize the wealth of the group in the following manner: One person carries all of the pennies, one person carries all of the nickels, one carries all of the dimes, one carries all of the quarters, and so on through dollar bills, five-dollar bills, ten-dollar bills, twenties, fifties, hundreds, thousands, and ever larger (and unfamiliar) denominations. You brazenly set the thermostat to 80 degrees and await the landlord's arrival. When he does come, the person carrying pennies goes to pay first and turns over 8,000. The person carrying nickels then turns over 1,600 of them, the person carrying dimes turns over 800, the person with quarters turns over 320, the person with dollars gives the landlord 80, the person with five-dollar bills turns over 16, the person with ten-dollar bills gives him 8, the person with twenties gives him 4, and the person with fifties hands over one (since 2 fifty-dollar bills would exceed the necessary payment, thereby requiring change). But everyone else carries only a denomination—a minimal "lump" of money—that exceeds the required payment. Therefore they cannot pay the landlord and hence rather than getting the infinite amount of money he expected, the landlord leaves with the paltry sum of $690.
Planck made use of a very similar strategy to reduce the ridiculous result of infinite energy in an oven to one that is finite. Here's how. Planck boldly guessed that the energy carried by an electromagnetic wave in the oven, like money, comes in lumps. The energy can be one times some fundamental "energy denomination," or two times it, or three times it, and so forth—but that's it. Just as you can't have one-third of a penny or two and a half quarters, Planck declared that when it comes to energy, no fractions are allowed. Now, our monetary denominations are determined by the United States Treasury. Seeking a more fundamental explanation, Planck suggested that the energy denomination of a wave—the minimal lump of energy that it can have—is determined by its frequency. Specifically, he posited that the minimum energy a wave can have is proportional to its frequency: larger frequency (shorter wavelength) implies larger minimum energy; smaller frequency (longer wavelength) implies smaller minimum energy. Roughly speaking, just as gentle ocean waves are long and luxurious while harsh ones are short and choppy, long-wavelength radiation is intrinsically less energetic than short-wavelength radiation.
Here's the punch line: Planck's calculations showed that this lumpiness of the allowed energy in each wave cured the previous ridiculous result of infinite total energy. It's not hard to see why. When an oven is heated to some chosen temperature, the calculations based on nineteenth-century thermodynamics predicted the common energy that each and every wave would supposedly contribute to the total. But like those comrades who cannot contribute the common amount of money they each owe the landlord because the monetary denomination they carry is too large, if the minimum energy a particular wave can carry exceeds the energy it is supposed to contribute, it can't contribute and instead lies dormant. Since, according to Planck, the minimum energy a wave can carry is proportional to its frequency, as we examine waves in the oven of ever larger frequency (shorter wavelength), sooner or later the minimum energy they can carry is bigger than the expected energy contribution. Like the comrades in the warehouse entrusted with denominations larger than fifty-dollar bills, these waves with ever-larger frequencies cannot contribute the amount of energy demanded by nineteenth-century physics. And so, just as only a finite number of comrades are able to contribute to the total heat payment—leading to a finite amount of total money—only a finite number of waves are able to contribute to the oven's total energy—again leading to a finite amount of total energy. Be it energy or money the lumpiness of the fundamental units—and the ever increasing size of these lumps as we go to higher frequencies or to larger monetary denominations—changes an infinite answer to one that is finite.3
By eliminating the manifest nonsense of an infinite result, Planck had taken an important step. But what really made people believe that his guess had validity is that the finite answer that his new approach gave for the energy in an oven agreed spectacularly with experimental measurements. Specifically, Planck found that by adjusting one parameter that entered into his new calculations, he could predict accurately the measured energy of an oven for any selected temperature. This one parameter is the proportionality factor between the frequency of a wave and the minimal lump of energy it can have. Planck found that this proportionality factor—now known as Planck's constant and denoted ħ (pronounced "h-bar")—is about a billionth of a billionth of a billionth in everyday units.4 The tiny value of Planck's constant means that the size of the energy lumps are typically very small. This is why, for example, it seems to us that we can cause the energy of a wave on a violin string—and hence the volume of sound it produces—to change continuously. In reality, though, the energy of the wave passes through discrete steps, à la Planck, but the size of the steps is so small that the discrete jumps from one volume to another appear to be smooth. According to Planck's assertion, the size of these jumps in energy grows as the frequency of the waves gets higher and higher (while wavelengths get shorter and shorter). This is the crucial ingredient that resolves the infinite-energy paradox.
As we shall see, Planck's quantum hypothesis does far more than allow us to understand the energy content of an oven. It overturns much about the world that we hold to be self-evident. The smallness of ħ confines most of these radical departures from life-as-usual to the microscopic realm, but if ħ happened to be much larger than it is, the strange happenings at the H-Bar would actually be commonplace. As we shall see, their microscopic counterparts certainly are.
What Are the Lumps?
Planck had no justification for his pivotal introduction of lumpy energy. Beyond the fact that it worked, neither he nor anyone else could give a compelling reason for why it should be true. As the physicist George Gamow once said, it was as if nature allowed one to drink a whole pint of beer or no beer at all, but nothing in between.5 In 1905, Einstein found an explanation and for this insight he was awarded the 1921 Nobel Prize in physics.
Einstein came up with his explanation by puzzling over something known as the photoelectric effect. The German physicist Heinrich Hertz in 1887 was the first to find that when electromagnetic radiation—light—shines on certain metals, they emit electrons. By itself this is not particularly remarkable. Metals have the property that some of their electrons are only loosely bound within atoms (which is why they are such good conductors of electricity). When light strikes the metallic surface it relinquishes its energy, much as it does when it strikes the surface of your skin, ca
using you to feel warmer. This transferred energy can agitate electrons in the metal, and some of the loosely bound ones can be knocked clear off the surface.
But the strange features of the photoelectric effect become apparent when one studies more detailed properties of the ejected electrons. At first sight you would think that as the intensity of the light—its brightness—is increased, the speed of the ejected electrons will also increase, since the impinging electromagnetic wave has more energy. But this does not happen. Rather, the number of ejected electrons increases, but their speed stays fixed. On the other hand, it has been experimentally observed that the speed of the ejected electrons does increase if the frequency of the impinging light is increased, and, equivalently, their speed decreases if the frequency of the light is decreased. (For electromagnetic waves in the visible part of the spectrum, an increase in frequency corresponds to a change in color from red to orange to yellow to green to blue to indigo and finally to violet. Frequencies higher than that of violet are not visible and correspond to ultraviolet and, subsequently, X rays; frequencies lower than that of red are also not visible, and correspond to infrared radiation.) In fact, as the frequency of the light used is decreased, there comes a point when the speed of the emitted electrons drops to zero and they stop being ejected from the surface, regardless of the possibly blinding intensity of the light source. For some unknown reason, the color of the impinging light beam—not its total energy—controls whether or not electrons are ejected, and if they are, the energy they have.