original interference pattern to the originally expected pattern—with
a bright region behind each of the two slits, just as if one were
shooting billiard balls or bullets and not waves toward the screen.
In other words, in attempting to verify your classical intuition,
you changed the behavior of the electrons. Or, as more commonly
asserted in quantum mechanics, measurement of a system can alter
its behavior.
ͤ͢
One of the many seemingly impossible aspects of quantum
mechanics is that there is no experiment you can perform that
demonstrates that in the absence of measurement the electrons
behave in a sensible classical way.
This strange wavelike nature of objects that would otherwise be
considered to be particles—such as electrons—is mathematically
expressed by assigning to each electron a “wave function,” which
describes the probability of finding that electron at any given point.
If the wave function takes on non-zero values at many different
points, then the electron’s position cannot be isolated in advance of
accurately measuring its position. In other words there is a non-zero
probability that the electron is not actually localized at just some
specific point in space in advance of making a measurement.
While you might imagine that this is a simple problem of not
having access to all the information we need to locate the particle
until we make a measurement, Young’s double-slit experiment,
when updated for electrons, demonstrated that this is most certainly
not the case. Any “sensible” classical picture of what is happening
between measurements is inconsistent with the data.
• • •
The strange behavior of electrons was not the first evidence that the
microscopic world could not be understood by intuitive classical
logic. Once again, in keeping with the revolutionary developments in
our understanding of nature since Plato, the discovery of quantum
mechanics began with a consideration of light.
Recall that if we perform Young’s double-slit experiment in
Plato’s cave with light rays, we get the interference pattern on the
wall that Young discovered, which demonstrated that light was
indeed a wave. So far, so good. However, if the light source is
sufficiently weak, then if we try to detect the light as it passes
ͤͣ
through either of the slits, something strange happens. We will
measure the light beam as traveling through one slit or the other,
not both. And as with electrons, in this case the pattern on the wall
will now change, looking as it would if light were particles and not
waves.
In fact, light also behaves like both a particle and a wave,
depending on the circumstances under which you choose to
measure it. The individual particles of light, which we now call
photons, were first labeled quanta by the German theoretical
physicist Max Planck, who suggested in 1900 that light might be
emitted or absorbed in some smallest bundle (although the idea that
light might come in discrete packets had earlier been floated by the
great Ludwig Boltzmann in 1877).
I have come to admire Planck even more as I have learned about
his life. Like Einstein, he was an unpaid lecturer and was not offered
an academic position after completing his thesis. During this time he
spent his career trying to understand the nature of heat and
developed several important pieces of work in thermodynamics. Five
years after defending his thesis, he was finally offered a university
position, and he then quickly rose up the ranks and became a full
professor at the prestigious University of Berlin in 1892.
In 1894 he turned to the question of the nature of light emitted by
hot objects, in part driven by commercial considerations (the first
example I know of in the story I have been telling where
fundamental physics was commercially motivated). He was
commissioned to explore how to get the maximum amount of light
out of the newly invented lightbulbs while using the minimum
amount of energy.
We all know that when we heat up an oven element it first glows
red, and then, when it gets hotter, it begins to glow blue. But why?
Surprisingly, the conventional approaches to this problem were
ͤͤ
unable to reproduce these observations. After struggling with the
problem for six years, Planck presented a revolutionary proposal
about radiation that agreed with observations.
Originally there was nothing revolutionary about his derivation,
but within two months he had revised his analysis to accommodate
ideas about what was happening at a fundamental level. In a quote
that has endeared him to me since I first read it, he wrote that his
new approach arose as “an act of despair. . . . I was ready to sacrifice
any of my previous convictions about physics.”
This reflects to me the fundamental quality that makes the
scientific process so effective, and which is so clearly represented in
the rise of quantum mechanics. “Previous convictions” are just
convictions waiting to be overturned—by empirical data, if
necessary. We throw out cherished old notions like yesterday’s
newspaper if they don’t work. And they didn’t work in explaining the
nature of radiation emitted by matter.
Planck derived his law of radiation from the fundamental
assumption that light, which was a wave, nevertheless was emitted
only in “packets” of some minimum energy—proportional to the
frequency of the radiation in question. He labeled the constant that
related the energy to the frequency the “action quantum,” which is
now called Planck’s constant.
This may not sound so revolutionary, and as Faraday did with
electric fields, Planck viewed his assumption as merely a formal
mathematical crutch to aid in his analysis. He later stated, “Actually I
did not think much about it.” Nevertheless, this proposal that light
was emitted in particle-like packets is clearly difficult to reconcile
with the classical picture of light as a wave. The energy carried by a
wave is simply related to the magnitude of its oscillations, which can
change continuously from zero. However, according to Planck, the
amount of energy that could be emitted in a light wave of a given
ͤͥ
frequency had an absolute minimum. This minimum was termed an
“energy quantum.”
Planck subsequently tried to develop a classical physical
understanding of these energy quanta, but failed—causing him, as he
put it, “much trouble.” Still, unlike a number of his colleagues, he
recognized that the universe didn’t exist to make his life easier.
Referring to the physicist and astronomer Sir James Jeans, who was
unwilling to give up classical notions in the face of the evidence
provided by radiation, Planck stated, “I am unable to understand
Jeans’s stubbornness—he is an example of a theoretician as should
never be existing, the same as Hegel was for philosophy. So m
uch
the worse for the facts if they don’t fit.” (Just to be clear, in case
readers are moved to write me letters, Planck cast this aspersion on
Hegel, not me!)
Planck later became friends with another physicist who had let
the facts drive him toward another revolutionary idea, Albert
Einstein. In 1914, when Planck had become dean at Berlin
University, he established a new professorship for Einstein there. At
first Planck could not accept Einstein’s remarkable proposal—made
in 1905, the same year in which he proposed the Special Theory of
Relativity—that not only was light emitted by matter in quantum
packets, but that light beams themselves existed as bunches of these
quanta—that light itself was made up of particle-like objects, which
we now call photons.
Einstein was driven to this proposal to explain a phenomenon
called the photoelectric effect, discovered by Philipp Lenard in 1902
—a physicist whose anti-Semitism would later play a key role in
delaying Einstein’s Nobel Prize, and ensuring, curiously, if perhaps
poetically, that it would be not for Einstein’s work on relativity, but
rather on the photoelectric effect. In the photoelectric effect, light
shining on a metal surface can knock electrons out of atoms and
ͥ͜
produce a current. However, no matter how intense the light, no
electrons would be emitted if the frequency of the light was below
some threshold. The moment the frequency was raised above that
threshold, a photoelectric current would be generated.
Einstein realized, correctly, that this could be explained if the light
came in minimum packets of energy, with the energy proportional
to the frequency of light—as Planck had postulated for light emitted
by matter. In this case, only light with frequencies greater than some
threshold frequency could contain quanta energetic enough to kick
electrons out of atoms.
Planck could accept the quantized emission of radiation as
explaining his radiation law, but the assumption that light itself was
quantum-like (i.e., particle-like) was so foreign to the common
understanding of light as an electromagnetic wave that Planck
balked. Only six years later, at a conference in Belgium, the Solvay
Conference, which later became famous, was Einstein finally able to
convince Planck that the classical picture of light had to be
abandoned, and that quanta—aka photons—were real.
Einstein was also the first to actually use a fact that he later
denounced in his famous statement deriding the probabilistic
essence of quantum mechanics and reality: “God does not play dice
with the universe.” He showed that if atoms spontaneously (i.e.,
without direct cause) absorb and emit finite packets of radiation as
electrons jump between discrete energy levels in atoms, then he
could rederive the Planck radiation law.
It is ironic that Einstein, who started the quantum revolution but
never joined it, was also perhaps the first to use probabilistic
arguments to describe the nature of matter—a strategy that the
subsequent physicists who turned quantum mechanics into a full
theory would place front and center. As a result, Einstein was one of
ͥ͝
the first physicists to demonstrate that God does play dice with the
universe.
To take the analogy a little further, Einstein was one of the first
physicists to demonstrate that the classical notion of causation
begins to break down in the quantum realm. Many people have
taken exception to my proposal that the universe needed no cause
but simply popped into existence from nothing. Yet this is precisely
what happens with the light you are using to read this page.
Electrons in hot atoms emit photons—photons that didn’t exist
before they were emitted—which are emitted spontaneously and
without specific cause. Why is it that we have grown at least
somewhat comfortable with the idea that photons can be created
from nothing without cause, but not whole universes?
The realization that electromagnetic waves were also particles
began a quantum revolution that would change everything about the
way we view nature. To be a particle and a wave at the same time is
impossible classically—as should be clear from the earlier discussion
in this chapter—but it is possible in the quantum world. As should
also be clear, this was just the beginning.
ͥ͞
C h a p t e r 7
A
U N I V E R S E
S T R A N G E R
T H A N F I C T I O N
Therefore do not throw away your confidence, which has a great
reward.
—HEBREWS 10:35
Conventional wisdom might suggest that physicists love to
invent crazy esoterica to explain the universe around us, either
because we have nothing better to do, or because we are particularly
perverse. However, as the unveiling of the quantum world
demonstrates, more often than not it is nature that drags us
scientists, kicking and screaming, away from the safety of what is
familiar.
Nevertheless, to say that the pioneers who pushed us forward into
the quantum world lacked confidence would be a profound
misstatement. The voyage they embarked upon was without
precedent and without guides. The world they were entering defied
all common sense, and classical logic, and they had to be prepared at
every turn for a change in the rules.
Imagine taking a road trip to another country, where the
inhabitants all speak a foreign language, and the laws are not based
on experiences that compare to any you have ever had in your life.
Moreover imagine the traffic signals are hidden and can change
from place to place. Then you can get a sense of where the young
Turks who overturned our understanding of nature in the first half
of the twentieth century were heading.
ͥ͟
The analogy between exploring strange new quantum worlds and
embarking on a trek through a new landscape may seemed strained,
but exactly such a relationship between the two was paralleled in the
life of none other than Werner Heisenberg, one of the founders of
quantum mechanics, who once reminisced about an evening in the
summer of 1925 on the island of Helgoland, a lovely oasis in the
North Sea, when he realized he had discovered the theory:
It was almost three o’clock in the morning before the final result of
my computations lay before me. The energy principle had held for
all the terms, and I could no longer doubt the mathematical
consistency and coherence of the kind of quantum mechanics to
which my calculations pointed. At first, I was deeply alarmed. I
had the feeling that, through the surface of atomic phenomena, I
was looking at a strangely beautiful interior and felt almost giddy
at the thought that I now had to probe this wealth of
mathematical structures nature had so generously spread out
before me. I was far too excited to sleep, and so, as a new day
dawned,
I made for the southern tip of the island, where I had
been longing to climb a rock jutting out into the sea. I now did so
without too much trouble and waited for the sun to rise.
Heisenberg, fresh from obtaining his PhD, had moved to the
distinguished German university in Göttingen to work with Max
Born to try to come up with a consistent theory of quantum
mechanics (a term first used in the paper “On Quantum Mechanics”
by Born in 1924). However, spring hay fever had laid Heisenberg
low, and he escaped the green countryside for the sea. There, he
polished off his ideas about the quantum behavior of atoms and sent
it off to Born, who submitted it for publication.
You may be familiar with Heisenberg’s name, not least because of
the famous principle associated with it. The Heisenberg uncertainty
ͥ͠
principle has gained a New Age aura, providing fuel for many a
charlatan to take advantage of people for whom quantum mechanics
seems to offer hope of a world where any dream, no matter how
outlandish, is realizable.
Other familiar names, Bohr, Schrödinger, Dirac, and later
Feynman and Dyson, each made great leaps into the unknown. But
they weren’t alone. Physics is a collaborative discipline. Too often
science stories are written as if the protagonists had a sudden Aha!
experience alone late at night. Heisenberg had been working on
quantum mechanics for several years with his PhD supervisor, the
brilliant German scientist Arnold Sommerfeld (whose students
would win four Nobel Prizes, and whose postdoctoral research
assistants would win three), and later with Born (who was finally
recognized with a Nobel almost thirty years later), as well as a young
colleague, Pascual Jordan. Every major triumph we celebrate with a
name and a prize is accompanied by a legion of hardworking, often
less heralded, individuals, each of whom moves forward the line of
scrimmage by a little bit. Baby steps are the norm, not the exception.
The most remarkable leaps into the unknown are often not fully
appreciated, even by their developers, until much later. Thus
Einstein, for example, never trusted his beautiful General Relativity
enough to believe its prediction that the universe cannot be static
but must be expanding or contracting—until observations
demonstrated the expansion. And the world didn’t stand on its head
Lawrence Krauss - The Greatest Story Ever Told--So Far Page 9