Genius: The Life and Science of Richard Feynman

Home > Science > Genius: The Life and Science of Richard Feynman > Page 10
Genius: The Life and Science of Richard Feynman Page 10

by James Gleick


  The Newest Physics

  The theory of the fast and the theory of the small were narrowing the focus of the few dozen men with the suasion to say what physics was. Most of human experience passed in the vast reality that was neither fast nor small, where relativity and quantum mechanics seemed unnecessary and unnatural, where rivers ran, clouds flowed, baseballs soared and spun by classical means—but to young scientists seeking the most fundamental knowledge about the fabric of their universe, classical physics had no more to say. They could not ignore the deliberately disorienting rhetoric of the quantum mechanicians, nor the unifying poetry of Einstein’s teacher Hermann Minkowski: “Space of itself and time of itself will sink into mere shadows, and only a kind of union between them shall survive.”

  Later, quantum mechanics suffused into the lay culture as a mystical fog. It was uncertainty, it was acausality, it was the Tao updated, it was the century’s richest fount of paradoxes, it was the permeable membrane between the observer and the observed, it was the funny business sending shudders up science’s all-too-deterministic scaffolding. For now, however, it was merely a necessary and useful contrivance for accurately describing the behavior of nature at the tiny scales now accessible to experimenters.

  Nature had seemed so continuous. Technology, however, made discreteness and discontinuity a part of everyday experience: gears and ratchets creating movement in tiny jumps; telegraphs that digitized information in dashes and dots. What about the light emitted by matter? At everyday temperatures the light is infrared, its wavelengths too long to be visible to the eye. At higher temperatures, matter radiates at shorter wavelengths: thus an iron bar heated in a forge glows red, yellow, and white. By the turn of the century, scientists were struggling to explain this relationship between temperature and wavelength. If heat was to be understood as the motion of molecules, perhaps this precisely tuned radiant energy suggested an internal oscillation, a vibration with the resonant tonality of a violin string. The German physicist Max Planck pursued this idea to its logical conclusion and announced in 1900 that it required an awkward adjustment to the conventional way of thinking about energy. His equations produced the desired results only if one supposed that radiation was emitted in lumps, discrete packets called quanta. He calculated a new constant of nature, the indivisible unit underlying these lumps. It was a unit, not of energy, but of the product of energy and time—the quantity called action.

  Five years later Einstein used Planck’s constant to explain another puzzle, the photoelectric effect, in which light absorbed by a metal knocks electrons free and creates an electric current. He, too, followed the relationship between wavelength and current to an inevitable mathematical conclusion: that light itself behaves not as a continuous wave but as a broken succession of lumps when it interacts with electrons.

  These were dubious claims. Most physicists found Einstein’s theory of special relativity, published the same year, more palatable. But in 1913 Niels Bohr, a young Dane working in Ernest Rutherford’s laboratory in Manchester, England, proposed a new model of the atom built on these quantum underpinnings. Rutherford had recently imagined the atom as a solar system in miniature, with electrons orbiting the nucleus. Without a quantum theory, physicists would have to accept the notion of electrons gradually spiraling inward as they radiated some of their energy away. The result would be continuous radiation and the eventual collapse of the atom in on itself. Bohr instead described an atom whose electrons could inhabit only certain orbits, prescribed by Planck’s indivisible constant. When an electron absorbed a light quantum, it meant that in that instant it jumped to a higher orbit: the soon-to-be-proverbial quantum jump. When the electron jumped to a lower orbit, it emitted a light quantum at a certain frequency. Everything else was simply forbidden. What happened to the electron “between” orbits? One learned not to ask.

  These new kinds of lumpiness in the way science conceived of energy were the essence of quantum mechanics. It remained to create a theory, a mathematical framework that would accommodate the working out of these ideas. Classical intuitions had to be abandoned. New meanings had to be assigned to the notions of probability and cause. Much later, when most of the early quantum physicists were already dead, Dirac, himself chalky-haired and gaunt, with just a trace of white mustache, made the birth of quantum mechanics into a small fable. By then many scientists and writers had done so, but rarely with such unabashed stick-figure simplicity. There were heroes and almost heroes, those who reached the brink of discovery and those whose courage and faith in the equation led them to plunge onward.

  Dirac’s simple morality play began with LORENTZ. This Dutch physicist realized that light shines from the oscillating charges within the atom, and he found a way of rearranging the algebra of space and time that produced a strange contraction of matter near the speed of light. As Dirac said, “Lorentz succeeded in getting correctly all the basic equations needed to establish the relativity of space and time, but he just was not able to make the final step.” Fear held him back.

  Next came a bolder man, EINSTEIN. He was not so inhibited. He was able to move ahead and declare space and time to be joined.

  HEISENBERG started quantum mechanics with “a brilliant idea”: “one should try to construct a theory in terms of quantities which are provided by experiment, rather than building it up, as people had done previously, from an atomic model which involved many quantities which could not be observed.” This amounted to a new philosophy, Dirac said.

  (Conspicuously a noncharacter in Dirac’s fable was Bohr, whose 1913 model of the hydrogen atom now represented the old philosophy. Electrons whirling about a nucleus? Heisenberg wrote privately that this made no sense: “My whole effort is to destroy without a trace the idea of orbits.” One could observe light of different frequencies shining from within the atom. One could not observe electrons circling in miniature planetary orbits, nor any other atomic structure.)

  It was 1925. Heisenberg set out to pursue his conception wherever it might lead, and it led to an idea so foreign and surprising that “he was really scared.” It seemed that Heisenberg’s quantities, numbers arranged in matrices, violated the usual commutative law of multiplication that says a times b equals b times a. Heisenberg’s quantities did not commute. There were consequences. Equations in this form could not specify both momentum and position with definite precision. A measure of uncertainty had to be built in.

  A manuscript of Heisenberg’s paper made its way to DIRAC himself. He studied it. “You see,” he said, “I had an advantage over Heisenberg because I did not have his fears.”

  Meanwhile, SCHRÖDINGER was taking a different route. He had been struck by an idea of DE BROGLIE two years before: that electrons, those pointlike carriers of electric charge, are neither particles nor waves but a mysterious combination. Schrödinger set out to make a wave equation, “a very neat and beautiful equation,” that would allow one to calculate electrons tugged by fields, as they are in atoms.

  Then he tested his equation by calculating the spectrum of light emitted by a hydrogen atom. The result: failure. Theory and experiment did not agree. Eventually, however, he found that if he compromised and ignored the effects of relativity his theory agreed more closely with observations. He published this less ambitious version of his equation.

  Thus fear triumphed again. “Schrödinger had been too timid,” Dirac said. Two other men, KLEIN and GORDON, rediscovered the more complete version of the theory and published it. Because they were “sufficiently bold” not to worry too much about experiment, the first relativistic wave equation now bears their names.

  Yet the Klein-Gordon equation still produced mismatches with experiments when calculations were carried out carefully. It also had what seemed to Dirac a painful logical flaw. It implied that the probability of certain events must be negative, less than zero. Negative probabilities, Dirac said, “are of course quite absurd.”

  It remained only for Dirac to invent—or was it “design” or “di
scover”?—a new equation for the electron. This was exceedingly beautiful in its formal simplicity and the sense of inevitability it conveyed, after the fact, to sensitive physicists. The equation was a triumph. It correctly predicted (and so, to a physicist, “explained”) the newly discovered quantity called spin, as well as the hydrogen spectrum. For the rest of his life Dirac’s equation remained his signal achievement. It was 1927. “That is the way in which quantum mechanics was started,” Dirac said.

  These were the years of Knabenphysik, boy physics. When they began, Heisenberg was twenty-three and Dirac twenty-two. (Schrödinger was an elderly thirty-seven, but, as one chronicler noted, his discoveries came “during a late erotic outburst in his life.”) A new Knabenphysik began at MIT in the spring of 1936. Dick Feynman and T. A. Welton were hungry to make their way into quantum theory, but no course existed in this nascent science, so much more obscure even than relativity. With guidance from just a few texts they embarked on a program of self-study. Their collaboration began in one of the upstairs study rooms of the Bay State Road fraternity house and continued past the end of the spring term. Feynman returned home to Far Rockaway, Welton to Saratoga Springs. They filled a notebook, mailing it back and forth, and in a period of months they recapitulated nearly the full sweep of the 1925–27 revolution.

  “Dear R. P… .” Welton wrote on July 23. “I notice you write your equation:

  This was the relativistic Klein-Gordon equation. Feynman had rediscovered it, by correctly taking into account the tendency of matter to grow more massive at velocities approaching the speed of light—not just quantum mechanics, but relativistic quantum mechanics. Welton was excited. “Why don’t you apply your equation to a problem like the hydrogen atom, and see what results it gives?” Just as Schrödinger had done ten years before, they worked out the calculation and saw that it was wrong, at least when it came to making precise predictions.

  “Here’s something, the problem of an electron in the gravitational field of a heavy particle. Of course the electron would contribute something to the field …”

  “I wonder if the energy would be quantized? The more I think about the problem, the more interesting it sounds. I’m going to try it …

  “… I’ll probably get an equation that I can’t solve anyway,” Welton added ruefully. (When Feynman got his turn at the notebook he scrawled in the margin, “Right!”) “That’s the trouble with quantum mechanics. It’s easy enough to set up equations for various problems, but it takes a mind twice as good as the differential analyzer to solve them.”

  General relativity, barely a decade old, had merged gravity and space into a single object. Gravity was a curvature of space-time. Welton wanted more. Why not tie electromagnetism to space-time geometry as well? “Now you see what I mean when I say, I want to make electrical phenomena a result of the metric of a space in the same way that gravitational phenomena are. I wonder if your equation couldn’t be extended to Eddington’s affine geometry…” (In response Feynman scribbled: “I tried it. No luck yet.”)

  Feynman also tried to invent an operator calculus, writing rules of differentiation and integration for quantities that did not commute. The rules would have to depend on the order of the quantities, themselves matrix representations of forces in space and time. “Now I think I’m wrong on account of those darn partial integrations,” Feynman wrote. “I oscillate between right and wrong.”

  “Now I know I’m right … In my theory there are a lot more ‘fundamental’ invariants than in the other theory.”

  And on they went. “Hot dog! after 3 wks of work … I have at last found a simple proof,” Feynman wrote. “It’s not important to write it, however. The only reason I wanted to do it was because I couldn’t do it and felt that there were some more relations between the An & their derivatives that I had not discovered … Maybe I’ll get electricity into the metric yet! Good night, I have to go to bed.”

  The equations came fast, penciled across the notebook pages. Sometimes Feynman called them “laws.” As he worked to improve his techniques for calculating, he also kept asking himself what was fundamental and what was secondary, which were the essential laws and which were derivative. In the upside-down world of early quantum mechanics, it was far from obvious. Heisenberg and Schrödinger had taken starkly different routes to the same physics. Each in his way had embraced abstraction and renounced visualization. Even Schrödinger’s waves defied every conventional picture. They were not waves of substance or energy but of a kind of probability, rolling through a mathematical space. This space itself often resembled the space of classical physics, with coordinates specifying an electron’s position, but physicists found it more convenient to use momentum space (denoted by Pα), a coordinate system based on momentum rather than position—or based on the direction of a wavefront rather than any particular point on it. In quantum mechanics the uncertainty principle meant that position and momentum could no longer be specified simultaneously. Feynman in the August after his sophomore year began working with coordinate space (Qα)—less convenient for the wave point of view, but more directly visualizable.

  “Pα is no more fundamental than Qα nor vice versa—why then has Pα played such an important role in theory and why don’t I try Qα instead of Pα in certain generalizations of equations …” Indeed, he proved that the customary approach could be derived directly from the theory as cast in terms of momentum space.

  In the background both boys were worrying about their health. Welton had an embarrassing and unexplained tendency to fall asleep in his chair, and during the summer break he was taking naps, mineral baths, and sunlamp treatments—doses of high ultraviolet radiation from a large mercury arc light. Feynman suffered something like nervous exhaustion as he finished his sophomore year. At first he was told he would have to stay in bed all summer. “I’d go nuts if it were me I,” T. A. wrote in their notebook. “Anyhow, I hope you get to school all right in the fall. Remember, we’re going to be taught quantum mechanics by no less an authority than Prof. Morse himself. I’m really looking forward to that.” (“Me too,” Feynman wrote.)

  They were desperately eager to be at the front edge of physics. They both started reading journals like the Physical Review. (Feynman made a mental note that a surprising number of articles seemed to be coming from Princeton.) Their hope was to catch up on the latest discoveries and to jump ahead. Welton would set to work on a development in wave tensor calculus; Feynman would tackle an esoteric application of tensors to electrical engineering, and only after wasting several months did they begin to realize that the journals made poor Baedekers. Much of the work was out of date by the time the journal article appeared. Much of it was mere translation of a routine result into an alternative jargon. News did sometimes break in the Physical Review, if belatedly, but the sophomores were ill equipped to pick it out of the mostly inconsequential background.

  Morse had taught the second half of the theoretical physics course that brought Feynman and Welton together, and he had noticed these sophomores, with their penetrating questions about quantum mechanics. In the fall of 1937 they, along with an older student, met with Morse once a week and began to fit their own blind discoveries into the context of physics as physicists understood it. They finally read Dirac’s 1935 bible, The Principles of Quantum Mechanics. Morse put them to work calculating the properties of different atoms, using a method of his own devising. It computed energies by varying the parameters in equations known as hydrogenic radial functions—Feynman insisted on calling them hygienic functions—and it required more plain, plodding arithmetic than either boy had ever encountered. Fortunately they had calculators, a new kind that replaced the old hand cranks with electric motors. Not only could the calculators add, multiply, and subtract; they could divide, though it took time. They would enter numbers by turning metal dials. They would turn on the motor and watch the dials spin toward zero. A bell would ring. The chug-chug-ding-ding rang in their ears for hours.

  In th
eir spare time Feynman and Welton used the same machines to earn money through a Depression agency, the National Youth Administration, calculating the atomic lattices of crystals for a professor who wanted to publish reference tables. They worked out faster methods of running the calculator. And when they thought that they had their system perfected, they made another calculation: how long it would take to complete the job. The answer: seven years. They persuaded the professor to set the project aside.

  Shop Men

  MIT was still an engineering school, and an engineering school in the heyday of mechanical ingenuity. There seemed no limit to the power of lathes and cams, motors and magnets, though just a half-generation later the onset of electronic miniaturization would show that there had been limits after all. The school’s laboratories, technical classes, and machine shops gave undergraduates a playground like none other in the world. When Feynman took a laboratory course, the instructor was Harold Edgerton, an inventor and tinkerer who soon became famous for his high-speed photographs, made with a stroboscope, a burst of light slicing time more finely than any mechanical shutter could. Edgerton extended human sight into the realm of the very fast just as microscopes and telescopes were bringing into view the small and the large. In his MIT workshop he made pictures of bullets splitting apples and cards; of flying hummingbirds and splashing milk drops; of golf balls at the moment of impact, deformed to an ovoid shape that the eye had never witnessed. The stroboscope showed how much had been unseen. “All I’ve done is take God Almighty’s lighting and put it in a container,” he said. Edgerton and his colleagues gave body to the ideal of the scientist as a permanent child, finding ever more ingenious ways of taking the world apart to see what was inside.

  That was an American technical education. In Germany a young would-be theorist could spend his days hiking around alpine lakes in small groups, playing chamber music and arguing philosophy with an earnest Magic Mountain volubility. Heisenberg, whose name would come to stand for the twentieth century’s most famous kind of uncertainty, grew enraptured as a young student with his own “utter certainty” that nature expressed a deep Platonic order. The strains of Bach’s D Minor Chaconne, the moonlit landscapes visible through the mists, the atom’s hidden structure in space and time—all seemed as one. Heisenberg had joined the youth movement that formed in Munich after the trauma of World War I, and the conversation roamed freely: Did the fate of Germany matter “more than that of all mankind”? Can human perception ever penetrate the atom deeply enough to see why a carbon atom bonds with two but never three oxygen atoms? Does youth have “the right to fashion life according to its own values”? For such students philosophy came first in physics. The search for meaning, the search for purpose, led naturally down into the world of atoms.

 

‹ Prev