Genius: The Life and Science of Richard Feynman

Home > Science > Genius: The Life and Science of Richard Feynman > Page 42
Genius: The Life and Science of Richard Feynman Page 42

by James Gleick


  The demystification of genius in the age of inventors shaped the scientific culture—with its plainspoken positivism, its experiment-oriented technical schools—that nurtured Feynman and his contemporaries in the twenties and thirties, even as the pendulum swung again toward the more mysterious, more intuitive, and conspicuously less practical image of Einstein. Edison may have changed the world, after all, but Einstein seemed to have reinvented it whole, by means of a single, incomprehensible act of visualization. He saw how the universe must be and announced that it was so. Not since Newton …

  By then the profession of science was expanding rapidly, counting not hundreds but tens of thousands of practitioners. Clearly most of their work, most of science, was ordinary—as Freeman Dyson put it, a business of “honest craftsmen,” “solid work,” “collaborative efforts where to be reliable is more important than to be original.” In modern times it became almost impossible to talk about the processes of scientific change without echoing Thomas S. Kuhn, whose Structure of Scientific Revolutions so thoroughly changed the discourse of historians of science. Kuhn distinguished between normal science—problem solving, the fleshing out of existing frameworks, the unsurprising craft that occupies virtually all working researchers—and revolutions, the vertiginous intellectual upheavals through which knowledge lurches genuinely forward. Nothing in Kuhn’s scheme required individual genius to turn the crank of revolutions. Still, it was Einstein’s relativity, Heisenberg’s uncertainty, Wegener’s continental drift. The new mythology of revolutions dovetailed neatly with the older mythology of genius—minds breaking with the standard methods and seeing the world new. Dyson’s kind of genius destroyed and delivered. Schwinger’s quantum electrodynamics and Feynman’s may have been mathematically the same, but one was conservative and the other revolutionary. One extended an existing line of thought. The other broke with the past decisively enough to mystify its intended audience. One represented an ending: a mathematical style doomed to grow fatally overcomplex. The other, for those willing to follow Feynman into a new style of visualization, served as a beginning. Feynman’s style was risky, even megalomaniacal. Reflecting later on what had happened, Dyson saw his own goals, like Schwinger’s, as conservative (“I accepted the orthodox view … I was looking for a neat set of equations …”) and Feynman’s as visionary: “He was searching for general principles that would be flexible enough so that he could adapt them to anything in the universe.”

  Other ways of seeking the source of scientific creativity had appeared. It seemed a long way from such an inspirational, how-to view of discovery to the view of neuropsychologists looking for a substrate, refusing to speak merely about “mind.” Why had mind become such a contemptible word to neuropsychologists? Because they saw the term as a soft escape route, a deus ex machina for a scientist short on explanations. Feynman himself learned about neurons; he taught himself some brain anatomy when trying to understand color vision; but usually he considered mind to be the level worth studying. Mind must be a sort of dynamical pattern, not so much founded in a neurological substrate as floating above it, independent of it. “So what is this mind of ours?” he remarked. “What are these atoms with consciousness?”

  Last week’s potatoes! They can now remember what was going on in my mind a year ago—a mind which has long ago been replaced… . The atoms come into my brain, dance a dance, and then go out—there are always new atoms, but always doing the same dance, remembering what the dance was yesterday.

  Genius was not a word in his customary vocabulary. Like many physicists he was wary of the term. Among scientists it became a kind of style violation, a faux pas suggesting greenhorn credulity, to use the word genius about a living colleague. Popular usage had cheapened the word. Almost anyone could be a genius for the duration of a magazine article. Briefly Stephen Hawking, a British cosmologist esteemed but not revered by his peers, developed a reputation among some nonscientists as Einstein’s heir to the mantle. For Hawking, who suffered from a progressively degenerative muscular disease, the image of genius was heightened by the drama of a formidable intelligence fighting to express itself within a withering body. Still, in terms of raw brilliance and hard accomplishment, a few score of his professional colleagues felt that he was no more a genius than they.

  In part, scientists avoided the word because they did not believe in the concept. In part, the same scientists avoided it because they believed all too well, like Jews afraid to speak the name of Yahweh. It was generally safe to say only that Einstein had been a genius; after Einstein, perhaps Bohr, who had served as a guiding father figure during the formative era of quantum mechanics; after Bohr perhaps Dirac, perhaps Fermi, perhaps Bethe … All these seemed to deserve the term. Yet Bethe, with no obvious embarrassment or false modesty, would quote Mark Kac’s faintly oxymoronic assessment that Bethe’s genius was “ordinary,” by contrast to Feynman’s: “An ordinary genius is a fellow that you and I would be just as good as, if we were only many times better.” You and I would be just as good … Much of what passes for genius is mere excellence, the difference a matter of degree. A colleague of Fermi’s said: “Knowing what Fermi could do did not make me humble. You just realize that some people are smarter than you are, that’s all. You can’t run as fast as some people or do mathematics as fast as Fermi.”

  In the domains of criticism that fell under the spell of structuralism and then deconstructionism, even this unmagical view of genius became suspect. Literary and music theory, and the history of science as well, lost interest not only in the old-fashioned sports-fan approach—Homer versus Virgil—but also in the very idea of genius itself as a quality in the possession of certain historical figures. Perhaps genius was an artifact of a culture’s psychology, a symptom of a particular form of hero worship. Reputations of greatness come and go, after all, propped up by the sociopolitical needs of an empowered sector of the community and then slapped away by a restructuring of the historical context. The music of Mozart strikes certain ears as evidence of genius, but it was not always so—critics of another time considered it prissy and bewigged—nor will it always be. In the modern style, to ask about his genius is to ask the wrong question. Even to ask why he was “better” than, say, Antonio Salieri would be the crudest of gaffes. A modern music theorist might, in his secret heart, carry an undeconstructed torch for Mozart, might feel the old damnably ineffable rapture; still he understands that genius is a relic of outmoded romanticism. Mozart’s listeners are as inextricable a part of the magic as the observer is a part of the quantum-mechanical equation. Their interests and desires help form the context without which the music is no more than an abstract sequence of notes—or so the argument goes. Mozart’s genius, if it existed at all, was not a substance, not even a quality of mind, but a byplay, a give and take within a cultural context.

  How strange, then, that coolly rational scientists should be the last serious scholars to believe not just in genius but in geniuses; to maintain a mental pantheon of heroes; and to bow, with Mark Kac and Freeman Dyson, before the magicians.

  “Genius is the fire that lights itself,” someone had said. Originality; imagination; the self-driving ability to set one’s mind free from the worn channels of tradition. Those who tried to take Feynman’s measure always came back to originality. “He was the most original mind of his generation,” declared Dyson. The generation coming up behind him, with the advantage of hindsight, still found nothing predictable in the paths of his thinking. If anything he seemed perversely and dangerously bent on disregarding standard methods. “I think if he had not been so quick people would have treated him as a brilliant quasi-crank, because he did spend a substantial amount of time going down what later turned out to be dead ends,” said Sidney Coleman, a theorist who first knew Feynman at Caltech in the fifties.

  There are lots of people who are too original for their own good, and had Feynman not been as smart as he was, I think he would have been too original for his own good.

  There
was always an element of showboating in his character. He was like the guy that climbs Mont Blanc barefoot just to show that it can be done. A lot of things he did were to show, you didn’t have to do it that way, you can do it this other way. And this other way, in fact, was not as good as the first way, but it showed he was different.

  Feynman continued to refuse to read the current literature, and he chided graduate students who would begin their work on a problem in the normal way, by checking what had already been done. That way, he told them, they would give up chances to find something original. Coleman said:

  I suspect that Einstein had some of the same character. I’m sure Dick thought of that as a virtue, as noble. I don’t think it’s so. I think it’s kidding yourself. Those other guys are not all a collection of yo-yos. Sometimes it would be better to take the recent machinery they have built and not try to rebuild it, like reinventing the wheel.

  I know people who are in fact very original and not cranky but have not done as good physics as they could have done because they were more concerned at a certain juncture with being original than with being right. Dick could get away with a lot because he was so goddamn smart. He really could climb Mont Blanc barefoot.

  Coleman chose not to study with Feynman directly. Watching Feynman work, he said, was like going to the Chinese opera.

  When he was doing work he was doing it in a way that was just—absolutely out of the grasp of understanding. You didn’t know where it was going, where it had gone so far, where to push it, what was the next step. With Dick the next step would somehow come out of—divine revelation.

  So many of his witnesses observed the utter freedom of his flights of thought, yet when Feynman talked about his own methods he emphasized not freedom but constraints. The kind of imagination that takes blank paper, blank staves, or a blank canvas and fills it with something wholly new, wholly free—that, Feynman contended, was not the scientist’s imagination. Nor could one measure imagination as certain psychologists try to do, by displaying a picture and asking what will happen next. For Feynman the essence of the scientific imagination was a powerful and almost painful rule. What scientists create must match reality. It must match what is already known. Scientific creativity, he said, is imagination in a straitjacket. “The whole question of imagination in science is often misunderstood by people in other disciplines,” he said. “They overlook the fact that whatever we are allowed to imagine in science must be consistent with everything else we know… .” This is a conservative principle, implying that the existing framework of science is fundamentally sound, already a fair mirror of reality. Scientists, like the freer-seeming arts, feel the pressure to innovate, but in science the act of making something new contains the seeds of paradox. Innovation comes not through daring steps into unknown space,

  not just some happy thoughts which we are free to make as we wish, but ideas which must be consistent with all the laws of physics we know. We can’t allow ourselves to seriously imagine things which are obviously in contradiction to the known laws of nature. And so our kind of imagination is quite a difficult game.

  Creative artists in modern times have labored under the terrible weight of the demand for novelty. Mozart’s contemporaries expected him to work within a fixed, shared framework, not to break the bonds of convention. The standard forms of the sonata, symphony, and opera were established before his birth and hardly changed in his lifetime; the rules of harmonic progression made a cage as unyielding as the sonnet did for Shakespeare. As unyielding and as liberating—for later critics found the creators’ genius in the counterpoint of structure and freedom, rigor and inventiveness.

  For the creative mind of the old school, inventing by pressing against constraints that seem ironclad, subtly bending a rod here or slipping a lock there, science has become the last refuge. The forms and constraints of scientific practice are held in place not just by the grounding in experiment but by the customs of a community more homogeneous and rule-bound than any community of artists. Scientists still speak unashamedly of reality, even in the quantum era, of objective truth, of a world independent of human construction, and they sometimes seem the last members of the intellectual universe to do so. Reality hobbles their imaginations. So does the ever more intricate assemblage of theorems, technologies, laboratory results, and mathematical formalisms that make up the body of known science. How, then, can the genius make a revolution? Feynman said, “Our imagination is stretched to the utmost, not, as in fiction, to imagine things which are not really there, but just to comprehend those things which are there.”

  It was the problem he faced in the gloomiest days of 1946, when he was trying to find his way out of the mire that quantum mechanics had become. “We know so very much,” he wrote to his friend Welton, “and then subsume it into so very few equations that we can say we know very little (except these equations) … Then we think we have the physical picture with which to interpret the equations.” The freedom he found then was a freedom not from the equations but from the physical picture. He refused to let the form of the mathematics lock him into any one route to visualization: “There are so very few equations that I have found that many physical pictures can give the same equations. So I am spending my time in study—in seeing how many new viewpoints I can take of what is known.” By then Welton had mastered the field theory that was becoming standard, and he was surprised to discover that his old friend had not. Feynman seemed to hoard shadow pools of ignorance, seemed to protect himself from the light like a waking man who closes his eyes to preserve a fleeting image left over from a dream. He said later, “Maybe that’s why young people make success. They don’t know enough. Because when you know enough it’s obvious that every idea that you have is no good.” Welton, too, was persuaded that if Feynman had known more, he could not have innovated so well.

  “Would I had phrases that are not known, utterances that are strange, in new language that has not been used, free from repetition, not an utterance which has grown stale, which men of old have spoken.” An Egyptian scribe fixed those words in stone at the very dawn of recorded utterance—already jaded, a millennium before Homer. Modern critics speak of the burden of the past and the anxiety of influence, and surely the need to innovate is an ancient part of the artist’s psyche, but novelty was never as crucial to the artist as it became in the twentieth century. The useful lifetime of a new form or genre was never so short. Artists never before felt so much pressure to violate such young traditions.

  Meanwhile, before their eyes, the world has grown too vast and multifarious for the towering genius of the old kind. Artists struggle to keep their heads above the tide. Norman Mailer, publishing yet another novel doomed to fall short of ambitions formed in an earlier time, notices: “There are no large people any more. I’ve been studying Picasso lately and look at who his contemporaries were: Freud, Einstein.” He saw the change in his own lifetime without understanding it. (Few of those looking for genius understood where it had gone.) He appeared on a literary scene so narrow that conventional first novels by writers like James Jones made them appear plausible successors to Faulkner and Hemingway. He slowly sank into a thicket of hundreds of equally talented, original, and hard-driving novelists, each just as likely to be tagged as a budding genius. In a world into which Amis, Beckett, Cheever, Drabble, Ellison, Fuentes, Grass, Heller, Ishiguro, Jones, Kazantzakis, Lessing, Nabokov, Oates, Pym, Queneau, Roth, Solzhenitsyn, Theroux, Updike, Vargas Llosa, Waugh, Xue, Yates, and Zoshchenko—or any other two dozen fiction writers—had never been born, Mailer and any other potential genius would have had a better chance of towering. In a less crowded field, among shorter yardsticks, a novelist would not just have seemed bigger. He would have been bigger. Like species competing in ecological niches, he would have had a broader, richer space to explore and occupy. Instead the giants force one another into specialized corners of the intellectual landscape. They choose among domestic, suburban, rural, urban, demimondaine, Third World, realist, postrea
list, semirealist, antirealist, surrealist, decadent, ultraist, expressionist, impressionist, naturalist, existentialist, metaphysical, romance, romanticist, neoromanticist, Marxist, picaresque, detective, comic, satiric, and countless other fictional modes—as sea squirts, hagfish, jellyfish, sharks, dolphins, whales, oysters, crabs, lobsters, and countless hordes of marine species subdivide the life-supporting possibilities of an ocean that was once, for billions of years, dominated quite happily by blue-green algae.

  “Giants have not ceded to mere mortals,” the evolutionary theorist Stephen Jay Gould wrote in an iconoclastic 1983 essay. “Rather, the boundaries … have been restricted and the edges smoothed.” He was not talking about algae, artists, or paleontologists but about baseball players. Where are the .400 hitters? Why have they vanished into the mythic past, when technical skills, physical conditioning, and the population on which organized baseball draws have all improved? His answer: Baseball’s giants have dwindled into a more uniform landscape. Standards have risen. The distance between the best and worst batters, and between the best and worst pitchers, has fallen. Gould showed by statistical analysis that the extinction of the .400 hitter was only the more visible side of a general softening of extremes: the .100 hitter has faded as well. The best and worst all come closer to the average. Few fans like to imagine that Ted Williams would recede toward the mean in the modern major leagues, or that the overweight, hard-drinking Babe Ruth would fail to dominate the scientifically engineered physiques of his later competitors, or that dozens of today’s nameless young base-stealers could outrun Ty Cobb, but it is inevitably so. Enthusiasts of track and field cannot entertain the baseball fan’s nostalgia; their statistics measure athlete against nature instead of athlete against athlete, and the lesson from decade to decade is clear. There is such a thing as progress. Nostalgia conceals it while magnifying the geniuses of the past. A nostalgic music lover will put on a scratchy 78 of Lauritz Melchior and sigh that there are no Wagnerian tenors any more. Yet in reality musical athletes have probably fared no worse than any other kind.

 

‹ Prev