The Conservative Sensibility

Home > Other > The Conservative Sensibility > Page 57
The Conservative Sensibility Page 57

by George F. Will


  The marvels of applied science are not nearly as astonishing as what science continues to reveal by deciphering what has been around for about 13.5 billion years—the universe—and predates us by most of these years. The human species has been around for about 250,000 years, approximately .0015 percent of the history of all earthly life. The planet, like the rest of the universe, got along swimmingly without us until, as it were, just the other day. This makes our arrival look awfully like a cosmic afterthought, an accident rather than the culmination of a plan. Astronomers are gathering and studying light that has taken 10 billion years to reach their instruments. This light helps them to understand what Martin Rees, Astronomer Royal of Britain, with British understatement, calls the universe’s “inclement” conditions: “violent explosions, jets of particles at 99.99 percent of the speed of light, flashes that disgorge in a few seconds far more light than the Sun emits in its 10-billion-year history.” And speaking of the sun, which means everything to us: Although it is a small thing in the cosmic scheme of things, it is so big that, Rees says, “it would take as many human bodies to make up the Sun’s mass as there are atoms in each of us.”61 It has taken just a bit more than a billion years for natural selection to proceed from the first multicellular organisms to us, and those billion years are 20 percent of the entire story of life on Earth. In just five billion years, the sun will be extinguished, and we with it.

  “It is not to be conceived,” wrote Isaac Newton, “that mere mechanical causes could give birth to so many regular motions.… This most beautiful system of the sun, planets and comets could only proceed from the counsel and dominion of an intelligent and powerful Being.”62 Since Newton’s day, however, new instruments of astronomy have revealed that there are many highly irregular goings-on in the universe, the beauty of which is real but terrifyingly inhospitable. Yes, Earth is so finely tuned as to be hospitable to human and other life. This leads many people to conclude that a Fine Tuner had us in mind: If the Earth’s atmosphere were slightly thicker or slightly thinner, not enough or too much of the sun’s radiation would arrive to sustain life. But this planetary friendliness can be understood as a happy accident of the evolution of this cooling cinder, Earth. And although this planet is friendly to human life, it has always been less than friendly for many lives. Searing heat, punishing cold, tornados, hurricanes, earthquakes, volcanoes, tsunamis, wild animals, poisonous snakes and spiders and plants, infestations of food-destroying and disease-carrying insects, diseases galore, and the proclivity of humans to prey upon and war against one another—all these are ingredients of life as experienced beneath the biophilic atmospheric layer that envelops Earth. Surely the Fine Tuner could have left out some of these ingredients—could have tuned the world to work a bit differently—had He been feeling a bit more friendly.

  Life is, presumably, either a cosmic fluke or a cosmic imperative. But because everything is a reverberation from the big bang—every atom of the material in us, and in everything else, is nuclear waste from that explosion—what, really, is the difference between fluke and imperative? Our universe is biophilic, meaning friendly to life, only because molecules of water and atoms of carbon, which are necessary for life, would not have resulted from a big bang if there had been even a slightly different recipe for the cosmic soup that existed after the explosion, a recipe that was cooked in the universe’s first one-hundredth of a second, when the temperature was a hundred thousand million degrees centigrade. The fact that a biophilic universe is like Goldilocks’ porridge—not too hot, not too cold, just right—is an invitation for natural theology to say this: The distillation of the post–big bang residue into particles and then into atoms and then, about a billion years ago, into the first multicellular organisms that led to us—all this involves a precision of such stupendous improbability that there must have been a Designer. This, however, requires a problematic premise—that no hugely consequential accident is really an accident.

  Besides, natural theology must reckon with the fact that this is not going to end well. The antecedent of the pronoun “this” is: everything. The universe, currently expanding, will either continue to do so, ending in intolerable cold, or it will collapse backward, ending in incinerating heat. What does natural theology make of these destinations for what the Designer set in motion? Astronomy’s and cosmology’s withering rejoinder to zealotry is: “What’s the use?” Recently two University of Michigan astronomers reported that if the laws of physics as currently understood continue to operate, then in 10,000 trillion trillion trillion trillion trillion trillion trillion trillion years the universe, which is pretty much everything, will run down. Carbon-based life, including us, will long since have disappeared, and the entire cosmos will be a thin soup of diffuse particles.

  Early astronomy might have displaced our planet from the place of honor in the cosmos, but at least Newton said that the universe is intelligible, even tidy. Early in the twentieth century, however, a minor Swiss civil servant, who traveled home in a streetcar from his job in the Bern patent office, wondered: What would the city’s clock tower look like if observed from a streetcar racing away from the tower at the speed of light? The clock, he decided, would appear stopped because light could not catch up to the streetcar, but his own watch would tick normally. “A storm broke loose in my mind,” Albert Einstein later remembered.63 He produced five papers in 1905, and for physicists the world has never seemed the same. For laypeople, it has never felt the same. Hitherto, space and time were assumed to be absolutes. They still can be for our everyday business because we and the objects we deal with do not move at the speed of light. But since Einstein’s postulate of relativity, measurements of space and time are thought to be relative to speed.

  In the 1920s, while people were enjoying being told that space is warped and that it pushes things down (this is the real “force” of gravity), Einstein became an international celebrity of a sort not seen before or since. Selfridges department store in London pasted six pages of an Einstein paper on a plate glass window for passersby to read. Charlie Chaplin said to him, “The people applaud me because everyone understands me, and they applaud you because no one understands you.”64 The precision of modern scientific instruments makes possible the confirmation of implications of Einstein’s theories—e.g., the universe had a beginning (the big bang) and its expansion is accelerating; time slows in a large gravitational field and beats slower the faster one moves; the sun bends starlight from across the sky, and there are black holes so dense that they swallow light. Does all this bewilder you? The late Richard Feynman, winner of the Nobel Prize in Physics, said, “I think I can safely say that nobody understands quantum mechanics.”65

  Einstein’s theism, such as it was, was expressed in his aphorism that God does not play dice with the universe. He meant that there are elegant, eventually discoverable laws, not mere randomness, at work. Saying “I’m not an atheist,” he explained: “We are in the position of a little child entering a huge library filled with books in many different languages. The child knows someone must have written those books. It does not know how. It does not understand the languages in which they are written. The child dimly suspects a mysterious order in the arrangement of the books but doesn’t know what it is.”66 Einstein postulated that fixed “space” and “time” are illusory, and that “energy” and “matter” are fungible, and the public accepted this. What choice did it have? And what did accepting it mean or entail? An interesting subject for intellectual history is whether Einstein’s theory of relativity somehow contributed to the subversion of other absolutes, including religious and political ones.

  In 1927, in The Future of an Illusion, Sigmund Freud confidently said, “The more people gain access to the treasures of our knowledge, the more widespread will be the falling away from religious belief.”67 This idea that religion is just a rung on humanity’s ladder up from ignorance is challenged by those, and they are legion, who find in some new knowledge—of cosmology, biology, and much
else—sustenance for theism. The 1965 discovery that the universe is permeated with background radiation confirmed the theory that a big bang set what are now distant galaxies flying apart. A famous aphorism holds that the most incomprehensible thing about the universe is that it is comprehensible. It is becoming ever more so because of advances in mathematics and particle physics, and in telescopes that, operating above the filter of Earth’s atmosphere, “see” the past by capturing for analysis light emitted from events that occurred perhaps—we cannot be sure how fast the universe is expanding—12 billion years ago.

  Astronomy is history. It is the history of what has happened in the 13 billion or so years since the big bang, which lasted a trillionth of a trillionth of a trillionth of a second, inflating a microscopic speck into everything that is. As astronomy unfolds the history of all this, mankind is being put in its place. But where is that? Martin Rees said we must put aside “particle chauvinism”: All the atoms that make up us can be truly said to be stardust, or residues from the fuel that makes the stars shine.68

  So, perhaps the supposedly crucial question—is life a cosmic fluke or a cosmic imperative?—is not much of a question. Human life exists at the back of beyond in an overwhelmingly hostile universe. So cosmology gets pressed into the service of natural theology, which rests on probability—or, more precisely, on the stupendous improbability of the emergence from chaos of complexity and then consciousness. Natural theology says: A watch implies a watchmaker, and what has happened in the universe—the distillation of the post–big bang cosmic soup into particles, then atoms, then wonderful us, reveals or implies a creator with a precise design. We know, however, that biological evolution has been beset by lots of accidents—climate changes, asteroid impacts, epidemics, etc. As Rees said, if Earth’s history were to be rerun, the biosphere almost certainly would end up quite different. And without us.

  The estimated number of stars is 10 followed by 22 zeros—so far. When the Hubble telescope took a picture of a speck of space less than 1/150th the size of a full moon, it peered into the location of more than 5,000 galaxies. From this fact it is possible to make a rough estimate that the visible universe contains more than 150 billion galaxies, each with billions of stars. It is not yet known what initiated life on Earth, but whatever it was, it probably has been in some way replicated somewhere else in the still expanding universe. Is there a plausible reason for thinking otherwise? As to whether there are other planets with life like Earth’s, Rees said the chance of there being two similar ecologies is less than the chance of two randomly typing monkeys producing the same Shakespearean play.

  As fascinating and disconcerting as are the questions raised by what astronomy and cosmology are learning, perhaps more disorienting and revolutionary are discoveries about what is within us. Not all revolutions are heralded by rhetorical drumrolls like “When in the course of human events” and “A specter is haunting Europe.” And not all revolutions are explicitly political; some are revolutions in thinking or sensibility that have political reverberations. On April 25, 1953, such a revolution was announced by a bland sentence in a British science journal, Nature: “We wish to suggest a structure for the salt of deoxyribose nucleic acid (D.N.A.).”69 The world that was shaken by this deceptively downbeat announcement from Francis Crick and James Watson was the narrow one inhabited by a few scientists who were competing with the intensity of athletes in a race to discover the structure of the chemical that controls heredity in all living things. History is, however, the history of ideas, perceptions, assumptions, values—the history of mind. And the wider world continues to be changed by the aftershocks of the intellectual earthquake that had its epicenter in a laboratory at Cambridge University in England. Each of us has 10 thousand trillion cells, give or take. Each cell contains a strand of DNA that, uncoiled, would extend about six feet. If an individual’s DNA were spliced into a single strand, it would extend 20 million kilometers, enough to wrap around the equator approximately 500 times. This is just one measure of the unfathomable strangeness of us.

  The myriad and rapidly multiplying benefits from DNA research include advancements in medicine and agriculture that have improved, and saved, the lives of hundreds of millions. But from this still-new science has come a challenge to something ancient: mankind’s estimate of itself. Nineteenth-century biology, and especially the theory of evolution, seemed strongly to suggest that mankind is not the apex of nature’s pyramid, but rather is a bead on a string—perhaps an early bead on a potentially very long string. Twentieth-century biology posed a political problem: It seemed to make dubious the concept around which liberal societies are organized, the concept of the self. The supreme value of liberal democratic societies is self-expression, including the political manifestation of this in self-government. But how, in the aftermath of the scientific and social revolution that Crick and Watson set in motion, are we to understand the self? A nineteenth-century poet famously insisted, “I am the master of my fate: I am the captain of my soul.”70 But what does such mastery mean if many things—if perhaps most things, including some of the most important things—are somehow foreordained by the genetic code?

  Darwin, Marx, and Freud suggested that various kinds of change—biological, personal, social—are driven by autonomous processes. Now DNA suggests new circumscriptions of autonomy. Expanding knowledge of how DNA works makes possible willful interventions in the chemical engine of existence. Such interventions can reassert human autonomy, but perhaps at a cost to human dignity. Can our understanding of human autonomy, and hence of moral agency, be enhanced by thinking of the self as something reducible to chemistry? Or by affirming a radical materialism? In 1980, the US Supreme Court ruled that a living microorganism—a product of human artifice; a genetically engineered product with industrial uses—was patentable matter. This was just a technical ruling, a matter of statutory interpretation (of patent law). The court, however, held this: The fact that the manufactured product was alive was irrelevant to its patentability because it was a “composition of matter,” just like a better mousetrap. But what if we decide that human beings, too, are just “compositions of matter”? What if thinking that this is the case makes us behave in certain ways?

  Liberal democratic societies assume that individuals are in some sense self-constituting creatures, producing themselves by their free and educated choices, assembling their purposes from a vast and potentially limitless buffet of possibilities. But what if this process of self-assertion, this exercise of autonomy, takes place in the context of an assumption that human beings are, like everything else, mere “compositions of matter,” neither more nor less? Then what is this “self” that is doing the asserting? Again, one of the greatest novels of the first modern nation is about Jay Gatsby’s creation of his self. This did not turn out well, which was his fault. Or was it? How do we assign fault, or make sense of praise or blame, among “compositions of matter”?

  Christianity was a source of three ideas central to the American founding. One is the idea of humanity’s irremediable imperfectability. The second is that original sin does not vitiate individual dignity. The third is that there are universal moral truths. All this poses a challenge for societies that are increasingly secular and given increasingly to believing in the social or genetic influences on consciousness: How do we define and defend the integrity of the self? This matters for self-government because since the second half of the nineteenth century the unitary understanding of the human personality—the idea of personhood—has come to seem problematic. Yet, strangely—or perhaps understandably—as the idea of the self has become more attenuated there has been increased emphasis on self-assertion and self-expression. As the self has become a hazier concept, there has been a more urgent desire to celebrate the assertion and expression of this elusive thing.

  The human brain has a hundred billion neurons, and 1,000 connections between each of them, so there are hundreds of trillions of possible connections. This means that the number of p
ossible permutations is larger than the number of stars in the universe—not in the galaxy, in the universe. This lump of matter in the skull is the seat of the self. The brain is responsible for perception, motor control, cognition, and emotion. Hence the brain is an organ of behavior. Actually, it is the organ. A stomachache can make a person cranky or sad, but those are mental conditions, not stomach conditions. More and more, brain imaging can, in some still very limited sense, show how some brain functions underlie human behavior. Neuroscience deepens our understanding of the human experience and has the potential to make the human experience better. It is, however, healthy to have—in the back of one’s brain, so to speak—an anxiety about whether, or in what sense, the improved experience will still be understood in the way that human experience has hitherto been thought of as distinctly human.

  A fundamental philosophic debate is between those who say, “I have a body,” and those who say, “I am a body.” There are many more of the former than of the latter, partly, no doubt, because humanity’s self-esteem is served by—or is dependent upon—the idea that there is more to us than flesh and blood and sinew. Probably for as long as there have been human beings, they have been comfortable thinking of themselves as creatures composed of flesh and blood and also something grander. Now, however, neurobiology is making problematic the idea that we are both bodies and quite distinct minds or spirits. The idea of “the ghost in the machine” may be yielding to the idea that we are just machines.71 Are we, however, merely the sum of the chemical reactions bubbling within us? Happily, the more we know, the less we know. The more we know about the brain, the more we are awed by how much there is to know, not only about the brain but about the totality of creation that has culminated (we are the culmination…aren’t we?) in something as intricate as we are. But the more that is learned about our intricacies, the more attenuated our sense of personhood becomes. Brain “mapping” is not just a way of discovering what particular parts of the brain do in response to external events; it is a way of discovering how the brain’s parts engage in “conversation” with one another, and how they can change over time. Much brain activity—much thinking—is not the result of external stimuli. So, is the brain conversing with—acting upon—itself? This internal conversation is at the core of who—and what—we are. New technologies, such as functional magnetic resonance imaging (fMRI), enable scientists to watch the brain in action, monitoring neural activity as it thinks. In fifty years, fMRI images probably will seem as crude as Magellan’s maps. We will understand thought processes with instantaneous cellular resolution, and hence the essence of what brains do and what disrupts them.

 

‹ Prev