Cosmic Apprentice: Dispatches from the Edges of Science

Home > Other > Cosmic Apprentice: Dispatches from the Edges of Science > Page 10
Cosmic Apprentice: Dispatches from the Edges of Science Page 10

by Sagan, Dorion


  What does it mean to communicate with an “alien” when we barely have the first idea of how to understand the intricacies of rain forest plant communications, nuclei-trading mycelial networks, and ultraviolet light–detecting superorganisms of bees, let alone the conscious or unconscious minds of a humpback, great blue, or white whale—whose mathematical, philosophical, aesthetic, and perceptual abilities may, for all we know, far outstrip those of our greatest geniuses (whose intelligences we also don’t judge by their skill in bloodshed)?

  In 2008, at a conference in Basel, the ethnopharmacologist Dennis McKenna characterized psychedelic drugs as “molecular messages sent by Gaia,” one of whose messages is that we—like monkeys excitedly trading glowing bits of a mysterious crashed starship found in a jungle—are not so smart. Indeed: Why should we worry about hypothetical interspecies communication (even Hawking says intelligent aliens are unlikely to exist within one hundred light-years or we would have detected them already) when we do not even understand the local “starship” of Earth’s many species?

  Burnett may be right that dolphins are not that smart. But we may not be that smart either. Nonetheless, if brain size has anything to do with it, some individual whales may be far smarter than individual human beings. Considering how much extra intelligence goes on beyond the level of conscious rational awareness—in our immune system, in our physiology, and in our intuitions—it’s almost overwhelming to consider the capacity of a white or blue whale’s unconscious mind. Despite their size (which you’d think would make them easy for our scientific sleuths to track)—the biggest ever in evolution, weighing two hundred tons each, their hearts as big as a car, their brains ten times more voluminous than ours—we do not even know where the blue whales go to breed.

  If such whales with whom we share this oceanic planet remain deeply mysterious, intelligent aliens in our midst, the same may also be true of a far larger being, even closer to us. I speak of the planetary biosphere of which we humans seem to be minute parts, not unlike some of the cells of our own bodies that, if they are sentient, which some may well be, likely have zero conception of the coffee-sipping, car-driving wholes of which they are part.

  Fossil and mass spectrometric evidence strongly suggests that Gaia—the visionary scientist James Lovelock’s name for this systemic, cybernetic, intelligent-acting nexus of life-forms at Earth’s surface taken as a physiological whole—regulates the chemistry and temperature of our planet’s surface in the unconscious manner of a living being. Whales are mammals with whose sentience we can empathize, even if we can’t understand them or they us. But, conscious or not, the living biosphere appears to be a far bigger fish, so to speak, one whose existence we’ve barely divined.

  If we are to worry, we should worry about this complex beast of a planet with whose vast, mostly unconscious living intelligence we have been seriously meddling. Compared with the possible actions, which may soon be visited on us by this leviathan, the alien of which we are a metastatic part, the concerns voiced by our charismatic physicists are distracting at best, irresponsible at worst.

  The media-grabbing headlines about being detected and eaten by distant aliens seem histrionically misplaced. The cosmological worrywarts are not providing a public service so much as displacing our local guilt over degrading and killing off whole species of our own very real and close relations. I’m not laughing, and I signed the petition to keep alive the ban on whale hunting. But it would be a rather fitting bit of cosmic irony if this giant intelligence, this greater leviathan of which we are part, this Brobdingnagaian body we are feeding on, this living surface of Earth whose physiological abilities still remain unknown and thus in a sense alien to the majority of people on Earth, itself turns out to have an immunelike system capable of regulating us out of existence, and does so, without us ever having truly established communication with it, while we twitter on about the man-eaters in the stars.

  PART III

  GAIA SINGS THE BLUES

  CHAPTER 8

  THERMOSEMIOSIS

  Boltzmann’s Sleight, Trim’s Hat, and the Confusion concerning Entropy

  THERMODYNAMICS STARTED OFF bright enough, practical and blond, saving the world from its limits. But then, overcome by shadows, its shiny children got dirt in their fingernails, soot in their hair; the world darkened with a foreboding of smokestacks. To the injury of overpopulation was added the attractiveness of thermodynamics as an incentive for geek speak, theoretical discussions that, with poetic justice, generated more heat than light.

  Unlike economics, a different kind of dismal science, thermodynamics was an indisputable success, its application helping ignite the Industrial Revolution and its theory, in the form of Maxwell’s demon, helping kindle computers and the information age. Indeed, thermodynamics may be responsible for your existence, as well as most of the nitrogen atoms in your body. In early 1912 the German chemists Fritz Haber and Carl Bosch produced inexpensive ammonia using nitrogen from the air and hydrogen gas. This in turn enabled heavily populated countries to make cheap ammonia-based fertilizers, staving off starvation on a global scale. An interesting feedback loop: technology is man-made and man now is factory-made.

  Even if you are a vegan eating organic food, some 50 percent of the nitrogen atoms inside your body, including in the amino acids that make up your proteins, and in your DNA, are synthetic: they were made under high pressures and temperatures in giant factories that use 2 percent of Earth’s energy, breaking the covalent bonds of nitrogen atoms in the atmosphere into forms that can be taken in by crops, eaten by food animals and us. According to Thomas Hager, these “giant factories, usually located in remote areas, that drink rivers of water, inhale oceans of air . . . burn about 2 percent of all the earth’s energy. If all the machines these men invented were shut down today, more than two billion people would starve to death.”1

  Yet despite its importance, the essence of thermodynamics remains confusing. Perhaps the enormous success of thermodynamics, in both academic theory and industrial production, caused experts to ignore simple descriptions of what the second law means over the past century. The astronomer Arthur Eddington said,

  The law that entropy always increases—the second law of thermodynamics—holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations—then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation—well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.2

  Appointed to give the Sir Robert Rede Lecture on May 7, 1959, Charles Percy Snow chose to critique higher education. Snow—a baron of the city of Leicester, England, as well as a physicist, mystery writer, defender of the realist novel, and author of the seven-volume Strangers and Brothers (made into a BBC series)—prodded his august audience in words that were, for all intents and purposes, the first shots in what would become the culture wars:

  A good many times I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists. Once or twice I have been provoked and have asked the company how many of them could describe the Second Law of Thermodynamics. The response was cold: it was also negative. Yet I was asking something which is about the scientific equivalent of: “Have you read a work of Shakespeare’s?”

  I now believe [he added later to published versions of his remarks] that if I had asked an even simpler question—such as, What do you mean by mass, or acceleration, which is the scientific equivalent of saying, “Can you read?”—not more than one in ten of the highly educated would have felt that I was speaking the same language. So the great edifice of modern physics goes up, and the majority of the cleverest people in the
western world have about as much insight into it as their Neolithic ancestors would have had.3

  According to the chemist Frank L. Lambert, who has written extensively about simple thermodynamics,4 even Snow neglected to define the essence of thermodynamics he said was so important. It has nothing to do with mandating an inevitable increase in disorder, as all sorts of cultural theorists and science geeks believe. Rather, the elegant essence of the phenomenon the second law describes is that energy, if not hindered, spreads.

  The intellectual sin of focusing on disorder is wildly democratic in its choice of victims. It afflicts not only the Cal Tech astrophysicist Sean Carroll, in From Eternity to Here, but Pope Pius XII, who offers the second law as proof of the existence of God, because only he had the wherewithal, in creating organized life and man, to resist this all-inclusive law of ever-increasing disorder. Yet hardly anyone is safe from this widespread mistaken meme, not even the most scientific-seeming rationalists and atheists. For example, the Darwinist philosopher Daniel Dennett repeats a version of this same mistake when he writes that life-forms “are things that defy” and constitute a “systematic reversal” of the second law.5 Superficially, this may seem to be the case, as some of life’s key chemicals concentrate rather than spread energy. But it is crucial to realize that, overall, living systems spread energy and that their partial molecular concentration of energy, and production of gradients, abets this process. Saying that life defies the second law is like saying that Robin Hood is against the spread of wealth because he gives it to the poor. A watch must have its watchmaker. A car does not put itself together from parts. Nonetheless, moving atoms do join to form compounds and more complex molecules. Unhindered, energy spontaneously spreads out.

  While a red Ferrari doesn’t assemble itself from spare parts in a junkyard during a windstorm, this has little to do with the organization we see in life. The macro-objects of our everyday life do not behave in the same way as atoms and molecules. Car parts in a junkyard don’t routinely whiz by at two hundred to two thousand miles an hour, colliding with one another, fusing and releasing so much energy that they become white-hot. Such behavior, however, is normal for molecules.

  The vast majority of compounds, some quite complex, form easily. But molecules are not atoms mixed up at random like the batter around a chicken leg in a bag of Shake ’n Bake. When three or more atoms aggregate to make a molecule, they possess a precise order. Their atoms, in a relatively fixed geometric relationship, generally stay stable. When atoms “bond” after their violent collisions, they aggregate into molecules so stable that temperatures of thousands of degrees are needed to pry them apart. Melt them together (to make them move more rapidly), and amino acids form huge new compounds. The melted amino acids make “proteinoids” with hundreds to thousands of amino acid units firmly joined in the same kind of bonds that hold proteins together. The result is not useful or valuable proteins, but it does show how easily gigantic complex molecules can form naturally.

  There are millions of compounds that have less energy in them than the elements of which they are composed; they are the result of “downhill” reactions, formed easily, resulting in the spread of energy. Their formation is no more mysterious than a glob of toothpaste appearing at the end of a squeezed tube.

  The rules of energy science favor formation of complex, geometrically ordered molecules. But there are also compounds that require, like objects in a factory, additions of energy from outside, leading to compounds having more energy in them than they had before. Such molecules may result, for example, by molecules being energized by lightning.

  While less likely, such reactions to create higher-energy molecules happen all the time. Alkanes, for example, are among the simplest of organic compounds, composed only of carbon and hydrogen, and containing portions or sections with one carbon atom holding two or three hydrogen atoms. Both simple and complex alkanes have been detected by spectroscopic methods in space. Simple alkanes with two to five carbon atoms joined to one another (and hydrogens attached to each carbon) all contain less energy than their elements. More complex alkanes, with six or more carbon atoms joined to make their molecules, have more energy in them than the elements from which they come.

  These alkanes, like life’s key energy-storage molecule, ATP—adenosine triphosphate, structurally a cousin to DNA—require energy to be formed. ATP is an amazing molecule. Not only is it as omnipresent in life as DNA or RNA, but because it is built up from energy and spent in metabolism, it is like cash in a casino: You synthesize and break up roughly your entire stock (about 8.8 ounces) of ATP each day.

  Alkanes and similar so-called endergonic chemicals contain more energy in them than the elements that go into them because they are forged via input of external energy—ultraviolet or X-rays, both plentiful in many parts of the universe. This energy is not so hard to come by. Indeed, high-energy cosmic rays are penetrating your body this very moment. On average, about five cosmic rays penetrate every square inch of your body every second, right now, even as you are reading this sentence: gamma rays, X-rays, subatomic particles. Some come from the sun, but a lot come from supernova explosions.

  Such abundant energy—and UV radiation was more plentiful on the early Earth—bombarded simple chemicals all the time, sometimes creating more energetic compounds that stored energy which, released later, created a cascade of delayed reactions. So even with science’s splendid emphasis on connecting humans and life to natural cosmic processes, it is easy to see where the notion that life defies the second law comes from, but it is wrong. It is far better to think of living systems as temporarily deferring second law–“based” predictions of immediate gradient breakdown, but as part of a process of greater overall and longer-lasting energy delocalization. Temperature measurements by low-flying airplanes over the H. J. Andrews Experimental Forest in Oregon corroborated thermal satellite pictures showing that rain forests in summer are (because of cloud cover) as cool as Siberia in the winter. Quarries and clear-cut forests have higher temperatures than a twenty-five-year-old Douglas fir plantation and a natural forest twenty-five years old, and neither of these was as cool as an old-growth forest four hundred years old. At first one might be tempted to explain these data by saying that the capture of solar energy in the cooler versus more fallow areas is due to the buildup in them of energy-storing chemicals that prevent energy’s spread. Yet there is another interpretation, which is quite different. Consider a refrigerator, keeping itself cool internally but generating excess heat. Is not this the essence of the grasslands compared with the desert, the forests compared with grasslands, and the great jungles compared with temperate forests? Most of the solar energy in the plants goes not into “blocking” the second law to make energy-storing compounds but into the thermodynamically open process of evapotranspiration. Latent heat is released as rain. Given the solar gradient, the difference between the hot sun and cold space, the cooling provided by evapotranspiration-produced clouds must, like a refrigerator, lead to energy spread, entropy production, farther out. Like natural nonliving complex systems, and our cooling machines that require an outside source of energy and dump heat into their surroundings, organisms have impressive internal structure. Yet, seen as energetic processes rather than firmly bound things and compared with less-organized regions of matter, they produce more heat, even as they keep themselves cool. They spread more energy. Constitutively open systems, they do not defy the second law. Rather, their order is connected to a more effective, elegant, and continuous production of entropy, dispersed energy.

  Entropy is a confusing word. In 1854 the German physicist Rudolf Clausius combined the word energy with “tropos,” Greek for transformation, to come up with entropy for a change in energy, dq: ΔS = dq/T.6 This was later given a statistical formation by Ludwig Boltzmann (1844–1906), one of the founders of modern thermodynamics. Boltzmann in a single statement is probably responsible for the lion’s share of our confusion about the conceptual meaning of entropy. After mor
e than four hundred pages of heavy math, in a common-language summation, Boltzmann writes (in Lectures on Gas Theory [Vorlesungenuber Gastheorie]) that the universe, “or at least most of the parts of it surrounding us are initially in a very ordered—therefore very improbable—state. When this is the case, then whenever two or more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.”7

  This concept—of entropy as “disorder” and thus any type of disorder as “entropy”—was dominant throughout the twentieth century. In the equation for entropy the symbol for entropy is S. Boltzmann had developed an equation for the entropy change, ΔS, in terms of energy states, but he could not do actual calculations because he did not know how to discover the value of k, now known as Boltzmann’s constant. Before Boltzmann committed suicide, but without his hearing about it, the physicist Max Planck established that k was equal to R/N—the gas constant divided by the number of molecules in a mole. The equation S = k log W (engraved on Boltzmann’s 1906 tombstone) is actually a version of this equation, coined about 1900 by Planck.

  In retrospect, Boltzmann’s common-language summary after four-hundred-odd pages full of mathematics clouded the issue far more than illuminating it. Adding to the confusion was the code-breaking physicist John von Neumann, who advised Claude Shannon, innovator of information theory, which was to become the basis of global telecommunications, to adopt the term entropy to describe information. “Nobody knows what entropy really is” anyway, counseled the troublemaker von Neumann, a heavy drinker who had so many car wrecks that they named an intersection in Princeton Von Neumann Corner.8 Shannon took the advice.

  The notion that the key concept of entropy is a spontaneous change from order to disorder stems from this 1898 summary Boltzmann gave of his own work. But what is order? It certainly appears that a sparkling crystal of ice is obviously more “ordered” than an equal volume of water, but the difference in the numbers of energy states is totally beyond our comprehension. As Lambert writes, “If liquid water at 273 K, with its 101,991,000,000,000,000,000,000,000 accessible microstates [quantized molecular arrangements] is considered ‘disorderly,’ how can ice at 273 K that has 101,299,000,000,000,000,000,000,000 accessible microstates be considered ‘orderly’?”9

 

‹ Prev