The Aliens Are Coming!

Home > Other > The Aliens Are Coming! > Page 12
The Aliens Are Coming! Page 12

by Ben Miller


  RISE AND SHINE

  So here goes. Every object in the universe radiates light, centered around a peak wavelength. Light, as you’ll remember,11 can have a whole spectrum of wavelengths, starting with radio waves, then on to microwaves, through the infrared to visible light, and on up through the ultraviolet to x-rays, and finally gamma rays. The higher an object’s surface temperature, the shorter the peak wavelength of the light it emits. In case that sounds a bit dry, let’s talk about a concrete example. You.

  The core body temperature of the average healthy human being is, as most people know, roughly 98.6°F. It won’t surprise you to learn that your surface temperature when clothed is somewhat less than that. Assuming you are relaxing in a reasonably well-heated room at around 68°F, perhaps wearing a onesie, you might expect it to be something in the region of 82°F. Any object—and I mean any object, it could be a piece of granite, a plastic statue of Elvis, or a possum—with a surface temperature of 82°F will radiate light, and the peak wavelength of that light will be roughly ten millionths of a meter, in the region of the spectrum we call the infrared.

  Maybe you’ve worn a pair of thermal imaging goggles, or, as in my case, seen Walter White doing so in an episode of Breaking Bad. Either way, you will know that people show up bright, and almost everything else shows up dark. That’s because the goggles are capable of detecting infrared light and using it to form a visible image. The shorter the wavelength of the infrared, the brighter the image in the glasses. When Walter looks at Jesse, Jesse will show up brighter than his surroundings because he is at a higher temperature than they are. Should Jesse be holding a cup of hot tea, then that would appear brighter still.

  HERE COMES THE SUN

  So every object in the universe gives off light, and the hotter it is the shorter the wavelength of the light it gives off. The very hottest things in the universe, of course, are stars. Our own Sun has a core temperature of some 16 million °C, and a surface temperature of 5.5 thousand °C.12 The light it emits spans the entire spectrum from radio waves to gamma-rays, but it peaks around the infrared to visible part of the spectrum.

  Color, in other words, is closely linked to temperature. Your gas ring burns blue, and the embers in the fire burn red, because gas burns at a higher temperature than wood. Temperature, and temperature alone, dictates the color of the light an object radiates. Heat a banana to 5,500°C and it will give off the same characteristic blend of colors as the Sun.13

  So what happens to the light radiated by the Sun, after it strikes the Earth? The short answer is that it bounces around like nobody’s business until it finally gets absorbed. That Ferrari on your drive is red because the paint is absorbing every light photon that hits it except those that are red. These reflected red photons will eventually land on something that will absorb them; your monogrammed black leather driving gloves, perhaps.

  After they have absorbed the red photons, and any other photons that happen to strike them, the temperature of your driving gloves will increase. They will then radiate a spectrum of electromagnetic energy, peaking in the infrared, which will in turn be reflected or absorbed by any other object—the obsidian-grey leather steering wheel, the tan calf-leather seats, your pink jumpsuit—that they come into contact with.

  I’m sure you can guess where this is heading. In this universe of ours, objects don’t have to be in contact with one another for heat to spread between them until they reach the same temperature. It can happen just as easily in a vacuum, via the process of radiation. In fact, because most of the universe is a vacuum, radiation is by far the most common way that heat gets around. Seen this way, the reason the stars shine is indeed to guide your destiny. It’s just that your destiny happens to be thermal equilibrium with the rest of the universe.

  So what has this got to do with life? Everything, it turns out. Life is an extremely perverse cooling process. The flow of heat in the universe may be a one-way street, from teacups to saucers, planetary cores to atmospheres, and from stars out into the cosmos, but it can be harnessed by life-forms to do work, reproduce, and—most importantly—organize themselves. It’s time to meet the First Law of Thermodynamics.

  PINBALL WIZARD

  The First Law is great, because it effectively provides an accountancy system for the entire universe. Let’s remind ourselves of what it tells us:

  (1) The total energy of the universe remains constant.

  Put simply, this is nothing more than conservation of energy. Fire a pinball machine, and you may watch impotently as the ball careers around, missing every jackpot and reward channel before it finally rolls with unswerving accuracy directly between the flippers and into the drain. But the beauty of energy is this: If we know the energy of the fired ball, and the energy of the returning ball, by subtraction we can work out exactly how much energy has been absorbed by the machine, despite knowing nothing at all about what went on in the game.

  How is this relevant to our search for life, you ask? Well, like our pinball machine, cells consume energy, and expend that energy as work and heat. To understand politics we follow the money; to understand cell biology we follow the energy. Take any kind of cell, and we can ask some simple but far-reaching questions. How does it get its energy? How does it store that energy? And what does it use it for? As we shall see in the next chapter, when we ask these questions of the very first bacteria we get fascinating clues as to how life on Earth got its start.

  But back to the plot. The First Law tells us that energy may be converted from one form to another, but it can’t be created or destroyed. The pinball machine is a classic example. When we pull back the pin, we convert chemical energy in our muscles into the mechanical energy of a compressed spring. Release the pin and the spring’s mechanical energy is converted to kinetic energy as the ball flies away at speed. The tilt of the machine then converts some of the ball’s kinetic energy into gravitational potential energy as it rolls up the slope of the machine, rising in the Earth’s gravitational field. And that’s before the game has even properly started.

  In a perfect world, at every one of these conversion stages no energy would be lost. Each type of energy would be completely converted into the next, and, theoretically at least, a turn at pinball could go on forever. But real life, as we all know, is not like that. Compressing a spring, rolling a metal ball across a wooden surface, and just about everything else all generate heat. What’s worse, that heat leaks out into the universe and gets absorbed. You can never get that energy to do anything useful again; it is lost to you forever. The quest to understand why—despite energy being conserved—you never get as much out of a system as you put in is what led to the extraordinary bit of physics that is the Second Law. Because it turns out that one of the reasons that the globules in the Martian meteorite look like they might have once been alive is that they appear to break the Second Law of Thermodynamics.

  ONLY HERE FOR THE BEER

  Even if you don’t know what the Second Law is, you will almost certainly have heard of it. Thomas Pynchon, one of my favorite American authors, is obsessed with it, and if you are an enthusiast of abstruse fiction I thoroughly recommend his novel The Crying of Lot 49 for its deep insights into what has to be the most fascinating of all physical laws. To really understand it, we first need a deeper understanding of something we have so far taken for granted: heat.

  Our goal, as ever, is a deeper understanding of biological systems, but the work and character of a nineteenth-century brewer named James Prescott Joule are so especially noteworthy I can’t resist a momentary diversion. The brewing game requires precise measurement of temperature, and Joule carried the art to virtuosic levels. He became fascinated with exactly what it was he was measuring, and in a landmark experiment demonstrated that what had previously been thought of as a fluid was in fact the random motion of atoms.

  Put simply, heat is atoms jiggling about. The hotter something gets, the more its atoms jiggle. Put something cold—or, in other words, unjiggly—in contac
t with something hot, and the jiggly hot atoms will eventually jiggle the unjiggly cold atoms into some sort of intermediate jiggly-ness. Thermometers measure this jiggly-ness. This, of course, is why temperature has a zero, because in theory it’s possible for something to have no heat and therefore not jiggle at all.14

  Joule’s trick was to drop a precisely known weight through a precisely known distance, making it do work on the way down by spinning a paddle in a bath of water. The spinning paddle pushes the water molecules around, increasing their speed and therefore their temperature. By accurately measuring the temperature increase in the water, Joule was able to demonstrate that the increase in its thermal energy exactly equaled the gravitational potential energy imparted by the falling weight. At a stroke, he showed that heat is basically a type of atomic motion. Even today, it still seems somehow revolutionary. But there you have it. Sloshing water about increases its temperature.15

  THE INFORMATION ENGINE

  Right. We’re ready for the Second Law. At first sight it appears to be one of the most unhelpful bits of science you could wish for, a non-event of the first water. Do not be fooled. You are about to go down a rabbit hole that will lead you deep into the innermost workings of the universe. Here she is:

  (2) The entropy of the universe always increases.

  So what exactly is entropy? Step by step, our understanding has deepened. It was first defined during the age of steam. A steam engine, as you will know, works by burning fuel to boil a body of water, and then using the steam created to drive its pistons. The steam then liquefies in a cold condenser before being reheated. The problem that quite rightly intrigued capitalists of the age was how to arrange this system of hot tank (boiling water) and cold tank (condenser) in order to generate the most mechanical work for a given amount of fuel.

  The problem tested some of the finest scientific minds of the time. First out of the blocks was the French physicist Sadi Carnot, who in true Gallic fashion published one of the most whimsically entitled works in the history of the physical sciences with his 1824 memoir, Reflections on the Motive Power of Fire. Carnot rightly determined that what really counted in a steam engine’s efficiency was not the amount of fuel burned, but the temperature of the hot and cold tanks. But what was the formula?

  If it was Carnot who framed the problem, it was the German physicist Rudolf Clausius who enabled its solution. He did so by recognizing that it is not the total amount of heat energy sloshing around in a steam engine that counts, but the portion of that energy that is available to do work. That led him to consider what had happened to the portion of an engine’s heat that isn’t available for work. In the case of a real steam engine, for example, there will be energy lost to its surroundings, waste energy generated by friction in its pistons, and losses due to the viscosities of both steam and water. None of that energy is ever coming back, or at least it is extremely unlikely to. How to characterize these unknown, unpredictable losses?

  Clausius, ingeniously, had a way. He invented a measure of this unavailable energy, and called it entropy. Remember the pinball machine? In that case, the gain in entropy of the machine and its surroundings will be that fraction of the ball’s initial energy that has been absorbed by friction and heating effects during the course of a turn. Entropy, as defined by Clausius, is simply a measure of how much of the input energy had become unavailable for work.

  I know, I know; it’s hard to see what any of this has to do with the origin of life. Your righteous indignation is understandable. No one, frankly, was expecting the Second Law of Thermodynamics to lead where it did. Clausius just wanted a working theory for steam engines. Instead, he ended up discovering nothing less than a brand new physical quantity, an entity that appears to be as fundamental to the way the universe works as energy or temperature. Because when the time came to try and understand entropy on the scale of atoms, an astonishing discovery was made. Entropy is directly related to information.

  THE DEVIL IN THE DETAIL

  The first man to put this discovery on a mathematical footing was the Austrian physicist Ludwig Boltzmann. His is a sad story. A brilliant man, he suffered vicious attacks from his German contemporaries, who were both unable to follow his deliciously nimble calculations and refused to subscribe to the atomic theory that underpinned them. Prone to depression, he committed suicide in 1906, aged just forty-two, but not before crafting the equation that appears on his tombstone:

  S = k log W

  In this masterpiece, Boltzmann relates the entropy, S, of a gas16 to the number of ways its particles can be configured, W, while still having the same overall pressure, volume, and temperature. The constant k, by the way, is respectfully known as Boltzmann’s constant. All we’re aiming for here is the gist, so the point to grasp is this: The greater the number of configurations a system has, the more uncertain we are about which one it is actually in.

  How so? Let’s take a concrete example. Say I take a party balloon and blow it up. I can easily measure the volume, pressure, and temperature of the air in the balloon. In how many ways could the air molecules inside the balloon be configured17 to create the precise values of volume, pressure, and temperature that I measure? In W ways, that’s how many, and W is a big number. The entropy of the air inside my balloon is then k log W.

  Now say I burst the balloon. The air molecules that were inside now start to mix with the air molecules in the room. Given enough time, they will escape the room and mix their way around the globe. In how many ways might they be configured then? I have no idea, but I can tell you that it’s a lot more than W, the number of ways they could be configured within the balloon. My uncertainty about the configuration of the air molecules in the balloon has increased, and their entropy has therefore gone up.

  INFORMATION IS THE RESOLUTION OF UNCERTAINTY

  Following Boltzmann, the union between entropy and information was solemnized by the work of a brilliant American electrical engineer and mathematician named Claude Shannon. Although he is arguably the founder of the digital age, Shannon is criminally underacknowledged. At the tender age of twenty-one he used his master’s thesis to invent digital circuit design, the foundation of all modern electronics, and following the Second World War he was employed at Bell Labs on a US military contract.

  Radio communication had become essential to warfare, and Shannon was tasked both with improving the existing military systems and making them more secure. One of the great problems in electronic devices is how to reduce the effect of “noise”—random fluctuations in the signal. To help solve the problem, Shannon defined a new quantity, H, called the Shannon entropy, as a measure of the receiver’s uncertainty as to the precise letters of a message. To be precise, he said that the Shannon entropy, measured in bits, was given by the expression

  H = –∑ pi log2 pi

  where H is the Shannon entropy, measured in bits,18 of a message that is conveyed by i letters each with probability pi and that funny squiggle is sigma, meaning “sum up the following for all values of i.”

  In case that’s a little abstract, let’s make these equations flesh. You and I arrange to go apple-scrumping19 after lights out. At nine o’clock, when it’s dark—it is autumn after all—I will creep into your garden and look up at your bedroom window. If your flashlight is on, you are going to shin down the drainpipe and join me in Farmer Benson’s orchard. If it’s off, well, we are going to have to go tomorrow night instead.

  In Shannon’s terms, the message—“Yes, I am scrumping” or “No, I am not”—is coded in two “letters,” “flashlight on” and “flashlight off.” At one minute to nine, while I watch my breath condense in the rhododendron bushes, I have no idea whether you are coming or not. If both outcomes are equally likely, then Shannon’s equation tells us that the Shannon entropy is as follows:

  H = –∑ pi log2 pi

  Two possible outcomes, each of which is equally likely, means p1 = p2 = ½, giving us

  H = – { ½ log2 (1/2) + ½ log2 (1/2) }
/>   Realizing that ½ is a common factor, and remembering that log(A/B) = logA – logB we get

  H = – ½ {log21 – log22 + log21 – log22}

  H = – ½ {2log21 – 2log22}

  Recalling that loganything1= 0 and log22 = 1, it all shakes down so that

  H = – ½ { – 2 } = 1 bit

  OK. This is heading somewhere, I promise. Let’s now imagine that we decide to include a third “letter,” where you waggle your flashlight about, meaning “wait because I am still deciding.” Let’s further imagine that each of the three possible letters is equally likely. What’s the Shannon entropy then? Clearly it’s more than in the previous case, but by precisely how much? Let’s plug in the numbers to give us the answer.

  H = –∑ pi log2 pi

  H = – { 1/3 log2 (1/3) + 1/3log2 (1/3) + 1/3 log2 (1/3) }

  H = – 1/3 { – 3 log23 }

  H = log23 = 1.58 bits

  Maybe you’re starting to see the pattern? For two letters, we had H = log22 bits. For three, H = log23 bits. And if we had W letters, each of which were equally likely, Shannon’s formula tells us that the information entropy of the message in bits is given by

  H = log2 W

  Or changing the base of the logarithm to the natural number, e, we get20

 

‹ Prev