Programming the Universe

Home > Other > Programming the Universe > Page 9
Programming the Universe Page 9

by Seth Lloyd


  We have seen that entropy is information about the microscopic jigglings of atoms—jigglings far too small for us to see even with the most powerful microscope. Each of the helium atoms in our balloon registers twenty bits. But unless we know where an individual atom is in the balloon and how fast it is going (up to the accuracy allowed by quantum mechanics), we have no idea what those bits are. In other words, entropy, which is just invisible information, is also a measure of ignorance.

  We do have some information about the atoms in the balloon. For example, we can measure the macroscopic state of the balloon: its size, its temperature, the pressure the atoms exert on its walls. Typically, we possess only a few hundreds of bits of macroscopic information about a physical system such as a balloon. For any system, we can make a distinction between bits whose values (0 or 1) we know and those whose values we don’t know. Bits with values of which we are ignorant constitute the entropy of the system: a bit of entropy is a bit of ignorance.

  Note that the division between known and unknown information is to some degree subjective. Different people know different things. For example, suppose you send me a brief e-mail containing 100 bits of information. You know what those bits are, because you sent them. To you, the information in the e-mail is known. Before I open the e-mail, I don’t know what those bits are: they are still invisible to me. At this point, I would count its 100 bits as a form of entropy. So different observers can assign different values to the entropy of a system. Remember Maxwell’s demon? Since it is monitoring the microscopic state of the gas, it has more information than an observer who simply knows the gas’s temperature and pressure. Accordingly, the demon assigns the gas a lower entropy than the macroscopic observer does. For the purpose of the second law of thermodynamics, the important quantity is the total amount of information in a physical system. The total amount of information, known and unknown, in a physical system does not depend on how it is observed.

  Suppose an unknown bit of information interacts with a known bit of information. After the interaction, the first bit is still unknown, but now the second bit is unknown, too. The unknown bit has infected the known bit, spreading the lack of knowledge, and increasing the entropy of the system. We can use the ideas of computation developed earlier to clarify this picture of the infectious nature of ignorance.

  Consider two bits. The first bit is unknown: it could read either 0 or 1. The second bit is known to be, say, 0. Thus, the two bits together are either in the state 00 or 10. Now apply the following simple logic operation to the bits. Flip the second bit if and only if the first bit is 1. This operation is called a controlled-NOT op, because it performs a bit-flip (or NOT operation) on the second bit, an operation controlled by the state of the first bit (which in this case is unknown). If the first bit is 1, then the controlled-NOT op will flip the second bit from 0 to 1. If the first bit is 0, the controlled-NOT op will leave the second bit a 0. After the controlled-NOT op, the two bits taken together will either be in the state 00 or in the state 11. The two bits are now correlated—that is, they have the same value. If we look at the first bit, we know the value of the second bit, and vice versa.

  After the op, the first bit is still unknown: it could still be in either the state 0 or the state 1. But look at the second bit. Now it, too, could be in either the state 0 or the state 1. The second bit, which was known to be 0 before the controlled-NOT operation, is now unknown, too. The controlled-NOT operation has caused the unknown information in the first bit to infect the second bit—the ignorance has spread! (The spread of ignorance is reversible. To get the original state of the two bits back, just perform the controlled-NOT twice. The controlled-NOT operation is its own inverse: applying it twice is like doing nothing at all.)

  The spread of ignorance increases the entropy of the individual bits in a system. The entropy of the first bit remains at one bit, but the entropy of the second bit has increased. But the entropy of the bits taken together remains constant. Before the controlled-NOT, the two bits could have been in one of two states, 00 or 10. There is one bit of entropy, all in the first bit. After the controlled-NOT operation, the two bits could still be in one of two states, 00 or 11. There is still only one bit of entropy, but now it is spread out between the two bits.

  The spread of ignorance is reflected in the increase of a quantity called “mutual information.” Each bit, on its own, has one bit’s worth of entropy, but the two bits taken together also have only one bit’s worth of entropy. The mutual information is equal to the sum of the entropies taken separately, minus the entropy of the two bits taken together. In other words, the two bits have exactly one bit of mutual information. Whatever information they have is held in common.

  Atomic Ignorance

  The infectious nature of information applies to colliding atoms as well as to bits in a computation. The argument that the entropy of the individual atoms in a gas tends to increase was originally put forward by Ludwig Boltzmann in the 1880s. Boltzmann defined the quantity he called H as the degree to which we know the position and velocity of any given atom in a gas.

  Boltzmann’s quantity H is in fact the entropy of an individual atom, multiplied by minus one. He showed that when the positions and velocities of the atoms are uncorrelated—that is, independent of each other—collisions between them will decrease H and increase the entropy of the individual atoms. Subsequent collisions, he argued, would continue to increase that entropy. He concluded that his H-theorem justified the second law of thermodynamics by supplying a mathematical proof that entropy must increase.

  The problem with Boltzmann’s H-theorem is that it is not, strictly speaking, true of actual atoms in a gas. Boltzmann was quite correct that collisions between initially uncorrelated atoms will increase the atoms’ individual entropies. These entropies increase because of the infectious nature of information. When two atoms collide, any uncertainty about the position and velocity of the first atom tends to infect the second atom, rendering its position and velocity more uncertain and thus increasing its entropy. This entropy increase of the second atom is analogous to the entropy increase of the second bit described above, when that bit was subjected to a controlled-NOT operation with an unknown bit as the controller.

  The flaw in the H-theorem lies with subsequent collisions. Once two atoms have collided and exchanged information, subsequent collisions can decrease the entropy of the individual atoms. To understand how interaction between two atoms that have previously collided can decrease their entropy, return to the two bits above. The first time the controlled-NOT operation is applied, the entropy of the control bit infects the second bit, increasing its entropy by one bit. But if the controlled-NOT operation is applied again, the second bit is restored to its initial, known state, decreasing its entropy by one bit.

  In principle, a similar inverse operation, resulting in a similar entropy decrease, could be engineered for the atoms. When Boltzmann presented his H-theorem as a proof of the second law of thermodynamics, his colleague Joseph Loschmidt pointed out that the H-theorem couldn’t always be true, since reversing the velocities of the atoms would undo the collision and decrease their entropies. (The hypothetical being that could reverse the velocities of the atoms is known as Loschmidt’s demon. Back in those days, everyone had demons.) Confronted with this (correct) argument, Boltzmann resorted to sarcasm: “Go ahead,” he said, “reverse them.”

  Boltzmann’s original argument for his H-theorem relied on an assumption about the nature of atomic collisions called “the assumption of molecular chaos.” Even though the positions and velocities of two atoms might be correlated before their collision, Boltzmann argued, repeated collisions between many atoms tend to dilute that correlation, so that two colliding atoms in a gas would in effect be uncorrelated at the moment of their collision. Right after they collide, the two atoms’ positions and velocities are correlated. But as they go on to collide with other atoms, their correlations with each other tend to fade. Boltzmann argued that the next
time they collide, they can be assumed to be uncorrelated: it is as if the two atoms had never collided before. If the assumption of molecular chaos is true, then the entropies of the individual atoms almost always increase. This increase can be undone in principle by reversing the process of collision, à la Loschmidt. But in practice such a reversal rarely happens.

  The assumption of molecular chaos is a good one and is true for many complex systems, such as gases. It is not true for all physical systems, however. As will be seen, in many physical systems it is possible to reverse the interactions between the pieces of the system, thereby reversing the entropy increase of those pieces.

  By and large, however, Boltzmann’s assumption works well. Even after atoms have collided once, their further collisions tend to increase their individual entropies. Why does the assumption of molecular chaos work so well? In my M.Phil. thesis, “The Spread of Ignorance,” and Ph.D. thesis, “Black Holes, Demons, and the Loss of Coherence,” I provided an answer for this question by developing an approach for treating the second law of thermodynamics in terms of the spread of ignorance. This method shows that Boltzmann’s H-theorem is “almost true” for “almost all” physical systems.

  Snooker

  Some background to my approach is in order. After graduating from Harvard, I went to Cambridge University on a Marshall Scholarship. These scholarships are given by the British government out of gratitude for the Marshall Plan, which helped rebuild Europe after World War II. (Gratitude goes only so far. On my first day in Cambridge I went to a pub called the Locomotive. The fellow sitting next to me at the bar had spiky green hair and was wearing a dog collar. When I mentioned to him that his government was paying for me, an American, to attend Cambridge, he ungratefully insisted that I leave the premises.) My first year at Cambridge was spent taking Part III Maths, a course in mathematics and physics one of whose goals is to identify promising scholars and weed out the rest. Students who obtain first-class degrees in Part III typically go on to do Ph.D.s. The very top students are known as wranglers (yippee-ti-yi-yo!). Maxwell had been a wrangler. As for the rest—well, let’s just say that the reward for the bottom student at graduation used to be a four-foot-long wooden spoon.

  To become a wrangler required ceaseless application to study. Many of my classmates were cocooned in the library throughout Part III. Their personalities would not emerge until after they graduated. I had read about student life at Cambridge in the novels of E. M. Forster and the poems of Wilfred Owen. Though I wished to avoid the spoon, I knew that if there was a branch of physics I wanted to study, it was the interplay between mechanics and fluid dynamics involved in rowing in a gentleman’s eight or punting to Grantchester. After my morning lectures, I would go off to the pub by the river and down the lane from the Department of Applied Math and Theoretical Physics (or DAMTP, encouragingly pronounced “damped”) to eat a Cornish pasty and drink a pint of Guinness. Then I would go either for a row on the river or to the graduate student lounge for a game of snooker.

  Snooker is a game related to pool. It is played with cues and balls on a table much larger than a pool table. The special cue employed to reach for shots at the far end of the table could equally well be used for pole vaulting. Snooker shares with cricket, lawn bowling, and sheepherding the classic feature of televised British sports: it is played on a vast green expanse on which small objects (men, balls, sheep) are distributed. Snooker is also like pool in that its goal is to use a cue ball to knock colored balls into pockets, a procedure known as potting. But in snooker, unlike pool, one alternates between potting red balls and potting yellow, blue, pink, or black balls.

  The secret to entropy increase is to be found in snooker. The collision of two snooker balls contains in two dimensions almost all the elements of the collision between two helium atoms in three. At the beginning of the game, the balls all start out in specified positions with zero velocity: their entropy is small. After a few shots, they are all over the table, in positions that depend sensitively on the past history of collisions between the balls and on slight variations in how they were struck by the cue. That uncertainty in how the cue ball is struck—a few bits of unknown information—infects all the balls with which the cue ball collides.

  In the early part of the twentieth century, Emile Borel (he of the monkeys typing) suggested that entropy increase could be thought of as arising from the interactions between systems spreading information around. Starting from Borel’s observations, my thesis work showed that interactions between pieces of a system, such as atoms in a gas—or snooker balls on a table—tend to increase the entropies of those pieces, even if they have interacted before. This result justifies Boltzmann’s assumption of molecular chaos, because it implies that a collision between two atoms will almost always increase their entropies even if they have collided before. Eventually the entropies of the individual parts of a system such as a gas tend to rise to their maximum possible value.

  Figure 6. Pool and the Second Law of Thermodynamics

  6a. The balls are in a low-entropy, triangular arrangement, but the cue ball is heading toward them.

  6b. After the cue ball breaks up the array, the balls move off in a pattern whose entropy and randomness increase with every collision.

  When atoms bounce off each other, they exchange information and spread entropy. Any ignorance about the state of one atom spreads to the state of the other. The spread of ignorance is also familiar in snooker, where the same arguments apply. The cue ball imparts part of its velocity (that is, some of its bits) to the red ball. The red ball bounces off the pink ball, spreading some of its bits, including those it got from the cue ball, to the velocity of the pink. As more and more collisions take place, the number of bits of ignorance distributed among the balls increases, until the bits (and the balls) are spread all over the table. Bits are infectious.

  A particularly interesting case of this bit-contamination process arises when some of the information about a system is macroscopic (that is, information we can access directly through observation and measurement) and the rest is microscopic, “invisible” information (or entropy). Over time, we would expect the microscopic, hidden information to infect the macroscopic, observable information. Eventually the information and entropy of all the system’s bits tends to its maximum allowed value.

  This infection of macroscopic bits by microscopic ones is a feature of chaos. Recall that a chaotic system is one whose dynamics tend to amplify small perturbations, so that microscopic information is pumped up to the macroscopic regime. In a chaotic system, the invisible information in the microscopic bits infects the macroscopic bits, causing the observable characteristics to wander in an uncertain fashion—like the butterfly’s effect on the course of a hurricane.

  The collision of snooker balls is also a chaotic process. Suppose you make a small error in striking the cue ball, so that its initial speed and direction is a bit off. That error is amplified when the cue ball strikes the red ball. The direction in which the red ball now moves has a greater error than the error in the initial speed and direction of the cue ball. The more collisions that take place, the more the initial error is magnified. If you planned to knock the red ball off the pink ball and knock that off the third to pot the third ball, you will probably have failed: by the third collision, the initial error has typically grown too great to afford any measure of control over the speed and direction of the third ball.

  Ignorance spreads, individual entropies increase. In this picture of the second law of thermodynamics, entropy increase is like an epidemic. Bits of ignorance are like viruses that are copied and spread by interaction. The contagion continues until all parts of the system have been infected. At this point, the entropies of the parts taken individually are close to their maximum value.

  The Spin-Echo Effect

  When Joseph Loschmidt suggested that it might be possible to decrease the entropy of a gas by simultaneously reversing the velocities of all its atoms, Boltzmann taunted him. Bu
t as we’ll now see, it is possible to realize Loschmidt’s proposal in actual physical systems. In such systems, entropy can appear to decrease, in apparent (but not actual) violation of the second law of thermodynamics.

  What happens if you reverse the motions of the components of a system? The interactions between the pieces of the system can undo themselves, decreasing their entropies. Loschmidt’s original proposal—to reverse the velocities of the atoms in a gas—is impractical. But for some systems, when Boltzmann challenges you to “reverse them,” you can.

  A simple example of such a reversible dynamics is the controlled-NOT operation described earlier. In this straightforward logic operation, one bit is flipped if and only if a control bit reads 1. As noted, if the second bit is initially 0 and the control bit can be either 0 or 1, then afterward the two bits are either both 0 or both 1. The controlled-NOT operation causes the second bit, with zero bits of entropy initially, to become correlated with the state of the first bit, so that the second bit’s final entropy is one bit. The ignorance in the first bit infects the second, causing its entropy to increase.

  Figure 7. Pool and Loschmidt’s Paradox

  7a. Loschmidt pointed out that if one reverses the velocities of the molecules in a gas, or of the balls in a pool table, entropy can apparently decrease. In figure 7a, the balls are in the same positions as in figure 6b, but their velocities been reversed.

  7b. After their velocities are reversed, in the absence of friction the subsequent collisions of the balls undo their previous collisions, returning the balls to their original, low-entropy arrangement.

  To undo the controlled-NOT, simply perform it a second time. After the first operation, the two bits are either both 0 or both 1. During the second controlled-NOT op, if the control bit is 0 the second bit remains 0. If the control bit is 1, then the second bit is flipped from 1 to 0. In either case, the second controlled-NOT undoes the effect of the first and restores the second bit to 0. As a result, the bit’s entropy decreases from one to zero bits.

 

‹ Prev