Book Read Free

Time Loops

Page 17

by Eric Wargo


  Price didn’t come up with the idea originally. It was first suggested in the 1950s by the French physicist Olivier Costa de Beauregard as a way of rescuing Einstein’s relativity from his own proposal of entanglement (and at the same time, rescuing entanglement from relativity). But that was before Bell established the reality of entangled states with his theorem and before the many experimental demonstrations that followed. More recently, Price and his collaborator, San José State University physicist Ken Wharton have reinterpreted some of those experiments, showing that “spooky action at a distance” really is better explained through Costa de Beauregard’s “Parisian zigzag” than through the “superdeterminism” that is implicit in standard descriptions of entanglement. 16

  Price’s assumption when he first advocated a less temporally biased view of causation in his 1996 book Time’s Arrow and Archimedes’ Point was that, even though time-symmetric solutions to quantum mechanics seemed persuasive on many levels, it would be hard or impossible to test them directly because of the famous uncertainty that rules the quantum realm. Since every measurement changes what it measures, how could you ever get direct evidence of a future influence? Your measurement is changing that future, making it impossible to distinguish any backwards influence from the perturbing effects of your measurement. For the same reason, advocates of retrocausal solutions often emphasize, as though anticipating the abuse of these ideas by the uninitiated, that it would be impossible to actually send a signal into the past. 17 But developments over the past two decades in laboratory methods—some of which were developed by Aharonov and his colleagues specifically to study aspects of his two-state vector formalism—have opened new doors for testing retrocausal hypotheses experimentally. They may even open doors to a real kind of informational time travel.

  For instance, in 2009, a team at the University of Rochester led by physicist John Howell shot a laser beam through a beam splitter and then at a mirror attached to a very sensitive motor; the motor could measure the extremely tiny amount that the mirror was nudged or deflected by the photons in the beam. This experiment used a new technique proposed by Aharonov called “weak measurement” that manages to partly get around the old limitations imposed by Heisenberg’s uncertainty principle. Each individual weak measurement interacts trivially with the particle in order not to perturb it too much; that measurement, in isolation, is too imprecise to tell you anything useful, but when aggregated over many particles, such measurements can produce valuable information without disturbing the light beam. The weakly measured laser light in this experiment then passed through either of two gates, beyond which one portion of the split beam was measured a second time in the usual, “strong” way. This after-the-fact selection of some of the light to measure again is known in these experiments as post-selection —a key concept that we will be returning to throughout the remainder of this book. The astonishing finding was that the post-selected light, the portion of the laser beam measured at the later time, turned out to have been amplified , deflecting the mirror previously about a hundred times more than the light that wasn’t destined for the second measurement. 18 In other words, that second measurement seems to have influenced the light’s behavior in its past . 19

  As hard as it is to wrap our heads around, retrocausation could radically simplify—even “collapse”—much of the famously mystical weirdness of quantum mechanics and explain why particles so often seem to know so much about their destinies. Take for instance a phenomenon called frustrated spontaneous emission . It sounds like an embarrassing sexual complaint that psychotherapy might help with. In fact, it involves the decay of radioactive particles, which ordinarily takes place at a predictably random rate. The exception, however, is when radioactive material is placed in an environment that cannot absorb the photons that are emitted by decay. In that case, decay ceases—the atoms become “frustrated.” How do these atoms “know” to stop decaying until conditions are suitable? According to Wharton, the unpredictable decay of radioactive particles may be determined in part by whatever receives their emitted photons in the future. 20 Decay may not really be random at all, in other words.

  Another quantum mystery that arguably becomes less mysterious in a retrocausal world is the quantum Zeno effect . Usually, the results of measurements are unpredictable—again according to the famous uncertainty believed to govern the quantum kingdom—but there is a loophole. Persistent, rapid probing of reality by repeating the same measurement over and over produces repetition of the same “answer” from the physical world, almost as if it is “stopping time” in some sense (hence the name of the effect, which refers to Zeno’s paradoxes like an arrow that must first get halfway to its target, and then halfway from there, and so on, and thus is never able to reach the target at all). 21 If the measurement itself is somehow influencing a particle retrocausally, then repeating the same measurement in the same conditions may effectively be influencing the measured particles the same way in their past , thereby producing the consistent behavior.

  Retrocausation may also be at the basis of a long-known but, again, hitherto unsatisfyingly explained quirk of light’s behavior: Fermat’s principle of least time . Light always takes the fastest possible path to its destination, which means taking the shortest available path through different media like water or glass. It is the rule that accounts for the refraction of light through lenses, and the reason why an object underwater appears displaced from its true location. 22 It is yet another example of a creature in the quantum bestiary that makes little sense unless photons somehow “know” where they are going in order to take the most efficient possible route to get there. If the photon’s angle of deflection when entering a refractive medium is somehow determined by its destination, Fermat’s principle would make much more sense. (We will return to Fermat’s principle later in this book; it plays an important role in Ted Chiang’s short story, “Story of Your Life,” the basis for the wonderful precognition movie Arrival .)

  And retrocausation could also offer new ways of looking at the double-slit experiment and its myriad variants. 23 Again, we would have to assume that the photon’s path through the slits is already determined in part by its future interaction with the screen, not just by the slits themselves (or the light source). The interference pattern may not simply reflect possible paths interfering with each other in some quantum nowhere prior to measurement but may reflect the entanglements between photons and the whole experimental apparatus (including the screen) across time .

  In other words, particles may know so much about the future because they have already been to the future … although again, a more correct way of looking at it is that they are halfway “in” the future already.

  Think back to that glass block of Minkowski: Every particle is really a spaghetti strand snaking across time. Or better yet, imagine particles as colored threads on a vast and chaotic loom; everywhere a thread entwines with another thread is an interaction (or “measurement”), and the patterns woven by the zig-zagging, criss-crossing threads as they are stretched between both sides of that loom would not appear random at all if we could take a higher, Archimedean vantage point outside of time, as Price recommends. 24 From a viewpoint that could grasp the whole cloth, it would be strange—indeed, ridiculously biased—to privilege one side, or one direction of causation, over the other. It would be like saying the pattern created by the interwoven colored threads in a blanket is “caused” by its lefthand side, with a component of capriciousness in the threads’ turnings as we scan the blanket from left to right, and the righthand side of the blanket exerting no “leftward” influence on the pattern at all. With retrocausation, we can no longer privilege the past, as though causation is only a matter of “pushing” (sometimes called efficient causation ). The real mystery becomes why those efficient causes are so much more apparent and intuitively understood, and why influences propagating in reverse give us headaches to even think about. This is a mystery that physics alone might not be able to solve,
but the first step toward finding a solution may be to train ourselves to “think backwards” about events. (Consider this book a primer in doing exactly that.)

  What at least some retrocausal frameworks suggest is that the socalled collapse of the wavefunction may really be just a handy fiction, a statement about our current lack of knowledge not only about the outcome of a measurement but also about how our measurement will turn out to have affected the particle’s prior behavior once we have interpreted the relevant data. In other words, the many supposedly “superposed” possibilities, and the statistical contours of that landscape of possibility, reflect not simply our lack of knowledge about something that already possessed definite properties independently of us (the old-fashioned, realist view never quite given up on by Einstein) or that measurement creates actuality out of a cloud of mathematical nothing (the Copenhagen Interpretation). Instead, the measurement itself is what will determine those prior, uncertain properties. Measurement creates—but really, created —the past. Aharonov and his colleague Jeff Tollaksen write that time-symmetric reformulations of quantum mechanics

  change the meaning of uncertainty from ‘capriciousness’ to exactly what is needed in order that the future can be relevant to the present, without violating causality, thereby providing a new perspective on the question ‘Why does God play dice?’ … [It] suggests that two ‘identical’ particles are not really identical, but there is no way to find out their differences based only on information coming from the past, one must also know the future. 25

  It is important to pause at this point and “absorb” the implications of all this. What it does not mean is that, tucked away safely in a few laboratories on a few university campuses there might be funny, exceptional situations where trivial teensy-weensy effects precede trivial teensy-weensy causes. The implication of experiments like John Howell’s beam-amplification experiment at Rochester or numerous variants of the double-slit experiment like Wheeler’s delayed-choice experiment is that the behavior of all matter at a fundamental level is determined not just by its past but also by its future. Everything that happens to every photon or electron, every interaction it has with other photons and electrons, other objects, other forces, sends ripples into its past history as well as its future history. What looked for all the world like randomness, quantum uncertainty, really has been hiding a variable we didn’t understand or have any way of measuring until recently: the influence of the Not Yet on the Now . 26

  Even if it represents a new way of looking at the measurement problem, the idea that measurement “creates the past” flows in some ways from insights articulated by Bohr already in the 1930s. Bohr argued that quantum reality undermined totally the classical assumption Einstein adhered to, that subject and object, observer and observed, could be distinguished. There is no pre-existing reality apart from and independent of the observer and the apparatus used to measure that reality, and thus no a priori distinction between observer and observed. They are all of a piece: Not only is the wavelike (versus particle-like) nature of an electron “created” under the influence of a certain specific kind of measuring apparatus, but the creation of and choice to deploy that apparatus is itself a product of the social conditions making the experiment possible, the history of science that led to it, and even the biological and cultural forces that converged to produce the experimenter herself. The measurement sits at the center of infinitely widening gyre of what theoretical physicist and feminist scholar Karen Barad, using a more modern idiom, calls “entanglements.” 27 Other apparatuses, other conditions, and another experimenter making other choices would produce a different reality.

  In her stunning 2007 meshing of critical theory, physics, and gender studies, Meeting the Universe Halfway , Barad revisited Bohr and his then-untestable Gedanken- experiments from the standpoint of recent developments in what she calls “experimental metaphysics”—the use of new, highly precise laboratory techniques to probe the once-insoluble questions that Bohr and Einstein (and Heisenberg and Schrödinger) debated, never thinking empirical answers would ever be forthcoming. The Rochester experiment and others using weak measurement might be examples of such experiments. Variants of the double-slit experiment that spy on individual photons to see “which path” they took through the slits are among those that Barad cites. In the early 1990s, Marlan O. Scully and colleagues at the Max Planck Institute described an experiment collecting which-path information on atoms passing through two slits using newly developed micromaser technology. Again, Bohr had predicted that the screen would show no interference pattern in such a case—the atoms, “knowing” they had been spied on, would behave like particles, not waves. This turns out to be the case. But astonishingly, Scully and colleagues described that if the experimenters erase the which-path information they have collected (but not yet analyzed) after they run the experiment, an interference pattern is restored. 28 It is a delayed-choice version of the double-slit experiment in which the experimenters seem able to dictate what happens it the past by erasing (versus not erasing) quantum information in the present.

  From this and other similar experiments, Barad argues that we can no longer think of time as some objective framework, some empty container holding and preceding its contents. It too is shaped and in a very real sense created by our experimental choices and by the scientific and discursive practices that we bring to bear on reality when we investigate it—a radical form of what is sometimes called the “social construction of reality” but penetrating to the very heart of matter, space, and time. She writes:

  It’s not that the experimenter changes a past that had already been present or that atoms fall in line with a new future simply by erasing information. The point is that the past was never simply there to begin with and the future is not simply what will unfold; the “past” and the “future” are iteratively reworked and enfolded through the iterative practices of spacetimemattering … neither space nor time exist as determinate givens outside of phenomena. 29

  (Yes, she said “spacetimemattering.”)

  To be clear, Barad is not adopting an interpretation like that of John von Neumann or any number of more recent writers excited by the possibility that consciousness is somehow constitutive of reality, or that consciousness is even an intrinsic part of observation and measurement. 30 And she is not simply repeating the quantum truism that our choice of how to measure a particle effectively determines the face it reveals going forward. She is instead asking the reader to grapple with the fact that time itself does not escape the skein of entanglements in which the experiment and the experimenter are not only embedded but actually constituted. Everything is entangled. It is a radical kind of contextualism that extends entanglements not only across space but across history (in both directions). There are no individual agents acting upon separate objects, for Barad, and thus no way to separate the knower from the known. It is all “intra-action” in which agents and objects come into being, along with their histories, through the enactment of cuts (meaningful distinctions) in an otherwise undifferentiated phenomenal reality.

  Barad does not use the idiom of retrocausation, which has gained in popularity in the decade since she wrote her book—and she might not necessarily endorse the two-state vector or transactional interpretations described here. But what she is articulating is the literal participation of measurement (and by extension, all physical interactions and entanglements) in producing the historical conditions that led to them. The particle’s past is determined by how the experimenter makes contact with it in the present, as well as further decisions that will be made about that contact going forward. The past never ceases being made by the future, and we ourselves never cease being made in the intra-actions that create this new knowledge. We and time are all of a piece—an idea we will loop back to at the end of this book.

  * * *

  “The herd of mainstream opinion has resigned itself to the spooks,” Huw Price and Ken Wharton lament, “but there is an open gate leading in a
different direction.” 31 They argue that there really are no good arguments against retrocausation from a physics standpoint. The objections are purely philosophical: The problem of free will is what has tended to deflect physicists from confronting time-symmetric interpretations of quantum mechanics with the seriousness that these quite elegant solutions deserve. 32 When suddenly confronted with the radical and troubling idea of the block universe implied by retrocausation, physicists frequently turn into philosophers, anxious to preserve and protect one of the philosophical pillars of our individualistic, freedom-loving society. Even when retrocausation advocates broach the subject in polite company, either in journal articles or the popular science press, they generally do some obligatory rhetorical dance promising that free will is safe and sound (even if it really may not be). 33 This kind of anxiety tells us we are in the world of powerful taboo. We’ll come back to that too.

  But even if it remains a minority interpretation in physics and touches on deep-seated hangups, the resistances to retrocausal thinking are softening. If you follow the science news, it seems like every month some new theory or experiment is throwing the universality of cause-preceding -effect into question. 34 For example, research in quantum information theory by Časlav Brukner and Philip Walther at the University of Vienna and by Giulio Chiribella at the University of Hong Kong is showing that causal order can be “indefinite” at the quantum level. 35 Recent experiments are even showing that quantum-entangled systems can defy the second law of thermodynamics: Heat can be made to flow from a colder object to a hotter one—another kind of “time reversal.” 36 “Time crystals” are also now being created in laboratories: little perpetual motion machines that continually oscillate without any energy being added to them, as though they are stuck in an endless, timeless loop, little chunks of eternity. 37 What do these new developments and possibilities tell us—or not tell us—about the possibility of gaining useful information about the future before it arrives—effectively the obverse (and thus, equivalent) of “sending a signal into the past”? Even on this question, the skeptics and deniers may have less of a leg to stand on than was once believed. That “gate” Price and Wharton mentioned really is wide open to a lot of amazing possibilities.

 

‹ Prev