Book Read Free

The End of Absence: Reclaiming What We've Lost in a World of Constant Connection

Page 11

by Michael Harris


  A prime example is the Google Books project, which has already scanned tens of millions of titles with the ultimate goal of democratizing human knowledge at an unprecedented scale—the new technology needs the old one (briefly) for content; the old one needs the new (forever) to be seen by a larger audience. Screen resolution and printout resolution are now high enough that digital versions satisfy researchers, who no longer need to look at original manuscripts (unless they’re hungry for first-person anecdotes). A real Latin copy of Copernicus’s De revolutionibus, for example, waits for us in the stacks of Jagiellonian University in Kraków; but it, like a fourth-century version of Virgil’s work or a twelfth-century version of Euclid’s, is handily available in your living room (sans airfare). It’s thrilling: our sudden and undreamed-of access to magazine spreads as they appeared in the pages of Popular Science in the 1920s or copies of Boccaccio’s Decameron as they appeared in the nineteenth century. The old, white-gloved sacredness of the manuscript is rendered moot in the face of such accessibility. Literary critic Stephan Füssel has argued that this means the “precious old book and the new medium have thus formed an impressive invaluable symbiosis.” I would only add: For now.

  One authenticity must eventually replace the other. But first, this wobble, this sense of two authenticities overlaid, a kind of bargaining.

  When Gutenberg published his Bible, he took great pains to please his readers’ sense of the authentic by matching his printed work to earlier scribal versions. John Man describes in The Gutenberg Revolution an intense labor, geared toward creating a kind of überversion of a handmade Bible rather than something entirely new. Authenticity, or an entrenched idea of authenticity, was key: Three punch cutters worked for four months to create all the punches that would do the printing, painstakingly copying them from a handwritten Bible to replicate the texture of a human touch. (His Bible’s 1,282 pages also incorporated accents that scribes had used to indicate short forms of words.) Although paper was readily available—and he did print his Bible on paper—he also imported five thousand calfskins in order to print around thirty “authentic” vellum copies. Gutenberg’s Bible, a manufactured masterpiece, claimed traditional authenticity even as it began to rub out that which came before. Yet first came that fascinating moment of flux: In the late fifteenth century, scribal culture and print culture were coexisting, with handwritten manuscripts being copied from printed books just as printed books were copied from scribal ones. The old “authentic” artifact and the new “fake” artifact—for a moment in time—informed each other.

  • • • • •

  When we step away from earlier, “more authentic” relations, it makes sense that we also fetishize the earlier “real.” Sherry Turkle argues that, in fact, our culture of electronic simulation has so enamored us that the very idea of authenticity is “for us what sex was for the Victorians—threat and obsession, taboo and fascination.” (One can imagine future citizens sneaking into underground clubs where they “actually touch each other.”) When I walk through the chic neighborhoods of London or Montreal—when I look through the shops that young, moneyed folk are obsessed by—it is this notion of ironic “authenticity,” the refolking of life, that seems always to be on offer. A highly marketable Mumford & Sons–ization. Young men buy “old-fashioned” jars of mustache wax, and young women buy 1950s-style summer dresses. At bars, the “old-fashioned” is one of the most popular cocktails, and hipster youths flock to the Ace hotel chain, where iPhone-toting customers are granted access to record players and delightfully archaic photo booths.

  The fascination with the authentic tin of biscuits or vintage baseball cap remains, of course, the exception that proves the rule. The drive toward the inauthentic still propels the majority of our lives. When aren’t we caught up in a simulacrum? Millions of us present fantasy versions of ourselves—skinnier, richer avatars—in the virtual world of Second Life (while our First Life bodies waste away in plush easy chairs). Some even watch live feeds of other people playing video games on Twitch.tv (hundreds of thousands will watch a single person play Grand Theft Auto and send cash donations to their favorite players).13 Meanwhile, in Japan, a robotic seal called Paro offers comfort to the abandoned residents of nursing homes; and the photo- and video-sharing site Instagram is less interested in recording reality and more interested in pouring it through sepia filters. The coup de grâce: Advances in the field of teledildonics promise us virtual sex with absentee partners. All in all, it seems the safety of our abstracted, cyborg lives is far more pleasing than the haptic symphony of raw reality. Digital life is a place where we can maintain confident—if technically less authentic—versions of ourselves.

  It’s also a perfect place to shirk certain larger goals. The psychologist Geoffrey Miller, when pondering why we haven’t come across any alien species as yet, decided that they were probably all addicted to video games and are thus brought to an extreme state of apathy—the exploratory opposite of the heroes in Star Trek who spend all their time seeking out “new life and new civilizations.” The aliens “forget to send radio signals or colonize space,” he wrote in Seed magazine,

  because they’re too busy with runaway consumerism and virtual-reality narcissim. They don’t need Sentinels to enslave them in a Matrix; they do it to themselves, just as we are doing today. . . . They become like a self-stimulating rat, pressing a bar to deliver electricity to its brain’s ventral tegmental area, which stimulates its nucleus accumbens to release dopamine, which feels . . . ever so good.

  Wouldn’t it make sense to shunt authentic tasks like child rearing, or space exploration, or the protection of the environment, to one side while pursuing augmented variations on the same theme?

  • • • • •

  Our devotion to the new authenticity of digital experience—the realness of the patently incorporeal—becomes painfully apparent in moments of technological failure. Wi-Fi dies at a café and a fleet of bloggers will choke as though the oxygen level just dropped.

  Mostly these strangulations are brief enough that they don’t cut us off in any significant way from our new reality. The realness of our digital lives is firm. The breach was just a hiccup. But how invincible, really, is our new reality, our gossamer web?

  In 1909, E. M. Forster published a smart little story called “The Machine Stops,” in which the web does drop away. In Forster’s vision of the future, humans live below the surface of the earth, happily isolated in hexagonal rooms like bees in a massive hive. They each know thousands of people but are disgusted by the thought of physical interaction (shades of social media). People communicate through “plates” (they Skype, essentially), and all human connection is conducted through the technological grace of what’s simply called the Machine, a massive networked piece of technology that supplies each person with pacifying entertainment and engaging electronic connections with other people. The Machine does not transmit “nuances of expression,” but gives “a general ideal of people” that’s “good enough for all practical purposes.” When “speaking-tubes” deliver too many messages (e-mail), people can turn on an isolation mode, but they’re then flooded by anxious messages the moment they return. Year by year, humans become more enamored of the Machine, eventually developing a pseudoreligion around it in what Forster terms a “delirium of acquiescence.”

  Humans are warned off of authentic experience. “First-hand ideas do not really exist,” one advanced thinker proclaims. “They are but the physical impressions produced by love and fear, and on this gross foundation who could erect a philosophy? Let your ideas be second-hand, and if possible tenth-hand, for then they will be far removed from that disturbing element—direct observation.” Inevitably, the virtuosic Machine begins to fall apart, though, and with it the very walls of their micromanaged underground society.

  Author Jaron Lanier recalls Forster’s story as a message of hope, a fantasy where mankind casts off its shackles (or has those shackles forced off, anyway). “At the end of the story
. . . ,” Lanier recounts, “survivors straggle outside to revel in the authenticity of reality. ‘The Sun!’ they cry, amazed at luminous depths of beauty that could not have been imagined.”

  But in fact Lanier is misremembering here. The underground citizens of Forster’s story do not climb out from the Machine’s clutches and discover the sun. The air above is toxic to them, and when the Machine dies, Forster’s heroes are buried alive, catching a glimpse of “the untainted sky” only as rubble crashes down and kills them. There’s no revelation; it’s a cold, dark death for all of them. The final words spoken in the story are not the euphoric ones remembered by Lanier. The last words anyone speaks are, “Humanity has learned its lesson.” Forster is describing a reverse Gutenberg moment. An undoing of the future.

  Our own Machine has been similarly threatened before, though we were far less reliant on communication technologies then. On September 1, 1859, a storm on the surface of our usually benevolent sun released an enormous megaflare, a particle stream that hurtled our way at four million miles per hour. The Carrington Event (named for Richard Carrington, who saw the flare first) cast green and copper curtains of aurora borealis as far south as Cuba. By one report, the aurorae lit up so brightly in the Rocky Mountains that miners were woken from their sleep and, at one a.m., believed it was morning. The effect would be gorgeous, to be sure. But this single whip from the sun had devastating effects on the planet’s fledgling electrical systems. Some telegraph stations burst into flame.

  Pete Riley, a scientist at Predictive Science in San Diego, published an article in Space Weather in 2012 stating that our chances of experiencing such a storm in the next decade are about 12 percent. That’s a one in eight chance of a massive digital dismantling. If it doesn’t happen soon, it’ll happen eventually. Great Britain’s Royal Academy of Engineering has pegged the chance of a Carrington-type event within the next two centuries at about a 95 percent probability. Such an event almost took place in the summer of 2012, actually, and involved a particle stream larger than we imagine the original Carrington Event to have been. But it just missed the earth, shooting harmlessly over our heads (over the top of a STEREO spacecraft, actually). When we are hit, at any rate, we won’t be able to save ourselves with some missile defense system meant for meteors; no missile could halt the wraithlike progress of a megaflare.

  What will happen, exactly? Electricity grids will fail; some satellites will break down; aircraft passengers will be exposed to cancer-causing radiation; electronic equipment will malfunction; for a few days, global navigation satellite systems will be inoperable; cellular and emergency communication networks may fail; the earth’s atmosphere will expand, creating a drag on satellites in low earth orbit; satellite communication and high-frequency communication (used by long-distance aircraft) will probably not work for days.

  I daydream about a latter-day Carrington Event weirdly often, actually. (It’s pleasant to have something truly morbid to fix on while sitting on a subway, and if Milton isn’t doing the trick, then I switch to other celestial damnations.) Joseph Weizenbaum, the creator of ELIZA whom we met in chapter 3, was able to notice even in the mid-1970s how computers had become as essential to human life as our most basic tools: If extracted from us cyborgs, “much of the modern industrialized and militarized world would be thrown into great confusion and possibly utter chaos.” I imagine our transportation and communication systems crashing to a halt, our banks and governments freezing or, worse, misfiring commands. I imagine our refrigeration systems failing and, with them, all our stores of perishable food. Entire power grids blinking off. GPS systems becoming fuzzy to the point of fouling precise military actions. A team of scientists from Atmospheric and Environmental Research estimated that such an event would cost the United States alone up to $2.6 trillion in damage and would take as long as a decade to recover from.

  A single lashing from the sun—the most authentic body we know—could shake our fantastic Machine. The promise prompts us to imagine a moment when our Machine stops entirely (as Forster did). The thought experiment is as enlightening as it is gruesome.

  Think of that moment when the fridge shuts off, causing you to realize—in the silence that ensues—that you’d been hearing its persistent hum before. You thought you knew silence, but you were really surrounded by the machine’s steady buzz. Now multiply that sensation by the world. Think how cold, how naked, how alone, how awake, you might be. Your own private Carrington Event.

  Amazing, how through the creeping years absence could leave us so quietly, so stealthily—yet the return of absence might be so violent a shock.

  PART 2

  * * *

  Breaking Away

  When from our better selves we have too long

  Been parted by the hurrying world, and droop,

  Sick of its business, of its pleasures tired,

  How gracious, how benign, is Solitude.

  —William Wordsworth, The Prelude

  CHAPTER 6

  Attention!

  In proportion as our inward life fails, we go more constantly and desperately to the post-office.

  —Henry David Thoreau, Walden

  THOREAU was right. Whenever I am frustrated, miserable, thwarted, I’ll open my in-box twice as often. But this is not my in-box’s fault. It’s mine. My need for distraction is tied to my emotional state. The unopened incoming message is always the best one; its only content is promise. W. H. Auden described this same love of forthcoming letters in his poem “Night Mail,” where he meditates on the effect of a mail train running from London to Scotland:

  And none will hear the postman’s knock

  Without a quickening of the heart.

  For who can bear to feel himself forgotten?

  In fact, I don’t expect to find anything so very extraordinary in my in-box at all. But the act of calling up the mail itself is a solace. And I can call it up whenever I choose, unlike the locals waiting for Auden’s mail train, who knew they couldn’t control its arrival time and so would have put the idea of new messages from their head for much of the day. To check my mail I issue just a click, no commitment at all, and there in the indefinite moment when the e-mail is called forth, I feel the jolt of hope, all in a secret instant where I may have received something wonderful, my gray little life may be changed. Sometimes I think that is the only real moment when I relax, when the world’s voice has not quite arrived and I can watch those messages load.

  I never used to think this behavior meant I was turning into a stimulus junkie. But then, one day, I forced myself to count the number of times I compulsively checked the status of my in-box. Answer: fifty-two. And what was I looking for there? I go for pure distraction from my duties, certainly, but it’s also true that some part of me has an oversize expectation of the messages therein. My eight-year-old self saw the mailman as a bringer of daily gifts, and my adult self is similarly enamored of unopened missives. Psychologists have a term for behavior like mine: “operant conditioning.” It’s a phrase, coined by B. F. Skinner in 1937, that describes any voluntary behavior that is shaped by its consequences. At its most basic level, operant conditioning implies that a creature will repeat an activity that produces positive rewards (sugar cubes for horses, bingo winnings for humans, and so on). But then comes the insidious part: the “variable interval reinforcement schedule.” Studies show that constant, reliable rewards do not produce the most dogged behavior; rather, it’s sporadic and random rewards that keep us hooked. Animals, including humans, become obsessed with reward systems that only occasionally and randomly give up the goods. We continue the conditioned behavior for longer when the reward is taken away because surely, surely, the sugar cube is coming up next time. In my case, I need to receive only one gratifying e-mail a month (praise from an editor, a note from a long-lost friend) before I’m willing to sift through reams of mundane messages in the hopes of stumbling on another gem.

  I’m not sure I’m as far gone an e-mail junkie as the average Americ
an office worker, who in one depressing report was found to be managing her e-mail for a quarter of each day. But let’s say I’m enough of a distraction addict that a low-level ambient guilt about not getting my real work done hovers around me for most of the day. And this distractible quality in me pervades every part of my life. I once asked a friend to keep tabs on how many times I looked away from the book I was reading. He told me I glanced away from that particularly good Alan Hollinghurst novel an average of six times every page. The distractions—What am I making for dinner?, Who was that woman in Fargo?, or, quite commonly, What else should I be reading?—are invariably things that can wait. What, I wonder, would I be capable of doing if I weren’t constantly worrying about what I ought to be doing? And how content might I become if I weren’t so constantly sure that the mailman has my true, far more glamorous life in that bag?

 

‹ Prev