Book Read Free

The End of Absence: Reclaiming What We've Lost in a World of Constant Connection

Page 5

by Michael Harris


  Our genes and memes have been working to shape us since humans first started copying one another’s raw vocalizations. But now we may be witness to a third kind of evolution, one played out by our technologies. This new evolution is posited not by a Silicon Valley teen, but by a sixty-one-year-old woman in rural England named Susan Blackmore. Just as Darwinism submits that genes good at replicating will naturally become the most prevalent, Blackmore submits that technologies with a knack for replication will obviously rise to dominance. These “temes,” as she’s called these new replicators, could be copied, varied, and selected as digital information—thus establishing a new evolutionary process (and one far speedier than our genetic model). Evolutionary theory holds that given a million technological efforts, some are bound to be better at making us addicted to them, and these give rise, organically (as it were), to more and more addictive technologies, leaving each generation of humans increasingly in service to, and in thrall of, inanimate entities. Until we end up . . . well, where we are.

  Blackmore’s work offers a fascinating explanation for why each generation seems less and less capable of managing that solitude, less likely to opt for technological disengagement. She suggests that technology-based memes—temes—are a different kind of replicator from the basic memes of everyday material culture, the ones Dawkins was describing. What sort of difference is there? I wanted to know. “The most important difference is the fidelity of copying,” she told me. This is important because a meme’s ability to propagate grows as its fidelity rate increases. “Most memes . . . we forget how often we get them wrong.” (Oral traditions of storytelling, for example, were characterized by constant twists in the tale.) “But with digital machines the fidelity is almost 100 percent. As it is, indeed, with our genes.” This is a startling thought, though a simple enough one: By delivering to the world technologies capable of replicating information with the same accuracy as DNA, we are playing a grand game indeed. The fidelity of our earliest memetic acts would have improved significantly with the advent of writing, of course, and then again thanks to the printing press, which might (like us) be called a meme machine. But we now have near perfect replication online.

  We are now becoming, by Blackmore’s estimation, teme machines—servants to the evolution of our own technologies. The power shifts very quickly from the spark of human intention to the absorption of human will by a technology that seems to have intentions of its own.

  Kevin Kelly takes this notion to the nth degree in his 2010 book, What Technology Wants, where he anthropomorphizes technologies and asks what they would like us to do. “The evolution of technology converges in much the same manner as biological evolution,” he argues. He sees parallels to bioevolution in the fact that the number of lines of code in Microsoft Windows, for example, multiplied ten times since 1993, becoming more complex as time goes on just as biological organisms tend to do.

  But viewed in the clear light of morning, we’ll likely find there was no robotic villain behind the curtain. Your iPhone does not “want” anything in the way that we perceive “want” to exist. Instead of animal “want,” we will confront only the cool, unthinking intelligence of evolution’s law. And, to be sure, our own capitalist drive pushes these technologies, these temes, to evolve (if that’s what they’re doing). Consider the fact that Google tested forty-one shades of blue on its toolbar to see which elicited the most favorable response. We push the technology down an evolutionary path that results in the most addictive possible outcome. Yet even as we do this, it doesn’t feel as though we have any control. It feels, instead, like a destined outcome—a fate.

  • • • • •

  Blackmore’s conception, if thrilling, is also harrowing. Genes must cooperate with us to get copied into the next generation, and they produce animals that cooperate with one another. And temes (being bits of information, not sentient creatures) need humans to build the factories and servers that allow them to replicate, and they need us to supply the power that runs the machines. But as temes evolve, they could demand more than a few servers from future generations of humans. Blackmore continued:

  What really scares me is that the accelerating evolution of temes and their machinery requires vast amounts of energy and material resources. We will go on supplying these as long as we want to use the technology, and it will adapt to provide us what we want while massively expanding of its own accord. Destruction of the climate and of earth’s ecosystems is the inevitable outlook. It is this that worries me—not whether they are amoral or not.

  Blackmore’s vision for our children’s technological future may seem nightmarish to the point of fantasy, especially since she seems to be constantly reporting the future, suggesting eventualities that cannot be determined the way we like.

  Yet when I think now of all that Blackmore told me, and of the eerie promise of decoded neurofeedback, when I think of the advancing multitudes of youths (and adults, too) who seem so perilously entranced by inanimate tools, I do sometimes lose heart. Then again, I want to counter all that with an opposing vision.

  The best way I can describe this optimistic alternative is to call up a scene from that 1999 gem of a movie The Matrix. In the Wachowski siblings’ film, a population enslaved by computers has been turned into a warehouse of mere battery cells, kept complacent by a mass delusion, the Matrix, which is fed into human brains and keeps them thinking they are living their own lives, freely, on the sunny streets of the world. In fact, as they dream out their false experiences, their physical bodies are held in subterranean caverns, each sleeping human jacked into a womblike pod. In my favorite scene, Neo, our hero, is torn from this dreamworld and awakens in that great dark chamber. Gasping for air, the first real air he has ever breathed, Neo stares out with stunned eyes and sees the raw world at last.5

  The Matrix is a technologically derived web of illusions, a computer-generated dreamworld that’s been built to keep us under control. The people in its thrall are literally suspended, helpless in their servitude to a larger technological intelligence. The solution to this very real human problem is the same solution presented by Buddhism and Gnosticism—we must, like Neo, awaken.

  • • • • •

  It’s becoming more and more obvious. I live on the edge of a Matrix-style sleep, as do we all. On one side: a bright future where we are always connected to our friends and lovers, never without an aid for reminiscence or a reminder of our social connections. On the other side: the twilight of our pre-Internet youths. And wasn’t there something . . . ? Some quality . . . ?

  I began this chapter lamenting little Benjamin’s confusion over the difference between a touch-sensitive iPad screen and a hard copy of Vanity Fair. But now I have a confession to make. I’m not much better off. This is not a youth-only phenomenon. A 2013 study from the University of Michigan found that those of us in our late thirties have now reached the point of having as many electronic interactions as we have face-to-face interactions. What a dubious honor that is—to be the first generation in history to have as many exchanges with avatars as with people. I wonder, sometimes, if this means I’ll start to treat friends and family as avatars in person. Occasionally, I’m hit with how weirdly consistent a group of people appears during a dinner party—how weird it is that they aren’t changing or scrolling like thumbnail portraits on a Twitter feed, being replaced, or flicking off. I’m suffering the same brain slips that young Benjamin suffered when he tried to use a hard-copy magazine as a digital interface. The only difference is that I’m more freaked out.

  Increasingly, I notice small moments when I treat hard-copy material as though it were digital. I’ve seen my fingers reach instinctively to zoom in to a printed photo or flick across a paper page as though to advance the progress of an e-book. These slips are deeply disturbing, a little like early signs of dementia. And they crop up in more meaningful scenarios, too. Just the other day, while discussing a particularly dreadful acquaintance with a friend of mine, I actually said, “Ugh, u
nfollow,” using Twitter’s term for removing an avatar from one’s ranks. And it wasn’t a semantic joke, is the thing. I clicked a button in my head and felt that jerk’s swift removal from my mental address book.

  There is one key difference here between young Benjamin and me. I am aware of my own confusion and can confront it. I can still recall my analog youth.

  In the quiet suburb where I was raised, there was a green hill near our house, a place where no one ever went. It was an easy trek, over the backyard fence and up a dirt path, and I would go there on weekends with a book if I wanted to escape the company of family or merely remove myself from the stultifying order of a household. Children do need moments of solitude as well as moments of healthy interaction. (How else would they learn that the mind makes its own happiness?) But too often these moments of solitude are only stumbled upon by children, whereas socialization is constantly arranged. I remember—I was nine years old—I remember lying on the green hill and reading my book or merely staring for a long, long time at the sky. There would be a crush of childish thoughts that would eventually dissipate, piece by piece, until I was left alone with my bare consciousness, an experience that felt as close to religious rapture as I ever had. I could feel the chilled sunlight on my face and was only slightly awake to the faraway hum of traffic. This will sound more than a little fey, but that young boy on the hillside did press daisies into his book of poetry. And just the other day, when I took that book down from its dusty post on my shelf, the same pressed flowers fell out of its pages (after a quarter century of stillness) and dropped onto my bare toes. There was a deep sense memory, then, that returned me to that hushed state of mind on the lost green hill, a state that I have so rarely known since. And to think: That same year, a British computer scientist at CERN called Tim Berners-Lee was writing the code for the World Wide Web. I’m writing these words on the quarter-century anniversary of his invention.

  That memory of a quieter yesteryear is dearly useful. Awake—or at least partly so—to the tremendous influence of today’s tech-littered landscape, I have the choice to say yes and no to the wondrous utility of these machines, their promise and power. I do not know that Benjamin will have that same choice.

  Regardless, the profound revelations of neuroplasticity research are constantly reinscribing the fundamental truth that we never really outgrow our environments. That the old, like the young, are vulnerable to any brave new world they find themselves walking through. The world we fashion for ourselves, or think we fashion, remains an insistent shaper of our minds until the day we die. So, in fact, we are all Kids These Days.

  Despite the universality of this change, which we’re all buffeted by, there is a single, seemingly small change that I’ll be most sorry about. It will sound meaningless, but: One doesn’t see teenagers staring into space anymore. Gone is the idle mind of the adolescent.

  I think that strange and wonderful things occur to us in those youthful time snacks, those brief reprieves when the fancy wanders. We know that many scientists and artists spring from childhoods of social deprivation. The novels of Anthony Trollope, for example, are the products of a friendless youth. He describes in his autobiography years and years of boyish daydreaming, which continued in adulthood:

  Other boys would not play with me. . . . Thus it came to pass that I was always going about with some castle in the air. . . . There can, I imagine, hardly be a more dangerous mental practice; but I have often doubted whether, had it not been my practice, I should ever have written a novel. I learned in this way to maintain an interest in a fictitious story, to dwell on a work created by my own imagination, and to live in a world altogether outside the world of my own material life.

  Solitude may cause discomfort, but that discomfort is often a healthy and inspiring sort. It’s only in moments of absence that a daydreaming person like Anthony Trollope can receive truly unexpected notions. What will become of all those surreptitious gifts when our blank spaces are filled in with duties to “social networks” and the relentless demands of our tech addictions?

  I fear we are the last of the daydreamers. I fear our children will lose lack, lose absence, and never comprehend its quiet, immeasurable value. If the next generation socializes more online than in the so-called real world, and if they have no memory of a time when the reverse was true, it follows that my peers and I are the last to feel the static surrounding online socialization. The Internet becomes “the real world” and our physical reality becomes the thing that needs to be defined and set aside—“my analog life,” “my snail life,” “my empty life.”

  Montaigne once wrote, “We must reserve a back shop, all our own, entirely free, in which to establish our real liberty and our principal retreat and solitude.” But where will tomorrow’s children set up such a shop, when the world seems to conspire against the absentee soul?

  CHAPTER 3

  Confession

  The highest and most beautiful things in life are not to be heard about, nor read about, nor seen but, if one will, are to be lived.

  —Søren Kierkegaard

  THE third most Googled person in 2012 was a small fifteen-year-old girl from Port Coquitlam—a nondescript Canadian town composed mainly of box stores, parking lots, and teenagers with nothing to do. The girl’s name was Amanda Todd. She liked cheerleading—being petite, she got to be the girl at the top of the pyramid. And she liked to sing—she would perform covers before her computer’s camera and post the videos on YouTube under an account titled SomeoneToKnow. In these, her adolescent pursuits, she was entirely typical. But when Amanda Todd killed herself on Wednesday, October 10, a different light was cast on her seemingly ordinary life; within days, media alighted on the most notorious cyberbullying case in history.

  I will not fill these pages with a detailed account of the years of abuse that led up to her death (that story is readily available on the Internet, which proved such an entrenched and toxic commentator on Todd). Suffice it to say that when still in grade seven, she was convinced by an unidentified man to expose her breasts via webcam. That man then proceeded to blackmail and harass her with the captured image of her nude body for years. (“Put on a show for me,” he would later order.) Todd became the subject of a tormenting Facebook profile, which featured her breasts as its profile picture. She attended three schools in the space of a year in an effort to avoid the ensuing harassment from peers. She was beaten by a gang of young girls (while others stood by and recorded the scene on their phones). And, eventually, Todd became so paranoid and anxious that she could not leave her home. She first attempted to kill herself by drinking from a bottle of bleach, which was unsuccessful and led to more of the online bullying that drove her to that action in the first place.

  Then, a month before her death, Todd posted a video on her YouTube channel, unpacking her troubled story. This time she wasn’t singing someone else’s song, but describing for viewers (in a broken way, for she suffered from a language-based learning disability) her own suffering. Naturally, this opened her again to the attacks of faceless online “commenters.” By stepping into the buzzing crowds of Internet forums, we hazard a deep cruelty. While Amanda Todd lived, these waves of ridicule pushed her toward more public confessions, which were broadcast over the very mass communication technologies that had spurred her distress. Later, after her suicide, she was transformed into a meme and a hashtag, bandied about online in a series of suicide jokes and vandalisms on her memorial pages. She is taunted in death even more than she was in life.

  What interests me more than the common tale of online abuse, though, is the outlet that Todd turned to as a balm for her wounds—the video she posted as a final creative act. She turned, against all reason except perhaps that of an addict, to the very thing that made her suffer so. She turned to an online broadcast technology. When I first read about this tortured girl, I kept wondering how much we all subvert our emotional lives into our technologies; how much of the pain and suffering we each live with is now funneled awa
y from traditional outlets (diaries, friends, counselors) and toward an online network that promises solace.

  Two weeks after Todd killed herself, her mother, Carol, sat on a black sofa and spoke to the media. “It’s not about a child who . . . just sat on her computer in her room,” she said. She spoke with a soft and uncertain voice. She did not look into the camera. She tried to describe her daughter’s state of mind in the days leading up to that final act. “She realized the error in her actions, but that error couldn’t be erased. . . . She tried to forget, she tried to make it go away. She tried to change schools. But wherever she went, it followed her.” The footage is hard to watch; Carol Todd looks understandably distraught and annoyed by the attention of the media. In the end, she bites her lip, says to the scores of imagined “bad mom” accusers, “Amanda was born into the right family.” And then she asks for the camera to be turned off.

  When, months later, I asked Carol Todd for an interview, she deferred or canceled our meeting a half-dozen times; her reasons sometimes seemed genuine and sometimes not. I’d resigned myself to not meeting her at all when she had an apparent change of heart and asked me to lunch. So I traveled to her hometown and sat myself down at a restaurant she likes called Earl’s, where pretty, polished girls, about the age Amanda would be by now, brought us our sandwiches and coffee.

 

‹ Prev