The End of Absence: Reclaiming What We've Lost in a World of Constant Connection

Home > Other > The End of Absence: Reclaiming What We've Lost in a World of Constant Connection > Page 12
The End of Absence: Reclaiming What We've Lost in a World of Constant Connection Page 12

by Michael Harris


  I am certain my childhood brain was less distractible than my adult brain. I have a distinct feeling that I’ve lost some ability to remain attentive to a book or given task. Who was that boy who read all of Jurassic Park in a single sitting in the cloister of his parents’ living room? Who was that teenager who, sailing through the Gulf Islands one August, simultaneously whizzed through Great Expectations and (sigh) memorized every word of the Sunset Boulevard libretto? And who is this frumpy thirty-something man who has tried to read War and Peace five times, never making it past the garden gate? I took the tome down from the shelf this morning and frowned again at those sad little dog-ears near the fifty-page mark.

  • • • • •

  When the film critic Roger Ebert died, I reread an essay of his from a May 2010 issue of the Chicago Sun-Times. In it, Ebert describes his faded love for nineteenth-century novelists—Austen and Dickens and Dostoyevsky. For years, “I would read during breakfast, the coffee stirring my pleasure in the prose.” He read them all, spent hours at a go in their complicated worlds. But then he dropped the breakfast and dropped the reading, too. Novels became something one used to fill up transatlantic flights. (And then, once an iPad could be loaded with a few seasons of Entourage, they weren’t needed there, either.) In the previous year, he had tried to tackle Dickens’s Dombey and Son, and although he loved the writing, he kept finding himself incapable of continuing. As it was for me with my War and Peace, there always seemed to be something else to do. Deadlines, he shrugged. “Tweeting. Blogging. Surfing.” Ebert found, as do many other damaged Dickensophiles, that “instead of seeking substance, we’re distractedly scurrying hither and yon, seeking frisson.” Frisson, in this case, is French for “adorable videos of cats trapped in cardboard boxes.”

  Are the luxuries of time on which deep reading is reliant available to us anymore? Even the attention we deign to give to our distractions, those frissons, is narrowing.

  It’s important to note this slippage. To remember that those cat videos were not always there. As a child, I would read for hours in bed without the possibility of a single digital interruption. Even the phone (which was anchored by wires to the kitchen wall downstairs) was generally mute after dinner. Our two hours of permitted television would come to an end, and I would seek out the solitary refuge of a novel. And books were a true refuge. What I liked best about them was the fact that they were a world unto themselves, one that I (an otherwise powerless kid) had some control over. There was a childish pleasure in holding the mysterious object in my hands; in preparing for the story’s finale by monitoring what Austen called a “tell-tale compression of the pages”; in proceeding through some perfect sequence of plot points that bested by far the awkward happenstance of real life.

  The physical book, held, knowable, became a small mental apartment I could have dominion over, something that was alive because of my attention and then lived in me.14 I couldn’t perform this magic trick in the company of others, though. I couldn’t enjoy reading at all if there was anyone else present. If my parents or brothers came into the living room when I was reading and stationed themselves with a book on the couch opposite, I would be driven to distraction, wondering what was going on in their book, and would be forced to leave the room in search of some quieter psychic hollow.

  In the purgatory of junior high school, I spent every recess and lunch break holed up in a wooden stall at the school’s library, reading a series of fantasy novels called DragonLance, which then consisted of several dozen books (I read them all). And I was not so rare in my behavior, either. Many writers, and also the general population of introverts, will take to reading as a form of retreat. Alberto Manguel relates in A History of Reading how the novelist Edith Wharton would escape the stultifying rules of nineteenth-century life by reading and writing in her bedroom exclusively. I had my bullies on the playground, but Wharton had the irredeemable constraints of corsets and polite conversation. In R. W. B. Lewis’s biography, she is described as throwing “a minor fit of hysterics because the bed in her hotel room was not properly situated.” I’m inclined to agree with Lewis that it wasn’t in a proper position for reading. Any devoted reader knows how important it is to have a proper cave in which to commit the act.

  But now . . . that thankful retreat, where my child-self could become so lost, seems unavailable to me. Wharton could shut out distraction in her locked bedroom. Today there is no room in my house, no block in my city, where I am unreachable.

  At the end of that Roger Ebert essay, he says he decided to force himself to do the reading that he knew, deep down, his brain wanted and needed. When he gave himself the proper literary diet (and found a room in the house where his Wi-Fi connection failed), “I felt a kind of peace. This wasn’t hectic. I wasn’t skittering around here and there. I wasn’t scanning headlines and skimming pages and tweeting links. I was reading. . . . Maybe I can rewire my brain, budge it back a little in the old direction.”

  Well, I thought, maybe I can, too. Maybe I can “fortify the wavering mind,” as Seneca suggested, “with fervent and unremitting care.”

  I made a list of all my current commitments—work projects and personal ones—and started hacking away at that list while refusing any additions. Eventually, if we start giving them a chance, moments of absence reappear, and we can pick them up if we like. One appeared this morning, when Kenny flew to Paris. He’ll be gone for two weeks. I’ll miss him, but this is also my big break.

  I’ve taken War and Peace back down off the shelf. It’s sitting beside my computer as I write these lines—accusatory as some attention-starved pet.

  You and me, old friend. You, me, and two weeks. I open the book, I shut the book, I open the book again. The ink swirls up at me. This is hard. Why is this so hard?

  • • • • •

  Dr. Douglas Gentile, a friendly professor at Iowa State University, recently commiserated with me about my pathetic attention span. “It’s me, too, of course,” he said. “When I try to write a paper, I can’t keep from checking my e-mail every five minutes. Even though I know it’s actually making me less productive.” This failing is especially worrying for Gentile because he happens to be one of the world’s leading authorities on the effects of media on the brains of the young—attention deficit is meant to be something he’s mastered. “I know, I know! I know all the research on multitasking. I can tell you absolutely that everyone who thinks they’re good at multitasking is wrong. We know that in fact it’s those who think they’re good at multitasking who are the least productive when they multitask.”

  The brain itself is not, whatever we may like to believe, a multitasking device. And that is where our problem begins. Your brain does a certain amount of parallel processing in order to synthesize auditory and visual information into a single understanding of the world around you, but the brain’s attention is itself only a spotlight, capable of shining on one thing at a time. So the very word multitask is a misnomer. There is rapid-shifting minitasking, there is lame-spasms-of-effort-tasking but there is, alas, no such thing as multitasking. “When we think we’re multitasking,” says Gentile, “we’re actually multiswitching. That is what the brain is very good at doing—quickly diverting its attention from one place to the next. We think we’re being productive because we are, indeed, being busy. But in reality we’re simply giving ourselves extra work.” A machine may be able to spread its attention simultaneously across numerous tasks, but in this respect, we humans are far more limited. We focus. Author Tom Chatfield points out that our rapid-switch attention strategy works perfectly well when we’re checking our e-mail or sending texts about that horrible thing Susan wore last night, but

  when it comes to the combination of these “packets” of attention with anything requiring sustained mental effort, however, our all-round performance rapidly decays. According to internal research from Microsoft, for example, it took workers an average of a quarter of an hour to return to “serious mental tasks” after replying t
o email or text messages.

  The multitasking mind, having abbreviated any deep deliberation it was set to undertake, is therefore more likely to rely on rote information and mechanical analysis. Yet look at the multitasker in action. He or she appears to be a whir of productivity, not some slave to mindless responses. Phone (and cappuccino) held aloft while crossing the intersection—barely avoiding a collision with that cyclist (also on the phone)—the multitasker is in the enviable position of getting shit done.

  We can hardly blame ourselves for being enraptured by the promise of multitasking, though. The tunnel vision involved in reading, say, War and Peace is deeply unnatural; meanwhile, the frenetic pace of goofing around on the Internet has a nearly primal attractiveness. This is because computers—like televisions before them—tap into a very basic brain function called an “orienting response.” Orienting responses served us well in the wilderness of our species’ early years. When the light changes in your peripheral vision, you must look at it because that could be the shadow of something that’s about to eat you. If a twig snaps behind you, ditto. Having evolved in an environment rife with danger and uncertainty, we are hardwired to always default to fast-paced shifts in focus. Orienting responses are the brain’s ever-armed alarm system and cannot be ignored.

  • • • • •

  This is why a TED Talk lecture, for example, can be even more engaging on your computer screen than it was in person. In a lecture hall, you are charged with mustering your own attention and holding it, whereas a video is constantly triggering your orienting response with changes in camera angle and lighting; it does these things to elicit attention out of you. “Televisions and computers,” says Gentile, “are crutches for your attention. And the more time you spend on those crutches, the less able you are to walk by yourself.”

  So now, just as the once useful craving for sugar and fat has turned against us in an environment of plenty, our once useful orienting responses may be doing as much damage as they do good. Gentile believes it’s time for a renaissance in our understanding of mental health. To begin with, just as we can’t accept our body’s cravings for chocolate cake at face value, neither can we any longer afford to indulge the automatic desires our brains harbor for distraction.

  “In my opinion,” he told me, “we’ve focused for thirty or forty years on the biological and genetic aspects, which has allowed us to come up with all these drugs to handle attention deficit disorder, but we’ve been focused on only half the equation. We’ve focused exclusively on the nature side of things. Everyone seems to think attention problems are purely genetic and unchangeable except by medication.”

  Given that children today spend so much more time in front of flashing screens (more than ten hours per day, when “multitasking” is accounted for), it would be a willful kind of ignorance to assume so much sparking of our orienting responses wouldn’t rewire the brain. “We’re now finding,” Gentile told me, “that babies who watch television in particular end up more likely to have attention deficit problems when they reach school age. It’s pretty obvious: If you spend time with a flickering, flashing thing, it may leave the brain expecting that kind of stimulation.” And it’s not just infants who need to be protected from such flashes. “We’ve found that whenever kids exceed the one to two hours of recreational screen time a day the AAP [American Academy of Pediatrics] recommends, levels of attention issues do go up an awful lot.”

  I stopped him there. “One or two hours of screen time a day? That’s the recommendation?”

  “Yes.”

  “Does anybody meet that standard?”

  “Well, no. Probably not.”

  • • • • •

  It’s not merely difficult at first. It’s torture. I slump into the book, reread sentences, entire paragraphs. I get through two pages and then stop to check my e-mail—and down the rabbit hole I go. After all, one does not read War and Peace so much as suffer through it. This is not to disparage the book itself, only the frailty of its current readers. It doesn’t help that the world at large, being so divorced from such pursuits, is often aggressive toward those who drop away into single-subject attention wells. People don’t like it when you read War and Peace. It’s too long, too boring, not worth the effort. And you’re elitist for trying.

  War and Peace is in fact thirteen hundred (long) pages long and weighs the same as a dead cat. Each of its thirty-four principal characters goes by three or four different (Russian) names. The aristocrats portrayed often prefer to speak in French, which is odd considering they spend much of their time at war with Napoleon. In my edition, the French is translated only in tight footnotes, as though the translators (Pevear and Volokhonsky) mean to say, “Really? You want us to do that for you, too?” There are also hundreds of endnotes, which are necessary to decode obscure sayings and jokes, so I flip about once each page to the rear of the tome, pinching down to hold my place at the front. (Endnotes: the original hyperlink.) It is, manifestly, the product of a culture with far fewer YouTube videos than our own.

  In order to finish the thing in the two weeks I have allotted myself, I must read one hundred pages each day without fail. If something distracts me from my day’s reading—a friend in the hospital, a magazine assignment, sunshine—I must read two hundred pages on the following day. I’ve read at this pace before, in my university days, but that was years ago and I’ve been steadily down-training my brain ever since.

  Whether or not I’m alone in this is an open question. Numbers from the Pew Research Center and Gallup survey suggest that reading levels in fact have remained relatively constant since 1990. Curiously, the largest hit that book reading has taken seems to have occurred somewhere in the 1980s. Then again, the National Endowment for the Arts (NEA) released a massive and scathing report in 2007 that claimed Americans are indeed spending less time reading, that their reading comprehension is eroding, and that such declines “have serious civic, social, cultural, and economic implications.” Nearly half of all Americans from eighteen to twenty-four, apparently, “read no books for pleasure.”

  How are we to square those numbers? Perhaps by focusing less on quantity and more on quality. The NEA report may have come to such wildly different conclusions from those of Gallup and Pew because the NEA is interested in “literary” reading. The report states explicitly, “Literary reading declined significantly in a period of rising Internet use.” The NEA was also concerned with quality of reading environment, noting that a third of reading by young adults is accomplished while “multitasking” with other media, including TV and music.

  • • • • •

  I experienced my first intuition of Tolstoy’s larger phraseology around page fifty-eight. Princess Anna Mikhailovna is subtly begging for money from a countess so that she may pay for her son’s military uniform.

  Anna Mikhailovna was already embracing her and weeping. The countess was also weeping. They wept because they were friends; and because they were kind; and because they, who had been friends since childhood, were concerned with such a mean subject—money; and because their youth was gone.

  In the larger context of war preparations and complex aristocratic maneuvering, I find the sudden vain mention of their vanished youth startling and beautiful (a little comic, too). And this passage has the effect of plunging me into the book properly. For a page or two I am rapt, utterly lost. And then my phone goes off; the miniature absence, and the happiness it gave me, is ended. I want to read, but I stop. I know the distractions are unproductive and I fly to them all the same.

  • • • • •

  Humans are not the only animals who behave in unproductive and irrational ways—and it may be easier to observe the arbitrary nature of our behavior if we look to another species first. Consider the three-spined stickleback fish. The stickleback is a two-inch-long bottom-feeder that lives throughout the northern hemisphere. From late April into July, sticklebacks make their way to shallow mating grounds, where the males, as in most mating gr
ounds, get aggressive with one another. Male sticklebacks develop a bright red throat and underbelly during mating season; the coloring is a product of carotenoids found in the fish’s diet, so a bright red male, having sourced plenty of food for himself, can be seen by females as a desirable mate, and he can also be seen by other males as serious competition—the reddest male sticklebacks elicit more aggression from other males. The Nobel Prize–winning ethnologist Niko Tinbergen found, however, that male sticklebacks actually attack whatever piece of material in their environment is reddest. (Place a red ball in a stickleback mating ground and the boys go crazy.) They respond purely to the stimulus of the color itself and not to the fish behind the red. A neural network in the male stickleback’s head is triggered by the sign stimulus, the color red, and produces instinctive aggression on the spot.

  What, I’m now left to wonder, is my red? What kind of stimulus derails my attention against my will; what ingrained tendencies do technologies capitalize on each time they lead me away from the self I hope to fashion? And are they fixed actions, after all? Or are these patterns that I can change?

  In the wild, some species have evolved to take advantage of the fixed action patterns of other creatures. The North American cowbird, for example, will lay its eggs in another species’ nest, and its young are later fed thanks to the parental instinct of the host bird. Is it possible our more successful technologies have reached a point where they are expert exploiters of our own automatic behavior? The Internet’s constantly flashing, amorphous display is an orienting response’s dreamboat, after all.

 

‹ Prev