The Winter of Our Disconnect

Home > Other > The Winter of Our Disconnect > Page 15
The Winter of Our Disconnect Page 15

by Susan Maushart


  It was so very foreign to the way I myself work best: in monastic seclusion and pin-droppingly pure silence. Even the most innocuous background melody grates like fingernails on the blackboard of my attention span. Music with lyrics can make me scream with frustration. When we had an extension built some years ago, I grew accustomed to the racket of the power tools. But the talkback on the tradesmen’s radios nearly killed me. In the end, I was driven to using earphones on my own radio, tuned to static. As for television, having it on in the background while I try to write—or even read—is almost physically painful. At the first few bars of the Simpsons theme song, I swear I can hear my ideas snapping like twigs.

  “Can’t you just block it out and pretend it’s not there?” plead the kids.

  “Absolutely,” I say, advancing on the “Off” button. If I can’t hear what I’m writing, I try to explain, it’s no longer creating prose. It’s tossing word salad.

  My children naturally regard my work habits as freakish. But they are fairly common among Boomers. In our day, people still sat down to watch television. It was an intentional thing, and it usually involved a program guide. There were no remotes. There were no DVDs in people’s bedrooms. (That, I explain to my wondering children, would have been like having a gas barbecue next to the bed, or a flush toilet in the closet. “Sick!” they chorus.) Television had not yet become the soundtrack for family life—the not-beautiful, not-useful wallpaper lining every household’s personal space; the visual cud on which the entire culture chewed.

  “But Mum . . . We need the background noise,” they explain. They speak slowly, as if I were the one with the intellectual impairment. As, arguably, I may be.

  Sussy confides that one of the reasons she loves her school is the teachers allow iPods during class. “During CLASS?!” I sputter. (For this educational privilege I’m refinancing the family home?)

  “Well, yeah. Why not? When we’re doing projects and stuff.”

  I tell her about the time my father yelled at me for humming under my breath at the dinner table. I tell her about my primary school cafeteria, where you weren’t allowed to talk. I tell her about the fitness revolution sparked by the appearance of the Walkman in the late seventies. (“The idea that you could go jogging and listen to music at the same time blew people’s minds,” I explain. “Random!” she cries with real feeling. “But I don’t get it, Mum. What’s ‘jogging’?”)

  No wonder my generation doesn’t do well with a lot going on in the background—not cognitively, at any rate. For us, as for Gerald Ford, who famously struggled to chew gum and lead the free world, doing more than one thing at a time is a reach.

  The multitasking millennials we have spawned may be fully functional with a device in every orifice, or they may not. But we, their Boomer and Gen X parents, know our limits. We excel at monotasking.

  Our kids smile indulgently at our limitations. It’s as if our inability to multitask were some quaint but faintly repulsive evolutionary vestige—like hairy knuckles or a monobrow. When we thunder, “But how can you THINK with that racket going on?” they explain sweetly, “The thing is: Our brains are different.” (I don’t know about yours, but my own kids seem to have learned neuroscience before their times tables.) They don’t say “superior,” or “more highly evolved,” but that’s what they mean, and they and I both know it.

  Bill’s idea of dressing for dinner is wearing board shorts that Velcro all the way up. Shoes? That’s pushing it. Anni, who prides herself on her immaculate grooming, sheds her hair extensions as unthinkingly as a snake sheds its skin. Yesterday I noticed them on top of the piano, snarled like roadkill. Sussy pours cereal directly into the drawer, where, most of the time, it hits a bowl.

  Believe me when I tell you: Their brains are different.

  However, I’m pretty sure that’s not what they’re getting at when they assure me that multitasking is a piece of cake (when, to me, it looks like a mangled collection of crumbs). They probably mean more or less what Don Tapscott says in his buoyantly optimistic book Grown Up Digital. “Parents have said to me: ‘How can my kid be doing homework, while he’s also listening to MP3 files, he’s texting on his phone, he’s got three windows open on the computer—one of them Facebook—and he’s petting the dog? How is this possible?’” Tapscott’s answer is scarily identical to Bill’s. “Well,” he says, “it turns out they have different brains.”1

  A Canadian business consultant with an undergraduate degree in psychology and no children, Tapscott assures parents that worrying about kids’ multitasking is like worrying that your mate no longer carries a club. “These kids grow up interacting and collaborating, thinking and organizing, scrutinizing, having to remember things, managing information. And that affects the actual wiring, synaptic connections, and structure of kids’ brains. So they have better switching abilities and better working active memory. If I’m doing several things at once, I can’t keep track of what the heck is going on, but they can. So this is creating a generation that thinks, works, and learns very differently.”2

  The last sentence I have no problem with. About the rest, I’m not so sure. Better switching abilities? Hmm. On the remote, perhaps. Better working memory? Well, maybe—as long as we’re not counting the number of cell phones, iPods, and chargers missing in action on the bus last year. Call me unevolved, but I get nervous when non-scientists start talking about the “actual wiring” in our heads.

  Before we began The Experiment, my own (admittedly equally unscientific) hunch was at odds with Bill’s and Tapscott’s. The idea that constant track-jumping could conduce to a smooth train of thought for anybody, under any circumstances, seemed like a stretch.

  Post-Experiment, I know it is. I watched as my kids awoke slowly from the state of cognitus interruptus that had characterized many of their waking hours, to become more focused, logical thinkers. I watched as their attention spans sputtered and took off, allowing them to read for hours—not minutes—at a stretch; to practice their instruments intensively; to hold longer and more complex conversations with adults and among themselves; to improve their capacity to think beyond the present moment, even if that only translated into remembering to wash out tights for tomorrow morning.

  I’m not saying anybody suddenly went from Hannah Montana to Homer. They didn’t develop an unquenchable thirst for their set texts, or learn to love their trigonometry worksheets. In fact, they probably did no more homework during The Experiment than they had done before—Sussy swears she did much less and, though her grades improved significantly, this may be true. But they all completed their schoolwork far more efficiently, far more quickly, and with visibly greater focus.

  I can’t say what went on in the “actual wiring” inside their heads once they were forcibly encouraged to monotask. Luckily, I don’t have to. Any number of major studies has shown that all that spin about “different brains” has been greatly exaggerated, like Mark Twain’s death or the redemptive power of bleached teeth. In fact, it turns out that multitaskers are not ahead of the cognitive curve at all—not even in those skill areas where one would expect otherwise. It’s true that our kids’ brains are being changed by the media they habitually interact with, and that many of those changes are as yet dimly understood. It’s also true that bookish people like me, who need cocoonlike isolation in order to work effectively, have our own wiring issues. But the fact is no one’s brain is different enough to make constant interruptions, distractions, and task-switching an optimal environment in which to function. No one’s.

  As a fifty-two-year-old, post-reproductive female, my brain is “different” too. Well, hello. Half the time I can’t even remember where I left my last partner, let alone my reading glasses. As we all know, the prevailing cultural mythology for people my age—especially women my age—is all about memory loss, vagueness, and diminution of brain function. At least, I’m pretty sure it is . . . Hang on. Isn’t it?

  LOLishly enough, the latest neuroscience suggests that people a
t my stage of the game have particularly agile neural ability. (Think Barack Obama, not Menopause: The Musical.) True, we are somewhat slower at acquiring new information. But our ability to process, organize, and contextualize that information is unparalleled—and it shows in our “wiring”—aka our neural structures. Midlife brains are marked by a proliferation of glial cells (that’s Greek for “glue”) and experience optimal convergence between right and left hemispheres. The cumulative effect, notes one neuroscientist, is that “our brains graduate from a dial-up modem pace to high-speed DSL.” No wonder we occasionally exceed our personal download allowance!

  The superior cognitive function we experience at midlife is one reason people tend not to vote for world leaders who are in their twenties and thirties. It also explains why younger air traffic controllers are consistently outperformed by their more experienced elders. A recent study by researchers at MIT and the University of Illinois found that middle-aged workers’ reaction time, memory, and attentional ability was significantly worse than that of younger colleagues, when both groups were tested in isolation and under laboratory conditions. But when they were tested in real-life conditions, the elderly tortoises absolutely hammered the upstart hares.3

  You and I might call the midlife advantage “wisdom.” Neuroscientists locate it within actual structures in the brain. It all adds up to the same thing: When it comes to sorting through and weighing up multiple bits of information, midlife heads do it better and faster.

  While we’re on the subject of good news for mother, midlife brains are also more efficient when it comes to controlling temperament. Contrary to prevailing stereotypes, males and females alike become less grumpy with age. We also tend to grow less impulsive, less labile in our moods, and less prone to extreme emotional responses. A study carried out at the University of California, Berkeley, assessed 123 women in their early twenties and again in subsequent decades. It found “likable personality traits”—such as the ability to remain objective, to tolerate ambiguity, to handle interpersonal relationships successfully—peaked for women in their fifties and sixties.4 ☺

  Admittedly, on the day of the e. e. cummings assignment, you could be forgiven for thinking otherwise. It was one of those days where something definitely clicked in my head. It was somewhere between an aha! moment and a WTF?! moment.

  That night, after I’d kissed everybody goodnight and switched their screens to sleep mode, I found myself thinking back to my own high school days. I’d usually do my homework in my bedroom, away from the depressingly familiar dialogue on my mother’s afternoon soap opera. Other kids tore handwritten pages from their spiral-bound notebooks, but not me. Those untidy curly edges always set my teeth on edge. I preferred to do my assignments double-spaced neatly on onionskin, on my beloved orange Olympia portable. (Once a nerd, always a nerd.) Yet there was nothing particularly Dickensian about my bedroom. I had a telephone. It was white and had a bleached-blond troll doll glued to the receiver.

  I also had a radio and a stereo, and in my senior year even a portable color TV. But using any of them while I worked would have been as unthinkable as singing karaoke in the middle of a school assembly. And there was no karaoke back then.

  I did get bored sometimes. But most of my distraction strategies seemed to revolve around fire: lighting a stick of incense or a scented candle—hey, it was the seventies, okay?—or, in extremis, crawling through my bedroom window onto the roof to smoke. I also had a weird but obscurely satisfying habit of melting crayons on the light bulb of my desk lamp. That was the closest I ever came to something resembling multitasking. I listened to a ton of music, like all normal teenagers. But when I listened, I listened. Hard, and usually studying the lyrics on the back cover of the LP.

  Other kids I knew were pretty much the same. Some listened to the radio while they studied—something our parents and teachers frowned upon, it amuses me to remember—but that was about as “stimulating” as our media environment ever got.

  These are anecdotal recollections, I realize. Yet the fact that there exist no hard data from this period on teen media use is evidence of how much has changed. Today, entire journals are devoted to the subject, and new articles and books appear as regularly as reruns of Seinfeld. We are so much more interested in how our kids interact with technology. Partly that’s because there is so much more technology. Partly it’s because there’s so much more fear. Thirty-five years ago, we didn’t know enough to know how much we didn’t know. Today we are beginning to.

  You don’t need a Ph.D. in social psychology to tell you something’s up when you have to fight to make eye contact with your teenage children, or get them to sit down and eat a meal, or have the occasional grunt-free conversation. As parents, we comfort ourselves with the excuse that all this is normal, natural, age-appropriate stuff. But somewhere in the back of our beleaguered, Boomer-ish brains, we remember a time—perhaps even our own teen years—when it wasn’t. For many of us, that is exactly the opposite of a comforting thought. It’s a scary one. And it’s exactly that fear factor that creates such fertile ground for the growth of noxious “experts,” whether they are the cheerleaders, who insist we are moving toward a user-generated golden age waaay too cool for parents or other vestigial organisms to understand, or the doomsayers, who prophesy with equal confidence the collapse of civilization as we know it.

  Among the former are authors Don Tapscott—the aforementioned “different brain” dude—and Steven Johnson, whose irresistibly titled Everything Bad Is Good for You argues that the new media only seem to be dumbing us down; in fact, they are making us smarter. True, our kids know fewer facts and less history, these authors acknowledge. They struggle to construct arguments and to maintain focus. But their ability as information hunter-gathers, their visual acuity, their narrative and creative intelligence, leave an older generation for brain-dead. This is exactly the message our kids want us to hear. It’s also the one they themselves quite earnestly believe. To be fair, there is some compelling evidence to support the cheerleaders’ case—including the fact that IQ (as opposed to SAT scores, say) has been rising for decades.

  In the opposite corner sit observers like Emory literature professor Mark Bauerlein, author of The Dumbest Generation (no prizes for guessing how he feels about multitasking); journalist Maggie Jackson, whose meticulously researched book Distracted argues that, in the age of the Internet, we all have ADHD; and clinical psychologist Michael Osit, whose 2008 book Generation Text thunders against a generation used to “instant everything.” The case the doomsayers argue is persuasive, fact-filled—and deeply depressing. Literacy as we know it is vanishing. Attention spans are anorexic. Narcissism is up—knowledge is down. The culture is coarsening and so is our cognitive edge.

  The cheerleaders tell only the rah-rah side of the story. They fall over themselves in their eagerness to seize the new day. The doomsayers’ gaze, by contrast, is fixed determinedly in the rearview mirror. They see only a rapidly retreating landscape and devote most of their considerable rhetorical power to mourning its passing.

  The question of which side to believe is so pre-Web 2.0. As critic Neil Postman was fond of remarking, “Information explosions blow things up.” That doesn’t mean they blow everything up. But some stuff, yes, inevitably—sometimes some pretty important stuff. The fireworks are amazing—but we pay a price for admission. Media give, and media take away.

  History shows us that, with the turning of each new technological tide, there is always somebody who’ll forecast tsunami. Socrates was one of them. He feared that the written word—basically the Twitter of fourth-century Athens—would undermine education, and he warned that reading would cause people to “cease to exercise their memory and become forgetful.” (Yup. The exact same argument you and I are still making about the use of calculators in school.) Having too many facts at one’s fingertips “without proper instruction” was dangerous too, leading people to be “filled with the conceit of wisdom instead of real wisdom.”5 (The exa
ct same argument you and I are still making about Google.)

  Fifteenth-century Venetian man of letters Hieronimo Squarciafico thought the printing press was the devil. “Already abundance of books makes men less studious,” he fumed. “It destroys memory and enfeebles the mind by relieving it of too much work.”6 A German critic, writing at the dawn of the reading revolution that would sweep Europe and the New World in the early nineteenth century, prophesied a pandemic of “colds, headaches, weakening of the eyes, heat rashes, gout, and arthritis.”7

  In Everything Bad Is Good for You, Steven Johnson cranks the culture jam even louder, imagining what today’s conservative critics—the ones who are convinced that Wikipedia is the devil’s workbench—might have said about printed books. That they are “tragically isolating.” That they understimulate the senses. That they suppress social interaction and breed intellectual passivity. (Imagine! You simply sit back and have the story dictated to you.) All true, of course . . . and all conveniently overlooked by Digital Immigrants such as you and me.

  Most of us don’t think of books as “media” at all—which is both ridiculous and a reminder of how utterly embedded in our media ecology they have become. It’s sobering to realize that Socrates’ version of The Experiment would have been a six-month ban on reading or writing. Random! It occurs to me that I should really try to do without for a day or two, out of a sense of fair play to the Natives. The prospect frankly terrifies me. What the hell would I do? How would I keep up? (Wait a minute. Could this be the way Anni, Bill, and Sussy felt about relinquishing their Facebook accounts?) The concept of having dependency issues around literacy had never occurred to me, any more than the concept of having dependency issues around oxygen.

 

‹ Prev