The End of Absence: Reclaiming What We've Lost in a World of Constant Connection

Home > Other > The End of Absence: Reclaiming What We've Lost in a World of Constant Connection > Page 10
The End of Absence: Reclaiming What We've Lost in a World of Constant Connection Page 10

by Michael Harris


  I’ve come to see “IMHO” as a harbinger of bullshit. IMHO usually portends a comment that is ill conceived and born of either private prejudice or a desire to trumpet. It’s part of a public debate culture in which the “honest opinion” is worthy of publication and consumption not because of any particular merit, but because it is “honestly” the writer’s “opinion.” In his charming little book On Bullshit, the moral philosopher Harry G. Frankfurt offers a useful equation for predicting the manufacture of the manure in question:

  Bullshit is unavoidable whenever circumstances require someone to talk without knowing what he is talking about. Thus the production of bullshit is stimulated whenever a person’s obligations or opportunities to speak about some topic exceed his knowledge of the facts that are relevant to that topic.

  By this reckoning, haven’t we created bullshit machines? In the more than one hundred million amateur travel reviews that fuel TripAdvisor, for example, isn’t it likely that our ability to speak publicly almost always exceeds our knowledge? The invitation to bullshit, anyhow, is constant.

  When I find myself drowning in bullshit—my own and that of others—I think about what it’d be like to sit outdoors at some New York City café, circa 1930, and open a copy of The New Yorker, maybe read a book review by Dorothy Parker. What must that have felt like? To draw in a few hundred words of commentary, both discernible and, yes, discerning, completely void of miscellaneous commentary? Wipe away the democratic clamor of “honest opinion” and find beneath a single salient voice. Ms. Parker’s “honest opinions” were often briefly stated; she knew the soul of wit (“like the soul of lingerie”) was its brevity. When Parker reviewed A. A. Milne’s now-beloved The House at Pooh Corner she made short work of it: After describing Pooh and Piglet humming in the snow, she demurs, “Oh darn—there I’ve gone and given away the plot.” And nobody jabbered a response. . . . Clarion calls like Parker’s weren’t smothered by dozens of voices clouding the air with half-baked comebacks.

  The review read, the magazine folded and tossed aside, one decides to trust or not trust Parker’s opinion and leave it at that. Perhaps on rare occasions a letter is written to the editor (which might be published, if thoughtful enough), but mostly the discussion is one-way and finite. What a lovely thing, to shut up and listen and not broadcast anything back. There’s a certain serenity in it and even a kind of light grace.

  There has always been an abundance of bullshit. But never before have so many been implicated in the bullshit rigmarole that is public conversation. Before, most of us could point at the bullshitters among us (the politicians and hotheaded pundits) and shake our heads. Today, no such finger-pointing can be allowed because we all swim in the mess. When the armchair philosophers are handed megaphones and the crowd of “honest opinion” starts to overwhelm the trained few, will we begin to think that bullshitting is the only and the natural way to make a call? Or will we engineer opinion vacuums, weed out the bullshit, and separate what is best from what is customary?

  CHAPTER 5

  Authenticity

  But isn’t everything here green?

  —Dorothy, in L. Frank Baum’s The Wonderful Wizard of Oz

  ANDREW Ng holds a position in the Computer Science Department at Stanford University, where he regularly lectures, year after year, to classrooms of roughly four hundred bright and privileged students. Back in 2008, a video project he launched called Stanford Engineering Everywhere (SEE) let him broadcast base approximations of those classes online. Ng simply stuck a camera at the back of the lecture hall and posted the whole thing on his site, like the world’s most boring bootlegged movie. Yet the response—excited viewers kept chatting him up at his Silicon Valley Starbucks—got Ng thinking. There was an appetite for world-class education among those without the means or wherewithal to attend an institution like Stanford. How far could that hunger be satisfied? Could the Internet, like other communication advances, allow us (even compel us) to redistribute monopolies of knowledge? Doesn’t all information in fact want to be free?

  Over the next few years, Ng worked out of his living room, developing much of the technology and theory that’s used today in “massive open online courses” (MOOCs). Ng was driven by a single question: How can we develop a course that scales to arbitrarily large numbers of students? The answer came in the form of autograded quizzes, discussion forums, a more dynamic lecture recording style, and the startling proposal that peer grading could be as effective as grading from a single authority (if every student grades, and is graded by, five others). In the summer of 2011, Ng set up a new course online, and one hundred thousand students signed up. He did the math in his head: I’ll need to teach a class at Stanford for 250 years to reach that many people.

  The MOOC revolution had begun. On April 18, 2012, Ng announced (along with Daphne Koller) the online learning platform Coursera.org. And Ng’s assumptions about that hidden appetite for higher learning proved correct. Latest numbers show Coursera hosts more than five million students who collectively enroll in 532 courses offered by 107 institutions around the globe, including Princeton and Yale.

  The advantages of MOOCs are many and clear. Online videos of lectures are massively abbreviated, so an hourlong lecture might become a five-minute video focusing on single action-minded outcomes. Instead of showing a lecturer pacing back and forth in front of bored students, Ng now overlays talk onto visuals that show graphics and handwritten key points—“just the content,” as Ng has it. “We also use video editing to cut out the boring bits,” he told me. “Some things, like a professor writing for ages on a board, you just don’t need to see.”

  And then there’s the data. The piles and piles of data. Coursera doesn’t just educate you, it learns from you, too. Every keystroke, every mouse click, is logged in Coursera’s rapidly expanding data banks. When a student pauses a video, Coursera takes note; when a student needs more time than usual to answer a question, Coursera logs that, too; it knows when you skip a video, what questions you asked of peers, and what time of day you read the answer. Over its first year or so, Ng told me, “Coursera collected more educational data than the entire field of education has collected in its five-thousand-year history.”

  To what end? Consider a single programming assignment that Ng devised for one of his MOOCs. Thousands of students submitted a wrong answer—but what struck Ng was that Coursera could scan its data and reveal that two thousand students had made exactly the same mistake in their work. “I was able to create a custom correction message, which would pop up when people had the usual misconception. In a normal class of one hundred students, you won’t notice these patterns. So, ironically, in order to achieve this level of personalization, what we needed was to teach a class to one hundred thousand people.” (I take his point, though I’m not sure that my definition of personalization is the same as Ng’s.) The hope, here, is that mass data analysis will allow Coursera, and other MOOC providers, to create software that personalizes education in much the same way that Netflix, Google, and Amazon already personalize your experience of movie watching, searching, and shopping. Imagine taking a class on twentieth-century literature and receiving helpful advice that “other learners like you” have found useful. The process of intellectual exploration, once highly idiosyncratic, becomes an opportunity to promote whatever material has the highest view count. “Until now,” Ng told me, “education has been a largely anecdotal science, and we can now make it data-driven.” This reminded me, of course, of Karthik Dinakar, eager to “harden” the soft sciences of psychology and psychiatry with reams of crowdsourced data.

  The crowdsourcing of education is further highlighted by Ng’s interest in Wiki lecture notes. “At Stanford,” he explained to me, “I taught a class for a decade, and writing the lecture notes would take forever. And then, every year, students would find more bugs, more errors, in my notes. But for online classes, I put up a Wiki and invite students to write their own lecture notes; students watch my lectures and cre
ate the notes themselves. When you have one hundred thousand students reading and editing the same Wiki lecture notes, the result is a higher quality of text than I could create on my own. Bugs are rapidly squashed.” I ask whether the same principle that works for his engineering classes would work for classes on art history or creative writing. Ng pauses for a beat before replying: “I haven’t seen any evidence that would suggest otherwise.”

  Nevertheless, MOOCs and the attendant dematerialization of the education process are creating a certain crisis of authenticity. A large Pew Research Center survey found that most people believe we’ll see a mass adoption of “distance learning” by 2020, and many are wondering whether that will brush aside the sun-dappled campuses, shared coffees, and lawn lolling that pre-Internet students considered so essential to their twenty-something lives.

  There are also more concrete points to consider. Graduation rates, for starters: Another MOOC godfather at Stanford, Sebastian Thrun (of Udacity), was tantalized for a while by the possibility of bringing Ivy League education to the world’s unfortunates, but he later announced in Fast Company magazine that less than 10 percent of his MOOC students were actually completing courses. He had become deeply dissatisfied with the MOOC world he had helped to bring about: “We don’t educate people as others wished, or as I wished,” he said. “We have a lousy product.” After signing up nearly two million students for online courses, Thrun despaired at the dismal completion rates; and only about half of those who did complete courses appeared to be learning the material in a meaningful way.

  Ng remains hopeful. “I think a lot of content can and will go online,” he told me. “The economics certainly work out better that way. But I don’t see us replicating the crucial mentor experience, the small-group experience. What I’d like to do is automate routine activities like grading papers and lectures, to free up the professor’s time for real conversations. The role of the traditional university is going to be transformed.” Meanwhile, the nonprofit enterprise edX announced in 2013 an artificial intelligence program that instantly grades essays and written answers, eliminating the need for a professor’s comments.

  Ng himself often compares the digital revolution with the original Gutenberg moment, so it follows that he would assume a digital enlightenment is about to follow. “I think we can change the world one day,” he says matter-of-factly. “If a poor child in a developing country takes a dozen courses in computer sciences that he didn’t have access to before, and then can earn a decent wage, I think in that way we can change the world.” Who would deny such an enlightenment? But it may be worth noting here that most Coursera students are not from developing countries. At present, Africa makes up 3.6 percent of the students, while more than a third come from North America and a further third hail from Europe.

  Neil Postman, the pioneering technology critic, argues in Technopoly that “school was an invention of the printing press and must stand or fall on the issue of how much importance the printed word has.” By this measure, Coursera and its ilk are a kind of necessity, a rearrangement of education that’s inevitable as our means of communication changes. “For four hundred years schoolteachers have been part of the knowledge monopoly created by printing,” continues Postman, “and they are now witnessing the breakup of that monopoly.” In the days of Thamus (see chapter 2), the written word was a kind of inauthentic knowledge, and then it became the only true form of knowledge. Is it so unlikely that we’re undergoing a similar reevaluation today?

  The new knowledge monopoly will feel comparatively abstract, if history is any guide. Advances in cartography, for example, delivered maps that substituted an abstract space for firsthand experiences with natural landscapes; the mechanical clock parsed leisurely “natural time” into regimented sections so that the gong of a church bell had more sway over your comings and goings than your body’s own inclinations.12 Arguably, the larger and more productive world that our technologies deliver is simultaneously an impoverished version of the older one—a version that rejects direct experience and therefore rejects an earlier conception of reality that had its own value. We see more, yet our vision is blurred; we feel more things, yet we are numbed. Marshall McLuhan argues that whenever we amplify some part of our experience with a given technology, we necessarily distance ourselves from it, too. (A friend of mine saw those airplanes crash into the World Trade Center while sitting in her living room on the other side of the continent—and thought, against her will, of a movie.)

  • • • • •

  Some lens has been shuttered over our vision. We all have felt it. Even as we draw more of the world into our lives, gaining access to people and events we never had access to before, we feel that the things we gather lose some veracity in transit. It happens to me constantly. At my brother’s wedding, a hundred of us gathered in my parents’ backyard, beneath the glow of trailing paper lanterns strung throughout the trees and white tents. I remember breaking away from the festivities to check my phone, only to find that my friend was posting photos of the very wedding I’d stepped away from: pixelated simulacra of the moment I had left.

  The most obvious reason a person would ditch the authentic is, of course, to gain access to a heightened version of dull reality. Enter the promise and wonder of Google Glass, released in 2013, which offers just that—augmented reality. The “wearable computer” is a (slightly futuristic, slightly dorky) headset fixed with a miniature display and camera, which responds to voice commands. We can tell it to take a picture of what we’re looking at or simply pull up Google Images’ archive of vintage Hulk Hogan photos because we want to compare the hairdo being sported by that guy on the metro. The company’s welcoming Web site smiles: “Welcome to a world through glass.” Welcome to augmented (read: inauthentic) reality.

  Remember that the Emerald City in The Wonderful Wizard of Oz isn’t actually emerald. In the Hollywood film version, yes, Judy Garland and her gang traipse through a gorgeous, sparkling town. But in L. Frank Baum’s original book, Dorothy and the others are exhorted to put on “safety goggles” to protect their eyes. “If you do not,” they are warned, “the brightness and glory of the Emerald City would blind you.” Only much later do they discover that it was the green-tinted goggles all along that gave the city its apparent luster. The Emerald City (like the “wizard” behind the curtain) is a fake. “But isn’t everything here green?” asks Dorothy. “No more than in any other city,” replies Oz. “But my people have worn green glasses on their eyes so long that most of them think it really is an Emerald City.”

  When we wear emerald glasses with the intention of augmenting reality, we’re always giving ourselves over to some authority’s vision and relinquishing a portion of our own independent sight.

  All our screen time, our digital indulgence, may well be wreaking havoc on our conception of the authentic—how could it not? But, paradoxically, it’s the impulse to hold more of the world in our arms that leaves us holding more of reality at arm’s length. Coursera.org delivers the world’s great teachers to your living room but turns education into a screen interface; a child’s cell phone keeps her in constant touch with her friends but trains her to think of text messaging as a soulful communication.

  When Walter Benjamin meditated on the advent of mechanical reproduction in 1936, he was already wondering at the uncanny changes that take place when “a cathedral quits its site to find a welcome in the studio of an art lover” or “a choral work performed in a hall or in the open air can be heard in a room.” When Benjamin went to the movies—which were now, amazingly, delivering talking images on the screen—he saw that they turned rare beauties into easily accessible experiences, all the while degrading the “aura” of that which they projected, their “genuineness.” He wrote: “The genuineness of a thing is the quintessence of everything about its creation that can be handed down, from its material duration to the historical witness that it bears.” What a strange concern, we might think—historical witness. It’s that antique n
otion of actually being there, of becoming richer by being one of the few people or things to have lived in a singular moment, a singular place. Benjamin even worried about the actors he saw on movie screens, noting that “for the first time . . . a person is placed in the position, while operating with his whole being, of having to dispense with the aura that goes with it. For that aura is bound to his here and now; it has no replica.” It’s a worry, a sensibility, that’s nearly demolished by YouTube and its ilk; we aren’t trained to care about the genuineness of things when digital copies give a zombie-scale crowding of content. This outdated concern for genuineness—for aura—requires absence, that one thing we have in short supply. The endgame is this: Without absence in our lives, we risk fooling ourselves into believing that things (a message from a lover, the performance of a song, the face of a human body) matter less. De Beers hoards its diamonds to invent a scarcity that equals preciousness. Perhaps we now need to engineer scarcity in our communications, in our interactions, and in the things we consume. Otherwise our lives become like a Morse code transmission that’s lacking breaks—a swarm of noise blanketing the valuable data beneath.

  • • • • •

  I often feel as though I’m living through a moment of authenticity wobble. Depending on the person I’m talking to—a youth or a senior citizen—my sense of what’s authentic, what’s real, flips back and forth. My own perception of the authentic is caught in the sloshing middle. Perhaps that means I’m less authentic than those who came before and those who came after. I dispute both origins. For my peers and me, this confusion is all around us, an ambient fog, though we don’t often name it. We look up symptoms on Mayoclinic.org but indulge in “natural” medicine; we refuse to obey any church’s laws, yet we want to hold on to some idea of spirituality and read our Eckhart Tolle; we hunch plaintively over our cell phones for much of the year and then barrel into the desert for a week of bacchanalian ecstasy at the Burning Man festival. One friend of mine, who is more addicted to his phone than most, visits a secret traveling sauna once a month located inside a specially outfitted van. He gets naked with a bunch of like-minded men and women and chats about life inside the van’s superheated cabin; then he tugs his clothes back on and reenters his digital life. I’m at the point where I won’t call one experience authentic and the other inauthentic. We are learning to embrace both worlds as real, learning to accept the origin and aura of things that rain down mysteriously from the clouds.

 

‹ Prev