What of the new Web-dependent phenomena: open access and open-source programming, virtual social networking, the co- construction of knowledge? All these are gains and reflect something hopeful: the collaborative effort of our joint endeavor, our willingness to share. The inclusive nature of these phenomena is encouraging. I want to join in, and I like the idea of making a modest contribution to a larger enterprise. But the new technologies let me witness their distancing and distorting influences: Internet-fueled fantasies where everyone can be a celebrity, or people can live through their avatars in virtual reality or develop alternative personalities in chat rooms—fantasies that someone, somewhere on the Internet, is making money from.
How do I cope with the speeded-up information age? The overload is overwhelming, but so is my desire to know and not to miss anything. I’m tempted to know a little bit about everything and look for predigested, concise, neatly formatted content from reliable sources. My reading habits have changed, making me aware of how important well-packaged information has become. It’s now necessary to consume thousands of abstracts from scientific journals, doing one’s own fast search for what should be read in more detail. Debates seem to be decided at the level of abstracts, repudiations signaled by the title and a hundred words. The real work, of course, goes on elsewhere, but we want the Internet to brings us the results. This leaves me knowing less and less about more and more. At the same time, I’m exhilarated by the dizzying effort to make connections and integrate information. Learning is faster—though the tendency to forge connecting themes can feel dangerously close to the search for patterns that overtakes the mentally ill. Time to slow down and engage in longer study.
The Internet shows me more and more about those who participate in it, but I worry lest I forget that not everything or everyone in the world has a home on the Internet. Missing are those who cannot read or write, who have no access to a computer, or who choose to remain disconnected. There is a danger of coming to think that what cannot be found on an Internet search doesn’t exist, and that the virtual world is the world. It isn’t. However bizarre and incredible the people populating the Internet are, they are still akin to me—people with knowledge of computers and their applications. Certainly there is diversity and hierarchy and vast domains of varied information, but nevertheless—except when Internet users turn their attention on the those who are excluded, or who exclude themselves—a mirror will be held up to those who sustain the information age, and it is only this part of the world that I have come to have scattered information about.
Ephemera and Back Again
Chris DiBona
Open Source and Public Sector, Google
I often feel as though my brain is at best a creative and emotional caching front end on the Internet. With a few bare exceptions (my children, my wife, my family), I feel little practical need anymore to commit my long-term memory to endeavors I formerly spent days, weeks, months, and years on. I’ve come to think I should memorize things more for the health of my brain rather than for any real practical need to know—for example, that decimal 32 is a space in ASCII, or that the second stanza of the Major General’s song shows his acquaintance with the binomial theorem.
I hardly ever memorize phone numbers of people outside my immediate family these days, and I used to proudly tuck away nearly all of them. Now, as a result of the richness of a life connected to the Internet, I mostly retain area codes, so that I can guess who might be calling. A casualty of contact syncing, perhaps, but still I find myself considering many voice conversations or audio recordings too information-sparse to be listened to unless I’m otherwise occupied—with driving, say, or washing the dishes.
For elements of culture especially, I don’t wonder for long who was in the movie about the fall of communism with the woman in a coma. I just look it up, faster, online. I don’t spend much time considering in what techno song the dude from Star Trek says, “Time becomes a loop,” nor do I find it difficult to find, online, the name of that book I read in which the dude orbiting a neutron star for an alien race discovers its tidal effects. Nor do I have to consider what game it was that had a dog accompanying me through post-apocalyptic California. As I scroll, Pavlovian, through my feed, the waves of knowledge roll over me.
When I travel, I no longer take pictures of these outings, except ones with my family in them; there are better photos available to me online, if I feel like jogging my memory about a trip.
I don’t even especially worry about where I am, either, considering myself not unlike a packet being routed—not from client machine to router to server to backhaul to peer to machine to client machine, but instead from house to car to plane to car to hotel to car to office or conference to car to hotel to car to plane to car to home, with jet lag my only friend and my laptop my source of entertaining books (Neutron Star), movies (Good Bye Lenin!), games (Fallout), or music (Orbital, Meat to Munich), with cellular data, headphones, and circuits.
Some would equate this sort of information pruning to a kind of reinforced and embraced ignorance or evidence of an empty life. Nicholas Carr, writing in the Atlantic, enjoyed some attention in 2008 with his article titled “Is Google Making Us Stupid?” The author, reacting to (or justifying) his own reduced attention span, accuses Google (my employer) of trying to do away with deep thinking, while indulging in what comes off as an absurd nostalgia for making knowledge difficult to find and obtain.
There was an important thought worthy of exploration within that article—that there is a kind of danger in reinforcing the shallow. I have come to understand, expect, and accept that people try to find the Internet that aligns with their beliefs. This is impossible to change without strangling the Internet of the creativity that makes it so useful, as for every Wikipedia expanding and storing humankind’s knowledge about everything, there is a Conservapedia rewriting the Bible to be more free-market friendly.
But people who wallow in ignorance are no different online than off. I don’t believe that the Internet creates ignorant people. But what the Internet changes is the notion of unique thought. I have come to think that with more than 6.7 billion people on the planet, with more than a billion capable of expressing themselves on the Internet and hundreds of millions if not billions on the Internet via their cell phones, there is very little chance that any idea I might have outside my specialty hasn’t already been explored, if not executed. Within my specialty, even, there is a fair amount of what I’d charitably call non-unique thinking. This is not to say that the world doesn’t need practitioners. I consider myself to be a good one, but only rarely do I come up with an approach that I’d consider unique within my specialty.
At one time, I found this a rather bleak realization—thinking we’re all just conduits from urge to hand to Net to work—but over the last decade I’ve come to find it a source of comfort. Not all ideas need be mine; I can use the higher functions more for where they matter—locally, with my family, and on my work, on things I enjoy and treasure—and less on loading a browser or opening a tab into today’s ephemera.
Queries I executed while writing this article:
modern major general
Google stupid
garden paving pruning cleaving
garden paring pruning cleaving
garden paring pruning
garden paring
dense antonyms
major general’s song
define: stanza
ASCII chart
game had a dog accompanying me through post-apocalyptic California
orbiting a neutron star for an alien race finds out about tidal effects
for an alien race finds out about tidal effects
orbiting a neutron star in a ship built by aliens
dude orbiting a neutron star for an alien race with eyes in their hands
time becomes a loop
the German movie about the fall of communism with the woman in a coma
books printed each year
Co
nservapedia
Internet enabled cell phones
people with Internet enabled cell phones
people with Internet-enabled cellphones
planet population
What Do We Think About? Who Gets to Do the Thinking?
Evgeny Morozov
Commentator on Internet and politics, Net Effect blog; contributing editor, Foreign Policy
As it might take decades for the Internet to rewire how our brains actually process information, we should expect that most immediate changes would be social rather than biological in nature. Of those, two bother me in particular. One has to do with how the Internet changes what we think about; the other one with who gets to do the thinking.
What I find particularly worrisome with regard to the “what” question is the rapid and inexorable disappearance of retrospection and reminiscence from our digital lives. One of the most significant but overlooked Internet developments of 2009—the arrival of the so-called real-time Web, whereby all new content is instantly indexed, read, and analyzed—is a potent reminder that our lives are increasingly lived in the present, completely detached even from the most recent past. For most brokers dealing on today’s global information exchange, the past is a “strong sell.”
In a sense, this is hardly surprising. The social beast that has taken over our digital lives has to be constantly fed with the most trivial of ephemera. And so we oblige, treating it to countless status updates and zetabytes of multimedia (almost 1,000 photos are uploaded to Facebook every second). This hunger for the present is deeply embedded in the very architecture and business models of social networking sites. Twitter and Facebook are not interested in what we were doing or thinking about five years ago; it’s what we are doing or thinking about right now that they like to know.
These sites have good reasons for such a fundamentalist preference for the present, as it greatly enhances their ability to sell our online lives to advertisers. After all, much of the time we are thinking of little else but satisfying our needs, spiritual or physical, and the sooner our needs can be articulated and matched with our respective demographic group, the more likely it is that we’ll be coerced into buying something online.
Our ability to look back and engage with the past is one unfortunate victim of such reification of thinking. Thus, amid all the recent hysteria about the demise of forgetting in the era of social networking, it’s the demise of reminiscence that I find deeply troublesome. The digital age presents us with yet another paradox: Whereas we have nearly infinite space to store our memories, as well as all the multipurpose gadgets to augment them with GPS coordinates and 360-degree panoramas, we have fewer opportunities to look back on and engage with those memories.
The bottomless reservoirs of the present have blinded us to the positive and therapeutic aspects of the past. For most of us, “reengaging with the past” today means nothing more than feeling embarrassed over something we did years ago after it has unexpectedly resurfaced on social networks. But there is much more to reminiscence than embarrassment. Studies show that there is an intricate connection between reminiscence (particularly about positive events in our lives) and happiness: the more we do of the former, the more we experience the latter. Substituting links to our Facebook profiles and Twitter updates for links to our past risks turning us into hyperactive, depressive, and easily irritated creatures who don’t know how to appreciate our own achievements.
The “who” question—who gets to do the thinking in the digital age?—is much trickier. The most obvious answer—that the Internet has democratized access to knowledge, and we are all thinkers now, bowed over our keyboards much like Rodin’s famous sculpture—is wrong. One of my greatest fears is that the Internet will widen the gap between the disengaged masses and the overengaged elites, thus thwarting our ability to collectively solve global problems (climate change and the need for smarter regulation in the financial industry come to mind) that require everyone’s immediate attention. The Internet may yield more “thinking” about such issues, but such thinking will not be equally distributed.
The Marxists have been wrong on many issues, but they were probably right about the reactionary views espoused by the lumpen proletariat. Today we are facing the emergence of the cyber lumpen proletariat—of people who are being sucked into the digital whirlwind of gossip sites, trashy video games, populist and xenophobic blogs, and endless poking on social networking sites. The intellectual elites, on the other hand, continue thriving in the new digital environment, exploiting superb online tools for scientific research and collaboration, streaming art house films via Netflix, swapping their favorite books via e-readers, reconnecting with musical treasures of the bygone eras via iTunes, and, above all, perusing materials in the giant online libraries, like the one that Google could soon unveil. The real disparities between the two groups become painfully obvious once members of the cyber lumpen proletariat head to the polls and push for issues of an extremely dubious, if not outright unethical, nature. (The referendum on minarets in Switzerland is a case in point; the fact that Internet users on Obama’s change.gov site voted the legalization of marijuana as the most burning issue is another.)
As an aside, given the growing concerns over copyright and the digitization of national cultural heritage in many parts of the world, there is a growing risk that this intellectual cornucopia will be available only in North America, creating yet another divide. Disconnected from Google’s digital library, the most prestigious universities in Europe or Asia may look less appealing even than middling community colleges in the United States. It seems increasingly likely that the Internet will not diffuse knowledge production and thinking around the globe but further concentrate it in one place.
The Internet Is a Cultural Form
Virginia Heffernan
Columnist (“The Medium”), the New York Times
People who study the real world, including historians and scientists, may find that the reality of the Internet changes how they think. But those of us, including philosophers and literary critics, who study symbolic systems find in the Internet yet another symbolic system, albeit a humdinger, that yields—spectacularly, I must say—to our accustomed modes of inquiry.
Anyway, a new symbolic order need not disrupt Truth, wherever Truth may now be said to reside (neurons? climate change? atheism?). Certainly to those of us who read more novels than MRIs, the Internet—and especially the World Wide Web—looks like what we know: a fictional world made mostly of words.
Philosophers and critics must only be careful, as we are trained to be careful, not to mistake this highly stylized and artificial order, the Internet, for reality itself. After all, cultural vocabularies that gain currency and influence—epic poetry, the Catholic Mass, the British Empire, photography—always do so by purporting to be reality, to be transparent, to represent or circumscribe life as it really is. As an arrangement of interlocking high, pop, and folk art forms, the Internet is no different. This ought to be especially clear when what’s meant by “the Internet” is that mostly comic, intensely commercial, bourgeois space known as the World Wide Web.
We who have determinedly kept our heads while suffrage, the Holocaust, the highway system, Renaissance perspective, coeducation, the Pill, household appliances, the moon landing, the Kennedy assassination, and rock and roll were supposed to change existence forever cannot falter now. Instead of theatrically changing our thinking, this time we must keep our heads, which means—to me—that we must keep on reading and not mistake new texts for new worlds, or new forms for new brains.
Wallowing in the World of Knowledge
Peter Schwartz
Futurist, business strategist; cofounder, Global Business Network, a Monitor Company; author, Inevitable Surprises
In 1973, just as I was starting work at Stanford Research Institute (SRI), I had the good fortune to be one of the earliest users of what was then known as ARPANET. Collaborative work at a distance was the goal of the experiment that
led to the suitcase-size TI Silent 700 portable terminal with an acoustic coupler and thermal printer on the back (no screen) sitting on my desk at home in Palo Alto. I was writing scenarios for the future of Washington state with the staff of Governor Dan Evans in Olympia. It was the beginning of the redistribution of my sense of identity.
In the 1980s, I was also a participant in the WELL, one of the first meaningful online communities. Nearly everyone who was part of the WELL had this sense of a very rich set of multiple perceptions constantly and instantly accessible. And (though not because the Deadheads were a large part of that community) my sense of an aware, distributed consciousness began to develop.
Finally, with the coming of the modern Internet, the World Wide Web, and the incredible explosion of knowledge access, another level in transformation took hold. I am one of those people who used to read encyclopedias and almanacs. I just wanted to know more—actually, everything. I also make my living researching, writing, speaking, and consulting. Depth, breadth, and richness of knowledge are what make it work in my passions and my profession. Before the Internet, that was limited by the boundaries of my brain. Now there is a nearly infinite pool of accessible information that becomes my knowledge in a heartbeat measured in bits per second. For those of us who wallow in the world of knowledge for pleasure and profit, the Internet has become a vast extension of our potential selves.
Is the Internet Changing the Way You Think? Page 21