Is the Internet Changing the Way You Think?

Home > Other > Is the Internet Changing the Way You Think? > Page 25
Is the Internet Changing the Way You Think? Page 25

by John Brockman


  Like many others, I suspect that the Internet has made my experience more fragmented, splintered, and discontinuous. But I’d argue that that’s not because of the Internet itself but because I have mastered the Internet as an adult. Why don’t we feel the same way about reading and schooling that we feel about the Web? Those changes in the way we get information have had a pervasive and transformative effect on human cognition and thought, and universal literacy and education have been around for only a hundred years or so.

  It’s because human change takes place across generations rather than within a single life. This is built into the very nature of the developing mind and brain. All the authors of these essays have learned how to use the Web with brains that were fully developed long before we sent our first e-mail. All of us learned to read with the open and flexible brains we had when we were children. As a result, no one living now will experience the digital world in the spontaneous and unself-conscious way that the children of 2010 will experience it, or in the spontaneous and unself-conscious way we experience print.

  There is a profound difference between the way children and adults learn. Young brains are capable of much more extensive change—more rewiring—than the brains of adults. This difference between old brains and young ones is the engine of technological and cultural innovation. Human adults, more than any other animal, reshape the world around them. But adults innovate slowly, intentionally, and consciously. The changes that take place within an adult life, like the development of the Internet, are disruptive, attention-getting, disturbing, or exciting. But those changes become second nature to the next generation of children. Those young brains painlessly absorb the world their parents created, and that world takes on a glow of timelessness and eternity, even if it was created only the day before you were born.

  My experience of the Web feels fragmented, discontinuous, and effortful (and interesting) because, for adults, learning a new technology depends on conscious, attentive, intentional processing. In adults, this kind of conscious attention is a limited resource. This is true even at the neural level. When we pay attention to something, the prefrontal cortex—the part of our brain responsible for conscious, goal-directed planning—controls the release of cholinergic transmitters, chemicals that help us learn, to certain very specific parts of the brain. So as we wrestle with a new technology, we adults can change our minds only a little bit at a time.

  Attention and learning work very differently in young brains. Young animals have much more widespread cholinergic transmitters than adults do, and their ability to learn doesn’t depend on planned, deliberate attention. Young brains are designed to learn from everything new or surprising or information-rich, even when it isn’t particularly relevant or useful.

  So children who grow up with the Web will master it in a way that will feel as whole and natural as reading feels to us. But that doesn’t mean their experience and attention won’t be changed by the Internet, any more than my print-soaked twentieth-century life was the same as the life of a barely literate nineteenth-century farmer.

  The special attentional strategies we require for literacy and schooling may feel natural because they are so pervasive, and because we learned them at such an early age. But at different times and places, different ways of deploying attention have been equally valuable and felt equally natural. Children in Mayan Indian cultures, for example, are taught to distribute their attention to several events simultaneously, just as print and school teach us to focus on only one thing at a time. I’ll never be able to deploy the broad yet vigilant attention of a hunter-gatherer—though, luckily, a childhood full of practice caregiving let me master the equally ancient art of attending to work and babies at the same time.

  Perhaps our digital grandchildren will view a master reader with the same nostalgic awe we now accord to a master hunter or an even more masterly mother of six. The skills of the hyperliterate twentieth century may well disappear—or, at least, become highly specialized enthusiasms, like the once universal skills of hunting, poetry, and dance. It is sad that after the intimacy of infancy our children inevitably end up being somewhat weird and incomprehensible visitors from the technological future. But the hopeful thought is that my grandchildren will not have the fragmented, distracted, alienated digital experience that I do. To them, the Internet will feel as fundamental, as rooted, as timeless, as a battered Penguin paperback, that apex of the literate civilization of the last century, feels to me.

  “Go Native”

  Howard Gardner

  Psychologist, Harvard University; author, Changing Minds

  The Internet has changed my life greatly, but not in a way that I could have anticipated, nor in the way that the question implies. Put succinctly, just as if a newly discovered preliterate tribe had challenged my beliefs about human language and human culture, the Internet has altered my views of human development and human potential.

  Several years ago, I had a chance conversation with Jonathan Fanton, then president of the MacArthur Foundation. He mentioned that the foundation was sponsoring, to the tune of $50 million, a major study of how young people are being changed by the new digital media, such as the Internet. At the time, as part of the GoodWork research project, I was involved in studies of ethics, focusing particularly on the ethical orientation of young people. So I asked Fanton, “Are you looking at the ways in which the ethics of youth may be affected?” He told me that the foundation had not thought about this issue. After several conversations and a grant application, my colleagues and I launched the GoodPlay project, a social science study of ethics in the digital media.

  Even though I myself am a digital immigrant—I sometimes refer to myself as a digital paleolith—I now spend many hours a week thinking about the ways in which nearly all of us, young and old, are affected by being online, networked, and surfing or posting for so much of the day. I’ve become convinced that the “digital revolution” may be as epoch-making as the invention of writing or, certainly, the invention of printing or of broadcasting. While I agree with those who caution that it is premature to detail the effects, it is not too early to begin to think, observe, reflect, or conduct pivotal observations and experiments. Indeed, I wish that social scientists and/or other observers had been around when earlier new media debuted.

  Asked for my current thinking, I would make the following points. The lives and minds of young people are far more fragmented than at earlier times. This multiplicity of connections, networks, avatars, messages, may not bother them but certainly makes for identities that are more fluid and less stable. Times for reflection, introspection, solitude, are scarce. Long-standing views of privacy and ownership/authorship are being rapidly undermined. Probably most dramatically, what it has meant for millennia to belong to a community is being totally renegotiated as a result of instant, 24/7 access to anyone connected to the Internet. How this will affect intimacy, imagination, democracy, social action, citizenship, and other staples of humankind is up for grabs.

  For older people (even older than I am), the digital world is mysterious. For those of us who are middle-aged or beyond, we continue to live in two worlds—the predigital and the digital—and we may either be nostalgic for the days without BlackBerrys or relieved that we no longer have to trudge off to the library. But all persons who want to understand their children or their grandchildren must make the effort to “go native”—and at such times we digital immigrants or digital paleoliths can feel as fragmented, as uncertain about privacy, as pulled by membership in diverse and perhaps incommensurate communities, as any fifteen-year-old.

  The Maximization of Neoteny

  Jaron Lanier

  Musician, computer scientist; pioneer of virtual reality; author, You Are Not a Gadget: A Manifesto

  The Internet, as it evolved up to about the turn of the century, was a great relief and comfort to me and influenced my thinking positively in a multitude of ways. There were the long-anticipated quotidian delights of speedy information a
ccess and transfer, but also the far more important optimism born from seeing so many people decide to create Web pages and become expressive—proof that the late twentieth century’s passive society on the couch in front of the TV was only a passing bad dream.

  In the last decade, however, the Internet has taken on unpleasant qualities and has become gripped by reality-denying ideology.

  The current mainstream dominant culture of the Internet is the descendant of what used to be the radical culture of the early Internet. The ideas, unfortunately, are motivated to a significant degree by a denial of the biological nature of personhood. The new true believers attempt to conceive of themselves as becoming ever more like abstract, immortal, information machines instead of messy, mortal, embodied creatures. This is just another approach to an ancient folly—the psychological denial of aging and dying. To be a biological realist today is to hold a minority opinion in an age of profound, overbearing, technologically enriched groupthink.

  When I was in my twenties, my friends and I were motivated by the eternal frustration of young people—that they are not immediately all made rulers of the world. It seemed supremely annoying to my musician friends, for instance, that the biggest stars—like Michael Jackson—would get millions of dollars in advance for an album, while an obscure minor artist like me would get only a $100,000 advance (and this was in early 1990s dollars).

  So what to do? Kill the whole damned system! Make music free to share, and demand that everyone build reputations on a genuine all-to-all network instead of a broadcast network, so that it would be fair. Then we’d all go out and perform to make money, and the best musician would win.

  The lecture circuit was particularly good to me as a live performer. My lecture career was probably one of the first of its kind driven mostly by my online presence. (In the old days, my crappy Website got enough traffic to merit coverage by mainstream media such as the New York Times.) Money seemed available on tap.

  A sweet way to run a culture back then, but in the bigger picture it’s been a disaster. Only a tiny, token number of musicians (if any) do as well within the new online utopia as even I used to do in the old world, and I wasn’t particularly successful. All the musicians I have been able to communicate with about their true situation, including a lot of extremely famous ones, have suffered after the vandalism of my generation, and the reason isn’t abstract but biological.

  What we denied was that we were human and mortal, that we might someday have wanted children, even though it seemed inconceivable at the time. In the human species, neoteny, the extremely slow fading of juvenile characteristics, has made child rearing into a draining, long-term commitment.

  That is the reality. We were all pissed at our parents for not coming through in some way or other, but evolution has extended the demands of human parenting to the point that it is impossible for parents to come through well enough, ever. Every child must be disappointed to some degree because of neoteny, but economic and social systems can be designed to minimize the frustration. Unfortunately the Internet, as it has come to be, maximizes it.

  The way that neoteny relates to the degradation of the Internet is that as a parent, you really can’t run around playing live gigs all the time. The only way for a creative person to live with what we can call dignity is to have some system of intellectual property to provide sustenance while you’re out of your mind with fatigue after a rough night with a sick kid.

  Or spouses might be called on to give up their own aspirations for a career—but there was this other movement, called feminism, happening at the same time, which made that arrangement less common.

  Or there might be a greater degree of socialism, to buffer biological challenges, but there was an intense libertarian tilt coincident with the rise of the Internet in the United States. All the options have been ruled out, and the result is a disjunction between true adulthood and the creative life.

  The Internet, in its current fashionable role as an aggregator of people through social networking software, values humans only in real time and in a specific physical place—that is, usually away from their children. The human expressions that used to occupy the golden pyramidion of Maslow’s pyramid are treated as worthless in themselves.

  But dignity is the opposite of real time. Dignity means, in part, that you don’t have to wonder if you’ll successfully sing for your supper for every meal. Dignity ought to be something one can earn. I have focused on parenting here because it is what I am experiencing now, but the principle becomes even more important as people become ill, and then even more as people age. So for these reasons and many others, the current fashionable design of the Internet, dominated by so-called social networking, has an antihuman quality. But very few people I know share my current perspective.

  Dignity might also mean being able to resist the near consensus of your peer group.

  Wisdom of the Crowd

  Keith Devlin

  Executive director, H-STAR Institute, Stanford University; author, The Unfinished Game: Pascal, Fermat, and the Seventeenth-Century Letter That Made the World Modern

  In this year’s Edge question, the key phrase is surely “the way you think,” and the key word therein is “think.”

  No one can contribute to an online discussion forum like this without thereby demonstrating that the Internet has changed, and continues to change, the way we work. The Internet also changes the way we make decisions. I now choose my flights on the basis of a lot more information than any one air carrier would like me to have (except perhaps for Southwest, which currently benefits from the Internet decision process), and I select hotels based on reviews by other customers, which I temper by a judgment based (somewhat dubiously, I admit) on their use of language as to whether they are sufficiently “like me” for their views to be relevant to me.

  But is that really a change in the way I think? I don’t think so. In fact, we Edge contributors are probably a highly atypical society grouping to answer this question, since we have all been trained over many years to think in certain analytic ways. In particular, we habitually begin by gathering information, questioning that information and our assumptions, looking at (some) alternatives, and basing our conclusions on the evidence before us.

  We are also used to having our conclusions held up to public scrutiny by our peers. Which, of course, is why it is rare—though intriguingly (and I think all to the good) not totally impossible—to find trained scientists who believe in biblical creationism or who doubt that global warming is a real and dangerous phenomenon.

  When I reflect on how I go about my intellectual work these days, I see that the Internet has changed it dramatically, but what has changed is the execution process (and hence, on some occasions, the conclusions I reach or the way I present them), not the underlying thinking process.

  I would hope for humanity’s future that the same is true for all my fellow highly trained specialists. The scientific method for reaching conclusions has served us well for many generations, leading to a length and quality of life for most of us that was beyond the imagination of our ancestors. If that way of thinking were to be replaced by a blind “wisdom of the crowd” approach, which the Internet offers, then we are likely in for real trouble. For wisdom of the crowd—like its best-known exemplar, Google search—gives you the mostly best answer most of the time.

  As a result of those two mostlys, using the wisdom of the crowd without questioning it, though fine for booking flights or selecting hotels, can be potentially dangerous, even when restricted to experts. To give one example, not many decades ago, the wisdom of the crowd among the scientific community told us that plate tectonics was nonsense. Now it is the accepted theory.

  The good thing about the analytic method, of course, is that once there was sufficient evidence in support of plate tectonics, the scientific community switched from virtual dismissal to near total acceptance.

  That example alone explains why I think it is good that a few well-informed (this co
ndition is important) individuals question both global warming and evolution by natural selection. Our conclusions need to be constantly questioned. I remain open to having my mind changed on either. But to make that change, I require convincing evidence, which is so far totally lacking. In the meantime, I will continue to accept both theories.

  The real Edge question, for me, is one that is only implied by the question as stated: Does the Internet change the way of thinking for those people born in the Internet age—the so-called digital natives? Only time can really answer that.

  Living organisms adapt, and the brain is a highly plastic organ, so it strikes me as not impossible that the answer to this modified question may be yes. On the other hand, recent research by my Stanford colleague Cliff Nass (and others) suggests that there are limitations to the degree to which the digital environment can change our thinking.

  An even more intriguing question is whether the Internet is leading to society as a whole (at least, those who are on the Net) constituting an emergent global thinking. By most practical definitions of “thinking” I can come up with—distinguishing it from emotions and self-reflective consciousness—the answer seems to be yes. And that development will surely change our future in ways we can only begin to imagine.

  Weirdness of the Crowd

  Robert Sapolsky

  Neuroscientist, Stanford University; author, Monkeyluv: And Other Essays on Our Lives as Animals

  I should start by saying that I’m not really one to ask about such things, as I am an extremely unsophisticated user of the Internet. I’ve never sold anything on eBay, bought anything from Amazon, or posted something on YouTube. I don’t have an avatar on Second Life and I’ve never “met” anyone online. And I’ve never successfully defrauded the wealthy widow of a Nigerian dictator. So I’m not much of an expert on this.

 

‹ Prev