Book Read Free

The Shallows

Page 6

by Nicholas Carr


  The debate between determinists and instrumentalists is an illuminating one. Both sides command strong arguments. If you look at a particular technology at a particular point in time, it certainly appears that, as the instrumentalists claim, our tools are firmly under our control. Every day, each of us makes conscious decisions about which tools we use and how we use them. Societies, too, make deliberate choices about how they deploy different technologies. The Japanese, looking to preserve the traditional samurai culture, effectively banned the use of firearms in their country for two centuries. Some religious communities, such as the Old Order Amish fellowships in North America, shun motor cars and other modern technologies. All countries put legal or other restrictions on the use of certain tools.

  But if you take a broader historical or social view, the claims of the determinists gain credibility. Although individuals and communities may make very different decisions about which tools they use, that doesn’t mean that as a species we’ve had much control over the path or pace of technological progress. It strains belief to argue that we “chose” to use maps and clocks (as if we might have chosen not to). It’s even harder to accept that we “chose” the myriad side effects of those technologies, many of which, as we’ve seen, were entirely unanticipated when the technologies came into use. “If the experience of modern society shows us anything,” observes the political scientist Langdon Winner, “it is that technologies are not merely aids to human activity, but also powerful forces acting to reshape that activity and its meaning.”13 Though we’re rarely conscious of the fact, many of the routines of our lives follow paths laid down by technologies that came into use long before we were born. It’s an overstatement to say that technology progresses autonomously—our adoption and use of tools are heavily influenced by economic, political, and demographic considerations—but it isn’t an overstatement to say that progress has its own logic, which is not always consistent with the intentions or wishes of the toolmakers and tool users. Sometimes our tools do what we tell them to. Other times, we adapt ourselves to our tools’ requirements.

  The conflict between the determinists and the instrumentalists will never be resolved. It involves, after all, two radically different views of the nature and destiny of humankind. The debate is as much about faith as it is about reason. But there is one thing that determinists and instrumentalists can agree on: technological advances often mark turning points in history. New tools for hunting and farming brought changes in patterns of population growth, settlement, and labor. New modes of transport led to expansions and realignments of trade and commerce. New weaponry altered the balance of power between states. Other breakthroughs, in fields as various as medicine, metallurgy, and magnetism, changed the way people live in innumerable ways—and continue to do so today. In large measure, civilization has assumed its current form as a result of the technologies people have come to use.

  What’s been harder to discern is the influence of technologies, particularly intellectual technologies, on the functioning of people’s brains. We can see the products of thought—works of art, scientific discoveries, symbols preserved on documents—but not the thought itself. There are plenty of fossilized bodies, but there are no fossilized minds. “Gladly would I unfold in calm degrees a natural history of the intellect,” wrote Emerson in 1841, “but what man has yet been able to mark the steps and boundaries of that transparent essence?”14

  Today, at last, the mists that have obscured the interplay between technology and the mind are beginning to lift. The recent discoveries about neuroplasticity make the essence of the intellect more visible, its steps and boundaries easier to mark. They tell us that the tools man has used to support or extend his nervous system—all those technologies that through history have influenced how we find, store, and interpret information, how we direct our attention and engage our senses, how we remember and how we forget—have shaped the physical structure and workings of the human mind. Their use has strengthened some neural circuits and weakened others, reinforced certain mental traits while leaving others to fade away. Neuroplasticity provides the missing link to our understanding of how informational media and other intellectual technologies have exerted their influence over the development of civilization and helped to guide, at a biological level, the history of human consciousness.

  We know that the basic form of the human brain hasn’t changed much in the last forty thousand years.15 Evolution at the genetic level proceeds with exquisite slowness, at least when gauged by man’s conception of time. But we also know that the ways human beings think and act have changed almost beyond recognition through those millennia. As H. G. Wells observed of mankind in his 1938 book World Brain, “His social life, his habits, have changed completely, have even undergone reversion and reversal, while his heredity seems to have changed very little if at all, since the late Stone Age.”16 Our new knowledge of neuroplasticity untangles this conundrum. Between the intellectual and behavioral guardrails set by our genetic code, the road is wide, and we hold the steering wheel. Through what we do and how we do it—moment by moment, day by day, consciously or unconsciously—we alter the chemical flows in our synapses and change our brains. And when we hand down our habits of thought to our children, through the examples we set, the schooling we provide, and the media we use, we hand down as well the modifications in the structure of our brains.

  Although the workings of our gray matter still lie beyond the reach of archaeologists’ tools, we now know not only that it is probable that the use of intellectual technologies shaped and reshaped the circuitry in our heads, but that it had to be so. Any repeated experience influences our synapses; the changes wrought by the recurring use of tools that extend or supplement our nervous systems should be particularly pronounced. And even though we can’t document, at a physical level, the changes in thinking that happened in the distant past, we can use proxies in the present. We see, for example, direct evidence of the ongoing process of mental regeneration and degeneration in the brain changes that occur when a blind person learns to read Braille. Braille, after all, is a technology, an informational medium.

  Knowing what we do about London cabbies, we can posit that as people became more dependent on maps, rather than their own memories, in navigating their surroundings, they almost certainly experienced both anatomical and functional changes in the hippocampus and other brain areas involved in spatial modeling and memory. The circuitry devoted to maintaining representations of space likely shrank, while areas employed in deciphering complex and abstract visual information likely expanded or strengthened. We also now know that the changes in the brain spurred by map use could be deployed for other purposes, which helps explain how abstract thinking in general could be promoted by the spread of the cartographer’s craft.

  The process of our mental and social adaptation to new intellectual technologies is reflected in, and reinforced by, the changing metaphors we use to portray and explain the workings of nature. Once maps had become common, people began to picture all sorts of natural and social relationships as cartographic, as a set of fixed, bounded arrangements in real or figurative space. We began to “map” our lives, our social spheres, even our ideas. Under the sway of the mechanical clock, people began thinking of their brains and their bodies—of the entire universe, in fact—as operating “like clockwork.” In the clock’s tightly interconnected gears, turning in accord with the laws of physics and forming a long and traceable chain of cause and effect, we found a mechanistic metaphor that seemed to explain the workings of all things, as well as the relations between them. God became the Great Clockmaker. His creation was no longer a mystery to be accepted. It was a puzzle to be worked out. Wrote Descartes in 1646, “Doubtless when the swallows come in spring, they operate like clocks.”17

  THE MAP AND clock changed language indirectly, by suggesting new metaphors to describe natural phenomena. Other intellectual technologies change language more directly, and more deeply, by actually altering the way we sp
eak and listen or read and write. They might enlarge or compress our vocabulary, modify the norms of diction or word order, or encourage either simpler or more complex syntax. Because language is, for human beings, the primary vessel of conscious thought, particularly higher forms of thought, the technologies that restructure language tend to exert the strongest influence over our intellectual lives. As the classical scholar Walter J. Ong put it, “Technologies are not mere exterior aids but also interior transformations of consciousness, and never more than when they affect the word.”18 The history of language is also a history of the mind.

  Language itself is not a technology. It’s native to our species. Our brains and bodies have evolved to speak and to hear words. A child learns to talk without instruction, as a fledgling bird learns to fly. Because reading and writing have become so central to our identity and culture, it’s easy to assume that they, too, are innate talents. But they’re not. Reading and writing are unnatural acts, made possible by the purposeful development of the alphabet and many other technologies. Our minds have to be taught how to translate the symbolic characters we see into the language we understand. Reading and writing require schooling and practice, the deliberate shaping of the brain.

  Evidence of this shaping process can be seen in many neurological studies. Experiments have revealed that the brains of the literate differ from the brains of the illiterate in many ways—not only in how they understand language but in how they process visual signals, how they reason, and how they form memories. “Learning how to read,” reports the Mexican psychologist Feggy Ostrosky-Solís, has been shown to “powerfully shape adult neuropsychological systems.”19 Brain scans have also revealed that people whose written language uses logographic symbols, like the Chinese, develop a mental circuitry for reading that is considerably different from the circuitry found in people whose written language employs a phonetic alphabet. As Tufts University developmental psychologist Maryanne Wolf explains in her book on the neuroscience of reading, Proust and the Squid, “Although all reading makes use of some portions of the frontal and temporal lobes for planning and for analyzing sounds and meanings in words, logographic systems appear to activate very distinctive parts of [those] areas, particularly regions involved in motoric memory skills.”20 Differences in brain activity have even been documented among readers of different alphabetic languages. Readers of English, for instance, have been found to draw more heavily on areas of the brain associated with deciphering visual shapes than do readers of Italian. The difference stems, it’s believed, from the fact that English words often look very different from the way they sound, whereas in Italian words tend to be spelled exactly as they’re spoken.21

  The earliest examples of reading and writing date back many thousands of years. As long ago as 8000 BC, people were using small clay tokens engraved with simple symbols to keep track of quantities of livestock and other goods. Interpreting even such rudimentary markings required the development of extensive new neural pathways in people’s brains, connecting the visual cortex with nearby sense-making areas of the brain. Modern studies show that the neural activity along these pathways doubles or triples when we look at meaningful symbols as opposed to meaningless doodles. As Wolf describes, “Our ancestors could read tokens because their brains were able to connect their basic visual regions to adjacent regions dedicated to more sophisticated visual and conceptual processing.”22 Those connections, which people bequeathed to their children when they taught them to use the tokens, formed the basic wiring for reading.

  The technology of writing took an important step forward around the end of the fourth millennium BC. It was then that the Sumerians, living between the Tigris and Euphrates rivers in what is now Iraq, began writing with a system of wedge-shaped symbols, called cuneiform, while a few hundred miles to the west the Egyptians developed increasingly abstract hieroglyphs to represent objects and ideas. Because the cuneiform and hieroglyphic systems incorporated many logosyllabic characters, denoting not just things but also speech sounds, they placed far greater demands on the brain than did the simple accounting tokens. Before readers could interpret the meaning of a character, they had to analyze the character to figure out how it was being used. The Sumerians and the Egyptians had to develop neural circuits that, according to Wolf, literally “crisscrossed” the cortex, linking areas involved not only in seeing and sense-making but in hearing, spatial analysis, and decision making.23 As these logosyllabic systems expanded to include many hundreds of characters, memorizing and interpreting them became so mentally taxing that their use was probably restricted to an intellectual elite blessed with a lot of time and brain power. For writing technology to progress beyond the Sumerian and Egyptian models, for it to become a tool used by the many rather than the few, it had to get a whole lot simpler.

  That didn’t happen until fairly recently—around 750 BC—when the Greeks invented the first complete phonetic alphabet. The Greek alphabet had many forerunners, particularly the system of letters developed by the Phoenicians a few centuries earlier, but linguists generally agree that it was the first to include characters representing vowel sounds as well as consonant sounds. The Greeks analyzed all the sounds, or phonemes, used in spoken language, and were able to represent them with just twenty-four characters, making their alphabet a comprehensive and efficient system for writing and reading. The “economy of characters,” writes Wolf, reduced “the time and attention needed for rapid recognition” of the symbols and hence required “fewer perceptual and memory resources.” Recent brain studies reveal that considerably less of the brain is activated in reading words formed from phonetic letters than in interpreting logograms or other pictorial symbols.24

  The Greek alphabet became the model for most subsequent Western alphabets, including the Roman alphabet that we still use today. Its arrival marked the start of one of the most far-reaching revolutions in intellectual history: the shift from an oral culture, in which knowledge was exchanged mainly by speaking, to a literary culture, in which writing became the major medium for expressing thought. It was a revolution that would eventually change the lives, and the brains, of nearly everyone on earth, but the transformation was not welcomed by everyone, at least not at first.

  Early in the fourth century BC, when the practice of writing was still novel and controversial in Greece, Plato wrote Phaedrus, his dialogue about love, beauty, and rhetoric. In the tale, the title character, a citizen of Athens, takes a walk with the great orator Socrates into the countryside, where the two friends sit under a tree beside a stream and have a long and circuitous conversation. They discuss the finer points of speech making, the nature of desire, the varieties of madness, and the journey of the immortal soul, before turning their attention to the written word. “There remains the question,” muses Socrates, “of propriety and impropriety in writing.”25 Phaedrus agrees, and Socrates launches into a story about a meeting between the multitalented Egyptian god Theuth, whose many inventions included the alphabet, and one of the kings of Egypt, Thamus.

  Theuth describes the art of writing to Thamus and argues that the Egyptians should be allowed to share in its blessings. It will, he says, “make the people of Egypt wiser and improve their memories,” for it “provides a recipe for memory and wisdom.” Thamus disagrees. He reminds the god that an inventor is not the most reliable judge of the value of his invention: “O man full of arts, to one is it given to create the things of art, and to another to judge what measure of harm and of profit they have for those that shall employ them. And so it is that you, by reason of the tender regard for the writing that is your offspring, have declared the very opposite of its true effect.” Should the Egyptians learn to write, Thamus goes on, “it will implant forgetfulness in their souls: they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks.” The written word is “a recipe not for memory, but for reminder. And it is no true wisdom that you
offer your disciples, but only its semblance.” Those who rely on reading for their knowledge will “seem to know much, while for the most part they know nothing.” They will be “filled, not with wisdom, but with the conceit of wisdom.”

  Socrates, it’s clear, shares Thamus’s view. Only “a simple person,” he tells Phaedrus, would think that a written account “was at all better than knowledge and recollection of the same matters.” Far better than a word written in the “water” of ink is “an intelligent word graven in the soul of the learner” through spoken discourse. Socrates grants that there are practical benefits to capturing one’s thoughts in writing—“as memorials against the forgetfulness of old age”—but he argues that a dependence on the technology of the alphabet will alter a person’s mind, and not for the better. By substituting outer symbols for inner memories, writing threatens to make us shallower thinkers, he says, preventing us from achieving the intellectual depth that leads to wisdom and true happiness.

  Unlike the orator Socrates, Plato was a writer, and while we can assume that he shared Socrates’ worry that reading might substitute for remembering, leading to a loss of inner depth, it’s also clear that he recognized the advantages that the written word had over the spoken one. In a famous and revealing passage at the end of The Republic, a dialogue believed to have been written around the same time as Phaedrus, Plato has Socrates go out of his way to attack “poetry,” declaring that he would ban poets from his perfect state. Today we think of poetry as being part of literature, a form of writing, but that wasn’t the case in Plato’s time. Declaimed rather than inscribed, listened to rather than read, poetry represented the ancient tradition of oral expression, which remained central to the Greek educational system, as well as the general Greek culture. Poetry and literature represented opposing ideals of the intellectual life. Plato’s argument with the poets, channeled through Socrates’ voice, was an argument not against verse but against the oral tradition—the tradition of the bard Homer but also the tradition of Socrates himself—and the ways of thinking it both reflected and encouraged. The “oral state of mind,” wrote the British scholar Eric Havelock in Preface to Plato, was Plato’s “main enemy.”26

 

‹ Prev