The Shallows
Page 20
But that wasn’t the whole story. The proliferation of printed pages had another effect, which Socrates didn’t foresee but may well have welcomed. Books provided people with a far greater and more diverse supply of facts, opinions, ideas, and stories than had been available before, and both the method and the culture of deep reading encouraged the commitment of printed information to memory. In the seventh century, Isidore, the bishop of Seville, remarked how reading “the sayings” of thinkers in books “render[ed] their escape from memory less easy.”1 Because every person was free to chart his own course of reading, to define his own syllabus, individual memory became less of a socially determined construct and more the foundation of a distinctive perspective and personality. Inspired by the book, people began to see themselves as the authors of their own memories. Shakespeare has Hamlet call his memory “the book and volume of my brain.”
In worrying that writing would enfeeble memory, Socrates was, as the Italian novelist and scholar Umberto Eco says, expressing “an eternal fear: the fear that a new technological achievement could abolish or destroy something that we consider precious, fruitful, something that represents for us a value in itself, and a deeply spiritual one.” The fear in this case turned out to be misplaced. Books provide a supplement to memory, but they also, as Eco puts it, “challenge and improve memory; they do not narcotize it.”2
The Dutch humanist Desiderius Erasmus, in his 1512 textbook De Copia, stressed the connection between memory and reading. He urged students to annotate their books, using “an appropriate little sign” to mark “occurrences of striking words, archaic or novel diction, brilliant flashes of style, adages, examples, and pithy remarks worth memorizing.” He also suggested that every student and teacher keep a notebook, organized by subject, “so that whenever he lights on anything worth noting down, he may write it in the appropriate section.” Transcribing the excerpts in longhand, and rehearsing them regularly, would help ensure that they remained fixed in the mind. The passages were to be viewed as “kinds of flowers,” which, plucked from the pages of books, could be preserved in the pages of memory.3
Erasmus, who as a schoolboy had memorized great swathes of classical literature, including the complete works of the poet Horace and the playwright Terence, was not recommending memorization for memorization’s sake or as a rote exercise for retaining facts. To him, memorizing was far more than a means of storage. It was the first step in a process of synthesis, a process that led to a deeper and more personal understanding of one’s reading. He believed, as the classical historian Erika Rummel explains, that a person should “digest or internalize what he learns and reflect rather than slavishly reproduce the desirable qualities of the model author.” Far from being a mechanical, mindless process, Erasmus’s brand of memorization engaged the mind fully. It required, Rummel writes, “creativeness and judgment.”4
Erasmus’s advice echoed that of the Roman Seneca, who also used a botanical metaphor to describe the essential role that memory plays in reading and in thinking. “We should imitate bees,” Seneca wrote, “and we should keep in separate compartments whatever we have collected from our diverse reading, for things conserved separately keep better. Then, diligently applying all the resources of our native talent, we should mingle all the various nectars we have tasted, and then turn them into a single sweet substance, in such a way that, even if it is apparent where it originated, it appears quite different from what it was in its original state.”5 Memory, for Seneca as for Erasmus, was as much a crucible as a container. It was more than the sum of things remembered. It was something newly made, the essence of a unique self.
Erasmus’s recommendation that every reader keep a notebook of memorable quotations was widely and enthusiastically followed. Such notebooks, which came to be called “commonplace books,” or just “commonplaces,” became fixtures of Renaissance schooling. Every student kept one.6 By the seventeenth century, their use had spread beyond the schoolhouse. Commonplaces were viewed as necessary tools for the cultivation of an educated mind. In 1623, Francis Bacon observed that “there can hardly be anything more useful” as “a sound help for the memory” than “a good and learned Digest of Common Places.” By aiding the recording of written works in memory, he wrote, a well-maintained commonplace “supplies matter to invention.”7 Through the eighteenth century, according to American University linguistics professor Naomi Baron, “a gentleman’s commonplace book” served “both as a vehicle for and a chronicle of his intellectual development.”8
The popularity of commonplace books ebbed as the pace of life quickened in the nineteenth century, and by the middle of the twentieth century memorization itself had begun to fall from favor. Progressive educators banished the practice from classrooms, dismissing it as a vestige of a less enlightened time. What had long been viewed as a stimulus for personal insight and creativity came to be seen as a barrier to imagination and then simply as a waste of mental energy. The introduction of new storage and recording media throughout the last century—audiotapes, videotapes, microfilm and microfiche, photocopiers, calculators, computer drives—greatly expanded the scope and availability of “artificial memory.” Committing information to one’s own mind seemed ever less essential. The arrival of the limitless and easily searchable data banks of the Internet brought a further shift, not just in the way we view memorization but in the way we view memory itself. The Net quickly came to be seen as a replacement for, rather than just a supplement to, personal memory. Today, people routinely talk about artificial memory as though it’s indistinguishable from biological memory.
Clive Thompson, the Wired writer, refers to the Net as an “outboard brain” that is taking over the role previously played by inner memory. “I’ve almost given up making an effort to remember anything,” he says, “because I can instantly retrieve the information online.” He suggests that “by offloading data onto silicon, we free our own gray matter for more germanely ‘human’ tasks like brainstorming and daydreaming.” 9 David Brooks, the popular New York Times columnist, makes a similar point. “I had thought that the magic of the information age was that it allowed us to know more,” he writes, “but then I realized the magic of the information age is that it allows us to know less. It provides us with external cognitive servants—silicon memory systems, collaborative online filters, consumer preference algorithms and networked knowledge. We can burden these servants and liberate ourselves.”10
Peter Suderman, who writes for the American Scene, argues that, with our more or less permanent connections to the Internet, “it’s no longer terribly efficient to use our brains to store information.” Memory, he says, should now function like a simple index, pointing us to places on the Web where we can locate the information we need at the moment we need it: “Why memorize the content of a single book when you could be using your brain to hold a quick guide to an entire library? Rather than memorize information, we now store it digitally and just remember what we stored.” As the Web “teaches us to think like it does,” he says, we’ll end up keeping “rather little deep knowledge” in our own heads.11 Don Tapscott, the technology writer, puts it more bluntly. Now that we can look up anything “with a click on Google,” he says, “memorizing long passages or historical facts” is obsolete. Memorization is “a waste of time.”12
Our embrace of the idea that computer databases provide an effective and even superior substitute for personal memory is not particularly surprising. It culminates a century-long shift in the popular view of the mind. As the machines we use to store data have become more voluminous, flexible, and responsive, we’ve grown accustomed to the blurring of artificial and biological memory. But it’s an extraordinary development nonetheless. The notion that memory can be “outsourced,” as Brooks puts it, would have been unthinkable at any earlier moment in our history. For the Ancient Greeks, memory was a goddess: Mnemosyne, mother of the Muses. To Augustine, it was “a vast and infinite profundity,” a reflection of the power of God in man.13 The
classical view remained the common view through the Middle Ages, the Renaissance, and the Enlightenment—up to, in fact, the close of the nineteenth century. When, in an 1892 lecture before a group of teachers, William James declared that “the art of remembering is the art of thinking,” he was stating the obvious.14 Now, his words seem old-fashioned. Not only has memory lost its divinity; it’s well on its way to losing its humanness. Mnemosyne has become a machine.
The shift in our view of memory is yet another manifestation of our acceptance of the metaphor that portrays the brain as a computer. If biological memory functions like a hard drive, storing bits of data in fixed locations and serving them up as inputs to the brain’s calculations, then offloading that storage capacity to the Web is not just possible but, as Thompson and Brooks argue, liberating. It provides us with a much more capacious memory while clearing out space in our brains for more valuable and even “more human” computations. The analogy has a simplicity that makes it compelling, and it certainly seems more “scientific” than the suggestion that our memory is like a book of pressed flowers or the honey in a beehive’s comb. But there’s a problem with our new, post-Internet conception of human memory. It’s wrong.
AFTER DEMONSTRATING, IN the early 1970s, that “synapses change with experience,” Eric Kandel continued to probe the nervous system of the lowly sea slug for many years. The focus of his work shifted, though. He began to look beyond the neuronal triggers of simple reflex responses, such as the slug’s withdrawal of its gill when touched, to the much more complicated question of how the brain stores information as memories. Kandel wanted, in particular, to shed light on one of the central and most perplexing riddles in neuroscience: how, exactly, does the brain transform fleeting short-term memories, such as the ones that enter and exit our working memory every waking moment, into the long-term memories that can last a lifetime?
Neurologists and psychologists had known since the end of the nineteenth century that our brains hold more than one kind of memory. In 1885, the German psychologist Hermann Ebbinghaus conducted an exhausting series of experiments, using himself as the sole subject, that involved memorizing two thousand nonsense words. He discovered that his ability to retain a word in memory strengthened the more times he studied the word and that it was much easier to memorize a half dozen words at a sitting than to memorize a dozen. He also found that the process of forgetting had two stages. Most of the words he studied disappeared from his memory very quickly, within an hour after he rehearsed them, but a smaller set stayed put much longer—they slipped away only gradually. The results of Ebbinghaus’s tests led William James to conclude, in 1890, that memories were of two kinds: “primary memories,” which evaporated from the mind soon after the event that inspired them, and “secondary memories,” which the brain could hold onto indefinitely.15
At around the same time, studies of boxers revealed that a concussive blow to the head could bring on retrograde amnesia, erasing all memories stored during the preceding few minutes or hours while leaving older memories intact. The same phenomenon was noted in epileptics after they suffered seizures. Such observations implied that a memory, even a strong one, remains unstable for a brief period after it’s formed. A certain amount of time seemed to be required for a primary, or short-term, memory to be transformed into a secondary, or long-term, one.
That hypothesis was backed up by research conducted by two other German psychologists, Georg Müller and Alfons Pilzecker, in the late 1890s. In a variation on Ebbinghaus’s experiments, they asked a group of people to memorize a list of nonsense words. A day later, they tested the group and found that the subjects had no problem recalling the list. The researchers then conducted the same experiment on another group of people, but this time they had the subjects study a second list of words immediately after learning the first list. In the next day’s test, this group was unable to remember the initial set of words. Müller and Pilzecker then conducted one last trial, with another twist. The third group of subjects memorized the first list of words and then, after a delay of two hours, were given the second list to study. This group, like the first, had little trouble remembering the initial list of words the next day. Müller and Pilzecker concluded that it takes an hour or so for memories to become fixed, or “consolidated,” in the brain. Short-term memories don’t become long-term memories immediately, and the process of their consolidation is delicate. Any disruption, whether a jab to the head or a simple distraction, can sweep the nascent memories from the mind.16
Subsequent studies confirmed the existence of short-term and long-term forms of memory and provided further evidence of the importance of the consolidation phase during which the former are turned into the latter. In the 1960s, University of Pennsylvania neurologist Louis Flexner made a particularly intriguing discovery. After injecting mice with an antibiotic drug that prevented their cells from producing proteins, he found that the animals were unable to form long-term memories (about how to avoid receiving a shock while in a maze) but could continue to store short-term ones. The implication was clear: long-term memories are not just stronger forms of short-term memories. The two types of memory entail different biological processes. Storing long-term memories requires the synthesis of new proteins. Storing short-term memories does not.17
Inspired by the groundbreaking results of his earlier Aplysia experiments, Kandel recruited a team of talented researchers, including physiological psychologists and cell biologists, to help him plumb the physical workings of both short-term and long-term memory. They began to meticulously trace the course of a sea slug’s neuronal signals, “one cell at a time,” as the animal learned to adapt to outside stimuli such as pokes and shocks to its body.18 They quickly confirmed what Ebbinghaus had observed: the more times an experience is repeated, the longer the memory of the experience lasts. Repetition encourages consolidation. When they examined the physiological effects of repetition on individual neurons and synapses, they discovered something amazing. Not only did the concentration of neurotransmitters in synapses change, altering the strength of the existing connections between neurons, but the neurons grew entirely new synaptic terminals. The formation of long-term memories, in other words, involves not only biochemical changes but anatomical ones. That explained, Kandel realized, why memory consolidation requires new proteins. Proteins play an essential role in producing structural changes in cells.
The anatomical alterations in the slug’s relatively simple memory circuits were extensive. In one case, the researchers found that, before a long-term memory was consolidated, a particular sensory neuron had some thirteen hundred synaptic connections to about twenty-five other neurons. Only about forty percent of those connections were active—in other words, sending signals through the production of neurotransmitters. After the long-term memory had been formed, the number of synaptic connections had more than doubled, to about twenty-seven hundred, and the proportion that were active had increased from forty percent to sixty percent. The new synapses remained in place as long as the memory persisted. When the memory was allowed to fade—by discontinuing the repetition of the experience—the number of synapses eventually dropped to about fifteen hundred. The fact that, even after a memory is forgotten, the number of synapses remains a bit higher than it had been originally helps explain why it’s easier to learn something a second time.
Through the new round of Aplysia experiments, Kandel wrote in his 2006 memoir In Search of Memory, “we could see for the first time that the number of synapses in the brain is not fixed—it changes with learning! Moreover, long-term memory persists for as long as the anatomical changes are maintained.” The research also revealed the basic physiological difference between the two types of memory: “Short-term memory produces a change in the function of the synapse, strengthening or weakening preexisting connections; long-term memory requires anatomical changes.”19 Kandel’s findings fit seamlessly with the discoveries being made about neuroplasticity by Michael Merzenich and others. Further ex
periments soon made it clear that the biochemical and structural changes involved in memory consolidation are not limited to slugs. They also take place in the brains of other animals, including primates.
Kandel and his colleagues had unlocked some of the secrets of memory at the cellular level. Now, they wanted to go deeper—to the molecular processes within the cells. The researchers were, as Kandel later put it, “entering completely uncharted territory.”20 They looked first at the molecular changes that occur in synapses as short-term memories are formed. They found that the process involves much more than just the transmission of a neurotransmitter—glutamate, in this case—from one neuron to another. Other types of cells, called interneurons, are also involved. The interneurons produce the neurotransmitter serotonin, which fine-tunes the synaptic connection, modulating the amount of glutamate released into the synapse. Working with the biochemists James Schwartz and Paul Greengard, Kandel discovered that the fine-tuning occurs through a series of molecular signals. The serotonin released by the interneuron binds to a receptor on the membrane of the presynaptic neuron—the neuron carrying the electric pulse—which starts a chemical reaction that leads the neuron to produce a molecule called cyclic AMP. The cyclic AMP in turn activates a protein called kinase A, a catalytic enzyme that spurs the cell to release more glutamate into the synapse, thereby strengthening the synaptic connection, prolonging the electrical activity in the linked neurons, and enabling the brain to maintain the short-term memory for seconds or minutes.