Book Read Free

Permanent Present Tense

Page 14

by Suzanne Corkin


  In 1984, I asked a psychiatrist, George Murray, to evaluate Henry. Murray noted that Henry “was always smiling, and had a relatively warm interaction with me.” Henry did not know whether his appetite was good or bad but smiled when he reported that he did not like liver. When Murray asked him whether he was sleeping normally, he replied, “I guess so.” He said that he did not think of death and, to his knowledge, did not cry. When asked if he felt helpless, he said, “Yes and no,” and if he felt hopeless, with a wide smile said, “Yes, and mostly no.” When asked if he felt worthless, Henry smiled again and said, “This could be the same as hopeless.” The preceding question had lingered in his short-term memory. When Murray asked him whether he liked himself, he once more had a cautious smile and said, “Yes and no—I can’t be a brain surgeon.” (Over the years, a recurring theme in Henry’s conversations was that he had wanted to be a brain surgeon.) Murray concluded that Henry “does not have any depression. This does not mean that he cannot feel sad on occasion.”

  Murray continued to probe Henry’s emotional life with additional questions about his parents and taste in music. They laughed together over their mutual dislike of “jive.” Then Murray moved into the area of sex. He asked Henry if he knew what an erection was, and he said “a building.” Then Murray said, “well, let me use some other language” and asked him what a “hard-on” was. Henry, without smiling, frowning, or any change in facial muscular said, “a man gets it, below the belt.” Henry knew that men have penises and women do not, and he described how babies are conceived. During this line of questioning, Henry had no facial responses to Murray’s questions and said that he did not have any sexual desire. Murray described him as asexual—having no libido. (Buckler, Henry’s boss, had described Henry as a perfect gentleman who “never as much as looks at the girls at the Center.”)

  When my colleagues and I interacted with Henry, he was always friendly but passive. He had an excellent sense of humor that occasionally popped up in everyday conversation. For example, one day in 1984, a neurologist in our lab walked with Henry out of a testing room into the hall. As the door closed behind them, the neurologist wondered aloud whether he had left his keys inside the room. Henry replied, “At least you’ll know where to find them!”

  Henry’s inherent easygoing and generous nature was apparent in the supreme patience he exhibited in undertaking all our tests. Of course, he had no long-term memory for the testing episodes, so each one was a new experience for him, and he never seemed bored. Once, when talking with one of the members of our lab, he summed up his testing experiences this way: “It’s a funny thing—you just live and learn. I’m living, and you’re learning.”

  Seven

  Encode, Store, Retrieve

  In 1972, I visited Mrs. Molaison and Henry when the Watergate scandal was dominating the news and asked him what Watergate meant to him.

  “Well, I think of a prison right off, and I think of a riot in Watergate Prison,” he replied.

  “Have you heard anything on the news recently about riots or about Watergate?” I asked.

  “No. Then I think of an investigation into it.”

  “That’s right,” I said, encouragingly.

  “But that, uh, I can’t put my finger more on it, I guess.”

  “Have you ever heard the name John Dean?”

  “Well, I think of an assassin right off, but then, after I said that, after I said assassin, I thought of, uh, a leader, you know, labor leader or worker that was killed or injured. That’s what I think of.”

  “You read all about them in the papers and everything,” Mrs. Molaison interjected.

  John Dean was Nixon’s White House counsel. Henry had been exposed to the extensive coverage of the Watergate break-in, but like a computer with a faulty hard drive, his brain was unable to store and retrieve this information.

  The modern study of the human brain owes much to advances in computer science. Our search for the underlying cognitive operations of long-term memory is now grounded in information theory, an idea introduced in 1948 by Claude Shannon, an engineer at New Jersey’s Bell Telephone Laboratories. Shannon introduced this idea in his mathematical theory of communication, integrating knowledge from applied mathematics, electrical engineering, and cryptography to describe the transmission of information as a statistical process, and coining the term bit to refer to the most fundamental unit of information. In the early 1950s, the cognitive psychologist George A. Miller introduced information theory to the study of natural-language processing, thus integrating Shannon’s ideas into the field of psychology.1

  Conceptualizing learning and memory in terms of information processing was a key advance, allowing researchers to divide memory into three stages of development, akin to computer processes. The first stage is encoding information by turning sensory inputs from the world into representations in the brain. The second stage is storing those representations so they can be extracted later. The third stage is retrieving the stored memories when they are needed. Researchers now design memory experiments to examine each of the three stages separately, and to watch how they interact.

  Scientists divided the underlying information-processing stream into these discrete stages to make the scientific study of memory tractable. This artificial division is a simplification, but a necessary one: it allows researchers to describe in detail the numerous processes within each stage. In reality, encoding, storage, and retrieval occur constantly and simultaneously. Understanding the constituent parts of memory formation is essential to assembling them into a comprehensive theory.

  Henry had no trouble encoding information. When I would ask him whether he wanted milk in his tea, he could register my question in his short-term memory and reply that he never took milk but did take sugar. Henry’s problem was with the last two stages of information processing—storage and retrieval of new information. If I distracted him by changing the topic of conversation and then asked what we had talked about earlier, he would not know. The stimuli his brain received could be held briefly, but they could not be squirreled away and revisited later.

  Beginning with the publication of Scoville and Milner’s paper in 1957, Henry’s case helped to launch decades of research that dissected the cognitive and neural processes within each of the three stages of memory formation. Just as important, Henry’s case illustrated the fractionation of memory—the idea that our brains constantly juggle different kinds of short-term and long-term memory processes, each mediated by a separate, specialized memory circuit. Milner’s landmark discovery, showing that some of Henry’s long-term memory processes were disrupted while others were not, led to the important theoretical distinction between declarative or explicit memory—seriously impaired in Henry—and nondeclarative or implicit memory—intact in Henry.2

  Declarative memory, rooted in the medial temporal lobes, refers to the type of memory we invoke when in everyday conversation we say, “I remember” or “I forget.” This kind of memory includes the capacity to recollect consciously two kinds of information: episodic knowledge—the recollection of specific experiences we were part of in the past—and semantic knowledge—general knowledge, such as information we gather about people, places, language, maps, and concepts, not linked to a particular learning event. In many ways, declarative memory is the backbone of everyday life, enabling us to acquire the knowledge we need to pursue goals and dreams and to function as independent people.

  Henry lived for fifty-five years without acquiring any new declarative memories. He could not tell us about exact incidents, such as what he had for breakfast, what tests he had taken the day before, or how he celebrated his last birthday. He also could not learn new vocabulary words, the name of the current president, or the faces of people he encountered at the CRC. On memory tests, his scores were no better than if he had guessed at the answers. The structures removed from Henry’s brain were dedicated to declarative memory. His surgery, however, left intact other circuits that supported his nondeclar
ative memory, so he could learn new motor skills and acquire conditioned responses.

  Research that stemmed from Henry’s case shed light on the fundamental processes that underlie the encoding, storage, and retrieval of episodic knowledge. Over the last fifty-five years, scientists have made great progress in characterizing these three stages of processing. In the 1990s, these investigations were fueled by the advent of brain-imaging tools such as positron emission tomography (PET) and functional MRI. These technologies made it possible for researchers to examine, for the first time, brain activity for each stage of processing separately.

  After the discovery that conscious remembering depends on mechanisms in the hippocampus and its close neighbor, the parahippocampal gyrus, scientists began to tackle basic questions in the psychology and biology of episodic learning: What specific cognitive operations contribute to long-term memory of a single event? What are the complex workings within the hippocampus and parahippocampal gyrus? What is the role of the cerebral cortex in long-term memory? What cognitive processes and corresponding brain circuits mediate how well we encode, store, and retrieve episodes, and how much we forget?

  Simply registering a sensory event—seeing, hearing, smelling, touching, or tasting—does not ensure that learning will happen. How well we remember events and facts depends greatly on how effectively they are encoded initially. The likelihood that we will remember a name, face, date, address, directions to a party, or anything else is related to the richness of the representation. Researchers call this the depth-of-processing effect.

  Psychologists Fergus Craik and Robert Lockhart first described this effect in the early 1970s after conducting a series of experiments to study how deeply their subjects processed information. The researchers persuasively argued that when the brain receives information, it can process that information to different depths. Craik and Lockhart gave their subjects short words such as speech and daisy as test stimuli, allowing participants to view each word briefly, and then asking them a question about each word. By using three kinds of questions, the researchers hoped to generate different levels of processing—shallow, intermediate, and deep.3

  Envision the printed word TRAIN. Craik and Lockhart encouraged shallow processing through questions about the physical structure of the word (Is the word in lowercase?), intermediate processing through questions about the rhyming characteristics of the word (Does the word rhyme with brain?), and deep processing through questions about the meaning of the word (Is the word a mode of travel?). After participants encoded a list of words in this way, there was a brief pause, followed by a surprise memory test to see which words they recalled. The experiments showed that subjects remembered best those words encoded by processing their meaning, followed by words encoded by their rhyming characteristics, and then by words encoded by their physical structure. Overall, participants’ retention of the words depended on how elaborately and descriptively they thought about the words as they encoded them. Thus, Craik and Lockhart illustrated that deep processing produces stronger memories than shallow processing.4

  In 1981, we became curious to see whether Henry would show the depth-of-processing effect. We designed a depth-of-processing test in which we helped him think hard about the meanings of words to enhance his ability to recognize those words later. Would he be more likely to recognize words he processed deeply than words he processed superficially? The stimuli for Henry’s test were thirty common nouns such as hat, flame, and map. For the encoding task, the examiner played an audiotape. Henry first heard a word such as hat, and then one of three kinds of questions, which he answered “Yes” or “No.” For example, “Does a woman say the word?” targeted the physical (shallow) level. “Does the word rhyme with glass?” targeted the phonological (intermediate) level. “Is the word a type of clothing?” targeted the semantic (deep) level.5

  After the encoding phase, Henry took an unexpected memory test to see whether he recognized the words he had just encoded. The examiner read him three words, asking him to choose the one he had heard before and encouraging him to guess if he was unsure. Henry performed the depth-of-processing task on two occasions. Just by chance, he should have gotten ten out of thirty correct; and in two separate test sessions, his score was no better than chance—twelve in 1980, and ten in 1982. His overall performance was deficient; to answer our original question, he did not show the depth-of-processing effect.6

  We now understand that Henry did poorly not just because his hippocampus was damaged, but also because he lacked the critical connections and interactions that occur between medial temporal-lobe structures—where information is processed initially—and the cortical areas specialized for storing representations of words and other information. While the hippocampus makes a vital contribution to encoding, the cortex plays an equally important role. Functional MRI studies, which allow us to look at brain activity during task performance, show convincingly that activation in the cortex is greater during deep processing than during shallow. The cortex, however, cannot perform the job of encoding by itself. Henry’s senses could perceive words, pictures, sounds, and touch, and could deliver this information to his brain where his cortex registered it. But beyond that stage, his ability to store that information was so deficient that deeper processing did not help. Although he could receive and understand incoming sensory information normally, he was unable, even with added elaboration, to form deep representations that would result in better memory.7

  In general, the more deeply we characterize a name, face, date, address, or anything else, the better we remember it. This is true whether we extract the information from long-term memory without help—free recall—or it automatically jumps out at us when we consider several options—recognition. Imagine you want to search online for directions to a particular Italian restaurant. If your brain has a rich representation of the restaurant based on past visits, you will spontaneously recall its name and the town it is in and plug that information into your computer’s search engine. On the other hand, if your representation of the restaurant is sparse because you have never eaten there and just drove by it once, you may not recall the specific name and will need to view a list of possible restaurants until you recognize the one you are looking for.

  When we encode new information deeply, we are more likely to retrieve it later because we have linked it to a wealth of semantic information already stored throughout the temporal, parietal, and occipital cortices. Elaborative rehearsal, in which we mentally manipulate information and relate it to other facts we know, helps long-term memory much more than if we simply repeat the information. Students know the value of preparing for tests and exams by forming small study groups in which they ask one another practice questions drawn from lectures and readings. The ensuing discussions constitute elaborative rehearsal: they encourage deeper processing and more robust encoding of the material than if the students simply read over their notes silently in the library.8

  Unlike our hypothetical students, Henry could not benefit from dynamic elaborative rehearsal. In 1985, however, Henry did use simple rehearsal to accomplish a feat that at first blush looked like long-term memory formation. A postdoctoral fellow in my lab wanted to test his appreciation of the passage of time. She told him that she was going to leave the room and that when she returned, she would ask him how long she had been gone. She left the room at 2:05 and returned at 2:17. When she asked Henry how long she was out of the room, he replied “Twelve minutes—got you there!” She was astonished, until she noticed the large clock on the wall and understood how Henry had come up with the correct answer. During her absence, he repeated 2:05 over and over to himself, keeping it foremost in his mind, and when she came back, he looked at the clock and saw that it was 2:17. He then harnessed his working memory capacities to do some simple arithmetic, subtracting 2:05 from 2:17. Henry could not remember things, but occasionally he could be quite clever in finding ways to compensate for his disorder.9

  Elaborative rehearsal is not the o
nly way to beef up memory. History is rich with examples of people using intricate techniques to recall information, creating mental representations and organizing them so that information is available later. In 1596, Matteo Ricci, an Italian Jesuit missionary and scholar working in China, wrote a short book entitled Treatise on Mnemonic Arts, laying out a technique for Chinese men to memorize the vast burden of knowledge they needed to pass challenging civil-service examinations. Ricci’s technique, based on a medieval European idea, centered around a “memory palace”: an imposing edifice with a reception hall and many rooms that contained lively, complex images, such as emotion-provoking paintings, located at different points. Because items that have emotional content are more memorable than neutral items, the trick is to create an outrageous or emotional association between each piece of information to be remembered and an object in the room. In doing so, we form vivid mental associations. A modern term for Ricci’s memory technique is the method of loci: picturing a familiar route in the mind’s eye, placing distinct landmarks along that route, and associating different to-be-remembered material with each one.10

  If you want to create a memory palace, choose a familiar landmark—an office building, a nearby grocery store, or your house. For example, let’s say that you are trying to memorize a toast you will deliver to the bride at her wedding reception. You have a specific set of stories you want to remember—about soccer in elementary school, gymnastics in middle school, travel to France in high school, acquiring a dog in college, and later meeting the groom. Pick out a familiar location to become your memory palace—for instance, your local supermarket. Insert cues to the anecdotes in your speech into the orderly arrangement of the foods. You place an eye-catching reminder on the door of the market, and then progress, in succession, to the fruit, vegetable, meat, and frozen foods departments. As you approach the entrance, you picture an enormous soccer ball filling the glass door, with the bride and her best friend at age seven in their soccer uniforms, perched on top of the ball holding hands. Moving on to the fruit department, you imagine the bride’s gymnastics team doing handstands on the watermelons, and in the vegetables, a giant Eiffel Tower atop asparagus spears. In the meat section, place a full-size Siberian Husky inside the showcase with a five-pound steak in his mouth, and in the frozen foods aisle, visualize the groom inside the freezer, on one knee, holding a gigantic bag of onion rings. Once you have constructed these images and fixed them in your mind, you can embark on a mental excursion through the supermarket, with your mind’s eye traveling from one memory image to another. As you deliver your speech, you can retrieve, in a specific order, the memories and anecdotes you have stored there. People of all ages can take advantage of memory-enhancing tricks like this one.

 

‹ Prev