by John D'Agata
So fifty years after first studying and abandoning one compound to immobilize our nation’s nuclear waste, the Department of Energy found itself with a report from the American Society for Testing and Materials in which Synroc was now recommended for use at Yucca Mountain, but not as a substance for containing Yucca’s waste—for which England and Australia and Japan and Pakistan and Russia and China and France are testing it—but rather as a medium on which to write our nation’s message about this problem that we never found a real solution for.
“This is a project about faith,” said the anthropologist David Givens, another member of the Department of Energy’s Expert Judgement Panel. “I mean, it’s a project about having faith that we can even pull this off. But it’s also a project that requires faith to even think this would be necessary. These days, a lot of scientists are predicting that the human race won’t survive beyond the next few centuries. So there’s something touching about the U.S. government making the assumption that it’s crucial we warn our descendants about the dangers of nuclear waste. They’re assuming that we’ll have descendants.”
David was in Tacoma, and I was out of the country, and so we spoke by way of a satellite relay.
“Hello?” David said.
“Hello?” I responded.
“Have you ever played that game they call ‘Telephone’?” he asked. “You know, you whisper something into the ear of the person sitting next to you, and then that person whispers what you said to the person sitting next to them, and then that person whispers what they heard, and so on, and so on? It ends with the last person down the line finally revealing what they heard, which ideally would be the same thing that the very first person said, but of course it never is. There’s always something that happens in the transmission of the message. People mishear things, so the message gets garbled. Or they can’t hear all of it, so they make stuff up. Sometimes there’s just some jerk who decides to change the message for the hell of it. The point is, in a situation like that, the message has to rely on the diligence and goodwill of its carriers. And this project is no different. What we’re all talking about is a species-wide game of Telephone that’s going to last for the next ten millennia. So, yeah, there’s some faith involved. We’re gonna have to trust each other.”
“But the best thing to rely on in a project like this,” explained Louis Narens, a cognitive scientist who was also on the panel, “is not some emotional connection that we imagine we have with the future, but rather the cognitive skills that are inherent in all of us. It’s our intelligence that unites our species.”
Louis has been teaching for a quarter of a century at the Institute for Mathematical and Behavioral Sciences at the University of California in Irvine. His publications have titles like Ultra-Uncertainty Tradeoff Structures and Homeomorphism Types of Generalized Metric Spaces and A General Theory of Ratio Salability with Remarks About the Measurement-Theoretic Concept of Meaningfulness, but when we met one sunny morning at his home by the beach there was a Labrador standing beside him at the door and his wife blending smoothies in a yellow jogging bra.
“Some people on the panel wanted to take an emotional approach to designing this sign, and I understand that motivation,” he said, “because emotions are dramatic. Their impact is immediate, and they feel very deep. But the problem is that emotions are unreliable, too. An aesthetic approach to designing this marker would ultimately be irresponsible because nobody’s really proven that emotions are universal.”
Cognitive science is the study of how humans know themselves. It explores how we perceive, reason, and interact with the world through the complex negotiation of objects and ideas.
Its primary focus, rooted in the linguistic theories of Noam Chomsky from the late 1950s, is that which makes the human mind unique in the world: its ability to represent things through the use of abstract signs.
“Language itself was developed out of signs,” Louis said. “Five thousand years ago, small cuneiform notches were made on clay vessels in order to tell people how many sheep a person owned, or how many sheaves of wheat they owed somebody else. Inside the vessels were a corresponding number of tokens in the shape of sheep or wheat. Eventually, though, people realized that the tokens inside those vessels weren’t really necessary as long as the notches on the outside of the vessels were acknowledged to be a counterpart to actual sheep or wheat. They were symbols, in other words. And this is what helped pave the way for representational language, allowing those notches to stand in for greater and greater abstractions. Not ‘sheep,’ for example, but ‘goods.’ And then not ‘goods,’ but ‘property.’ This is the beginning of cuneiform, the origin of written language. And it’s when our species began to think about thought itself.”
He wiped off a smoothie mustache.
“It’s tricky, though,” he said. “Because how far can we take the significance of a symbol? Jung would have us believe that what unifies human consciousness is the existence of emotional archetypes, generalized subconscious stimuli that affect all humans exactly the same way. As a cognitive scientist, however, I’m not so sure about that. I mean, as attractive as the idea of emotional archetypes is, their existence can’t be proven. Lots of people have tried testing for the existence of archetypes by asking subjects to look at three different pictures and then to choose which of them express the same emotion. But do you know what happens? It doesn’t work. There’s never been any evidence from these studies that proves we ‘feel’ things in universal ways, because there’s never any consistency in what people choose. And that makes sense, because this kind of test is really just an exercise in interpretion.”
According to administrators of the Thematic Apperception Test, a psychological evaluation developed in 1935, “the human imagination in an individual mind is more unique than even a fingerprint.” In the test, subjects are shown thirty-one individual images and then asked to compose stories to accompany each. As the test’s originators once explained, the test is based on “the well recognized fact that when someone attempts to interpret something complex he is apt to tell as much about himself as he is about the matter he is trying to interpret. At such times, the subject is off his guard, since he believes that he is merely explaining objective occurrences. To a trained professional, however, he is exposing inner forces, wishes, fears, and traces of past traumas.”
By 1943, the Thematic Apperception Test was said to have eclipsed the Rorschach Test in popularity among psychologists. Within another decade, it evolved into a series of specialized tests for “adult male subjects” or “adult female subjects” or “boys,” “girls,” “Negro Americans,” “the handicapped,” “the elderly,” “veterans of foreign wars.”
It was, as two psychologists claimed in the 1960s, “modern psychology’s reigning personality assessment.”
However, in a controversial 1999 study entitled “The Thematic Apperception Test: A Paradise of Psychodynamics,” the Thematic Apperception Test was administered to a twenty-five-year-old subject dubbed “John Doe.” Following the test’s protocol, John was asked to view the thirty-one standard images and then to compose a story that could accompany each. In response to an image of a boy who is sitting alone while looking at a violin, John wrote:
“This child is sick in bed. He has been given sheet music to study, but instead of music he has come across a novel that interests him more. It is probably an adventure story. He evidently does not fear the possibility that his parents will find him thusly occupied as he seems quite at ease. He seems to be quite a studious type and perhaps regrets missing school, but he still seems quite occupied with the adventure in the story. The adventure in the story has something to do with oceans. He is not too happy, not too sad. His eyes are somewhat blank, the result of reading a book without any eyes. He disregards the music and falls asleep with the book.”
The administrator then invited thirty-one of the world’s preeminent psychologists to review what John had written, the results of which—
“Quite likely he is a member of some minority group”
“He is probably a mainstream All-American boy”
“He was very clearly overprotected while growing up”
“Feels neglected by the world”
“Fears social disapproval”
“Lacks positive personal relationships”
“Is afraid of his father”
“Sees his father as inadequate”
“He may have had an excessive number of premature sexual affairs”
“He has never had an authentic sexual experience”
“He has had homosexual relationships out of desperation rather than desire”
“His Oedipal problems are not yet resolved”
“I see no significant damage”
“There is little hope for recovery”
“Seems fatalistically resigned to the rejection of the world”
“Has an ingrained hopeless pessimism”
“Seems hopeful for the future”
“…a dominant drive for security”
“…an enormous amount of hostility”
“…a great deal of love to share”
“…a decreasing ability to distinguish between fantasy and reality”
“…an exceptional imagination at work”
“…a pathological loosening of an orderly thought process”
“…seems to be in the early stages of paranoid schizophrenia”
“…the depression is so powerful that suicide could be likely”
“…I think he is very creative”
—were described in a review in the Journal of the American Medical Association as “probably the most profound psychological dissection ever performed on a single human.”
“We all interpret differently,” Louis repeated. “But the point is that the more complex we make our message, the more likely it is that a future civilization will be able to decipher it correctly, because a complex message will leave less room for vagueness and more opportunities to countercheck our intentions. It can’t be just a shorthand message for ‘danger,’ in other words, or a picture of a stick figure entering a mountain with a big X drawn over it. It has to be something that’s multilayered, a mixture of different communicative media that people in the future will have to work through in order to understand its overall meaning, using the natural functions of the cognitive mind. In short, we have to give these people a problem to solve. We can’t just give them the answer. And part of that strategy will have to involve using language effectively.”
“Yeah,” said Fritz Newmeyer, a linguist at the University of Washington in Seattle, “except languages have the unfortunate habit of regularly failing.”
Red-haired and slim and bearded-to-a-point, Fritz met me on his campus at the student union center. He bought a banana and a cup of coffee and said, “You’ve got an hour.”
Fritz described himself as the “maverick” on the panel.
“I was alienated from other members of the group,” he explained, “because I was really outspoken. I thought this whole project was bullshit from the start.”
In his field, Fritz is best known for compiling a four-volume study on the history of modern linguistics, a discipline that tracks the roots, uses, and transformations of the world’s many languages, as well as their demise.
“I have a set of the Encyclopaedia Britannica from 1911,” he said, “and every now and then I like to look at it just to see what kinds of changes our culture has experienced. There’s Latin and Greek in this edition, for example, and the Latin and Greek aren’t translated. But these days, of course, everything in the Britannica would be translated for us. So obviously this tells us a lot about the differences between the encyclopedia’s readership in 1911 and its readership today. It tells us, for example, that Latin and Greek aren’t considered integral to our educations anymore. And this not only suggests something about the amount of social change that we’ve experienced since 1911, but it also says something about the changes that those two languages have undergone. Latin and Greek are now the domain of scholars, or of those who might have attended private schools where these languages are still taught. Nevertheless, there has been some change to both these languages and to our culture. Does this mean our culture is going to hell, as we tend to say when change occurs? Or does it just mean that our culture has evolved, and thus so have our needs for language? My point is that a language either changes to better suit its culture, or it atrophies, just like everything else. And then it eventually dies.”
Linguists estimate that there are currently 6,700 languages in use around the world, half of which will disappear within the next century. Among these, only about 100 are spoken by 90 percent of the world’s inhabitants, leaving the fates of the remaining 6,600 languages to just 10 percent of the population.
“So the problem that we faced on the panel,” Fritz explained, “was that even if we worked with the world’s strongest languages, there was still no guarantee that any of them would last, because languages naturally fall out of use. For example, say we left a message that was written in English, and it was simple and straightforward and incredibly precise. What do you think are the chances that someone in ten thousand years will still be able to read it? How many people even speak English today?”
“I don’t know,” I said. “Maybe about thirty percent?”
“Thirty percent of what?” Fritz said.
“Of the world.”
He put down his banana.
“How many people live in the United States?” asked Fritz. “Plus in England, in Australia, in Canada, South Africa, New Zealand, Ireland, Scotland, etc.? How many people in all these cultures? How many of them can speak and read English fluently?”
“Okay,” I said. “Thirty-five percent?”
He stared.
I looked down.
“Let me give you a hint,” he said. “It’s not thirty percent and it’s not higher than that.”
“Twenty?” I said.
Fritz closed his eyes.
“Fifteen? Ten?”
“Stop,” he said. “It’s five percent. Okay? Just five percent of the world’s population knows how to read or speak English. That’s all. And only fifteen percent speak Mandarin, which is the world’s most popular language. So nobody’s got a majority. And that’s my point. This is a project that’s attempting to accomplish what can’t be done, which is communicate in a way that will be coherent to anyone on the planet at any given time between now and ten millennia from now. We have never been able to do that. And we probably never will. So what we really ought to do is invent a language that can be universally understood.”
In 1941, American linguist Morris Swadesh began devising a system to help him more efficiently trace the roots of languages in order to determine how quickly they’d evolved from their original sources. In linguistics, the most common reason for doing this is to better understand the cultural influences that have affected a language over time. The less obvious reason for doing it, however, is to pursue what some linguists like to call the ultimate “mother tongue,” a single root for every language, but a root that some linguists don’t even believe existed.
Swadesh, working in exile in an underfunded lab at the National School of Anthropology in Mexico City, believed that an easier way to pursue his research than conducting expensive firsthand fieldwork would be to construct a template of basic vocabulary with which to compare a variety of languages, thus achieving a quick survey of languages’ interrelatedness. While most languages change dramatically over time, Swadesh theorized that a basic set of vocabulary was not only likely to exist in every world language, but that such a basic vocabulary would also be more likely to resist significant change, because fundamental words serve essential roles in human culture. The vocabulary list that Swadesh created was made up of words that describe body functions, natural phenomena, sensory experiences, and physical dimensions.
He developed a list of 200 such words—<
br />
all, animal, ashes, back, bark, belly, berry, big, bird, bite, black, blood, bone, breast, breathe, brother, burn, child, claw, clothing, cloud, cold, come, cook, count, cry, day, die, dig, dirty, dog, drink, dry, dull, dust, ear, earth, eat, egg, eight, eye, fall, fat, father, fear, feather, fight, fire, fish, five, float, flower, fly, fog, foot, four, full, freeze, give, good, grass, green, guts, hair, hand, he, head, hear, heart, heavy, here, hold, horn, how, hundred, hunt, husband, I, ice, if, kill, knee, know, lake, last, laugh, leaf, left, leg, lie, live, liver, long, louse, man, many, meat, moon, mother, mountain, mouth, name, near, neck, new, night, nine, nose, not, old, one, other, play, pull, push, raid, rain, red, right, river, road, root, rope, rub, salt, sand, say, scratch, sea, see, seed, seven, sew, sharp, shoot, short, sing, sister, sit, six, skin, sky, sleep, small, smell, smoke, smooth, snake, snow, speak, spit, split, squeeze, stab, stand, star, stick, stone, straight, sun, swell, swim, tail, ten, that, there, they, thick, thin, think, three, throw, tie, tongue, tooth, tree, turn, two, walk, warm, wash, water, we, wet, what, when, where, white, who, wide, wife, wind, wing, wipe, woman, woods, warm, work, year, and yellow
—and called it the “Swadesh List.”
By comparing the different versions of this basic vocabulary in any two languages, Swadesh believed that he could determine how closely related those two languages were, and thus how recently they had diverged from a common root.
In his book The Origin and Diversification of Language, Swadesh proposed that his technique could be used to trace human languages back 200,000 years, roughly 95 percent farther than with conventional techniques.
“If we can show by means of comparative linguistics that various peoples spoke similar languages sometime in the past,” he wrote, “we can infer the identities of those predecessor languages, and thus even more intimate connections between all human cultures…arriving eventually at an original tongue.”
Swadesh came to believe so strongly in the potential of his method that he began to call it “glottochronology,” comparing it to the precision of radiocarbon dating.