When We Are No More

Home > Other > When We Are No More > Page 2
When We Are No More Page 2

by Abby Smith Rumsey


  We keep our mental model of the world up to date by learning new things. Fortunately, our memory is seldom really fixed and unchangeable. If it were, that would put us in extreme peril—unless the world were to suddenly stop, become fixed and unchangeable too. We must be as adept at forgetting what is no longer true or useful as we are at remembering what is valuable and necessary. As we take on new roles and responsibilities in life, such as parent, partner, worker, or citizen, we shed old ones—child, student, or dependent. Like muscles, memories weaken with time when they are not used. Just as in the art of packing, in which what we leave out is as important as what we put in the bag, so too does the art of memory rely on the art of forgetting.

  What this means for the digital age is that data is not knowledge, and data storage is not memory. We use technology to accumulate facts about the natural and social worlds. But facts are only incidental to memory. They sometimes even get in the way of thoughtful concentration and problem solving. It is the ability for information to be useful both now and in the future that counts. And it is our emotions that tell what is valuable for our survival and well-being. When distracted—for example, by too many bright shiny things and noisy bleeping devices—we are not able to learn or develop strong reusable memories. We fail to build the vital repertoire of knowledge and experience that may be of use to us in the future. And it is the future that is at stake. For memory is not about the past. It is about the future.

  Human memory is unique because from the information stored in our brains we can summon not only things that did or do exist, but also things that might exist. From the contents of our past we can generate visions of the future. We know there is a past, a present, and a future, and in our heads we travel freely among these time zones. We know that generations of people were born and died before we came into existence, and that we, too, are destined to die. This deep temporal depth perception is unique in Nature. We engage in mental time travel, imagining possible future outcomes, or traveling backward in time to re-create how something in the present came to be this way and not that. The richer our memories, the greater our imaginative capacities. And our destiny as problem solvers lies in this conjectural thinking.

  As we consider memory in the digital age, we will see how our personal memory is enhanced, and at times compromised, by the prodigious capacities and instantaneous gratifications of electronic information. We will also look at memory “at scale,” as computer engineers would say, the collective memory of humanity as it has grown and at times shrunk over thousands of years. Collective memory—the full scope of human learning, a shared body of knowledge and know-how to which each of us contributes and from which each of us draws sustenance—is the creation of multiple generations across vastly diverse cultures. We will see how single acts of learning can be shared and accumulate over time to become knowledge, how generations have worked to secure that knowledge and share it across generations, and how each generation builds upon this foundation. Digital networks make our collective memory accessible across political and linguistic boundaries. Everyone with access to the Internet can turn personal memory and learning into shared knowledge, ensuring that the collective memory of humanity continues to be culturally diverse as it grows exponentially.

  The past as well as the future of this collective memory is being fundamentally reshaped by digital technology. How do we guarantee that this uncontrolled experiment in human memory will turn out well? There are no guarantees, of course. But what happens is in our hands. We face critical decisions as a society and as individuals about how to rebuild memory systems and practices to suit an economy of information abundance. It is rare that any generation is called upon to shape so much of the world that future generations will inherit. It is my goal to deepen our understanding of memory’s role in creating the future and to expand the imaginative possibilities for rebuilding memory systems for the digital age.

  This is a little book about a big idea. It is not a book of predictions, for the future is unknowable. Nor is it a comprehensive history or analysis of cultural and biological memory. Instead, it is an exploration into new territories of memory, past and future. Like all travelers making their way in terra incognita, we have to take strategic detours in order to stay headed in the same direction. We must, with regret, sidestep some very interesting but ultimately distracting diversions along the way. For those who want to explore a topic we consider too briefly for their taste, I have included a list of the key resources I used in my research, as well as pointers to specific sources and additional commentary in the notes.

  Our journey begins by looking behind us, into the deep past of human memory, to learn how we arrived at this pass. Along the way, we spend time at key inflection points with individuals whose stories exemplify how changing ideas about memory interact with technologies for sharing knowledge to expand human potentials. We survey the biology of memory, a field still in its infancy, to gain insights into the brain’s natural filtering systems that capture what is valuable and dump the rest. We end in the present day to consider the personal, social, and cultural choices we face as we work to master the abundance of memory in the digital age.

  CHAPTER TWO

  HOW CURIOSITY CREATED CULTURE

  Thoughts in the concrete are made of the same stuff as things are.

  —WILLIAM JAMES, “DOES CONSCIOUSNESS EXIST?” 1904

  We do not know where our ability to experience different tenses of time comes from and why we alone see time, past and present, in increments of weeks, years, centuries, and even more. Because we are the sole surviving branch of the genus Homo, it has proven hard to trace how our ability to engage in mental time travel arose and why. We routinely and consciously make physical records of our knowledge to distribute to distant times and faraway places inhabited by people we will never meet. Through culture, a collective form of memory, we create a shared view of the past that unites us into communities and allows large-scale cooperation among perfect strangers. All the evidence we have tells us that we alone have figured out how to do this. For all we know, this may be why we survived and our closest cousins, the Neanderthals and Denisovans, did not.

  New techniques in DNA extraction and analysis allow us to compare our genome to those of our close cousins and see how their biological inheritance mirrors ours in most details. Neanderthals were physically more powerful than our ancestors when we came out of Africa and moved into their territory. Neanderthals had been living in Eurasia for tens of thousands of years. They were better adapted to the cold, bigger in all dimensions including the brain. They may have shared some distinctively human behaviors that are now ours alone: controlling fire, crafting sophisticated tools, decorating their bodies, and even burying their dead. But Neanderthals did not band together in large groups to cooperate. They did not exchange goods across long distances, as early humans did. Nor did they create durable objects for the purposes of remembering and sharing information.

  Our ancestors, on the other hand, communicated through language and gesture, song and dance. They led rich interior lives, creating ideas, images, and wholly imagined realities that shaped their interior worlds just as deftly as their tools made clothes to keep them warm and huts to shelter them. At some point in our past, we began to mark the passage of time. We started to see the relationship between action and reaction, cause and effect. We began to travel backward in time to understand causation, and forward in time to make predictions. We know that we did these things because we left material evidence of our thoughts. We began to think with things.

  Over forty thousand years ago, Homo sapiens turned mind into matter by creating physical records of their mental lives. Strikingly realistic and profoundly sophisticated renderings of beasts and birds survive on the walls of caves in France, Spain, Italy, Germany, the steppes of Russia, and Indonesia. Objects that are less realistic but still recognizably human are found at related sites. These objects range from large caches of beads made from shell, bone, metals, and ivo
ry to fertility fetishes and even musical instruments such as a flutelike instrument fashioned from a deer’s bone. Items found at some sites bear traces of being manufactured far away, evidence that we were making and trading objects—along with women and livestock—at least seventy thousand years ago.

  We can only conjecture what these images and objects meant and how they were used. But one thing is beyond doubt: whoever made and used them was human. They even signed some of their paintings, leaving handprints silhouetted in red and black tints. We recognize ourselves in the irrational yet compelling desire to breach the limits of time and space, to bear witness to our existence, and to speak to beings distant in time and space. We look at these images in the caves as a window into the past. What we find is a mirror.

  Some scientists believe these paintings were made to invoke the spirits of hunting or fertility. Others argue that the caves were sites of shamanistic rituals. A third group thinks that the caves represent nothing more or less than Paleolithic graffiti. We cannot read the code in which they are written because we have no context for interpretation. In the absence of evidence that can be interpreted unambiguously, we continue to test our conjectures against any new evidence that might come in, sketchy as it may be. We demand that our knowledge be based squarely on the evidence at hand. This is a very modern mindset, this insistence on verifiable truths. Our ancestors were more direct in their explanation of how we came to be separate from other creatures: We asked one too many questions.

  CREATION STORIES

  How did memory grow from a feature common to invertebrates like snails and jellyfish to the elaborate cocoon of human culture? The unique mental capacities that make possible not just learning but also teaching and the wholesale accumulation of knowledge over millennia require self-awareness, symbolic thought, and language. The eerie certainty we have of existing as separate creatures in a world full of things that are not us, our ability to create abstract symbolic representations of our mental states and use language to communicate these interior states of mind—all this had to be in place before we could keep knowledge from dying with the person who possesses it.

  Creation myths usually feature people who are dangerously curious. The generations of dangerously curious men we honor as the founders of modern science were raised on the biblical tale of origins, a calamitous account of curiosity and its consequences. Roger Bacon and Galileo Galilei, Francis Bacon and Isaac Newton, and Charles Darwin and Albert Einstein all grew up in a culture that instructed them that reverence must always trump knowledge. When Adam and Eve rashly found out for themselves what the fruit of the tree of knowledge of good and evil tasted like, the resulting breach of the natural order divided all time forever into Before and After. Having succumbed to curiosity, they sealed our fate.

  In the Garden of Eden, there was neither toil nor trouble. All was provided for the well-being of God’s creations. The Lord God instructed his special favorite, man, that “you may freely eat of every tree of the garden; but of the tree of the knowledge of good and evil you shall not eat, for in the day that you eat of it you shall die.” Despite this prohibition and the explicit warning of deadly consequences, Adam and Eve had their apple. They chose knowledge over reverence, and at that moment, history began. They were expelled from the Garden of Easy Living, permanently alienated from the natural, un-self-conscious state they were born into. The moral of the story is that in Paradise, there is no curiosity.

  From that time on, we have been condemned to live by our wits. Culture is the collective wit by which we live. Having access to a collective memory that accumulates over time dramatically lowers the cost of our innate curiosity by providing a richer and more diverse set of possible answers to our endless questions. The story of Adam and Eve provides a wonderful thought experiment for us to learn exactly how all encompassing and cocoon-like culture is. In the starkest terms, Adam and Eve present a picture of humans who are all Nature and no nurture. Try to imagine their plight, suddenly cut off from the source of everything they need to live, thrown out on their own, having to fend for themselves with a newfound awareness of their own frailty and mortality. Without the vast legacy of know-how we take for granted as our birthright, they were forced to start from scratch and learn everything—how to find food, how to clothe and shelter themselves, how to bear children, how to raise them, how to die. Pain came as a surprise, and so did pleasure. They were like children—except they had no parents to turn to for explanations about what they were feeling.

  Trying to imagine every aspect of life that Adam and Eve had to invent completely on their own makes clear how impossible it is to imagine being human in such circumstances. This premise has been the inspiration for enchanting fictions about the human condition, from Robinson Crusoe to Tarzan of the Apes. And in the midst of a new wave of apocalyptic thinking sweeping through culture at the beginning of the third millennium, some people are building libraries from which we could “restart civilization” in the event of the catastrophe they fear awaits us. These libraries are thought experiments too, ways of speculating about what we need to save and what we can afford to lose.

  To be fair, Genesis conveys a contradictory message. Prior to having tasted of the fruit, Adam and Eve could not be said to be making a conscious choice to commit a sin. They were natural creatures, lacking any self-consciousness, ignorant of good and evil. They had no idea what death was. It did not exist in their experience, so when God said if they ate the fruit they would die, they may have had no idea what he was talking about. Their lack of experience is the very essence of their innocence. On the other hand, Adam and Eve were clearly endowed by God with their fateful curiosity—all that exists, exists by the grace and power of the Creator. So they may be guilty, but they are not ultimately responsible.

  Whether Galileo or Darwin believed all or none of the story, it was the model of historical thinking that they learned as children and, as we will see, that tale had a powerful effect on how memory and knowledge were imagined in the West. Today the pursuit of curiosity for its own sake is hailed as the bedrock of our culture of knowledge. Through a curious interbreeding of biblical theology and Greek thought, the West gradually stopped seeing knowledge as a threat to reverence and instead began to cherish it as a form of reverence in itself. The story of how this change occurred unfolds over thousands of years and is full of twists, turns, and many dead ends. But it begins, humbly and inevitably, with accounting.

  WHY THE ACCOUNTANTS INVENTED WRITING

  Well before 3000 B.C., people were clustering into densely populated urban centers in Mesopotamia, today’s Iraq (perhaps the historical site of the mythical Eden). One group, living in a proto-city we know as Sumer, began to write receipts when they were conducting small trades. These receipts were not made of paper, which had not been invented yet. Instead, people would recruit someone skilled in the art of inscription to incise certain symbols, understood by both parties to the trade, on a cylindrical clay token, thereby creating an object as witness and warrant of the transaction. These tokens are known today as cuneiforms, from the Latin for wedge (cuneus), the shape a scribe’s stylus would make in the clay.

  Over time, these tokens became larger, flatter tablets of either dried or fired clay. The wedges developed into a vocabulary of words, objects, names, numbers, and actions. While evolving over thousands of years in the Middle East long after the Sumerians passed into history, writing remained intrinsic to the process of accounting for goods and services and making public—publishing—that information. Recorded information lasts only as long as the medium on which it lives. The more durable and secure the carrier of the information, the more durable and secure the information itself. There may have been forms of inscription that preceded the cuneiforms. But if so, they were on more fragile media and have left no trace. Clay, on the other hand, stabilized that information literally for millennia.

  The Sumerians are credited with inventing writing, and they no doubt deserve a special place of honor
in our memory. But cuneiforms represent a more powerful innovation in the deep history of memory than a technical solution. This innovation made the invention of writing not only possible, but also almost inevitable. It led to the creation of objects as evidence, capable of transcending the frailty of human memory and thwarting the temptation to shade the truth by holding people accountable. The cuneiforms are objective witnesses that cannot lie or forget. These truth tokens mitigated the risk of exchanging goods and services with strangers. They were warrants of trust that were hard to tamper with. No one could alter what was written in clay without making a mess of the tablet. The invention of physical objects that provided information—which if not fully unimpeachable was certainly far more reliable than human testimony—opened up a world of possibilities that lay beyond the walls of the city. But the need came first, then the invention.

  The earliest surviving cuneiforms were created before 3300 B.C., and soon—in Sumerian if not Internet time—an information culture centered on these objects began to flourish. The oldest tablets comprise inventories of goods such as grains, oils, textiles, and livestock. But like any good invention, writing began to reveal its latent powers with use. People turned to writing for much more than accounting. Succeeding royal governments created the full array of documents familiar today—treaties, laws and decrees, and records of military prowess. All these were created with clear public purposes: to manage economic assets, secure control over a ruler’s possessions, and extol his power.

 

‹ Prev