A key to these advances is an invention that transformed our capacity to both accurately create and effectively share mental scenarios: writing.
Current evidence suggests that the first script was not devised, as one might have expected, by poets, philosophers, or historians but by accountants in Near Eastern farming communities.1 At the end of the last ice age some tribes began to abandon a hunter-gatherer lifestyle in favor of a sedentary agricultural existence, which enabled rapid population growth and was a catalyst for development.2 Selectively planting the seeds of wild plants with desirable characteristics led to the domestication of high carbohydrate cereals such as wheat and barley, which are also easy to store. Wild herding animals, such as sheep and goats, were penned, and selective culling and breeding drove their domestication. Farming required planning, cooperation, and work, but it also produced surplus that allowed some people to increasingly focus on activities other than obtaining food. Through the distribution of grains, meat, and other goods, people could make a living in trade, building, prostitution, security, or administration. The first cities and temples were built; increasingly complex civilizations, including their languages, animals, and crops, started to spread. Economic activity expanded, and as early as 9,500 years ago, clay tokens were employed to bolster human memory. Different shapes, such as a cone or a cylinder, represented different units of merchandise, such as a measure of grain or an animal. A thousand years later, seals emerged and were used to identify the people responsible for the goods represented by tokens, opening up a brand-new world of accounting. Farmers, temple administrators, and traders could keep track of their transactions, including debts and pledges.
About five and a half thousand years ago Sumerian accountants used sealed, hollow clay balls to envelop tokens for longer periods. Once sealed, the content of the container could only be checked by breaking it, so someone clever came up with the idea of impressing the tokens on the outside of the wet clay container before putting them inside. Six impressions on the outside meant six corresponding tokens on the inside. It did not take long for people to realize there was no longer any point in having the tokens on the inside. Subsequently, clay containers were replaced by impressed clay tablets, and the technique became widespread in Syria and Mesopotamia. Accountants then began to add pictographic symbols that were traced rather than impressed, such as small marks for counting or a simple picture of an ear of barley to represent barley. Signs were increasingly simplified, and the first cuneiform script, involving various arrangements of a basic wedge (cuneus in Latin), emerged over five thousand years ago. Eventually, scribes began to use writing to record things other than accounting. The rest is history—literally.3
Writing has allowed us to connect our minds and accumulate knowledge across time and space like never before. Early inscribed statues from the city Ur, for instance, record the name, title, and ancestry of a king, as well as what palaces and temples he built and what lands he acquired. Writing was used to record creation myths, laws, prayers, calendars, teachings, and funerary texts. Pharaohs and kings began to send couriers to disseminate written decrees, letters, and warrants. Postal services eventually emerged in Persia, Rome, and China, allowing ordinary individuals to exchange their thoughts and experiences. Ancient Greek historians began to systematically document wars and narrate significant events. Once fixed in writing, thoughts became lasting objects.
Some written teachings came to be revered as originating from divine sources, and sacred scriptures have had an incredibly powerful and enduring influence. Most of today’s major moral traditions have roots going back to important thinkers some 2,500 years ago. During this period the Buddha offered his philosophy in India, Confucius in China, Socrates in Greece, and, perhaps a bit earlier, Zoroaster (Zarathustra) in Persia. It may even have been possible for a single person, such as the fictional grandson of Zoroaster in Gore Vidal’s historical novel Creation, to have lived long enough to speak to all of these great men. Why did these influential moral traditions emerge almost simultaneously? The reason may be less a coincidence of moral insight (or divine communication) than a function of the spreading influence of writing. Written down, moral teachings became standardized and so could proliferate like no oral traditions had done. Social norms became written laws. Even if interpretations of, say, the Old Testament differ widely, people can always return to the source.
Written words invite critical reflection, focused debate, and commentary. Readers at any time and place can assess and build on others’ ideas. Aristotle’s writings, for instance, influenced much of subsequent Western philosophy. If we depended on word of mouth alone, we might only have heard fragments of his thoughts, with no way of distinguishing the original ideas from what retelling has added or subtracted. With writing, we can still “hear” the voices of the writers centuries after they have died. Their scenario-building minds can still be wired to ours (even if the information flow is unidirectional). We can learn from the dead and, in a sense, collaborate with them across time, confirming one idea, debunking another, and qualifying a third. As Carl Sagan put it: “The library connects us with the insights and knowledge, painfully extracted from Nature, of the greatest minds that ever were, with the best teachers, drawn from the entire planet and from all of our history, to instruct us without tiring, and to inspire us to make our own contribution to the collective knowledge of the human species.”
For much of history, texts were hand-copied, painted on silk, bamboo, or paper, making distribution limited and vulnerable to loss. Books were precious treasures. The great library of Alexandria, first organized by one of Aristotle’s students some 2,300 years ago, systematically attempted to gather the world’s knowledge, collecting hundreds of thousands of scrolls. It took accumulation of culture to another level. With its eventual demise much of the recorded thought of the ancient world was lost. Many documents were destroyed in a fire during Caesar’s attack in 48 BCE, but the library continued to be a hub of science for several more centuries. Its last famous librarian was the female mathematician and astronomer Hypatia, who was murdered by followers of Alexandria’s Archbishop Cyril in 415 CE. The great library waned as the dark ages beckoned.
Our drive to exchange our minds has fuelled a search for ever-more effective media. The invention of woodblock printing in China in the third century enabled more rapid copying. Europeans caught on later, with Gutenberg’s printing press appearing around 1440 and revolutionizing mass production. Within a century the number of books in Europe is thought to have multiplied from tens of thousands to tens of millions. Written accounts and ideas became accessible to ever-larger audiences around the world. The printing of newspapers from the seventeenth century onwards attuned many people’s minds to the same scenarios we call “current affairs.”
Printing helped people focus their collective efforts on understanding nature. Books and journals created the opportunity for an intellectual interchange that cumulated in the Enlightenment: the age of reason. The first journal fully devoted to science, Philosophical Transactions of the Royal Society, was founded in 1665 and still appears regularly to this day. Researchers such as Isaac Newton reported their discoveries through the journal, and a hundred years later, Wilhelm Herschel would use it to tell the world about his profound astronomical observations. Without printing, the inductive scientific approach his son John Herschel advocated would not have gathered the momentum it did. With printing, scientists could efficiently share data, compare hypotheses, and systematically put them to the test. Findings could be rapidly and reliably disseminated, ratcheting up human knowledge dramatically. Numerous technological breakthroughs followed, as scientists and engineers solved problems and informed each other about exciting new opportunities—some, in turn, with far-reaching consequences on how we exchange our minds.
Postal services had long enabled remote communication, but the invention of the telegraph in the nineteenth century made long-distance mental connections instant. At the same time, motoriz
ed transportation opened up new opportunities for visiting family and friends who lived far away and so enabled us to more frequently indulge our urge to link up. We came to increasingly rely on various media to satisfy this drive. Telephone, radio, television, fax and email are more recent technological advances that have enhanced our capacity to connect our minds across time and space.
The rise of the internet and satellite networks make it possible to hook up virtually any human mind with any other anywhere (feel free to insert sarcastic remark about your local service provider here, if you like). The web offers access to others’ writing from across the globe. Social media such as Facebook and Twitter occupy an important place in many people’s daily lives, and by the time you read this there will be yet other ways of communicating. It may seem a mystery to older generations that these computer-mediated interactions are so extraordinarily popular, but they are logical extensions of a long historical trend. These media enable us to satisfy our urge to connect and inform each other in an instant, wherever we are, about any idea or whim we care to share.
We increasingly collaborate economically, politically, and intellectually within a range of global networks. More and more people use these technologies to discuss, complain, or gossip and to instigate and coordinate cooperative projects. Many a quirky idea or hobby that may have wilted in solitude in the past can flourish with input of like-minded people from elsewhere. Progress in science and technology has been extraordinary as a result of the rapid exchange of research reports through electronic journals and, increasingly, through open-access outlets that enable anyone with an internet connection to search and read the latest findings. Within a few generations our understanding of the nature of our world, and our place within it, has changed dramatically.
THE TEMPLE OF APOLLO AT Delphi is said to have been inscribed, “Know thyself.” When Linnaeus published his famous classification of all living things, Systema Naturae, he included humans but did not provide a taxonomic description. Instead he simply wrote, “Nosce te ipsum,” the Latin version of that same ancient phrase.4
It was long thought that our apparently unique position on Earth is miraculous. The knowledge accrued over the last couple of hundred years of systematic scientific inquiry has provided a different perspective on humanity. Only recently did we discover that other hominins used to share the planet with our ancestors. Now we know that the vast gap between humans and other animals is in part due to the disappearance of our most closely related species. By developing a perspective that encompasses hundreds, thousands, even millions of years, we are gaining a very different view on who we are. Our ancestors shaped much of this world, burning forests, draining swamps, and domesticating some species while wiping out others. With industrialization our forces grew monumentally, and it may only be our generation that is waking up to the fact we are rapidly altering this planet—potentially undermining our own future prosperity. Only with a clearer understanding of where we come from and who we are can we gain a better view of where we are heading.
Oracles, prophets, and diviners have long been in the business of telling the future. Like inventions in science fiction that inspire engineers, prophecies can guide people’s actions in foreseeable ways. Indeed, in extreme cases, prophets of doom have led to mass suicide and utopian visions to revolution. Science has not only brought us a more systematic route to explanations but given us new tools for reliably forecasting the future—and shaping it.
We have begun to record changes systematically, giving us databases from which to make models and predictions. Accounting of past crop yields and related variables such as rainfall or temperature allows us to extrapolate future yields. Unique events, such as new inventions or the consequences of the introduction of a new species to an ecosystem, remain difficult to predict, but more continuous change is easily plotted. Statistical models link numerous variables and enable complex extrapolations in which we can even quantify the errors we are likely to make. We are increasingly able to predict what we care about, such as how long we are likely to live, when we will run out of certain resources, and what consequences our activities have on fauna and flora. Environmental impact studies are now commonplace. Computer simulations can be used to create scenarios of what will happen if we continue down one path and what is likely to happen if we change one parameter or another. We can compare alternative future situations in terms of possibility, probability, and desirability.
On an ecological macro level, current models predict profound changes in climate, atmosphere, and the oceans. The rapid decline of forest habitats, biodiversity, and resources such as oil or fish, as well as accumulating waste and pollution, are now widely recognized as global problems with significant future consequences. What can we do to avert disasters and create a sustainable future? It may turn out to be most fortunate that as we seem to be reaching the breaking point on many fronts, we are beginning to exchange our minds globally and are becoming self-aware of our interconnected fate.
Our future depends on how accurately we can build scenarios of the future and how well we can link our minds to cooperatively tackle global problems—abilities that set us apart from other animals (and that, arguably, got us into this mess). We face colossal challenges. The buck stops with us. There is no sign one of the other creatures on Earth will jump into the fray and sort things out. We are the only species capable of launching strategic cooperative efforts designed to address these challenges.
I HAVE REVIEWED HERE CURRENT evidence on the nature and origin of what makes us human. The data led me to propose that the peculiarity of the human mind primarily stands on two legs: our open-ended capacity to create nested mental scenarios and our deep-seated drive to connect to other scenario-building minds. These traits have had dramatic consequences for the way we communicate, our access to past and future, our understanding of and cooperation with others, and our intelligence, culture, and morality. We have managed to create a fast and efficient cultural inheritance system through which human groups have accumulated novel powers that ultimately allowed us to dominate much of the planet.
Of course this analysis is far from the last word on the gap. The key to Herschel’s scientific method is to continue to test hypotheses by further observation and experimentation. There are at least two sides to this in the present case. On one side, the claim that nonhuman animals do not have these capacities requires further scrutiny. We need to establish more precisely the competences and limits of animals on some of the key components, such as working memory. If future research demonstrates, say, that an orangutan can compute certain problems recursively, then this will falsify part of the hypothesis.5
On the other side, we need to further examine if there are other factors that I overlooked that are critical to human uniqueness. Though we have seen the central relevance of the two legs (and the interplay of their consequences) in the six domains reviewed, more systematic work is required to examine if this is also the case in other domains. Have a fresh look at your own intuitions about what sets human minds apart. Do the two factors ultimately play a key role in what is unique about the traits you consider?6
Some of my deductions will be proven wrong. New evidence will challenge results that inform the current investigation. Romantics and killjoys will continue to quibble, and new middle ground will be charted. As scientific progress accelerates, our understanding of the gap will become increasingly refined. I have no doubt that genetics, neuroscience, comparative psychology, and paleoanthropology have more surprises in store.7 Having said that, the current picture of the gap is clearer than we could reasonably have expected even a few years ago. In this book I have offered a snapshot of this picture. I hope that you have enjoyed the show and that your own scenario-building mind has been inspired to ponder this mental fodder some more. Whether much of my interpretation turns out to be correct or not, I hope to have convinced you that there is a need to systematically research these questions in a science of the gap.
The book�
�s focus on what sets us apart from other animals should not distract from the long list of traits we share with other animals. Our minds are wired in ancient ways. Many of our fundamental cognitive processes, emotions, and desires are not unique. For instance, frustrated humans can work themselves into a rage, just as chimpanzees do. People can become agitated, “flip,” or go “ape shit”—although we generally try to rein in such outbursts. We punish violations of our social norms as we try to uphold a polite and civil society; our culture and morality help cultivate less aggressive and more socially compliant behavior. Still, our primate heritage cannot be denied:
Darwinian man, though well behaved, at best is only a monkey shaved.
—GILBERT AND SULLIVAN
Reminders of our animal nature are a counterweight to the common view that humans are separate from the natural world, but they should not obscure the fact that we are peculiar indeed. There is no point belittling the extraordinary powers that separate us from other animals nor denying that we are a primate. It is time we established a more balanced view that acknowledges both the similarities and the differences between animals and humans. This may require letting go of some long-cherished notions of self-importance, but it should in no way diminish our sense of wonder about our peculiar existence. Know thyself.
A MORE PRECISE ANALYSIS OF the gap has some applied benefits we can throw into the bargain. Researchers often study mice, rats, and other animals in an effort to understand the genetic or neurological bases of human mental functions and associated disorders, but this only makes sense when these animals have some rudiments of the human trait. For characteristics that turn out to be entirely unique to humans, the use of animal models is, for the most part, profoundly misguided. A clearer understanding of the gap will give us a better framework for deciding when such research is not promising. It may save laboratory animals’ lives and help researchers avoid many a garden path.
The Gap Page 33