Book Read Free

Lone Survivors

Page 16

by Chris Stringer


  This brings us to the critical question of religion and belief systems, to which rituals are often closely linked. It seems likely that a sense of guilt for social infringements (for example, stealing from a neighbor, hitting a defenseless person who had done no harm) had evolved in early humans, since what appears to be a sense of shame can be programmed into social animals such as dogs and some primates. But only humans have a sense of sin—an infraction not against a person but against a divinely sanctioned law. The law in question may relate to hurting another (for example, adultery or murder) or infringing a religiously enforced code of behavior (for example, combing the hair during a thunderstorm for the Semang peoples of Malaysia or eating pork).

  So what could have begun this process of separation from the natural world, and the belief in the supernatural? In The Descent of Man, Darwin discussed how his dog barked every time the wind caught a parasol, perhaps because in its confusion it imagined there must be an agent (invisible to the dog) that was causing the movement. Darwin added that such imaginings could have been the source of an early belief in spiritual agencies. Thus the mind-reading abilities we discussed earlier, combined with the human understanding of cause and effect, so essential for activities like toolmaking and hunting, might lie at the root of spiritual beliefs, as argued by both Robin Dunbar and the anatomist Lewis Wolpert. Unexplained phenomena such as lightning, environmental crises, and human illness must have causes, so perhaps invisible spirit forces were at work—as Darwin put it, ones with “the same passions, the same love of vengeance or simplest form of justice, and the same affections they themselves experienced.” In particular, once self-awareness had evolved, belief in an afterlife could soon have followed, allowing the mystery of death to be addressed and dealt with—the essence of those who had loved us and looked after us during our lives would surely live on to look after us after they died.

  I mentioned shamanism earlier in relation to Lewis-Williams’s interpretations of European cave art, and his suggestion that modern humans may be the only species that can remember its dreams, hence providing the imaginative basis for spirit worlds to which humans may gain privileged access. He and others have argued that shamanism is an ancient form of religion, perhaps the very oldest, with an antiquity going back to the African Middle Stone Age at least. In both San and Paleolithic art there are representations of therianthropes (human–animal chimeras—the centaurs of Greek mythology, for example), and in recent depictions these often relate to “soul flights,” where the shaman’s soul leaves the body during a trance and merges with or is possessed by a spiritually powerful animal. The trances may be brought on by repetitive chanting, dancing, or drumming, by sensory deprivation or sensory overload—for example, through eating, drinking, or smoking hallucinogenic plant compounds. In evolutionary terms, the benefits to the shaman may be obvious—high status and possibly privileged access to group resources or sexual partners—but what are the advantages to the group and the other individuals within it? This brings us to the tricky question of why spiritual beliefs evolved in the first place, and why they seem to have such a hold on humanity, despite occasional and largely unsuccessful attempts to cast them off.

  For some, religious beliefs are a pathology—a mass delusion—or they are akin to a virus that perpetuates itself via information imprinted by adults on impressionable young minds. Others argue that spiritual beliefs evolved because they were useful to those who possessed them, and endowed survival on those individuals and their close relatives. Data show that human feelings such as depression, pessimism, and anxiety are handicaps to health and longevity, so religious beliefs that alleviated those “symptoms” could certainly have been favored. Humans do seem to be preprogrammed for religious beliefs, readily taking these onboard, however irrational they may seem to nonbelievers or those of different faiths—and this seems to be as true for adult converts as it is for religiously groomed children. There is disputed evidence that people with strong religious convictions tend to be healthier, live longer, have more surviving children, and are even somewhat wealthier than nonbelievers. If that was true in the past, selection would have favored those with religious beliefs, as long as the benefits outweighed the costs. (Religions or sects that demanded complete sexual abstinence or castration of male followers have understandably not thrived!)

  As an example of the social benefits that may have applied in the past, we can return to shamans in tribal societies, who act as spiritual emissaries and appear to have functioned successfully as healers, fortune-tellers, peacemakers, and interlocutors with the world of spirits and ancestors. They may benefit personally through their perceived powers, of course, but they can also act as social enforcers, discouraging aberrant behavior or prophesying in order to lead their groups in new directions. And when we compare recent human societies of all types, there seems to be an association between larger group sizes and the prevalence of morally concerned gods, which could again aid social cohesion and conformism to social norms. Furthermore, modern psychological experiments have shown that religious beliefs can breed selfless behavior (and hence social reputation), discourage “freeloaders,” and encourage mutual trust.

  From the evidence of burials and symbolic objects, rituals and religious beliefs probably go back more than 100,000 years, but could they actually have been central to the origins of modern humans? A British anthropologist, Chris Knight, certainly thinks so, and in a wide-ranging synthesis of data from present-day anthropology, primatology, and sociobiology, together with archaeology, he and his collaborators argued that women collectively produced a social revolution in Africa over 100,000 years ago. The symbolic use of red ocher began as part of a female response to accumulating social and reproductive stresses caused by the increasing demands of pregnancy, infant and child care, and the need for male provisioning. The bloodred pigment was deployed by menstruating and nonmenstruating women, smeared on their bodies to spread the taboo of menstruation across alliances of female kin. This instituted a “sex strike,” which could only be broken when the men returned from collaborative hunts with food to share. Female rituals evolved around the sex strike, male rituals around the hunt (begun under a dark moon, returning at full moon, thus linking menstrual and lunar cycles and the blood of women and of animals), and tribal rituals of celebration and feasting would follow the return of the successful hunters.

  I think these ideas are ingenious, and I do believe that human behavior changed in revolutionary ways during the Middle Stone Age, to trigger our expansions within and then outside of Africa. However, I don’t think that Chris Knight’s views provide the correct explanation or even the correct kind of explanation. This is because I no longer think that there is a single “right” answer to the question of our behavioral origins. What we have seen so far is that there are many interconnected strands to modern human behavior, ranging from our enhanced mind-reading talents, symbolism, and artistic and musical expression to rituals and religion. And, as I will discuss next, we have complex survival mechanisms, which are fueled by our language abilities.

  6

  Behaving in a Modern Way: Technology and Lifeways

  Eight years before quarrymen came across a strange skeleton in the Neander Valley (Neander thal) in Germany, which gave its name to that whole ancient population, something similar happened in Gibraltar, though the result was rather different. There, the skull of a Neanderthal woman was discovered in 1848, but was then left unrecognized on a museum shelf for the next fifteen years rather than studied and published. So today we talk about Neanderthal man and Homo neanderthalensis, rather than Gibraltar Man (or Woman) and “Homo calpicus” (a name based on an ancient name for the Rock, suggested by the paleontologist Hugh Falconer in a letter to George Busk in 1863 but never properly published). By then the Gibraltar fossil had made its way to London, and it now resides in a metal safe outside my room. Unfortunately that fossil was blasted from its quarry and no other bones, tools, or associated materials were recovered, thoug
h they must have been there. So in 1994 I jumped at the chance of excavating more caves in Gibraltar with a team including Oxford archaeologist Nick Barton and Clive Finlayson from the Gibraltar Museum. Although we never managed to add to the total of two Neanderthal fossils from this tiny pinnacle of limestone, we did find a wealth of evidence of the way of life of these ancient Europeans. It included tools, hearths, food debris, and some of the best evidence yet discovered that the Neanderthals shared a fundamental behavioral feature with us—exploitation of marine resources such as shellfish and seals. That work was published a few years ago and since then even more evidence of the complexity of both Neanderthal and early modern behavior has emerged.

  Around 300,000 years ago, the more complex technologies of the Middle Paleolithic began to appear among the descendants of Homo heidelbergensis in Africa (Homo sapiens) and western Eurasia (Neanderthals). Techniques that required additional steps in tool manufacture became widespread across Africa and western Eurasia, and the first truly composite tools were invented, ones which must have been mounted in or on wooden handles. The wooden handles or shafts almost invariably perished, but traces of adhesive are present on European, western Asian, and African tools. Those used in the African Middle Stone Age were often mixtures of plant gum and red ocher, and the artisans were able to effect changes in their properties through heating and variations in moisture and acidity, implying a high level of knowledge, planning, and thought. Further evidence of these abilities recently emerged from observations of Middle Stone Age tools and from modern experiments.

  Archaeologists such as Kyle Brown and Curtis Marean found that they were unable to match the appearance and quality of the many tools they were excavating from the Pinnacle Point Caves in South Africa among the local sources of the silcrete rock from which they had been made. But they finally discovered that their glossier and darker sheen, and finer flaking, only appeared when the tools had been pretreated by being buried under a hearth that was burning for many hours at a high temperature, and then left to cool slowly. Such engineering allowed the removal of longer, shallower flakes and more control of the final shape and cutting edges, and its use on the ancient tools was further demonstrated through physical tests of their fabric, showing they had indeed been subject to prolonged heating. Given the systematic and widespread application of such processes on the Pinnacle Point tools, the results could not have been produced by the tools being left accidentally near a hearth made for other reasons. Not only did this skillful pretreatment lead to tools that looked and performed better, but by improving the quality of the local raw materials, it gave these ancient inhabitants of the southern African coast more options in their choices of rock sources for their tools. This was an essential prerequisite in decisions about where to live, and a sign of their increasing ability to shape their local environment—a key factor in the development of our modern human capacity to adapt to almost any place on Earth.

  Fire has, of course, been a vital aid in human survival for at least 800,000 years (based on evidence for hearths at the Israeli handaxe site of Gesher Benot Ya’aqov) and possibly for much longer. As Darwin argued in The Descent of Man (1871), “The art of making fire … is probably the greatest [discovery], excepting language, ever made by man.” It provided warmth and protection from predators, illumination to extend “daylight,” and a new social focus as people sat to talk, sleep, or work (and later to sing and dance) around the flickering flames. But the anthropologist Richard Wrangham argued that it had an equally important role in shaping our evolution through the introduction of cooking. In most cases, cooking reduces the time and energy needed to chew and digest foods, although heat also reduces their vitamin content, and nutrients are lost in the fat and water that are driven away. The process not only helped to provide a broader diet and more fuel for a growing and energy-sapping brain, but also reduced the effect of harmful toxins and pathogens such as parasites, bacteria, and viruses that are present in many raw foods. And by adding food to the flames, cooking provided an extra social focus for fire, in that individuals could cook for each other, for partners, kin, friends, and honored guests. Once cooking became central to human life, it would have influenced our evolution, leading to changes in digestion, gut size and function, tooth and jaw size, and the muscles for mastication.

  So when did humans first control the use of fire, and when did cooking become important? As we discussed earlier, brain size increase and dental reduction had certainly begun in Homo erectus, was well developed by the time of heidelbergensis, and reached levels comparable to that of modern humans in the Neanderthals. There is disputed evidence for human control of fire dating back to about 1.6 million years in Africa, and stronger support for its presence at about 800,000 years in Israel, and in Britain by about 400,000 years (the site of Beeches Pit in Suffolk). However, the majority of early human sites at this time lack such evidence, which perhaps indicates that the use of fire was not yet ubiquitous among early humans. However, within the last 200,000 years there are many Neanderthal and early modern sites with accumulations of hearths but, interestingly, the associated food debris does not always show strong evidence that the meat was being cooked. For example, at Neanderthal sites I have been involved in excavating in Gibraltar, it seems that the Neanderthals knew about baking mussels in the dying embers of a fire to get them to open up for consumption, but many of the animal remains around their hearths seem to have been butchered and eaten raw.

  From fragments of debris preserved in their sites and even around their teeth, we also know that Neanderthals were processing and cooking plant resources like grains and tubers. Similarly, through studies of 100,000-year-old Middle Stone Age tools from the Niassa Rift in Mozambique, Julio Mercader and his colleagues detected the traces of starches from at least a dozen underground and overground plant foods, suggesting that the complex processing of plants, fruits, and tubers, including cooking to remove toxins, was something that had also developed in Africa, providing a vital adaptation as our species traveled around the world. Anna Revedin and her colleagues identified starch grains from wild plants on 30,000-year-old Gravettian grinding stone from sites in Italy, Russia, and the Czech Republic, apparently part of the production of flour, long before the agricultural revolution. The plants included rushes and grasses that, from modern comparisons, were probably exploited at different times of the year, and processed by specialized slicing tools found at the sites. Jiří Svoboda described large underground earth ovens full of hot stones that were in use about 30,000 years ago, at what is now Pavlov in the Czech Republic, to cook huge slabs of mammoth meat. They and surrounding pits, which seem to have been used to boil water with hot stones, were placed within large tents or yurts, to judge by the excavated patterns of holes in the ground. As discussed earlier, such places would have been foci for groups who cooked and ate together.

  Just as Chris Knight’s model of females bonding had menstruation at its heart, so the anthropologists James O’Connell and Kristen Hawkes argued that the collection and processing of plant resources, especially underground ones like tubers that required specialist knowledge to collect and treat, was critical in catalyzing social change in early humans. Although meat became very important, it was also an unpredictable food resource, so while hunting was left to the men, women—especially those unencumbered with children—developed and shared the skills of gathering and refining plant resources as an insurance policy. So perhaps Darwin’s 1871 suggestion that “[man] has discovered the art of making fire, by which hard and stringy roots can be rendered digestible and poisonous roots or herbs innocuous” was actually most appropriate for bands of females. In what has been called the Grandmother Hypothesis, Hawkes and O’Connell proposed that selection would also have favored experienced and postreproductive women who survived for decades after the menopause, something very rare in other primates. These women could have helped to provide for their daughters and other dependent kin, and also act as general helpers, as many grandmothers do
today, thus aiding the survival of their genes and the reinforcement of this supportive behavior.

  The anthropologist Sarah Blaffer Hrdy took this line of thought farther still with the wider concept of alloparents—individuals that regularly took over the provisioning and care of infants and children from the mother. This occurs in other animals, including some primates, and Hrdy believes that the presence of large-brained and dependent infants by the time of Homo erectus meant that such supportive social behavior by older siblings and wider kin had already developed out of necessity by then. In her view, such cooperative breeding allowed children to grow up slowly and remain dependent on others for many years, which in turn permitted the evolution of even bigger-brained modern humans. If we compare hunter-gatherer populations today with, say, chimps, there is a huge difference in fecundity: the interval between births averages about seven years in apes but is only three to four years in humans. In apes, mothers do not usually welcome others carrying or even touching their young babies, whereas human mothers are very tolerant of such sharing behavior, eliciting support that Hawkes, O’Connell, and Hrdy argue is the reason humans can cope with such closely spaced births of demanding infants. And Hrdy further suggests that the immersion of human babies in a pool of alloparents would have honed the mind-reading skills and empathy that are so important to our species, faster than anything else. Whether alloparents necessarily included fathers in the past is still unclear, since in the vast majority of mammals, males have little or no specific interaction with their offspring, and the extent of their involvement in infant care also varies widely in humans today. Undoubtedly this would also have depended on the extent of specialized roles in Paleolithic societies; if men were mostly away tracking and hunting, they simply would not have been around much to take on the role of infant care.

 

‹ Prev