Book Read Free

Lone Survivors

Page 18

by Chris Stringer


  Nevertheless, the care that both Neanderthals and early moderns bestowed on other group members would have had both social and demographic effects, and this may provide further clues about why modern humans were ultimately the most successful of all human species. Earlier, we discussed the distinctiveness of human age profiles compared with those of apes: we have a longer period of infant dependency, reach puberty later, have later ages for first births but closer birth-spacings, postreproductive survival in women is very common, and overall we live longer. This means that humans develop and need much longer-lasting social ties, beyond those of their immediate kin, throughout their lives. There are probably specific genetic bases for our longevity. For example, it has been suggested that unique mutations in a gene for the cholesterol-transporting apolipoprotein E occurred about 250,000 years ago. The variant ApoE3 lowers the risk of many age-related conditions such as coronary disease and Alzheimer’s, and it will be interesting to see whether this variant was also present in the Neanderthal genome.

  As discussed in chapter 3, the Neanderthals had a human rather than an apelike developmental pattern, but at the same time their lives must have been stressful. About twenty years ago, a nurse turned anthropologist, Mary Ursula Brennan, compared the pattern of growth interruptions in dental enamel formation in Neanderthals and early modern humans and found these indicators of childhood stress were much more common among the Neanderthals. In old age there are further indications of the problems that they and our African ancestors faced (again, from research using teeth—this time to assess the longevity of Neanderthals and early modern humans). While Erik Trinkaus found little difference in survivorship between archaic and modern humans, the anthropologists Rachel Caspari and Sang-Hee Lee came to different conclusions. Their studies were conducted using a technique called wear seriation, in which the degree of wear of each molar tooth is used to assess the relative age of an individual. So, for example, the age of third molar (wisdom tooth) eruption was taken to mark adulthood, and when cumulative molar tooth wear indicated an individual was about double that age, they were considered to have reached older adulthood and could potentially have been grandparents. Additionally, Caspari used microCT (see chapter 3) on some dental samples as an aging guide, since the pulp cavities of molars decrease in size through life as dentine is accumulated in them.

  Caspari and Lee carried out comparisons ranging from ancient hominins such as australopithecines through to Neanderthals and Cro-Magnons, assessing the ratios of young adults to old adults. They found that only the Cro-Magnons of Europe had a high representation of middle-aged to old individuals (about four times as many, compared with their Neanderthal predecessors in Europe, and even more distinct when compared with earlier humans and prehumans). Interestingly, the Skhul and Qafzeh early moderns were no different from the Neanderthals in their relatively low survivorship to middle and old age. This in turn suggested that cultural, social, or environmental factors—rather than biology—were probably at work in catalyzing the change in age profiles; otherwise the difference should already have been showing in the 100,000-year-old moderns from Israel. If the Cro-Magnons had had more older adults, they would have had more reproductive opportunities, packing extra children into each fertile life span, and there would have been more intergenerational overlap, allowing greater transfer of knowledge and experience down the years. In addition, some data from recent humans suggest that the frontal lobes of the brain, which are closely involved in the planning of behavior, continue their wiring-up until at least twenty-five years of age, so this is something that might only be complete in adults who survived that long. But harking back to the Grandmother Hypothesis and alloparents, these results suggest that their beneficial effects would have barely been felt in early humans, including the Neanderthals. Caspari’s study of the seventy-five or so Neanderthals from the site of Krapina in Croatia showed no individuals were likely to have been older than thirty-five at death, so there were not many grandparents around, and that would have been even worse news when so many younger parents were evidently dying before they reached thirty. Thus orphaned Neanderthals would have mainly had to rely on older siblings rather than grandparents for social support.

  It was perhaps only with the broadening of food supplies and of those involved in its collection that the change in age profiles could develop in modern humans. And something else of great importance would have been enabled by the overlap of three or four generations in the Cro-Magnons: extended kinship. An example of how important this could have been is shown by the complex kinship systems of many Australian aboriginal groups, which determine not only where individuals are placed in society but what their duties are and how they will be treated. The system determines who can marry whom, what roles they will take in ceremonies, and how they should react to both kin and nonkin (for example, social intimacy, joking relationships, or—cue for many comedians—avoidance relationships such as between a mother-in-law and son-in-law). And when times are hard, groups may need support—or at least tolerance—from each other, such as when a water hole needs to be shared. Then it is critical for negotiators to establish if they are kin or potential enemies by tracking back their genealogies to see if they can find relatives (who may be long dead) in common, or if there is a history of unresolved disputes. All of this requires extensive records and mapping of relationships, which, in the absence of written or digital storage, is only feasible when several generations overlap, in order to provide a kind of collective memory.

  In that last example, from Australia, we see the two opposing forces of intergroup relations at work in modern humans—cooperation and conflict—and undoubtedly these have both been important in influencing recent human evolution. I have spent some time discussing the role of mutual social support within groups, but humans undoubtedly also evolved vital mechanisms to defuse potentially aggressive encounters with neighbors. These would have included intermarriage, so that potential enemies could instead become kin, and it is possible that some of the symbolism we see in the Paleolithic—whether it is strings of beads as friendly trade items or cave art intended to signal territorial boundaries—was aimed at managing external relations. The anthropologists Robin Fox and Bernard Chapais developed the argument that the exchange of mates, and in particular the exchange of women, associated with marriage, was the critical evolutionary step in the development of the kinship systems that can be found in hunter-gatherers and pastoralists around the world. Two critical building blocks in such relationships are found in primates: alliance and descent. Alliance consists of stable breeding bonds, such as a male gorilla and the several females with whom he mates. Descent consists of groups of related individuals, such as female monkeys who share a mother, who bond, and who can acquire the status of their mother and pass it on to their offspring.

  But human kinship combines both of these, since the mode of descent (traced through one of the parents) is a mechanism for the construction of alliances. So although offspring disperse, as one sex (usually women) marries outside their immediate group, they maintain their original ties of mutual descent. The change from relatively promiscuous mating to pair-bonding allowed the unique recognition in humans of fatherhood, of paternal relations, and of “in-laws,” all of which were essential building blocks in truly human kinship systems. We have little evidence of the kinship systems of the first modern humans or of the Neanderthals (although see chapter 7), but the proliferation of symbolic objects such as beads 80,000 years ago suggests to me that mate exchanges (and most commonly these are exchanges of females) were probably in place between human groups in Africa.

  However, the injuries carried by early humans, and especially the Neanderthals, show that encounters with others in the Paleolithic were not always friendly, and although there is less evidence of such wounds in early modern humans, researchers like Raymond Kelly believe that the potential for both conflict and coalitions was also a significant force in the development of modern humanity. I discussed the poss
ibility that only modern humans had projectile weaponry in relation to the rib wound on the Shanidar 3 Neanderthal, and the emergence of “killing at a distance” would have been a threat to humans as much as to hunted prey. Male chimps form aggressive coalitions to carry out lethal raids on other troops, so it is likely that such behavior was part of our evolutionary heritage, and that tools in the form of rocks, clubs, sharp stones, or pointed sticks would soon have been recruited for defense or attack (as in one of the famous opening scenes in Stanley Kubrick’s film 2001: A Space Odyssey). As Darwin put it in 1871: “A tribe including many members who, from possessing in a high degree the spirit of patriotism, fidelity, obedience, courage, and sympathy, were always ready to aid one another, and to sacrifice themselves for the common good, would be victorious over most other tribes; and this would be natural selection.” Over the last 130 years, such views have formed the basis of ideas on “group selection” by distinguished researchers ranging from Arthur Keith and Raymond Dart to Richard Alexander and James Moore.

  But from the 1970s onward, work by biologists such as William Hamilton, Robert Trivers, and Richard Dawkins emphasized the selfishness of genes and undermined the basis of many previous formulations of group selection. Selection acts only on genes or individuals, not populations, and while altruism (selflessness) can evolve, it will only be favored in genetically closely related groups. Mathematical tests showed that group selection would fail when there was even a small amount of migration between groups, or when “cheaters” exploited the benevolence of others to propagate their own genes. However, more recently, biologists and anthropologists such as Paul Bingham and Samuel Bowles have returned to the issue by recruiting weaponry and genes to the cause of group selection. The argument goes that by coming together to use effective projectile weaponry, individuals reduced their separate risks, and thus coalitions of warriors would have been advantageous for group defense and offense. Bingham proposed that this development would also have been important within societies by deterring free riders who tried to reap the rewards of group membership without contributing their fair share of commitment to the associated costs or risks. However strong individually, they could soon be brought into line when faced with a coalition of spear-armed peers, who could act as general enforcers of within-group rules and solidarity.

  Bowles posited the idea that if Paleolithic groups were relatively inbred and genetically distinct from each other, and warfare between groups was prevalent, then group selection through collaborative defense and attack could evolve and be maintained. Without warfare, a gene with a self-sacrificial cost of only 3 percent would disappear in a few millennia, but with warfare, Bowles’s model showed that even levels of self-sacrifice of up to 13 percent could be sustained. He used archaeological data (although mainly post-Paleolithic) to argue that lethal warfare was indeed widespread in prehistory, and that altruistic group-beneficial behaviors that damaged the survival chances of individuals but improved the group’s chances of winning a conflict could emerge and even thrive by group selection. Moreover, the model could work whether the behavior in question was genetically based or was a cultural trait such as a shared belief system. As mentioned above, Bowles’s archaeological data do not come from the Paleolithic, but there is one observation that does resonate with his views: the French archaeologist Nicolas Teyssandier noted that the period of overlap of the last Neanderthals and first moderns in Europe was characterized by a profusion of different styles of stone points. This might reflect a sort of arms race to perfect the tips of spears, perhaps to hunt more efficiently, but equally this could suggest heightened intergroup conflict.

  Social relations, cooperation and conflict, food acquisition, and changing age profiles could all have been important in shaping modern humanity, but one of the markers of Homo sapiens—language—was undoubtedly a key factor. For the primatologist Jane Goodall, the lack of sophisticated spoken language was what most differentiated the chimps she studied from us. Once humans possessed this faculty, “they could discuss events that had happened in the past and make complex contingency plans for both the near and the distant future … The interaction of mind with mind broadened ideas and sharpened concepts.” Despite the rich repertoire of communication in chimps, without a humanlike language “they are trapped within themselves.”

  So how could such a critical thing as language evolve in humans, and was its evolution gradual or punctuational? Darwin certainly favored a gradual evolution, under the effects of both natural and sexual selection. He wrote in 1871:

  With respect to the origin of articulate language … I cannot doubt that language owes its origin to the imitation and modification of various natural sounds, the voices of other animals, and man’s own instinctive cries … may not some unusually wise ape-like animal have imitated the growl of a beast of prey, and thus told his fellow-monkeys the nature of the expected danger? This would have been a first step in the formation of a language.

  As the voice was used more and more, the vocal organs would have been strengthened and perfected through the principle of the inherited effects of use; and this would have reacted on the power of speech. But the relation between the continued use of language and the development of the brain has no doubt been far more important. The mental powers in some early progenitor of man must have been more highly developed than in any existing ape, before even the most imperfect form of speech could have come into use, but we may confidently believe that the continued use and advancement of this power would have reacted on the mind itself, by enabling and encouraging it to carry on long trains of thought. A complex train of thought can no more be carried on without the aid of words, whether spoken or silent, than a long calculation without the use of figures or algebra.

  In contrast to Darwin’s gradualist evolutionary views, the linguist Noam Chomsky has long argued that modern human language did not evolve through Darwinian selection; in a sense, for him, it is an all-or-nothing faculty, emanating from a specific language domain in the brain that may have appeared through a fortuitous genetic mutation. He believes that all human languages, however different they may sound at first, are structured around a universal grammar that is already present in the brain of infants and which they use intuitively to interpret and then re-create the patterns of speech presented to them by the group into which they are born. The evolutionary psychologist Steven Pinker has shared some of Chomsky’s views, in particular that there is a specific hard-wired domain for language capabilities in the brain. In his opinion, this domain generates mentalese (a term created by the cognitive scientist Jerry Fodor), an underlying and innate mental code out of which all human languages can be forged. However, Pinker parted from Chomsky in arguing that gradual genetically based change (comparable to that which eventually led to complex eyes) could have evolved the human “language organ” and its language-generating systems, in a series of evolutionary steps, with selection (either natural or sexual/cultural) favoring increased richness of expression.

  Earlier I discussed the view of the archaeologist Richard Klein that there was a punctuational origin for modern human behavior in Africa about 50,000 years ago, and, to an extent, his views can be compared with Chomsky’s. Klein critically assessed the evidence for “modern” behavior prior to 50,000 years and found it unconvincing. In his view it is only after that date that a consistent pattern of finds demonstrates the presence of things like increasing diversity and specialization in tools, undoubted art, symbolism, and ritual, expansion into more challenging environments, diversification of food resources, and relatively high population densities. As a trigger, he suggests there may have been “a fortuitous mutation that promoted the fully modern brain … the postulated genetic change at 50 ka fostered the uniquely modern ability to adapt to a wide range of natural and social circumstances with little or no physiological change.” He further speculates that this brain rewiring may have rapidly facilitated the full language capabilities of Homo sapiens, which up to then had been little
different from those of earlier humans, and, as he recognizes, this is something that is very difficult to demonstrate from the fossil and archaeological record. Although I disagree with Klein about a unique “switch” that turned on modern human behavior, I agree with his views on the critical importance of language to our species.

  However, there could have been premodern languages in earlier humans and in the Neanderthals. Robin Dunbar and the anthropologist Leslie Aiello argued that human language perhaps first developed through “gossip,” as a supplement to (and eventually a replacement for) social grooming. The activity of fur grooming is performed on a one-to-one basis by many primates to help maintain their relationships and social cohesion. Dunbar and Aiello speculated that without the benefit of language, the burgeoning size of Homo erectus groups would have required individuals to expend up to half their time on individual social grooming, leaving little time for other vital activities. But by allowing groups of early humans to chatter to each other, a primitive language could have facilitated social intimacy and cohesion, freeing up time otherwise spent in grooming.

 

‹ Prev