Book Read Free

Before the Dawn: Recovering the Lost History of Our Ancestors

Page 19

by Nicholas Wade


  Both Keeley and LeBlanc believe that for a variety of reasons anthropologists and their fellow archaeologists have seriously underreported the prevalence of warfare among primitive societies. “While my purpose here is not to rail against my colleagues, it is impossible to ignore the fact that academia has missed what I consider to be some of the essence of human history,” writes LeBlanc. “I realized that archaeologists of the postwar period had artificially ‘pacified the past’ and shared a pervasive bias against the possibility of prehistoric warfare,” says Keeley.

  Keeley suggests that warfare and conquest fell out of favor as subjects of academic study after Europeans’ experiences of the Nazis, who treated them, also in the name of might makes right, as badly as they were accustomed to treating their colonial subjects. Be that as it may, there does seem a certain reluctance among archaeologists to recognize the full extent of ancient warfare. Keeley reports that his grant application to study a nine-foot-deep Neolithic ditch and palisade was rejected until he changed his description of the structure from “fortification” to “enclosure.” Most archaeologists, says LeBlanc, ignored the fortifications around Mayan cities and viewed the Mayan elite as peaceful priests. But over the last 20 years Mayan records have been deciphered. Contrary to archaeologists’ wishful thinking, they show the allegedly peaceful elite was heavily into war, conquest and the sanguinary sacrifice of beaten opponents.

  Archaeologists have described caches of large round stones as being designed for use in boiling water, ignoring the commonsense possibility that they were slingshots. When spears, swords, shields, parts of a chariot and a male corpse dressed in armor emerged from a burial, archaeologists asserted that these were status symbols and not, heaven forbid, weapons for actual military use. The large number of copper and bronze axes found in Late Neolithic and Bronze Age burials were held to be not battle axes but a form of money. The spectacularly intact 5,000-year-old man discovered in a melting glacier in 1991, named Ötzi by researchers, carried just such a copper axe. He was found, Keeley writes dryly, “with one of these moneys mischievously hafted as an ax. He also had with him a dagger, a bow, and some arrows; presumably these were his small change.”

  Despite the fact that the deceased was armed to the teeth, archaeologists and anthropologists speculated that he was a shepherd who had fallen asleep and frozen peacefully to death in a sudden snowstorm, or maybe that he was a trader crossing the Alps on business. Such ideas were laid to rest when an X-ray eventually revealed an arrowhead in the armed man’s chest. “In spite of a growing willingness among many anthropologists in recent years to accept the idea that the past was not peaceful,” LeBlanc comments, “a lingering desire to sanitize and ignore warfare still exists within the field, Naturally the public absorbs this scholarly bias, and the myth of a peaceful past continues.”

  If primitive societies of the historic past were heavily engaged in warfare, it seems quite possible that their distant ancestors were even more aggressive. A genetic discovery made as part of a study of mad cow disease lends some credence to this idea.

  The Skeleton in the Human Past

  Among the least appetizing aspects of primitive warfare is cannibalism. Cannibalism implies the existence of warfare since the victims do not voluntarily place themselves on the menu. Anthropologists and archaeologists have long resisted the idea that cannibalism took place in the peaceful past. In his 1979 book Man-Eating Myth, William Arens, an anthropologist at the State University of New York at Stony Brook, argued that there was no well attested case of cannibalism and that most reports of it were propaganda made by one society to establish its moral superiority to another. Christy G. Turner, an archaeologist at Arizona State University, met only disbelief when he first proposed that the cut, burned, and defleshed bones of 30 individuals at a site occupied by Anasazi Indians were the remains of an ancient cannibal feast. His critics attributed the cuts on such bones to scavenging animals, funerary practices, the roof falling in—anything but anthropophagy.

  Though some accounts of cannibalism may well have been fictive, Turner and Tim White of the University of California at Berkeley have now found cannibalized human remains at 25 sites in the American southwest. Turner believes these are the work of Anasazi Indians who dominated the area between AD 900 and 1700 and used cannibalism as an instrument of social control. Cannibalism has been reported from Central and South America, Fiji, New Zealand and Africa. The Aztecs made a state practice of sacrificing captives and their civilization has furnished a recipe for human stew. A common belief that accompanies ritual cannibalism is the notion that by eating particular parts of the victim, often a slain warrior, the consumer absorbs his strength or courage. The frequency of reports of cannibalism by societies in all regions of the world suggests, Keeley concludes, “that, while hardly the norm, ritual consumption of some part of enemy corpses was by no means rare in prestate warfare.”

  Could cannibalism in fact have been so widespread and so deeply embedded in human practice as to have left its signature in the human genome? This gruesome possibility has emerged from the work of English researchers trying to assess the likely extent of the outbreak of mad cow disease among Britons who had eaten tainted beef. Mad cow disease belongs to a group of brain-eroding pathologies caused by misshapen brain proteins known as prions. Contrary to the expectations of British agricultural officials, prions can cross species barriers; cow prions, which rot cows’ brains, can also rot human brains if the cow’s neural tissue is eaten.

  Even more effective at rotting the human brain are human prions. People are at risk of exposure to human prions when they eat other people’s brains. This was a regular practice among the Fore of New Guinea who, sometime around the year 1900, adopted the novel funerary practice of having women and children eat the brains of the dead. By about 1920, the first case of a brain-wasting disease they called kuru appeared.

  A very similar disease, called Creutzfeldt-Jakob disease or CJD, occurs at low incidence in many populations of the world. CJD is caused after a spontaneous mutation causes brain cells to make the misshapen form of the protein instead of the normal form. Kuru presumably started when the brain of a deceased Fore with a natural case of CJD was eaten by his relatives.

  Once kuru got a foothold in the Fore population, the disease progressed relentlessly until some villages became almost devoid of young adult women. The epidemic quickly subsided after Australian administrative authorities banned the Fore’s mortuary feasts in the 1950s.

  A research team led by Simon Mead of University College, London, recently looked at the genetics of Fore women aged over 50. All these survivors had attended many funeral feasts and presumably must have possessed some genetic protection against the disease. Mead’s team analyzed the DNA of their prion protein gene and found that more than 75% had a distinctive genetic signature.186 Every person in Britain infected with mad cow disease, on the other hand, had the opposite genetic signature.g

  Having identified this protective signature, Mead’s team then analyzed other populations around the world. They found that every ethnic group they looked at possessed the signature with the exception of the Japanese, who had a protective signature of their own at a different site in the gene.

  Various genetic tests showed that the protective signature was too common to have arisen by chance, and must have been amplified through natural selection. Other tests suggested the signature was very ancient and was probably present in the human population before it dispersed from Africa. Under this scenario the Japanese presumably lost the signature through the process known as genetic drift, but developed a new one instead because it was so necessary.

  So why has the British epidemic of mad cow disease proved not nearly so deadly in that nation of beef-eaters as was initially feared? It seems that Britons have been in part protected by their ancient cannibal heritage. That the British and other world populations have maintained the protective signature many generations after their last cannibal feast is an indication
of how widespread cannibalism may have been in the ancestral human population and its worldwide descendants. The frequency of cannibalism in turn attests to the prevalence of warfare among the earliest human populations.

  “There is an innate predisposition to manufacture the cultural apparatus of aggression, in a way that separates the conscious mind from the raw biological processes that the genes encode,” writes the biologist Edward O. Wilson. “Culture gives a particular form to the aggression and sanctifies the uniformity of its practice by all members of the tribe.”187 The genes supply the motivation for warfare, Wilson is saying, in humans as they do in chimps, but people, blessed with the power of language, look for some objective cause of war. A society psychs itself up to go to war by agreeing that their neighbors have wronged them, whether by seizing property or failing to deliver on some promise. Religious leaders confirm that the local deity favors their cause and off go the troops.

  The human predisposition for socially approved aggression falls into a quite different category from that of individual aggression. Bellicose individuals usually get themselves locked up in jail for long periods or, in primitive societies, social sanction is given to having them killed. Individual aggression is seldom a good strategy for propagating one’s genes. But socially approved aggression—that is, warfare—can be. A predisposition to warfare does not mean war is inevitable since the predisposition is only executed in certain contexts. The warlike Vikings of the tenth century became the peaceful Scandinavians of the twentieth.

  Among forager societies, warfare can benefit the victor, by expanding territory and increasing reproductive success. That is the conclusion that archaeologists and anthropologists have been so anxious to avoid endorsing, because it seems to offer a justification for war, even a glorification of it. But by playing down the prevalence of warfare in the past they have obscured the important and surprising fact adduced by Keeley, that modern societies have succeeded in greatly reducing the frequency of warfare.

  On the assumption that warfare was an incessant preoccupation of early human existence, the picture of the Upper Paleolithic era that specialists have so far constructed seems strangely incomplete. What does it mean to say that the Aurignacian culture was succeeded by the Gravettian? That the makers of the Aurignacian tool kit woke up one morning and decided thenceforward they would all do things the Gravettian way? Or that after many sanguinary battles people bearing the Gravettian culture ousted those following the Aurignacian? When the Last Glacial Maximum made northern latitudes uninhabitable and the glaciers pushed their populations south, is it likely they were welcomed with open arms by the southerners whose territory they invaded? If warfare was the normal state of affairs, it would have shaped almost every aspect of early human societies.

  Warfare is a dramatic and distinctive feature of history, and it thoroughly overshadows an even more remarkable feature of human societies. This feature, the polar opposite of war, is the unique human ability to cooperate with others, and specifically with unrelated individuals. Social organisms like bees and ants form groups centered around members who are related to each other and have a common genetic interest. So do people to some extent when organized in tribal societies. But humans have extended sociality far beyond the extended family or tribe and have developed ways for many unrelated individuals to cooperate in large, complex, cohesive societies.

  The uniquely human blend of sociality was not easily attained. Its various elements evolved over many years. The most fundamental, a major shift from the ape brand of sociality, was the human nuclear family, which gave all males a chance at procreation along with incentives to cooperate with others in foraging and defense. A second element, developed from an instinct shared with other primates, was a sense of fairness and reciprocity, extended in human societies to a propensity for exchange and trade with other groups. A third element was language. And the fourth, a defense against the snares of language, was religion. All these behaviors are built on the basic calculus of social animals, that cooperation holds more advantages than competition.

  The Evolutionary Basis of Social Behavior

  Though we take the necessities of social behavior for granted, group living in the animal world is highly unusual. In fact even the most rudimentary forms of sociality have long been a puzzle for biologists to explain in terms of evolutionary theory.

  The reason is that a society serves no purpose unless members help one another, yet any effort an individual makes assisting others takes away from investment in his own offspring and reproductive success. If altruists have fewer children, altruistic behavior will be eliminated by natural selection. Yet without altruism there is no benefit to living in a society. How therefore can social behavior ever have evolved?

  Evolutionary biologists have developed a reasonably good account of how social behavior may have emerged in groups of closely related individuals, in a theory about what is known as inclusive fitness. Another theory, that of reciprocal altruism, explains how behavior could have evolved for helping even unrelated people, or at least those who can be expected to reciprocate the favor at a later time.

  Why will a bee sacrifice its life in the hive’s defense? Why should a worker ant embrace sterility and devote her life to raising the queen’s offspring? The late William Hamilton made a major addition to Darwin’s theory in showing how altruism, at least toward one’s own kin, makes evolutionary sense. Darwinian fitness, defined as reproductive success, is all about getting as many of one’s own genes as possible into the next generation. Hamilton’s insight was that the notion of Darwinian fitness should properly be expanded to include the genes one shares with one’s kin. Since these shared genes are the same, being inherited from the same parent, grandparent or great-grandparent, then helping get those into the next generation is as good as transmitting one’s own.

  This notion of expanded fitness, or inclusive fitness as Hamilton called it, predicts that individuals will have a special interest in promoting the survival of children, full siblings and parents, with all of whom they have about 50% of their genes in common, and a substantial though lesser interest in the survival of grandchildren, nephews and nieces, half siblings, grandparents, and aunts and uncles, with whom they share 25% of their genes.

  To maximize their inclusive fitness, individuals must restrain their own competitive behavior and make some degree of self-sacrifice on behalf of kin—in other words, develop social behavior. Thus altruists can be inclusively fitter than non-altruists and their genes, under certain conditions, will spread. Hamilton’s theory of inclusive fitness explains many otherwise puzzling features of social organisms, such as the self-sacrificing behavior of social insects like bees, ants and termites. It also helps explain why chimp communities and human tribal societies are organized along kinship lines.

  It can take extreme circumstances to make evident the survival value of human kinship ties. Some 51% of the 103 Mayflower pioneers in the Ply-mouth colony perished after their first winter in the New World. It turns out that the survivors had significantly more relatives among other members of the colony than did those who died. Among the Donner party, a group of 87 people stranded in the Sierra Nevada in the winter of 1846, only 3 of 15 single young men survived, whereas men who survived had an average of 8.4 family members with them.188

  But kinship alone seems to have limited power as a cohesive social force. Napoleon Chagnon, in his study of the Yanomamo, noticed that as village populations grew larger, the average degree of relatedness would decrease. The population would then split, usually along kinship lines, with the result that people within the two smaller groups would be more highly related to each other. “Kinship-organized groups can only get so large before they begin falling apart,” Chagnon writes. Disputes break out over the usual things—sexual trysts, infidelity, snide comments or veiled insults. “As villages grow larger, internal order and cooperation become difficult, and eventually factions develop: Certain kin take sides with each other, and social life beco
mes strained. There appears to be an upper limit to the size of a group that can be cooperatively organized by the principles of kinship, descent and marriage, the ‘integrating’ mechanisms characteristically at the disposal of primitive peoples.”189

  In most Yanomamo villages, members are on average related to each other more closely than half-cousinship.190 But nontribal societies are a lot larger, as if some new cohesive factor has to come into play if a community is to outgrow the organizational limits imposed by kinship. Recent human history, Chagnon writes, could be viewed as a struggle to overcome these limits: “Many general discussions of our social past as hunters and early cultivators allude to the ‘magic’ numbers of 50 to 100 as the general community size within which our recent cultural and biosocial evolution occurred, a maximal community size that was transcended only in the very recent past—within the last several thousand years.”191

  One principle that biologists think may help explain larger societies, both human and otherwise, is that of reciprocal altruism, the practice of helping even a nonrelated member of society because they may return the favor in future. A tit-for-tat behavioral strategy, where you cooperate with a new acquaintance, and thereafter follow his strategy toward you (retaliate if he retaliates, cooperate if he cooperates), turns out to be superior to all others in many circumstances. Such a behavior could therefore evolve, providing that a mechanism to detect and punish freeloaders evolves in parallel; otherwise freeloaders will be more successful and drive the conditional altruists to extinction.

 

‹ Prev