Moral Origins

Home > Other > Moral Origins > Page 9
Moral Origins Page 9

by Christopher Boehm


  With respect to selection by reputation, in a small band that is talking about people’s behavior all the time, it’s far more difficult to dissemble a good, generous reputation than it is in a modern urban society with its relative anonymity. With free riding basically immobilized, selection by reputation—as a distinctively human type of social selection81—could be an important and efficient means of favoring extrafamilial generosity. This would hold not only for marriage choices but also for choices of subsistence partners and for choices of political allies in and out of the band, for who is favored or disfavored in extending safety-net help, and more generally for situations in which families are choosing to live in one band or another and must be granted permission to do so.

  As a mechanism for selection in favor of self-sacrificial generosity, I believe that social selection needs the most theoretical development. And while no one of these mechanisms we have discussed could have done the job alone, I suspect that social selection will prove to be very important. For humans as I shall be defining it, social selection involves a unique combination of selection by reputation and free-rider suppression, and, as we’ll be seeing later, by itself reputational selection contributes to powerful interactive effects similar to those found in Darwinian sexual selection. There, exaggerated maladaptive traits like peacocks’ resplendent but unwieldy tails are kept in place by female choice, which at the level of gene selection serves as a means of compensation.

  Altruism, too, is by definition basically maladaptive, which means that unless group selection is strongly operative, some kind of individual compensation must be taking place. Social selection probably could not fix altruistic genes in our species’ gene pools all by itself; this will require further research by scholars who do such modeling. But it seems likely to have been a leading force in what was a multifaceted selection process based on contributions from a number of mechanisms,82 including reciprocal altruism and group selection.

  In this book a heavy emphasis will be placed on the two types of social selection we have discussed here, one of which is reputational selection and the other is selection that takes place when groups crack down on deviants. In both cases, we will be exploring the role of human preferences, which stem from human nature, in further shaping that same human nature.

  KNOWING OUR

  IMMEDIATE PREDECESSORS

  4

  FROM PRESENT TO RECENT PAST

  In this and later chapters I’ll be looking for predictable kinds of anti-free-rider social control found very widely today so that I can confidently project them into the more recent hunter-gatherer evolutionary past, and thereby assess their impact on evolving human gene pools. This treatment actually began in the previous chapters, but here I’ll try to justify making such projections. I’ll be particularly interested in dire forms of punishment that could have affected individual reproductive success drastically and therefore could have strongly shaped gene frequencies and, ultimately, human nature.

  To reliably make the case for the punitive type of social selection’s having acted on our genetic makeup, it will be useful to project the group behaviors of today’s foragers into the more recent Pleistocene past as conservatively and accurately as possible. This means we must limit our reconstruction to human predecessors who had brains equal to our own and whose cultures had become flexible and advanced like our own. Archaeologists call these people culturally modern humans, and it’s widely agreed that in Africa they had arrived by 45,000 BP.1

  SETTING ASIDE THE “MARGINALIZATION” ARGUMENT

  In the African archaeological record, cultural modernity is assessed in terms of a rather abrupt appearance of more complex and regionally changeable stone tool technologies, along with objects of self-adornment and “art,” often in the form of engravings. However, interesting as these developments may be, they tell us precious little about what was happening with these people socially. For this reason, it will be necessary to use today’s foragers to reconstruct the group life of their predecessors.

  Past attempts to do so have met with major objections from scientists who deal in human prehistory, so we’ll have to get technical here. For doubters like the influential late political anthropologist Elman Service or more recently archaeologist and hunter-gatherer expert Robert Kelly,2 a main problem is that most of today’s foragers have been “marginalized” by aggressive tribal agriculturalists and, eventually, by civilizations and then empires that took over our planet’s more desirable areas. In contrast, Pleistocene foragers had their pick of world environments, and, so the theory goes, they didn’t have to cope with the not-so-productive semideserts, arctic wastes, and other marginal habitats that often limit subsistence possibilities today. Thus, there’s no telling what they were up to.

  Service made his persuasive marginalization argument more than three decades ago, and it made sense at the time. Unfortunately, it has become almost a truism in archaeological and evolutionary circles that Pleistocene foragers must have been living in a fat city situation because unmarginalized small populations could pick their rich environments at will. However, the available prehistoric information has changed since then, and changed dramatically. What’s new is our understanding of Late Pleistocene climates and their all but unbelievable instability.3 Frequently, and cyclically, rapidly changeable weather patterns could have led to two kinds of prehistoric “marginalization” that, roughly speaking, would have been comparable to what we see today.

  One was purely ecological marginalization. This was likely when areas with adequate patterns of rainfall became drier and only smaller populations could be supported in widely scattered bands. Such climatic downturns could have created localized-drought challenges directly comparable to those arising in the capricious Kalahari Desert area that people like today’s !Kung and !Ko Bushmen have to cope with, or challenges faced by desert Aborigines in Australia or Great Basin foragers in North America.4 The second type of marginalization would have been political. As cyclically better conditions allowed Pleistocene groups to multiply, competition could have intensified as more aggressive hunter-gatherer groups began to monopolize the better resources, marginalizing other foragers just as today’s foragers have been marginalized by territorially aggressive farmers.

  Not only that, but when there were shifts toward ecological good times that permitted gradual but eventually substantial population growth, and then a sudden downturn arrived, it’s likely that foragers of one language or ethnic group would have been prone to aggressively push aside foragers of another. This would have been especially the case if the resources they were competing over were rich enough, and concentrated enough, to be readily defensible.5 Such marginalization could have resulted in outright warfare, and even though direct prehistoric evidence before 15,000 BP is lacking, to judge from certain foragers today6 such conflict could have become quite intensive under the right conditions.7

  Some of the Holocene microclimates that today’s foraging nomads deal with are not just thin in resources but are quite unpredictable in the shorter term, and at least a few cases of famine have been recorded by ethnographers.8 In the recent Holocene these perilous junctures have occurred so rarely that only occasionally does an anthropologist even visit a field site at which a true famine is well remembered. But a striking exception is certain Inuit speakers like the Netsilik in central Canada or Inuit groups in Greenland,9 while Bushmen and many other foragers living on semideserts at least are able to recall episodes of serious privation.10 The social, emotional, and genetic effects of such dire scarcity will be weighed in Chapter 10.

  Using today’s nomadic hunter-gatherers as models for their nomadic predecessors must be further justified here because Service’s “marginalization taboo” still enjoys such wide adherence. Of course, when archaeologists show their unreadiness to reconstruct the social life of “prehistoric foragers,”11 often this involves a legitimate fear of projecting modern human behaviors on to much earlier types of humans who had smaller brains an
d, in all likelihood, had a significantly different behavioral potential. With respect to smaller-brained humans who had not yet developed culturally modern tool kits, such conservatism has been and still is quite appropriate. Here, however, I am considering only the more recent prehistoric humans who matched us in brains and cultural capacity.

  My theory is that the main outlines of their social and ecological life can be reconstructed quite straightforwardly simply by identifying behavior patterns that similarly nomadic foragers share very strongly today. However, such reconstructions must be carefully strategized, and I will be reconstructing only what might be called core behavior patterns,12 that is, behaviors involved with gaining a living, along with the social behaviors that are basic to such an enterprise. In addition, in recreating Late Pleistocene socioecology, I will be focusing only on those carefully selected contemporary foragers whose ecological lifestyles would have been likely 45,000 years ago.

  FINDING THE RIGHT HUNTER-GATHERERS

  This analysis has involved ten years of research effort.13 My first task was to evaluate the great majority of the world’s ethnographically described hunter-gatherer societies, 339 of them,14 to weed out those that obviously would have been atypical in the Late Pleistocene. I eliminated, for instance, the many North American mounted hunters like the Apache or the Comanche because horses were domesticated only recently.15 I also eliminated a few bands that lived dependently at missions, like the well-known South American Aché, and ones that symbiotically traded food with horticulturalists like the Pygmies or the Agta of the Philippines or foragers who had begun to cultivate a few plants themselves. And then I had to set aside dozens of societies that had been heavily involved for centuries with the European fur trade, such as the North American Ojibwa and the Cree, and of course I had to eliminate several dozen sedentary foraging societies that began to intensively store food and eventually lost their egalitarian ways to become markedly hierarchical, like Japan’s aboriginal Ainu or the Kwakiutl of British Columbia—who actually had slaves. After this triage was finished, only about half of the world’s foraging societies were left. They were uniformly independent, nomadic, and egalitarian, and they were suitable—if used in quantity with some statistical sophistication—as models for humans in the latter part of the Late Pleistocene Epoch, which overall lasted from about 125,000 BP until our present Holocene Epoch began to kick in.

  The contemporary models I’ll be using, then, are taken from the perhaps 150 groups that I’ll be referring to as “Late Pleistocene appropriate”16 foraging societies, or, in a more streamlined fashion, as “LPA foragers.” My assumption is that they are very similar to the culturally modern people who were evolving in Africa around 45,000 BP and were spreading to most parts of the world.17 (Keep in mind that the people who painted those beautiful cave paintings in Spain and France first evolved their artistic potential in Africa, where cultural modernity had its beginnings.)18

  With a third of these worldwide LPA societies now coded in fine detail with respect to their social life, this is what I’ve found so far. To start with, these fifty societies are definitely all mobile, and as nomads, instead of trying to store their large-game meat as individual families, they share it widely. It doesn’t matter whether these people live on Arctic tundras or in tropical forests—they never dwell in permanent, year-round villages, and they always combine hunting and gathering to make a living according to what is environmentally available, with an emphasis on eating the relatively fatty meat of large mammals. Normally, their camps or “bands” average around twenty to thirty persons, and each family cooks at its own hearth.19

  In the case of camp size and the butchering of large game, today’s ethnography coincides with what we know of yesterday’s archaeology.20 Just from the ethnography, we know that invariably these people believe in sharing their large game with everyone in the band, and that they all face problems of social deviance like bullying and theft and employ similar basic means of social control to combat them. These foragers very predictably share a core of moral beliefs with an egalitarian emphasis on every hunter’s being a political equal, while the political positions of women as nonhunters are much more subject to diversity. We also know that their bands involve highly flexible camping arrangements, with families moving in and out as needed, and that at any given time a band will be composed of a mixture of some related and many more unrelated families.21

  If these bands were just big extended families, the cooperation and altruism they engage in would be much easier to explain, for kin selection theory would do the trick. But they aren’t, and we may readily assume that the same was true 45,000 years ago. That’s why, as our evolutionary story unfolds, we’ll be so interested in seeing by proxy how prehistoric forager lifestyles could have generated distinctive types of social selection, as agencies that could have supported generosity outside of the family at the level of genes.

  Today, the social patterns I’ve discussed hold all but uniformly across an almost incredibly wide variety of environmental niches that these LPA foragers manage to cope with successfully. These range from arctic tundras to boreal forests in the far north, to productive temperate or tropical forests, to resource-stingy jungles, and to fertile plains or game-rich savannas and arid semideserts.22 These environments include coastal areas as well, which prehistorically were likely to have served as refuges from glacial cold snaps or droughts. These sites today often would be under water, and it’s conceivable that people could have become sedentary for a time while exploiting them. It’s even possible that sometimes they did so for long enough to begin to lose their egalitarian, meat-sharing lifestyle if a long-term habitat was rich enough to permit food storage. However, while families’ economic standards of living may have begun to differ, it’s likely that political egalitarianism would have been more resistant to change, and in any event these outliers would not have negated the social central tendencies I’ve been describing; they would have held very widely.

  Climates today range from hot to frigid and from stable to sometimes fairly unpredictable, but before the Holocene phased in, Late-Pleistocene-type climates could change with a rapidity we seldom see today. It’s no accident that during the lengthy Pleistocene Epoch, human brains just kept on getting bigger, for we’ve had a lot of challenges to cope with,23 and surely some of them involved situations of desperation and famine. In Chapter 10 we’ll learn, from today’s foragers, exactly how desperate these situations were likely to have been and what could have happened to the usual food-sharing practices when people were facing actual starvation.

  It’s remarkable that a single main “type” of band composition and group life can work so successfully when such a startling array of environmental challenges is faced, but this in fact is the case. Scholars agree that socioecological flexibility is what makes this possible, and although the band is an obvious focus, to get the total picture we have to think in terms of many culturally similar bands dispersed over sizable regions, with families changing bands on a rather frequent basis. In the Late Pleistocene with its dangerously capricious environments, very likely this highly flexible approach to group living and subsistence was not merely convenient but often absolutely necessary to getting by—with the sheer survival of entire bands or regional populations surely being on the line much more frequently than is the case today.

  In that epoch, as today, there would have been at least a few exceptions to the basic overall patterns—that is, the strong central tendencies I’ve been describing. I’ve just suggested that temporary sedentary adaptations were likely, and the food storage could have reduced the sharing of large game. Another readily understood contemporary exception that was likely prehistorically can be seen in the few foraging societies that cope with environments so spare that most of the time they are able to forage only as families, without forming bands—as with some desert Australian Aborigines, who subsist partly on insects, or as with certain of the Shoshonean Indians living in America’s semidesertic Gre
at Basin area, whose fat and protein come mainly from fluctuating harvests of piñon nuts rather than from wild game.24 In the unstable and periodically dangerous Late Pleistocene, occasional divergences from the central tendencies I described previously were likely to have been more frequent. However, the great majority of these prehistoric foragers would still have followed today’s main pattern, meaning that they lived in mobile, flexible, egalitarian multifamily bands of twenty to thirty and they invariably shared their beloved large game with its exceptional fat content. That, I propose, was the central tendency for those large-game hunters, and surely it was a strong one, then as now, even though these and possibly some other outliers were likely.

  I’ve taken 45,000 years before the present as the time when Homo sapiens populations in Africa had become culturally modern; this means that they had a full capacity to flexibly invent and maintain the remarkably variable material and social patterns that LPA foragers exhibit today. However, this date may be somewhat conservative,25 for humans had already become anatomically modern by 200,000 BP,26 which means that they were then at least physically indistinguishable from us. Increasingly, it’s looking as though cultural modernity, as deduced from the making of increasingly complex and variable artifacts, some of them symbolic, was phasing in earlier than 45,000 BP. The problem is that cultural modernity evolved in Africa and African archaeology is just getting up a real head of steam. Thus, even though I shall use the 45,000 BP figure to keep the analysis conservative, we might put in its place a date of 50,000 BP or even 75,000 BP or earlier. Only time, and more excavations, will tell.

 

‹ Prev