The Great Shift
Page 9
No one knows when religions first began. It used to be argued that religion is a universal phenomenon—every known civilization has one, it was said, or at least had one at some point—and this fact was held to prove the truth of every religion’s basic claim, that God (or the gods) must exist. But such a claim is wrong on several counts. To begin with, not every religion is theistic, that is, centered on God or the gods; more broadly, there certainly are now (and have been in the past) civilizations that lack anything corresponding to what we term, even in the most general sense, a religion.5 Perhaps most basic of all, everything we now know about evolution indicates that human beings came into existence through a long process of development from earlier hominins.* Homo habilis6 first appeared somewhat less than three million years ago; then came Homo erectus/Homo ergaster 7 (less than two million years ago). Fossil evidence of Homo erectus demonstrates that he was a traveler, apparently the first hominin to leave the African continent and settle widely in Europe and Asia. He may also have been the first hominin to control fire.8 After this there appears a remarkable predecessor of the first Neandertals* and Homo sapiens, a species known as Homo heidelbergensis, dating back perhaps six hundred thousand years ago.9 Evidence of this species’ existence has been found in various parts of western Europe, including Spain, France, Germany, and England. A later representative of Homo heidelbergensis, going back almost four hundred thousand years ago, was discovered in Germany in 1994; he was the presumed manufacturer of an impressive collection of four hunting spears, wooden shafts sharpened at both ends with stone tools, which had apparently been used to kill wild horses—“the oldest reliably identified hunting weapons ever discovered.”10
Could any of these early humans also be called Homo religiosus,11 the first species to worship supernatural beings, or sacred places or animals or inanimate objects from the natural world, or in some other way manifest what could be considered religious behavior? It is sometimes argued that the practice of burying dead bodies (as Neandertals apparently did) indicates the existence of some sort of religious belief, but this is not necessarily so. After a few days, a dead body begins to stink; what is more, it can become a dangerous carrier of disease. Something must be done with it. A corpse can, of course, be disposed of in various ways: tossed into a bonfire, dropped into the sea, or simply carried to a remote location (where, however, its flesh will soon be consumed by animals and insects). Faced with such alternatives, early human beings (apparently starting with Neandertals) came to practice burial as a thoughtful and caring way of disposing of a former member of the family or clan.12 But this does not indicate any hypothesizing of a present or future encounter of the departed with anything divine, or his or her coming back to life at some later date.
However, the practice at ancient burial sites of placing grave goods—a favorite tool, or perhaps a weapon of some sort—next to a dead person’s body tells a different story. To do so seems to attest to the belief that the lifeless corpse will somehow make use of the buried item sometime in the future (otherwise, why throw away a perfectly good animal jawbone, or a carefully sharpened stone?). Scholars continue to debate the age of the earliest grave goods, but some date them back to more than one hundred thousand years ago. From a later period (forty thousand years ago) come meticulous drawings of game animals; while the evidence is far from conclusive, one hypothesis is that these drawings were intended to evoke spirits or deities connected with the hunt and in this way to gain the favor or aid of such spirits in the endless search for food.
Our Puny Selves
What, then, is the significance of this snapshot of God walking about among the humans in the Garden of Eden? In a sense, this image takes us back to the schematic beginnings of human contemplation, to the time when, one might imagine, the little men and women first began to consider themselves and the world in which they lived. This act of contemplation is crucial, going back to a time when humans saw themselves surrounded on all sides, endlessly being acted upon by all that was beyond their control or even comprehension. So in the biblical story, God walks among the naked humans in the Garden simply because His continued presence is necessary for nearly everything that the story narrates. That is why He never enters or appears; He is just always there, the principal actor. He makes the earth and the sky. He shapes Adam out of dirt: first He makes a kind of muddy statue, then He breathes His breath into the statue’s nostrils, and Adam comes to life. After this He plants the plants in the Garden, and still later anesthetizes Adam into a deep sleep, which enables Him to take part of Adam’s side or ribcage and turn it into Eve.13 This God does everything, even making the animal hides to clothe the first humans after their sin (Gen 3:21, presumably after the crude fig-leaf81 garment they themselves had fashioned [3:7] had fallen apart). And even their sin, it should be remembered, was not the result of any human initiative; Adam and Eve were manipulated by the snake into violating God’s orders (3:13).
In short, this is a world in which very little is done by the humans. Everything happens to them, and while an anthropologist would rightly object that this biblical picture of human passivity telescopes a lot of what we now know about the development of early man, for our purposes it is most suggestive about the sense of self that a hypothetical Adam and Eve must have had, inherited as it was from still earlier times and then passed on into far later periods. In a word, as soon as early humans were capable of contemplating themselves and their surroundings, they could not but have an overwhelming sense of their own smallness. The world around them was so obviously big and active. It impinged on them at every turn; it was always taking the initiative and making things happen. In their emerging consciousness, this surrounding world might best be described as a vague, great Outside. It was not personified, of course; we are still far from that. The Outside was just everywhere, endlessly doing; it was huge and they were very little. At times, the Outside’s being must have seemed as palpable, and as accessible, to the humans as their own, filling up all of what we would think of as the empty spaces in their world, pulsing in the nighttime darkness or shining through the thick branches at noon. Its great hand, gloved in sky or sea or soil, was sometimes kind, sometimes cruel. And its presence was heard everywhere, in the rustling of the wind through the leaves or the croaking of the forest at night; all these sounds were part of a single chorus, “the sound of the LORD God walking about in the Garden.” Since the Outside was not any kind of person, it was not given to any sort of characterization; it was the undifferentiated Outside, a hugeness not yet analyzed into discrete beings or functions, but just the “not-us” (or, more accurately, the “mostly not us”) that was everywhere and left the little people full of awe.14 Humans will develop and change, but this fundamental sense of their own little selves and how they fit into the larger, active world will not. Hardwired into the brain over eons and eons, it will only begin to leave them much later, and then not fully until modern times.
The Discovery of Agriculture
The wise snake causes Adam and Eve to understand. At his urging, they eat the fruit of the Tree-of-Knowing-Everything* and “the eyes of both of them were opened.” If the knowledge that the tree imparted included an understanding of agriculture, then we are indeed at a very late point in human history (roughly twelve thousand years ago),15 virtually yesterday. But returning to the preceding stages is important for our understanding of the developing Homo religiosus in his larger environment.
It all began about two and a half million years ago; there emerged a certain bipedal ape with a slightly larger brain than his fellows. “Bipedal” means that he could walk around erect on two legs, which he often did; like other apes, however, he also spent a lot of time in trees, sleeping in constructed nests for safety. His larger-than-average brain actually presented an evolutionary disadvantage, since babies with larger heads are more difficult to deliver, and also because big brains consume significantly more energy. To help with the former problem, the thickness and shape of the head gradually
changed, but what is more, much of the brain’s development was postponed until after the baby’s birth. This meant that humans bear young that are altricial—utterly helpless, because in a sense they are born too soon. But if it were otherwise, the baby’s head would be just too big to come out. (On the other hand, once out of the womb, the brains of newborn humans develop at a much faster pace in the first year than those of other primates. As Steven Pinker has noted, if our bodies grew proportionally to our brains in that first year, we would all be ten feet tall and weigh half a ton.)16 As for the high energy demanded by bigger brains, this man-ape solved the problem by developing an energy-rich (carnivorous) diet along with a reduction in his own digestive tissue.17
As time went on, further adaptations led him to live at ground level much of the time, opening up new feeding opportunities and gradually separating him from other primates, until he became a distinct species, Homo habilis (starting ca. 2.6 million years ago), so named for his primitive toolmaking abilities.18 Homo erectus, considered by some to be the first truly human species, emerged around 1.8 million years ago, standing and walking on his hind legs alone. This posture was not without problems, but it had distinct advantages, freeing up the hands to carry things while walking and providing a higher perspective on the surroundings. Over the next million years, Homo erectus gave way to Homo heidelbergensis (mentioned above) and thence evolved, according to many paleoanthropologists, into both Homo neandertalis and Homo sapiens. The African continent was humanity’s birthplace, but our ancestors left it in stages19 and migrated to other continents, until ultimately they were able to “fill the earth and subdue it” (Gen 1:28).
What were their ancestors, those early hominins, thinking? If the great Outside had been undifferentiated at first, doing virtually everything, filling land and sea and sky and utterly overshadowing the little people, gradually this quality began to recede just a bit. From approximately 2.6 million years ago is the first evidence of hominins’ knapping— banging two stones together to break off sharp, thin flakes. The flakes could then be used for the purpose of cutting up meat and other foods—in fact, cutting almost anything that a modern knife can cut. By around 1.7 million years ago, symmetrically chipped-off stones were being used for hand axes; then, mastery of fire allowed early humans to cook their food, evidenced at least around 790,000 years ago,20 if not earlier. Hunting large animals for food was the apparent purpose of the already-mentioned spears of Homo heidelbergensis (400,000 years ago).
Banging one stone against another to chip off flakes is certainly doing something. Does it require thinking? This is a more complicated question. To begin with, scholars distinguish two kinds of stone-flake production. The first involves banging the “hammerstone” against the “core” stone to produce a flake sharp enough to cut up some meat or open a nut. This is a fairly straightforward act, a necessary part of the sequenced task of cutting or opening something which, scholars say, merely requires “procedural memory.” The second kind aims at shaping both faces of the stone chip so as to produce a more or less symmetrical blade, such as that of the above-mentioned hand axes. The mental abilities necessary to produce such bifaces are more complicated, demanding multiple levels of intentional organization. (In fact, some scholars have sought out a physical connection between the neurological requirements of these carving skills and those needed for the emergence of language.)21 What is more, the biface itself was probably a different kind of object from the simple stone flake in the minds of its producers. It had an independent existence, a tool in its own right,22 whereas the simple stone chip was merely part of a larger and oft-repeated procedure. As a result, the biface was probably carried about even when no specific task was in sight, especially since it was a multipurpose tool, butchering animals, cutting or shaving wood, digging into the ground, and perhaps even being thrown at animals for defensive or offensive purposes.23
Worshiping Bears
Archaeologists sometimes quote a saying (attributed to the cosmologist Martin Rees): “Absence of evidence is not evidence of absence.” In other words: just because we haven’t found something doesn’t mean it didn’t exist back then. So, for example, with regard to the stone tools mentioned above: it is convenient to use them to chart human development, since bifaces and other stone tools stay around forever, just waiting to be unearthed by archaeologists. But their discovery tells us very little about the actual way of life of the humans who made them, since those same humans probably manufactured all sorts of other things that don’t stay around for millennia—baskets, baby carriers, bows and arrows, blowguns, boomerangs, and much more (these are only a few perishable things that begin with the letter B).24
The same is true when it comes to early evidence of what we like to call “religion.” All we have is what we have: those grave goods, cave drawings, and the like. They tell us nothing of the thinking that preceded them—perhaps by millennia—nor, on the other hand, can we always be sure of things that are claimed to give evidence of the “religious.” A recent example is instructive.
After prehistoric bear bones were discovered in caves in various parts of western and central Europe, some scholars argued for the existence of an ancient bear cult in Neandertal times. Neandertals did generally inhabit caves, and the position of the bones in some of these caves seemed suggestive of some deliberate arrangement: the longer bones (tibia, femur, humerus) appeared to have been laid along the sides of the cave and the bear heads in the corners. This apparently deliberate disposition, it was thought, might have been made for religious reasons. Such a conclusion had been encouraged by an early find of the Swiss scholar Emil Bächler, who excavated the Drachenloch (“Dragon Lair,” so named for the mass of bear bones unearthed there) in the eastern Swiss Alps during the period 1919–1923. Especially intriguing to Bächler was a bear skull he uncovered with a femur stuck into the area of the cheekbone. He argued that the femur could only have ended up in this position if it was deliberately turned as it was moved behind the cheekbone—suggesting direct contact between a Neandertal inhabitant and the bear skull.
The idea of a prehistoric bear cult in Europe was, and still is, thrilling. It bespeaks a time when we humans were still closely connected to the rest of the animal kingdom—still frightened by the power and predations of these huge fellow-mammals, perhaps seeking symbolically to harness that power or overcome it, or even in some way identifying ourselves with them as, for example, the totem of our group. Those big bones and skulls neatly arranged in caves seemed light-years away from what we normally think of as religion today, and yet there they were, mute testimony to a world of emotions and ideas that we can scarcely imagine. These Swiss discoveries also linked up with evidence of bear cults (if that is what they were) from later times in the colder regions of Asia, Europe, and North America; it was tempting to think that the roots of this animal’s numinous quality for humans were far more ancient, perhaps instinctive, or an ancient meme that never died. And indeed, a certain fascination with bears may go back very far.25
At the same time, however, these findings have been vigorously challenged. To begin with, Neandertals were not the only inhabitants of caves; bears themselves were finding refuge in caves for their months of hibernation. In fact, bears and humans seem to have occupied caves at different periods, so the coexistence of bear bones and evidence of humans in caves was not an indication in itself of any Neandertal interest in bears. What is more, some Neandertals lived in open-air settlements, but no bear bones were found there.26 As for the position of the bones in caves, other factors—water seepage in the caves, or even strong winds—might have produced the same results. In short, many scholars today doubt that the hypothetical bear cult ever existed, and these doubts now extend to the very existence of something resembling religion among Neandertals.27 It could have existed; we just don’t have the evidence.
Humans Contemplating
More fundamentally, one might say that looking for material remains to confirm the existence of reli
gious ideas or practices inevitably puts the cart before the horse. Long before humans made any object that gave evidence of such things, they were looking at, and fitting into, the world around them in a certain way, conceiving of their own existence in some form or other in their own minds. When did this begin, and what did it consist of? Scholars can make a start by comparing the cranium capacities of chimpanzees (whose mental abilities have been pretty well explored by contemporary scientists) with those of various early humans: chimpanzee brains average about 400 cubic centimeters (gorilla brains go slightly higher, to an average 500 cc or so); the brains of Homo habilis already exceeded these, rising up to 500–800 cc; Homo erectus to 750–1250 cc; early Homo sapiens to 1200–1700 cc; and modern man to an average of about 1350 cc. Somewhere along this continuum, our ancestors began to contemplate themselves and their surroundings, but there is little light that fossil remains or the material culture can shed on when this first occurred. It certainly did occur, however; about this there can be no doubt.