Book Read Free

Sex, Time, and Power

Page 32

by Leonard Shlain


  Eventually, he tested this novel approach to problem solving in the field. Many times it failed, giving the process the appropriate name of “trial and error.” But sometimes it worked, and through the sustained application of reason, early Homo sapiens sought to achieve a special kind of knowing called wisdom. Mother Nature repeatedly humbled him, forcing him to discover the trying lesson that all subsequent humans have been pained to learn: Good judgment is based on experience, and experience is often based on poor judgment.

  Over time, he became increasingly adept at controlling extraneous factors in his if = then predicting machine, and Mother Nature rewarded his accuracy with kernels of knowledge about how the world worked. Thus, the storehouse of communal wisdom steadily enlarged.

  Communication between individuals further enhanced the stockpiling of valuable information in an ectoplasmic silo called culture. Any member of a tribe, having learned a useful fact, could place his or her contribution in the invisible tower by exhaling a few controlled puffs of air in the direction of another’s ears. The expenditure of calories was minuscule: Human language was an incredibly energy efficient transfer and storage system.

  The if = then equation was linear and sequential. The dimension of time forms the very core of cause-and-effect logic, and it issues forth primarily from the left hemisphere of right-handed men and women. Because problem solving using this technique is mental rather than “real,” it falls under the rubric of “abstract” thinking—another primarily left-brain function.

  A core principle of abstract thinking—reductionism—greatly facilitated the if = then equation. An opposable thumb and forefinger enabled Homo sapiens to indulge his insatiable curiosity. Like many primate species before him, he was able to pick apart things to learn what was inside. Abstract thinking allowed him to take this skill to another level—a virtual one.

  Now, without actually dissecting something, he could use his imagination to break down a whole (system, object, or pattern) into its component parts. This mental manipulation gave him a huge advantage over every other animal. Reductionism is synonymous with analysis, and it advances in a measured step-by-step process that depends heavily on the concept of linear sequence. If this, then that; if that, then this… Extending the line requires the knowledge of deeptime. It is the underlying mechanism that drives reason, logic, rationality, and ultimately art and science.

  The artist looks at nature and breaks it down into its component parts, reassembling them in a novel and compelling way. The scientist does the same, but is interested in understanding how the parts relate to the whole. When the scientist reassembles the parts in a novel way, it is called technology. The artist uses reductionism and synthesis in the service of aesthetics; the scientist uses reductionism and synthesis in the service of advancing knowledge. The artist employs the images and metaphors of nature to interpret the relationships of reality; the scientist imposes number and equation on nature to express the relationships of reality. The writer Vladimir Nabokov observed, “There can be no science without fancy and there can be no art without facts.”1 The revolutionary artist and the visionary scientist are both fundamentally engaged in investigating the essence of reality.

  The if = then equation played a crucial role in the invention of new tools and weapons. It had an immense bearing on Homo sapiens’ rapid ascent in the standings of nature’s survival-of-the-fittest contest. Rationality boosted his ability to imagine, construct, and deploy ever-superior weapons with which to kill prey and, later, perceived enemies. For all but the final 0.008 percent of the five-million-year bipedal-primate experiment, his weapons of choice were rocks, bones, pits, and sticks. Then, beginning somewhere between forty thousand and thirty thousand years ago, he invented the spear thrower, or ataltl, a small, hand-held, leveraged device that a hunter could attach to the base of his spear. It greatly increased the speed, thrust, and accuracy of a hurled projectile. Homo sapiens invented the bow and arrow fifteen thousand years ago, the iron sword five thousand years ago, the chariot four thousand years ago, flaming naphtha three thousand years ago, gunpowder one thousand years ago, the airplane one hundred years ago, and the atom bomb fifty years ago. The Doubly Wise Man had transmogrified into the formidable Weapons Maker Man.

  Imagine a ladder. On each rung sits a predator; the higher the rung, the more dangerous the predator. Emerging from a long primate tradition of chewing nuts, roots, and leaves, Homo sapiens began to ascend the ladder rapidly, successively displacing more fearsome predators at each step. In the astonishingly brief span of 150,000 years, Homo sapiens attained the top rung, and from there he claimed the title of undisputed king of the jungle…and sea, plain, desert, tundra, forest, and mountain. Just as Genesis had foretold, humans had gained “dominion over the fish in the sea, and over the fowl of the air, and over every living thing that moveth upon the earth” (Genesis: 1:28).

  Gripping the top rung of the predators’ ladder, Homo sapiens surveyed all that he had conquered. No single creature could withstand the juggernaut of his onslaught. Just about every plant and animal simmered as an ingredient in one or another flavorful sapient recipe. He had succeeded in stalking, harvesting, capturing, foraging, taming, plucking, domesticating, and killing the entire outpouring of billions of years of evolution. His sovereignty was secure. Nowhere on the horizon was there an animal that posed a threat to his dominion.

  And yet, despite his stunning and rapid success, he increasingly experienced a gnawing sense of anxiety. His unease resulted from his having discovered one of time’s terrible certainties. Let us attempt to reconstruct the moment of his insight that forever changed the human species.

  Somewhere, sometime, a Homo sapiens sat alone on a rock lost in contemplation. Let us name this particular individual Adam. Although physically in an attitude of repose, he was actually intensely exercising the expanded new circuits in his brain’s left hemisphere, challenging the limits of how far he could peer into the future.

  Each time Adam tried to extend his vision, he was blocked by an obstruction: an impenetrable black wall. Placing his fingertips against his temples, he increased his concentration. Despite his most focused efforts, he could not go through, around, under, or over the obsidian barrier that loomed before him. And then, in a searing insight, it dawned on him: He was going to die. The black monolith stood for his own personal death. He could imagine nothing beyond it. His discovery filled him with terror. Here, then, was the terrible price he paid for following Gyna sapiens into the mysterious vortex of time’s tunnel.

  Death is an all-too-common occurrence in nature. It exists side by side with the exuberance of life. Animals see, hear, and smell it all around them. Zebras nervously whinny as they witness one of their own being torn to pieces by hyenas. A vixen mews as she listens to the weakening cry of her young kit dying of disease or malnutrition. A distressed baby monkey cringes in horror as an eagle snatches his mother, who, only a moment before, was at his side grooming him. “Nature,” as the poet Alfred Tennyson remarked, “red in tooth and claw.” Psychoanalytically inclined philosopher Ernest Becker summarized the situation thus: “Creation is a nightmare spectacular taking place on a planet that has been soaked for hundreds of millions of years in the blood of all its creatures. The soberest conclusion we can make is that the planet is being turned into a vast pit of fertilizer.”2

  For our closest relatives, the chimpanzees, death appears to be an unfathomable enigma. When a chimp suddenly dies by falling from a great height, the others clamber down and gather round the corpse. Among much panting, hooting, and obvious agitation, they peer and prod. But finally, after what we would consider an astonishingly short period of concern for a fallen troop-mate, they nonchalantly resume their daily activities. In all but a few extraordinary cases, it would appear that the matter has been quickly forgotten.*

  Paleontologist Richard Leakey and Roger Lewin, in their 1992 book, Origins Reconsidered, observed:

  In all human societies, the awareness of death has p
layed a large part in the construction of mythology and religion. There seems, however, to be no awareness of death among chimpanzees. Females have been known to carry around the corpse of an infant for a few days after its death, but they seem to be experiencing bewilderment rather than grief. More important, other mature individuals appear to offer no condolence or sympathy to the bereaved mother. The emotional experience seems to go unappreciated by others, and unshared. So far no observer has seen reliable indications that chimpanzees have an awareness of their own death, the extinction of self.4

  Dogs obviously mourn the loss of their owners. But it is not evident from their sorrowful keenings and whimperings that they are aware that the same fate awaits them. As death approaches for some higher animals, they appear to behave as if they were anticipating this final event. Elephants seem to know when one of their herd is going to die. They appear to become more protective and sympathetic, as if trying to alleviate the suffering of the old one. Observers report that elephants sometimes cover a dead companion with leaves and branches before they move on.5 There is, however, nothing in their observable behavior to suggest that earlier in their lives they confronted the inevitability of their own demise.

  We may never know with certainty what other animals know about their own death, but it appears that an absolute precondition for being able to anticipate personal extinction is the capacity to think in an extended time frame. Humans seem to be the only animals to have mastered this complex feat—and perform it at an early age.

  The human confrontation with death’s finality altered the course of every subsequent culture.

  The jolt of adrenaline that activates the fight-or-flight response when any animal confronts mortal danger is among the most basic survival instincts. However, there does not appear to be any animal other than a human that seems fixated on the prospect of its death in the far-distant future. Only a human clearly understands the harsh truth that death is inevitable, inescapable, and nonnegotiable.

  Because humans are animals, we experience instant fear whenever danger suddenly thrusts itself smack into our awareness. In addition to these sudden episodic spasms of acute fear, we became the first animal to be plagued by an ever-present, free-floating anxiety resulting from an awareness of death’s omnipresence. Becker, aware of his own impending demise, wrote poignantly:

  Anxiety is the result of the perception of the truth of one’s condition. What does it mean to be a self-conscious animal? The idea is ludicrous, if it is not monstrous. It means to know that one is food for worms. This is the terror: to have emerged from nothing, to have a name, consciousness of self, deep inner feelings, an excruciating inner yearning for life and self-expression—and with all this, yet to die.6

  The psalmist expressed the same idea poetically:

  As for man, his days are as grass: as a flower of the field, so he flour-isheth.

  For the wind passeth over it, and it is gone; and the place thereof shall know it no more. [Psalm 103:15.]

  Herodotus recounts an incident from the fifth century B.C. Xerxes, the Persian despot, was leading a vast army to invade Greece. As he neared the storied ruins of Troy, he felt a strong desire to visit the site. Surveying the scene of the ancient battle deeply moved him. The next day, he told his staff that he wanted to review his troops. Xerxes had a throne hastily set upon a high knoll so that he could view on the plain below the panorama of a million men on the march accompanied by their engines of war.

  At first, he exulted vaingloriously in the pageantry of so many virile men parading before him. Then he began to weep. His aide-de-camp, witnessing the king’s distress, asked why he cried. “I was thinking,” Xerxes replied, “and it came into my mind how pitifully short is human life—for of all these thousands of men not one will be alive in a hundred years.”7

  One could safely assume that Xerxes was weeping for himself as well. Death would be his fate, too. As he stood at the height of his power, Xerxes understood that life’s impermanence made all his strivings seem a preposterous vanity. It is extremely doubtful that any other animal might have been similarly overcome with emotion. Coming to grips with the finality of death was one of the major milestones in the evolution of our species, and it remains a key insight in the maturation of each individual.

  The child psychologist Jean Piaget reported that infants as young as six months can make a distinction between animate and inanimate objects. Late in the second year, a toddler ambles past a mirror, stops, touches it, and then begins to make funny faces, an activity we share with chimps that confirms that the child has a concept of self. The awareness of chronological time becomes refined in children between the ages of three and three and a half, when they begin to use the tenses of verbs correctly. Properly conjugating past, present, and future indicates he or she has acquired an understanding of time. At the beginning of the fourth year, a child understands that other individuals can hold beliefs different from his or her own. This constitutes one of life’s major discoveries. For the first time, children relish experimenting with a new behavior mode they will put to extensive use for the rest of their lives—lying. Leo Tolstoy wrote, “From the child of five to myself is but a step. But from the newborn baby to the child of five is an appalling distance.”8

  Yet, in the early years of every child’s life, death is an incomprehensible concept. At around the age of six or seven, children finally become aware that dead birds, goldfish, and hamsters will never return. Then, often after the death of a grandparent, they come to the realization that someday their parents will die. Soon afterward, they formulate the key insight that the same fate awaits them, too.

  The Catholic Church considers seven to be the age of moral understanding and uses it as the milestone for when a child can receive first communion. Confucian followers believe seven is the age of the beginning of wisdom. Many other cultures have used this age as a dividing line between innocence and the beginnings of a mature mind. The reason, I suspect, that the great traditions acknowledge seven as a turning point in one’s life is that this is the moment one grasps the inevitability and finality of death. Advance knowledge of one’s own personal final exit is uniquely human. Eve took a bite from an apple and gained godlike knowledge. She then offered it to Adam. Neither noticed that there was a worm in the core.

  Homo sapiens emerged as a distinct species 150,000 years ago. Paleontologists have not identified a single major physical attribute that has changed since that time. An ancestral Homo sapiens dressed in modern clothing and walking a crowded street in New York today would not attract the slightest attention.

  For the first 110,000 years of our species, there is no consistent record of art, burials, or grave goods.* Then what author John Pfeiffer calls the Creative Explosion began. A magnificent awakening mysteriously emerged nearly all at once approximately forty thousand years ago, during which the abovementioned cultural markers unique to humans habitually made their appearance in areas as disparate as Australia, Morocco, Siberia, and, especially, Europe. Thereafter, they have become staples of every human culture. Since paleontologists have not identified any increase in brain capacity, what discovery could Homo sapiens have suddenly made that engendered the practice of the ritual disposition of dead relatives? What encouraged them to relinquish perfectly usable goods and place them in a grave alongside the deceased? Why adorn the dead with painstakingly crafted adornment? And most intriguing: What compelled sapients to begin creating art in abundance after the passage of tens of thousands of years without experiencing a similar urge? †

  The “Great Leap Forward,” as Jared Diamond calls the Creative Explosion, is also referred to by others as the “Upper Paleolithic Revolution.” Many different theories besides the Big Bang of language have been advanced to explain this forty-thousand-year-old unsolved evolutionary whodunit. These include a “restructuring of social relations,” the “appearance of economic specialization,” and an as yet to be identified “technological invention.”9

  Stephen Mithin,
in his 1996 book, The Prehistory of the Mind, proposes that early Homo sapiens’ cognition was divided among separated “domains of knowledge.” Although each domain had been expanding, the technical intelligence concerning how to make tools was isolated from natural-history intelligence, which in turn had little interaction with the domains of social intelligence, general intelligence, and language. Mithin believes that the Creative Explosion occurred when some event caused the walls to come tumbling down. Like Gabriel’s trumpet blast at Jericho, an unknown factor smashed through the barriers erected in the brain that kept the various domains from knowing what another knew.

  In 1866, the biologist Ernst Haeckel proposed his Biogenetic Law: “Ontogeny is the short and rapid recapitulation of phylogeny.” “Ontogeny” in humans is the process by which a fertilized ovum develops into a full-term baby; “phylogeny” is the process by which simple life-forms evolve into more complex ones. Phylogeny took billions of years to occur; ontogeny mimics (recapitulates) the journey in nine months. Each process parallels the other. Stated simply: The successive developmental stages that a human conceptus-embryo-fetus-neonate runs through in only nine months of gestation repeats the entire 3.8-billion-year evolution of all animals.*

  The instant a sperm and an ovum unite, the future human has all the characteristics of a single-celled organism, similar to an amoeba. Within hours, the fertilized ovum begins to rapidly divide. The spherical multicelled blastocyst it becomes in four days is nearly indistinguishable from other primitive multicellular organisms, such as molds and sponges. The journey from single-celled organisms to multicelled ones took evolution several billion years to make. Within three weeks, the human embryo takes on the appearance of a worm, and then, in another week, it develops gills to mimic those of a fish. Several weeks later, lungs replace gills, a process that took millions of years in evolution. Shortly afterward, it morphs into a reptile, complete with a tail. Finally, the fetus begins to manifest all the characteristics of a mammal, and then those of a primate. With the disappearance of the fetus’s tail, a fetal ape appears. Finally, when the changeling sheds its outer layer of dense intrauterine fur (called “lanugo”), he or she differentiates into a human in the remaining weeks of pregnancy. The fifth-century-B.C. Greek philosopher Empedocles was aware of the process when he wrote in Purifications, “Already have I once been a boy and a girl, and a bush and a bird, and a silent fish in the sea.”

 

‹ Prev