Book Read Free

The Immense Journey

Page 8

by Loren Eiseley


  To continue our writing of the story of human evolution we are totally dependent upon finding additional fossils. Until further discoveries accumulate, each student will perhaps inevitably read a little of his own temperament into the record. Some, as Hurzeler has done, will dwell upon short faces, vertical front teeth and little rounded chins. They will catch glimpses of an elfin human figure which mocks us from a remote glade in the forest of time. Others, just as competent, will say that this elusive homuncular elf is a dream spun from our disguised human longing for an ancestor like ourselves. They will say that in the living primate world around us there are lemurs with short faces and vertical teeth, and that there are monkeys which have the genuine faces of elves and the capacious craniums of little men.

  In the end we may shake our heads, baffled, and have to admit that many lines of seeming relatives, rather than merely one, lead to man. It is as though we stood at the heart of a maze and no longer remembered how we had come there.

  THE DREAM ANIMAL

  It will now be seen that in spite of the dramatic press announcements which thundered the end of Darwinism and of missing links, our little homuncular elf proved nothing of the kind. Even if he turned out to be on the main line of evolutionary ascent leading to man—and this is still exceedingly doubtful—he has, at present, nothing to tell us about the human brain. He is small, he is not by any stretch of imagination a man, and if he did indeed become one, the event still lay millions of years in the future. No amount of headlines can turn the little creature from Tuscany into a human being without recourse to evolutionary change. The writers who had seized upon the “little man” as a refutation of Darwin’s general thesis had, at best, been merely acclaiming a new “missing link.”

  We must now examine, however, some recent aspects of the problem to which I have previously given attention: the mystery which enshrouds the rise of the human brain. A most perceptive philosopher once remarked that the truth about man is inside him. This may well prove to be the case, but the difficulty is to get the secret out, if indeed it lies there, and once it is revealed, to be sure that it is read correctly.

  Every so often out of the millions of the human population, a six-year-old child or a teen-age youth dies of old age. The cause of this curious disease, known as progeria, or premature aging, is totally unknown. Clinical cases are reported of complete hairlessness, wrinkled and flabby skin, along with senile changes in the heart and blood vessels. Medical science has observed in these rare cases an enormous increase in the velocity of aging, but the mechanism involved remains as yet undiscovered, though the cause may lie somewhere among the ductless glands.

  The affliction, rare though it is, reveals a mysterious clock in the body, a clock capable of running fast or slow, shortening life or extending it and, like the more visible portions of our anatomy, being subjected to evolutionary selection. This clock, however, has another even more curious aspect: it may affect the growth rate of particular organs. In this way certain peculiar animal specializations have appeared, such as the huge antlers of the extinct Irish elk, or the dagger-like fangs of the saber-toothed tiger.

  Man, too, has a curious specialization of a more abstract and generalized type, his brain. If this brain, a brain more than twice as large as that of a much bigger animal—the gorilla—is to be acquired in infancy, its major growth must take place with far greater rapidity than in the case of man’s nearest living relatives, the great apes. It must literally spring up like an overnight mushroom, and this greatly accelerated growth must take place during the first months after birth. If it took place in the embryo, man would long since have disappeared from the planet—it would have been literally impossible for him to have been born. As it is, the head of the infant is one of the factors making human birth comparatively difficult. When we are born, however, our brain size, about 330 cubic centimeters, is only slightly larger than that of a gorilla baby. This is why human and anthropoid young look so appealingly similar in their earliest infancy.

  A little later, an amazing development takes place in the human offspring. In the first year of life its brain trebles in size. It is this peculiar leap, unlike anything else we know in the animal world, which gives to man his uniquely human qualities. When the leap fails, as in those rare instances where the brain does not grow, microcephaly, “pinheadedness,” is the result, and the child is then an idiot Somewhere among the inner secrets of the body is one which keeps the time for human brain growth. If we compare our brains with those of other primate relatives (recognizing, as we do, many similarities of structure) we are yet unable to perceive at what point in time or under what evolutionary conditions the actual human forerunner began to manifest this strange postnatal brain expansion. It has carried him far beyond the mental span of his surviving relatives. As our previously quoted authority, Dr. Tilly Edinger of Harvard, has declared, “the brain of Homo Sapiens has not evolved from the brains it is compared with by comparative anatomy; it developed within the Hominidae, at a late stage of the evolution of this family whose other species are all extinct.”

  We can, in other words, weigh, measure and dissect the brains of any number of existing monkeys. We may learn much in the process, but the key to our human brain clock is not among them. It arose in the germ plasm of the human group alone and we are the last living representatives of that family. As we contemplate, however, the old biological law that, to a certain degree, the history of the development of the individual tends to reproduce the evolutionary history of the group to which it belongs, we cannot help but wonder if this remarkable spurt in brain development may not represent something roughly akin to what happened in the geological past of man—a sudden or explosive increase which was achieved in a relatively short period, geologically speaking. We have already opened this topic in our discussion of the Darwin-Wallace argument. Let us now see what new evidence bears upon the facts we set forth there.

  In discussing the significance of the Piltdown hoax and its bearing upon the Darwin-Wallace controversy, I used the accepted orthodox geological estimate of the time involved in that series of fluctuating events which we speak of popularly as the “Ice Age.” I pointed out that almost all of what we know about human evolution is confined to this period. Long though one million years may seem compared with our few millennia of written history, it is, in geological terms, in evolutionary terms, a mere minute’s tick of the astronomical clock.

  Among other forms of life than man, few marked transformations occurred. Rather, the Ice Age was, particularly toward its close, a time of great extinctions. Some of the huge beasts whose intercontinental migrations had laid down the first paths along which man had traveled, vanished totally from the earth. Mammoths, the Temperate Zone elephants, dropped the last of their heavy tusks along the receding fringes of the ice. The long-horned bisons upon whose herds man had nourished himself for many a long century of illiterate wanderings, faded back into the past. The ape whose cultural remnants at the beginning of the first glaciation can scarcely be distinguished from chance bits of stone has, by the ending of the fourth ice, become artist and world rover, penetrator of the five continents, and master of all.

  There is nothing quite like this event in all the time that went before; the end of brute animal dominance upon earth had come at last. For good or ill, the growth of forests or their destruction, the spread of deserts or their elimination, would lie more and more at the whim of that cunning and insatiable creature who slipped so mysteriously out of the green twilight of nature’s laboratory a short million years ago.

  A million years is a short time as evolution clocks its progress. We assume, of course, that below that point the creature which was to become man was still walking on his hind feet, but there is every reason to think that the bulging cortex which would later measure stars and ice ages was still a dim, impoverished region in a skull box whose capacity was no greater than that of other apes. Still, a million years in the life history of a single active species like man is a
long time, and powerful selective forces must have been at work as ice sheets ground their way across vast areas of the temperate zones. But suppose, just suppose for a moment, that this period of the great ice advances did not last a million years—suppose our geological estimates are mistaken. Suppose that this period we have been estimating at one million years should instead have lasted, say, a third of that time.

  In that case, what are we to think of the story of man? Into what foreshortened and cramped circumstances is the human drama to be reduced, a drama, moreover, which, besides evolutionary change, involves time for the spread of man into the New World? Such an episode, it is obvious, would involve a complete reexamination of our thinking upon the subject of human evolution. In 1956 Dr. Cesare Emiliani of the University of Chicago introduced just this startling factor into the dating of the Ice Age. He did it by the application of a new dating process developed in the field of atomic physics.1

  The method, it should be explained at the outset, is not the carbon-14 technique which has become so widely publicized in the last decade. That method has applications which, at best, can carry us back around thirty to forty thousand years. The new technique elaborated in the University of Chicago laboratories involves oxygen-18. By studying the amount of this isotope in the shells of sea creatures it was found that the percentage of oxygen-18 in the limy shell of, say, an oyster would reveal the temperature of the water in which the oyster had lived when its shell was being secreted. This is because oxygen-18 enters chemical reactions differently at different temperatures. For example, as the temperature of the water increases, the oxygen-18 in the shell decreases.

  By using marine cores, specimens of undisturbed sediments brought up from the ocean floor, Dr. Emiliani has been able to subject these chalky deposits full of tiny shells to careful oxygen-18 analysis. He has found, as he analyzed the chemical nature of the seas’ “long snowfall,” that is, the age-long rain of microscopic shells falling gently to the sea bottom, that marked changes in water temperature could be discerned for different periods in the past. As he studied layer after layer of the chalky ooze brought up in sequential order from the depths, he found that the times of maximum ice expansion on the continents coincided with periods of marked cold beyond that of the present, as revealed in the oxygen-18 content of the minute shells from the ocean floor.

  Studying Atlantic and Caribbean cores, Emiliani came to the conclusion that the earliest great cold period, most probably coinciding with the onset of the first glaciation in Europe, was probably no earlier than about three hundred thousand years ago. Oxygen-18, of course, indicates periods of relative warmth or cold, not years. The dating triumph was achieved by the well-known carbon-14 technique for the upper levels of the deposits within the forty-thousand-year range, since carbon-14 also occurs in the chalk ooze.

  By establishing the beginning of the last ice recession at about twenty thousand years, it was possible, as a result of the undisturbed uniform nature of the sea deposits, to project the datings backward by the combination of the cold graph and the apparent rate at which the deposits had been laid down, as determined from the carbon dates of the more recent levels. The study reveals a considerable degree of regularity in the waxing and waning of the ice sheets at intervals of about fifty to sixty thousand years.

  Dr. Emiliani and his co-workers have thus produced an Ice Age chronology startlingly different from orthodox estimates, but one which is being widely and favorably considered. The newer scheme allows about six hundred thousand years for the total of Ice Age time. Actually the modification is more striking than this figure would indicate. Older figures placed the first, or Gunz glaciation, distant from us at the bottom of the Ice Age by almost a million years. The new chronology would place this ice sheet only about three hundred thousand years remote and then allow perhaps three hundred thousand more years, much less accurately computable and quite indefinite, for certain vague preglacial events. These might include our oldest traces of the Australopithecine man-apes and the first dim traces of crude pebble and bone tools, possibly made by some, at least, of these South African anthropoids.

  As we have already indicated, most of our collection of human fossils is derived from the last half of the Pleistocene, even by the old chronology. In this new arrangement the bulk of this material is found to be less than two hundred thousand years old. Man, in Dr. Emiliani’s own words, had “the apparent ability to evolve rapidly.” This is almost an understatement. The new chronology would appear to suggest a spectacular, even more explosive development than I have previously suggested.

  Unfortunately the full outlines of this story cannot, as yet, be made out. Our fossils are too scattered and too few. If the Fontechevade cranium from the French third interglacial represents a man essentially like ourselves, as in brain he appears to be, we can date our species as in existence perhaps seventy thousand years ago, though its total diffusion, in terms of area, at that date would be unknown. If the problematical Swanscombe skull—discovered in England—whose face is missing but whose cranial capacity falls within the modern range, should prove, in time, to be also of our own species, “modern” man would have been in existence perhaps one hundred twenty thousand years before the present.

  Even if the men of this period should, in the end, prove to have a face somewhat more massive than that of modern man, an essentially modern brain at so early a date can only suggest, in the light of Emiliani’s new datings, that the rise of man from a brain level represented in earliest preglacial times by the South African man-apes took place with extreme rapidity. Either this occurred, or other fossil forms are not on the main line of human ascent at all. This latter theory, if we still try to cling to a slow type of human evolution, would imply that the true origin of our species is lost in some older pre-Ice Age level, and that all the other human fossils represent side lines and blind alleys of development, living fossils already archaic in Pleistocene times.

  Some, contending for this view, have pointed out that carbon-14 datings close to the forty-thousand-year mark have recently been recorded in America. This, it has been argued, suggests a remarkably wide and early diffusion for man, if he is really so young as is now suggested. Just lately, however, some of the earliest carbon-14 dates from the Southwest have been challenged. Professor Frederick Zeuner of the University of London has recently (1957) reported that carbon samples subjected to alkaline washing give dates much earlier than they should actually be. Some of the carbon-14 necessary for accurate dating is apparently removed by subjection to this treatment, thus raising the age of the sample. As a consequence, some of the very earliest American dates from the Southwest may be subject to upward revision. There is no doubt that man had reached America in the closing Ice Age, but these earlier dates will be subject to serious scrutiny.

  Interestingly enough, the Keilor skull from Australia, once supposed to be a very early third interglacial man of our own species, has now been elevated, on the basis of carbon dates, to definitely postglacial times. Thus, on this remote continent, there is now no reliable evidence of extremely ancient human intrusion. Furthermore, if we turn to the Old World and seek to carry men much like ourselves further back toward the first glaciation, we have to ask why we so rapidly descend into seemingly cultureless or almost cultureless levels. If man approximating ourselves is truly much older than we imagine, it is conceivable that his physical remains might for long escape us. It seems unlikely, however, that a large-brained form, if widely diffused, would have left so little evidence of his activities. It would appear, then, that within the very brief period between about five hundred thousand to one hundred fifty thousand years ago, man acquired the essential features of a modern brain. Admittedly the outlines of this process are dim, but all the evidence at our command points to this process as being surprisingly rapid.

  Such rapidity suggests other modes of selection and evolution than those implied in the nineteenth-century literature with its emphasis on intergroup “struggle” which
, in turn, would have demanded large populations. We must make plain here, however, that to reject the older Darwinian arguments is not necessarily to reject the principle of natural selection. We may be simply dealing with a situation in which both Darwin and Wallace failed, in different ways, to see what selective forces might be at work in man. Most of the Victorian biologists were heavily concerned with the more visible aspects of the struggle for existence. They saw it in the ruthless, expanding industrialism around them; they tended to see nature as totally “red in tooth and claw.”

  The anthropologist had yet to subject native societies to careful scrutiny, or to learn that people of different cultures were remarkably like ourselves in their basic mental make-up. They were often regarded as mentally inferior, living fossils pushed to the wall and going under in the struggle with the dominating white. Wallace, as we have already seen, stood somewhat outside this Victorian prejudice, and having himself endured economic want, almost alone among the great biologists of his time, sought for another key to the development of man.

  His thoughts led him in a somewhat mystical direction, yet certain of the facts he recorded were valid enough. He wrote early, however, so that natural explanations which could now be offered were, understandably, not available to him at that time. It is impressive that Wallace observed, though he did not understand, what we today call the pedomorphic features of man—his almost hairless body, his helpless childhood, his surprisingly developed brain—which he rightly judged to be in some manner related to the uniqueness of man. His conclusion that the linguistic ability of natives is in no way inferior to that of “higher” races—a commonplace today—was, in its own time, a courageous statement made in considerable contradiction to beliefs widely held even among scientists.

 

‹ Prev