Book Read Free

The Scars of Evolution

Page 3

by Elaine Morgan


  But Dart’s find had come first. It held the field long enough to give rise to an imaginative concept of how man’s earliest ancestors may have lived. The fossils found in South Africa by Dart and Broom had been found in a hot, dry area, buried in cave debris, mingled with horns and baboon skulls and other indications of a grassland habitat. Perhaps man was the ape who moved out onto the savannah …?

  This theory proved instantly acceptable as an answer to the perennial question of how man originated. In the public mind the images inspired by Kipling’s The Jungle Book and Rice Burrough’s Tarzan of the Apes gave place to a vision of our earliest predecessors striding across the grassy plains which cover half the total area of the African continent.

  Once such a concept takes hold it becomes extremely hard to dislodge. Scientists use it, tentatively at first, as a model. They construct hypotheses around it, publish papers on it, commit themselves, acquire a vested interest in its continued acceptance. It becomes the conventional wisdom. Savannah explanations were advanced to account for hairlessness, bipedalism, sexual bonding, tool using, and a whole spectrum of other human features.

  If the Ethiopian discoveries had been made before the South African ones, it is probable that none of that would have happened. The Hadar site, in the Afar peninsula, is now arid, but there are geological deposits there which prove that it was once a lakeside or riverine habitat. The same thing is true of the Olduvai Gorge.

  Of all the fossil sites in the Rift Valley, nearly all were wetter when the hominids were there than they are today. (A possible exception is Laetoli, where a couple of hominids left their footprints on a layer of newly deposited volcanic ash.)

  All the other Rift Valley sites, now desert or near-desert, were then green. Don Johanson summed up what is known of them: ‘Once they were lush lake regions, swarming with game, laced with winding rivers and thick stands of tropical forests.’ Many creatures other than the hominids left their bones at these sites. They include early ancestors of modern pigs and elephants and herd animals and rodents and horses. They also include species of crocodiles, fish, hippopotamuses, frogs, swamp snails, waterfowl, and the fossilised pollen of rushes and other water plants. The most famous of all the fossils to date is known as ‘Lucy’. Her almost complete skeleton was discovered in East Africa among the remains of crocodile and turtle eggs and crab claws. And in 1985, when researchers revisited the Taung site, they concluded that there, too, when the baby died the climate was humid rather than arid as it is today.

  So if the prospecting had started in the north and worked down, popular illustrations of groups of Australopithecus would have shown them reclining under a shady tree at the water’s edge, living perhaps on fruit and greenery and fish. Instead, they are depicted as shaggy creatures trekking through parched grass and a scatter of stunted thorn bushes, turning to scavenging and hunting to supplement their diet.

  There has been no retreat from the savannah theory. Lucy has been assimilated into the savannah hypothesis. It is argued that the Rift Valley relics do not represent a cross-section of the indigenous fauna at the time but are merely ‘death assemblages’, and that although Lucy died at the lake edge, that does not prove she lived there. The fossils must include the bones of swamp and lake species which lived on the spot, and the bones of savannah species which merely visited the place to drink, and died before they could go away again. We have no way of knowing to which category the early hominids belonged.

  Savannah theorists would argue that for every hominid that left its bones in the lakeside mud, ten or twenty may have left theirs on the arid plains. They say we should not be too surprised that no evidence of this can be produced because the corpses would have been devoured by carnivores and in that kind of environment the bones would not be preserved. It is a good argument; though it is not easy to understand why, if Lucy and her kind could so easily return to the lush places to die, they would not have chosen also to live in them.

  Meanwhile, the fossil-hunters’ world had again been shaken up – as in Raymond Dart’s time – by a ‘preposterous’ suggestion. It generated a lot of heat, not only because it ran contrary to their own beliefs, but also because it came from outsiders who had never sweated on a fossil site or pondered over the cusps of a prehistoric molar. It came from a new breed of anatomists armed with a shining new investigative technique – biochemical analysis – and it concerned the dating of the emergence of man.

  There has never been agreement between the experts concerning the exact shape of the anthropoid family tree. Louis Leakey, for example, never accepted that any of the Australopithecus fossils from any of the sites belonged on the line leading to man. They all belonged, he believed, to side branches which had later become extinct, and man’s first true ancestor – like the Holy Grail – was still to be sought for. Others were quite ready to believe that Lucy or some other of the Hadar specimens might have been ancestral to Homo sapiens.

  What none of them knew with any certainty was how long ago the split between apes and man had taken place. It was generally assumed that it happened a long time ago. People are very different from apes, and evolution is a slow process. Throughout the ’60s and into the ’70s estimates of up to 30 or 50 million years ago were being put forward as possible dates for the ape/man split, and nobody raised an eyebrow. ‘At least fifteen million’ was the bottom line.

  Suddenly molecular biologists were publishing their own figures. They asserted that only five million years had passed since the apes and ourselves had a common ancestor. They worked it out by comparing proteins and nucleic acids from the bodies of living humans and of living African apes, and measuring the differences between them on the assumption that the longer two species have been evolving separately, the greater the differences will be. Between man and chimpanzee the difference turned out to be very small – only one per cent.

  The ‘five million yeat’ claim was published by Vincent Sarich and Allan Wilson at the end of 1967, and it met with a mixed reaction from the palaeontologists. Most of them, if they heard about it at that time, ignored it. That is always the easiest and most time-honoured response, though not terribly constructive. A few of them flatly stated that it had to be nonsense because it disagreed with their own reading of the fossil record. Some hedged their bets by quietly revising their previous estimates of the divergence dates. For example, ‘not less than fifteen million years ago’ became ‘not more than fifteen million’.

  Almost all of them found five million years impossible to swallow. ‘They didn’t offer any reason why it couldn’t be true,’ Wilson commented, ‘they just felt it was somehow obvious it couldn’t be true.’ He had come up against exactly the same blank wall as Raymond Dart had done, and was reacting with the same sweet reason, defending his attackers. ‘I don’t think,’ he said, ‘one can blame the person particularly because it’s the social context they’re in.’

  But his colleague Sarich did not have a Dartian temperament. He was not about to sit around for twenty years and then say he had not been in a hurry and had not expected to be believed. He expected to be believed, pronto. Sherwood Washburn, Professor of Anthropology at the University of California, said of him: ‘… he wanted to convert people faster than I think it was reasonable they should be converted.’ He was not interested in arguments about old skulls and teeth and jawbones, saying it didn’t matter what the fossils looked like. The old estimates should be abandoned at once because ‘… one no longer has the option’ of clinging to them.

  For a time there was bitter controversy. The new technique was first attacked as unreliable, then examined more attentively. The turning point came when two other scientists, Charles Sibley and Jon Ahlquist, applied a technique called DNA hybridisation, measuring the differences in the overall structure of the genetic material instead of a detailed nucleotide sequence. They arrived at a time-scale longer than that of Sarich and Wilson, but not much longer – between seven and nine million years, rather than between four and six million.


  Conversion took place piecemeal. There was a tendency to take the new evidence on board but to keep it in quarantine by expressing it in the subjunctive, on the lines of: ‘If we are to believe the molecular biologists, we should have to conclude that …’ The nearest thing to Keith’s ‘Professor Dart was right and I was wrong’ came in a paper by David Pilbeam. Pilbeam had special reason to be discouraged by the new time-scale, because it shattered a new and promising theory he had been working on, but in 1984 he wrote: ‘It is now clear that the molecular record can tell us more about hominid branching patterns than the fossil record does.’

  Sarich expressed the same idea from the opposite side of the barricades and in somewhat patronising terms. He said: ‘One doesn’t want to be foolish and say that the fossil record contributes nothing. But it is true that what it does contribute has been enormously overrated.’

  If one did say that the fossil record contributed nothing, one would be talking through one’s hat: it is central and vital to our understanding. Molecular biologists contribute to our knowledge of when the split took place, but they cannot tell us precisely where it took place; they cannot tell us in what kind of habitat it took place; and they could certainly never have told us, as the fossil-hunters did, that the hominids were walking on two legs for millions of years before they developed the big brain.

  The salient point is that we are now more able than ever before to close in on a probable time and place for the emergence of one key feature found only in human beings – bipedality.

  In the last couple of decades the molecular biologists have moved the date of the man/ape split sharply forward in time; the fossil-hunters have moved the date of the first bipedalists steadily backward in time. From both ends the gap has narrowed.

  It can now be said with some confidence that at some time between six or seven million years ago and three-and-a-half million years ago, probably somewhere in north-east Africa in the region of the Red Sea, certain anthropoids stood up and began walking erect. Something must have happened which meant that the near-universal mammalian mode of locomotion (walking on four legs) rather suddenly ceased to be efficient for them. They switched to a mode which was not merely different, but unique among mammals. It might be thought that the cause of such a dramatic change is likely itself to have been dramatic.

  The orthodox scenario says no. It claims the cause was a gradual dwindling of the forests, so that they could not accommodate so many arboreal animals; some bands of the apes might have dropped out of the branches and ventured out into open country.

  ‘And therefore,’ so the story goes, ‘they became bipedal.’ That is where the logic breaks down. Why should they? No other savannah animal has done so either before or since. It was very far from an easy option. The price for going bipedal – a price we are still paying today – is very high, and has been largely glossed over. In the next chapter an attempt will be made to count the cost.

  The story of the bones tells us much about the origins of man and it also tells us a few things about scientists. With few exceptions, when confronted with a maverick idea, they are confident they can identify whether or not it is preposterous by the gut instinct they have about it. Most of them feel that this absolves them of any obligation to examine it in detail or to give their reasons for rejecting it.

  Possibly nine times out of ten, they are right; but sometimes they are wrong. They were wrong in 1912 about Piltdown Man, and in the 1920s about the Taung skull, and in the ’70s about the date of our divergence from the ape. They are wrong today in clinging to the savannah theory without reassessing the flimsiness of the evidence in its favour.

  3

  The Cost of Walking Erect

  ‘For any quadruped to get up on its hind legs in order to run is an insane thing to do. It’s plain ridiculous.’

  Owen Lovejoy

  Walking on two legs does not seem by any means a difficult trick to perform when you belong to a species that has been practising it for a few million years. Unless, that is, you are fourteen months old and have just tumbled down for the 65th time; or unless you have broken a leg and, not having three others to fall back on, are reduced to hopping or hobbling with a stick; or unless you are drunk, or suffering from backache, or growing old.

  Because people have found it easy, bipedalism did not strike the early evolutionists as a difficult thing to explain. When Darwin wrote The Descent of Man he did not even bother to list it in his index. The sequence of events seemed clear enough at that time. The first development, it was assumed, was that the ancestral primates acquired greater dexterity and intelligence. They learned to make tools and wield weapons, and it is easier to cope with the weapons, if not the tools, while standing up with both feet on the ground and both arms free. ‘Then,’ Darwin concluded with a cautious double negative, ‘I can see no reason why it should not have been advantageous to the progenitors of man to have become more and more erect and bipedal.’

  He would have seen good reason if he had still been alive when the fossils of Lucy and her companions came to light. They had small brains, and even at sites where their bones were fairly thick on the ground, there was not the slightest trace of evidence of tool making or weapon wielding. This discovery reversed all previous assumptions about the sequence of events. It became clear that bipedalism did not emerge in the wake of other human-like acquirements, to enable them to work better. It came first.

  It should not have occasioned quite as much surprise as it did. It was over half a century since Raymond Dart had inferred from his Taung baby’s skull that it must have belonged to a bipedal species, it was over thirty years since his claims had in general terms been accepted.

  But there is a wide difference between a logical argument in a scientific magazine and the sight of the almost complete skeleton of a bipedal primate. Lucy and the other Hadar fossils – some over three million years old – were the earliest hominids ever discovered. (By contrast, the date of the Taung site has now been assessed at around one million years old.) The first bipedalists were not semihuman creatures. They were animals opting to walk on their hind legs.

  It was a costly option for them to take up, and we are still paying the instalments. The mammalian spine evolved over a hundred million years and reached a high degree of efficiency, on the assumption that mammals are creatures with one leg at each corner and that they walk with their spine in a horizontal plane. Under those conditions the blueprint is one that would command the admiration of any professional engineer. The vertebral column is designed on cantilever principles, as a single shallow arch supported by two pairs of movable pillars; the weight of the internal organs is vertically suspended from the arch and evenly distributed along the length of it. Such a mammal resembles a walking bridge.

  Our distant ancestors departed from this time-honoured mode of locomotion and converted themselves into walking towers, with a high centre of gravity and a narrow base. There are some other creatures which make use of bipedalism either some of the time or all the time – kangaroos, for example, and ostriches. But they do not proceed with their spines perpendicular; their total body weight is equally and fairly widely distributed around the point where their feet touch the ground. This gives them much greater stability, rather as a tight-rope walker improves his equilibrium by equipping himself with a long balancing pole.

  Human vertical locomotion presents more formidable problems, and any sensible engineer confronted with them would insist on starting from scratch. He would probably suggest a spinal column running down the centre of the trunk, with heart, lungs, liver and other organs arranged around it as symmetrically as possible. Supporting ligaments would be attached to the collar bones rather than the spine … and so on. Evolution, of course, does not work like that. Every re-adaptation is a process of make-do and mend.

  By now, the human skeleton has undergone a notable transformation. The single arch of the spinal column has been abandoned. Babies are still born with it, but they lose it soon
after birth in favour of a perfectly straight spine – straighter than it will ever be again. When they learn to sit up, a forward curve develops near the top of the spine, and when they learn to stand they acquire a second forward curve near the base of it. These kinks are essential to prevent the bipedal primate from falling over.

  The lower vertebrae have grown bigger to sustain the unprecedented vertical pressure they now have to bear. The pelvic girdle has been moved into a different plane, and the iliac blades on each side of it have spread and flattened into a saucer-like shape to hold the main weight of the intestines. (Otherwise in a bipedal mammal they would be in danger of slumping into the bottom of the body cavity and prolapsing.)

  These improvements have made things a lot easier; it is safe to conjecture that the first few million years of bipedalism were the worst. But even today, metaphorically speaking, Nature should still be displaying the sign: ‘Reconstruction in Progress. The Management regrets any inconvenience that may be caused.’

  Of all the man-hours lost to industry through various forms of illness, the highest percentage derives from our mode of locomotion. Every doctor in general practice must have lost count of the times he has been assured: ‘I would be perfectly all right, doctor, if it wasn’t for my back.’

  The Dresden pathologist Schmorl demonstrated that the spinal column is the first organ in man to ‘age’ – a process which at least begins as a consequence of the cumulative effects of wear and tear. Pathological changes in the spinal column have been detected in people as young as eighteen years old. In a recent year in Britain, lower back pain occasioned the loss of nineteen million working days. In the United States it has been calculated that 70 per cent of American citizens are affected by it at some time in their lives.

  It has been argued that bipedalism was not a very surprising development in a creature as closely allied to the apes as we are, because when an ape moves through the trees by brachiating (hanging from the branches) it is travelling with its spine vertical, as we do. This is sometimes thought to have pre-adapted us for the new mode of locomotion and made it less of a problem.

 

‹ Prev