Book Read Free

Lone Survivors

Page 24

by Chris Stringer


  First we’ll look at the brain, because one leading theory about our African origins—that of the archaeologist Richard Klein—argues that the development of modern human behavior came about suddenly around 50,000 years ago, as a result of genetic mutations that enhanced the workings of our brain, essentially making us “modern” at a stroke. A similar view is espoused by the neuroscientist Fred Previc, who highlights the importance of the neurotransmitter dopamine to human creative thought and hypothesizes that it reached critical levels about 80,000 years ago, driving behavioral evolution to modernity. Unfortunately it is very difficult to test such ideas properly from the surviving evidence, since while we can make a real or a virtual model of the inside of a fossil skull, that will only reflect the external shape and proportions of the ancient brain that was once inside that skull. Such a model can tell us nothing about the internal workings and wirings of the once-living brain, which would have contained billions of interconnected nerve cells. However, from such data we do know that during human evolution our brains have certainly increased in overall volume relative to body mass (this ratio is known as the encephalization quotient, or EQ). Early humans had EQs of only about 3.4 to 3.8, and this even included H. heidelbergensis individuals, who had human-sized brains but much larger bodies than the average today. More evolved humans such as our African ancestors prior to 200,000 years ago and Neanderthals had EQs between about 4.3 and 4.8, and when we arrive at early moderns such as those from Skhul and Qafzeh and the Cro-Magnons, EQ reaches its highest values at around 5.3 to 5.4.

  Since then, H. sapiens seems to have leveled off at those values or has even suffered a slight decline in EQ. But in brains, as in many other things, size isn’t everything, and we can infer that there must also have been significant reorganizations in the human brain for activities like toolmaking and speech. In order to maximize the surface area of the outermost cortical layer of our brain (the “gray matter,” which includes nerve cells and their interconnections), it is complexly folded into convolutions, thus allowing our cortical surface area to be about four times that of a chimpanzee’s, matching the increase in overall brain volume. While there have been many careful studies of the impressions of these convolutions of sulci (furrows) and gyri (ridges) on the inner surface of fossil braincases, such markings are often faint and difficult to interpret. Work done in the last century on the fake Piltdown skull’s convolutions found many supposedly apelike features, and yet we now know that the skull in question was actually that of a recent human, so much of that work consisted of wishful thinking or even fantasy—I even compared some of the old work to the pseudoscience of phrenology. Yet another approach to the analysis of ancient brains has focused on changes in the relative proportions of the various components rather than their convolutions, as these can be determined quite well from the preserved inner surface of the braincase or from CT data.

  The cortex or cerebrum is by far the largest part of the brain in humans and is divided centrally into two cerebral hemispheres—the left and right—which have different specializations, but which are interconnected by bundles of nerve fibers. The cerebral hemispheres are made up of four lobes, corresponding in position to the cranial bones of the same name: the frontal, parietal, temporal, and occipital lobes. We know quite a lot about the general roles that these lobes play in our brain: the frontal is involved with thinking and planning; the parietal in movement and the senses; the temporal with memory, hearing, and speech; and the occipital with vision. Tucked underneath and behind the cerebral hemispheres is the smaller cerebellum, which is predominantly concerned with regulation and control of the body.

  However, recent studies showed that the cerebellum also plays a role in many so-called higher functions and is extensively interconnected with the cerebrum. As well as regulating body functions, it seems the cerebellum is also concerned with the processes of learning. The increase in both gross brain size and EQ really took off about 2 million years ago, soon after the first clear archaeological evidence of both meat eating and toolmaking appeared in Africa. All areas of the brain enlarged, but proportionately the cerebral hemispheres increased more than the cerebellum. The pace of cerebral enlargement accelerated in heidelbergensis and peaked in Neanderthals and early moderns, seemingly correlated with an increase in behavioral complexity. But interestingly, in recent humans this long-term pattern has reversed, since the cerebellum today is proportionately larger. At the moment it is not clear what, if anything, this change signifies. On average, human brains have shrunk some 10 percent in size over the last 20,000 years, so is the cerebellum having to maintain its size more than the cerebrum, or, as some have claimed, does a relatively larger cerebellum perhaps provide greater computational efficiency? We simply do not know the answer yet.

  It is certain, however, that the overall shape of the brain and the braincase that envelops it changed from archaic to modern humans, becoming shorter and higher, narrower lower down and broader higher up, with a particular expansion in the upper parietal area. Brain shape is inevitably closely matched to skull shape, since the two must develop and grow in harmony—but which is the driver and primary determinant of their closely matched shapes? This is not a simple question to address, since even the braincase and the brain do not grow and exist in isolation. For example, the base of the skull anchors the upper parts of the vocal, digestive, and respiratory tracts and articulates the head and the spine, while the front of the skull contains the teeth and jaws and the muscles that work them. Those factors seem to have constrained brain shape from changing much in those regions.

  But the upper areas of the skull and brain are not so constrained, and my Ph.D. results back in 1974 highlighted changes in the frontal, parietal, and occipital bones of modern humans. Each contributed to the increased globularity of the braincase in modern sapiens, and our domed forehead is particularly noticeable. There are data from my collaborative research with Tim Weaver and Charles Roseman that suggest that many of these cranial changes might not be significant in evolutionary terms and could simply be the result of chance changes (genetic drift) as modern humans followed their own separate path in isolation in Africa. I will return to this issue in chapter 9, but nevertheless the cranial shape of modern humans is so idiosyncratic, compared with the patterns found in all the other known human species, that I do think it is worth considering whether brain evolution might lie behind our globular vault shape. This is particularly so as research by Philipp Gunz and his colleagues suggests that this shape change began to separate archaic and modern skulls soon after they were born.

  It seems likely that important changes occurred in the frontal lobes of our brain, given their importance in forward thinking, yet I was surprised by CT studies of fossil skulls in which I was involved. These showed that the profile and relative size of the frontal lobes inside the brain cavity had changed much less in modern humans, compared with the obvious external changes in forehead development. At the rear of the skull, the occipital bone is relatively narrower and more evenly curved in modern humans. In erectus and heidelbergensis skulls the occipital was more sharply angled, and this must have been partly related to the powerful neck muscles that attached across the bone in primitive humans. And in Neanderthals the profile of the occipital was influenced by the rather bulging occipital lobes of their brains, the significance of which is still debated—these lobes contain the visual cortex, for example. In modern humans, the parietals are heightened and lengthened, while the arch they make is narrower at the base but wider higher up. The paleoneurologist Emiliano Bruner investigated these aspects of ancient brain shape using geometric morphometrics. He confirmed earlier, more traditional, studies that blood vessel impressions on the inside of the parietals (reflecting blood supply to the parietal lobes) are altered in modern humans, forming a much more complex network.

  So is there anything in the function of the parietal lobes that might explain their expansion in the modern human brain? They are involved in integrating sensory in
formation, in processing data from different parts of the brain, and in social communication, all of which could be reflected in the behavioral changes recognized with the arrival of modern humans. The cognitive archaeologists Thomas Wynn and Frederick Coolidge argued that a key change in the modern human mind must have been the development of an episodic working memory. Memory in humans can be subdivided into declarative memories, such as basic facts and information, and procedural memories, such as strings of words or actions (such as making a tool or route finding). It is known from brain studies that these are separate modules, in the sense that brain damage may interfere with one but not the other, and brain imaging studies show they are controlled by different pathways. It is very likely that both of these types of memory are enhanced in the modern human brain, but there is one special and important type of declarative memory called episodic, personal, or autobiographical memory—a storylike reminiscence of an event, with its associated emotions. This can be used mentally to rerun past events, and, just as important, it can also rehearse future events—a sort of “inner-reality” time machine that can run backward or project possible scenarios forward, and which seems closely linked to concepts of self-awareness (“consciousness”). As we have seen already, archaeological evidence suggests that the reach of modern humans across the landscape in terms of food gathering, sourcing raw materials, and social networks increased during the Middle Stone Age and continued to increase during the Later Stone Age in Africa and contemporaneous industries outside of Africa, such as the Upper Paleolithic. Such developments could reflect the arrival of a modern kind of episodic memory. Moreover, the ability to conjure up vivid inner-reality narratives could also have been critical in the development of religious beliefs, since imaginary scenarios could be created as well as actual ones. Once people could foresee their own deaths, religious beliefs that provided reassurance about such events could perhaps have been selected for their value in promoting survival.

  Experiments and observations suggest that the parietal lobes are indeed involved in episodic memory, but it is clear that they are not the only location implicated, since the recall of such memories involves a network of links between the frontal, parietal, and temporal lobes. Moreover, even episodic memory is not a single straightforward path. For example, some patients with selective parietal lobe damage can recall a particular event in detail from a general cue such as “your birthdays” (top-down recall), whereas others need a detailed cue such as a photo of a particular birthday cake (bottom-up recall) to remember one special event properly. But the lower parts of the parietal lobes are also implicated in another vital property of the modern human brain: inner speech. This is our inner voice that consciously and unconsciously guides so much of our thinking and decision making; in a sense it provides one of the most vital bits of software for the hardware of our brains. Indeed, there is evidence that an inability to create and use this program—for example, in people who were born deaf, mute, and blind, and who have been given little sensory stimulation from other humans—greatly limits higher brain functions. Even so, such severely impaired people, when given appropriate inputs from an early age, can develop and use their own codes of inner speech, for example, by recalling the symbols of sign language that they have been taught, instead of spoken words.

  Stanley Ambrose, the champion of the impact of the Toba eruption on modern human evolution, also argued for the importance of memory and the development of particular parts of the brain in the success of modern humans. In his view, what was most important was the integration of working memory with prospective memory (dealing with near-future tasks) and constructive memory (mental time traveling), which are centered in the front and lower rear of the frontal lobes. Such links would have facilitated everything from the construction of composite artifacts to the development of the fullest levels of mind reading and social cooperation. In his view, archaic humans like the Neanderthals had developed the memory for short-term planning and the production of composite artifacts, but they lacked the full brain integration and hormonal systems that promoted the levels of trust and reciprocity essential for the much larger social networks of modern humans.

  All of this shows how complex our brains are, and what a long way we still have to go to understand their workings in living humans, let alone ones who died 100,000 years ago. Unfortunately for Richard Klein’s views of a significant cognitive event about 50,000 years ago, the heightening of the frontal and the expansion of the parietal lobes had apparently already occurred 100,000 years earlier, as shown by the shape of the early modern skulls from Omo Kibish and Herto. Overall, there is little evidence so far for any detectable changes in the modern human brain when Klein argues these should have occurred. Brain volume and EQ apparently increased fairly steadily in modern humans until the last 20,000 years, after which they seem to have declined somewhat, and similarly the trend in increasing cerebellum/cerebrum ratio seems to have altered only in the last 20,000 years or so.

  Therefore, all we can say is that there is no obvious physical evidence for such a change in the workings of the human brain 50,000 years ago. Perhaps some genetic support will eventually emerge; there are claims that the gene DRD4, which when negatively mutated is linked with attention-deficit/hyperactivity disorder (ADHD), underwent changes around that time. DRD4 affects the efficacy of the neurotransmitter dopamine in the brain, and it has been suggested that a positive effect of such mutations would be to encourage novelty seeking and risk taking—perhaps important qualities for a migration out of Africa. John Parkington is one of several archaeologists and biologists who have argued that the fish oils obtained by early moderns when they began to seriously exploit marine resources would have boosted the brain power of early Homo sapiens—and there are further claims that omega-3 fatty acids would have conferred additional benefits in terms of health and longevity. But unfortunately, at the moment, such changes can only be inferred from indirect evidence such as the archaeological record, which itself can be interpreted in very different ways. Here we need to return to two of the key elements of modernity that might be decipherable from that archaeological evidence: the presence of symbolism and, by inference, complex language.

  In chapters 5 and 6, we discussed some of the key behavioral “signatures” of modernity that are usually highlighted by archaeologists—things like figurative art and burials with grave goods. And we saw that there is not yet any strong evidence for figurative or clearly representational art anywhere before the European material dated at about 40,000 years. Equally, there is no evidence of symbolic burials older than the early modern examples known from Skhul and Qafzeh at about 100,000 years, even if older African sites like Herto are suggestive of the ritual treatment of human remains. However, the processing and use of red pigments in Africa does go back considerably farther, to beyond 250,000 years, at sites like Kapthurin and Olorgesailie in Kenya. The record is sporadic after this but emerges at Pinnacle Point in South Africa at about 160,000 years, and much more strongly at sites in North and South Africa from about 120,000 years. In particular, there is the rich material from Blombos Cave, South Africa, which includes about twenty engraved ocher fragments and slabs, dated to around 75,000 years ago and some which extend back to 100,000 years. These fragments seem to be generally accepted as symbolic in intent rather than accidental or utilitarian, but many of the earlier examples are only suggestive of symbolic meaning, rather than definitive.

  The evidence seems much stronger in the case of tick shell beads, present at the known limits of the early modern human range at least 75,000 years ago, from cave sites in Morocco, Israel, and South Africa. But even here, the context in which they were being used becomes critical in deciding what level of symbolic meaning they carried. The archaeologist Paul Pettitt suggests an alternative way of looking at symbolic intent by moving away from judging an absolute (and contentious) presence/absence, and instead deconstructing different levels of symbolic meaning, in line with the Robin Dunbar stages of “mind r
eading” that we discussed in chapter 5. This deconstruction is valuable because it also allows an evolutionary sequence for symbolism to be considered, rather than just an on–off switch, where symbolism is either not there at all or fully developed, with no intermediates. Pettitt points out that symbols can only function as such in recent humans if the “writer” and “reader” are in accord over meaning, but in interpreting archaeological finds we tend to focus on the writer, without considering those who might receive the intended message. He cautions that unless the symbol is repeated at a number of different sites in a given time period, we should be cautious about how widespread was the behavior and how efficient it was at conveying its meaning.

 

‹ Prev