Book Read Free

Dragons of Eden

Page 4

by Carl Sagan


  Moreover, there is a statistical correlation between brain mass or size and intelligence in human beings. The relationship is not one-to-one, as the Byron-France comparison clearly shows. We cannot tell a person’s intelligence in any given case by measuring his or her brain size. However, as the American evolutionary biologist Leigh van Valen of the University of Chicago has shown, the available data suggest a fairly good correlation, on the average, between brain size and intelligence. Does this mean that brain size in some sense causes intelligence? Might it not be, for example, that malnutrition, particularly in utero and in infancy, leads to both small brain size and low intelligence, without the one causing the other? Van Valen points out that the correlation between brain size and intelligence is much better than the correlation between intelligence and stature or adult body weight, which are known to be influenced by malnutrition, and there is no doubt that malnutrition can lower intelligence. Thus beyond such effects, there appears to be an extent to which larger absolute brain size tends to produce higher intelligence.

  Sensory and motor homunculi, after Penfield. These are two maps of the specialization of function in the cerebral cortex. The distorted mannequins are maps of how much attention is given in the cortex to various parts of the body; the larger the body part shown, the more important it is. At left is a map of the somatic sensory area, which receives neural information from the parts of the body shown; at right is a corresponding map for the transmission of impulses from brain to body.

  In exploring new intellectual territory, physicists have found it useful to make order-of-magnitude estimates. These are rough calculations that block out the problem and serve as guides for future studies. They do not pretend to be highly accurate. In the question of the connection between brain size and intelligence, it is clearly far beyond present scientific abilities to perform a census of the function of every cubic centimeter of the brain. But might there not be some rough and approximate way in which to connect brain mass with intelligence?

  The difference in brain mass between the sexes is of interest in precisely this context, because women are systematically smaller in size and have a lower body mass than men. With less body to control, might not a smaller brain mass be adequate? This suggests that a better measure of intelligence than the absolute value of the mass of a brain is the ratio of the mass of the brain to the total mass of the organism.

  The chart on this page shows the brain masses and body masses of various animals. There is a remarkable separation of fish and reptiles from birds and mammals. For a given body mass or weight, mammals have consistently higher brain mass. The brains of mammals are ten to one hundred times more massive than the brains of contemporary reptiles of comparable size. The discrepancy between mammals and dinosaurs is even more striking. These are stunningly large and completely systematic differences. Since we are mammals, we probably have some prejudices about the relative intelligence of mammals and reptiles; but I think the evidence is quite compelling that mammals are indeed systematically much more intelligent than reptiles. (Also shown is an intriguing exception: a small ostrich-like theropod class of dinosaurs from the late Cretaceous Period, whose ratio of brain to body mass places them just within the regional diagram otherwise restricted to large birds and the less intelligent mammals. It would be interesting to know much more about these creatures, which have been studied by Dale Russell, chief of the Palaeontology Division of the National Museums of Canada.) We also see from the chart on this page that the primates, a taxon that includes man, are separated, but less systematically, from the rest of the mammals; primate brains are on the average more massive by a factor of about two to twenty than those of nonprimate mammals of the same body mass.

  A scatter diagram of brain mass versus body mass for primates, mammals, birds, fish, reptiles, and dinosaurs. The diagram has been adapted from the work of Jerison (1973), with some points added for the dinosaurs and now-extinct members of the family of man.

  When we look more closely at this chart, isolating a number of particular animals, we see the results on this page. Of all the organisms shown, the beast with the largest brain mass for its body weight is a creature called Homo sapiens. Next in such a ranking are the dolphins.* Again I do not think it is chauvinistic to conclude from evidence on their behavior that humans and dolphins are at least among the most intelligent organisms on Earth.

  A closer look at some of the points in the diagram on this page. Saurornithoid is the ostrich-like dinosaur mentioned in the text.

  The importance of this ratio of brain to body mass had been realized even by Aristotle. Its principal modern exponent has been Harry Jerison, a neuro-psychiatrist at the University of California at Los Angeles. Jerison points out that some exceptions exist to our correlation—e.g., the European pygmy shrew has a brain mass of 100 milligrams in a 4.7 gram body, which gives it a mass ratio in the human range. But we cannot expect the correlation of mass ratio with intelligence to apply to the smallest animals, because the simplest “housekeeping” functions of the brain must require some minimum brain mass.

  The brain mass of a mature sperm whale, a close relative of the dolphin, is almost 9,000 grams, six and a half times that of the average man. It is unusual in total brain mass, not (compare with the figure below) in ratio of brain to body weight. Yet the largest dinosaurs had brain weight about 1 percent that of the sperm whale. What does the whale do with so massive a brain? Are there thoughts, insights, arts, sciences and legends of the sperm whale?

  The criterion of brain mass to body mass, which involves no considerations of behavior, appears to provide a very useful index of the relative intelligence of quite different animals. It is what a physicist might describe as an acceptable first approximation. (Note for future reference that the Australopithecines, who were either ancestral to man or at least close collateral relatives, also had a large brain mass for their body weight; this has been determined by making casts of fossil braincases.) I wonder if the unaccountable general appeal of babies and other small mammals—with relatively large heads compared to adults of the same species—derives from our unconscious awareness of the importance of brain to body mass ratios.

  The data so far in this discussion suggest that the evolution of mammals from reptiles over two hundred million years ago was accompanied by a major increase in relative brain size and intelligence; and that the evolution of human beings from nonhuman primates a few million years ago was accompanied by an even more striking development of the brain.

  The human brain (apart from the cerebellum, which does not seem to be involved in cognitive functions) contains about ten billion switching elements called neurons. (The cerebellum, which lies beneath the cerebral cortex, toward the back of the head, contains roughly another ten billion neurons.) The electrical currents generated by and through the neurons or nerve cells were the means by which the Italian anatomist Luigi Galvani discovered electricity. Galvani had found that electrical impulses could be conducted to the legs of frogs, which dutifully twitched; and the idea became popular that animal motion (“animation”) was in its deepest sense caused by electricity. This is at best a partial truth; electrical impulses transmitted along nerve fibers do, through neurochemical intermediaries, initiate such movements as the articulation of limbs, but the impulses are generated in the brain. Nevertheless, the modern science of electricity and the electrical and electronic industries all trace their origins to eighteenth-century experiments on the electrical stimulation of twitches in frogs.

  Only a few decades after Galvani, a group of literary English-persons, immobilized in the Alps by inclement weather, set themselves a competition to write a fictional work of consummate horror. One of them, Mary Wollstonecraft Shelley, penned the now-famous tale of Dr. Frankenstein’s monster, who is brought to life by the application of massive electrical currents. Electrical devices have been a mainstay of gothic novels and horror films ever since. The essential idea is Galvani’s and is fallacious, but the concept has insinuate
d itself into many Western languages—as, for example, when I am galvanized into writing this book.

  Most neurobiologists believe that the neurons are the active elements in brain function, although there is evidence that some specific memories and other cognitive functions may be contained in particular molecules in the brain, such as RNA or small proteins. For every neuron in the brain there are roughly ten glial cells (from the Greek word for glue), which provide the scaffolding for the neuronal architecture. An average neuron in a human brain has between 1,000 and 10,000 synapses or links with adjacent neurons. (Many spinal-cord neurons seem to have about 10,000 synapses, and the so-called Purkinje cells of the cerebellum may have still more. The number of links for neurons in the cortex is probably less than 10,000.) If each synapse responds by a single yes-or-no answer to an elementary question, as is true of the switching elements in electronic computers, the maximum number of yes/no answers or bits of information that the brain could contain is about 1010 × 103 = 1013, or 10 trillion bits (or 100 trillion = 1014 bits if we had used 104 synapses per neuron). Some of these synapses must contain the same information as is contained in other synapses; some must be concerned with motor and other noncognitive functions; and some may be merely blank, a buffer waiting for the new day’s information to flutter through.

  If each human brain had only one synapse—corresponding to a monumental stupidity—we would be capable of only two mental states. If we had two synapses, then 22 = 4 states; three synapses, then 23 = 8 states, and, in general, for N synapses, 2N states. But the human brain is characterized by some 1013 synapses. Thus the number of different states of a human brain is 2 raised to this power—i.e., multiplied by itself ten trillion times. This is an unimaginably large number, far greater, for example, than the total number of elementary particles (electrons and protons) in the entire universe, which is much less than 2 raised to the power 103. It is because of this immense number of functionally different configurations of the human brain that no two humans, even identical twins raised together, can ever be really very much alike. These enormous numbers may also explain something of the unpredictability of human behavior and those moments when we surprise even ourselves by what we do. Indeed, in the face of these numbers, the wonder is that there are any regularities at all in human behavior. The answer must be that all possible brain states are by no means occupied; there must be an enormous number of mental configurations that have never been entered or even glimpsed by any human being in the history of mankind. From this perspective, each human being is truly rare and different and the sanctity of individual human lives is a plausible ethical consequence.

  In recent years it has become clear that there are electrical microcircuits in the brain. In these micro-circuits the constituent neurons are capable of a much wider range of responses than the simple “yes” or “no” of the switching elements in electronic computers. The microcircuits are very small in size (typical dimensions are about 1/10,000 of a centimeter) and thus able to process data very rapidly. They respond to about 1/100th of the voltage necessary to stimulate ordinary neurons, and are therefore capable of much finer and subtler responses. Such microcircuits seem to increase in abundance in a manner consistent with our usual notions about the complexity of an animal, reaching their greatest proliferation in both absolute and relative terms in human beings. They also develop late in human embryology. The existence of such microcircuits suggests that intelligence may be the result not only of high brain-to-body-mass ratios but also of an abundance of specialized switching elements in the brain. Microcircuits make the number of possible brain states even greater than we calculated in the previous paragraph, and so enhance still farther the astonishing uniqueness of the individual human brain.

  We can approach the question of the information content of the human brain in a quite different way—introspectively. Try to imagine some visual memory, say from your childhood. Look at it very closely in your mind’s eye. Imagine it is composed of a set of fine dots like a newspaper wirephoto. Each dot has a certain color and brightness. You must now ask how many bits of information are necessary to characterize the color and brightness of each dot; how many dots make up the recalled picture; and how long it takes to recall all the details of the picture in the eye of the mind. In this retrospective, you focus on a very small part of the picture at any one time; your field of view is quite limited. When you put in all these numbers, you come out with a rate of information processing by the brain, in bits per second. When I do such a calculation, I come out with a peak processing rate of about 5,000 bits per second.*

  Most commonly such visual recollections concentrate on the edges of forms and sharp changes from bright to dark, and not on the configuration of areas of largely neutral brightness. The frog, for example, sees with a very strong bias towards brightness gradients. However, there is considerable evidence that detailed memory of interiors and not just edges of forms is reasonably common. Perhaps the most striking case is an experiment with humans on stereo reconstruction of a three-dimensional image, using a pattern recalled for one eye and a pattern being viewed for the other. The fusion of images in this anaglyph requires a memory of 10,000 picture elements.

  But I am not recollecting visual images all my waking hours, nor am I continuously subjecting people and objects to intense and careful scrutiny. I am doing that perhaps a small percent of the time. My other information channels—auditory, tactile, olfactory and gustatory—are involved with much lower transfer rates. I conclude that the average rate of data processing by my brain is about (5,000/50) = 100 bits per second. Over sixty years, that corresponds to 2 × 1011 or 200 billion total bits committed to visual and other memory if I have perfect recall. This is less than, but not unreasonably less than, the number of synapses or neural connections (since the brain has more to do than just remember) and suggests that neurons are indeed the main switching elements in brain function.

  A remarkable series of experiments on brain changes during learning has been performed by the American psychologist Mark Rosenzweig and his colleagues at the University of California at Berkeley. They maintained two different populations of laboratory rats—one in a dull, repetitive, impoverished environment; the other in a variegated, lively, enriched environment. The latter group displayed a striking increase in the mass and thickness of the cerebral cortex, as well as accompanying changes in brain chemistry. These increases occurred in mature as well as in young animals. Such experiments demonstrate that physiological changes accompany intellectual experience and show how plasticity can be controlled anatomically. Since a more massive cerebral cortex may make future learning easier, the importance of enriched environments in childhood is clearly drawn.

  This would mean that new learning corresponds to the generation of new synapses or the activation of moribund old ones, and some preliminary evidence consistent with this view has been obtained by the American neuroanatomist William Greenough of the University of Illinois and his co-workers. They have found that after several weeks of learning new tasks in laboratory contexts, rats develop the kind of new neural branches in their cortices that form synapses. Other rats, handled similarly but given no comparable education, exhibit no such neuro-anatomical novelties. The construction of new synapses requires the synthesis of protein and RNA molecules. There is a great deal of evidence showing that these molecules are produced in the brain during learning, and some scientists have suggested that the learning is contained within brain proteins or RNA. But it seems more likely that the new information is contained in the neurons, which are in turn constructed of proteins and RNA.

  How densely packed is the information stored in the brain? A typical information density during the operation of a modern computer is about a million bits per cubic centimeter. This is the total information content of the computer, divided by its volume. The human brain contains, as we have said, about 1013 bits in a little more than 103 cubic centimeters, for an information content of 1013/103 = 1010, about ten bil
lion bits per cubic centimeter; the brain is therefore ten thousand times more densely packed with information than is a computer, although the computer is much larger. Put another way, a modern computer able to process the information in the human brain would have to be about ten thousand times larger in volume than the human brain. On the other hand, modern electronic computers are capable of processing information at a rate of 1016 to 1017 bits per second, compared to a peak rate ten billion times slower in the brain. The brain must be extraordinarily cleverly packaged and “wired,” with such a small total information content and so low a processing rate, to be able to do so many significant tasks so much better than the best computer.

  The number of neurons in an animal brain does not double as the brain volume itself doubles. It increases more slowly. A human brain with a volume of about 1,375 cubic centimeters contains, as we have said, apart from the cerebellum about ten billion neurons and some ten trillion bits. In a laboratory at the National Institute of Mental Health near Bethesda, Maryland, I recently held in my hand a rabbit brain. It had a volume of perhaps thirty cubic centimeters, the size of an average radish, corresponding to a few hundred million neurons and some hundred billion bits—which controlled, among other things, the munching of lettuce, the twitchings of noses, and the sexual dalliances of grownup rabbits.

  Since animal taxa such as mammals, reptiles or amphibians contain members with very different brain sizes, we cannot give a reliable estimate of the number of neurons in the brain of a typical representative of each taxon. But we can estimate average values which I have done in the chart on this page. The rough estimates there show that a human being has about a hundred times more bits of information in his brain than a rabbit does. I do not know that it means very much to say that a human being is a hundred times smarter than a rabbit, but I am not certain that it is a ridiculous contention. (It does not, of course, follow that a hundred rabbits are as smart as one human being.)

 

‹ Prev