The Meaning of Human Existence
Page 10
For ages no tribe could survive unless the meaning of its existence was defined by a creation story. The price of the loss of faith was a hemorrhage of commitment, a weakening and dissipation of common purpose. In the early history of each tribe—late Iron Age for Judaeo-Christianity, and seventh century CE for Islam—the myth had to be set in stone in order to work. Once set, no part of it could be discarded. No doubts must be heard by the tribe. The only solution to an outmoded dogma was to finesse or conveniently forget it. Or, in the extreme, break away with a new, competing dogma.
Obviously no two creation stories can both be true. All of those invented by the many known thousands of religions and sects in fact have certainly been false. A great many educated citizens have realized that their own faiths are indeed false, or at least questionable in details. But they understand the rule attributed to the Roman stoic philosopher Seneca the Younger that religion is regarded by the common people as true, by the wise as false, and by rulers as useful.
Scientists by nature tend to be cautious in anything they say about religion, even when expressing skepticism. The distinguished physiologist Anton (Ajax) J. Carlson, when asked what he thought of the 1950 ex cathedra (that is, infallible) pronouncement by Pius XII that the Virgin Mary ascended bodily into heaven, is reported to have responded that he couldn’t be sure because he wasn’t there, but of one thing he was certain, that she passed out at thirty thousand feet.
Might it be better just to leave this vexatious matter alone? Not deny, just forget? After all, the great majority of people in the world are sort of getting along, more or less. However, negligence in the matter is dangerous, both short-term and long-term. National wars may have subsided, obviously due to the fear of their possibly catastrophic outcomes to both sides. But insurgencies, civil wars, and terrorism have not. The principal driving force of mass murders committed during them is tribalism, and the central rationale for lethal tribalism is sectarian religion—in particular the conflict between those faithful to different myths. At the time of writing the civilized world flinches before the brutal struggles between Shiites and Sunnis, the murder of Ahmadiyya Muslims in Pakistan’s cities by other Muslims, and the slaughter of Muslims by Buddhist-led “extremists” in Myanmar. Even the blocking by ultra-Orthodox Jews of liberal Jewish women from the Western Wall is a menacing early symptom of the same social pathology.
Religious warriors are not an anomaly. It is a mistake to classify believers of particular religious and dogmatic religionlike ideologies into two groups, moderate versus extremist. The true cause of hatred and violence is faith versus faith, an outward expression of the ancient instinct of tribalism. Faith is the one thing that makes otherwise good people do bad things. Nowhere do people tolerate attacks on their person, their family, their country—or their creation myth. In America, for example, it is possible in most places to openly debate different views on religious spirituality—including the nature and even the existence of God, providing it is in the context of theology and philosophy. But it is forbidden to question closely, if at all, the creation myth—the faith—of another person or group, no matter how absurd. To disparage anything in someone else’s sacred creation myth is “religious bigotry.” It is taken as the equivalent of a personal threat.
Another way of expressing the history of religion is that faith has hijacked religious spirituality. The prophets and leaders of organized religions, consciously or not, have put spirituality in the service of groups defined by their creation myths. Awe-inspiring ceremonies and sacred rites and rituals and sacrifices are given the deity in return for worldly security and the promise of immortality. As part of the exchange the deity must also make correct moral decisions. Within the Christian faith, among most of the denominational tribes, God is obliged to be against one or more of the following: homosexuality, artificial contraception, female bishops, and evolution.
The Founding Fathers of the United States understood the risk of tribal religious conflict very well. George Washington observed, “Of all the animosities which have existed among mankind those which are caused by difference of sentiments in religion appear to be the most inveterate and distressing and ought most to be deprecated.” James Madison agreed, noting the “torrents of blood” that result from religious competition. John Adams insisted that “the government of the United States is not in any sense founded on the Christian religion.” America has slipped a bit since then. It has become almost mandatory for political leaders to assure the electorate that they have a faith, even, as for the Mormonism of Mitt Romney, if it looks ridiculous to the great majority. Presidents often listen to the counsel of Christian advisers. The phrase “under God” was introduced into the Pledge of Allegiance in 1954, and today no major political candidate would dare suggest it be removed.
Most serious writers on religion conflate the transcendent quest for meaning with the tribalistic defense of creation myths. They accept, or fear to deny, the existence of a personal deity. They read into the creation myths humanity’s effort to communicate with the deity, as part of the search for an uncorrupted life now and beyond death. Intellectual compromisers one and all, they include liberal theologians of the Niebuhr school, philosophers battening on learned ambiguity, literary admirers of C. S. Lewis, and others persuaded, after deep thought, that there must be Something Out There. They tend to be unconscious of prehistory and the biological evolution of human instinct, both of which beg to shed light on this very important subject.
The compromisers face an insoluble problem, which the great, conflicted nineteenth century Danish philosopher Søren Kierkegaard called the Absolute Paradox. Dogmas forced on believers, he said, are not just impossible but incomprehensible—hence absurd. What Kierkegaard had in mind in particular was the core of the Christian creation myth. “The Absurd is that the eternal truth has come to exist, that God has come to exist, is born, has grown up and so on, and has become just like a person, impossible to tell apart from another person.” It was incomprehensible, even if declared true, Kierkegaard continued, that God as Christ entered into the physical world in order to suffer, leaving martyrs to suffer for real.
The Absolute Paradox tears at all in every religion who seek an honest resolution of body and soul. It is the inability to conceive of an all-knowing divinity who created a hundred billion galaxies, yet whose humanlike emotions include feelings of pleasure, love, generosity, vindictiveness, and a consistent and puzzling lack of concern for the horrific things Earth-dwellers endure under the deity’s rule. To explain that “God is testing our faith” and “God moves in mysterious ways” doesn’t cut it.
As Carl Jung once said, some problems can never be solved, only outgrown. And so it must be for the Absolute Paradox. There is no solution because there is nothing to solve. The problem is not in the nature or even in the existence of God. It is in the biological origins of human existence and in the nature of the human mind, and what made us the evolutionary pinnacle of the biosphere. The best way to live in this real world is to free ourselves of demons and tribal gods.
14
Free Will
Neuroscientists who work on the human brain seldom mention free will. Most consider it a subject better left, at least for the time being, to philosophers. “We will attend to it when we’re ready and have time,” they seem to say. Meanwhile, their sights are set on the brighter and more realistically conceived grail of science, the physical basis of consciousness, of which free will is a part. No scientific quest is more important to humanity than to nail the phantom of conscious thought. Everyone, scientists, philosophers, and religious believers alike, can agree with the neurobiologist Gerald Edelman that “consciousness is the guarantor of all we hold to be human and precious. Its permanent loss is considered to be the equivalent of death, even if the body persists in its vital signs.”
The physical basis of consciousness won’t be an easy phenomenon to grasp. The human brain is the most complex system known in the Universe, either organic or inorgani
c. Each of the billions of nerve cells (neurons) composing its functional part forms synapses and communicates with an average of ten thousand others. Each launches messages along its own axon pathway, using an individual digital code of membrane firing patterns. The brain is organized into regions, nuclei, and staging centers that divide functions among them. The parts respond in different ways to hormones and sensory stimuli originating from outside the brain, while sensory and motor neurons all over the body communicate so intimately with the brain as to be virtually a part of it.
Half of the twenty thousand to twenty-five thousand genes of the entire human genetic code participate in one manner or other in the prescription of the brain-mind system. This amount of commitment has resulted from one of the most rapid evolutionary changes known in any advanced organ system of the biosphere. It entailed a more than twofold increase in brain size across three million years, from at or below 600cc in the australopith prehuman ancestor to 680cc in Homo habilis, thence to about 1,400cc in modern Homo sapiens.
Philosophers have labored off and on for over two thousand years to explain consciousness. Of course they have, it’s their job. Innocent of biology, however, they have for the most part understandably gotten nowhere. I don’t believe it too harsh to say that the history of philosophy when boiled down consists mostly of failed models of the brain. A few of the modern neurophilosophers such as Patricia Churchland and Daniel Dennett have made a splendid effort to interpret the findings of neuroscience research as these become available. They have helped others to understand, for example, the ancillary nature of morality and rational thought. Others, especially those of poststructuralist bent, are more retrograde. They doubt that the “reductionist” or “objectivist” program of the brain researchers will ever succeed in explaining the core of consciousness. Even if it has a material basis, subjectivity in this view is beyond the reach of science. To make their argument, the mysterians (as they are sometimes called) point to the qualia, the subtle, almost inexpressible feelings we experience about sensory input. For example, “red” we know from physics, but what are the deeper sensations of “redness”? So what can the scientists ever hope to tell us in larger scale about free will, or about the soul, which for religious thinkers at least is the ultimate of ineffability?
The procedure of the more skeptical philosophers is top-down and introspective—thinking about how we think, then adducing explanations, or else discovering reasons why there can be no explanation. They describe the phenomena and they provide thought-provoking examples. They conclude that there is something fundamentally different from ordinary reality in the conscious mind. Whatever that may be, it is better left to philosophers and poets.
Neuroscientists, who are relentlessly bottom-up as opposed to top-down, will have none of this. They have no illusions about the difficulty of the task, understanding that mountains are not provided with escalators built for dreamers. They agree with Darwin that the mind is a citadel that cannot be taken by frontal assault. They have set out instead to break through to its inner recesses with multiple probes along the ramparts, opening breaches here and there, and by technical ingenuity and force enter and explore wherever space is found to maneuver.
You have to have faith to be a neuroscientist. Who knows where consciousness and free will may be hidden—assuming they even exist as integral processes and entities? Will they emerge in time, metamorphosing from the data like a butterfly from a caterpillar, the image filling us like Keats’s men around Balboa with a wild surmise? Meanwhile, neuroscience, primarily because of its relevance to medicine, has grown rich. Its research projects are growing on budgets of hundreds of millions to billions each year. (In the science trade it’s called Big Science.) The same surge has occurred, successfully, in cancer research, the space shuttle, and experimental particle physics.
As I write, neuroscientists have begun the direct assault Darwin called impossible. It envisions an effort called the Brain Activity Map (BAM) Project, conceived by key government institutes in the United States, including the National Institutes of Health and National Science Foundation, in collaboration with the Allen Institute for Brain Science, and endorsed as government policy by President Obama. The program, if successfully funded, will parallel in magnitude the Human Genome Project, which was the biology moon shot completed in 2003. Its goal is nothing less than a map of the activity of every neuron in real time. Much of the technology will have to be developed on the job.
The basic goal of activity mapping is to connect all of the processes of thought—rational and emotional, conscious, preconscious, and unconscious, held both still and moving through time—to a physical base. It won’t come easy. Bite into a lemon, fall into bed, recall a departed friend, watch the sun sink beyond the western sea. Each episode comprises mass neuronal activity so elaborate, so little of which as yet been seen, we cannot even conceive it, much less write it down as a repertory of firing cells.
Skepticism concerning BAM is rife among scientists, but that is nothing new. The same resistance was also the case for the Human Genome Project and much of space exploration being conducted by NASA. An added incentive to push ahead is the practical application mapping will have for medicine—in particular the cellular and molecular foundations of mental illness and the discovery of deleterious mutations even before the symptoms are expressed.
Assuming that BAM or other, similar enterprises are successful, how might they solve the riddle of consciousness and free will? I suggest that the solution will come relatively early in the functional mapping program rather than as a grand finale at the end, and providing neurobiology remains favored as Big Science. In evidence is the large amount of information already archived in brain studies, and especially as it is combined with the principles of evolutionary biology.
There are several reasons for optimism in the search for an early solution. First is the gradual emergence of consciousness during evolution. The extraordinarily high human level was not reached suddenly, like a light turned on with the flick of a switch. The gradual albeit rapid increase in brain size leading up from the habiline prehumans to Homo sapiens suggests that consciousness evolved in steps, in a manner similar to those of other complex biological systems—the eukaryotic cell, for example, or the animal eye, or colonial life in insects.
It should then be possible to track the steps leading to human consciousness through studies of animal species that have come partway to the human level. The mouse has been a prominent model in early brain-mapping research, and will continue to be productive. This species has considerable technical advantages, including convenient laboratory rearing (for a mammal) and a strong supporting foundation of prior genetic and neuroscience research. A closer approach to the actual sequence can be made, however, by also adding humanity’s closest phylogenetic relatives among the Old World primates, from lemurs and galagos at the more primitive end to rhesus macaques and chimpanzees at the higher end. The comparison would reveal what neural circuits and activities were attained by nonhuman species, and when and in what sequence. The data obtained might detect, even at a relatively early stage of research, the neurobiological traits that are uniquely human.
The second point of entry into the realm of consciousness and free will is the identification of emergent phenomena—entities and processes that come into existence only with the joining of preexisting entities and processes. They will be found, if the results of current research are indicative, in the linkage and synchronized activity of various parts of both the sensory system and the brain.
Meanwhile, the nervous system can be usefully conceived as a superbly well organized superorganism built upon a division of labor and specialization in the society of cells—around which the body plays a primarily supportive role. An analog, if you will, is to be found in a queen ant or termite, and her supporting swarm of workers. Each worker on its own is relatively stupid. It follows a program of blind, untutored instinct, which is subject to only a small amount of flexibility in its expr
ession. The program directs the worker to specialize on one or two tasks at a time, and to change programs in a particular sequence—typically nurse to builder and guard to forager—as the worker ages. All the workers together are in contrast brilliant. They address all needed tasks simultaneously, and can shift the weight of their effort to meet potentially lethal emergencies such as flooding, starvation, and attacks by enemy colonies. This comparison should not be considered a stretch. Something like it has been a common theme in serious literature as far back as Douglas Hofstadter’s 1979 classic Gödel, Escher, Bach: An Eternal Golden Braid.
A strong additional advantage is the narrowness of the range of human perception. Our sight, hearing, and other senses impart the feeling that we are aware of almost everything around us in both space and time. Yet, as I’ve emphasized earlier, we are aware of only minute slivers of space-time, and even less of the energy fields, in which we exist. The conscious mind is a map of our awareness in the intersection only of those parts of the continua we happen to occupy. It allows us to see and know those events that most affect our survival in the real world, or, more precisely, the real world in which our prehuman ancestry evolved. To understand sensory information and the passage of time is to understand a large part of consciousness itself. Advance in this direction might prove easier than previously assumed.
The final reason I’d like to suggest for optimism is the human necessity for confabulation. Our minds consist of storytelling. In each instant of present time, a flood of real-world information flows into our senses. Added to the severe limitation of the senses is the fact that the information they receive far exceeds what the brain can process. To augment this fraction, we summon the stories of past events for context and meaning. We compare them with the unfolding past to apply the decisions that were made back in time, variously right or wrong. Then we look forward to create—not just to recall this time—multiple competing scenarios. These are weighed against one another by the suppressing or intensifying effect imposed by aroused emotional centers. A choice is made in the unconscious centers of the brain, it turns out from recent studies, several seconds before the decision arrives in the conscious part.