Impulsivity
Impulsivity has two core manifestations among modern people with ADHD. The first is impulsive behavior—acting without thinking things through or the proverbial “leap before you look.” Often this takes the form of interrupting others or blurting things out in conversation. Other times it’s reflected in snap judgments or quick decisions.
To the prehistoric hunter, impulsivity was an asset because it provided the ability to act on instant decisions as well as the willingness to explore new, untested areas. If the hunter were chasing a rabbit through the forest with his spear and a deer ran by, he wouldn’t have time to stop and calculate a risk/benefit analysis. He would have to make an instant decision about which animal to pursue, then act on that decision without a second thought.
Thomas Edison purportedly said that his combined distractibility and impulsiveness helped him in his “hunt” for world-transforming inventions. He said, “Look, I start here with the intention of going there [drawing an imaginary line] in an experiment, say, to increase the speed of the Atlantic cable; but when I have arrived partway in my straight line, I meet with a phenomenon and it leads me off in another direction, to something totally unexpected.”
The second aspect of impulsivity is impatience. For a primitive farmer, however, impatience and impulsivity would spell disaster. A very patient approach, all the way down to the process of picking bugs off plants for hours each day, day after day, would have to be hardwired into the brain of a farmer. The word boring couldn’t be in his vocabulary. His brain would have to be built in such a way that it tolerated, or even enjoyed, sticking with something until it was finished.
Risk-Taking
Risk-taking or, as Dr. Edward Hallowell and Dr. John Ratey describe it in their book Driven to Distraction: Recognizing and Coping with Attention Deficit Disorder from Childhood through Adulthood, “a restive search for high stimulation,” is perhaps the most destructive of the behaviors associated with ADHD in contemporary society. It probably accounts for the high percentage of people with ADHD among prison populations and plays a role in a wide variety of social problems, from the risky driving of a teenager to the infidelity or job-hopping of an adult.
Yet for a primitive hunter, risk and high stimulation were a necessary part of daily life. In fact, the urge to experience risk, the desire for that adrenaline high, would have been necessary among the members of a hunting society because it would have propelled their members out into the forest or jungle in search of stimulation and dinner.
If a farmer were a risk-taker, however, the results could lead to starvation. Because decisions made by farmers had such long-ranging consequences, their brains would have to have been wired to avoid risks and to carefully determine the most risk-free way of going about their work.
So the agricultural revolution highlighted two very different types of human societies: farmers and hunters. Each group lived different lives, in different places. Those with the ADHD gene in farming societies were probably culled from the gene pool by natural selection, or they became warriors for their societies, “hunting” other humans as various tribes came into conflict. In some societies—evolving into the countries of India and Japan, for instance—this role was even institutionalized into a caste system. History is replete with anecdotes about the unique personalities of the warrior castes such as the Kshatriya in India and the Samurai in Japan.
Where Have All the Hunters Gone?
If we accept for a moment the possibility that the gene that causes ADHD was useful in another time and place but has become a liability in our modern society based on the systems of agriculture and industry, these question arise: How did we reach a point in human evolution where the farmers so massively outnumber the hunters? If the hunter gene was useful for the survival of people, why have hunting societies largely died out around the world, and why is ADHD seen in only 3 to 20 percent of the population (depending on how you measure it and whose numbers you use) instead of 50 percent or more?
Recent research from several sources shows that hunting societies are always wiped out by farming societies over time. Fewer than 10 percent of the members of a hunting society normally survive when their culture collides with an agricultural society—and it has nothing to do with the hunter’s “attention deficits” or with any inherent superiority of the farmers.
In one study reported in Discover magazine,2 the authors traced the root languages of the peoples living throughout central Africa and found that at one time the area was dominated by hunters: the Khoisans and the Pygmies. But over a period of several thousand years, virtually all of the Khoisans and Pygmies (the Hottentots and the Bushmen, as they’ve been referred to in Western literature) were wiped out and replaced by Bantu-speaking farmers. Two entire groups of people were rendered nearly extinct, while the Bantu-speaking farmers flooded across the continent, dominating central Africa.
There are several reasons for this startling transformation. First, agriculture is more efficient at generating calories than hunting. Because the same amount of land can support up to 10 times more people when used for farming rather than hunting, farming societies generally have roughly 10 times the population density of hunting societies. In war, numbers are always an advantage, particularly in these ratios. Few armies in history have survived an onslaught by another army 10 times larger.
Second, diseases such as chicken pox, influenza, and measles, which virtually wiped out vulnerable populations such as native North and South Americans, who died by the thousands when exposed to the illnesses of the invading Europeans, began as diseases of domesticated animals. The farmers who were regularly exposed to such diseases developed relative immunities; though they would become ill, these germs usually wouldn’t kill them. Those with no prior exposure and thus no immunity, however, would often die. So when farmers encountered hunters, the latter were often killed off simply through exposure to the farmers’ diseases.
Finally, agriculture provides physical stability to a culture. The tribe stays in one spot while their population grows, which allows its members to specialize in individual jobs. Some people become tool and weapon makers, some build devices that can be used in war, and some create governments, armies, and kingdoms—all of which give farming societies a huge technological advantage over hunting societies, which are generally more focused on day-to-day survival issues.
So now we have an answer to the question: Where have all the hunters gone?
Most were killed off, from Europe to Asia, from Africa to the Americas. Those who survived were brought into farming cultures either through assimilation, kidnapping, or cultural change—and provided the genetic material that appears in the small percentage of people with ADHD today.
Further evidence of the anthropological basis of ADHD is seen among the modern survivors of ancient hunting societies.
Indigenous Hunters Today
Cultural anthropologist Jay Fikes points out that members of traditional Native American hunting tribes normally behave differently from those who have traditionally been farmers. The farmers such as the Hopi and other Pueblo Indian tribes are relatively sedate and risk-averse, he says, whereas the hunters, such as the Navajo, are “constantly scanning their environment and are more immediately sensitive to nuances. They’re also the ultimate risk-takers. They and the Apaches were great raiders and warriors.”
A physician who recently read my first book on the subject, Attention Deficit Disorder: A Different Perception, and concluded that he saw proof of the hunter-versus-farmer concept in his work with some of the Native Americans in southwest Arizona, dropped me the following unsolicited note over the Internet:
Many of these descendants of the Athabaskan Indians of western Canada have never chosen to adapt to farming. They had no written language until an Anglo minister, fairly recently, wrote down their language for the first time. They talk “heart to heart,” and there is little “clutter” between you and them when you are communicating. They hear and consider
everything you say. They are scanning all the time, both visually and auditorily. Time has no special meaning unless it is absolutely necessary (that’s something we Anglos have imposed on them). They don’t use small talk, but get right to the point, and have a deep understanding of people and the spiritual. And their history shows that they have a love of risk-taking.
But what sent humankind onto the radical social departure from hunting to farming? Few other animals, with the exception of highly organized insects such as ants, have developed a society that is based on anything that approaches agriculture.
In The Ascent of Man, Jacob Bronowski points out that 20,000 years ago every human on earth was a hunter and forager. The most advanced hunting societies had started following wild herd animals, as is still done by modern Laplanders. This had been the basis of human and pre-human society and lifestyle for several million years.
Until 1995 the earliest hard evidence of human activity (and hunting activity, at that) came from the Olduvai Gorge in Tanzania, Africa, with fragments of stone tools and weapons that dated back 2.5 million years. More recently, University of Southern California anthropologist Craig Stanford was quoted in the Chicago Tribune as saying that recent research he conducted in Africa indicates that early hominids may have been tribally hunting as early as 6 million years ago.
So for 6 million years, our ancestors were hunters, and then, suddenly, in a tiny moment of time (10,000 years is to 6 million years what less than three minutes is to a 24-hour day), the entire human race veered in a totally new direction.
The Agricultural Revolution
The reason for the change, according to Bronowski and many anthropologists, probably has to do with the end of the last ice age, which roughly corresponds to the beginning of the agricultural revolution. (Bronowski and most authorities place the agricultural revolution as occurring roughly 12,000 years ago.) At that time mutated grasses appeared simultaneously on several continents, probably in response to the sudden and radical change in climate. These grasses were the first high-yield, edible ancestors of modern rice and wheat and provided the humans who lived near them an opportunity to nurture and grow these staple foods.
Those people with the farmerlike patience to grow crops evolved into farming societies that emptied their ranks of the impulsive, sensation-seeking hunters among them. Those persons who were not patient enough to wait for rice to grow maintained their hunting tribes, the last remnants of which we see today in a few remaining indigenous peoples on the earth. The Old Testament, for example, is in large part the story of a nomadic hunting tribe moving through the wrenching process, over several generations, of becoming a settled farming tribe.
Our Society’s Hunters
In 1996 the Journal of Genetic Psychology published an article titled “Attention Deficit Disorder: An Evolutionary Perspective,” which suggested that, “Although no theory entirely explains the occurrence of ADHD, it is worthwhile to note that, at least historically, ADHD may have served an adaptive function and may have been selected by the environment for survival.”3
In 1997 Peter S. Jensen, the head of the Child and Adolescent Psychiatry division of the National Institutes of Mental Health (NIMH), was the lead author of a paper published in the peer-reviewed Journal of the American Academy of Child and Adolescent Psychiatry. In that paper, titled “Evolution and Revolution in Child Psychiatry: ADHD as a Disorder of Adaptation,”4 Jensen and his co-authors strongly argued that ADHD children shouldn’t be told they have an illness but that instead parents and teachers should emphasize their positive characteristics. “In reframing the child who has ADHD as ‘response ready,’ experience-seeking, or alert,” they wrote, “the clinician can counsel the child and family to recognize situations in modern society that might favor such an individual, both in terms of school environments, as well as future career opportunities, e.g., athlete, air-traffic controller, salesperson, soldier, or entrepreneur.”5
But it was all just speculation until 2000, when the article “Dopamine Genes and ADHD” appeared in Neuroscience and Biobehavioral Reviews. This paper, by lead author James M. Swanson and 10 other scientists, noted that, “The literature on these candidate genes and ADHD is increasing. Eight molecular genetic studies have been published, so far, about investigations of a hypothesized association of ADHD with the DAT1 and the DRD4 gene.”6
Soon other scientists were saying that the “hunter gene” may be a good thing. Dr. Robert Moyzis said of an NIMH-funded study of the gene, which he helped conduct, “We found a significant positive selection for the genetic variation associated with ADHD and novelty-seeking behavior in the human genome. This study strengthens significantly the connection between genetic variations and ADHD. It also provides a clue as to why ADHD is so pervasive.”7
Numerous other scientific journals over the years have published similar reports or studies. There’s a growing consensus that such children are carrying a gene for a behavior set that’s really a skill set, a collection of useful adaptations. And a growing number of voices, such as Howard Gardner (Frames of Mind: The Theory of Multiple Intelligences) and Daniel Goleman (Emotional Intelligence: Why It Can Matter More Than IQ), are calling for a wider definition of the kinds of “intelligence” and talent our schools accept and teach to.
The Edison Gene
All these discoveries have led me to a new hypothesis built upon my earlier work with the hunter/farmer model.
The early development of this came, in part, from Michael Garnatz, who, with his wife, Heidi, has founded and runs the website www.hypies.com, one of the largest and most well-known ADHD sites in Germany. Michael confronted me one day with the essential inconsistency of a theory that suggests the Edison gene, which emerged in full form only about 40,000 years ago, was useful for hunters when humans had been hunters for 100,000 years before that. Although I’d never called it the farmer gene, Michael pointed out that its emergence in the human genome seemed to coincide with some of the earliest documented examples of ancient peoples’ experimenting with agriculture. So how could I continue to call it the hunter gene?
I thought about this for some time, trying to figure out how to put together all the pieces in a way that was scientifically consistent and not merely some glib renaming of an older theory. Nothing seemed to work.
And then I read William H. Calvin’s book, A Brain for All Seasons. Calvin had synthesized some of the most remarkable science of the last decade of the twentieth century and had done so in a way that gave sudden and clear meaning to the appearance and the persistence of the Edison gene.
The Crisis-Survival Gene
It’s amazing what science can learn in a decade. In the early 1990s, most paleoclimatologists believed that climate change was a gradual phenomenon and that ice ages lasting 10,000 or more years were gradually alternated with periods of warmth. Nobody understood what caused the ice ages or even our current warmth, although variations in solar radiation were suspected.
Similarly, paleoanthropologists had known for decades that fully modern humans emerged from Africa about 200,000 years ago and, about 40,000 years ago, began behaving in highly organized and cooperative ways. Scientists had figured out that if they could take a child from the world of 38,000 years ago and raise him in today’s world, he would be indistinguishable from the rest of us.
The problem was that nobody could understand why it took these “modern” people so long—up until about 10,000 years ago—to begin building city-states and engaging in intensive agriculture. Of course, people pointed to the ice age that had North America, northern Asia, and Europe in its grip up until 10,000 years ago, but this didn’t explain why civilizations weren’t being built in still-warm Africa or other equatorial areas of the world.
In part this confusion was due to the common assumption that ice ages were relatively constant phenomena that had come on slowly throughout ancient history, with the glaciers of the most recent one gradually melting about 10,000 years ago, leading to today’s weather.
<
br /> A decade later, however, scientists discovered that what they thought to be true was actually wrong. As William Calvin documents so brilliantly in A Brain for All Seasons, it turns out that each and every one of the transitions between the cold, dry, windy ice ages and today’s wet, warm weather happened not gradually, over hundreds or thousands of years, but suddenly—in fewer than a dozen years.
The weather might have become strange one year, with a cold summer and a hard winter, and then the next year, or the one after that, the summer simply never returned. The winters then became so harsh that very few humans could survive them (with the exception, it seems, of the Neanderthals, who could—and did—more easily because their bodies were better adapted to cold).
Although ice gripped the north during these times, the weather changes also hit Africa and southern Asia: when it became cold in the north, drought struck the equatorial regions. Waterholes dried up, rivers stopped flowing, and jungles and rain forests turned to tinder and then blazed in massive, continent-consuming wildfires.
When, a thousand or so years later, the worldwide weather switched to being wet and warm, ice in the north thawed and monsoonlike floods swept Africa, tearing away what life had managed to hang on to survival during the cool, dry, windy period.
This is the cauldron in which humans evolved. When we look at a graph of worldwide temperatures, we discover that in the past 150,000 years—virtually the whole of human history—there has rarely been a period of 10,000 years (or even as much as 2,000 years) of steady, warm, comfortable weather. The single exception is the past 10,000 years, which is referred to as the Holocene period.
The Thom Hartmann Reader Page 8