The Wild Life of Our Bodies: Predators, Parasites, and Partners That Shape Who We Are Today

Home > Other > The Wild Life of Our Bodies: Predators, Parasites, and Partners That Shape Who We Are Today > Page 23
The Wild Life of Our Bodies: Predators, Parasites, and Partners That Shape Who We Are Today Page 23

by Rob Dunn


  Fincher knew that as humans settled into permanent or semi-permanent villages, the pathogens that cause infectious diseases grew more diverse and more common. When we stopped moving, the diseases started to catch up with us. Every so often a new disease would arise. By 200 years ago, despite being naked and relatively flea- and louse-free, humans collectively hosted hundreds of different kinds of pathogens, more types of pathogens, for example, than can be found on all of the Carnivore species in North America. The process is ongoing. Even now, each year more pathogens jump from their main hosts onto us. Many of these pathogens are transmitted person to person, one body to the next. The denser our populations become, the easier it is for them to spread. Yet the fact that we have persisted this long suggests that we have many ways of coping, and perhaps some of them are behavioral. Any individuals or societies that had new and effective ways to deal with new and terrible pathogens would have done better.

  Evasive maneuvers could be taken. Humans could simply move. If one moved fast enough to new places, sometimes not all of the diseases caught up. Native Americans walked across the Bering Straits and in doing so were able to shake many of the worst human diseases (the diseases caught up in 1492 when Columbus and his ship full of diseased mates set sail). But one did not necessarily need to move fast. My own research has shown that both the number of different diseases in a place and how common they are (the number of cases of a given disease) are influenced strongly by climate. Cold places and dry places have fewer diseases. Malaria, as one example among hundreds, requires specific mosquitoes to move it from one body to another. All one has to do is move away from these Anopheles mosquitoes to escape malaria, whether it be toward the poles or to higher elevations.

  There was also another option though, changing how we behave toward each other. If being social and sedentary makes us more prone to disease, changes in the ways in which we are social might have the opposite effect. We could groom disease-causing parasites off each other’s bodies. It is not romantic, but neither are lice or fleas. Grooming is an old and highly functional approach to disease control. Rats, pigeons, cows, antelope, and monkeys groom. When pigeons are prevented from grooming, they grow speckled with lice. Cows prevented from cleaning themselves have four times as many ticks and six times as many lice as those left unhindered. Antelope have a specialized tooth called a “dental comb” that seems to serve no purpose other than to aid in grooming away ectoparasites (evidence of yet another case in which ectoparasites seem to have posed a cost that was significant enough to cause animal bodies to evolve).4 Many animals groom themselves and each other even though the lost time such efforts entail is costly. Rats spend up to 30 percent of their time grooming, time that might be spent foraging or searching for mates. Howler monkeys spend about a quarter of the calories they consume swatting at flies. Clearly, grooming is a behavior that both helps to reduce parasite (and likely pathogen) loads and varies from species to species and probably place to place.

  Fincher thought about other behaviors that might affect our chances of getting diseases, other behaviors wired into our brains or embedded in our cultures. Sure, we swat flies, pick lice, and move, but moving is hard and for many pathogens, once they had already accomplished their most difficult feat of arriving on our bodies, grooming came too late. One cannot groom away malaria, or any of the many diseases transmitted body to body rather than via a vector. What Fincher wondered was whether some human behaviors and cultural practices influence the probability of coming in contact with a disease in the first place, a kind of behavioral immunity. Insect societies sometimes organize into smaller groups within colonies to reduce disease transmission. Some ants assign only a few individuals to the role of undertakers, to reduce contact with the dead. In at least two species of ants, sick workers leave their nests to die alone, where they pose no risk to their sisters of passing on their disease. Fincher wondered whether humans showed some of these sorts of behaviors, even if subconsciously—whether, in other words, we were as smart as the ants. He wondered too whether the kind of behavior humans showed depended on the frequency of their exposure to disease.

  A big hint came in 2004. Jason Faulkner, a graduate student in the psychology lab of Mark Schaller at the University of British Columbia, suggested that xenophobia, the fear of others, evolved to control the spread of disease. Faulkner imagined that when diseases are common, xenophobia might guard us against diseases that travel from one tribe to the next. Perhaps it is for this reason that “others” have often, across history and cultures, been described not only as scary but more specifically as dirty and disease-ridden. It was “the others” that almost always had fleas, lice, and rats. Our dislike for the others seemed like an evolutionary universal to Faulkner, one that might be stronger where disease is more prevalent in a way that, while it causes social problems, may once have saved lives. What if, he wondered, xenophobia arose as a specific and useful form of disgust, itself an emotion with no known value other than to keep us away from disease?

  Fincher saw Faulkner’s work and began to pull together his even bigger, more speculative, idea. Forget the water bugs, he would figure out the story of humans. He may have been too ambitious, but far be it for Thornhill to discourage him. Fincher read papers on anthropology, sociology, and, of course, insects. He latched on to the basic attributes of human culture and behavior that seemed to differ from place to place, in particular our sense of individualism. Anthropologists have long commented on the differences among cultures in the extent to which individuals act with their own interests in mind—cowboy-style—versus those of their whole clan. This difference between individualist cultures on the one hand and collectivist cultures on the other is one of the biggest differences among peoples globally, bigger even than differences of livelihood, marriage practices, or taboos. In many Amazonian groups, one’s family or clan is nearly as important as one’s self. In such cultures, often referred to as collectivist, the big distinction is not between one individual and another, but between one group and another. Deviations from the group’s norms are frowned upon. Individual creativity and personality are regarded as unimportant or even bad. Fincher, along with his growing list of individualist Western collaborators, imagined that collectivism might emerge in response to disease prevalence, where behaving in the “traditional” ways of the group might help to reduce disease, whereas behaving individually, in ways untested by time, might have the opposite effect. Maybe individualism and all that it leads to, from Western heroes to rogue biologists or even democracy, is only possible when societies are removed from the pressures of disease.

  Meanwhile, in British Columbia, Jason Faulkner’s adviser, Mark Schaller, and another student Damian Murray were studying whether xenophobia, but also other behaviors such as extroversion and sexual openness, were influenced by disease. Just as for xenophobia, being introverted and sexually conservative both seemed like good ideas when socially transmitted diseases were common. Together then, Fincher, Thornhill, Schaller, and Murray imagined that the key elements of differences among cultures and individuals were nearly all related to disease. We are who we are because of disease. Or so these men, all of them individualists, born in environments with a low prevalence of most diseases, had begun to believe.

  Some of the links between disease and behavior are beyond reproach. People who live in rural parts of the tropics, where lice are still a part of life, groom each other, whereas families in New Jersey or Cleveland rarely do. This difference is a result of the differences in the abundance of external parasites from place to place. No one picks lice from hair in which none exist, but what about our other behaviors, behaviors more central to our identities? Could they really differ depending on the levels of disease into which we are born? Swine flu offered a kind of lesson in what is possible. In 2009, swine flu, H1N1, emerged as a potential threat. Anyone who turned on their TV even occasionally knew to be “on alert.” And so what did people do? In Mexico, people began to stop kissing as a greeting. They
refused to shake hands. Elsewhere, flights were canceled, particularly flights out of infected regions. In other words, people cut off physical contact with strangers. They became, one might even say, xenophobic, clustering, like ants, into groups. They began to call for an end to flights from other countries. They did not, of course, stop hugging their children or kissing their husbands or wives. They simply avoided other people. They thought first of their collective, their most intimate, tribe.

  The ways in which we responded to H1N1 were what Fincher and his colleagues proposed happened all around the world, again and again, when diseases became prevalent. Of course, biologists have many theories, and not all of them are right or testable. But what was interesting about Fincher’s theory was that it was testable. If he was right, people from regions in which pathogens were more prevalent ought to be more xenophobic. They ought to be more protective of their own people and less inviting to their neighbors. You can imagine, though, that many other things influence the personalities of individuals within cultures. Anthropologists can give you long lists of the idiosyncratic bits and pieces of history that might be significant. Isolation, for example, might favor xenophobia (if the arrivals from afar are less predictable and hence pose a greater risk). Scarcity of resources might make one a little more hostile to one’s neighbors. In the context of such scarcity, if there were any relationship between pathogen prevalence and the behaviors of modern humans, it would be surprising.

  Fincher and his colleagues wanted to test their theories by seeing whether the regions with the greatest historical prevalence of diseases were also the same ones that were the most collectivist, xenophobic, and introverted. Many surveys have been conducted across cultures that aim to understand core attributes of behavior and personality. In one of the largest, 100,000 IBM employees in countries all over the world were interviewed. The interviews included questions aimed at distinguishing between cowboys (aka individualists) and collectivists. Using the database that resulted from these interviews and others, Fincher compared the individuality scores of people around the world.5 What they found was that in regions where deadly diseases are more common, people consistently think more about the tribe and less about their own individual fate and decisions. They are also more xenophobic. Separately, Mark Schaller also found that where diseases are more prevalent, individuals are less culturally and sexually open and less extroverted.6 What Fincher, Schaller, and others observed were correlations. Just because two things, such as disease prevalence and personalities, show the same patterns of variation from one place to another does not mean that one causes the other. But at the very least, the patterns these scientists observed do not rule out their ideas.

  On the basis of their results, Fincher, Schaller, Thornhill, Murray, Faulkner, and the scientists who work with them have begun to think they have discovered general rules of human behavior and culture. They have looked at us from a distance and claim to understand—to see us for what we are. They may be right. But one hesitates to jump to a firm conclusion too quickly. What they have discovered is an interesting pattern, a statistical relationship between pathogens and human behaviors and cultures. Just how disease affects behavior is a more difficult question, or at least it seemed to be until recently.

  As he sat back in his chair in his office, thinking about disease, Mark Schaller wondered how disease influenced behavior and culture. Schaller is the son of the great mammal biologist George Schaller. And like his father, a man who has spent years chasing rare beasts, Mark Schaller likes the pursuit, albeit of ideas rather than snow leopards. He wondered, could our subconscious really measure, in some way, the level of disease to which we might be exposed? Schaller wondered if we might all have an innate ability to recognize diseased individuals and to respond to them differently. It might be an ability more finely tuned in some places than in others, or perhaps only activated when necessary. What if our brains recognize and categorize the level of disease present in our surroundings and then without ever bothering to alert our consciousness, respond to this perceived risk? At face value, the idea seemed ridiculous. But Schaller and his collaborators decided to do an experiment to test the idea anyway. The results of this experiment seem likely to change our understanding of our bodies, our selves, and our relationship to the world.

  Schaller set up a computer screen in his lab on which he played images of nonstressful things such as furniture and then either a series of images related to guns and violence or a series related to disease, for example a woman coughing or the face of a smallpox victim. Would the individuals who saw scenes of diseased individuals actually respond in some subconscious bodily way to seeing the disease? Direct links exist between the response of people to stressful situations and the production of hormones such as cortisol and norepinephrine, which can, in turn, affect immune function. But could a picture of someone looking sick really affect our immune system? It was hard to imagine that our subconscious responses to disease might be as sophisticated as Schaller’s idea would require them to be.

  The experimental subjects were brought into the lab. Their blood was taken and then they were shown the neutral slide show and one of the two sets of stressful slide shows. After the slide shows, the participants’ blood was taken again. Each blood sample was then exposed, in a test tube, to a compound found in many pathogenic bacteria, lipopolysaccharide. Schaller and his colleagues thought that the blood cells of the participants who had seen the images of disease might more aggressively attack the bacterial compound by producing more cytokines. But, truth be told, they had no idea what they would see. Then the results came in. The blood taken from the individuals who had seen the disease slide shows produced 23.6 percent more bacteria-attacking cytokines (IL-6) than did the blood taken from the same individuals before the slide shows. But what about the individuals who saw the violent slides without images of disease, perhaps the response was just due to stress? It was not. The blood of the individuals who had seen the violent slides did not change at all. Seeing signs of disease primed the participants’ immune systems to respond to a pathogen like E. coli. This happened simply because they saw the images. It happened subconsciously. It happened incredibly quickly and easily. If you walk outside of your room and see someone coughing, it is likely it will also happen to you.7

  What Schaller and Fincher have gone on to argue is that in addition to our immune system (and perhaps our hairlessness), we also fight disease through a behavioral immune system. That system is born in part of an emotion, disgust, which rises to our consciousness but also seems to directly affect our bodies, behaviors, and cultures. It seems possible that because of this system, in places where diseases are common, we more innately express behaviors that reduce disease risk. This may include xenophobia and other attributes of who we are. In addition though, our behaviors are modulated by culture. Collectivism, for example, and other features of societies come to be encoded in taboos and norms. Norms may be shaped by the innate biology of individuals, but they also have a life all their own. Even if disease prevalence is reduced in a region, culture is likely to be slow to change. A case in point is the correlation that Fincher, Thornhill, Schaller, and others are able to show between disease prevalence and individualism. Our behaviors and cultures seem associated not with current disease prevalence, but instead with historical prevalence, the diseases we used to face a few hundred years ago. Old habits die hard, leaving us, once again, with ghosts of our past.

  How does all of this relate back to who you are today, wherever you are? It suggests that how you behave toward your friends and strangers is shaped not just, as you might hope, by your consciousness but by something deeper. The manifestations of this effect may include many aspects of our personalities and social behavior, but even if it includes nothing more than disgust, it is of consequence. Disgust evolved to trigger us to distance ourselves from disease-related stimuli and to trigger our immune systems to “get ready.” But the stimuli that trigger our disgust are imperfectly honed. Our minds
seem to have evolved in such a way that they make mistakes in what they judge as signs of disease, perhaps because for a long time it was better to make a mistake and avoid someone who was not diseased than to make a mistake and not avoid someone who was.

  For those of us with the good fortune to live somewhere where infectious diseases are rare or, because of changes to our lives and public health, have become rare, there are many potential costs to responding to the wrong cues associated with disease (and, in the absence of many diseases, fewer costs to letting an occasional disease slip by). The most obvious cost is that our immune systems and behaviors related to defense against disease may become hyperactive. Notably, Schaller’s study of the response of individuals to disease stimuli showed them pictures of sick people, pictures not unlike those we see on TV every day. Could our bodies be reacting not just to actual sick people but also to television sick people? No one knows.

  A more insidious and perhaps significant cost to our bodies’ misreading the signs of disease has to do with social groups that our bodies lead us to subconsciously avoid. Schaller has begun to argue (and in bits and pieces to demonstrate) that many of the attributes of old age, diseases that are not infectious (such as morbid obesity), and disabilities trigger our disgust reaction. If so, they do so by accident, our subconscious minds having mistaken the signs of age, obesity, or disability for indications of infectious disease. Schaller has shown that when individuals perceive disease to be a threat, they are more likely to act in ways that can be construed as ageist.8 Similar results hold for our perceptions of the obese, which are worse when we are worried about getting sick. These responses, if real and general, have broad implications for how we deal with the aging, disabled, and chronically ill in our societies. That the elderly, the chronically sick, and the disabled can become marginalized in many modern societies is beyond doubt. That such marginalization is a result of our misplaced evolutionary disgust, disgust that evolved to save us from disease, is more speculative, but plausible. Regardless, it seems that our behavioral immune system, whose intricacies we have yet to fully understand, is only partially functional in the world we have made for ourselves. It tugs subconsciously at our actions and our immune systems even before we are able to consider what is right or wrong.

 

‹ Prev