Mothers and Others

Home > Other > Mothers and Others > Page 33
Mothers and Others Page 33

by Sarah Blaffer Hrdy


  To my knowledge, Karlen Lyons-Ruth is the first child psychologist engaged in clinical practice to attempt to integrate Bowlbian attachment theory with new findings about humankind’s legacy of cooperative breeding. It is part of her search to understand the peculiar need for infant-caretaker attunement that she and her colleagues have been documenting in our species. “As the explicit sharing of intentional states became a more powerful force in human evolution,” she wrote in 2005 together with her colleague Katherine Hennighausen, “this shift also affected the infant-parent attachment system, moving the center of the attachment relationship to primarily intersubjective processes.” All infant primates are soothed by close bodily contact, but in humans the sharing of emotional cues became a more important part of the quest for continuing commitment.24

  Post-Bowlby, generations of developmental psychologists, infant psychiatrists, and psychoanalysts have worked to demonstrate the importance of early attachments for security and self-confidence as infants gradually learn how to regulate their emotions. We already know that early in development these little connoisseurs of commitment become attuned to facial expressions, rhythms, and tones of voice—the entire spectrum of cues with which caretakers (most of this research was done with mothers, of course) signal how sensitive they are to the infant’s mental state and needs.25 Now for the first time, a growing awareness of this unusual dimension to the needs of human infants, different from the emotional needs of other apes, is being combined with an evolutionary explanation for why this should be so.

  When a (usually older) child complains that “no one understands me,” and we ask ourselves why a child would care, a big part of the answer has to be that we descended from creatures whose minds were pre-adapted to evaluate the understanding and commitment of others. No other social creature is capable of feeling quite so “lonely” even when surrounded by familiar conspecifics. Beginning with Emile Durkheim on anomie and continuing with Robert Putnam in Bowling Alone, Shelley Taylor in The Tending Instinct, or John Cacioppo and William Patrick in Loneliness, a number of distinguished writers have commented on how the centrifugal pressures of modern life are diminishing our sense of community. The modern emphasis on individualism and personal independence along with consumption-oriented economies, compartmentalized living arrangements in highrise apartments or suburban homes, and neolocal residence patterns combine to undermine social connectedness.

  But from my perspective as an evolutionist interested in the role that childrearing played in the evolution of prosocial impulses, the trouble started earlier. All through the Pleistocene, infant survival depended on the ability of infants to maintain contact and solicit nurture from both mothers and others. If, in African foraging societies like those of the Efe or the Aka, children grew up feeling surrounded by responsive caretakers, it was because as a matter of fact they were. Those who were not were unlikely to survive. No wonder these children learned to perceive their world as a “giving place.” Within the first two years of life, infants fortunate enough to be reared in responsive caretaking relationships develop innate potentials for empathy, mind reading, and collaboration, and often do so with astonishing speed. Such behavior is the outcome of complex interactions between genes and nurture, and this drama is played out on the stage of the developing brains. Thus, the development of innate potentials is far from guaranteed.

  The end of the Pleistocene marked a consequential divide in the way children were raised, as people began to settle in one place, build walled houses, grow and store food. While predation rates declined, malnutrition remained a problem, and deaths from diseases like malaria and cholera actually increased. Nevertheless, child survival became increasingly decoupled from the need to be in constant physical contact with another person, or surrounded by responsive, protective caretakers, in order to pull through. Many other things began to change as well. For one thing, girls growing up in sedentary agricultural societies reached puberty sooner and became capable of giving birth at younger ages. Among foragers, any girl sufficiently well-fed to ovulate in her early teens was, almost by definition, a girl surrounded by supportive kin, people who after she gave birth were likely to be willing to help her rear her young. After the Pleistocene, and increasingly over the ensuing centuries, even young women still psychologically immature and woefully lacking in sympathy or social support could nevertheless be well-fed enough to ovulate and conceive while still in their early teens.

  Cultivated fields, livestock, and food stores were accompanied by population growth and social stratification, and with them the need to defend property and, even more than before, to defend women as well. Property, higher population densities, and larger group sizes all put new pressures on men to remain near fathers and brothers, their most reliable allies. “In-group amity” as a way to survive in the face of “out-group enmity” took on greater importance. With men remaining near their own kin, it was women who moved—either exchanged between groups or perhaps captured. With a diminished role for the mother’s kin in rearing young, old compunctions against raiding with the purpose of taking women by force began to fade.

  As property accumulated and residence patterns also became more patrilocal, inheritance patterns became patrilineal. Male heirs were better positioned to hold on to intergenerationally transmitted resources. Such developments led to an increased emphasis on being certain about paternity. As cultures emphasizing female chastity flourished, women’s freedom of movement was severely curtailed. No longer could women use sexuality to line up extra “fathers”; no longer could daughters move to be near kin at birth, or mothers move to be near daughters who needed their help. Increasingly, young women found themselves giving birth for the first time far from their own mothers and sisters, more likely to be in competition with, rather than bonded to, the women they saw around them.

  More important, patriarchal ideologies that focused on both the chastity of women and the perpetuation and augmentation of male lineages undercut the long-standing priority of putting children’s well-being first. Customs such as sequestration of women, chaperoning, veiling, and suttee took a huge toll on women, but they also took a toll on children. With settled lifestyles, intervals between births were already growing shorter. At the same time, the need for competing clans to out-man rivals put even greater emphasis on large numbers of heirs, particularly males. The fecundity of women took priority over the health or quality of life of any individual child. Conventions that kept men separated from women and children discouraged the development of the nurturing potentials of fathers, depriving children of yet another source of allomaternal care.

  Fast forward now to the modern postindustrial era, as patriarchal institutions have begun to lapse and women in many parts of the developed world have begun to regain considerable freedom of movement and control over reproduction and mating choices. As always, though, mothers still need a tremendous amount of help to successfully rear their young, and yet they often reside far from supportive kin. Among many immigrants from the Old World to the New, or more recently from Latin America to the United States, extended kin were left far behind, and mothers in these truncated families were forced to abandon older traditions of childrearing and invent new ones.

  As mothers began to work outside their homes, in locations incompatible with childcare, many became accustomed to using paid allomothers, often creching infants together in one supervised place. The highest quality daycare centers do an excellent job of simulating the nurture on offer from extended families, with high ratios of adults to infants and stable cadres of responsive caretakers. But daycare of this caliber is not necessarily available, or if available, rarely affordable. Many women, who for the first time in the history of our species have a choice, are opting to delay childbirth or forgo it altogether. Yet those children who do come into the world are now surviving at higher rates than ever before.

  Child mortality in developed countries has plummeted. More than 99 percent of those born survive to age 5, and those who d
o not are more likely to die from accidents (automobiles being the biggest killers) than from malnutrition or disease. Meanwhile, in the developing world, child mortality from disease and malnutrition remains high, and in war-torn or AIDs-stricken regions with burgeoning populations of under-nurtured orphans, their chance of surviving is little better than in the Pleistocene. But as everywhere in the post-Neolithic era, survival of even the neediest youngsters has become largely decoupled from the responsiveness of caretakers. And perhaps for the first time in human history, exceedingly high rates of child survival coincide with sobering statistics about the emotional well-being of children.

  In a finding that is not so surprising, developmental psychologists report that as many as 80 percent of children from populations at high risk for abuse or neglect grow up confused by or even fearful of their main caretakers, suffering from a condition known as “disorganized attachment.” Far more unsettling is the finding that 15 percent of children in what are described as “normal, middle-class families,” children not ostensibly at special risk, are also unable to derive comfort from or to constructively organize their emotions around a caretaker they trust; these children too exhibit symptoms of disorganized attachment.26

  From the outset there was always a number of children who could not be categorized using attachment theory’s classic designations of “secure” versus “insecure” attachment.27 In 1990, the psychologist Mary Main at the University of California-Berkeley recognized that many of these difficult-to-classify children seemed dazed or disoriented. Some appeared dissociated from where they were, or would suddenly freeze for no apparent reason, as if alarmed by the proximity of their caretaker and paralyzed by their own contradictory emotions of fear and need. As Main put it, the attachment figure is normally “the primate infant’s haven of safety in times of alarm,” but not for these children. She hypothesized that infants repeatedly exposed to frightening behavior by their caretakers, or whose caretakers seem to be frightened themselves—rendering them insensitive or unresponsive to infants’ needs—encountered an irreconcilable dilemma that left them unable to mount any coherent strategy to elicit the attention and nurturing they required. She called this disorganized attachment.28

  So far, follow-up studies of these children extend only as far as the late teens, but already we know that by the time they reach school age, children classified with disorganized attachment as infants have difficulty interpreting the feelings of others, are significantly more aggressive toward their peers, and are prone to behavior disorders.29 Patterns of attachment between infants and their caretakers have not been studied long enough for psychologists to be able to say whether they might be changing over time, or whether they are predictive of adult behavior and emotional health. But what we can confidently surmise is that prior to about 15,000 years ago, the conditions leading to a serious attachment disorder in a child would not have been compatible with that child’s survival.

  Perverse as it sounds, when viewed this way, it appears that children today have begun to survive too well. Pleistocene parents and other kin were selected to respond to grave threats to their children’s survival—predation and starvation—by providing constant physical protection. As they held infants and passed them around to provisioning group members, who in the course of these intimate interactions became emotionally primed to nurture their charges, parents and alloparents communicated their commitment to the children in their group. Back in the Pleistocene, any child who was fortunate enough to grow up acquired a sense of emotional security by default. Those without committed mothers and also lacking allomothers responsive to their needs would rarely have survived long enough for the emotional sequelae of neglect to matter. Today, this is no longer true, and the unintended consequences are unfolding in ways that we are only beginning to appreciate.

  ARE WE LOSING THE ART OF NURTURE?

  As in all higher primates, only more so in the human case, prior experience and learning loom large in the way mothers and allomothers nurture infants in their charge. Compared with other mammals, like dogs or cats, human mothers have a near absence of what ethologists call “fixed action patterns.” Nurture, in our species, is more nearly an art form passed down from mothers or others to subsequent generations. Contrary to the notion of a “maternal instinct,” a person’s responsiveness to the needs of infants is to a large degree acquired through experience—through both the experience of nurturing and the experience of being nurtured. As we have seen throughout this book, both males and females start out with an innate capacity for empathy with others and for nurture, but past experiences along with proximate cues are critically important for the development and expression of nurturing responses. A study of foster mothers and the way they responded to their charges, undertaken by the University of Delaware psychologist Mary Dozier and her colleagues, illustrates my point here.

  Fifty infants between birth and 20 months of age were placed with women who had no biological relationship to them. Prior to placement, each of these foster mothers was asked to describe her own attachment experiences as a child, during an in-depth “Adult Attachment Interview.” The interviews were recorded, transcribed, coded, and classified by four independent specially trained raters. Some of the foster mothers clearly remembered and valued their own early attachment relationships. Others were more dismissive about them. In their analysis, the researchers took into account race, socioeconomic status, number of prior placements, and especially age of the infant at the time he or she was placed with a particular foster parent. Age at placement mattered, as we might expect. But the single best predictor of how securely attached an infant would become to a given caregiver turned out to be the way the foster mother recalled her own childhood experiences. Her state of mind about her past relationships dwarfed other effects.30

  It is well known that genetics plays a role in personality development, and of course these babies did not arrive in foster care as “blank slates.” Just as in other primates, some individuals are innately calm while others are more reactive. Some children are extroverted, others shy; and such traits are clearly heritable. However, attachment styles are known not to be heritable in the same way, and certainly in this instance the degree of concordance between the attachment styles of caregivers and their charges was clearly not due to shared genes.31 Rather, the quality of the attachment relationship that babies forged reflected the emotional state of the allomothers currently providing their care.

  Human infants are born monitoring the intentions of others, and by the second year of life their increasingly sophisticated sense of self, along with their awareness of the connections between self and others, helps them to understand the various goals that someone else might have in mind, as well as to communicate their own. These capacities provide the underpinnings for inter-individual communication and cooperation.32 Children cared for by responsive others exhibit a high potential for collaboration, and this may help explain why infants who are classified as securely attached become better at making friends in preschool.33 But “equally impressive,” Lyons-Ruth reminds us, “is the potential for derailment.”34

  We are learning that a subset of children today grow up and survive to adulthood without ever forging trusting relationships with caring adults, and their childhood experiences are likely to be predictive of how they in turn will take care of others. For hundreds of thousands of years, an interest in mind reading and in sharing mental and emotional states has provided the raw material for the evolution of our unusually prosocial natures. But if the empathic capacities of infants find expression only under certain rearing conditions, and if natural selection can only act on genetic traits that are actually expressed in the phenotype, perhaps we need to be asking how even the most useful innate predispositions can persist if their development is not encouraged?

  After all, “the” human species is no more static than other species are. If our environment changes (or, more pertinent in the human case, as we transform our environment), we c
hange with it. So why wouldn’t novel modes of childrearing continue to shape not just child development but human nature? To anyone who wonders if processes postulated in this book could ever be reversible, I would say that there is no reason why not. Just because humans have become “advanced” enough to vaccinate their young, write histories, and speculate about our origins, this does not mean that evolutionary processes have ceased to operate.

  Far from it. Indeed, some anthropologists such as Henry Harpending at the University of Utah and John Hawks of the University of Michigan are convinced that over the last 40,000 years or so—since the Upper Paleolithic and especially since the Neolithic—selection on our species has actually accelerated as human activities and population pressure transformed local environments and as an exponentially expanding population generated many more mutations for selection to act on. The best-documented cases of post-Pleistocene selection involve adaptations for resisting new diseases like cholera, smallpox, yellow fever, typhus, malaria, and, more recently, AIDS, as well as digestive mechanisms for coping with novel diets.35 But there is no reason why cognitive and behavioral traits would be any less susceptible to ongoing selection than digestive enzymes.

  Indeed, Hawks argues that some of the fastest-evolving genes in the human genome are those associated with the development of the central nervous system. His views are consistent with the discovery of new genetic variants responsible for increased brain size that are probably no more than 6,000 years old. Under strong positive selection, these variants have spread rapidly.36 As one evolutionist has quipped: “The ten or so [hominin] species that preceded modern humans came and went at a rate of about 200,000 years per species. Ours began some 130,000 years ago, so we could be just about due for a change.”37 It will not matter how spectacularly well prosocial tendencies served humans in the past if the underpinnings for such traits remain unexpressed and thus can no longer be favored by selection. Over evolutionary time, traits no longer used eventually disappear.

 

‹ Prev