by Jesse Bering
Let’s also not forget that many individuals are put off by the idea of cunnilingus or fellatio because of those pesky pubic hairs that can lodge inadvertently in their gratifying throats. In fact, this was the theme of an episode of Curb Your Enthusiasm, where Larry David had to embarrassingly explain this bothersome tickle to a rather serious doctor. But now this is turning into a different type of story altogether.
In any event, pubic hair coiffure is not a zero-sum game. Typing “pubic hair styles” into my Google search bar yielded 467,000 hits at the time of this writing, every single one of which I was hesitant to click on—until I got home from the public library, of course.
Bite Me: The Natural History of Cannibalism
While I was strolling not so long ago through one of the dimly lit backrooms of a wing in the National Galleries of Scotland, my inner eye still tingling with thousands of Impressionist afterimages, pudgy Rubensian cherubs, and Gothic quadrangles, one irreverent painting leaped out at me in a very contemporary sort of way. It was part of an early-sixteenth-century triptych showing what appeared to be a solemn, middle-aged clergyman in gilded ecclesiastical robes commanding three naked adolescent boys before him in a bathtub.
Now, I must say, my first thought on seeing this salacious image was that the Catholic Church has been an ephebophile’s haven for far longer than anyone has ever realized. But my uneasiness was put to rest once I leaned in to read the caption, which stated that the Dutch artist Gerard David, a prolific religious iconographer based in Bruges, Belgium, was merely painting a scene of starvation cannibalism. Phew! What a relief it was only an innocent case of anthropophagy (the eating of human flesh by humans) and nothing more sinister than that. The boys had been killed by a butcher, you see, and their carcasses were salting in a makeshift vat awaiting ingestion by famished townspeople. Fortunately, that most notorious child lover himself, Saint Nicholas—the middle-aged clergyman—just happened to be passing through town when he caught wind of the boy-eating scandal and resurrected the lads in the tub.
In any event, my time in Edinburgh offered plenty of food for thought on the subject of human meat. From the art gallery, my partner, Juan, and I galloped over to the Surgeons’ Hall Museum, where we wandered through aisles packed floor to ceiling with pickled gangrenous feet, hairy severed arms of industrial-age elderly women, trephined heads, and sundry sickly genitals. Also on display was an elegant leather notebook, composed of a substance resembling cowhide but, in fact, made of the skin of the famous corpse supplier-cum-murderer William Burke.
And all of this got me thinking about the logistics of cannibalism. The slick commercialization of the food industry has changed things dramatically, but there were, at one time, relatively frequent conditions—crop failures, habitat depletion, famine—in which cannibalism would have had lifesaving adaptive utility for our species. One pair of anthropologists, for example, actually crunched the numbers, concluding that the average human adult provides sixty-six pounds of edible food, including fat, connective tissue, muscle, organs, blood, and skin. Protein-rich blood clots and marrow are said (by the rare connoisseur) to be special treats. At least one prominent evolutionary theorist, Lewis Petrinovich, has argued that cannibalism is a genuine biological adaptation common to all human beings—including those of you gripping the toilet seat as you’re reading this.
Anthropophagy routinely emerges, says Petrinovich, under predictable starvation conditions, and at least during our early evolution human cannibalism was not as rare as you might think. Today the term “cannibalism” conjures up sensationalistic stories of helicopter crashes in remote montane regions in the Andes, serial killers, or failed nineteenth-century expeditions to the Arctic. But our deeper history suggests that it would not have been an entirely uncommon occurrence. “The point is that cannibalism is in the human behavioral repertoire,” writes Petrinovich in The Cannibal Within, “and probably is exhibited for a number of reasons—a common one being severe and chronic nutritional deprivation. A behavior might be exhibited only under extreme circumstances and still be part of our biological inheritance, and the fact that its course follows a systematic pattern argues against the hypothesis that it is psychotic in character.”
Petrinovich wends his way through a human history littered with the gnawed-on bones of our cannibalized ancestors, revealing that—contrary to critiques arguing that man-eating is a myth conjured up by Westerners to demonize “primitives”—we really have been gobbling each other up for a very, very long time. We’re just one of thirteen hundred species for which intraspecific predation has been observed. Among primates, cannibalism can usually be accounted for by nutritional and environmental stress, or it appears as a reproductive strategy in which baboons, for example, consume unhealthy infants to make way for more viable offspring.
Pinpointing the specific factors that cause cannibalism is a rather difficult affair in the laboratory, mainly because of those pesky university ethics review boards. Still, an intrepid Japanese researcher shrugged off these considerations and induced cannibalism among a captive population of squirrel monkeys by feeding the pregnant females a low-protein diet. This led to a high rate of spontaneous abortion and the mothers’ devouring their aborted fetuses—a much-needed bolus of protein. Now imagine doing this same study with human beings under similar controlled laboratory conditions. Rather horrific, I should say, but that doesn’t mean the findings couldn’t generalize to our own species. And don’t get me started on the many ways that mammalian mamas feast on placental afterbirth. Some of our own prefer it with a dash of paprika, others as a spaghetti and meatballs dish.
But the fact that cannibalism in primates, including human beings, is motivated by starvation is precisely the point that Petrinovich is arguing. Where he differs from other evolutionary theorists, however, is in his assertion that anthropophagy represents a true adaptation in our species, just as cannibalism does for other animals. It is not simply an anomalous behavior found in a handful of depraved individuals. Such people do exist, to be sure—like this man who was so curious to know what his own flesh tasted like that he … well, I’ll let the clinical psychiatrists who examined him tell you in their own words:
After he cut the first toe, he first showed it to his flatmates before he ate it raw while he walked the streets. He chewed as much of the bone as possible and then spat it out. He recalls eating it “for the experience” and that it was a “once in a lifetime opportunity to eat human flesh.” He was excited by the shock value of doing so. The second toe was cooked in an oven before eating. In between cutting his toes he continued to work on renovating houses.
That man is presumably wearing special orthopedic shoes now. But again, whereas cannibalism can certainly be deviant, in other cases it’s even somewhat routine. Our close cousins the Neanderthals were essentially carnivorous predators and were driven to cannibalism at the end of the last glacial maximum in the face of dwindling numbers of large game animals. Osteoarchaeological research at a cave in southeast France yielded a bundle of roasted Neanderthal bones from about six individuals, haphazardly discarded bones that had been deliberately defleshed and disarticulated and the marrow extracted.
As for our own species, the Aztec were notorious for their bloodthirsty sacrifice and cannibalism rituals. These were largely symbolic religious events, but some scholars have suggested that the greasy surfeit of Aztec sacrifice victims may have also been a high-energy nutritional supplement for the wealthy elite, who had first dibs on this so-called “man corn.” Actually, noncannibals may be the outliers, both historically and cross-culturally speaking. Researchers have documented evidence of ritual anthropophagy throughout societies in Africa (Zandeland, Sierra Leone, the Belgian Congo), South America (eastern Brazil, Ecuador, western Colombia, Paraguay), the New Hebrides (Fiji, Papua New Guinea, Vanuatu, East New Guinea Highlands), and Native America. It’s appeared in industrialized societies, too, including famine-stricken China during the Great Leap Forward (1958–62) and Soviet-
era Russia.
The bottom line, says Petrinovich, is that when you’re hungry enough, ravenous really, and when all other food sources—including “inedible” things you’d rather not stomach such as shoes, shoelaces, pets, steering wheels, rawhide saddlebags, or frozen donkey brains—have been exhausted and expectations are sufficiently low, even the most recalcitrant moralist among us would shrug off the cannibalism taboo and savor the sweet meat of man … or woman, boy or girl, for that matter. It’s either that or die, and between the two choices only one is biologically adaptive.
A behavior can be adaptive without being an inherited biological adaptation, of course. But because starvation occurred with such regularity in our ancestral past, and because the starving mind predictably relaxes its cannibalistic proscriptions, and because eating other people restores energy and sustains lives, and because the behavior is universal and proceeds algorithmically (we eat dead strangers first, then dead relatives, then live slaves, then live foreigners, and so on down the ladder to live kith and kin), there is reason to believe—for Petrinovich at least—that anthropophagy is an evolved behavior. The taboo against cannibalism is useful in times of health and prosperity; groups wouldn’t survive very long if members were eating one another up. Yet starvation has a way of releasing the cannibal within.
In fact, some scientists have suggested that starvation cannibalism may have been so prevalent in the ancestral past that it literally changed our DNA. Modern human populations appear to contain genetic adaptations designed specifically to combat cannibalistic viruses. Typically, when a predator species consumes a prey species, there are substantive differences in immune systems between the two, with different varieties of pathogens. But the more similar the eater and the eaten, the more vulnerable the former to debilitating food-borne disease. This is because organisms can be compromised only by parasites that have adapted to the particular environment of the host species; they require a recognizable genetic substrate to thrive.
According to the microbiologist Carleton Gajdusek, who won the 1976 Nobel Prize in Physiology or Medicine for his epidemiological research on cannibalism, this is what almost certainly happened with the New Guinea Fore people in the case of kuru, a neurodegenerative disease that devastated that population in the early half of the last century. Gajdusek traced the disease to mortuary cannibalism; women and children were eating the brains of the recently deceased as part of the local funerary rites. (Brain consumption was a ritual act, but it spiked in frequency—perhaps not coincidentally—whenever pork had fallen into short supply, so human brains also infused a dose of protein.) The interesting thing is that kuru is a variant of Creutzfeldt-Jakob disease (CJD) and probably resulted, originally, from a single case of cannibalism among the Fore of a CJD-ridden brain, with kuru then evolving on its own course. In an issue of Current Biology, the geneticist John Brookfield speculated that over the past 500,000 years, human beings have developed increasing variation in the gene for the human prion protein. Those who are heterozygous for this gene, he points out, were protected against CJD through cannibalism. “This sustained heterozygote advantage [was possibly] created by a lifestyle of habitual cannibalism, implying a new vision of the lifestyles of our ancestors.”
As we’ve seen, not all cases of cannibalism are due to nutritional needs. Sociopathic individuals such as Jeffrey Dahmer, Armin Meiwes, and Issei Sagawa lived in urban environments peppered with fast-food restaurants and overflowing grocery stores, yet still they dined on people. In SuperSense, the psychologist Bruce Hood argues that such cases reflect essentialist beliefs, the idea that the victims’ hidden “essences” or personality attributes are acquired by physical ingestion. It’s also interesting that many such cases have a sexual component. As Margaret St. Clair wrote teasingly in the foreword of To Serve Man: A Cookbook for People: “There is no form of carnal knowledge so complete as that of knowing how somebody tastes.” I suspect there’s some truth to that uncomfortable joke. Essentialist beliefs may account for our species’ peculiar history of medical cannibalism as well. The conquistadores and their New World heirs were known to have used human fat from agile natives to grease their arthritic joints. Long before Armin Meiwes was even a twinkling in his mother’s eye, pregnant Aché women of Paraguay were nibbling on boiled penises in the hopes that it would bring them sons.
So with all of these scenes swimming in my head, and pragmatist that I am, I’m left wondering why, exactly, it is that the consumption of already dead human bodies is such a taboo, especially for societies in which the soul is commonly seen as flitting off at death like an invisible helium balloon. If you subscribe to such dualistic notions, after all, the body is only an empty shell that the now-liberated spirit no longer needs. Even resurrectionists could gleefully feed the impoverished with their own flesh, lest they, God forbid, allow such a bounty of edible meat to go to rot. All those wasted commercial goods, burned down to dry, gravelly dust in crematories, squirreled away behind ornate vaults, fed extravagantly to bloated subterranean organisms! If you’d rather not eat meat from aged or possibly diseased dead people, and if you’re worried about the dignity of the individual, it would be easy enough to breed and then factory-farm brain-dead or free-ranging anencephalic human beings, treating them humanely, of course, but enforcing food safety standards to control for any outbreaks.
After all, let us not forget those starving people of this world, surrounded by—as some epicures swear—the most succulent meat on the planet.
The Human Skin Condition: Acne and the Hairless Ape
Humans are pimply. It’s part of what sets us apart from the rest of the animal kingdom. While it’s true that some form of acne vulgaris affects other species—it’s been found in some Mexican hairless dogs and induced experimentally in rhino mice—acne is largely an affliction of our accursed species alone. (Somewhere between 85 and 100 percent of adolescents exhibit acne—and a significant minority of adults, too.) Why is the human animal so peculiar in its tendency to form volcanic comedones, papules, pustules, nodular abscesses, and, in some severe cases, lasting scars? According to the evolutionary theorists Stephen Kellett and Paul Gilbert, we probably owe these unsavory blemishes to our having lost our apish pelts too rapidly for our own good.
Although increasingly glabrous (hairless) skin probably evolved for adaptive purposes—it may have enabled our ancestors to keep cool, for example, while traveling across the hot savanna—the sure-footed pace at which genes for depilated flesh were selected posed some cosmetic problems. Kellett and Gilbert observe that the evolution of our sebaceous glands, which were accustomed to dealing with hair-covered flesh, lagged behind this change in our appearance. As a consequence, all that oily and waxy sebum, normally committed to lubricating fur, hadn’t much fur to lubricate. So the sebum started to build up and clog our pores instead. (There are many issues that a person suffering from hypertrichosis—also known as werewolf syndrome—has to worry about, but acne tends not to be one of them.) Better this evolutionary account than pimples by intelligent design, in any event. What a heartless God indeed that would wind up the clock so that our sebaceous glands might overindulge in sebum production precisely at the time in human development when we’d become most acutely aware of our appearance.
It only makes matters worse that evolution has given us another distinctly human trait, and one that makes any outbreak of acne infinitely more upsetting. I’m referring to our crippling sensitivity to other minds. Although this statement is not entirely without controversy, it seems likely, based on the available evidence, that other species do not share our fine-tuned facility at taking on the rich psychological perspective of others. If this is so, then seeing the flash of disgust, or even a more innocent curiosity, reflected in other human eyes whenever they steal away to our physical flaws triggers in us an aversive state entirely original to our species. Anyone who has ever had a ripe, loathsome pimple placed strategically upon the tip of her nose by the epidermal fates has felt this painful interpers
onal state.
Consider a scene from Jean-Paul Sartre’s No Exit, in which three strangers come to realize that they’ve just been cast to hell, which is, strangely enough, an average, furnished drawing room. The Devil’s insidious rub, however, is that there are no windows, no mirrors, and no sleep permitted in this room. Even the characters’ eyelids are paralyzed, disallowing them the simple luxury of blinking. Their exquisite little torture is for all eternity to be under one another’s unrelenting glare. Inez, a sadistic lesbian, knows just how to push the buttons of the other female in the room. “What’s that?” she asks, examining Estelle’s face. “That nasty red spot at the bottom of your cheek. A pimple?” “A pimple?” replies the frantic, mirror-deprived, pampered debutante Estelle. “Oh, how simply foul!”
Sartre’s chthonic allegory bears a striking resemblance, in fact, to the sort of living hell that many acne sufferers report experiencing on an everyday basis. For a report in the British Journal of Health Psychology, for instance, the psychologists Craig Murray and Katherine Rhodes interviewed around a dozen members of an online acne support group, who’d been prescribed antibiotics or hormone treatments for their condition and suffered from acne for at least a full year. “Michelle” eloquently describes what it feels like to meet someone new, face-to-face: