Love at Goon Park: Harry Harlow and the Science of Affection

Home > Science > Love at Goon Park: Harry Harlow and the Science of Affection > Page 5
Love at Goon Park: Harry Harlow and the Science of Affection Page 5

by Deborah Blum


  Terman said it didn’t matter whether Harry was Jewish or not; the problem was that his name sounded Jewish. “He also indicated that even though the depression had already hit they would keep me on some kind of basis for the forthcoming year.” As it turned out, Harry didn’t need the extra support from Stanford. Shortly after his conversation with Terman—while the young psychologist was still considering his department head’s proposal—a job offer came through. The University of Wisconsin had sent a one-line telegram to Harry Israel. It asked, “Will you accept an assistant professorship at the UW paying $2,750 a year?” In a heartbeat. He was packed, on the road, out of there. He almost left California, yet, as Harry Israel.

  But Harry’s last name still troubled Terman. In his letter to Wisconsin recommending Harry Israel, Terman had acknowledged the Jewish sound of the name. He then assured the potential employers that Dr. Israel was not “that kind of Jew.” He called Harry into the office and said that he was glad about the job but he still thought the name Israel was just wrong, too negative. It would continue to hold him back. Didn’t Harry want to have a great future, not just an ordinary one? Of course he did. This was the Harry Israel who had written down “fame” as his ambition in his high school yearbook. He still desperately wanted to please Terman in some way, to prove himself beyond that junior college designation. And “I had seen anti-Jewish prejudice and did not want any son or particularly daughter of mine to go through it.” Okay, Harry said, give me a new name. Terman suggested that Harry choose a name that at least belonged to his family. Harry came up with two possibilities: Crowell, after an uncle; and Harlow, from his father’s middle name. “Terman chose Harlow and, as far as I know, I am the only scientist who has ever been named by his major professor.”

  By the time the news reached Fairfield, Stanford was already printing the graduation program, and listed under Ph.D. graduates was that new man again, Harry F. Harlow of Palo Alto, California. Only once, and that was to a close friend, Harry said that he regretted the change, that it seemed to dishonor whatever Jewish ancestry he had, be it only ⅙4th that ran in the Israel family. That was long after he’d left Stanford, and after World War II and Adolf Hitler had put an end to the fancy—if people ever really believed it—that there could be benign prejudice toward any people, that such attitudes were mere silliness. His fall-back position—as always—was a joke. In an interview in Psychology Today, he told it like this: “So I became a Harlow. I guess I’m not alone. Once a man called me up and said he was looking up the Harlow ancestry. I said I was sorry, but I had changed my name.” “Oh, heavens, not again,” he replied. “Everyone named Harlow that is worth a damn has changed his name.”

  Changing the name, of course, doesn’t change the person. Years later, one of Harry’s best-known post-doctoral researchers, California psychologist William Mason, would wonder which man was the real Harry: the Wisconsin crusader called Harry Harlow or the shy loner from Iowa named Harry Israel. “What was the real man like?” Mason asked. “Very complex.” There’s one aspect of the almost forgotten Harry Israel, though, that remains straightforward: He understood that you could win a lost cause. Against both odds and expectations, he’d become a Stanford-trained research psychologist. During his career, that belief—that you should rarely declare a battle lost—would guide many of his most defiant research choices. Eventually, his fondness for unpopular causes would lead him to labor for love. And to appreciate what a lost cause that was in the world of mid-twentieth-century psychology, you must appreciate the depth and righteousness of the opposition to love as part of daily life. Psychologists argued vehemently against cuddling children. Doctors stood against too close contact between even parent and child. There was real history behind this, built on experience from orphanages and hospitals, built on lessons learned from dead children and lost babies. There were careful experiments and precise data and numerical calculations of behavior to prove that emotions were unnecessary and unimportant. There was a decades-thick wall of research and an army of researchers to counter any upstart psychologist, including Harry Harlow. Naturally, it was just the kind of challenge that appealed to him most.

  TWO

  Untouched by Human Hands

  The apparent repression of love by modern psychologists stands in sharp contrast with the attitude taken by many famous and normal people.

  Harry F. Harlow,

  The Nature of Love, 1958

  THE FRUSTRATING, IMKPOSSIBLE, TERRIBLE thing about orphanages could be summarized like this: They were baby killers.

  They always had been. One could read it in the eighteenthcentury records from Europe. One foundling home in Florence, The Hospital of the Innocents, took in more than fifteen thousand babies between 1755 and 1773; two thirds of them died before they reached their first birthday. In Sicily, around the same time, there were so many orphanage deaths that residents in nearby Brescia proposed that a motto be carved into the foundling home’s gate: “Here children are killed at public expense.” One could read it in the nineteenth-century records from American orphanages, such as this report from St. Mary’s Asylum for Widows, Foundlings, and Infants in Buffalo, New York: From 1862 to 1875, the asylum offered a home to 2,114 children. Slightly more than half—1,080—had died within a year of arrival. Most of those who survived had mothers who stayed with them. “A large proportion of the infants, attempted to be raised by hand, have died although receiving every possible care and attention that the means of the Sisters would allow as to food, ventilation, cleanliness, etc.”

  And yet babies, toddlers, elementary school children, and even adolescents kept coming to foundling homes, like a ragged, endless, stubbornly hopeful parade. In the orphanages, the death of one child always made room for the next.

  Physicians were working in and against an invisible lapping wave of microorganisms, which they didn’t know about and couldn’t understand. Cholera flooded through the foundling homes, and so did diphtheria and typhoid and scarlet fever. Horrible, wasting diarrheas were chronic. The homes often reeked of human waste. Attempts to clean them foundered on inadequate plumbing, lack of hot water, lack even of soap. It wasn’t just foundling homes, of course, where infections thrived in the days before antibiotics and vaccines, before chlorinated water and pasteurized milk. In the United States, more than one fourth of the children born between 1850 and 1900 died before age five. But foundling homes concentrated the infections and contagions, brought them together in the way a magnifying glass might focus the sun’s rays until they burn paper. The orphanages raised germs, seemingly, far more effectively than they raised children. If you brought a group of pediatricians together, they could almost immediately begin telling orphanage horror stories—and they did.

  In 1915, a New York physician, Henry Chapin, made a report to the American Pediatric Society that he called “A Plea for Accurate Statistics in Infants’ Institutions.” Chapin had surveyed ten foundling homes across the country; his tally was—by yesterday’s or today’s standards—unbelievable. At all but one of the homes, every child admitted was dead by the age of two. His fellow physicians rose up—not in outrage but to go him one better. A Philadelphia physician remarked bitterly that “I had the honor to be connected with an institution in this city in which the mortality among all the infants under one year of age, when admitted to the institution and retained there for any length of time, was 100 percent.” A doctor from Albany, New York, disclosed that one hospital he had worked at had simply written “condition hopeless” on the chart as soon as a baby came into the ward. Another described tracking two hundred children admitted into institutions in Baltimore. Almost 90 percent were dead within a year. It was the escapees who mostly survived, children farmed out to relatives or put in foster care. Chapin spent much of the rest of his career lobbying for a foster care system for abandoned children. It wasn’t that he thought foster homes would necessarily be kinder or warmer—he hoped only that they wouldn’t kill children so quickly.

  By Chap
in’s time, of course, thanks to researchers such as Louis Pasteur and Alexander Fleming and Edward Jenner, doctors recognized that they were fighting microscopic pathogens. They still didn’t fully understand how those invisible infections spread—only that they continued to do so. The physicians’ logical response was to make it harder for germs to move from one person to the next. It was the quarantine principle: Move people away from each other, separate the sick from the healthy. That principle was endorsed—no, loudly promoted—by such experts of the day as Dr. Luther Emmett Holt, of Columbia University. Holt made controlling childhood infections a personal cause. The premier childcare doctor of his time, he urged parents to keep their homes free of contagious diseases. Remember that cleanliness was literally next to Godliness. And remember, too, that parents, who weren’t all that clean by doctors’ standards, were potential disease carriers. Holt insisted that mothers and fathers should avoid staying too close to their children.

  Before Holt, American parents usually allowed small children to sleep in their bedrooms or even in their beds. Holt led a crusade to keep children in separate rooms; no babies in the parental bedroom, please; good childcare meant good hygiene, clean hands, a light touch, air and sun and space, including space from you, mom and dad. And that meant avoiding even affectionate physical contact. What could be worse than kissing your child? Did parents really wish, asked Holt, to touch their baby with lips, a known source for transmitting infection?

  If parents had doubts about such lack of contact, Holt’s colleagues did not. In the 1888 The Wife’s Handbook (with Hints on Management of the Baby), physician Arthur Albutt also warned each mother that her touch could crawl with infection. If she really loved the baby, Albutt said, she should maintain a cautious distance: “It is born to live and not to die” and so always wash your hands before touching, and don’t “indulge” the baby with too much contact so that “it”—the baby is always “it” in this book—may grow up to fill a “useful place in society.”

  In foundling homes, wedged to the windows with abandoned children, there was no real way to isolate an ailing child—nor did anyone expect the foundlings to occupy many useful places in society. But administrators did their best to keep their charges alive. They edged the beds farther apart; they insisted that, as much as possible, the children be left alone. On doctors’ orders, the windows were kept open, sleeping spaces separated, and the children touched as little possible—only for such essentials as a quick delivery of food or a necessary change of clothes. A baby might be put into a sterile crib with mosquito netting over the top, a clean bottle propped by its side. The child could be kept virtually untouched by another human being.

  In the early twentieth century, the hyperclean, sterile-wrapped infant was medicine’s ideal of disease prevention, the next best thing to sending the baby back to the safety of the womb. In Germany, physician Martin Cooney had just created a glass-walled incubator for premature infants. His Kinderbrutanstalt (“child hatchery”) intrigued both manufacturers and doctors. Because preemies always died in those days anyway, many parents handed them over to their physicians. Doctors began giving them to Cooney. He went on an international tour to promote the hatchery, exhibiting his collection of infants in their glass boxes. Cooney went first to England and then to the United States. He showed off his babies in 1902 at the Pan American Exposition in Buffalo, New York. During the next two years, he and his baby collection traveled to shows as far west as Nebraska. Cooney settled in Coney Island, where he successfully cared for more than five thousand premature infants. Through the 1930s, he continued, occasionally, to display them. In 1932, he borrowed babies from Michael Reese Hospital for the Chicago World’s Fair and sold tickets to view the human hatchlings. According to fair records, his exhibit made more money that year than any other, with the exception of that of Sally Rand, the famed fan-dancer. The babies in the boxes were like miracles of medicine; they were alive when generations before them had died. Cooney said his only real problem was that it was so hard to convince mothers to take their infants back. Oddly enough, they seemed to feel disconnected from those babies behind the glass.

  Sterility and isolation became the gods of hospital practice. The choleras and wasting diarrheas and inexplicable fevers began to fall away. Children still got sick—just not so mysteriously. There were always viruses (measles, mumps, things we now vaccinate against) and still those stubborn bacterial illnesses that plague us today: pneumonias, respiratory infections, drearily painful ear infections. But, now, doctors took the position that even the known infections could be best handled by isolation. Human contact was the ultimate enemy of health. Eerily unseeable pathogens hovered about each person like some ominous aura. Reports from doctors at the time read like descriptions of battle zones in which no human was safe—and everybody was dangerous. One such complaint, by Chicago physician William Brenneman, discussed the risks of letting medical personnel loose in the wards. Nurses weren’t allowed enough sick leaves and they were bringing their own illnesses into the hospital; interns seemed to not appreciate that their “cold or cough or sore throat” was a threat. Physicians themselves, Brenneman added sarcastically, apparently felt they were completely noninfectious when ill, as long as they wore a long “white coat with black buttons all the way down the front.” How could you keep illness out of hospital when doctors and nurses kept coming in?

  Brenneman, of Children’s Memorial Hospital in Chicago, thought children’s wards were similar to concentration camps, at least when it came to infection potential. He evoked the prison camps of World War I, where doctors had found that captured soldiers were crawling with streptococcus bacteria. Were wards so different? Tests had shown that 105 of 122 health workers at the hospital were positive for the same bacteria, a known cause of lethal pneumonias. “It is known what the streptococcus did in concentration camps during the World War. One is constantly aware of what it does in the infant ward under similar conditions of herding and massed contact.” The less time a child spent in the hospital, the better was Brenneman’s rule and he urged doctors to send their patients home; or if they had no home, into foster care, as quickly as possible. And if they had to be hospitalized? Push back the beds; wrap up the child quickly, keep even the nurses away when you could.

  Harry Bakwin, a pediatrician at Bellevue in New York, described the children’s ward of the 1930s like this: “To lessen the danger of cross infections, the large open ward of the past has been replaced by small, cubicled rooms in which masked, hooded, and scrubbed nurses and physicians move about cautiously so as not to stir up bacteria. Visiting parents are strictly excluded, and the infants receive a minimum of handling by the staff.” One hospital even “devised a box equipped with inlet and outlet valves and sleeve arrangements for the attendants. The infant is placed in this box and can be taken care of almost untouched by human hands.” By such standards, the perfectly healthy child would be the little girl alone in a bed burnished to germ-free perfection, visited only by gloved and masked adults who briskly delivered medicine and meals of pasteurized milk and well-washed food.

  Hospitals and foundling homes functioned, as Stanford University psychologist Robert Sapolsky puts it today, “at the intersection of two ideas popular at the time—a worship of sterile, aseptic conditions at all costs, and a belief among the (overwhelmingly male) pediatric establishment that touching, holding and nurturing infants was sentimental maternal foolishness.” It wasn’t just that doctors were engaged in a quest for germ-free perfection. Physicians, worshipping at the altars of sterility, found themselves shoulder to shoulder with their brethren who studied human behavior. Their colleagues in psychology directly reassured them that cuddling and comfort were bad for children anyway. They might be doing those children a favor by sealing them away behind those protective curtains.

  Perhaps no one was more reassuring on the latter point than John B. Watson, a South Carolina–born psychologist and a president of the American Psychological Association (APA).
Watson is often remembered today as the scientist who led a professional crusade against the evils of affection. “When you are tempted to pet your child remember that mother love is a dangerous instrument,” Watson warned. Too much hugging and coddling could make infancy unhappy, adolescence a nightmare—even warp the child so much that he might grow up unfit for marriage. And, Watson warned, this could happen in a shockingly short time: “Once a child’s character has been spoiled by bad handling, which can be done in a few days, who can say that the damage is ever repaired?”

  Nothing could be worse for a child, by this calculation, than being mothered. And being mothered meant being cradled, cuddled, cosseted. It was a recipe for softness, a strategy for undermining strong character. Doting parents, especially the female half of the partnership, endowed their children with “weaknesses, reserves, fears, cautions and inferiorities.” Watson wrote an entire chapter on “The Dangers of Too Much Mother Love,” in which he warned that obvious affection always produced “invalidism” in a child. The cuddling parent, he said, is destined to end up with a whiny, irresponsible, dependent failure of a human being. Watson, who spent most of his research career at Johns Hopkins University, was a nationally known and respected psychologist when he trained his sights on mother love. Articulate, passionate, determined, he was such an influential leader in his field, that his followers were known as “Watsonian psychologists.” And like him, they came to consider coddling a child as the eighth of humankind’s deadly sins. “The Watsonian psychologists regard mother love as so powerful (and so baneful) an influence on mankind that they would direct their first efforts toward mitigating her powers,” wrote New York psychiatrist David Levy in the late 1930s.

 

‹ Prev