by Ron Powers
In 1946, Truman signed the National Mental Health Act, which provided federal funding, for the first time ever, for research into the human mind. William Menninger, who by then was the head of the neuropsychiatry division of the US Army, helped draft the act. A key argument made by Menninger and others was that an infrastructure of sound psychiatric counseling would end up saving money when measured against the tremendous costs to society of incarcerating the insane. In the words of the historian Ellen Herman, they were “advocating that mental health, rather than mental illness, be the centerpiece of federal policy.”8
The act led to the formation of the National Institute of Mental Health in 1949. NIMH is now the world’s largest research organization that is devoted to mental illness. Its annual budget is $1.5 billion.
President Truman’s fight to guarantee public financing in mental health care was squarely in line with the progressive Democratic ideals of his time, and, in that sense, unsurprising. But Truman brought a special understanding to the enormous bulge in the numbers of mentally damaged Americans that World War II had produced. Truman knew about what happened to combatants in twentieth-century warfare. He’d been one of them.
First Lt. Harry Truman had arrived in France in March 1918 with the 129th Artillery Regiment. He rose to the rank of captain, took command of Battery D, and directed artillery fire from forward positions during the horrific Meuse-Argonne Offensive the following autumn. It was the largest (and the most climactic) American engagement of the war, pitting twenty-two US and four French divisions against forty-seven German divisions across a twenty-mile front. The Thirty-Fifth Division, of which Battery D was a part, went into battle with 27,000 men and took 7,300 casualties, the highest rate suffered by any American division in the war.9
Adding to the casualties caused by bullets and shells was the unholy noise generated by the machines that fired the bullets and shells. World War I was history’s first battle, the Civil War perhaps excepted, in which sound itself was a debilitating weapon—but a weapon that did not take sides. Battery D’s four 75mm howitzers contributed their small part to a universe of acoustic hell that often reached decibels of 140 to 185 or more, levels that ripped men’s eardrums open and could be heard in London, two hundred miles and across a sea channel to the west.10 The maximum tolerable decibel level over a several-hour period is currently held to be about 85.11
In one of war’s infinite little ironies, the mission of Truman’s battery was to provide support for a nearby light tank brigade commanded by a captain named George S. Patton. Patton was destined for glory and an ambiguous legacy in World War II: “ambiguous” because his heroic record of lightning advances at the head of his Third Army was marred by two incidents in Sicily in which he slapped soldiers in hospital tents for their “cowardice.” Patton kicked one of these men out of the tent and drew a pistol on the other. At least one of these soldiers was recovering from shell shock; the other was later diagnosed as having “malarial parasites.”*
World War II increased the din and its tortures to the psyche. Its combat arms were more varied and more powerful than ever: The tank, a marginal presence in the first war, now saturated the battlefield. Its 90mm guns fired at 187 decibels. The new howitzers were even louder, at 189 decibels. Recoilless rifles reached 188 decibels, machine guns 155 decibels, and even a submachine gun could generate 160 decibels.12
During engagements, all or most of these battlefield Frankensteins could be in full roar at the same time, for hours, along miles of front, on both sides of the lines. Their racket approached physical dimensions. Some soldiers believed that they could actually “see” the noise as it curled over them like a giant wave. The mere concussions of exploding shells gouged deep craters. Given all this, it seems miraculous that any combatant could survive ten minutes inside this hell with his sanity undemolished, much less an entire campaign or the war in full. (The madness, of course, was hardly generated by noise alone. Fatigue, anxiety, fear of death, grief over the loss of a comrade or the horror of shooting an enemy—these and other factors did their share in separating fighting men from their senses.) Whatever the causes, the incidence of mental flameouts proved to be double the rate of World War I.
The war’s effects on the human mind produced even more insidious consequences. Like Patton, many officers assumed that twitching, convulsing, or fetally positioned men without visible wounds were faking trauma to get out of combat. The captains, majors, and generals ordered these wrecks back into the line, thus heaping humiliation on top of their jangled psyches.
The vast majority of these “fighting men” in that war, of course, as in all wars, were, and are, boys in the peak years of their susceptibility to schizophrenia.
The Nazi atrocities of human experimentation, revealed to the world in the Nuremberg “Doctors’ Trials” in December 1946, abruptly revoked the popular prestige that eugenics had enjoyed since around the turn of the century. The mentally ill have mostly been spared this particular form of mass torture since the first liberating British tanks rolled into Bergen-Belsen.*
Yet the demise of eugenics did not spell the end of suffering under “the lights of perverted science” for America’s mentally ill. Even as World War II—at long last—laid bare the simplistic assumptions of eugenics theory and the moral depravity inherent in its practice, the war pushed an even more outrageous pseudoscience into the mainstream of psychiatric “cure.” That perversion of the healing arts was called the lobotomy.
The modern lobotomy—the back-alley abortion of brain surgery—had been conceived as an antidote to schizophrenia in 1935. Its inventor was a Portuguese neurosurgeon, as he styled himself, named António Egas Moniz.* Moniz called what he did “leukotomies” because he was after white matter—as in brain tissue—and leukos means “white” or “clear” in the ever-dignifying Greek.
Diagnosis was imprecise in those years and would remain so for a long time. No one in the 1930s, as we have seen, had as yet established a baseline for differentiating insanity from severe psychological problems. Thus there was no way to verify that Moniz’s patients—twenty hospitalized and helpless men and women—were in fact insane. As for a cure, no one really had a clue. Lobotomy made as much sense as electroshock, insulin coma therapy, even “refrigeration” therapy. These and other untested methods were being rushed into operating rooms as fast as doctors and tinkerers could dream them up.
Moniz came to believe that his patients’ common problem was an oversupply of emotion. Moniz did not have a lot of training in neurosurgery. In fact, his new technique helped create the concept. He knew a little about the brain’s geography, just enough to theorize where the emotional “off” switch was located. He hit upon the idea that had some crude nineteenth-century provenance: drilling holes into a patient’s skull, then poking inside with a long thin rod to probe the edges of the frontal lobe. The rod had a small wire attached to the business end. When the doctor gave the rod a twirl, the wire would sever the long nerve fibers that link the frontal lobe with the emotion-producing parts of the brain, the limbic system.
Moniz believed this could neutralize psychosis. And he was right; it could, and did, and often neutralized the patient’s memory, personality, and, sometimes, the entire patient as well. Accidents happen.
Moniz won a Nobel Prize.
It took less than a year for Moniz’s brain-scraping technique to make its inevitable way to the United States, a continental seller’s market for cures. Its importer and promoter was a goateed and dapper Washington doctor named Walter Freeman. Freeman was a brain surgeon in the manner that Professor Harold Hill was a marching band consultant. In fact, he wasn’t a neurosurgeon at all; he was a neuropathologist, and thus no more qualified to stick things into people’s heads than Moniz. So he hired a qualified sidekick named James Watts to handle the drilling and twirling.
Freeman seems to have decided that the European product was underperforming somehow; it could use some American pep and zip. He rebranded it “lobotomy,�
�� perhaps to carve out some marketplace distinction. Lobos means “lobe” in Greek, and is every bit as classy as “leukos.” After several years of directly replicating Moniz’s approach via Watts, Freeman hit upon a way to make the operation more user-friendly, plus eliminate the middleman. Why not just slide the rod in under the eye socket? No sheepskins necessary for that! Freeman saw that he needed a thinner rod than Moniz had used. He settled on an ice pick—one that he’d found in his kitchen drawer.
The pick needed a couple of knocks from a hammer to get it started, but once inside, it was as easy as one, two… what comes after two?
Freeman named this refinement “transorbital” lobotomy. Watts, now superfluous and finally repelled by it all, fled. No more middleman.
And no problem! Walter Freeman could handle everything on his own. He was a natural publicity animal. (It was he who had nominated Moniz for the Nobel Prize in the first place.) He honed a personal style that set him apart from the pack: he never washed his hands before an operation nor wore a mask during it. He disdained anesthesia for his patients. He performed up to twenty-five lobotomies a day. Sometimes he performed two simultaneously, one with each hand. Often, he would invite audiences into the operating room, including the press: an archival photograph in the Wall Street Journal shows him gripping an ice pick dagger-style, his head cocked in rakish preparation, as observers crowd in. Sometimes he had a bad day at the office. A couple of times the tip of the pick broke off and lodged in the patient’s skull. (Oops.) Even more embarrassing, Freeman once looked up from his patient into a photographer’s lens, lost his concentration, and let the pick slide too deeply into the brain. The patient died.13 The photograph turned out well.
This unfortunate victim thus joined the estimated one-third of Freeman’s patients whose cases the doctor himself admitted were “failures.” Not all died; some simply lost all affect, or were bedeviled by seizures, incontinence, or emotional outbursts.14
Ethically conscious doctors and surgeons were appalled by Freeman’s method, not to mention his style. They pointed out that no medical literature existed to verify its legitimacy or warn of its side effects. Certainly Freeman provided none.
A few thoughtful souls did step forward to excoriate him. In 1948, Nolan Lewis, director of the New York State Psychiatric Institute, demanded of his colleagues: “Is quieting a patient a cure? Perhaps all it accomplishes is to make things more convenient for those who have to nurse them. The patients become rather child-like; they are as dull as blazes. It disturbs me to see the number of zombies that these operations turn out. It should be stopped.”15 The great mathematician and social theorist Norbert Wiener took a similar line of attack that same year: “Prefrontal lobotomy… has recently been having a certain vogue, probably not unconnected with the fact that it makes the custodial care of many patients easier. Let me remark in passing that killing them makes their custodial care still easier.”16
Such condemnations were met with the same judiciousness, compassion, and restraint that had greeted eugenics and “scientific racism”: in 1949, civilian and military doctors across the United States were twirling away to the tune of an estimated five thousand lobotomies a year.17
What under the stars kept this P. T. Barnum of the brain propped up as a legitimate doctor for so long? (His career lasted thirty-two years before his recklessness finally caught up with him.)
The law could not touch him. No laws existed to prohibit lobotomy. No such laws exist today. But the larger reason for Freeman’s impunity derived from need. Specifically, it derived from World War II: the war, and the unprecedented numbers of deranged veterans—both men and women—that this global charnel house was disgorging back to the United States. They had been streaming home, or directly into military hospitals, since Pearl Harbor in late 1941. By war’s end, around 680,000 of them had been wounded in combat. Those were the physically wounded. What truly shocked the populace, as well as psychiatrists, was that almost three times as many veterans, some 1.8 million, had come home needing treatment for wounds to their minds.
For a while in the postwar years, the Veterans Administration hospital psychiatric chiefs tried to keep Freeman at bay. But the overwhelming stream of needy patients soon made it impossible for them to be, as it were, picky. They held their noses and allowed him and Dr. Watts over the threshold. Each man was soon raking in $50 a day—$678 and change in 2016 currency—in consulting fees; that is, fees for teaching other doctors how to tap, shove, and twirl.
When the supply of raw material in the VA hospitals around the country at last began to taper off, Walter Freeman realized that he needed to create a new market. So he purchased a van, christened it “The Lobotomobile,” and went haring around the country, stopping at mental hospitals to do his specialty and, again, to demonstrate it for the resident doctors. It really wasn’t all that hard. A no-brainer, so to speak.
Not until 1967 did the medical community decide that it had had about enough of Walter Freeman. Doctors informally agreed to relieve him of his operating-room privileges. This decision was reached after the woman who proved to be his last victim died from a brain hemorrhage—on Freeman’s third intrusion into her skull. By the time of his own death in 1972—of cancer—Freeman had directed or performed thirty-five hundred operations.
Lobotomy did not expire with Freeman, but it became extremely rare. The antipsychotic drug revolution, which had started in the 1950s, gradually replaced it as a more humane form of mass treatment. The most eloquent eulogy was written by Stephen T. Paul, professor of psychology and social sciences at Morris University in Pittsburgh: “Lobotomy was finally seen for what it was: Not a cure, but a way of managing patients. It did not create new people; it subtracted from the old ones. It was an act of defeat, of frustration.”18
Walter Freeman and his ghoulish fad aside, the early postwar years marked one of the few eras in which the United States seriously engaged the problem of madness amid its populace. It didn’t last long, and it was abruptly supplanted by a kind of Dark Age from which the momentum of public policy has yet to recover. But for a time at least, serious professionals seemed to be on the verge of wresting the fate of mentally ill people from the control of quacks, deluded ideologues, and callous public servants.
The most legendary among them hailed from Topeka, Kansas: the above-mentioned Menninger brothers, Karl and William. These sons of an old-fashioned Presbyterian town doctor and a pious, domineering mother were big men with high domes and prominent beaks and straight-arrow values—well, mostly straight-arrow values. William, born in 1899, became a lifetime Sea Scout. Karl, older than Will by six years, liked to equate mental health with moral health, and occasionally salted his books with pious exhortations. In Whatever Became of Sin? he enjoined men of the cloth to “teach! Tell it like it is. Say it from the pulpit. Cry it from the housetops… Cry comfort, cry repentance, cry hope. Because recognition of our part in the world transgression is the only remaining hope.”19
Evangelistic in their boosting of psychiatry; driven, paternalistic, and brilliant, the two accomplished something that probably no one else among their countrymen could have managed. They rescued psychiatry from the liabilities that were threatening to extinguish its early-century cachet (its taints of Europeanism and elitism on the one hand; clowns such as Freeman with his ice picks on the other). They replaced this imagery with their own stamp—then unique among US psychiatrists—of home-cooked American optimism regarding mental cure, flavored with their entrepreneurial genius. In truth, their conception of psychiatry was destined for obsolescence. Paradoxically, they accomplished this with a staff liberally stocked with German-Jewish psychiatrists who had fled the encroaching Third Reich.
It all started in 1919, when Karl Menninger returned to Topeka from the Harvard Medical School, where he’d graduated cum laude. His mission was to help his father, Dr. Charles Frederick Menninger, establish the Menninger Diagnostic Clinic. Karl was twenty-six then, and William was twenty. The clinic welcomed pat
ients with emotional and “psychological” problems, though years would pass before the brothers could afford to include psychoanalysts on their staff.
For a while, it seemed that there might be no staff—and no clinic, either. The Wicked Witch of the West herself could not have been less welcome in this respectable Kansas town of fifty thousand people and eighty-odd churches than doctors who opened their doors to “maniacs.” Even though the family was known, several upstanding citizens tried to sue their clinic out of town. It didn’t work, but the Menningers’ persuasive powers did, though father and son had to smuggle their patients in under fake diagnoses until everybody calmed down. It helped that Charles Frederick Menninger was a reputable physician, a homeopathy man, which suited the region’s self-reliant traditions. People began to notice that his son spoke in new and fresh and reassuring ways, unlike that gloomy sex-minded Freud over there in Europe. Karl promised “a psychotherapy for the people” and a movement toward “progressive analysis” (which meant roughly the same thing).
The clinic gained popularity, local investors got interested, the father and son attracted psychiatrists who at first had been skeptical, and within five years the clinic had become the Menninger Sanitarium. Starting out in a converted farmhouse with thirteen beds on Southwest Sixth Street, it grew into a nationally known enterprise that spread to over 430 acres on two campuses. Its staff grew to nine hundred.
They cared for patients housed in thirty-nine buildings, including an administration building with a clock tower. Patients were encouraged to linger for months, even years, if they could afford it. These lengthy stays had a self-selecting effect on the clientele: movie stars, politicians, even political officeholders came for treatment. (The brothers were not in fact elitists; their aim was to get psychiatry ingrained into the nation’s cultural fabric. On the other hand, a movie star was a movie star.) In time, the sanitarium became a de facto salon as well; it attracted psychiatric intellectuals and social activists from around the world for formal and informal talks and debates.