Book Read Free

The Great Pretender

Page 4

by Susannah Cahalan


  (Though they have different causes, the symptoms of syphilis share many similarities with those of autoimmune encephalitis, the disease that struck me, which I guess could give autoimmune encephalitis the dubious honor of being the syphilis of my generation.)

  The more we learned about the science of the mind, the hazier the boundary between neurology and psychiatry became. During the twentieth century, neurology broke off into a distinct branch of medicine, and in doing so “claimed exclusive dominion over the organic diseases of the nervous system”—like stroke, multiple sclerosis, and Parkinson’s. Meanwhile, psychiatrists took on the ones “that could not be satisfactorily specified by laboratory science”—like schizophrenia, depression, and anxiety disorders. Once a biological breakthrough was achieved, the illness moved out of psychiatry and into the rest of medicine. Neurologists work to uncover how damage to the brain impairs physical function; psychiatrists are there to understand how this organ gives rise to emotion, motivation, and the self. Though the two fields overlap considerably, the separation embodies our mind/body dualism—and this continues today.

  Clearly, syphilis and Alzheimer’s disease weren’t the only causes of insanity. In order to track down and cure the others—if they could be found—psychiatrists still needed to develop a diagnostic language that could help pinpoint the different types (which would hopefully lead to the cleaving out of different causes) of mental illness.

  German psychiatrist Emil Kraepelin had been tackling this issue since the late nineteenth century, and though you’ve likely never heard of him, his work has had more influence on the way psychiatry is practiced today than did the famous Sigmund Freud, born the same year: 1856. The son of a vagabond actor / opera singer / storyteller, Kraepelin dedicated his life to organizing mental illnesses into orderly parts, perhaps as a reaction to such an unorthodox father. In doing so he endowed the nascent field with a new nosology, or system of diagnosis, that would later inspire the Diagnostic and Statistical Manual of Mental Disorders, the bible of psychiatry today. Kraepelin studied thousands of cases and subdivided them, breaking down what was described as “madness” into clear categories with varied symptoms as best he could. This culminated in the description of the medical term dementia praecox. Kraepelin defined dementia praecox in his 1893 textbook Psychiatrie as an early onset permanent dementia, a biological illness that caused psychosis and had a deteriorating course with little hope to improve, causing “incurable and permanent disability.” Kraepelin separated dementia praecox patients from those with “manic-depressive psychosis,” a disorder of mood and emotion that ranged from depression to mania, which had a better long-term prognosis. This division continues today with schizophrenia (and its component parts) and bipolar disorder (and its component parts). (In 1908, almost two decades after Kraepelin presented the diagnosis dementia praecox to the public, Swiss psychiatrist Paul Eugen Bleuler tested out the new term schizophrenia, which translates to “splitting of the mind,” contributing to a long-running confusion2 over the term. Later, psychiatrist Kurt Schneider further defined schizophrenia with a list of “first rank symptoms” that include auditory hallucinations, delusions, and thought broadcasting.)

  Now, finally, psychiatrists could make predictions about course and outcome. Most important, they could provide a name for their patients’ suffering, something I personally would argue is one of the most important things a doctor can do, even if a cure isn’t in sight. Still, the cause remained elusive—as it continues to.

  Doctors began to slice and dice their way through “insane” brains. They removed living people’s thyroids, women’s ovaries, and men’s seminal vesicles based on half-baked theories about the genetic origins of madness. An American psychiatrist named Henry Cotton, superintendent of Trenton State Hospital in New Jersey, offered a “focal infection theory” of mental illness, which posited that the toxic by-product of bacterial infections had migrated to the brain, causing insanity. It wasn’t a terrible idea in theory (there are infectious causes of psychosis), but Cotton’s solutions were a nightmare. In an attempt to eliminate the infection, he began by pulling teeth. When that didn’t work, he refused to reconsider and instead removed tonsils, colons, and spleens, which often resulted in permanent disablement or death—and got away with it because his patient population had neither the resources nor the social currency to stop him.

  Clinicians and researchers also embraced the growing eugenics movement that argued that insanity was a heritable condition passed down through inferior genes. In America, thirty-two states passed forced sterilization laws between 1907 and 1937—why not stop the spread of undesirables, they thought, by cutting off their ability to reproduce? The Nazis adopted America’s science-approved sadism, sterilizing three hundred thousand or so German psychiatric patients (the most common diagnosis was “feeblemindedness,” followed by schizophrenia and epilepsy) between 1934 and 1939 before they took it one step further and began exterminating “worthless lives”—executing over two hundred thousand mentally ill people in Germany by the end of World War II.

  In the aftermath of the war, as the full horror of Nazi atrocities hit the American public, the timing seemed overdue for a reassessment of psychiatry and its obsession with finding biological causes for mental illness—especially in 1955, when over a half million people lived in psychiatric hospitals, the highest number ever.

  In a strange confluence of events, the same year that Kraepelin popularized dementia praecox, Freud emerged with a new theory of treating the mind called psychoanalysis. While asylum psychiatrists interrogated the body, another group of doctors, psychoanalysts, had moved so far away from the search for an answer in the physical that it was as if they were practicing a different discipline altogether. Psychiatry outside the asylum had little in common with that practiced inside. Outside the asylum, the idea reigned that the mind was the seat of all mental suffering, not the gray matter of the brain. For someone like me, so accustomed to talk of neurotransmitters, dopaminergic pathways, and NMDA receptors, the popular terms of that era, like penis envy, phallic stage, and Oedipal conflict, feel awkward and clumsy, holdovers from a quainter world. But it wasn’t that long ago when these were the norms. Every Baby Boomer alive today was born when terms like these dominated the field.

  Psychoanalysis invaded the US by way of Europe right before World War II, offering up a new theory that provided fresh insight into mental anguish—and, for once, real cures—as war-weary soldiers returned from battle healthy by all physical estimations, but emotionally unable to join the workforce or engage in family life. For the first time ever, there were more recorded casualties related to the mind than to the body. It was a sobering thought: If a healthy young man could be reduced to a shaking, fearful, hysterical one without any physical cause, then couldn’t this happen to any of us?

  Freud (who died before psychoanalysis really took off in America) gave us a path out of this dark forest of uncertainty. In his explanation, our minds were divided into three parts: the id (the unconscious—rife with repression and unfulfilled desires); the ego (the self); and the superego (the conscience), all engaged in battle. The analyst’s goal was to “make the unconscious conscious” and with a surgeon’s focus zero in on the underlying conflict—our libidos, repressed desires, death drives, projections, and wish fulfillment fantasies; all that deep, dark, murky stuff from our childhoods—on the way to insight. There was “nothing arbitrary or haphazard or accidental or meaningless in anything we do,” wrote Janet Malcolm in Psychoanalysis: The Impossible Profession.

  And who wouldn’t want this kind of careful attention and promise of a cure over the dour inevitability that the biological side (à la Emil Kraepelin) was offering? Consider the two differing interpretations of a patient’s story as analyzed by both Kraepelin’s followers and Freud. In 1893, fifty-one-year-old German judge Daniel Paul Schreber started to become obsessed with the idea that to save the world, he needed to become a woman and give birth to a new human race. He blame
d these disturbing thoughts on his psychiatrist, whom he called a “soul murderer” who had implanted these delusions via “divine rays.” Doctors diagnosed Schreber with Kraepelin’s dementia praecox and committed him to a psychiatric hospital, where he eventually died. When Freud read Judge Schreber’s account, Memoirs of My Nervous Illness, he suggested that, instead, Schreber’s behaviors stemmed from repressed homosexual impulses, not from an incurable brain disease. Treat the underlying conflict and you’d treat the person. If you had your choice, which kind of treatment would you pick? Americans overwhelmingly chose Freud, and Kraepelin and his acolytes were forsaken to the professional boondocks.

  By the 1970s, nearly every tenured professor in psychiatry was required to train as an analyst, and most textbooks were written by them, too. Overnight, it seemed, analysts got “a power, a secular power, that they never had before and they never had since,” psychiatrist Allen Frances told me. You no longer went to your priest or parents; you paid an analyst to shrink you. Now “mind doctors” wanted to mine your “family relations, cultural traditions, work patterns, gender relations, child care, and sexual desire.” Psychiatrists were thrilled to leave the back wards of mental hospitals, where difficult patients had few options for cures, and instead to retrain as analysts and cater lucrative talk therapy treatments (five days a week!) to help the so-called worried well who suffered from a case of nerves brought on by modern life. The people who needed help the most were left behind as analysts comfortably cherry-picked their patients—mostly wealthy, white, and not very sick.

  Americans jumped on the couch, embracing the “blank screens” of their therapists and the idea that the mind could be improved. Decades after his death, Freud’s method was suddenly everywhere: in women’s magazines, in advertising (Freud’s nephew Edward Bernays is called the father of public relations); even the CIA started snatching up analysts. America’s second-biggest bestseller after the Bible became Dr. Benjamin Spock’s The Common Sense Book of Baby and Child Care, which was based on Freudian theories. Another huge book of the moment was Norman O. Brown’s Life Against Death: The Psychoanalytic Meaning of History, which attempted to reframe the past through a Freudian battle between freedom and repression. Hollywood hired psychiatrists on retainer on movie sets. Insurance companies paid for months of talk therapy and reimbursed at levels equal to other serious medical procedures.

  No matter how many psychiatrists enlisted, however, there still weren’t enough. By 1970, despite the influx of doctors, the demand exceeded the supply. Unlike the custodians of the sick in the past, psychoanalysts now promised to listen to their patients. In the best cases, patients found clarity and meaning from this relationship. Instead of pathologizing people outright, analysts saw each patient as unique in her psychic suffering. They gave us a deeper understanding of how fraught and layered our interior lives are: the complexities of sexuality; the key role that our childhoods play in our adult lives; how the unconscious speaks to us through our behaviors. Through the “interchange of words between patient and physician,” as Freud put it, you could explore, comprehend, and even heal the sick parts inside us. “Words were originally magic, and the word retains much of its old magical powers even today,” Freud wrote in 1920. “Therefore let us not underestimate the use of words in psychotherapy.”3

  One of the varied downsides was that doctors enacted vivid blame games on their patients (and the families of their patients), especially on mothers. (See the refrigerator mother [lack of maternal warmth] and the schizophrenogenic mother [an overbearing, nagging, domineering female, usually paired with a weak father], both of whom were believed to create symptoms of schizophrenia and autism in their children.) Viennese psychoanalyst Bruno Bettelheim,4 “psychoanalyst of vast impact,” in The Empty Fortress in 1967 compared the family structure of those with mental illness, especially autism, to concentration camps, a particularly damning argument because Bettelheim himself had survived two years in Dachau and Buchenwald. The only way one could recover was to completely sever relationships with family.

  But what you didn’t get with Freud was a focus on diagnosis. In fact, his followers practiced “extreme diagnostic nihilism.” Nomenclature, shared diagnostic language—these didn’t really matter to the analysts. In fact, psychiatrists expanded the scope of social deviance, pathologizing almost everyone in the process, effectively closing the chasm between sanity and insanity by showing that “true mental health was an illusion,” as anthropologist Tanya Marie Luhrmann wrote in her study of the profession Of Two Minds. According to a now infamous 1962 Midtown Manhattan study based on two-hour interviews with sixteen hundred people in the heart of the city, only 5 percent of the population were deemed mentally “well.” The whole world was suddenly crazy, and psychiatrists were their caped crusaders.

  America was again starting to look a lot like it had in the time of Nellie Bly—where anyone could be and often was (mis)diagnosed.

  And then, in February 1969, “David Lurie” walked into the intake room at an unspecified hospital in Pennsylvania and set off a metaphorical bomb. He finally proved what so many people had long suspected: Psychiatry had too much power and didn’t know what the hell to do with it.

  4

  ON BEING SANE IN INSANE PLACES

  I often imagine Bly’s trip back to Manhattan aboard the transport ferry from Blackwell Island—the air whipping her hair, the foul smells of the river, the buzzy relief—as her thoughts turned to the women she had abandoned.

  “For ten days I had been one of them. Their sorrows were mine, mine were theirs, and it seemed intensely selfish to accept freedom while they were in bondage,” Bly wrote. “I left them in their living grave, their hell on earth—and once again I was a free girl.”

  That was exactly how I felt every time I thought about my mirror image, and all those who had not been saved as I had—the others whom psychiatry had left behind.

  A month or two after my presentation at the psychiatric hospital, I had dinner with Dr. Deborah Levy, a McLean Hospital psychologist who studies (among other things) genes that appear to put people at risk for developing serious mental illness, and her colleague Dr. Joseph Coyle, a McLean Hospital psychiatrist who is one of the foremost experts on the NMDA receptor, a part of the brain that is tampered with in the illness that struck me. (Tracking two neuroscience researchers in conversation is much like following an intense hockey game. Take your eye off the puck for one second, and you’re lost.) We spoke about the hysterias of the past and the conversion disorders of the present; about the difference between malingering and Munchausen syndrome. The former describes faking an illness for some kind of gain (to win a lawsuit, for example), while the latter is the name of a mental disorder in which one pretends to be sick when there isn’t any obvious incentive. (The famous case of Gypsy Rose Blanchard is an extreme example of Munchausen by proxy, when you make someone else sick, often a child.) We talked a bit about the great pretender illnesses that blur the boundary between psychiatry and neurology and how hard it is for physicians to parse those out and about how my disease appeared to be a bridge between the two worlds, a “physical” disorder that masked itself as a “psychiatric” one.

  I chimed in with the story I had recently learned of my mirror image. There shouldn’t have been any difference between us; she should have received the same treatment, she should have had the same quick and urgent interventions, and she should have had the opportunity to recover as I had. But she had been derailed because of one crucial difference: Her mental diagnosis had stuck. Mine hadn’t. Sympathetic, Dr. Levy asked me if I had ever heard of the study by Stanford professor David Rosenhan.

  “Do you know it? The one where the people purposefully faked hearing voices and were admitted to psychiatric hospitals and diagnosed with schizophrenia?” she asked.

  Nearly fifty years after its publication, Rosenhan’s study remains one of the most reprinted and cited papers in psychiatric history (despite being the work of a psychologist rather than
a psychiatrist). In January 1973, the distinguished journal Science published a nine-page article called “On Being Sane in Insane Places,” whose driving thesis was, essentially, that psychiatry had no reliable way to tell the sane from the insane. “The facts of the matter are that we have known for a long time that diagnoses are often not useful or reliable, but we have nevertheless continued to use them. We now know that we cannot distinguish insanity from sanity.” Rosenhan’s dramatic conclusions, backed up for the first time by detailed, empirical data and published by Science, the sine qua non of scientific journals, were “like a sword plunged into the heart of psychiatry,” as an article in the Journal of Nervous and Mental Diseases observed three decades later.

  Rosenhan, a professor of both psychology and law, had posed this opening salvo: “If sanity and insanity exist, how shall we know them?” Psychiatry, it turned out, didn’t have an answer—as it hadn’t for centuries. This study “essentially eviscerated any vestige of legitimacy to psychiatric diagnosis,” said Jeffrey A. Lieberman, chairman of Columbia’s Department of Psychiatry. In the wake of the study’s publication, “Psychiatrists looked like unreliable and antiquated quacks unfit to join in the research revolution,” added psychiatrist Allen Frances.

 

‹ Prev