Shrinks
Page 16
Transorbital lobotomies were still being performed when I entered medical school. My sole encounter with a lobotomized patient was a rather cheerless affair. He was a thin, elderly man in St. Elizabeths Hospital in Washington, DC, who sat staring out at nothing in particular, as still as a granite statue. If you asked him a question, he responded in a quiet, robotic tone. If you made a request, he complied as dutifully as a zombie. Most disconcerting were his eyes, which appeared lifeless and blank. I was informed that at one time he had been unremittingly aggressive and unruly. Now, he was the “perfect” patient: obedient and low-maintenance in every way.
Walter Freeman performing a lobotomy. (© Bettmann/CORBIS)
Astonishing though it may seem, Moniz received the Nobel Prize in 1949 “for his discovery of the therapeutic value of leucotomy in certain psychoses,” marking the second Nobel Prize given for the treatment of mental illness. The fact that the Nobel committee was honoring malaria cures and lobotomies underscores the desperation for any form of psychiatric treatment.
Fortunately, contemporary psychiatry has long since discarded the dangerous and desperate methods of fever therapy, coma therapy, and transorbital lobotomies after the revolution in treatments beginning in the 1950s and ’60s. But one form of therapy from the “snake pit” era has survived as the most common and effective somatic treatment in psychiatry today.
Electrified Brains
As the use of fever therapy and coma therapy spread through mental hospitals around the world, alienists observed another unexpected phenomenon: The symptoms of psychotic patients who also suffered from epilepsy seemed to improve after a seizure. Since fever improved the symptoms of patients with GPI, and insulin dampened the symptoms of psychosis, might seizures also be harnessed as a treatment?
In 1934, the Hungarian psychiatrist Ladislas J. Meduna began experimenting with different methods for inducing seizures in his patients. He tried camphor, a scented wax used as a food additive and embalming fluid, and then metrazol, a stimulant that causes seizures in high doses. Amazingly, Meduna discovered that psychotic symptoms really did diminish after a metrazol-induced seizure.
Meduna’s novel seizure treatment quickly became known as convulsive therapy, and by 1937 the first international meeting on convulsive therapy was held in Switzerland. Within three years, metrazol convulsive therapy had joined insulin coma therapy as a standard treatment for severe mental illness in institutions all around the world.
There were problems with metrazol, however. First, before the convulsions actually started, the drug induced a feeling of impending doom in the patient, a morbid apprehension that was only heightened by the awareness that he was about to experience an uncontrollable seizure. This fearful anxiety must have been even worse for a psychotic patient already suffering from frightening delusions. Metrazol also provoked thrashing convulsions so violent that they could become, quite literally, backbreaking. In 1939, an X-ray study at the New York State Psychiatric Institute found that 43 percent of patients who underwent metrazol convulsive therapy experienced fractures in their vertebrae.
Physicians began to look for a better way to induce seizures. In the mid-1930s, an Italian professor of neuropsychiatry, Ugo Cerletti, was experimentally inducing seizures in dogs by delivering electrical shocks directly to their heads. He wondered if electrical shocks might also induce seizures in humans, but his colleagues dissuaded him from attempting such experiments on people. Then one day while buying meat from the local butcher he learned that while slaughtering pigs, butchers often applied electrical shocks to their heads to put the animals into a kind of anesthetized coma before cutting their throats. Cerletti wondered: Would an electrical shock to a patient’s head also produce anesthesia before provoking convulsions?
Before you decry Cerletti’s project as wanton barbarism, it is worth reviewing the circumstances that led a trained doctor to consider running electricity through a person’s brain—a notion, absent context, that sounds as terrifyingly absurd as the suggestion that dropping a pile of bricks on your toes will cure athlete’s foot. First, there was still no effective treatment for severe mental illness besides insulin coma therapy and metrazol seizure therapy—dangerous, volatile, and highly invasive treatment. Second, for most patients, the only alternative to these extreme therapies was permanent institutionalization within a soul-crushing asylum. After watching shocked pigs become oblivious to the butcher’s knife, Cerletti decided that shooting 100 volts of electricity through a person’s skull was worth the obvious risks.
In 1938, Cerletti called upon his colleague Lucino Bini to build the first device explicitly designed to deliver therapeutic shocks to humans and, with Bini’s collaboration, tried the device on their first patients. It worked just as Cerletti dreamed: The shock anesthetized each patient so that when he woke up he showed no memory of the seizure—and, as with metrazol, the patients showed marked improvement upon waking.
Beginning in the 1940s, Cerletti and Bini’s technique, dubbed electroconvulsive therapy, or ECT, was adopted by almost every major psychiatric institution around the world. ECT was a welcome replacement for metrazol therapy because it was less expensive, less frightening to patients (no more feelings of impending doom), less dangerous (no more broken backs), more convenient (just flick the machine on and off)—and more effective. Depressed patients in particular often showed dramatic improvements in mood after just a few sessions, and while there were still some side effects to ECT, they were nothing compared to the daunting risks of coma therapy, malaria therapy, or lobotomies. It was truly a miracle treatment.
One of the side effects of ECT was retrograde amnesia, though many doctors considered this a perk rather than a drawback, since forgetting about the procedure spared patients any unpleasant memories of being electrocuted. Another side effect was due to the fact that early ECT procedures were usually administered in “unmodified form”—a euphemistic way of saying that the psychiatrists didn’t use any kind of anesthesia or muscle relaxants—which resulted in full-scale convulsions that could produce bone fractures, though these were far less frequent and damaging than those resulting from metrazol-induced seizures. The introduction of suxamethonium, a synthetic alternative to curare, combined with a short-acting anesthetic, led to the widespread adoption of a much safer and milder “modified form” of ECT.
One of the earliest practitioners of ECT in the United States was Lothar Kalinowsky, a German-born psychiatrist who immigrated to the United States in 1940. He settled in Manhattan, where he practiced psychiatry and neurology for more than forty years. I first met Kalinowsky as a resident in 1976 when he lectured and instructed the residents in ECT at St. Vincent’s Hospital. A slender man with silver-gray hair and a heavy German accent, he was always immaculately attired, usually in a well-tailored three-piece suit, and carried himself with a dignified manner and professorial posture. I received excellent training in electroconvulsive therapy from the man who pioneered its use in American psychiatry.
To a young medical resident, the experience of delivering ECT can be quite disturbing. Since medical students are exposed to the same cultural stereotype of shock therapy as everyone else—that it is gruesome and barbaric—when you administer ECT for the very first time your conscience is pricked with the unsettling feeling that you are doing something wrong. A moral tension mounts inside you, and you must keep reminding yourself that extensive research and data support the therapeutic effects of ECT. But once you’ve seen the astonishingly restorative effects of ECT on a severely troubled patient, it all gets much easier. This is no lobotomy, producing vacant zombies. Patients are smiling and thanking you for the treatment. The experience is much like a medical student’s first attempt at surgery: Cutting into a patient’s abdomen and fishing around for an abscess or tumor can be gruesome and unnerving, but you must harm the patient a little in order to help him a lot—or even save his life.
Psychiatric treatment is not known for producing rapid results. Medical school lore holds
that if you want to go into psychiatry you must be able to tolerate delayed gratification. Surgeons see the results of their treatment almost immediately after they stitch up an incision; for psychiatrists, waiting for drugs or psychotherapy to kick in is like watching ice melt. Not so with ECT. I’ve seen patients nearly comatose with depression joyfully bound off their cot within minutes of completing their ECT.
Whenever I think of ECT, one particular case comes to mind. Early in my career I treated the wife of a well-known New York restaurateur. Jean Claude was charismatic, cultured, and dedicated to his enormously successful French eatery. Still, not even his beloved restaurant came before his wife, Genevieve. She was a beautiful middle-aged woman who had once been a talented actress and still played the part of an ingénue. She also suffered from recurrent episodes of psychotic depression, a severe disorder that manifests with depressed mood, extreme agitation, and delusion-driven behavior. In the throes of an acute episode, she would become frantic, losing all control. Her usual impeccably mannered and charming demeanor would devolve into moaning and rocking. When her distress reached its crescendo, she would shudder and thrash her body in every direction, often shredding her clothes, and as if in counterpoint to her wild gyrations, Genevieve would break into loud baleful songs in her native French, sounding like a wounded Edith Piaf.
I first met Jean Claude when Genevieve was in the midst of one of her full-blown episodes. Other physicians had tried antidepressant and antipsychotic medications individually and in combination, with little effect. Rather than repeating the same medications, I suggested ECT. After the first session, Genevieve was calmer and screamed less, though she remained frightened and preoccupied. After several more treatments over the course of three weeks, she returned to her usual courteous self and thanked me, telling me this was the first time a psychiatrist had ever made her feel better. Jean Claude could not thank me enough and insisted that I dine at his restaurant whenever I wanted. I confess I took advantage of his offer and over the next couple of years I brought women I was dating to his chic gastronomic establishment whenever I wanted to make a good impression. One of these women became my wife.
Today, improved technologies enable ECT to be individually calibrated for each patient so that the absolute minimum amount of electrical current is used to induce a seizure. Moreover, the strategic placement of the electrodes at specific locations on the head can minimize side effects. Modern anesthetic agents combined with muscle relaxants and abundant oxygenation also render ECT an extremely safe procedure. ECT has been assiduously investigated over the past two decades, and the APA, NIH, and FDA all approve its use as a safe and effective treatment for patients with severe cases of depression, mania, or schizophrenia, and for patients who cannot take or do not respond to medication.
It strikes me as supremely ironic that the Nobel committee saw fit to award prizes for infecting patients with malaria parasites and for surgically destroying frontal lobes, two short-lived treatments that were neither safe nor effective, while passing over Cerletti and Bini despite the fact that their invention was the only early somatic treatment to become a therapeutic mainstay of psychiatry.
Despite the notable success of ECT, psychiatrists in the mid-twentieth century still yearned for a treatment that was cheap, noninvasive, and highly effective. But in 1950, such a therapy appeared to be nothing more than a pipe dream.
Chapter 6
Mother’s Little Helper: Medicine at Last
Mother needs something today to calm her down
And though she’s not really ill
There’s a little yellow pill
She goes running for the shelter of a mother’s little helper
—MICK JAGGER AND KEITH RICHARDS
It’s better to be lucky than smart.
—HENRY SPENCER
Chloral Simmerings in My Spine
These days, it’s difficult to imagine the practice of psychiatry without medication. You can hardly watch TV without seeing an ad for some mood-enhancing pill, usually featuring merry families frolicking on sandy beaches or joyous couples hiking through sun-dappled forests. Young people are far more likely to associate my profession with Prozac, Adderall, and Xanax than reclining on a couch week after week, divulging one’s dreams and sexual fantasies. Schools, colleges, and nursing homes in every state openly endorse the liberal use of psychoactive drugs to mollify their more disruptive charges. What is less well known is that psychiatry’s dramatic transformation from a profession of shrinks to a profession of pill-pushers came about through sheer serendipity.
When I was born, not a single therapeutically effective medication existed for any mental disorder. There were no antidepressants, no antipsychotics, no anti-anxiety drugs—at least, no sort of psychiatric drug that quelled your symptoms and enabled you to function effectively. The few existing treatments for the major categories of mental illness (mood disorders, schizophrenia, and anxiety disorders) were all invasive, risky, and burdened with appalling side effects, and these desperate measures were mostly used to control disruptive inmates in mental institutions. Similarly, the first psychiatric drugs were not intended to be curative or even therapeutic—they were blunt instruments for pacification. Their daunting side effects were only deemed acceptable because the alternatives—fever cures, coma therapy, induced convulsions—were even worse.
In the late nineteenth century, asylums used injections of morphine and other opiate-derived drugs to subdue recalcitrant inmates. While the patients may have ranked this among the most agreeable psychiatric treatments of the Victorian Era, the practice was discontinued once it became clear that opioids turned patients into hardcore addicts. The first behavior-altering drug commonly prescribed outside of asylums (psychotropic drug, in the argot of medicine) was chloral, a sleep-inducing non-opiate prescribed to relieve insomnia in anxious and depressed patients. Like morphine, chloral was not intended to treat a patient’s most salient symptoms—namely, the fearfulness in anxiety disorders or the feelings of sadness in depression—it was intended to knock the patient out cold. Chloral was preferable to morphine because it was reliable in strength from dose to dose and could be given orally, but patients disliked its awful taste and the distinctive odor it imparted to their breath, known as “alky-breath.”
Even though chloral was less addictive than morphine, it was still habit-forming. Women suffering from “nervous conditions” often self-administered the drug at home in order to avoid the embarrassment of institutionalization and frequently ended up as chloral addicts. The celebrated author Virginia Woolf, who suffered from manic-depressive illness and was repeatedly institutionalized, frequently swallowed chloral in the 1920s. From her boudoir, she wrote to her lover Vita Sackville-West about its effects: “Goodnight now, I am so sleepy with chloral simmering in my spine that I can’t write, nor yet stop writing—I feel like a moth, with heavy scarlet eyes and a soft cape of down—a moth about to settle in a sweet bush—would it were—ah, but that’s improper.”
Once its sleep-inducing properties became widely known, chloral quickly gained notoriety as perhaps the first drug employed to surreptitiously incapacitate a victim. Adding a few drops of chloral to someone’s drink gave rise to the expression “slip him a mickey.” (The term may have originally referred to a Chicago bartender, “Mickey” Finn, who added chloral to the drinks of customers he intended to rob.)
The simple act of putting a patient to sleep will inevitably reduce his symptoms. After all, when you lose consciousness, your anxieties, delusions, and manias subside, along with your nervous tics, ranting, and pacing. From this matter-of-fact observation, it was a short leap of imagination for psychiatrists to extrapolate the hypothesis that by prolonging their patients’ sleep, they might diminish their symptoms during waking hours as well. Around the turn of the nineteenth century, Scottish psychiatrist Neil Macleod experimented on a variety of mental illnesses with a powerful sedative known as sodium bromide. He claimed that by rendering patients unconscious for a
n extended period of time, he could produce a complete remission of their mental disorders, a remission that sometimes lasted for days or even weeks. He called his treatment “deep sleep therapy”—an appealing moniker, for who doesn’t feel rejuvenated after a restful slumber?
Unfortunately, there’s quite a bit of difference between natural deep sleep and the sleep produced by a chemical strong enough to knock out an elephant. Deep sleep therapy can elicit a cauldron of scary side effects, including coma, cardiovascular collapse, and respiratory arrest; one of Macleod’s own patients died during his experiments. It was also difficult to judge the right dose, and sometimes patients slept for a day or two longer than intended. Most problematic was the fact that bromide is a toxin that builds up in the liver, becoming more harmful with each use.
At first, bromide compounds spread rapidly through public asylums because they were cheaper and easier to manufacture than chloral, while producing more potent effects. The “bromide sleep cure” was briefly taken up by other physicians, too, before being abandoned as too dangerous.
Even though morphine, chloral, and bromide were all crude and addictive sedatives with harmful side effects, the notion that drug-induced sleep was therapeutic became firmly established by the start of World War II. (Except, of course, among the psychoanalysts, who dismissed sleeping pills out of hand, insisting they did nothing to resolve the unconscious conflicts that were the true mainspring of all mental illness.) Even so, no psychiatrist, psychoanalyst or otherwise, believed that there would ever be a drug that targeted the symptoms of mental illness or empowered a patient to lead a normal life—at least, not until 1950, the year the first psychopharmaceutical drug was born, a drug providing true therapeutic benefits for a troubled mind.