Quackery

Home > Other > Quackery > Page 12
Quackery Page 12

by Lydia Kang


  In the eighteenth century, one of the most infamous psychiatric hospitals in the world, St. Mary of Bethlehem in London, was notoriously nicknamed “Bedlam” for the horrific behaviors, conditions, and treatments found within. Writer Alexander Cruden was institutionalized multiple times for shocking behaviors such as attempting to date widows and getting upset over incest. How dare he? He had noted that “The common Prescriptions of a Bethlemitical Doctor are a Purge and a Vomit, and a Vomit and a Purge over again, and sometimes a Bleeding.” This was sadly before aerosolized room deodorizer was invented.

  Benjamin Rush, physician and founding father, recommended “heroic depletion therapy” (see Mercury, page 3) for many ailments, including this prescription for mania: “20 to 40 ounces of blood [two and a half pints] may be taken at once … early and copious bleeding are wonderful in calming people.” He’s sort of correct. After all, people of all temperaments are calm when they’re too fatigued and anemic to care.

  Even the Sushruta Samhita, the ancient Sanskrit text, remarked that bloodletting is followed by a sense of cheerfulness upon the patient. And who doesn’t want that? Pass the knife.

  Well, maybe not just yet. Here are a few people who probably wouldn’t describe their experience as “cheery.”

  The Exsanguinated Lives of the Rich and Famous

  Marie Antoinette was bled after she gave birth before a room full of courtly observers. (If you think this is impressive, remember that if she’d had social media, she could’ve given birth in front of millions.) The queen ended up fainting and was revived by bloodletting, or at least the pain of it.

  Some had far worse outcomes, especially when bloodletting was employed as a desperate last resort. In 1685, Charles II of England succumbed to “fits” while shaving. His fourteen physicians were under great pressure to keep him alive. Besides bleeding, the poor king endured enemas, purgatives, and cupping, and had to eat the gallstone of an East Indian goat. Plasters made from pigeon droppings were thoughtfully applied to his feet. They bled copious amounts from him again and again, once even slitting open his jugular veins. At the end, he was left nearly bloodless before he died, though perhaps his soul was simply running screaming from the bird-poop poultices. Thirty years later, Charles II’s niece, Queen Anne—then on the throne herself—was bled and purged after having fits and falling unconscious; she survived only two days after the doctors arrived.

  Lord Byron, suffering from a violent cold complete with fevers and body aches, had an ongoing battle with his physicians about bloodletting. He adamantly refused, stating that it hadn’t helped in previous illnesses. Finally, he gave in to their nagging and proclaimed, “Come as you are; I see a damned set of butchers. Take away as much blood as you will but have done with it.” After several pints were bled over the course of three bleedings, his physicians were surprised that Byron worsened. Desperate, the doctors blistered him and applied leeches around his ears. Lord Byron died soon after, and his physicians promptly blamed him for putting off the bleeding for too long.

  “Breathing a vein” in 1860.

  George Washington was yet another famous victim of bloodletting. Three years after retiring from his presidency, he came down with a fever after riding in snowy weather. He had trouble breathing, probably from a bout of severe epiglottitis. His physicians aggressively bled him, tried a drink of molasses, vinegar, and butter (which nearly choked him to death), blistered him, bled him again, tried laxatives and emetics, and bled him some more for good measure. A day later, he was bled yet again. All told, he may have been bled of five to nine pints of blood and died shortly after. Quite a price to pay for an illness that started out as a bad cold.

  An array of bloodletting instruments

  The Bleeding Slows to a Trickle

  Even in the face of his critics, Dr. Benjamin Rush maintained his staunch and loud defense of bloodletting, and his landscaping proved this. At the height of the Yellow Fever epidemic in Philadelphia, his front lawn had so much congealed, spilled blood that it stank and buzzed with flies. No show on HGTV could have fixed that disaster. Unfortunately for Rush’s patients, the doctor greatly overestimated the body’s blood volume—by 200 percent. He would often remove four to six pints of blood in a single day (the average human male has about twelve pints). And remember, he’d often bleed several days in a row. His treatment mortality rate was so high that a critic named William Cobbett decried, “The times are ominous indeed, when quack to quack cries purge and bleed.” Cobbett even went so far as to say that Rush’s so-called heroic therapy was “a perversion of nature’s healing powers.” Burn!

  Though bleeding was a well-loved weapon among physicians for more than two millennia, detractors like Cobbett were always around. Erasistratus thought blood loss would weaken patients (he was right). In the seventeenth century, an Italian scholar named Ramazzini claimed, “It seems as if the phlebotomist [bloodletter] grasped the Delphic Sword in his hand to exterminate the innocent.”

  By the eighteenth and nineteenth centuries, opposition from many physicians and scientists began to turn the tide of change. Louis Pasteur and Robert Koch showed that inflammation came from infection and wouldn’t be cured with bloodletting. In 1855, John Hughes Bennett, a physician from Edinburgh, used statistics to show that pneumonia mortality decreased as bloodletting declined. With the current understanding of human physiology and pathology, medical practices in the West began to move away from the antiquated ideas of humoral medicine.

  Today, bloodletting, or phlebotomy as it’s called (Greek for “cutting a blood vessel”), is still used throughout the world. California had to ban bloodletting by acupuncturists in 2010. And it’s still a modern practice with Unani, a Persian-Arabic branch of medicine that traces its roots back to the thirteenth century. Bleeding with suction cup therapy, wet-cupping, is also still done in traditional Arabic medicine, with some positive studies. (In the 2016 Summer Olympics, swimmer Michael Phelps was seen covered with bruises from “dry cupping,” using just suction without the bleeding.)

  With our modern understanding of the human body, it would make sense that bloodletting might improve symptoms of high blood pressure and occasionally heart failure; instead, we have noninvasive pills that don’t require slitting a vein wide open. But for some diseases, bloodletting remains a suitable remedy. Hemochromatosis, a disorder that causes dangerous over-accumulation of iron, is treated with regular bleeding that depletes the body of this element. Phlebotomy can also be used for polycythemia vera, which causes a pathologic increase in red blood cells. After all Galen wrote, it turns out that too much blood is truly a bloody problem to have.

  Pity that the bloodletters of the past didn’t realize that most of the time, blood is best left inside the body, rather than outside.

  13

  Lobotomy

  Of Ancient Holey Heads, the Stone of Madness, Neural Eggbeaters, Kitchen Ice Picks, and Walter Freeman’s Lobotomobile

  No one doubts that the Kennedys were America’s own royal family. Handsome and beautiful, well bred and well connected, they had the money, pedigree, smarts, and political ties to leave an indelible mark on our nation’s history and cultural consciousness. They also had secrets to hide.

  For decades, Rosemary Kennedy was the least known of all the siblings of John F. Kennedy. Photos of her appearance at King George and Queen Elizabeth’s court in 1938 show her smiling, her dark hair coiffed perfectly, white gloves and couture gown fitting her curvy frame to perfection. The British press was wild for her beauty. Eligible young men wooed her at events. At first glance, she easily outshone her own patrician mother and plainer sister, Kathleen.

  But what most didn’t know is that much of Rosemary’s inner world was kept a secret. Her birth had been delayed and her mother held her legs closed until the doctor could arrive two hours later—on the advisement of a nurse and despite the fact that the baby was crowning. Many blame this for Rose’s mental deficiencies, perhaps from lack of oxygen during those crucial hours. Her s
iblings were athletes and achievers, but Rosemary didn’t hit her developmental milestones on time, if at all. As an adult, she possessed the intelligence of a fourth grader and wrote letters with simplistic handwriting riddled with spelling errors. Her father, Joe Kennedy, ambassador to England, can be seen grasping her arm tightly in a few photographs, evidence of his attempts to keep Rosemary’s behavior in check.

  By her early twenties, all the cognitive gains from years of tutoring and constant vigilance were slowly slipping away. She’d escape her convent boarding school and wander the streets at night. Her unexpected outbursts of emotion—sometimes yelling, sometimes punching (and the punches hurt, for she was strong and healthy)—were becoming too difficult to contain. For an elite Boston family as socially active as the Kennedys were, having an uncontrollable child with “disgraceful” mental deficiencies could amount to social suicide. They just needed her to be calm, predictable, and more … Kennedy-like.

  It just so happened that a new neurosurgical technique was stirring up excitement and interest. A Saturday Evening Post article in 1941 claimed that it could help patients who were “problems to their families and nuisances to themselves.”

  Joe Kennedy called Dr. Walter Freeman for help, unbeknownst to his wife, who was overseas. In November 1941, Rosemary Kennedy was lobotomized, and she disappeared from the public eye.

  Kathleen, Rose, and Rosemary Kennedy in 1938.

  A Brief History of Holey Heads

  The oldest form of surgery, trepanning (also called trephining, both from the Greek word trypanon, meaning to drill or bore) was performed by scraping the skull away, cutting a square shape to remove the center, drilling a circlet of tiny holes like stamp perforations, or drilling out a circle. The tools used could be flint, obsidian, metal, or shell. Supposedly, this wasn’t brain surgery. No joke intended—really, it wasn’t. The brain, its blood vessels, and the skinlike covering, the meninges, weren’t touched. People seemed to understand that if you stirred up some brain pudding, bad things happened.

  Why was the procedure done? For a lot of good reasons: There is plenty of evidence that they were performed after skull fractures, possibly to remove broken fragments and alleviate pressure by removing blood clots. In fact, plenty of skulls showed evidence of healing bone, which means the patients survived.

  The bad reasons for trepanning? Random headaches. Epilepsy. Melancholy. Mental illness. And also—minor head injuries. Hippocrates recommended the procedure when all that was suffered was a bump on the head. Just in case. (Suddenly, the quip “I need that like I need a hole in the head” makes a little more sense.)

  During the Renaissance, the use of firearms increased the number of traumatic head injuries and treatment with trepanning. Unfortunately, by the eighteenth century, trepanning had become a dangerous prospect. Pre-antisepsis Europe was a rather dirty place. Some estimated that 50 percent of those trepanned died (unlike the ancient skulls found before—which boasted closer to a 20 percent mortality rate). It was such a barbaric situation that in 1839, surgeon Sir Astley Cooper argued, “If you were to trephine, you ought to be trephined in turn.”

  Though trepanning is still used for treatment of traumatic brain injury, an occasional few have veered from this obvious lifesaving strategy and instead bored themselves for a buzz. In 1965, a Dutchman named Bart Huges thought it could bring him to a higher state of consciousness. Using an electric drill, knife, and hypodermic needle, he went to work. Afterward, he stated, “I feel as I felt before the age of fourteen.” (As if we wanted to relive our most awkward hormonal teen years—forever.) This incident occurred after he failed out of medical school and before he went on to write Trepanation: The Cure for Psychosis. Others followed suit, but luckily most reasonable people prefer LSD to neurosurgery for existential psychedelic assistance. It’s much less messy.

  Drilling to the Root of Madness

  To better understand Rosemary’s fate, we need to crank back the timeline to the origins of brain surgery, the first kind of surgery ever, actually: the practice of trepanning (see box “A Brief History of Holey Heads,” opposite). Trepanning is the process of creating a hole in the skull. It’s the earliest recorded surgical procedure in history. Skulls from Mesolithic times (possibly as far back as 8000 to 10,000 bce) unequivocally show signs of the procedure, which we know was practiced in several ancient civilizations, including Mesoamerica, Greece, the Roman Empire, India, and China.

  For every sound use of trepanning, such as removing pieces of bone in skull fractures or relieving pressure, there were plenty of misfires. The good news was that people rightly theorized that the brain was the seat of thought and emotion; the bad news was that we had horrific methods of fixing a disordered thought process. A twelfth-century Greek surgeon recommended trepanning for melancholy and madness. A thirteenth-century Greek surgical text recommended that those with epilepsy be trepanned so that “the humors and air may go out and evaporate.” As easy as letting the air out of a balloon, right? Demons causing sickness were also thought to skeddadle with a little help from a cranial escape hatch.

  During the Renaissance, a theory emerged that a stone residing within the brain was the seat of madness, idiocy, and dementia. Remove it and you might prevent the befouling of the rest of the mind. In Hieronymus Bosch’s 1475 painting Cutting the Stone, also called The Extraction of the Stone of Madness, a poor soul sits tied to a fancy chair, gazing with a decided side-eye at the viewer. A doctor (who, for unknown reasons, is wearing a metal funnel) is cutting into his head. Multiple other works of art during this and the next century depict this hopeful surgery. It’s unclear if the paintings were theatrical in nature or depicted true surgical attempts at removing that dratted (and nonexistent) rock.

  Trepanning demo and tools, for the DIYer.

  Life came to imitate art, however, when Swiss doctor Gottlieb Burckhardt sliced into six brains in 1888. With no surgical experience, Burckhardt operated on patients with schizophrenia and psychotic hallucinations. Like the ancient doctors of yore, he used a trephine (basically, a round cookie-cutter-like bone saw on a stick) to drill holes near the temples, but here’s where he departed: He then cut through the brain’s dura and scooped out parts of the cerebral cortex with, in some cases, a sharp spoon. Yes, spoonfuls of brain were removed. Though some of the patients became “quieter” and no longer hallucinated, many were left with lingering neurological problems, died from ensuing complications, or even committed suicide. A psychiatrist at the time commented that “[Burckhardt] suggested that restless patients could be pacified by scratching away the cerebral cortex.”

  Burckhardt’s procedure was the first lobotomy, though that term wouldn’t be coined until decades later. Unlike trepanning, which aimed only at opening a hole in the skull without disturbing the brain or its meningeal covering, this new approach to surgery was a whole other kettle of, er, spoons. (And ice picks. And egg beaters. More on these other tools shortly.) It also marked the dawn of psychosurgery—damaging the brain on purpose to cure mental illness—a new invention that accompanied exciting discoveries about the links between our brains and behaviors (see box “Phineas Gage, the Hot Dude with a Hole in His Head,” page 147), and other developments in neuroanatomy.

  The medical community thought Burckhardt barbaric and received his work with cold horror. He never performed the procedures again. It would be almost fifty years before someone tried another lobotomy.

  What changed? The world had entered a mental health crisis.

  The Lobotomy: An American (Stolen) Invention

  In the later 1930s and early 1940s, physicians in the United States were desperate. The number of institutionalized mentally ill patients grew to more than four hundred thousand. Psychiatric patients took up more than half of the hospital beds across the country. There were no good pharmacological treatments, and these patients took enormous emotional, physical, and financial tolls on families and the asylums. Patients were treated in often horrific conditions. Their savior? A go
ut-ridden Portuguese neuroscientist with a syringe full of booze.

  In 1935, Egas Moniz attempted another psychosurgical cure for mental illness: the leucotomy (Greek for “cutting the white,” as in the white matter of the brain). The first patient chosen was an institutionalized woman suffering from years of debilitating depression. His own hands deformed by gout, Moniz employed a surgeon to drill a hole in the patient’s brain near the top of the head and inject pure ethanol to kill parts of the frontal lobe. (Yes, it’s the same alcohol found in your wine, but no, you won’t kill your brain cells after a glassful of rosé. So stop panicking.)

  In later procedures, they used an instrument called a leucotome, which was a nifty metal rod that, when pushed into your squishy brain, would shoot out a wire loop that spun around and got a nice churn going. It was less like an eggbeater scrambling a good flan, and more like a melon baller used on an overripe cantaloupe. The brain texture was later described by American surgeon James Watts as “what butter’s like when it’s been out of the refrigerator for a while.” There you go. Now we’ve ruined flan, cantaloupe, and butter, too.

  Moniz was later given the Nobel Prize for his work, despite the fact that many of his patients ended up right back at the asylums where they started. Although the medical community was yet again horrified, Moniz did not back away like Burckhardt. He spread the word.

  One of the doctors who listened to Moniz’s gospel was Walter Freeman, the American neurologist who would eventually lobotomize Rosemary Kennedy. Freeman partnered with neurosurgeon James Watts to continue Moniz’s work on American soil. In 1936, after their first patient survived and seemed cured (her anxiety diminished, and she seemed healthy but “shrewish and demanding with her husband”), they moved onward. But many patients had zero or only fleeting improvement. Many lost spontaneity. Hallucinations often continued.

 

‹ Prev