Book Read Free

The Cadaver King and the Country Dentist

Page 8

by Radley Balko


  But the fair’s celebration of technological advancement also came with jingoistic claims of American superiority, ugly demonstrations of alleged racial supremacy, and crank theories passed off as science. Among the fair’s exhibits, for example, were living dioramas of native “savages” gathered from all over the globe—essentially a human zoo. One featured Ota Benga, a Congolese man who had been captured and sold into slavery, then “rescued” and brought to St. Louis, where viewers could pay admission to see his filed and sharpened “cannibal teeth.” Other exhibits touted pseudosciences like phrenology and physiognomy, which posited that trained experts could make broad generalizations about intelligence, criminal proclivity, and morality based on physical characteristics like skull shape and size or the relative ratios of various body parts.

  In short, it was a time when it could sometimes be difficult to distinguish legitimate scientific breakthroughs from puffery and dangerous pseudoscientific nonsense. Occasionally the same intellectuals proffered both. Sir Francis Galton, for example, was a Victorian-era statistician, mathematician, meteorologist, and polymath. He is often credited with developing modern fingerprint analysis, an interest he’d nurtured while studying how French police agencies were using biological data to identify and classify criminal suspects. Galton thought anthropometry (the study of measurements of the human body) had uses well beyond mere identification. He believed other physical characteristics could not only be used to distinguish one person from another, but could be predictive of traits like criminality, intelligence, and virtue. Such thinking led to theories that people of a certain nose size, skull shape, or skin tone were inherently more or less virtuous, productive, or intelligent than those who were differently proportioned or colored.

  These theories are of course nonsense, and applying them to social policy has proven to be catastrophic. But Galton became one of the earliest and most vocal proponents. In his autobiography, he advocated for the forced sterilization of entire groups of people. “Stern compulsion ought to be exerted to prevent the free propagation of the stock of those who are seriously afflicted by lunacy, feeble-mindedness, habitual criminality, and pauperism,” he wrote. One of the fathers of modern fingerprint analysis wasn’t just a champion of eugenics; he coined the term.

  It seems somehow appropriate, then, that the embryonic field of forensics found a home in the progressive movement. Like progressives, early forensics practitioners enthusiastically forged ahead with “science” that didn’t always adhere to the principles of scientific inquiry; they embraced expertise but didn’t always check the credentials of the experts.

  There were exceptions to be sure. One of the earliest adopters of modern forensics was Berkeley, California, police chief August Vollmer. A pioneer of standardization and professionalization, Vollmer also pushed for specialization in policing. He thought law enforcement should be a career, not the patronage position it had become. Led by Vollmer, progressive police departments created police units that solely investigated homicides or only targeted narcotics. Forensics emerged as another such area of specialization. It was Vollmer who created the country’s first crime laboratory, including a focus on nascent forensic science.

  But more broadly, the newly open atmosphere toward people who claimed expertise presented new opportunities for hucksters and frauds. Everyone seemed to love the experts, but no one was making sure the experts actually knew what they were talking about. Within the justice system, it became the courts’ responsibility to tackle the daunting challenge of separating expertise from artifice. “The standard at the time was that if someone had specialized knowledge, and that knowledge seemed to be helpful to investigators, then the court would allow the testimony,” says Jonathan Koehler, a behavioral scientist and law professor at Northwestern University. “The problem was that there was no attempt to check the validity of what these witnesses were actually claiming.”

  By the early 1900s, American medical schools had improved dramatically, as had the standards and prestige of the medical profession in general. This only intensified the antipathy between the profession and the country’s coroners. Just as they had with the nascent forensics fields, progressives joined forces with the newly resurgent medical field to push for reform. In some cities, reformers tried to require all coroners to be licensed physicians. In others, they tried to abolish the coroner system entirely and replace it with a medical examiner system.

  In most places, the coroner system prevailed. The reasons were mostly institutional. First, many states had enshrined the position in their state constitutions. Abolishing it required a constitutional amendment. Second, by the early 1920s, many former coroners had worked their way through politics and now sat on city councils or in state legislatures, or held gubernatorial appointments. Lawmakers weren’t keen to abolish the position that had given them or their colleagues their start in politics. Third, the medical profession itself was not well positioned to take over the job. Though medical schools, certifications, and professionalism in the field had dramatically improved in a short period of time, the field of forensic pathology remained underdeveloped.

  But there were some success stories, mostly in large cities. New York City moved to a medical examiner system in the 1910s. That office soon became an example for reformers all over the country. Its success was due to the extraordinary vision of the men who led it, Charles Norris and his protégé, Alexander Getler. The office in Cleveland, Ohio, switched to such a system in 1914. Newark followed in the next decade. The state of Illinois still used the coroner system, but in Chicago, Cook County coroner Peter Hoffman saw the value in utilizing the knowledge and expertise of medical professionals. Hoffman ordered autopsies when investigating suspicious deaths, and he hired only qualified doctors to perform them. He also kept epidemiological statistics and for public health purposes tracked deaths from accidents and negligence.

  But overall, coroners were able to beat back the push to make them obsolete, either outright or by fighting to water down reforms. The frequent result of the back-and-forth was a compromise that preserved the coroner system but also created state and local medical examiner’s offices, to either oversee coroners or operate alongside them. That compromise, however, often only made things worse.

  In 1928, the Rockefeller Foundation and the National Research Council published the first of many damning reports on America’s coroner system. It exposed the widespread use of kickbacks and other corruption, found that few states required any qualifications at all for the position, and concluded that inquests tainted by corruption and political influence were reaching highly suspect findings about cause of death. The report recommended that states and cities retire their coroner systems, establish a medical examiner system in their place, reassign the coroner’s legal responsibilities to the judiciary, and require medical schools to teach medical death investigation as a full-time specialty. The foundation then followed its own recommendations, sponsoring the first academic department of legal medicine at Harvard Medical School in 1937—a program that soon began graduating qualified forensic pathologists.

  Outside of the medical community, though, all of this research and anecdotal data was generally met with silence. As of 1942 just seven states had replaced the coroner system with a medical examiner system. By the end of that decade, only eleven states had attempted any meaningful legal reform. In some parts of the country, the coroners had actually managed to strengthen their position.

  Some reformers began to look for a different approach. Chief among them was Richard Childs, a progressive activist from New York state and the longtime chair of the National Municipal League. Childs seized on what would become an enormously effective (if not always honest) tactic to win public support for reform: scare the hell out of people about crime. Childs crossed the country making the case that because of the coroners’ ineptitude and corruption, solvable crimes weren’t being investigated, and murderers, rapists, and other violent criminals were not only going free, but were ready
to strike again. Newspapers loved the angle, publishing series and investigations critical of the coroner system, most with variations of the tagline that their own city or state had become “a good place to get away with murder.” Childs had more success than his predecessors. By 1960, about half the states had instituted some type of reform or reorganization of their coroner systems, many borrowing heavily from Childs’s model policies.

  Childs’s chief nemesis was the National Association of Coroners, led by Cleveland’s Samuel Gerber, a charismatic, politically savvy coroner best known for leading the 1954 investigation that implicated Dr. Sam Sheppard in the murder of his wife—the inspiration for the movie and TV series The Fugitive. The other main obstacles to reform over the years were funeral directors. Between the fees for transporting bodies and the referrals for embalming and memorial services, successful funeral home owners found that good business means maintaining good relations with the coroner’s office. Funeral home directors were enormously influential in coroner races in much of the country, either by running for the office themselves or by donating to candidates. In an election for such a seemingly obscure office, even a small donation could give one candidate a huge advantage. At least in this niche of politics, funeral directors proved to be a formidable lobby.

  Twelve years after the Jennings fingerprint case, the US Court of Appeals for the District of Columbia Circuit became the first federal appellate court in the country to attempt to set some standards for the use of forensic expertise in the courtroom. In the 1923 case Frye v. United States, a polygraph instructor had testified that a rise in systolic blood pressure was indicative of a witness’s dishonesty. In considering whether to allow the expert opinion, the court ruled that in order for scientific evidence to be admissible it must have “gained general acceptance in the particular field in which it belongs.” The court determined that the nascent field of polygraph testing did not possess such acceptance and excluded the testimony. Though it wasn’t a Supreme Court case, Frye was eventually adopted by every other federal court of appeals, and over time by nearly every state. “That [decision] put some teeth in the law,” says law professor Jonathan Koehler. “It looked beyond mere witness qualifications to evaluate the content of a witness’s testimony.”

  But the decision also had at least one ancillary effect that would have an even bigger impact than its core holding: it made judges the gatekeepers of what expertise would be allowed into court. The problem is that judges are trained in legal analysis, not scientific inquiry. These are two entirely different ways of thinking. Nevertheless, for the next fifty years Frye remained the model in both federal court and in most state courts across the country.

  By 1952, just sixteen of the seventy-two medical schools in the United States offered instruction in legal medicine. Fewer still offered instruction in forensic pathology. Again, the problem was driven by low demand. Evidently, most medical students had little interest in working with the dead, and those who did typically learned on the job. Many (but certainly not all) who chose the profession at the time did so because they weren’t skilled enough physicians to specialize in anything else. This presented a significant obstacle to reforming the coroner system. How could reformers convince states to move to a medical examiner system if there weren’t any competent medical examiners to take over? Recognizing the problem, reformers set out to professionalize and standardize forensic pathology. By the mid-1950s, the American Board of Pathology—the nation’s leading group that certifies pathologists—began creating a specific certification in forensic pathology. The first class of candidates was certified in 1959.

  But the effort was far from an overwhelming success. After an initial surge of “grandfathered” certifications of experienced pathologists (though not necessarily experienced in forensic pathology), the stream of candidates thinned. Even as a rising homicide rate in the 1960s created an urgent need for additional medical examiners, the field still had to overcome its professional stigma, as well as the comparatively low pay offered by state and municipal governments.

  To address the shortage, the American Board of Pathology continued to grandfather certifications, even to applicants with no formal training in the field, despite data showing they performed significantly worse on the board-certifying exam. Morale was low, and new applicants were hard to come by. Opening the door to less-qualified candidates only exacerbated the stigma. The field slipped into a self-perpetuating cycle of mediocrity.

  By the early 1970s, over half of the country’s traditional coroner offices had been abolished, and just over half the states had implemented some sort of reform. But considering that the reformers had been waging the fight for decades, those advancements were fairly incremental and overshadowed by the continued strength of the status quo. By 1970 the office of coroner remained in some substantial form in thirty-nine states. Even today, just eighteen states have fully abolished the coroner system in favor of a medical examiner system, the same figure as in 1967. And of the thirty-two states in which coroners still serve part or all of a county, twelve require no specialized training for the position at all.

  Currently, the United States still faces a critical shortage of medical examiners. The field enjoys significantly more respect than it once did, thanks in part to celebrity medical examiners and high-profile criminal cases, both of which have contributed to public understanding of the job, along with positive pop culture portrayals on television shows like Quincy and more recently the CSI franchise. But forensic pathology continues to suffer from many of the problems that have plagued it from the beginning. Medical students go to medical school to help the sick. Few enroll to work with the dead. Most medical examiner jobs today are still public positions, meaning they tend to pay far less than medical specialties in private practice. Though medicolegal investigations have at least advanced to the point where autopsies are now standard in most suspicious deaths, the problem persists in that there just aren’t enough medical examiners to do them all—or at least to do them properly.

  All of which means that despite the relatively low pay of the state positions in forensic pathology, a doctor willing to bend the profession’s guidelines to help supply meet demand could make good money. There are quite a few places across the country where that’s exactly what has happened—where doctors have willingly performed significantly more autopsies than the field’s governing bodies recommend. But nowhere did it happen on the scale it did in Mississippi.

  4

  AT THE HANDS OF PERSONS UNKNOWN

  We couldn’t even count the bullet holes in my brother’s head. But they called it heart failure.

  —A black Mississippi woman, 1964

  Of all the ways America’s coroner system could be—and was—corrupted, the most damaging and consequential was its deployment to excuse and cover up lynchings, civil rights assassinations, and other racial violence during the twentieth century. “The coroner’s jury is often formed among those who took part in the lynching,” wrote the journalist and civil rights activist Ida B. Wells in 1894, “and a verdict, ‘Death at the hands of parties unknown to the jury’ is rendered.” Her observation was echoed four decades later by the Washington, DC, publication the Afro-American, which editorialized in 1932, “There is a righteous complaint that officials from coroners’ juries to state governors carefully whitewash lynchings in order to keep down public clamor.”

  Of course, the coroner system wasn’t the cause of the hundreds of lynchings that stained America in the late nineteenth and early twentieth centuries. They’d no doubt have occurred anyway. But the system certainly facilitated them. As an elected official in a culture in which blacks were effectively barred from voting and denigrated as second-class citizens, the coroner was a powerful ally in preserving the racial status quo. The conclusions of a coroner’s jury carried the force of law. Once an inquest determined that a black man with noose marks around his neck and riddled with bullet holes had died of natural causes, or suicide, or by accident, or that,
say, the owner of a sharecropping plantation had shot one of his tenant farmers in self-defense, that conclusion nearly always ended the investigation. There would be no arrests, no charges, no trial. In this sense, the coroner had extraordinary power. Though no coroners obtained the raw power or public notoriety of a George Wallace or a Bull Connor, by persistently thwarting justice for the victims of racial violence they were easily as culpable for the era’s worst crimes as any racist lawman or segregationist governor.

  As racial violence increased in the 1930s and 1940s, the US House of Representatives passed several different versions of a federal anti-lynching bill. Each time those bills reached the Senate, they were blocked by Southern lawmakers. The argument was as straightforward as it was disingenuous: federal intervention was unnecessary because lynching was already illegal under state law.

  Strictly speaking, that was true. Some Southern states had even passed anti-lynching laws as early as the late 1800s. But of course to convict under those laws, someone had to identify the lynchers. Dutifully, coroners inevitably found that the killers’ identities were impossible to ascertain. As Stewart Tolnay and E. M. Beck wrote in their history of Southern lynchings, “Potentially severe punishment for mob members was virtually irrelevant when coroners consistently concluded that lynch victims met their fates at the hands of a ‘party or parties unknown.’”

  Variations on that phrase—“persons, party or parties unknown”—haunt the accounts of lynchings throughout American history generally, and in the deep South in particular. Again and again, reports of lynchings in black publications like the Chicago Defender and catalogs of lynchings in the NAACP’s annual reports described coroners’ inquests that concluded the murders were committed “at the hands of persons unknown.” The phrase was so common among coroners and their juries that the author and historian Philip Dray made it the title of his own influential book on the history of lynching in America.

 

‹ Prev