Pseudoscience
Debate
A hero’s legend and a stolen skull rustle up a DNA drama
Christine Kenneally
Even with the best scientific techniques, you can’t always get what you want. But if you try, as the Rolling Stones put it, sometimes you get what you need.
Consider the case of Ned Kelly’s skull.
In Australia, Kelly needs no introduction; for Americans, it may help to think of him as Jesse James, Thomas Paine and John F. Kennedy rolled into one.
Born about 1854 to an Irish convict exiled to Australia, Kelly became a folk hero as a very young man. He took up arms against a corrupt British constabulary, robbed banks and wrote an explosive manifesto. He was shot and arrested in a shootout in which he wore homemade metal armour, and in 1880 he was hanged by the Anglo-Irish establishment he despised.
As with any semi-mythical hero, Kelly’s public has always hungered to get closer to the legend. His armour, cartridge bag, boots and a bloody sash became state treasures.
But perhaps the most priceless among them is his missing skull – the subject of a tangled forensic drama that was finally resolved in September 2011, at least in part, after decades of investigation, debate, tantalising leads, stalemates, false starts and what can only be called skulduggery.
After his execution, Ned Kelly was buried in a mass grave at the Melbourne Gaol. There his remains might have quietly and invisibly decomposed but for a mistake by 19th century gravediggers: they used a type of lime that slowed decomposition instead of hastening it.
So when the grounds were dug up for development in 1929, startled workers found the site full of skeletons. Officials began to move the remains to another prison. But in a scene of chaos that became a local scandal, a crowd of schoolboys and other onlookers ran amok among the coffins, seizing bones – including, it was thought, the skulls of Ned Kelly and Frederick Bailey Deeming, the notorious British serial killer who may have been Jack the Ripper.
The remaining bones were reburied at Pentridge prison, and the skulls were recovered soon after being stolen. They then embarked on a separate, winding journey through the back doors of a number of institutions.
In the 1970s, one skull was put on display in a jail museum alongside Kelly’s death mask, a plaster cast impression made shortly after his execution. (It is unknown whether that mask was the original or a copy.)
But in 1978 the skull was stolen again, and a man named Tom Baxter told journalists that he had it.
Mr Baxter held onto the skull for over three decades, promising to return it if the government gave Kelly a Christian burial. The government did not respond, and the stalemate continued until 2008, when yet another excavation uncovered more prisoners’ remains. At least 3000 bone fragments were exhumed and sent to the Victorian Institute of Forensic Medicine. It was thought that Ned Kelly’s bones might be among them.
Shortly after that, Mr Baxter handed over a fragile, sun-bleached skull to the authorities.
The forensic institute conducted a 21 month investigation of the skull, mixing historical detective work with an array of innovative scientific analyses.
Scientists used historical photographs, cranial plaster casts and a copy of the Kelly death mask to determine whether the skull from Mr Baxter had indeed been unearthed in the 1929 exhumation. When it came to the skull’s genetic material, however, the scientists faced some serious obstacles. DNA is well preserved in bone but highly vulnerable to contamination. They could not simply cut a square out of the skull, grind it to a powder and extract DNA from that; Joy Beyer, a molecular biologist at the institute, said she was told that the skull could not be damaged.
Finally, the institute sent samples from the skull and other remains to a forensic laboratory in Argentina that specialises in degraded and aged remains. That lab successfully extracted DNA from almost all the samples.
Even so, the DNA meant little in isolation. The investigators needed something, or someone, to match it against.
Hoping to find DNA in Kelly’s dried blood, they located the boots, bag and sash he wore on the night he was shot. ‘Dried specimens on cloth can preserve DNA for hundreds, even thousands, of years,’ said David Ranson, a pathologist at the institute.
But the boot and the bag had no usable DNA. The sash, which they found in a country museum, had been thoroughly washed before it was put on display. And a search for the original Kelly death mask – which might hold a stray eyelash or some skin – came up empty.
Next, the investigators looked for relatives. They found Leigh Olver, an art teacher who was descended from Ned Kelly’s mother, down a direct line of women. He donated blood for analysis, and they compared his mitochondrial DNA with that of the skull.
On 1 September 2011, the forensic institute announced the disappointing results of that analysis. It appears that after all this time, after being abducted more than once, placed on display for the world to see, hidden for decades, cherished, handled, sought after and tested, the skull is not Ned Kelly’s. ‘Mr Olver’s DNA and the DNA from the skull do not match,’ said Fiona Leahy, a historian and legal adviser at the institute.
There was one rather powerful note of consolation. The investigators found a match between the Olver DNA and one set of bones dug up at Pentridge, including a palm-size fragment of skull. So while most of Kelly’s skull is still missing, the rest of him appears to have been found.
As for the stolen skull, it could belong to the serial killer Frederick Deeming, who died in 1892. The forensic institute is seeking a maternal relative to test DNA.
What of Kelly’s skeleton? Should it be returned to the extended family? Or should there be a public grave? Many Australians regard Kelly as a national hero. Countless books and movies tell the story of his life. But others see him as a villain.
‘You can’t just bury the man,’ Mr Olver said. ‘Someone is going to dig him up again in half an hour.’
Iron men
Time travel
The rise and fall of infant reflux
Pamela Douglas
At the dawn of the 21st century Queensland infants were in the grip of an epidemic. Babies screamed, vomited and woke frequently at night. They refused to feed, arched their backs, drew up their knees. Parents were frantic: even if they could soothe the flailing fists and the little crumpled face, the minute they put their baby down, the piercing shrieks began again.
Once, we called this colic. We attributed it to wind, and a woman struggled through the nightmarish first months of a colicky baby’s life without much support from health professionals or even sympathy from those around her, secretly and horribly convinced of her own public failure. But by 1982, when a small group of ‘reflux mums’ formed the Vomiting Infants Support Association of Queensland, the nascent sub-specialty of paediatric gastroenterology had found in colic a cause célèbre.
The association went national in 2000 and became RISA, the Reflux Infants Support Association, aiming to give confidence and moral support to families of infants with problems associated with gastroesophageal reflux. But the epidemic appeared to be at its worst in Queensland, a state prone to statistical exaggeration. One prominent Queensland paediatric gastroenterologist pioneered the link between infant irritability and gastroesophageal reflux disease (GORD), and took to the lecture circuit to raise professional and community awareness. He subsequently relocated overseas and remains a dedicated and caring doctor, but he saw the world, particularly crying babies, through a very special – specialised – lens.
I subscribed guiltily to RISA News. Throughout the 1990s, as the epidemic worsened, my own robust offspring grew into preschoolers, then primary schoolchildren. They never cried much and, as the newsletters explained, only the parents of a reflux baby can truly relate to the exhaustion, despair, headaches and lack of sleep. But I’m a GP, and throughout the 1990s until the mid-2000s, infant GORD was rampant. Many of the babies I saw came pre-diagnosed with ‘reflux’ by the paediatrician, the hospital midw
ife, the child-health nurse, the breastfeeding counsellor, or the lady across the road. New mothers stepped carefully into the consulting room, manoeuvring the pram through the door or lugging the car capsule or carrying the baby, sat down in the chair by my desk, and wept.
Babies from the first days and weeks of life were being given cisapride, ranitidine or cimetidine, antacids – often in double doses – and, from the end of the decade, proton-pump inhibitors (PPIs). An American study showed that PPI use in infants multiplied sixteen-fold between 1999 and 2004.
In the Australian Family Physician, Medicine Today and Australian Doctor, diligent GPs read articles about crying babies and GORD written by paediatricians and gastroenterologists. Parents were angry with any incompetent practitioner who ‘missed’ the diagnosis. They were especially angry with the hapless doctor who ventured that maybe the baby was just a bad sleeper, or that the mother was unnecessarily worried. Having a new baby is not the blow-waved, lacy-white sensuality of the Lux mother: a brutal collision with reality lurks beneath the sentimental images of motherhood, shocking us early on, and it’s reasonable to expect a sensitive response from the GP. But parents came to believe that a failure to diagnose was a failure to care.
In the A4 newsletter that arrived in my mailbox every couple of months I read a stream of heartbreaking testimonials, alongside hints on sterilising medicine cups and removing the smell of vomit from clothing, or recipes for blancmange to thicken expressed breast milk or formula. There were diagrams illustrating how to change nappies with a pillow under the baby’s shoulders and without lifting the baby’s legs, or how to breastor bottle-feed holding the baby vertical. There were ads for approved cot harnesses to secure the baby once the head of the cot was raised 30 degrees, or for slogan T-shirts, cheap and cute, for the discernible ‘reflux’ baby, alongside order forms for a fundraising drive. Contributions from paediatric gastroenterologists and GPs advised frequent burping, thickened breast milk, thickened formula, frequent breastfeeds, spaced breastfeeds, different bottles, different formula.
It felt voyeuristic, peering into the newsletters like this, browsing families’ misery and their plucky attempts to keep each other’s spirits up, all written in homey prose. But it was also clear to me that Queensland babies, at least in the first few months of life, were in the grip of an imaginary disease. It’s true that premature infants, and infants with certain underlying health problems, for example, neurological abnormalities, are prone to GORD. But in otherwise healthy, full-term babies in the first few months of life, excessive crying, crying in a piercing shriek, back-arching, turning red in the face, flexing up the knees to the tummy, disrupted sleep, vomiting, and crying when put down are common behaviours, not caused by pain or reflux. I could see that using the diagnosis of GORD to explain these behaviours caused harm to mothers and babies.
For a start, parents were desperately focused on performing the various odd, disruptive and time-consuming manoeuvres supposed to protect their baby’s imaginary oesophageal lesions. These preoccupations certainly didn’t help parents learn to read and respond to their infants’ cues. Yet learning to read and engage the baby’s communications (a difficult task in unsettled babies, one that may even require professional help) is a very important way of protecting the mother–infant relationship and the child’s long-term mental health.
Multiple other problems were often not identified or addressed in the frenzy of activity surrounding GORD: for example, feeding difficulty, cow’s milk allergy, maternal anxiety or depression, or lack of familial and social support. Worse still, if correctable clinical problems weren’t diagnosed, mothers and babies were at risk of developing entrenched long-term problems, including ongoing feeding difficulties. The consequences of undetected and unmanaged feeding difficulties may be catastrophic for some, resulting in severely disrupted and anxious mother–infant relations, since it is not easy for a mother to remain calm at feed times if she believes her baby is starving.
Some babies do develop true GORD down the track. Could it be that by over-diagnosing GORD in the first few months of life, we also predisposed some babies to oesophagitis later on? This is a sensible interpretation of what we know about the multiple factors that do predispose babies to GORD, and the effects of failing to identify them.
Worst of all, cisapride (trade name Prepulsid) could fatally disrupt the beating of a tiny heart. This was recognised in 2000, after two children died. But the potential for disaster didn’t halt the GORD juggernaut: we simply substituted PPIs, even though they had not been trialled on a large scale, over time, in infants in the first weeks and months of life.
* * * * *
As a GP, I specialise in generalism. I started my professional life in the turbulence of an Aboriginal and Islander Community Health Service, which alerted me to other frames of reference, including to cross-cultural differences in infant care. I have a better-than-average grasp of the physiology of lactation, of breastmilk substitution and the infant gut, because I qualified as an international-board-certified lactation consultant when I had my babies, in snatched hours while they slept. Many GPs, paediatricians and paediatric gastroenterologists remain inadequately educated about breastfeeding, and even midwives and child-health nurses have variable standards of skills in lactation support. These knowledge deficits – the problem of the health professional who doesn’t know what he or she doesn’t know – are significant for unsettled babies.
Clinical epidemiology is a branch of medicine that expanded dramatically in the 1980s. It aims to understand patterns of disease and the way treatment changes these. The most authoritative of its analyses, the randomised controlled trial (RCT), assesses the effectiveness of interventions by making two populations as similar as possible, then comparing them, with and without the added intervention, applying statistical analyses to assure us that these changes didn’t occur by chance. In 1992, a group of academics at McMasters University in Canada took clinical epidemiology and repackaged it as ‘evidence-based medicine’ (EBM). In their manifesto they proclaimed EBM a paradigm shift, a revolution. They critically ranked the quality of research, eliminated much of the poor science, and developed tools for synthesising the results of multiple trials. They were ‘manning the barricades’ against the health professional who did not know what he or she did not know. They had chutzpah (hubris, their critics called it), and began harnessing the explosion of digitally available research with clever inventions like specialised databases, search filters, hierarchical ranking of evidence, systematic reviews and meta-analyses.
Like all good brand-makers they were revisionist, claiming to cleave the history of medicine into the pre-EBM era of dangerous, expert-led, opinion-driven, inconsistent care, and a post-EBM era that was clear-eyed and modern, pragmatic and anti-authoritarian. A democracy of evidence, except (so the critics complained) the EBM men set the rules. Brand EBM became a catchcry, a simplified and highly successful approach to health research that was rapidly co-opted by politicians and governments. Massive funds were diverted to the cause, and academic careers took off. Eighteen years later, critics are still scathing. A zombie, or a dead fish swimming, they call it, arguing that EBM has long since been exposed as a very limited approach to health knowledge: dead in the epistemological sense, but made to act as if alive because it’s inflated with funding and bouncing about.
All of this has had a remarkable effect on the culture in which I’ve practised over the past 25 years. Three-quarters of all medical consultations in Australia are with GPs, and most Australians consult their GP once a year, yet general practice research has always been drastically underfunded. Despite remarkable gains in the past decade, it is still, as a result, fifty times less productive than research in, say, internal medicine or surgery. From the late 1990s a handful of general practice academics in Australia became preoccupied with the fight to improve general practice’s credentials, trying to secure a foothold in a clinical research landscape utterly dominated by hospital-based specialists
(who have been at times referred to in primary care as partialists). In the real world of GP consultations, where the messy stuff of patients’ lives and contexts write into the body in dynamic and unpredictable ways, there are serious limitations to the usefulness of RCTs, but these powerful Australian EBM men were focused on making up ground. Genuine conservatives (or are they the genuine radicals?), thoughtful about the nature of evidence and its place in the complexity of primary care, struggling to articulate an authentic clinical practice, were brushed off as out of touch. By the time the GORD epidemic really took hold in crying babies, doctors were expected to follow brand EBM unquestioningly. A vigilant moralism about how we practise came to the fore.
So what do you do when you are confronted by an expectation that you practise according to the ‘evidence’ – an agreed clinical protocol written up in authoritative, peer-reviewed journals – when that evidence contradicts what you have reason to believe, from your own transdisciplinary knowledge base, is in your patient’s best interests? Diverging from accepted best practice is professionally compromising, even dangerous. But not contesting a harmful diagnosis is ethically compromising, dangerous to one’s personal integrity and peace of mind, not to mention patient health. I found this cognitive dissonance across a range of issues acutely painful. It seriously compromised my capacity to enjoy general practice. Finally, I took up research – an act of subversion.
But what madness was it to spend my limited free time, over the years, doing searches of the Medline or PubMed or CINAHL databases, poring over papers about my chosen issue: infant crying and GORD? Why did I lock myself away that warm outback Easter, when the kids played with their cousins in the red dirt amid tailings of freshly picked cotton, to read accounts of RCTs and cohort studies in the Journal of Pediatric Gastroenterology and Nutrition, or the Archives of Disease in Childhood, or the Journal of Gastroenterology and Hepatology?
The Best Australian Science Writing 2012 Page 7