Taking the Medicine: A Short History of Medicine’s Beautiful Idea, and our Difficulty Swallowing It

Home > Other > Taking the Medicine: A Short History of Medicine’s Beautiful Idea, and our Difficulty Swallowing It > Page 14
Taking the Medicine: A Short History of Medicine’s Beautiful Idea, and our Difficulty Swallowing It Page 14

by Burch, Druin


  This was hugely important. Sulphanilamide had been discovered and described in 1908, the product of a chemist’s doctoral studies. Neither I. G. Farben nor anyone else could now patent it. If the Pasteur Institute were correct, not only was their insight relevant for the future development of similar drugs, it also meant that the production and sale of sulphanilamide belonged to the general commonwealth, not to any one company.

  Colebrook continued cautiously to test Prontosil. In a further series of sixty-four women he showed a death rate of under 5 per cent. In another group of a hundred, in which he used both Prontosil and the straight sulphanilamide, eight died. In every group of women in which he used one of these new drugs, the death rate was substantially lower than the 20-odd per cent seen before they had become available.

  Why was Colebrook so slow to be satisfied? Partly, it was because this thoughtful man found himself in agreement with the Scottish doctor John Cowan. Puerperal fever was more lethal than pneumonia. So in none of his examinations of the new drugs did Colebrook include a group of control women, of patients with puerperal fever to whom he deliberately denied the drug. It simply seemed to him to be too cruel.

  That left him, however, with alternative explanations for the improvements he was seeing. What if the streptococcus was becoming spontaneously less hazardous, happening to evolve in a benign direction at the same time as these new compounds were being introduced? It sounded remarkably unlikely but there were reasons to take it seriously. Scarlet fever appeared to be doing exactly that, and it was a disease caused by the same streptococcus. Looking at the records of other hospitals, it was apparent to Colebrook that death rates from puerperal fever were declining, both before and during the introduction of these new sulfa drugs. And there was also reason to worry that the drugs themselves were dangerous. Colebrook found that a number of women treated with them suffered a change in the colour of their skin. It became dusky, dark and blue – a sign associated with potentially fatal changes in the blood’s ability to carry oxygen and carbon dioxide.

  Colebrook achieved two vital things. His series of 1936 papers in the Lancet meant that more admiring attention began to be paid to the sulfa drugs, and their popularity spread. His assertion of lingering uncertainty was also taken seriously. Colebrook managed to provoke not only useful enthusiasm but also fertile doubt. Further studies during the 1930s continued to look at the effects of sulfa drugs on puerperal fever. All of them were positive. Even though none used control groups (relying instead on the historical records from before the introduction of the sulfa drugs) they were persuasive.1

  What became apparent to doctors, as a result of the introduction of sulfa drugs, was that the sort of proof a drug required depended on the magnitude of the benefit it offered. In puerperal fever and, from 1938 onwards, meningococcal meningitis, the diseases were so fatal and the drug so effective that certainty was relatively easy to come by. Comparing what happened at a hospital before and after a drug was introduced was likely to be reliable. For other infections, however, sulfa drugs were harder to figure out. Not only did most people with pneumonia recover anyway, for example, but rates of pneumonic infections also varied hugely from month to month and year to year. Subtler methods were needed to determine what impact sulfa drugs had on such comparatively mild diseases. In scarlet fever, doctors used concurrent controls, giving the sulfa to some people and not to others. The drug had no meaningful effect. That was a surprising finding. Reasoning logically, doctors assumed it was going to be useful – scarlet fever was caused, after all, by the same bug as puerperal fever. Despite that, there seemed to be no effect at all. Logic and reason were proving no match for experiment.

  Before Domagk and Colebrook’s work, one in every 500 births in Britain led to the mother’s death. Knowing that the disease was sometimes spread by the hands of doctors and nurses had helped reduce rates, but not terrifically. Death rates had remained largely static over the previous eighty years. By 1940, after Prontosil, the death rate fell to one in 2,000.

  *

  The French suggestion about the irrelevance of Prontosil rubrum’s colouring to its activity turned out to be correct. Sulphanilamide was the portion of the compound with anti-bacterial activity. Fortunately for Farben, enough doctors ignored this fact to make the dye-based drug highly profitable. They seemed attached to Prontosil’s name rather than its actions, choosing Prontosil rubrum over sulphanilamide in exactly the way that they had previously prescribed Antifebrin in place of acetanilide.2

  In the meantime, Domagk himself got to use the drug. His six-year-old daughter Hildegarde was making Christmas decorations early in December 1935. Wanting help threading a needle, she carried it downstairs to look for her mother. She fell while carrying it carefully, as children are taught to do, with the sharp end pointing downwards and away from her. She landed badly enough that although the blunt eye of the needle was the part pressed against her palm, it was driven deep into her hand, sticking into the bone where the metal snapped in two.

  After an X-ray, a surgeon removed the broken needle. The next day Hildegarde was feverish. An abscess developed where the metal had pricked her. Despite repeated surgical drainage of the pus, the infection grew worse. It spread from her hand into her arm. In a state of septic shock – her blood pressure desperately low as a result of the bacterial toxins – she became delirious. The surgeons talked about cutting her arm off in an attempt to save her life. Samples of her blood grew streptococcus. She was plainly dying.

  Asking permission from the surgeon looking after her, Domagk fed his daughter Prontosil, putting the tablets in her mouth and watching while she swallowed them. Her recovery was quick and, by the standards of the day, wholly miraculous. By Christmas she was back at home, healthy.

  Gerhard Domagk’s smaller reward was the Nobel, in 1939. It came four years after Hitler forbade Germans from receiving these prizes. The 1935 Peace Prize had been awarded to Carl von Ossietzky, a pacifist critical of the Nazis. This had infuriated Hitler. In 1938 Ossietzky died in his concentration camp, his death doing nothing to lift the ban. When Domagk carefully acknowledged the award, thanking the committee but saying he was not sure if he could accept, it seemed far too polite a response in the eyes of the German authorities. Domagk was taken away by the Gestapo. After a period in jail, he was released. Travelling to Berlin to give a talk, he heard his name being called over the loudspeakers at the railway station in Potsdam. The Gestapo were waiting for him again, and told him there would be no talk. They offered him a letter declining the Nobel. He signed.

  Domagk finally collected his prize in 1947. By then the money, according to the rules of the Nobel Foundation, was forfeit. Domagk accepted the prize anyway.

  * * *

  1 There was a paradox here. Colebrook felt that control groups were unethical for such a dangerous disease. He was unwilling to take the risks with women’s lives that they entailed. The result was that acceptance of the drug’s effects was slow. Far fewer women might actually have died had Colebrook used controls to begin with. By trying to avoid harm to a few, Colebrook inflicted it on many.

  2 The same preference exists in the marketplace today. Ibuprofen tablets, for example, are available over the counter at chemists at very much cheaper prices than branded versions of the same thing, like Nurofen.

  14 Penicillin and Streptomycin

  TO THE EXTENT that sulphonamides turned out to act independently of their staining power, their development was partly accidental. It was reminiscent of the sorts of accident that had led to cinchona replacing Peruvian balsam for treating malaria, or the Reverend Stone selecting willow bark for fevers on the basis that the tree grew in swamps. In the two older examples, though, a mistaken theory had led by good fortune to a happy outcome. Ehrlich and Domagk had done something different. Their study of dyes led them, correctly, to the idea that a compound might have selective toxicity for different organisms. Starting with dyestuffs was sensible, both because they had already been shown to enter b
acteria and specific cell types, but also because their visual nature made them relatively easy to monitor. Actually, though, the development of antibiotics depended only on two things: that people could correctly conceive of them, and that a system could be put in place for screening large numbers of molecules in order to seek them out. Prontosil rubrum, by sheer chance, did not have to be red to work.

  A huge group of scientists worked in Domagk’s laboratory, testing one compound after another for its ability to save deliberately sickened and infected animals. That seems like a long way from the ideas of science that had begun to emerge in the seventeenth century. It was not. Francis Bacon, looking ahead, had seen that science was not for an elite only, that science ‘is not a way over which only one man can pass at a time . . . but one in which the labours and industries of men (especially as regards the collecting of experience), may with the best effect be first distributed and then combined. For then only will men begin to know their strength.’ That communal effort, more than any one person’s inspiration, was behind the discovery of antibiotics.

  During the First World War, Lieutenant (later Captain) Alexander Fleming spent time in France. Fleming worked to try to understand why war wounds went bad so often. The bandages soaked in antiseptic, so effective for the British during the Boer War, seemed to have little effect in Flanders. Wounds on the Western Front, Fleming found, were filthier. The mud and the high-velocity armaments drove dirt deep into damaged bodies, and oddly, as Fleming showed, washing out wounds with antiseptics did more harm than good. What seemed a reasonable approach was actually making things worse. The antiseptics killed lots of bacteria in wounds but they also killed the body’s own cells. And the wounded depended on those for their ability to heal. Even the latest antiseptic technology cost lives if it was put into practice without being soundly tested first.

  In the summer of 1928, while working in London, Fleming started experimenting with a mould. His actual interests at that stage were in lysozymes, a group of enzymes found in snot and other bodily fluids that were able to dissolve bacteria. The lysozymes had no effect on Staphylococci, a common bacterium in humans, but the mould did.

  It was not a novel, surprising or even a particularly interesting finding. Mould’s ability to kill bacteria had been described in 1876, by John Tyndall. He watched a piece of mutton rot, noting that the bacteria that grew on it were all killed by the mould. Like Fleming, Tyndall knew that he was observing the effects of a mould called Penicillium. ‘In every case where the mould was thick and coherent,’ Tyndall wrote, ‘the Bacteria died, or became dormant.’

  That made Penicillium no different in concept to carbolic acid, bleach or the other antiseptics. Fleming’s official biographer noted the routine nature of what occurred, explaining that Fleming observed a mould that appeared to kill bacteria: ‘probably the mould was producing acids which were harmful to the staphylococci – no unusual occurrence.’ It was not an important event.

  A far more astonishing demonstration of Penicillium’s properties had already come in 1897. A Frenchman named Ernest Duchesne submitted a doctoral thesis on the competition between micro-organisms, the battles they waged in order to live. ‘Contribution à l’étude de la concurrence vitale chez les microorganismes – Antagonisme entre les moisissures et les microbes’ was a remarkable work. Duchesne claimed to have shown that injections of his Penicillium safely cured typhoid in animals. Unfortunately Duchesne’s conclusions were rejected by the Pasteur Institute, his ideas had no impact, and it is not possible now to be certain about his methods. Whether his injections actually did work cannot be known. His understanding of what he was close to, however, is unmistakable. ‘The question of vital competition was indeed studied up until now only for the higher animals and plants,’ he wrote:

  It is not without interest to see whether, at the level of the infinitely small ones, this struggle for existence does not also exist, and whether it provides concepts useful for pathology or for therapeutics. The role of microbes in the genesis of disease is now well-known to us: we know that, not only do they generate disease, but they can also be the remedy for it, either by their attenuated culture or by products that they secrete.

  Duchesne concluded that Penicillium, injected at the same time as a dose of typhoid, made the latter harmless. He thought that this was important. He died of tuberculosis in 1912, at the age of thirty-seven, without having convinced anyone or having continued his own work beyond his widely ignored doctorate.

  Others had also noted the way in which penicillin killed bacteria. Like Fleming, they spared a thought for the possibility that this could be useful. ‘Life hinders life,’ Pasteur wrote in 1877. ‘A liquid invaded by an organised ferment, or by an aerobe, makes it difficult for an inferior organism to multiply . . . These facts may, perhaps, justify the greatest hope from the therapeutic point of view.’ The notion of simple organisms competing with others through chemical attacks was christened in 1899. Jean-Paul Vuillemin described how moulds fought with bacteria: ‘Here there is nothing equivocal; one creature destroys the life of the other. The conception is so simple that no one has ever thought of giving it a name.’ He called it antibiosis. A host of other nineteenth-century workers, including Lister, noted this ability of moulds to poison bacteria. In the same year that Pasteur wrote about his hopes, Lister even tried using Penicillium mould to treat infected wounds. He gave it up as hopeless.

  Fleming attempted to purify the bacteria-killing substance that the mould produced. He was unsuccessful. Nevertheless he took steps that moved him closer to realising that ‘the greatest hope’ of mankind was sitting in the culture dishes on his laboratory bench. The extracts of mould juice that he made were not toxic either to white blood cells or to living mice. That made them exceptionally different from the antiseptics that people slapped onto operating theatre floors or soaked wound dressings in. It was not a difference, however, that greatly impinged on Fleming.

  Fleming might have tested his ‘penicillin’ (a name he came up with early in 1929) on an infected animal, just as Domagk’s laboratory took pains to do in the same year with their pigments. Animal activists at the time, however, argued forcefully that you could not learn about treating human diseases by trying remedies on rabbits or mice. From the middle of the nineteenth century, public opinion in Britain in particular was appalled at the number of pointless vivisections carried out by doctors in the name of discovery. Astute observers pointed out that the vivisections almost never led to any new therapies, or to freshly effective treatments. Injecting his penicillin into deliberately infected laboratory animals was simply not something that occurred to Fleming as useful. Here, as well as in making the most of dyestuffs, the Germans led the way.

  Knowing that his extracts were too full of impurities to be injected safely into humans, Fleming tried using his penicillin as a wound dressing. He put it on the infected stump of a woman’s leg. It was no use. Fleming published a 1929 paper ‘On the Antibacterial Action of Cultures of a Penicillium with Special Reference to Their Use in the Isolation of B. Influenzae’. Because penicillin killed most bacteria, he explained that it was superb at leaving the research clinician with culture plates free of everything except B. influenzae. That was a boon to those who wanted to explore the nature of this particular bug, but of no therapeutic use to anyone at all. Fleming moved on to other things, keeping cultures of penicillin growing in order to clean bacteria from his laboratory equipment, as his paper described. Various accounts have him occasionally using, and talking of, penicillin as the 1930s swept by and sulphonamides came onto the market. If he did talk to people about his belief that penicillin might be of great human value, and occasionally use his extracts of it to treat simple skin and eye infections, it only makes his overall inactivity more puzzling. Given the harms that doctors inflicted without realising, though, perhaps it should not be surprising to find them also being blind to benefits.

  As a medical student in London in the 1920s, Cecil George Paine heard abo
ut penicillin from Fleming’s lectures. When Paine started work in Sheffield he wrote and asked Fleming for some samples of the Penicillium mould. During 1930 and 1931, Paine tried it out. He gave extracts of the mould juice to three patients with skin infections, seeing no benefits. ‘The attempts to use penicillin against these infections’, said Paine, ‘came to nothing.’ He turned to using it on babies born with gonorrhoea. Although this was a sexually transmitted disease, an infant could pick up gonorrhoea in the process of being born. It affected the baby’s eyes, filling them with infected pus. In one surviving set of medical notes, penicillin broth was applied to the eyes of a three-month-old named Peter. An inpatient since the age of three weeks, Peter was undergoing treatment with silver nitrate drops for his infected eyes.1 On 25 November 1930 the notes say: ‘Started c. Pinicillin’ (sic). By 2 December, both eyes had cleaned up. The penicillin, remembered Paine, ‘worked like a charm!’

  Paine’s mould was grown on meat broth, and like Fleming he was unable to manufacture any pure extracts of the broth’s active ingredient. He simply used the broth in its entirety. Nevertheless, his experiments continued. A patient came in from one of the nearby mines, his eye pierced with a fragment of stone driven underneath his iris. The wound was infected, and Paine was able to swab the emerging pus and show which bacteria were living inside the man’s eye. ‘We took a culture from his cornea and grew up the organism that was absolutely dreaded in the eye – Pneumococcus,’ said Paine. Operating on an infected wound was only done in order to cut out the infection entirely. Opening up the eye to remove the stone was simply likely to spread the infection. Removing the whole globe of the eye, before the infection spread back into the man’s brain, seemed the best way forward. Instead, Paine and his colleagues took a different approach. ‘We tried penicillin and it cleared up the infection like nobody’s business, and they were able to deal with him and he made a good recovery.’ With the infection cleared from his eye, the surgeons safely removed the stone. The man recovered and his vision returned to normal.

 

‹ Prev