The Rise and Fall of Modern Medicine

Home > Other > The Rise and Fall of Modern Medicine > Page 23
The Rise and Fall of Modern Medicine Page 23

by James Le Fanu


  Thus the therapeutic revolution of the post-war years was not ignited by a major scientific insight, rather the reverse: it was the realisation by doctors and scientists that it was not necessary to understand in any detail what was wrong, but that synthetic chemistry blindly and randomly would deliver the remedies that had eluded doctors for centuries. This is certainly different from the conventional way in which medical progress is believed to have occurred, but is well illustrated by the career of the chemicals that started the therapeutic revolution in the first place – the sulphonamides identified by the German chemist Gerhard Domagk in 1933.

  In 1927 the chemical company Bayer had appointed Domagk as Director of Research with a brief to investigate whether synthetic dyes might have antibacterial properties that could be used to treat infectious diseases.

  The principles of his research programme were as follows. His colleague Josef Klarer synthesised new chemical dyes, which he then passed to Domagk, who tested them in mice that had been experimentally infected with different types of bacteria that caused meningitis, gonorrhoea, puerperal fever and so on. It was a typical piece of systematic German research. Domagk did all of his own post-mortem dissections and microscopic examination of the animals’ organs, during which time he was accessible to no one, took no telephone calls and received no visitors. ‘We dissected until we could no longer stand on our feet, and looked through microscopes until we could no longer see.’

  Domagk’s first four years of research – up until 1932 – were ‘not particularly encouraging’. And then came prontosil, a red dye that had originally been synthesised in the hope it would be valuable for colouring leather. In December 1932 Domagk conducted his standard experiments on two groups of mice infected with the bacterium streptococcus. The group given prontosil survived, the control group died.3 Domagk did not publish the results of these experiments for another two years but in 1933 he had direct personal experience of prontosil’s effectiveness in humans when his four-year-old daughter developed a fulminating skin infection of the hand, for which the only treatment up till then had been amputation. She was cured by prontosil.

  Within a few months of Domagk’s publication of the result of his mice experiments in 1935, scientists at the Pasteur Institute in Paris discovered that the therapeutic effects of prontosil in killing bacteria had nothing to do with its chemical properties as a dye; the active component rather was a chemical to which the dye was linked, known as a sulphonamide.4

  And so by this circuitous route the sulphonamides were discovered. It is appropriate to pause for a moment before following their dizzying career to reflect on the difference they made to the treatment of infectious diseases and in particular the three caused by the streptococcus bacterium: puerperal fever in women following labour, erysipelas (the skin infection suffered by Domagk’s daughter) and scarlet fever. In Britain in the 1930s over 1,000 women died every year from puerperal fever, where the traumatised vaginal tissues became contaminated by streptococcal bacteria that then gained entry to the bloodstream, causing blood poisoning, collapse of the circulation and death within a few days. After Domagk discovered prontosil, Dr Leonard Colebrook, director of research at Queen Charlotte’s Maternity Hospital in London, managed to treat thirty-eight women within a few months, thirty-five of whom were cured. His comments on the cases convey some of the astonishment he felt at this new treatment. Thus one 36-year-old woman with a fulminating infection of the womb is described as being ‘very ill, debilitated and delirious. After admission her condition deteriorated until she was given prontosil. Spectacular improvement.’5 The curative power of prontosil had a dramatic effect on the mortality graph of puerperal fever, which fell like a stone from 2.5 per 1,000 live births in 1937 to less than 0.5 three years later.

  It was the same story with the skin infection erysipelas, a particular hazard for doctors and nurses. ‘Most large hospitals have records of tragic deaths [among their staff] from this cause. In each of the cases of which I have personal knowledge, infection was through a finger prick or scratch during attendance on a septic patient . . . there was an acute cellulitis [skin infection] of rapid onset, spreading to the arm, the onset of septicaemia was heralded by high fever and rigor.’ The decline in mortality from erysipelas paralleled that of puerperal fever. Nor was that all, for the sulphonamides were also effective against ‘strep sore throats’, which had just been implicated as the cause of rheumatic fever attacking the joints, kidneys and heart valves to cause respectively arthritis, kidney failure and heart failure from diseased valves. Sulphonamides saved the life of Franklin Roosevelt’s son, and Winston Churchill when he contracted pneumonia in Carthage in December 1943.6

  It is scarcely necessary to emphasise what an extraordinary phenomenon the sulphonamides represented. But this is only the beginning of the story of their contribution to modern medicine. They were, up until the discovery of penicillin, the only effective drugs against infectious disease and, besides being widely prescribed, were naturally a focus of great scientific interest. Consequently a whole series of other quite unanticipated therapeutic benefits soon became apparent.

  1939: Soon after Gerhard Domagk’s fortuitous discovery of the effectiveness of sulphonamides in treating bacterial infection, Donald Woods of London’s Middlesex Hospital discovered they worked by the principle of competitive inhibition, as already described. Put simply, he found the structure of sulphonamides to be very similar to another chemical, PABA, an important constituent of the essential vitamin folic acid. Humans obtain their folic acid from their diet, but bacteria have to manufacture it for themselves. Hence sulphonamides, Woods inferred, were a ‘false building-block, jamming the works’ of the bacteria’s chemical metabolism which when trying to make their own folic acid utilised sulphonamide rather than PABA – with lethal consequences.7

  Subsequently in the United States George Hitchings and Gertrude Elion would apply the same principle to DNA in the hope of finding the ‘false building-blocks’ that would prevent cells from dividing and which might then be useful in the treatment of cancer and similar disorders. Over the next twenty years, this approach led to, inter alia, drugs for leukaemia (6-mp), the immunosuppressant azathioprine that would make transplantation possible, allopurinol for the prevention of gout, and the successful drug for the treatment of viral illnesses, acyclovir.8

  1940: At the Department of Pathology in Oxford on 25 May, Howard Florey and colleagues performed precisely the same experiment with penicillin on mice infected with streptococci as Domagk had seven years earlier. Without Domagk’s precedent Florey would have been unlikely to have performed this experiment, as up to this time the true potential of the antibiotics does not seem to have been fully appreciated. In this way sulphonamides – or at least Domagk’s fastidious experimental method – were crucial to the early discovery of the power of antibiotics.

  In the same year patients receiving large doses of sulphonamides reported the unusual side-effect of passing considerable amounts of urine – because, it eventually transpired, the sulphonamide blocked an enzyme in the kidney. Several serious medical conditions result from increased fluid in the tissues, including the breathlessness of heart failure and the waterlogging of the tissues of kidney failure, while others, such as raised blood pressure and the blinding condition glaucoma, benefit from fluid reduction. There was at the time no effective treatment for any of these conditions, but this unusual side-effect encouraged the chemists to start playing around with the sulphonamide molecule, leading in rapid succession to the discovery of the powerful drug for lowering the blood pressure, bendrofluazide, diuretics or water pills (frusemide) for the treatment of heart and kidney failure, and acetazolamide for the treatment of glaucoma.9,10

  1941: F. V. McCallum at the Johns Hopkins Medical School in Baltimore observed that a variant of sulphonamide markedly increased the size of the thyroid gland in rats. Investigating the matter further, he found that a chemical, thiourea, blocked the synthesis of thyroxine, leading to its
logical therapeutic use as a treatment for an overactive thyroid or thyrotoxicosis.11 In the same year, another sulphonamide was found to prevent the growth of the leprosy organism in mice. This in turn led to the rediscovery of another sulphonamide, dapsone, which remains the mainstay of anti-leprosy treatment.12

  1942: Marcel Jabon of the Medical Faculty at Montpellier University observed that a derivative of a sulphonamide used for the treatment of typhoid fever made some patients extremely ill by lowering their blood sugar. This raised the possibility that it might be of use in the treatment of diabetes, leading to a group of drugs, the sulphonylureas, which, along with insulin, became a mainstay of treatment.13

  1946: A further variant of sulphonamides was discovered to be weakly effective against malaria, which led to the introduction of proguanil, still used in the prevention of this terrible illness.14

  Now it is possible to get a sense of how synthetic chemistry was going to transform medicine. We start with sulphonamides, a simple compound made of sulphur, hydrogen, nitrogen, carbon and oxygen atoms ‘accidentally’ discovered by Domagk, which revolutionised the treatment of a whole range of infectious illnesses. Then, over the next twenty years, the same sulphonamides were responsible in one way or another for the discovery of treatments for hypertension (thus reducing the incidence of stroke), diabetes, heart failure, glaucoma, thyrotoxicosis, malaria and leprosy. Further, the discovery of their mode of action opened up, through the work of Hitchings and Elion, a vast new field of therapeutics, leading to treatments for cancer, gout and viral illnesses.

  The fortuitous, serendipitous and accidental nature of drug discovery is amply illustrated in many of the definitive moments already discussed. It could not have happened any other way, as scientific understanding of disease was much too limited to provide an intellectual basis for the purposive design of drugs. Regrettably it is simply not possible to begin to describe in any detail the way in which this cornucopia of new drugs transformed every aspect of medicine. Some idea of the extraordinary scale of this phenomenon can, however, be gleaned by referring to the table on page 246 that summarises, decade by decade, the achievements of this golden age of drug discovery across the entire spectrum of medicine, the vast majority of which are still in use today.

  The one final ingredient necessary for the realisation of the full promise of the application of chemistry in medicine was a highly competitive ‘capitalist’ pharmaceutical industry. The lessons of the discovery of the sulphonamides, antibiotics and cortisone were not lost on the drug companies. The potential market for these drugs was so vast, the profits to be made from just one discovery so enormous, that they started to invest massively in research, recruiting every chemist they could lay their hands on. The potential of synthesising new chemical compounds seemed virtually limitless, and the whole process was best achieved by placing it on an industrial footing. This was a high-risk venture with no guarantee of return on the investment – as there was no predicting where the next discovery might come from – but ‘risk’ is what capitalism is all about. The dynamics of the therapeutic revolution owed more to a synergy between the creative forces of capitalism and chemistry than to the science of medicine and biology.

  The atmosphere of optimism and energy driving the therapeutic revolution forward is illustrated by events at the drug company Upjohn soon after Philip Hench’s revelation of the therapeutic benefits of cortisone. Its cost, because of its method of extraction, was still extremely high, so vast financial rewards would go to the drug company that found a cheaper way of manufacturing it:

  Top members of the research division mapped out a massive seven-pronged programme aimed at finding practical new methods of producing cortisone on an industrial scale. One group would work on a modification of the method of producing cortisone from bile-acids employed by Merck [who had originally provided the cortisone used by Philip Hench]. A second would make an effort to synthesise cortisone from simple raw materials, such as coal tar chemicals. A third team would work on the chemical conversion of an easily obtained steroid made by yeasts to cortisone, the fourth would seek to prepare the hormone from a steroid compound extracted from the root of the green helibore. A fifth group of researchers would look into the possibility of obtaining the hormone from steroids made by the adrenal cortex gland with the aid of enzymes. Another group would see whether micro-organisms might be enlisted in carrying out a particularly difficult step in the cortisone synthesis. And as a final measure, Upjohn would take part in an expedition to Africa, one of six sent by pharmaceutical companies and government agencies, in search of the famous lost strophanthus vine, whose seed was reported to contain a substance from which cortisone might be made easily – if the vine could be found and cultivated. Well over half of the three hundred people in the research division were thrown into the cortisone fray. It was the biggest research gamble the company had ever undertaken.15

  The golden age of drug discovery,1940–75

  Compounding this frenetic activity was what can best be called the multiplier effect. The more chemicals that were synthesised and the more drugs that were produced, so the greater the chance for the ‘accidental’ observation, whether in the laboratory or on the ward, that would draw attention to other useful therapeutic avenues to explore. The consequences can be seen in every new edition of the doctor’s therapeutic bible, the Pharmacopoeia, as by the 1960s over 100 new drugs were being registered each year. But this process could not go on indefinitely. Sooner or later the chemists must run out of new chemicals to test – and then what would happen?

  4

  TECHNOLOGY’S TRIUMPHS

  Technology, along with drug discovery, shares the prize for the massive expansion of medicine in the post-war years. They are similar, in that both provided empirical solutions to the problems of disease without the necessity for a profound understanding of its nature or causes. They differ, however, in that the manner of technological innovation is almost the precise opposite of drug discovery for, whereas most drugs were discovered by ‘accident’, technological solutions are, by definition, highly intentional, specific answers to well-defined problems.

  Many medical problems proved highly amenable to technological solutions, which subdivide into three categories: Life-sustaining, Diagnostic and Surgical (see opposite). Several have already been encountered, notably the life-sustaining technologies of the intensive-care unit, and particularly the ventilator machine which, by ensuring adequate oxygenation of tissues, can keep people alive during an acute illness until their physiological functions have recovered. Dialysis and the heart pacemaker extend this principle to keeping alive for many years those with chronic illnesses, such as kidney failure or potentially lethal abnormalities of heart rhythm.

  Next, the new methods of diagnostic technology permitted doctors to scrutinise every nook and cranny of the body. The brain, thanks to the CT and MRI scanners, can now be seen with a haunting clarity, while the foetus that previously grew hidden from view within the womb can, thanks to ultrasound, be observed virtually from the moment of conception.

  Finally, the surgical technology of the pump and joint replacements, as already described, created respectively the entirely new specialty of cardiac surgery and transformed the scope of orthopaedics.

  Three forms of medical technology

  LIFE-SUSTAINING1

  Intensive care Dialysis

  Ventilator Pacemakers

  DIAGNOSTIC2

  CT scanner PET scanner

  MRI scanner Angiography

  Ultrasound Cardiac catheterisation

  SURGICAL3

  Joint replacement The pump

  Intraocular lens implant Operating microscope

  Cochlear implant Endoscopy

  The significance of these technological innovations needs no elaboration. But the most important of all, in the comprehensiveness of its effects, was optics. The Zeiss operating microscope and the endoscope would permit surgeons not only to ‘see more’ but also to ‘do more�
�, and in the process would have a major impact across the whole range of surgical disciplines: ENT surgery, ophthalmology, neurosurgery, plastic surgery, replantation surgery, gynaecology, orthopaedics and abdominal surgery.

  The Operating Microscope

  The possibilities of the operating microscope are best conceived by thinking of a household pin whose head is slightly less than 1 mm or 1/20 inch in diameter. Then imagine the pinhead is an artery that has to be sewn on to another artery of the same dimensions. It can’t be done. But if the same pinhead-sized arteries are viewed through an operating microscope and magnified twenty-fold then, to the surgeon’s eye, both ends now appear to be an inch in diameter, and with delicate instruments they can be sewn together. Welcome to the world of microsurgery.4

  Microsurgery effectively started in 1954, when the German optics company Zeiss produced the first binocular surgical microscope. Among the first to see its possibilities were the ear, nose and throat (ENT) surgeons, for reasons that are readily apparent when leaning over the surgeon’s shoulder to watch an operation on a patient with deafness due to hardening of the bones of the middle ear. Peering down the ear canal, the eardrum is readily identifiable as a curtain of tissue. The surgeon takes a knife and cuts away its lower half and then lifts it upwards like a tent flap to expose beneath the three small bones of the middle ear, the most distant of which, the stapes (so called because of its physical resemblance to a rider’s stirrups), is in close proximity to the organ of hearing, the semi-circular canals of the inner ear. In otosclerosis, as this patient’s condition is called, the stapes becomes immobile and is unable to transmit the vibrations of sound from the eardrum. The surgeon duly mobilises the stapes by drilling a minute hole through its centre and dropping through a piston whose movements can now transmit the vibrations. It requires little imagination to appreciate how this and similar delicate operations, performed deep within the ear, can only be reliably accomplished with an operating microscope.5

 

‹ Prev