Book Read Free

Lifeblood

Page 3

by Alex Perry


  Malaria, then, is perpetual, a global scourge, a driver of human history, even a determinant of human evolution. To this day, a third of West Africans have sickle cell anemia, an often deadly mutation of red blood cells that might have been eliminated through natural selection but for the protection it offers against the most severe attacks of malaria.9

  If much human history can be told through the story of malaria, that’s just as true with the annals of medicine. The Western world’s first physician, Hippocrates, lived from 460 to 370 BC in ancient Greece and earned that title from his studies of malaria. He was the first to distinguish malaria from other types of fever, such as typhoid. Some of his conclusions were wide of the mark: he claimed there were two types of malaria fever, those that fell on even days and those falling on odd days. More accurately, he noted the disease’s ability to enlarge the spleen and made a connection between it, the marshes outside Athens, and the onset of rain in the autumn.10

  It would be another two thousand years before the disease was associated not with stagnant water and its vapors but with the mosquitoes that bred in it. But then, in the late nineteenth century, the discoveries came all at once. In 1880 Frenchman Charles Louis Alphonse Laveran identified the malaria parasite in a military hospital in Algeria—though he was initially widely disbelieved. In 1885, Italian Camillo Golgi established there was more than one type of malaria. And in a series of experiments in 1897 and 1898, a British imperial physician, Surgeon Major Ronald Ross, discovered the malaria parasite in the dissected gut of a mosquito that had bitten a malarious patient, thus proving how malaria is transmitted. Also in 1898, an Italian biologist, Giovanni Battista Grassi, did the same by passing malaria into a human volunteer through the bite of an infected mosquito. Ross was awarded the Nobel Prize for medicine in 1902. Laveran won the same prize for his work on malaria in 1907.

  Despite not knowing how malaria worked until the late nineteenth century, by then we had already been treating it successfully for hundreds of years. In 1631, a Jesuit apothecary in Peru called Agostino Salumbrino sent back to Rome the bitter bark from the cinchona tree, explaining that Peruvians used it as a cure for fever. As Fiammetta Rocco writes in her history of cinchona, The Miraculous Fever Tree, its alkaloid extract, quinine, became “the modern world’s first real pharmaceutical drug.”11 A later synthetic version, chloroquine, became the ubiquitous malaria treatment of the twentieth century. In time, the antimalarial properties of an extract of the Chinese sweet wormwood tree, artemisinin, would also become known to a wider world.

  After Laveran, Grassi, and Ross had shown how malaria was transmitted, we could not only treat malaria but also prevent it. As more was discovered about mosquitoes, we learned males were harmless, feeding on nectar and plant juices, and only females needed the nutrients of a blood meal to produce eggs. Female Anopheles had adapted themselves to feeding almost exclusively on humans, generally at night when their prey was asleep and stationary. They located their victims with sensors that followed heat and the plume of carbon dioxide in a person’s breath. In theory, that made prevention easy. For people: a nighttime regimen of long-sleeved shirts, full-length pants, and socks and hanging nets around beds. For the landscape: the draining of marshes, swamps, and ponds; the oiling of puddles; the fumigating of buildings and hedgerows; and the screening of doors and windows.

  Straightforward, perhaps. But given malaria’s global spread and the limited resources of the time, unfeasible. As Richard Feachem, the first executive director of the Global Fund to Fight AIDS, Tuberculosis and Malaria, says: “People forget today, but even as recently as the end of World War II, every country in the world had endemic malaria transmission within its borders, even inside the Arctic Circle.”12 Eradication efforts initially concentrated on limited areas where it made economic sense. The first took place in Panama, scene of Scotland’s earlier misfortune. In 1904, the US took on the long-standing project to build a canal between the Atlantic and the Pacific and—since malaria had defeated all previous attempts—employed a US military doctor, William Crawford Gorgas, to purify the area. The campaign was abhorrent in its selectivity. Gorgas included white managers and overseers in his efforts but excluded black laborers, who subsequently died in the thousands. But Gorgas’s efforts succeeded as far as they went: malaria and yellow fever deaths among whites plummeted, the Panama Canal was completed in 1914, and the narrow exclusion zones in which whites lived and worked remained malaria-free for decades.13

  In Europe and the US in the early twentieth century, malaria also declined rapidly. This was due not just to health campaigns. Mosquitoes’ opportunities to breed were also being reduced by the draining of swampland for pasture and the mechanization of agriculture, which prompted former farm workers to migrate en masse to the cities.14 But the malarious battlefields of World War II—around the Mediterranean and in the Middle East and Southeast Asia—forced the disease back onto the health agenda once more. In 1944, malaria even held up the Allied advance north through Italy after the German army purposefully caused an outbreak by flooding the land around Rome and confiscating local stockpiles of antimalarial drugs. In response to the threat, the US Army recruited Theodor Seuss Geisel, better known as Dr. Seuss, to produce a cartoon pamphlet for the troops featuring a sultry Anopheles drinking what appeared to be a glass of red wine. “This is Ann,” wrote Seuss.

  She drinks blood! Her full name is Anopheles Mosquito and she’s dying to meet you! Ann moves around at night (a real party gal) and she’s got a thirst. No whiskey, gin, beer or rum coke for Ann. She drinks G.I. blood. She can make you feel like a combination of a forest fire, a January blizzard, and an old dish mop . . . and now and then she can knock you flat for keeps.... If you go running around like a strip-teaser, you haven’t got a chance. Bathing and swimming at night where Ann hangs out really is asking for trouble. Head nets, rolled-down sleeves, leggings and gloves may seem like sissy stuff and not so comfortable—BUT a guy out cold from MALARIA is just as stiff as the one who stopped a hunk of steel.... Now IF you really are looking for trouble and you don’t want to miss [out]—just drop down to the nearest native village some evening. The places are lousy with fat little Anns sitting around waiting for you with their bellies full of germs. They stock up on MALARIA bugs from the hometown boys and gals and when they find a nice new sucker they give him the works. So, lay off the native villages if you want to keep the top of your head on.... She’ll bat you down and it won’t be funny.15

  The US Army wanted GIs to cover up. But its generals also knew the best way to fight a war was not protecting your own side but killing the enemy. Carbon, hydrogen, and chlorine had been first synthesized into dichloro-diphenyl-trichloroethane (DDT) in 1874, but it wasn’t until 1939 that the Swiss chemist Paul Hermann Müller discovered the chemical’s properties as an insecticide. (Müller would also win the 1948 Nobel Prize for medicine for his breakthrough.) In his experiments, Müller showed DDT was a long-lasting poison, lethal to cold-blooded insects, and cheap to produce but, miraculously, seemingly harmless to people. The US Army began using it in 1944. Malaria was quickly beaten back, and Germany and Japan defeated soon afterward.

  When the war ended, DDT went on sale to civilians. As GIs returned with tales of the wonder powder, sales rocketed. Backed by government scientists and wild public excitement, farmers set out to exterminate all insects—less because of the risk of malaria, which was now in abeyance, than out of a simple dislike of creepy crawlies.16 The chemical was sprayed on walls, fields, and swamps and dropped from the air. By 1952 malaria had been eradicated in the US.

  Outside the US, the advent of DDT coincided with the birth of foreign aid. In addition to the Marshall Plan, in 1943 the Allies also set up the United Nations Relief and Rehabilitation Administration (UNRRA), the first ever UN organization to administer relief in war zones. That included public health, and in that endeavor they were joined by private philanthropic groups such as the Rockefeller Foundation. After the war, these bodies used DDT to
launch a new war on insect-borne disease. In 1946, the expanding UNRRA was split into several more specialized UN agencies, among them the World Health Organization (WHO) and the United Nations Children’s Fund (UNICEF). And it was the WHO that initiated the first Global Malaria Eradication Program, in 1955.

  Persuaded that US-funded malaria eradication was a way to convince the world that Washington was a more beneficent patron than the Soviet Union, in 1958 the US Congress allocated $100 million for a five-year malaria eradication program, an amount it would increase to a total of $490 million by 1963.17 The scale of this effort can be measured by the fact that in the early 1960s a total of ninety-three countries, almost half the world, had American-funded malaria eradication programs that used DDT spraying—conducted by hand, from fumigating trucks, and from the air.

  The results were spectacular. Cases of malaria fell from 3 mil- lion in Sri Lanka in 1946 to 7,300 in 1951 to just 18 in 1963;18 life expectancy there rose from forty-three years to fifty-seven. In India, infections plummeted from 75 million and 800,000 deaths in 1947 to 100,000 and fewer than 100 deaths in 1965.19 In Sardinia , an early test, they plunged from a pre-eradication 75,000 cases in the 1940s to 9 in 1951.20 Over the duration of the campaign , eighteen countries around the world became malaria-free, accounting for 32 percent of humankind. Malaria cases fell from 350 million to 100 million.21 Our most historic disease, it was confidently predicted, would soon itself be history.

  So what happened? Why, half a century after we looked set to rid the planet of malaria, does it still sicken half a billion people a year and kill a million? The uncomfortable truth is that what should have been one of our great successes became one of our greatest failures. After its initial stunning success, the WHO’s global eradication campaign ran out of steam. Then it was abandoned altogether.

  Why? First, spraying wasn’t always popular with the people it was supposed to protect. DDT killed insects, certainly. But it also killed chickens and cats that ate the poisoned bugs. (In an incident lampooned for years afterward by environmentalists, in early 1960 Britain’s Royal Air Force parachuted twenty cats in wicker baskets into a remote Borneo village to compensate for those killed by DDT).22 Teams of sprayers, sent by central governments and foreigners, found a cold welcome. Dispirited, they began overspraying in the mornings so they could stop early or taking bribes not to spray, even selling their DDT in markets. Surveillance teams checking up on the sprayers were similarly uncommitted: many failed to visit remote areas or lacked the skills to collect and analyze blood samples properly. Partly because DDT was not being applied with sufficient thoroughness, partly because DDT was also being used by farmers as a plant pesticide, evolution began to select for survival mosquitoes that were resistant to low levels of DDT. “What didn’t kill them only made them stronger,” writes the malaria historian Sonia Shah in The Fever.23

  There were other reasons to turn against DDT. Research revealed rising concentrations of the stuff in the food chain. It was killing birds and infecting milk. In 1962, the environmentalist Rachel Carson published Silent Spring, which suggested pesticides such as DDT were endangering the entire ecosystem and pushing species such as the bald eagle, the US national bird, to extinction. The book became one of the most influential of the twentieth century. Not only did it change opinion on DDT forever—the US stopped funding the WHO campaign in 1963 and banned the substance altogether in 1972—Carson’s vision of the planet as an interconnected, living organism became the intellectual foundation of the ecological movement.

  But what doomed the WHO’s campaign from the start was that, in reality, it was not global at all. Incredible as it might seem today, the WHO simply left out Africa. The WHO had a procedural excuse for the omission: most of Africa was still under colonial rule, at least at the start of the campaign, and the WHO’s mandate did not cover colonies. But there was little mystery as to the real reason. Staff at the WHO wanted a success. So they chose not to include the continent with the worst infrastructure and capacity and that, not coincidentally, had the worst malaria. That heartless, circular logic—we can’t help the poorest because their poverty makes it too hard—was matched by the ruthless calculus of the eradication program’s main funder. The US deemed Africa less critical to the outcome of the Cold War than, say, Central America or Southeast Asia.24

  The legacy of the WHO campaign was mixed. By its end, large parts of the world, notably the US, Europe, Russia, and parts of Asia like the Korean peninsula, were malaria-free. Today that list has grown. As Feachem says, the entire planet had malaria in 1945 but now “100 countries have got rid of it, and there are 100 malarious countries.”25

  But in those countries where the disease survived, the campaign bequeathed a more powerful mosquito, inured to DDT, and a malaria parasite increasingly resistant to chloroquine. Unsurprisingly, it rebounded. Sri Lanka was back to 500,000 cases by 1969; India resurged to a million.26 Worse, in its epicenter of Africa, malaria was untouched. In 1969, the WHO formally dumped eradication as a goal and switched to learning to live with the disease through treatment and scattershot prevention.

  The campaign may have failed to kill malaria. But it was murder on malariology. When DDT seemed, at least for a few years, finally to have solved a problem that had dogged medicine for millennia, the field lost much of its interest for doctors. Malariologists changed disciplines. Few new doctors felt the specialty worth pursuing. Within a few years, there was just a handful of practicing malaria experts in the world.

  Brian Greenwood shouldn’t have been one of them. As a 1962 graduate of London’s Middlesex Hospital, Greenwood says he had intended to follow his contemporaries to Harley Street, the exclusive London address of Britain’s best private doctors. Greenwood—tall, charming, unfailingly polite—is from that generation of Englishmen who view introspection as poor manners. “Something was not quite right, so in 1965 I went to Africa,” is all he will say of the decision that changed his life and those of millions of others. The non sequitur is as apparent to him as anyone. “I still don’t know why I did that,” he adds.27

  After a short spell in Zambia, “where I was inspired by independence and [anticolonial leader Kenneth] Kaunda,” Greenwood took a job as a medical registrar at a teaching hospital in Ibadan in the sweltering forests of southern Nigeria. Overstaffing in Ibadan—there were eight doctors for thirty patients—left plenty of time for immunology research, which was developing into a passion for Greenwood. He also saw a bright future in African medicine. He expected, he says, “that Africa would soon have a lot more hospitals like that. The indicators at the time showed we were better than Thailand or Singapore.”

  It wouldn’t work out that way. As so often in postindependence Africa, civil war blighted expectations. In 1967 Nigeria was split by the Biafran conflict, which was to last for three years. “The doctors who were Biafrans left, and I suddenly had more responsibility,” says Greenwood. Safety was a concern, of course. But the lure of more patients on whom to conduct research was overpowering. Greenwood’s studies consumed him. “I began to wonder why autoimmune diseases were so high,” he says. “Since malaria had such an impact on the immune system, could the answer be malaria?”

  After two years, Greenwood returned to London with the data and samples he had gathered. Three years of lab work proved preventing malaria was key to improving health in Africa: stopping malaria, and so blocking its damaging impact on the immune system, also checked a host of other diseases. “That was it,” says Greenwood. “I was hooked by Africa and by malaria. It’s such a beautiful and clever parasite. I was offered a master’s position in Canada. But they had a new medical school in northern Nigeria, and I went there.”

  By now Greenwood was not so much choosing a path less trodden as forging one never trodden before. Newly married, with children, he brought his family with him to Africa and the new medical school at Ahmadu Bello University in the remote town of Zaria in the heat and dust on the southern edge of the Sahara. Few other doctors were w
illing to do the same. In addition to the dubious attractions of specializing in a small and shrinking field and working at a teaching hospital in northern Nigeria, Greenwood couldn’t offer much pay. “We struggled,” he says. “There was never much money, and we were always fighting off attempts to close us down.”

  But the obscurity and difficulty of the work acted like a filter for quality. Only the best and most dedicated applied. A small group of malariologists began to gather around Greenwood in Nigeria, and when in 1980 he moved to the Medical Research Council Laboratories in Banjul, Gambia, they followed. Among Greenwood’s disciples: Eldryd Parry, future Cambridge professor and founder of the Tropical Health and Education Trust; David Warrell, later president of the Royal Society of Tropical Medicine and Hygiene and professor emeritus at Oxford University; Bob Snow, future professor of tropical public health at Oxford and head of the Malaria Public Health and Epidemiology Group in Nairobi; Kevin Marsh, who now runs the KEMRI-Wellcome Research Programme in Kilifi, Kenya; and Pedro Alonso, who founded the Manhiça Health Research Centre in Mozambique to develop malaria vaccines. “It was definitely not the thing to do,” says Greenwood of his small band of malariologists. “You had to be pretty sure that’s what you wanted. Most of their colleagues thought these guys were crazy, just stepping off the ladder for a job in a tiny field where you were lucky if you got £20,000 a year. But if you were interested in malaria, there was nowhere else to go, and as a result, we gathered a very small, very dedicated group of experts. Pretty much everybody went on to become famous.” Greenwood pauses for a moment of reflection. “I have to say I really enjoyed it. I had a lovely time.”

 

‹ Prev