Word that the bark of the quina tree could cure malaria quickly spread to Europe. In 1633 Father Antonio de la Calaucha recorded the amazing properties of the bark of the “fever tree,” and other members of the Jesuit order in Peru began using quina bark to both cure and prevent malaria. In the 1640s Father Bartolomé Tafur took some of the bark to Rome, and word of its miraculous properties spread through the clergy. The papal conclave of 1655 was the first without a death from malaria among the attending cardinals. The Jesuits were soon importing large amounts of the bark and selling it throughout Europe. Despite its excellent reputation in other countries, “Jesuit’s powder”—as it became known—was not at all popular in Protestant England. Oliver Cromwell, refusing to be treated by a papists’ remedy, succumbed to malaria in 1658.
Another remedy for malaria gained prominence in 1670, when Robert Talbor, a London apothecary and physician, warned the public to beware of the dangers associated with Jesuit’s powder and started promoting his own secret formulation. Talbor’s cure was taken to the royal courts of both England and France; his own king, Charles II, and the son of Louis XIV, the French king, both survived severe bouts of malaria thanks to Talbor’s amazing medication. It was not until after Talbor’s death that the miraculous ingredient in his formulation was revealed; it was the same cinchona bark as in Jesuit’s powder. Talbor’s deceit, while no doubt making him wealthy—presumably his main motive—did save the lives of Protestants who refused to be associated with a Catholic cure. That quinine cured the disease known as ague is taken as evidence that this fever, which had plagued much of Europe for centuries, was indeed malaria.
Through the next three centuries malaria—as well as indigestion, fever, hair loss, cancer, and many other conditions—was commonly treated with bark from the cinchona tree. It was not generally known what plant the bark came from until 1735, when a French botanist, Joseph de Jussieu, while exploring the higher elevations of the South American rain forests, discovered that the source of the bitter bark was various species of a broad-leafed tree that grew as high as sixty-five feet. It was a member of the Rubiaceae, the same family as the coffee tree. There was always great demand for the bark, and its harvesting became a major industry. Although it was possible to gather some of the bark without killing the tree, greater profits could be made if the tree was felled and all the bark stripped. By the end of the eighteenth century an estimated 25,000 quina trees were being cut down each year.
With the cost of cinchona bark high and the source tree possibly becoming endangered, isolating, identifying, and manufacturing the antimalarial molecule became an important objective. Quinine is thought to have been first isolated, although probably in an impure form, as far back as 1792. Full investigation of what compounds were present in the bark started around 1810, and it was not until 1820 that researchers Joseph Pelletier and Joseph Caventou managed to extract and purify quinine. The Paris Institute of Science awarded these French chemists a sum of ten thousand francs for their valuable work.
Cinchona tree from whose bark quinine is obtained. (Photo courtesy of L. Keith Wade)
Among the almost thirty alkaloids found in cinchona bark, quinine was quickly identified as the active ingredient. Its structure was not fully determined until well into the twentieth century, so early attempts to synthesize the compound had little chance of success. One of these was the effort of the young English chemist William Perkin (whom we met in Chapter 9) to combine two molecules of allyltoluidine with three oxygen atoms to form quinine and water.
Working in 1856 on the basis that the formula of allyltoluidine (C10H13N) was almost half that of quinine (C20H24N2O2), his experiment was doomed to fail. We now know that the structure of allyltoluidine and the more complicated structure of quinine are as follows:
While Perkin failed to make quinine, his work was extremely fruitful for making mauve—and money—for the dye industry and for the development of the science of organic chemistry.
As the Industrial Revolution brought prosperity to Britain and other parts of Europe during the nineteenth century, capital became available to tackle the problem of unhealthy, marshy farmland. Extensive drainage schemes turned bogs and fens into more productive farms, meaning that less stagnant water was available for breeding mosquitoes, and the incidence of malaria decreased in regions where it had been most prevalent. But the demand for quinine did not decrease. On the contrary, as European colonization increased in Africa and Asia, there was more demand for protection against malaria. The British habit of taking quinine as a prophylactic precaution against malaria developed into the evening “gin and tonic”—the gin being considered necessary to make the bitter-tasting quinine in the tonic water palatable. The British Empire depended on supplies of quinine, as many of its most valuable colonies—in India, Malaya, Africa, and the Caribbean—were in regions of the world where malaria was endemic. The Dutch, French, Spanish, Portuguese, Germans, and Belgians also colonized malarial areas. Worldwide demand for quinine was enormous.
With no synthetic route to quinine in sight, a different solution was sought—and found: the cultivation of cinchona species from the Amazon in other countries. Profit from the sale of cinchona bark was so great that the governments of Bolivia, Ecuador, Peru, and Colombia, in order to maintain their monopoly over the quinine trade, prohibited the export of living cinchona plants or seeds. In 1853, the Dutchman Justus Hasskarl, director of a botanical garden on the island of Java in the Dutch East Indies, managed to smuggle a bag of seeds from Cinchona calisaya out of South America. They were successfully grown in Java, but unfortunately for Hasskarl and the Dutch, this species of the cinchona tree had a relatively low quinine content. The British had a similar experience with smuggled seeds from Cinchona pubescens, which they planted in India and Ceylon. The trees grew, but the bark had less than the 3 percent quinine content needed for cost-effective production.
In 1861, Charles Ledger, an Australian who had spent a number of years as a quina bark trader, managed to persuade a Bolivian Indian to sell him seeds of a species of cinchona tree that supposedly had a very high quinine content. The British government was not interested in buying Ledger’s seeds; their experience with cultivation of cinchona had probably led them to decide that it was not economically viable. But the Dutch government purchased a pound of seeds of the species, which became known as Cinchona ledgeriana, for about twenty dollars. Although the British had made the smart choice nearly two hundred years previously in ceding the isoeugenol molecule of the nutmeg trade to the Dutch in exchange for the island of Manhattan, it was the Dutch who made the correct call this time. Their twenty-dollar purchase has been called the best investment in history, as the quinine levels in Cinchona ledgeriana bark were found to be as high as 13 percent.
The C. ledgeriana seeds were planted in Java and carefully cultivated. As the trees matured and their quinine-rich bark was harvested, the export of native bark from South America declined. It was a scenario repeated fifteen years later, when smuggled seeds from another South American tree, Hevea brasiliensis, signaled the demise of indigenous rubber production (see Chapter 8).
By 1930 over 95 percent of the world’s quinine came from plantations on Java. These cinchona estates were hugely profitable for the Dutch. The quinine molecule, or perhaps more correctly the monopoly on the cultivation of the quinine molecule, almost tipped the scales of World War II. In 1940 Germany invaded the Netherlands and confiscated the complete European stock of quinine from the Amsterdam premises of the “kina bureau.” The 1942 Japanese conquest of Java further imperiled the supply of this essential antimalarial. American botanists, led by Raymond Fosberg of the Smithsonian Institution, were sent to the eastern side of the Andes to secure a supply of quina bark from trees still growing naturally in the area. Although they did manage to procure a number of tons of bark, they never found any specimens of the highly productive Cinchona ledgeriana with which the Dutch had had such astounding success. Quinine was essential to protect Allied troop
s in the tropics, so once again its synthesis—or that of a similar molecule with antimalarial properties—became extremely important.
Quinine is a derivative of the quinoline molecule. During the 1930s a few synthetic derivatives of quinoline had been created and had proven successful at treating acute malaria. Extensive research on antimalarial drugs during World War II resulted in a 4-aminoquinoline derivative, now known as chloroquine, originally made by German chemists before the war, as the best synthetic choice.
Both quinine (left) and chloroquine (right) incorporate (circled) the quinoline structure (center). The chlorine atom in chloroquine is arrowed.
Chloroquine contains a chlorine atom—another example of a chlorocarbon molecule that has been extremely beneficial to humanity. For over forty years chloroquine was a safe and effective antimalarial drug, well tolerated by most people and with little of the toxicity of the other synthetic quinolines. Unfortunately, chloroquine-resistant strains of the malaria parasite have spread rapidly in the past few decades, reducing the effectiveness of chloroquine, and compounds such as fansidar and mefloquine, with their greater toxicity and sometimes alarming side effects, are now being used for malarial protection.
THE SYNTHESIS OF QUININE
The quest to synthesize the actual quinine molecule was supposedly fulfilled in 1944, when Robert Woodward and William Doering of Harvard University converted a simple quinoline derivative into a molecule that previous chemists, in 1918, had allegedly been able to transform into quinine. The total synthesis of quinine was finally presumed complete. But this was not the case. The published report of the earlier work had been so sketchy that it was not possible to ascertain what had really been done and whether the claim of chemical transformation was valid.
Organic natural product chemists have a saying: “The final proof of structure is synthesis.” In other words, no matter how much the evidence points to the correctness of a proposed structure, to be absolutely sure it is correct, you have to synthesize the molecule by an independent route. And in 2001, 145 years after Perkin’s now-famous attempt to make quinine, Gilbert Stork, professor emeritus at New York’s Columbia University, together with a group of coworkers, did just that. They started with a different quinoline derivative, they followed an alternative route, and they carried out each and every step of their synthesis themselves.
As well as being a reasonably complicated structure, quinine, like many other molecules made in nature, presents the particular challenge of determining which way various bonds around certain of the carbon atoms are positioned in space. The quinine structure has an H atom pointing out of the page (indicated by a solid wedge) and an OH directed behind the page (indicated by a dashed line –-) around the carbon atom adjacent to the quinoline ring system.
The quinine molecule
An example of the different spatial arrangements of these bonds is shown on page 341 for quinine and a version inverted around the same carbon atom.
Quinine (left) and the very similar version (right) that would also be synthesized in the laboratory at the same time as quinine
Nature often makes only one of a pair of compounds like this. But when chemists attempt to make the same molecule synthetically, they cannot avoid making an equal mixture of the two. Because they are so similar, separating the two molecules of the pair is tricky and time-consuming. There are three other carbon atom positions in the quinine molecule where both the natural and the inverted versions are unavoidably made during a laboratory synthesis, so these painstaking operations must be repeated four times in all. It was a challenge that Stork and his group overcame—and there is no evidence that the problem was even fully appreciated in 1918.
Quinine continues to be harvested from plantations in Indonesia, India, Zaire, and other African countries, with lesser amounts coming from natural sources in Peru, Bolivia, and Ecuador. Its main uses today are in quinine water, tonic water, and other bitter drinks and in the production of quinidine, a heart medication. Quinine is still thought to provide some measure of protection against malaria in chloroquine-resistant regions.
MAN’S SOLUTION TO MALARIA
While people were searching for ways to harvest more quinine or make it synthetically, physicians were still trying to understand what caused malaria. In 1880 a doctor in the French army in Algeria, Charles-Louis-Alphonse Laveran, made a discovery that ultimately opened the way for a new molecular approach to the fight against this disease. Laveran, using a microscope to check slides of blood samples, found that the blood of patients with malaria contained cells that we now know are a stage of the malarial protozoa Plasmodium. Laveran’s findings, initially dismissed by the medical establishment, were confirmed over the next few years with the identification of P. vivax and P. malariae, and later P falciparum. By 1891 it was possible to identify the specific malaria parasite by staining the Plasmodium cell with various dyes.
Although it had been hypothesized that mosquitoes were somehow involved in the transmission of malaria, it was not until 1897 that Ronald Ross, a young Englishman who was born in India and was serving as a physician in the Indian Medical Service, identified another life stage of Plasmodium in gut tissue of the anopheles mosquito. Thus the complex association between parasite, insect, and man was recognized. It was then realized that the parasite was vulnerable to attack at various points in its life cycle.
The life cycle of the Plasmodium parasite. The merozoites periodically (every 48 or 72 hours) break out of their host’s red blood cells, causing a fever to spike.
There are several possible ways to break this disease cycle, such as killing the merozoite stage of the parasite in the liver and blood. Another obvious line of attack is the disease “vector,” the mosquito itself. This could involve preventing mosquito bites, killing adult mosquitoes, or preventing them from breeding. It is not always easy to avoid mosquito bites; in places where the cost of reasonable housing is beyond the means of most of the population, window screens are just not feasible. Nor is it practical to drain all stagnant or slow-moving waters to prevent mosquitoes from breeding. Some control of the mosquito population is possible by spreading a thin film of oil over the surface of water, so mosquito larvae in the water cannot breathe. Against the anopheles mosquito itself, however, the best line of attack is powerful insecticides.
Initially the most important of these was the chlorinated molecule DDT, which acts by interfering with a nerve control process unique to insects. For this reason, DDT—at the levels used as an insecticide—is not toxic to other animals, even while it is lethal to insects. The estimated fatal dose for a human is thirty grams. This is a considerable amount; there are no reported human deaths from DDT.
The DDT molecule
Thanks to a variety of factors—improved public health systems, better housing, fewer people living in rural areas, widespread drainage of stagnant water, and almost universal access to antimalarial drugs—the incidence of malaria had, by the early years of the twentieth century, greatly decreased in western Europe and North America. DDT was the final step necessary to eliminate the parasite in developed countries. In 1955 the World Health Organization (WHO) began a massive campaign using DDT to eliminate malaria from the rest of the world.
When DDT spraying started, about 1.8 billion people lived in malarial areas. By 1969 malaria had been eradicated for nearly 40 percent of these people. In some countries the results were phenomenal: in 1947 Greece had approximately two million cases of malaria, while in 1972 the number was seven. If any one molecule can be said to be responsible for the increase in economic prosperity of Greece during the last quarter of the twentieth century, it surely must be DDT. Before DDT spraying began in India, in 1953, there were an estimated 75 million cases a year; by 1968 there were only 300,000. Similar results were reported from countries all around the world. It was no wonder that DDT was considered a miracle molecule. By 1975 the WHO had declared Europe to be malaria free.
As it was such a long-lasting insecticide, treatmen
t every six months—or even yearly where the disease was seasonal—was sufficient to give protection against the disease. DDT was sprayed on the inside walls of houses where the female mosquito clung, waiting for nighttime to seek her meal of blood. DDT stayed in place where it was sprayed, and it was thought that very little would ever make its way into the food chain. It was an inexpensive molecule to produce, and it seemed at the time to have little toxicity for other forms of animal life. Only later did the devastating effect of DDT bioaccumulation become obvious. We have also since realized how overuse of chemical insecticides can upset ecological balance, causing more serious pest problems.
Although the WHO’s crusade against malaria had initially looked promising, the global eradication of the parasite proved more difficult than expected for a number of reasons, including the development of resistance to DDT by the mosquito, human population increase, ecological changes that reduced the number of species preying on mosquitoes or their larvae, war, natural disasters, decline of public health services, and the increase of Plasmodium resistance to antimalarial molecules. By the early 1970s the WHO had abandoned its dream of complete eradication of malaria and concentrated its efforts on control.
Penny le Couteur & Jay Burreson Page 30