Plagues and Peoples

Home > Other > Plagues and Peoples > Page 30
Plagues and Peoples Page 30

by William H. McNeill


  A number of other infectious diseases of long-standing importance also quickly succumbed to the new techniques bacteriologists had learned to command. Thus typhoid fever was first identified as a distinct disease in 1829; its causative bacillus was discovered and an effective vaccine developed by 1896; and in the first decade of the twentieth century mass inoculations against typhoid proved capable of checking the disease. Diphtheria bacilli were identified in 1883, and an antitoxin was proved effective in 1891. Bacilli in milk were brought under control by pasteurization, i.e., heating the milk to a temperature at which most potentially harmful bacteria were killed. The city of Chicago made this method of guarding infants and others against milk-borne infection legally compulsory in 1908. It was the first major city to do so, but others swiftly followed suit, so that this source of infection also ceased to be significant before World War I.72

  Other infections proved more difficult to deal with. From the 1650s European doctors had been aware that the debilitaring symptoms of malaria could be suppressed by drinking an infusion prepared by soaking the bark of the quinchona tree, a native of South America, in water or some other solvent. (The medically active agent in the infusion was later known as quinine.) But confusion over the identity of the tree that actually yielded the healing bark, together with commercial adulteration of the supply, later discredited the cure. This was especially true among Protestants, whose suspicions of the Jesuits, who spread knowledge of the bark around the world, extended also to their cure for malaria.73 Not until 1854, when the Dutch established quinchona plantations in Java, did Europeans command a reliable supply of the right kind of bark. In fact, the penetration of the interior of Africa that became a prominent feature of Europe’s expansion in the second half of the nineteenth century would have been impossible without quinine from the Dutch plantations. They continued to supply the European world until World War II.74 In 1942, when the Japanese seized Java, a concerted effort to discover substitute chemicals for suppressing malaria became necessary, and led to the synthesis of Atabrine and a number of other, and quite effective, drugs.

  Regular ingestion of suitable quantities of quinine allowed human beings to survive in regions where malaria would otherwise have killed them off; but the drug merely suppressed the fever, and did not either prevent or cure the disease. The identity and intricate character of the life cycle of the malarial plasmodium was worked out in the 1890s. No vaccine or antitoxin could be developed, and mosquito control proved so difficult to organize that it was only attempted in a few strategically important places before the 1920s.

  Yellow fever excited even greater attention than malaria, partly because it was more often lethal to susceptible adults, and partly because it threatened to disrupt American imperial expansion into the Caribbean. But yellow fever was a viral disease, so that its causative organism could not be identified by the techniques available to nineteenth-century bacteriologists. Nevertheless, an American medical team headed by Wal- ter Reed went to Cuba to combat the disease, and proved that it was spread by mosquitoes. In 1901 a campaign was launched to eliminate yellow fever from Havana by attacking mosquito-breeding places. The effort proved successful, largely because the medical campaign was backed by the prestige and resources of the United States army.

  In 1901, Havana had but recently broken away from Spanish imperial control as a result of the Spanish-American War (1898). Thereafter the United States’ ambitions and strategic considerations turned decisively toward the Caribbean, as plans for building a canal across the Isthmus of Panama assumed a new vivacity. The French attempt to pierce the isthmus (1881–88) had been abandoned because costs escalated unbearably, as a result of the heavy die-off among the work force from malaria and yellow fever. If a canal were to be successfully constructed, therefore, control of these mosquito-borne diseases became critically important. Hence American political leaders and military commanders concurred in placing hitherto unexampled resources at the disposal of the medical officers entrusted with this task.

  The result was indeed spectacular, for a rigorous and energetic sanitary police—supported and sustained by meticulous observation of mosquito numbers and patterns of behavior did succeed in reducing these previously formidable killers to trifling proportions. After 1904, when the Canal Zone was legally constituted, United States troops survived quite successfully while garrisoning what had previously been one of the world’s most notorious fever coasts.75

  United States military administrators limited their responsibility to safeguarding the health of American soldiers and did not seriously entertain the larger ambition of combatting yellow fever on a world-wide basis. Yet the opening of the Panama Canal in 1914 offered the prospect—or seemed to, since the relationship between dengue fever and yellow fever was then not understood—that ships passing through the Canal Zone might by ill-chance pick up the yellow fever infection and spread it throughout the Pacific islands and to the coastlands of Asia, where the disease was totally unknown.

  To try to head off such a disaster, the newly established Rockefeller Foundation in 1915 undertook a global program for the study and control of yellow fever. In the ensuing twenty years much was learned about the complexities of the disease. A number of spectacularly successful control programs eliminated foci of infection from the west coast of South America; and the tough ecological system that sustains the disease in its African homeland was sufficiently explored to convince all concerned that elimination of the disease on a global basis was impractical. By 1937, however, the development of a cheap and effective vaccine deprived yellow fever of its former significance for human life.76

  Success against yellow fever encouraged the Rockefeller Foundation to undertake a similar assault upon malaria in the 1920s. Mosquito control of the kind that had driven yellow fever from Caribbean cities proved locally successful in countries such as Greece. But it was not until after World War II, and the discovery of the insecticidal power of DDT, that methods of combatting mosquitoes became cheap enough to affect the world-wide incidence of malaria very significantly. After World War II, administration of anti-malarial campaigns passed from the private hands of the Rockefeller Foundation to the World Health Organization, established in 1948 to carry through just such operations on an official, international basis.

  The sudden lifting of the malarial burden brought about by liberal use of DDT in the years immediately after World War II was one of the most dramatic and abrupt health changes ever experienced by humankind. In some localities resultant changes in population growth rates were both spectacular and in their way as difficult to live with as malaria itself had been.77 In addition, massive distribution of DDT destroyed a wide spectrum of insect life, and sometimes poisoned animals that fed on organisms that had been tainted with the chemical. Another unwished-for and unintended effect was the development of DDT-resistant strains of mosquitoes. But chemists responded by developing new lethal compounds, and so far have been able to evolve such variants faster than insects have been capable of developing tolerances to the chemical assault. All the same, the long-range ecological consequences of this chemical warfare between humans and insects is by no means clear. Nor is it certain that malaria has been permanently subdued, despite the fact that the World Health Organization formally declared it (along with smallpox) to be a principal target for eradication from the face of the earth.78

  Tuberculosis was another infectious disease that proved unusually tenacious. As we saw in Chapter IV, it is possible that pulmonary tuberculosis had acquired a new importance by displacing the bacillus of leprosy among European populations after the fourteenth century. Some authorities think the incidence of the disease among European populations reached a peak in the seventeenth century and declined in the eighteenth, only to crest a second time among the ill-housed and ill-fed inhabitants of industrial towns in the nineteenth century.79 Upper classes were, of course, also liable to infection; and “consumption” actually became fashionable in literary and aesthetic circles
in the early decades of the nineteenth century.

  Nevertheless, after about 1850, deaths from tuberculosis, at least in England, had already begun to decline very markedly when Robert Koch won instant fame in 1882 by announcing the discovery of the causative bacillus. Almost fifty years later, in 1921, a partially effective vaccine against tuberculosis was finally produced. Long before then, new knowledge of how the disease propagated itself, and systematic efforts to isolate sufferers from consumption in sanatoria, together with such simple methods of prophylaxis as slaughtering milk cattle found to harbor tuberculosis bacilli and prohibiting spitting in public places, had done a good deal to hasten the retreat of pulmonary forms of the infection from western countries.

  On the other hand, tuberculosis remained virulent among a wide variety of previously isolated and primitive peoples brought into contact with outsiders by the continuing evolution of mechanical transport; and in much of Oceania, Asia, and Africa, tuberculosis remains a major source of human debility and death. The development of antibiotic drugs, during and after World War II, that were capable of attacking the bacillus without doing much damage to the human body, meant that in places where modern medical services were available the disease lost its former importance. But since the dramatic retreat of malaria in the post-World War II years, tuberculosis has remained probably the most widespread and persistent human infection in the world at large, with an annual death toll of something like 3.5 million.80

  Successes in discovering relatively cheap and efficacious ways to check these and other less well-known infectious diseases went hand in hand with spread of more efficient organizations for acting on the new knowledge medical researchers developed so spectacularly. National and local boards of health and medical services proliferated through the world; army medical corps marched alongside (and usually in advance) of their civilian counterparts.

  Decisive breakthroughs in military medical administration came just after the turn of the twentieth century. Until then, in even the best-managed armies, disease was always a far more lethal factor than enemy action, even during active campaigns. In the Crimean War (1854–56), for example, ten times as many British soldiers died of dysentery as from all the Russian weapons put together; and half a century later, in the Boer War (1899–1902), British deaths from disease as officially recorded were five times as great as deaths from enemy action.81 Yet a mere two years thereafter, the Japanese showed what systematic inoculation and careful sanitary police could accomplish. Their losses from disease in the Russo-Japanese war (1904–6) were less than a quarter of deaths from enemy action.82

  This remarkable breakthrough was not lost on other countries. In the course of the next decade, all the world’s important armies made it a standard practice to do what the Japanese had done, i.e., routinely inoculate recruits against a whole battery of common infections—typhoid, smallpox, tetanus, and sometimes others as well. Previously, some European armies had adhered to Napoleon’s example and vaccinated recruits against smallpox as a matter of course. Oddly enough, after 1815 the French did not continue this practice into peacetime, whereas the Prussians did. As a result, in 1870–71, during the Franco-Prussian War, smallpox put about 20,000 French soldiers out of action, whereas their German foes remained immune from the disease.83 What was new in military medicine was not the idea of immunization; rather it was the systematic way it now began to be applied to all infections against which convenient immunization procedures could be devised.

  In the decade before World War I another important medical discovery altered the epidemiology of European armies profoundly, for it was between 1909 and 1912 that the role of the louse in spreading typhus fever was figured out. This, together with systematic immunization against other common infections, was what made the unexampled concentration of millions of men in the trenches of northern France, 1914–18, medically possible. Passing men and clothing through delousing stations became part of the ritual of going to and returning from the front; and this prevented typhus from playing the lethal role on the Western Front that it did, sporadically but dramatically, in the East. Even when typhus did break out on the Eastern Front in 1915, disease losses in the ranks remained well below losses from enemy action as long as organization and discipline remained intact.84 Only when these cracked, as happened among the Serbs in 1915–6 or the Russians in 1917–18, did epidemic disease resume its accustomed lethality among soldiers and civilians alike. Syphilis was the only disease to flourish in the face of a functioning medical corps during World War I. That disease did attain epidemic proportions among British troops, and army doctors failed to handle it effectively at first, more from moral than for medical reasons.85

  Similar successes occurred during World War II, when even the epidemiological perils of the monsoon forests of southeastern Asia and the rigors of the Russian steppe proved incapable of paralyzing medically well-managed armies. New chemicals—DDT, sulfas, penicillin, Atabrine, for instance—made formerly formidable diseases easy to prevent or cure; and military command channels proved exceedingly effective in delivering the resulting medical miracles to the places where they were most needed. Soldiers and sailors regularly took precedence when shortages arose, but it was also true that military medical administration was extended to civilian communities whenever some infectious disease threatened to cause trouble for occupation authorities. Wholesale and compulsory delousing of civilians in Naples in 1943, for instance, stopped an incipient typhus epidemic in its tracks; and innumerable refugee camps, slave labor camps, and other forms of official accommodation for displaced persons shared in some degree or other the pattern of medical administration that had proved so valuable for military units.86

  Another remarkable by-product of the administrative innovations of World War II was improvement in health through food rationing. During World War I rationing was managed in ignorance of exact human dietary requirements, and came to be associated, especially in Germany, with malnutrition and intense human suffering. During World War II, hunger wreaked its ravages among some populations as before; but in Germany itself and still more in Great Britain, special allowances of critically short foods for children, pregnant women, and other specially vulnerable elements of the population, and a more or less rational allocation of vitamin pills, protein, and carbohydrates in accordance with scientifically established physiological needs for different classes of the population, actually improved the level of health in Great Britain, despite severe shortages and stringencies; and allowed the Germans to maintain a generally satisfactory level of health until almost the end of the struggle.87

  Such triumphs of administrative rationality prepared the way for the amazingly successful post-war international health programs that have fundamentally altered disease patterns in nearly all the inhabited world since 1948.

  International medical organization of a formal and official kind dates back to 1909, when an International Office of Public Hygiene was set up in Paris to monitor outbreaks of plague, cholera, smallpox, typhus, and yellow fever. The office also attempted to define uniform sanitary and quarantine regulations for the European nations. Between the two great wars of the twentieth century, the League of Nations set up a Health Section. Several special commissions discussed world incidence of such diseases as malaria, smallpox, leprosy, and syphilis. But more important work was done during this period by the Rockefeller Foundation with its programs attacking yellow fever and malaria. Then in 1948, a new and more ambitious World Health Organization was set up. With substantial government support, WHO set out to bring the benefits of up-to-date scientific medical knowledge to backward parts of the world, wherever local governmental authorities would co-operate.88

  Since the 1940s, therefore, the impact of scientific medicine and public health administration upon conditions of human life has become literally worldwide. In most places epidemic diseases have become unimportant, and many kinds of infection have become rare where they were formerly common and serious. The net increment to hum
an health and cheerfulness is hard to exaggerate; indeed, it now requires an act of imagination to understand what infectious disease formerly meant to humankind, or even to our own grandfathers. Yet as is to be expected when human beings learn new ways of tampering with complex ecological relationships, the control over microparasites that medical research has achieved since the 1880s has also created a number of unexpected by-products and new crises.

  One interesting and ironic development has been the appearance of new diseases of cleanliness. The chief example of this phenomenon was the rising prevalence of poliomyelitis in the twentieth century, especially among the hygienically most meticulous classes. It seems clear that in many traditional societies minor infection in infancy produced immunity to the polio virus without provoking any very pronounced symptoms; whereas persons whose sanitary regimen kept them from contact with the virus until later in life, often suffered severe paralysis or even death.89 Fear of annual outbreaks of poliomyelitis crested in the United States in the 1950s, assisted by careful propaganda aimed at securing funds for research into its causes and cure. As in so many earlier cases, an effective vaccine was developed in 1954, whereupon the disease sank again to a marginal position in public attention, affecting only a very few who escaped or refused vaccination.

 

‹ Prev