Why We Get Sick

Home > Other > Why We Get Sick > Page 8
Why We Get Sick Page 8

by Randolph M. Nesse


  9. Finally, the disadvantage of resistant strains in the absence of an antibiotic is gradually lost by further evolutionary changes, so that resistance can prevail even where no antibiotics have been used for a long time.

  The implications of these findings for medical practice are now widely appreciated. If one antibiotic doesn’t alleviate your disease, it may be better to try another, instead of increasing the dose of the first. Avoid long-term exposure to antibiotics; taking a daily penicillin pill to ward off infection is accepted therapy for some conditions, such as infection of vulnerable heart valves, but has the incidental effect of selecting for resistant strains. Unfortunately, we may often be exposed to this side effect without knowing it, by consuming meat or eggs or milk from animals routinely dosed with antibiotics. This is a hazard that has recently provoked conflict between food producers and public health activists. The problem of antibiotic use in farm animals needs to be more widely recognized and carefully evaluated in relation to whatever economic gains may be claimed. As Harold Neu, professor of medicine at Columbia University, says in concluding his 1992 article “The Crisis in Antibiotic Resistance,” “The responsibility of reducing resistance lies with the physician who uses antimicrobial agents and with patients who demand antibiotics when the illness is viral and when antibiotics are not indicated. It is also critical for the pharmaceutical industry not to promote inappropriate use of antibiotics for humans or for animals because this selective pressure has been what has brought us to this crisis.” Such advice is unlikely to be heeded. As Matt Ridley and Bobbi Low point out in a recent article in The Atlantic Monthly, moral exhortations for the good of the many are often welcomed but rarely acted upon. To get people to cooperate for the good of the whole requires sanctions that make lack of cooperation expensive.

  Viruses don’t have the same kind of metabolic machinery as bacteria and are not controllable by fungal antibiotics, but there are drugs that can combat them. An important recent example is zidovudine (AZT), used to delay the onset of AIDS in HIV-infected individuals. Unfortunately, AZT, like antibiotics, is not as reliable as it once was because some HIV strains are now (no surprise) resistant to AZT. HIV is a retrovirus, a really minimal sort of organism with special limitations and special strengths. It has no DNA of its own. Its minute RNA code acts by slowly subverting the DNA-replicating machinery of the host to make copies of itself. The cells it exploits include those of the immune system. The virus can hide inside these cells, where it is largely invulnerable to the host’s antibodies.

  A retrovirus’s lack of self-contained proliferation machinery is both its weakness and its strength. It reproduces and evolves more slowly than DNA viruses or bacteria. Another weakness is its low level of reproductive precision, which means that it produces an appreciable number of defective copies of itself. This functional weakness can be an evolutionary strength, however, because some of the defective copies may be better at evading the host’s immune system or antiviral drugs. Another strength of retroviruses is their lack of any easily exploited Achilles’ heel in their simple makeup.

  It takes months or years for HIV to evolve resistance to AZT, in marked contrast to the few weeks it takes bacteria to evolve significant levels of resistance to some antibiotics. Unfortunately, HIV has a long time to evolve in any given host. A single infection, after years of replication, mutation, and selection, can result in a diverse mixture of competing strains of the virus within a single host. The predominant strains will be those best able to compete with whatever difficulties must be overcome (e.g., AZT or other drug). They will be the ones that most rapidly divert host resources to their own use—in other words, the most virulent.

  SHORT-TERM EVOLUTION OF VIRULENCE

  The evolution of virulence is a widely misunderstood process. Conventional wisdom has it that parasites should always be evolving toward reduced virulence. The reasoning assumes, correctly, that the longer the host lives, the longer the parasites can live and the longer they can disperse offspring to new hosts. Any damage to the host on which they depend will ultimately damage all dependent parasites, and the most successful parasites should be those that help the host in some way. The expected evolutionary sequence starts with a virulent parasite that becomes steadily more benign until finally it may become an important aid to the host’s survival.

  There are several things wrong with this seemingly reasonable argument. For example, it ignores a pathogen’s ultimate requirement of dispersing offspring to new hosts. This dispersal, as noted in the previous chapter, frequently makes use of host defenses, such as coughing and sneezing, that are activated only as a result of appreciable virulence. A rhinovirus that does not stimulate the host to defend itself with abundant secretion of mucus and sneezing is unlikely to reach new hosts.

  Another error in the traditional view is the assumption that evolution is a slow process not only on a time scale of generations, but also in absolute time. Such a belief arises from a failure to appreciate the capacity for rapid evolution of any parasite that will go through hundreds or thousands of generations in one host’s lifetime. If the virulence of the amoeba that causes dysentery is too low or too high for maximizing its fitness, the virulence can be expected to evolve quickly toward whatever level is currently ideal. We should not expect the present virulence of any pathogen to be in transit from one level to another unless conditions have changed recently. By “recently,” we mean last week or last month, not the last ice age, which is what an evolutionary biologist often means by “recently.”

  Yet another flaw in the conventional wisdom is its neglect of selection among different parasites within hosts, as we just implied in our discussion of HIV. What good would it do a liver fluke to restrain itself so as not to harm the host if that host is about to die of shigellosis? The fluke and the Shigella are competing for the same pool of resources within the host, and the one that most ruthlessly exploits that pool will be the winner. Likewise, if there is more than one Shigella strain, the one that most effectively converts the host’s resources to its own use will disperse the most progeny before the host dies. As a rule, all else being equal, such within-host selection favors increased virulence, while between-host selection acts to decrease it. A recent comparative study of eleven species of fig wasps and their parasites confirmed that increased opportunities for parasite transmission are associated with increased parasite virulence.

  As with many other applications of evolutionary theory, careful quantitative reasoning is needed to understand the balance between natural selection within and between hosts. The graph on the next page is a naive representation of what we have in mind.

  An adequate theory of the evolution of virulence must take into account the rate of establishment, in a given host, of new infections; the extent to which these competing pathogens differ in virulence; the rate of origin of new strains by mutation within a host; and the extent to which these new strains differ in virulence. From such considerations it should be possible to infer the expected levels of virulence for a given pathogen, assuming that conditions stay the same, which they never really do. The most important changes would be those that alter the means by which a pathogen reaches new hosts. If dispersal depends not only on a host’s survival but also on its mobility, any damage to the host is especially harmful to the pathogen. If you are so sick from a cold that you stay home in bed, you are unlikely to come into contact with many people that your virus might infect. If you feel well enough to be up and about, you may be able to disperse it far and wide. It is very much in a cold virus’s interest to avoid making you really sick. By contrast, the malaria agent Plasmodium gets no benefit from the host’s feeling well. In fact, as shown by experiments with rabbits and mice, a prostrate host is more vulnerable to mosquitoes. People in the throes of a malarial attack are not likely to expend much effort warding off insects. Mosquitoes can feast on them at leisure and spread the disease far and wide.

  FIGURE 4–1. SELECTION WITHIN AND BETWEEN HOSTS.
/>   A shows the effects of an extremely virulent pathogen, which would be favored by natural selection within a host. It exploits its host to maximize the current rate of dispersal of new individuals to new hosts. It may kill the host quickly, but while the host lives it does better than any competing pathogen. B shows the effects of a pathogen that is favored by selection between pathogen communities of different hosts. It maximizes its long-term total productivity (rate of reproduction times duration, graphically the area under the production curve). Host death in B is most likely from something other than the pathogen.

  This evolutionary perspective suggests that diseases spread by personal contact should generally be less virulent than those conveyed by insects or other vectors. Do the facts fit this expectation? They do indeed. Among Paul Ewald’s important discoveries is the truth of this generalization and its importance for public health. He has shown that diseases from vector-borne pathogens tend to be more severe than those spread by personal contact and that mosquito-borne infections are generally mild in the mosquito and severe in vertebrate hosts. This is to be expected because any harm to the mosquito would make it less likely to bite another vertebrate. For gastrointestinal pathogens, the death rate is lower for direct, as compared to waterborne, transmission, as long as really sick hosts can effectively contaminate the water supply. As pure water became the norm in the United States early in this century, the deadly Shigella dysenteriae was displaced by the less virulent Shigella flexneri. As water was purified in South Asia during the middle of the century, the lethal form of cholera was steadily displaced by a more benign form, and the transition took place earliest at the places where water was first purified.

  An unsanitary water supply is only one example of what Ewald calls cultural vectors. The history of medicine shows repeatedly that the best place to acquire a fatal disease is not a brothel or a crowded sweatshop but a hospital. In hospitals, large numbers of patients may be admitted with infectious diseases normally transmitted by personal contact. People who are acutely ill do not move around much, but hospital personnel and equipment move rapidly from such people to others not yet infected. Inadequately cleaned hands, thermometers, or eating utensils can be quite effective cultural vectors, and the transmitted diseases may rapidly become more virulent.

  Take, for instance, the streptococci that can cause uterine infection in women after childbirth. Most nineteenth-century women knew that they risked their lives by having their babies in the hospital, but some still did so. Viennese physician Ignaz Semmelweis noted in 1847 that women in a clinic staffed by medical personnel contracted childbed fever three times as frequently as those in a clinic staffed by mid wives. On investigating, he found that doctors came directly from doing autopsies on women who had died from childbed fever to do pelvic examinations on women in labor. Semmelweis proposed that they were transmitting the causative agent and showed that infections were less frequent when examiners washed their hands in a bleach solution. Was he thanked for his wonderful discovery? No. He was dismissed from his post for suggesting that doctors were causing the deaths of patients. He became more and more frantic in his efforts to save the thousands of women who were dying unnecessarily, but he was ignored, and finally, at age forty-seven, he died in an insane asylum. Nowadays, we all accept the need for hygiene in hospitals, but whenever it becomes lax, conditions are perfect for selecting for increased virulence, as in the virulent hospital-acquired (versus community-acquired) infantile diarrhea studied by Paul Ewald.

  It is widely believed that HIV is a new pathogen, perhaps originating from a monkey infected with simian immunodeficiency virus (SIV). However, evidence now suggests that monkeys might have acquired SIV from people with HIV. While HIV may have been present in some humans for many generations, AIDS is apparently a new disease, resulting from the evolutionary origin in recent decades of highly virulent HIV strains. AIDS may have arisen because of changed sexual behavior resulting from the socioeconomic disruption of some traditional societies. Large numbers of prostitutes serving hundreds of men per year were so effective at spreading infection that host survival became much less important to virus survival. Those strains that most rapidly exploited their hosts came to prevail within the hosts, and even the highly virulent strains had plenty of opportunity to disperse to new hosts before the old ones died.

  In Western countries, AIDS appeared initially as a disease mainly of male homosexuals because their large numbers of sex partners greatly accelerated sexual transmission, and of intravenous drug users because the drug users’ needles were effective vectors. As in Africa, the most virulent HIV strains prevailed over the less virulent because between-host selection for lower virulence was greatly weakened. Even highly virulent viruses had abundant opportunities to reach new hosts before the original host died. Conversely, the use of clean needles and condoms can not only curtail the transmission of the virus, it can also cause the evolution of lower virulence.

  COSTS AND BENEFITS OF THE IMMUNE RESPONSE

  As described in the previous chapter, natural selection has given us a fiendishly effective system of chemical warfare. For every invading pathogen there will be a worst-case scenario as to what kind of molecules it might encounter. Our immune systems have been shaped over a hundred million years to make the pathogen’s worst nightmares come true. Unfortunately, every effective weapon can sometimes be dangerous to the one who wields it.

  The immune system can make two kinds of mistakes: failing to attack when it should and attacking something when it shouldn’t. The first kind of mistake results from inadequate response, so that a disease that should have been nipped in the bud becomes serious. The second kind of mistake results from mounting too aggressive a response to minute chemical differences. Autoimmune diseases such as lupus erythematosus and rheumatoid arthritis could be the result. The average person’s degree of sensitivity and responsiveness is presumably close to what has historically been the optimum: enough to counter pathogens but not so great as to attack the body’s own structure.

  Given that we have this chemical superweapon—immunity—how can we possibly remain vulnerable to infectious diseases? Once again, it is because the infectious agents can evolve rapidly and become better adapted by natural selection. Those variants that are least vulnerable to immunological attack will be those whose genes are best represented in future generations. So the pathogens may evolve one or another kind of defensive superweapon. Molecular mimicry, mentioned in the last chapter, is one such weapon.

  ESCALATING DECEPTION

  Scientists first developed the concept of mimicry to describe the patterns on butterflies’ wings. For instance, the viceroy butterfly looks almost exactly like the monarch butterfly, which birds do not attack because they want to avoid the toxins the monarch caterpillar gets from eating milkweed leaves. The viceroy has no such toxins, but birds mistake it for its bitter look-alike and likewise shun it. Examples are now also known in many other animal groups. Any edible species that by chance resembles a toxic species will have an advantage, and selection will make this mimic species look increasingly like the toxic model. This is bad for the model because predators that eat the edible mimic learn to go after the model as well. This sets up an arms race between the mimic, which evolves an ever closer resemblance to the model, and the model, which evolves to be as different as possible from its edible neighbors. Some environmental circumstances favor the mimic to such an extent that really detailed resemblances between unrelated species may evolve. We notice such mimicry easily because we perceive so much of the world visually. Detection of chemical mimicry requires more subtle techniques, but there is no reason to think it less common than visual examples.

  The molecular mimicry shown by pathogens turns out to be at least as subtle, complicated, and full of surprises as the visual mimicry shown by butterflies and other animals. Deceptive resemblances to human proteins are shown by the surfaces of various parasitic worms, protozoa, and bacteria. If there is any deficiency in the mimicry o
f human tissues by a bacterium, we can expect it to evolve an improvement rapidly. Pathogen surfaces may have a complex sculpturing of convexities and concavities, and the molecular forms most readily recognized by antibodies are hidden in crevices. As noted in the last chapter, some pathogens alter their exposed molecular structures so rapidly that the host has difficulty producing newly needed antibodies fast enough. This is rapid change without evolution, because the same pathogen genotype codes for a variety of molecular structures.

  Mimicry may not only permit pathogens to escape from immunological attack but also make active use of hosts’ cellular processes. For instance, streptococcal bacteria make molecules similar to host hormones that have receptor sites on cell membranes. In effect, the bacterium has a key to the lock on the door that normally admits a hormone. Once inside the cell, the bacterium is shielded from immunological and other host defenses. The host has an endosomelysosome complex that can attack pathogens within its cells, but molecular mimicry and other countermeasures protect the pathogen there too.

  NOVEL ENVIRONMENTAL FACTORS

  Before leaving infectious disease, we will anticipate a theme of Chapter 10 by noting the large proportion of epidemics that have resulted from novel environmental circumstances. We have already mentioned how changed social conditions may have initiated the AIDS epidemic, but the same is true for many other plagues. Richard Krause, of the National Institutes of Health, reports that early measles and smallpox epidemics spread along caravan routes in the second and third centuries and killed a third of the people in some communities. Bubonic plague, the black death, had long festered in Asia, but became epidemic only when Mongol invaders brought it to unexposed populations in Europe who lived with large populations of flea-infested rats. While we like to imagine that such events are in the past, AIDS continues to spread alarmingly, and the causes of other sudden outbreaks of infection are unknown. The Ebola virus ravaged parts of Africa in the 1980s, killing half of those who became ill, including most of the doctors and nurses who cared for the patients. It stopped as suddenly as it started, for reasons that remain unclear.

 

‹ Prev