Dark Banquet
Page 18
In that regard, one of the most common ways that antibiotics are misused occurs when a prescribed antibiotic regimen is abandoned, usually once a patient feels better. Few people, in fact, seem to realize the danger posed when they decide to stop their antibiotic treatment before it’s complete—and here’s why. Think of a hypothetical population of 1,000 microbes inside a person who has been instructed to take an antibiotic for seven days. Discounting bacterial reproduction for a moment (since this is a model), let’s say the antibiotic kills 900 microbes by day five and 990 by day six. If the patient were then to stop taking the antibiotic after day six, which microbes would be left alive? The ten survivors who were the most resistant to the antibiotic in the first place. Now factor in bacterial reproduction—and as these surviving microbes begin to multiply, each new generation will have the same antibiotic resistance exhibited by the ten original survivors.*118
Now that we’ve seen some of the nasty effects of chiggers it’s time to figure out just what they are. The short answer is that they’re the parasitic juvenile stages of some mite species.†119
And what about ticks?
Like chiggers, their story has become an extremely important one because of the pathogens they transmit through their bites. Starting in the mid-1970s, there has been growing concern in the United States and elsewhere over tick-transmitted diseases—especially Lyme disease and Rocky Mountain spotted fever.
Are ticks (like bed bugs) a pest on the rise, and if so, why? Are tick-transmitted pathogens also becoming immune to our treatments, or is there another answer? And what about Lyme disease? Why are the symptoms so variable (ranging from minor annoyance to catastrophic and life altering)? Some people believe that there’s a chronic version of the disease—one that the experts refuse to talk about. And while we’re in conspiracy mode, whatever happened to Lymerix, the Lyme disease vaccine?
All right, before we deal with the grassy knoll (and the chiggers lying in wait there), let’s cover some basics. First of all, chiggers, mites, and ticks are not insects, but like insects, they belong to the enormous invertebrate phylum Arthropoda. In fact, they’re members of the only arthropod group that rivals the insects in diversity—Arachnida (a subphylum that also includes the spiders and scorpions).
While chiggers aren’t exactly blood feeders, they merit interest because their blood-sucking cousins, ticks, share much of their biology as well as their penchant for strange behavior. Additionally, hundreds of mite species are vampires as adults. It’s just that many of them feed on nonvertebrate blood—like hemolymph, the blood found in arthropods like insects. Some of these mite/insect interactions have also been in the news lately, especially as they relate to agriculture and the honey bee industry.
Evolutionarily, mites and ticks are extremely close. In all likelihood, tick ancestors (prototicks) were actually mites that somehow evolved to become obligate blood feeders (similar to vampire bat ancestors, which were non–blood feeders).
Why did some mites retain their larval blood-feeding lifestyles into adulthood while other body characteristics (like sexual organs) developed normally? The answer is that maintaining larval (or juvenile) characteristics as an otherwise mature adult is yet another example of how new species can evolve. The basic premise, proposed by the evolutionary embryologist Gavin de Beer (1930) and reinvigorated by Stephen Jay Gould in his book Ontogeny and Phylogeny (1977), is that “evolution occurs when ontogeny*120 is altered in one of two ways: when new characters are introduced at any stage of development with varying effects on the subsequent stages, or when characters already present undergo changes in developmental timing.”
In the first scenario, Gould’s “introduced characters” result from genetic alterations like mutations—changes in an individual’s genetic blueprint that occur during DNA replication.*121 It is at this time that errors (variations to some) in the copying mechanism result in mutated strands of DNA. In this classic explanation of how evolution operates, in some instances this mutant DNA results in new characteristics for the individual.†122
We’ve already seen the hypothetical results of such mutations (horses and vampire bats), but let’s look at the second of these examples a bit closer. Let’s say that a protovampire bat had a genetic mutation that resulted in a change in tooth structure. If this mutation happened to produce sharper teeth (giving the protovampire a better chance of biting animals without being detected and thus increasing its chances of surviving and reproducing), then this novel characteristic would be considered an “adaptation.” Subsequent generations of protovampires, which would include the progeny of the bat with the mutation, would exhibit greater incidences of sharper teeth since those protovampires without this trait would be less likely to survive and reproduce in their local environments. In time, protovampire populations might accumulate more and more new characteristics (like salivary anticoagulants and amped-up excretory systems), adaptations similarly “selected” by the existing environmental conditions. Eventually, these bats would become different enough from their ancestors to be considered a new species—in this case, true vampire bats. Alternatively (and this is the part that many people overlook), the mutation that altered tooth sharpness could have just as easily been one that produced duller teeth, and this “maladaptive” character would have lessened the chances of that individual surviving to reproductive age, where it could pass that mutation (and the trait) on to the next generation.
Considering this last possibility makes it easier to understand how evolution is, in some ways, a genetic crapshoot. It also allows us to recognize the problem with such assumptions as “The environment changed and ancient vampire bats needed sharper teeth to make painless bites—so they evolved sharper teeth.” This reasoning, which seems to make sense, betrays a common misconception that people have about the mechanism of evolution.
For Jean-Baptiste Lamarck (1744–1829), the first naturalist to propose a mechanism for evolutionary change, the concept of need-based evolution (and other related ideas) got the Frenchman buried in a rented grave and his life’s work all but forgotten. In reality, Lamarck was a scientific heavyweight, and his résumé included prescient insights into botany, taxonomy, and organic evolution. He may have been, in fact, the first scientist to propose that species actually changed gradually over time and that they did so because of natural processes (as opposed to supernatural ones). In his spare time, Lamarck was the first naturalist to separate crustaceans, arachnids, and annelids from insects (although housecleaners had been coping with that very problem for many years), and he also coined the term invertebrate. All in all, a fairly hefty set of accomplishments—most of which go completely unmentioned, unappreciated, and most important (for high school students, at least), unmemorized. Instead, Lamarck has been hammered in nearly every introductory biology text ever written. Indeed Lamarck’s folly, referred to as “the inheritance of acquired characteristics” has hung around the poor man’s neck like an albatross (or more accurately, a giraffe).
In Lamarck’s giraffe story, used to explain how evolution might proceed, there was once a population of short-necked animals (let’s call them protogiraffes) feeding happily on low-lying leaves. For whatever the reason, the environment changed, these plants died out, and the short-necked animals were left with a dwindling food supply.*123 According to Lamarck, the protogiraffes that had previously fed on the vertically challenged (and now extinct) foliage, needed longer necks in order to feed from the higher branches of trees that hadn’t been wiped out. This need somehow produced elongated necks, resulting in the evolution of the modern giraffe.
Although it took a bit of time to discredit Lamarck (Charles Darwin actually fell back on Lamarckian concepts in his later editions of Origin of Species), eventually folks came up with questions like “If Lamarck was right, then why are boys with circumcised fathers born with a foreskin?”†124
In reality, most of what you do or experience in your lifetime has little or no effect on the genetic makeup of your off
spring. Whether we’re talking about longer snouts and legs for Miocene protohorses, or sharper teeth for ancient vampire bats, any inheritable modifications inevitably came about as a result of changes that occurred at the genetic level (i.e., changes in part of a genetic blueprint or in the timing of genetically programmed events).*125
It’s this change in the timing of genetically programmed events that explains how ticks may have evolved from chiggers. In this case, blood feeding may not have been a novel characteristic (as in ancient vampire bats), but possibly the timing of its appearance was. In a process known as heterochrony, the timing of developmental events is altered. Heterochrony, then, could explain the origin of the first tick from an ancestral mite—a mite that somehow maintained its larval feeding behavior into adulthood.
How could something like that come about?
There are numerous examples of this process in nature, but the most well known is neoteny—in which an organism reaches sexual maturity while retaining juvenile characteristics. The classic example concerns the giant salamander Necturus (the mudpuppy), which retains its gills throughout adulthood. In the vast majority of amphibians (like most salamanders, as well as mudpuppy cousins, like frogs and toads), these respiratory structures are lost as the larvae metamorphose into semiterrestrial adults.
It’s been hypothesized that in this famous case of neoteny, a mutation allowed some salamanders to retain their gills as they reached sexual maturity. The obvious question is, Why would that particular characteristic become an adaptation? The best hypothesis thus far is that the selection pressure to retain gills as an adult might have been a change in the terrestrial environment (e.g., a new predator or drier conditions), making it safer to extend the time salamanders spent in the ponds where they swam as larvae.
Similarly, with regard to the evolution of ticks, perhaps increases in local vertebrate populations or species diversity (both are also forms of environmental change) led to an evolutionary advantage for some mites that accidentally retained the parasitic dietary habits they had as larvae. Basically, more vertebrates meant more exploitable sources of food. As in Necturus, this adaptation had evolved from a mutation that hadn’t produced a new character but instead had changed the developmental timing of a previously existing character. Following this hypothetical scenario to its conclusion, true ticks would have evolved as prototicks transitioned from feeding on liquefied cell contents (like their mite ancestors) to feeding on blood.
However ticks came about, most researchers think that the first ticks appeared sometime during the early Cretaceous period (around one hundred million years ago) and, not coincidentally, during a period of tremendous vertebrate diversity.
Within the arachnids, chiggers, ticks, and mites belong to the order Acari (or Acarina), which contains between 850 and 900 species of ticks and approximately 50,000 species of mites.
According to Gwilym O. Evans, author of Principles of Acarology, acarines are unlike other arachnids because of the intimate associations they’ve developed with other animals. In mites, these associations range from symbiosis to commensalism to parasitism.
Briefly, symbiotic relationships are those between two different organisms in which both derive some benefit. Among the acarids, perhaps the strangest example of symbiosis is the relationship between the eastern subterranean termite Reticulitermes flavipes and the slime mite Histiostoma. Researchers have found that termite colonies often become infected with a pathogenic fungus (Metarhizium anisopliae). The fungus invades the termite’s body, secretes a fatal toxin, and then derives nutrients from the decomposing wood muncher. Finally, the rootlike fungal mycelia erupt through the cadaver’s exoskeleton to grow and spread reproductive spores throughout the termite colony. So destructive is this fungus that it’s even been considered for use in the biological control of termites. Fortunately for the termite (although unfortunately for homeowners and pest-control types), the slime mites living in the nest not only scarf down the pathogenic fungus, but as they cruise around the nest they spread a trail of bacteria, yeast, and other microbial organisms. This sets up competition between the pathogenic fungus and these nonlethal decomposers with the result being the suppression of growth and sporulation (release of the reproductive spores) in Metarhizium. In many ways, it’s as if the slime mite is able to serve as an external immune system for the termite.
Commensalism (another type of mite/animal association) is a relationship between two organisms in which one benefits and the other neither benefits from the relationship nor is harmed. One example, in the case of mites, is a form of commensalism known as phoresy, in which a smaller organism (in this case, the mite) attaches itself to other organisms (like an insect) for the purpose of transportation. Since the carrier isn’t harmed, you can think of phoresy as a milder version of the passive transport we saw in bed bugs. In perhaps the strangest case of phoresy, hummingbird-flower mites (Proctolaelaps kirmsei) are chauffeured from flower to flower within the nasal cavities of the hummingbirds. Although the hummingbirds aren’t physically harmed by the mites, they both wind up competing for the same pollen and nectar—and so this isn’t really a textbook example of commensalism.
Acarologist Tyler Woolley lists five significant ways that mites affect humans: health (through transmission of diseases as well as our bodies’ allergic and inflammatory reactions to them), agriculture (they infest crops, household and garden plants, and farm animals), stored agricultural products (they cause tremendous damage to grains, cereals, and veggies in which they live and multiply), *126 biological control (in which predatory mites are involved in controlling pests like fire ants or even other mites), and aesthetics (nobody likes a mangy mutt or mite-damaged houseplants).
As a group, mites exhibit a bewildering variety of ways to make a living. For example, approximately 140 species of them have been identified as living in house dust. Additionally, if you look closely enough you’ll find mites infesting algae, books, cheese, dried fruits, dried meats, drugs, flour, fungi, furniture, grains (like corn, wheat, oats, barley, rye, buckwheat, and millet), jams, jellies, mattresses, mildew, mushrooms, nectar, nuts, paper, plant bulbs, pollen, seaweed, seeds, spores, straw, sugar, vanilla pods, and wallpaper. Mites affect hundreds of plant species, and pretty much every type of wild animal, farm animal, and pet you can name. For creatures troubled by mites, infestation sites range from ears to anuses and all stops in between.
Besides an allergic reaction to dust mites and their droppings, perhaps the most commonly encountered mite-related health problem is scabies. Caused by Sarcoptes scabiei. Scabies is a condition that produces a rash and intense itching.*127 The symptoms result from the host body’s reaction to mite-secreted and-excreted substances released as the mites go about their parasitic business. Young female scabies mites, which are about one-fiftieth of an inch long (a half millimeter), excavate a burrow in the host’s skin where a male soon joins them. Copulation occurs only once and renders the female fertile for life. Soon after, she emerges from the honeymoon suite (leaving the male behind to die). The pregnant female motors around the surface of the host (reaching speeds of up to 60 inches per hour) until she locates a site for a permanent burrow (hands and wrists are popular). Burrowing at a rate of about one-fifth of an inch per day (five millimeters), the female feeds on liquid from ruptured host cells. She also takes time to pump out several eggs per day, which are applied to the walls of the ever-lengthening burrow. When the larvae hatch, they leave mom and their nursery burrow behind, passing through several instars before reaching adulthood. During their wanderings topside, scabies mites are commonly spread to new hosts during periods of prolonged physical contact.
Until relatively recently, scabies was thought to be a disease of the poor, the unwashed, and the sexually promiscuous. This view was challenged in a rather unique manner in an article titled “Scabies Among the Well-to-Do,” published in 1936 in the prestigious Journal of the American Medical Association:
Scabies is a disease of herding, p
romiscuity and travel, of family school and vacation life. A plague of armies, tenements and slums. It may with equal force invade a pedigreed school, Camp Wawa Wawa or the baronial castle on the hill. An ever present differential consideration, wholly without social boundaries, the possible explanation of the itches of the tycoon, the socialite and the university professor equally with the mechanic’s daughter on relief.
Another mite causing major concern today is Varroa destructor, which preys on several types of bees, including honey bees (Apis) and bumble bees (Bombus). Varroa can be considered an invertebrate vampire because it feeds on hemolymph. Since the bee’s circulatory system doesn’t function in gas transport, there is no oxygen-carrying hemoglobin, and as a result hemolymph lacks the red color of vertebrate blood. It is, however, a complex liquid containing a variety of hemocytes, cells that carry out many of the same functions as their leukocyte counterparts—functions that include phagocytosis and a role in the immune response. There’s even a hemocytic version of stem cells.
Female mites enter bee nests (or hives) where they lay their eggs just before the brood chambers containing the developing bees are capped by the adult bees. The parasites feed on larval and pupal instars as well as the emerging adult bees, which are also used for transportation. As with other arthropod parasites, as Varroa destructor feeds it can transmit viral and bacterial pathogens to its host.
Recently, the dramatic and nearly worldwide loss of honey bees has become a major concern not only within the beekeeping industry but also among farmers who raise the more than ninety commercial crops commonly pollinated by bees.*128 Colony collapse disorder (CCD, formerly known as fall dwindle disease) is characterized by the sudden departure of most of the adult worker bees from the hive, leaving behind the queen, a few young workers, and an abandoned brood of larvae and pupae. Although the cause of CCD is still under investigation, the list of potential suspects includes mites, bacteria, fungi, viruses, long-term exposure to substances like pesticides—especially neonicotinoids (chemicals that mimic the neurotoxic effects of the compound found in tobacco), and poor nutrition.*129 There is even a suggestion, albeit far-fetched, that cell phones are the causative agent.