The Duffy antigen is not especially important to red blood cells. Nonetheless, researchers have written hundreds of papers about it. The reason is that Plasmodium vivax also uses the Duffy antigen as a receptor. Like a burglar with a copy of the front-door key, it inserts itself into the Duffy antigen, fooling the blood cell into thinking it is one of the intended compounds and thereby gaining entrance.
Duffy’s role was discovered in the early 1970s by Louis H. Miller and his collaborators at the National Institutes of Health’s Laboratory of Parasitic Disease. To nail down the proof, Miller and his collaborators asked seventeen men, all volunteers, to put their arms into boxes full of mosquitoes. The insects were chockablock with Plasmodium vivax. Each man was bitten dozens of times—enough to catch malaria many times over. Twelve of the men came down with the disease. (The researchers quickly treated them.) The other five had not a trace of the parasite in their blood. Their red blood cells lacked the Duffy antigen—they were “Duffy negative,” in the jargon—and the parasite couldn’t find its way inside.
The volunteers were Caucasian and African American. Every Caucasian came down with malaria. Every man who didn’t get malaria was a Duffy-negative African American. This was no coincidence. About 97 percent of the people in West and Central Africa are Duffy negative, and hence immune to vivax malaria.
Duffy negativity is an example of inherited immunity, available only to people with particular genetic makeups. Another, more famous example is sickle-cell anemia, in which a small genetic change ends up deforming the red blood cell, making it unusable to the parasite but also less functional as a blood cell. Sickle-cell is less effective as a preventive than Duffy negativity—it provides partial immunity from falciparum malaria, the deadlier of the two main malaria types, but its disabling of red blood cells also leads many of its carriers to an early grave.
Both types of inherited immunity differ from acquired immunity, which is granted to anyone who survives a bout of malaria, in much the way that children who contract chicken pox or measles are thereafter protected against it. Unlike the acquired immunity to chicken pox, though, acquired malaria immunity is partial; people who survive vivax or falciparum acquire immunity only to a particular strain of vivax or falciparum; another strain can readily lay them low. The only way to gain widespread immunity is to get sick repeatedly with different strains.
Inherited malaria resistance occurs in many parts of the world, but the peoples of West and Central Africa have more than anyone else—they are almost completely immune to vivax, and (speaking crudely) about half-resistant to falciparum. Add in high levels of acquired resistance from repeated childhood exposure, and adult West and Central Africans were and are less susceptible to malaria than anyone else on earth. Biology enters history when one realizes that almost all of the slaves ferried to the Americas came from West and Central Africa. In vivax-ridden Virginia and Carolina, they were more likely to survive and produce children than English colonists. Biologically speaking, they were fitter, which is another way of saying that in these places they were—loaded words!—genetically superior.
Racial theorists of the last century claimed that genetic superiority led to social superiority. What happened to Africans illustrates, if nothing else, the pitfalls of this glib argument. Rather than gaining an edge from their biological assets, West Africans saw them converted through greed and callousness into social deficits. Their immunity became a wellspring for their enslavement.
How did this happen? Recall that vivax, covertly transported in English bodies, crossed the Atlantic early, as I said; certainly by the 1650s, given the many descriptions of tertian fever, quite possibly before. Recall, too, that by the 1670s Virginia colonists had learned how to improve the odds of survival; seasoning deaths had fallen to 10 percent or lower. But in the next decade the death rate went up again—a sign, according to the historians Darrett and Anita Rutman, of the arrival of falciparum. Falciparum, more temperature-sensitive than vivax, never thrived in England and thus almost certainly was ferried over the ocean inside the first African slaves.
Falciparum created a distinctive pattern. Africans in Chesapeake Bay tended to die more often than Europeans in winter and spring—the result, the Rutmans suggested, of bad nutrition and shelter, as well as unfamiliarity with ice and snow. But the African and European mortality curves crossed between August and November, when malaria, contracted during the high mosquito season of early summer, reaches its apex. During those months masters were much more likely to perish than slaves—so much more that the overall death rate for Europeans was much higher than that for Africans. Much the same occurred in the Carolinas. Africans there, too, died at high rates, battered by tuberculosis, influenza, dysentery, and human brutality. Many fell to malaria, as their fellows brought Plasmodium strains they had not previously encountered. But they did not die as fast as Europeans.
Because no colonies kept accurate records, exact comparative death rates cannot be ascertained. But one can get some idea by looking at another continent with endemic malaria that Europe tried to conquer: Africa. (The idea that one can compare malaria rates in places separated by the Atlantic Ocean is in itself a mark of the era we live in, the Homogenocene.) Philip Curtin, one of slavery’s most important historians, burrowed in British records to find out what happened to British soldiers in places like Nigeria and Namibia. The figures were amazing: nineteenth-century parliamentary reports on British soldiers in West Africa concluded that disease killed between 48 percent and 67 percent of them every year. The rate for African troops in the same place, by contrast, was about 3 percent, an order-of-magnitude difference. African diseases slew so many Europeans, Curtin discovered, that slave ships often lost proportionately more white crewmen than black slaves—this despite the horrendous conditions belowdecks, where slaves were chained in their own excrement. To forestall losses, European slavers hired African crews.
The disparity between European and African death rates in the colonial Americas was smaller, because many diseases killed Europeans in Africa, not just malaria and yellow fever. But a British survey at about the same time as the parliamentary report indicated that African survival rates in the Lesser Antilles (the southern arc of islands in the Caribbean) were more than three times those of Europeans. The comparison may understate the disparity; some of those islands had little malaria. It seems plausible to say that in the American falciparum and yellow fever zone the English were, compared to Africans, somewhere between three and ten times more likely to die in the first year.
For Europeans, the economic logic was hard to ignore. If they wanted to grow tobacco, rice, or sugar, they were better off using African slaves than European indentured servants or Indian slaves. “Assuming that the cost of maintaining each was about equal,” Curtin concluded, “the slave was preferable at anything up to three times the price of the European.”
Slavery and falciparum thrived together. Practically speaking, P. falciparum could not establish itself for long in Atlantic City, New Jersey; the average daily minimum temperature is above 66 degrees, the threshold for the parasite, for only a few weeks per year. But in Washington, D.C., just 120 miles south, slightly warmer temperatures let it become a menace every fall. (Not for nothing is Washington called the most northern of southern cities!) Between these two cities runs the Pennsylvania-Maryland border, famously surveyed by Charles Mason and Jeremiah Dixon in 1768. The Mason-Dixon Line roughly split the East Coast into two zones, one in which falciparum malaria was an endemic threat, and one in which it was not. It also marked the border between areas in which African slavery was a dominant institution and areas in which it was not (and, roughly, the division between indigenous slave and non-slave societies). The line delineates a cultural boundary between Yankee and Dixie that is one of the most enduring divisions in American culture. An immediate question is whether all of these are associated with each other.
For decades an influential group of historians argued that southern culture was formed
in the cradle of its great plantations—the sweeping estates epitomized, at least for outsiders, by Tara in the movie Gone with the Wind. The plantation, they said, was an archetype, a standard, a template; it was central to the South’s vision of itself. Later historians criticized this view. Big colonial plantations existed in numbers only in the southern Chesapeake Bay and the low country around Charleston. Strikingly, these were the two most malarial areas in the British colonies. Sweeping drainage projects eliminated Virginia’s malaria in the 1920s, but coastal South Carolina had one of the nation’s worst Plasmodium problems for another two decades. From this perspective, the movie’s Tara seems an ideal residence for malaria country: atop a hill, surrounded by wide, smooth, manicured lawns, its tall windows open to the wind. Every element is as if designed to avoid Anopheles quadrimaculatus, which thrives in low, irregular, partly shaded ground and still air. Is the association between malaria and this Villa Plasmodia style a coincidence? It seems foolish to rule out the possibility of a link.
“What would be the attitudes of a population that had a relatively high rate of illness and short life expectancy?” asked the Rutmans. Some have suggested that the reckless insouciance and preoccupation with display said to be characteristic of antebellum southern culture are rooted in the constant menace of disease. Others have described a special calm in the face of death. Maybe so—but it is hard to demonstrate that southerners were, in fact, unusually rash or vain or stoic. Indeed, one could imagine arguing the opposite: that the steady, cold breath of mortality on southerners’ necks could make them timid, humble, and excitable.
Tara (shown behind Scarlett O’Hara in this publicity image from Gone with the Wind) was created on a studio backlot. Nonetheless, it was a faithful image of the classic southern plantation. High on a nearly treeless hill, with tall windows to admit the breeze, it was ideally suited to avoid mosquitoes and the diseases that accompanied them. (Photo credit 3.3)
More than four hundred species of mosquito belong to the genus Anopheles. Perhaps a quarter can transmit malaria, but only about thirty species are common vectors. More than a dozen of these thirty exist in the Americas, the most important being A. quadrimaculatus, A. albimanus, and A. darlingi. Their habitat range and the average temperature go far to explain why the history of certain parts of the Americas—and not others—was dominated by malaria.
Click here to view a larger image.
A different point is more susceptible to empirical demonstration: the constant risk of disease meant that the labor force was unreliable. The lack of assurance penalized small farmers, who were disproportionately affected by the loss of a few hands. Meanwhile, the Rutmans noted, “a large labor force insured against catastrophe.” Bigger planters had higher costs but were better insulated. Over time, they gained an edge; smaller outfits, meanwhile, struggled. Accentuating the gap, wealthy Carolinian plantation owners could afford to move to resorts in the fever-free mountains or shore during the sickness season. Poor farmers and slaves had to stay in the Plasmodium zone. In this way disease nudged apart rich and poor. Malarial places, the Rutmans said, drift easily toward “exaggerated economic polarization.” Plasmodium not only prodded farmers toward slavery, it rewarded big plantations, which further lifted the demand for slaves.
Malaria did not cause slavery. Rather, it strengthened the economic case for it, counterbalancing the impediments identified by Adam Smith. Tobacco planters didn’t observe that Scots and Indians died from tertian fever and then plot to exploit African resistance to it. Indeed, little evidence exists that the first slave owners clearly understood African immunity, partly because they didn’t know what malaria was and partly because people in isolated plantations could not easily make overall comparisons. Regardless of whether they knew it, though, planters with slaves tended to have an economic edge over planters with indentured servants. If two Carolina rice growers brought in ten workers apiece and one ended up after a year with nine workers and the other ended up with five, the first would be more likely to flourish. Successful planters imported more slaves. Newcomers imitated the practices of their most prosperous neighbors. The slave trade took off, its sails filled by the winds of Plasmodium.
Slavery would have existed in the Americas without the parasite. In 1641 Massachusetts, which had little malaria, became the first English colony to legalize slavery explicitly. During the mid-nineteenth century, the healthiest spot in English North America may have been western Massachusetts’s Connecticut River Valley, according to an analysis by Dobson and Fischer. Malaria there was almost nonexistent; infectious disease, by the standards of the day, extremely rare. Yet slavery was part of the furniture of daily life—at that time almost every minister, usually the most important man in town, had one or two. About 8 percent of the inhabitants of the main street of Deerfield, one of the bigger villages in the valley, were African slaves.
On the other side of the hemisphere’s malaria belt, the southern terminus of the habitat for Anopheles darlingi, the main South American vector for falciparum, is by the Rio de la Plata (Silver River), the border between Spanish and Portuguese America. South of the river is Argentina. With few mosquitoes to transmit Plasmodium, Argentina had little malaria. Yet, like Massachusetts, it had African slaves; between 1536, when Spain founded its first colony on the Rio de la Plata, and 1853, when Argentina abolished slavery, 220,000 to 330,000 Africans landed in Buenos Aires, the main port and capital.
On the other side of the mosquito border were the much bigger Brazilian ports of Rio de Janeiro and São Paulo, where at least 2.2 million slaves arrived. Despite the difference in size, southern Brazil and Argentina were demographically similar: in the 1760s and 1770s, when Spain and Portugal first systematically censused in their colonies, about half of the population in both areas was of African descent. Yet the impact of slavery in them was entirely different. Slavery was never critical to colonial Argentina’s most important industries; colonial Brazil could not have functioned without it. Argentina was a society with slaves; Brazil was culturally and economically defined by slavery.
All American colonies, in sum, had slaves. But those to which the Columbian Exchange brought endemic falciparum malaria ended up with more. Falciparous Virginia and Brazil became slave societies in ways that non-falciparous Massachusetts and Argentina were not.
YELLOW JACK
In the 1640s a few Dutch refugees from Brazil landed on Barbados, the easternmost Caribbean island. Unlike the rest of the Caribbean, Barbados never had a large Indian population. English colonists moved in, hoping to capitalize on the tobacco boom. When the Dutch refugees arrived the island had about six thousand inhabitants, among them two thousand indentured servants and two hundred slaves. Tobacco had turned out not to grow particularly well on Barbados. The Dutch showed the colonists how to plant sugarcane, which they had learned during an ill-fated venture in Brazil. Europe, then as now, had a sweet tooth; sugar was as popular as it was hard to come by. Barbados proved to be good cane territory. Production rapidly expanded.
Sugar production is awful work that requires many hands. The cane is a tall, tough Asian grass, vaguely reminiscent of its distant cousin bamboo. Plantations burn the crop before harvest to prevent the knifelike leaves from slashing workers. Swinging machetes into the hard, soot-smeared cane under the tropical sun, field hands quickly splattered themselves head to foot with a sticky mixture of dust, ash, and cane juice. The cut stalks were crushed in the mill and the juice boiled down in great copper kettles enveloped in smoke and steam; workers ladled the resultant hot syrup into clay pots, where the pure sugar crystallized out as it cooled. Most of the leftover molasses was fermented and distilled to produce rum, a process that required stoking yet another big fire under yet another infernal cauldron.
The question as ever was where the required labor would come from. As in Virginia, slaves then typically cost twice as much as indentured workers, if not more. But the Dutch West India Company, a badly run outfit that was desperate for cash, was willi
ng to sell Africans cheap in Barbados. Slaves and indentured servants there were roughly the same price. As one would expect, the island’s new sugar barons imported both by the thousands: the sweepings of English streets and luckless captives from Angolan and Congolese wars. Covered in perspiration and gummy cane soot, Europeans and Africans wielded machetes side by side. Then the Columbian Exchange raised the relative cost of indentured servants.
Hidden on the slave ships was a hitchhiker from Africa: the mosquito Aedes aegypti. In its gut A. aegypti carried its own hitchhiker: the virus that causes yellow fever, itself also of African origin. The virus spends most of its time in the mosquito, using human beings only to pass from one insect to the next. Typically it remains in the body no more than two weeks. During this time it drills into huge numbers of cells, takes over their functioning, and uses the hijacked genetic material to produce billions of copies of itself. These flood the bloodstream and are picked up by biting aegypti. For imperfectly understood reasons this cellular invasion usually has little impact on children. Adults are hit by massive internal bleeding. The blood collects and coagulates in the stomach. Sufferers vomit it blackly up—the signature symptom of yellow fever. Another symptom is jaundice, which gave rise to the disease’s nickname of “yellow jack.” (A yellow jack was the flag flown by quarantined ships.) The virus kills about half of its victims—43 to 59 percent in six well-documented episodes McNeill compiled in Mosquito Empires. Survivors acquire lifelong immunity. In Africa yellow fever was a childhood disease that inflicted relatively little suffering. In the Caribbean it was a dire plague that passed over Africans while ravaging Europeans, Indians, and slaves born in the islands.
The first yellow fever onslaught began in 1647 and lasted five years. Terror spread as far away as Massachusetts, which instituted its first-ever quarantine on incoming vessels. Barbados had more Africans and more Europeans per square mile than any other Caribbean island, which is to say that it had more potential yellow fever carriers and potential yellow fever victims. Unsurprisingly, the epidemic hit there first. As it began a man named Richard Ligon landed in Barbados. “We found riding at Anchor, 22 good ships,” he wrote later,
1493: Uncovering the New World Columbus Created Page 14