Cotton Mather, the late-seventeenth-century minister of Boston’s Old North Church, a great proponent of witchcraft trials as well as a recognized medical expert who had studied medicine at Harvard, opined that women who did not breastfeed their babies were “dead while they live.” Mather, who persuaded Elihu Yale to found a new college because Harvard found his ideas about witchcraft unacceptable, said that God would judge women negatively if they refused to breastfeed. Harvard did not disagree on this point—Harvard president Benjamin Wadsworth termed the decision not to breastfeed “criminal and blame worthy.”
According to these men, a woman who did not breastfeed was turning her back on that with which God in his wisdom had provided her. Inevitably, classism entered into the argument, because most of the women who did not breastfeed were upper-class. There was a great deal of talk about women ignoring their maternal responsibilities to live luxuriously. The implication was that they were too idle and vain to live up to the obligations of motherhood. Overlooked in the debate was the fact that women whose labor was essential on family farms or in family businesses needed an alternative to breastfeeding.
The Protestants in particular were advocates of breastfeeding. In fact, attacking women for not breastfeeding was unusual until the Protestant Reformation. Among the more radical sects, such as the New England Puritans, it was rare for a capable mother not to breastfeed. The more radical ministers regularly delivered sermons on the evils of not breastfeeding.
It seemed that men everywhere, both religious and secular, weighed in on what women should do. Benjamin Franklin, master of the homily that says so little yet is hard to refute, proclaimed, “There is no nurse like a mother.” Jean-Jacques Rousseau, after abandoning five children to orphanages, expressed strong opinions on appropriate child-rearing and denounced wet-nursing.
At the heart of the debate was a belief, held even before the perils of animal milk were fully understood, that breastfeeding by mothers was the safest option. Another widely held belief was that wet-nursing led to a higher infant mortality rate. Little evidence of this existed among the upper classes, by whom wet nurses were privately employed, but in orphanages, the death rate among wet-nursed infants was horrifying. After visiting French foundling hospitals, many people became passionate advocates for infants being breastfed by their mothers, and some, such as the British physician Hugh Smith, widely read in the British colonies, went so far as to suggest that bottle feeding was safer and preferable to wet-nursing.
Not so, according to Dr. Alphonse Leroy, sent to the foundling hospital in Aix in 1775 to find out why so many children were dying. He concluded that the cause of the deaths was artificial feeding, not wet-nursing. The existence of bacteria was unknown at the time, so he could not have thought that milk could acquire deadly bacteria if not fresh enough. Nor could he have known that the feeding vessels, if not properly washed, could also transmit deadly bacteria. What he did conclude—the right conclusion for the wrong reason—was that the animal milk killed the babies because milk, whether human or animal, became deadly when exposed to air. His successful solution was to have the children suckle on goats.
Still, the infant mortality rate at hospitals remained tragically high, as babies continued to be artificially fed. It was, after all, extremely difficult to maintain a dairy herd at a hospital. At the end of the eighteenth century, a Dublin hospital had a 99.6 percent infant mortality rate. In other words, it was extraordinarily rare for a baby to survive this hospital. It was finally shut down in 1829.
Wet nurses often came from the lower middle class, from working families. Sometimes they artificially fed their own children so as to have the milk to wet-nurse someone else’s. It paid better than being a domestic.
A wet nurse had to have clean habits—no promiscuity or alcoholism—and a happy disposition. Brunettes were most stable. Blondes had cheerful temperaments, but that made them excitable, and that excitement could change the quality of their milk. A study in Berlin in 1838 compared the composition of milk from brunettes, blondes, and redheads and claimed to show definitively that redheads had the worst milk and brunettes the best.
It is curious that for all this concern about wet nurses passing on their characteristics to others’ infants, most wet nurses in slave societies were slaves. In fact, slaves were in great demand as wet nurses, so much so that if a slave had a child and was lactating, her sale value would increase.
A baby could also be fed what was called pap or panada, which meant food supplements—i.e., formula. In poorer regions of the world, additives stretched milk further. In the impoverished cane-growing regions of the Dominican Republic, sugar and water was, and is, a frequent milk substitute. Excavations of Roman gravesites have revealed not only baby milk bottles, indicating a considerable amount of artificial feeding, but also “pap boats” for feeding babies flour and milk or even flour and water.
By the fifteenth century, pap had come to mean flour and bread crumbs cooked in milk or water. Panada was broth, possibly with vegetables or butter added, cooked with milk and even sometimes eggs. When made with milk, these formulas had an advantage in that the milk was boiled, which made them safer.
Not a great deal is known about Simon de Vallambert, who in 1565 published the first printed book in France on pediatrics. In it he gave this recipe for pap:
The flour from which it is made nowadays the greater part of the nurses pass simply through a sieve without other preparation. Others cook it in the oven in a leaded or vitrified earthen pot after the bread is drawn, to finally take away the viscosity which is in the crude flour. The milk mixed with the flour is commonly from the goat or cow, that of the goat is better. When one intends to add more nourishment one adds finally an egg yolk, when one wished to guard against constipation one adds honey.
A simpler recipe came from Jane Sharp, an English midwife who in 1671 published the first English book on midwifery by a woman. Her pap was simply “Barley bread steeped a while in water and then boiled in milk.”
By the eighteenth century, doctors were starting to endorse cow’s milk diluted with supplements, claiming that this made the milk nutritionally closer to human milk. By the nineteenth century, both artificial feeding and milk supplements were widely accepted. In their 1869 book, Catherine Beecher and Harriet Beecher Stowe advised:
If the child be brought up “by hand” [as opposed to being nursed] the milk of a new milch cow, mixed with one third water and sweetened a little with white sugar, should be the only food given until the teeth come. This is more suitable than any preparation of flour or arrowroot, the nourishment of which is too highly concentrated.
Even as drinking milk was becoming more popular and its health benefits extolled, people were becoming increasingly leery of it. One hot Fourth of July in 1850, Zachary Taylor, the twelfth president of the United States, a tough old soldier nicknamed “Old Rough and Ready,” laid the cornerstone for the Washington Monument and then refreshed himself with a cool glass of milk. But summer milk is dangerous, and the president died soon after. Many have attributed his death to cholera, but some thought, and perhaps it is true, that it was the glass of milk that killed him.
If the Taylor story is true, he was not the only president to be struck by the ravages of milk. When Abraham Lincoln was seven years old, his family left Kentucky and moved to the small community of Little Pigeon Creek in southern Indiana. In 1818, when he was nine, his mother, Nancy Lincoln, died of something called “milk sickness.” It was an epidemic. Nancy’s aunt and uncle, and a cousin named Dennis Hanks, also died. Then the disease disappeared for twelve years. When it came back to the area in 1830, the Lincolns left.
Milk sickness was caused when cows ate a plant called white snakeroot, also known as squaw-weed, richweed, pool wort, pool root, white sanicle, Indian sanicle, deer wort, white top, or steria. Known in botany as Eupatorium urticaefolium, it usually struck cows, and the humans who drank their milk, in the late summer. In the nineteenth century it decimated commu
nities in what are now the Midwest and Plains states. But incidents occurred even earlier, among the first white settlers in Maryland, North Carolina, Kentucky, Tennessee, Alabama, Missouri, Illinois, Indiana, and Ohio. The disease produced violent vomiting and a burning sensation, and after three days, victims often died. It was often confused with malaria, though the symptoms are not identical.
As far back as in prerevolutionary North Carolina, milk sickness was recognized as a separate disease and was suspected as being caused by milk. Those who abstained from milk, cheese, and all dairy products in the late summer were not stricken, and even those that were, but then abstained from dairy, had only a mild attack.
Some suspected that the disease was caused by a poisonous dew that formed at night. Others suspected that it was caused by an invisible microorganism—one of the early versions of Louis Pasteur’s later “germ theory.” That was an astute guess, but it actually had nothing to do with the cause of this disease.
Cows grazed on the poisonous white snakeroot plant during late summer and early fall droughts, when the normal grasses were not available and the herds foraged for alternatives. Cows that grazed in enclosed pastures with few weeds did not become infected.
Exactly which weed caused the disease remained a mystery for some time. All the usual poisons were suspected, including poison ivy, water hemlock (Cicuta maculata), Indian hachy, Indian tobacco (Lobelia inflata), Indian hemp (Apocynum cannabinum), Virginia creeper (Parthenocissus quinquefolia), cross vine (Bignonia capreolata), Indian currant (Symphoricarpos orbiculatus), marsh marigold (Caltha palustris), spurge (Euphorbia esula), mushrooms, and parasitic fungi and molds that grow on various plants. A few people even had it right—white snakeroot (Eupatorium urticaefolium). It was a woodland plant, and as settlers cleared more and more land for pastures, the disease disappeared.
But if milk was killing people in the countryside, it was a small toll compared to the milk-related deaths occurring in cities such as New York, Chicago, and London.
From the first days of colonization, Manhattan stood out as a dairy center because it was settled by the dairy-crazed Dutch. Unlike the English, the Dutch specifically recruited dairy farmers to settle in their colony. Even after the British takeover in 1664, when New Amsterdam became New York, it remained a place where a great deal of dairy was produced and consumed, and the Dutch remained dairy farmers into the next century. Both butter and buttermilk were popular. Bread with butter was a standard breakfast and was also served at dinner. Milk with morsels of bread could be consumed for breakfast or dinner. Even after the British introduced coffee shortly after taking over, the most popular hot drink was tea with milk. New Yorkers also served cheese for both breakfast and dinner.
Even as New York City became more urban, the tradition of owning a cow or two continued. By the nineteenth century, cows were tied to stakes and often fed garbage. Property owners would rent spaces for staked cows and demand a claim on their manure, which sold well to farmers. This did not do much for the smell of the city, but it already had a sewage problem and was a redolent town to begin with.
The old Dutch farms in Europe had been fastidiously kept, but there was no such concern for hygiene in New York City. The cows lived and were milked surrounded by garbage, and the milk was kept in open pails. A street vendor could carry two pails with the help of a yoke across the shoulders and so roamed the streets, ladling out milk to customers.
In the nineteenth century, the Western territories and states with their huge expanses of land became the main producers of grains and other crops in the United States. Easterners could not compete. New England farmland was already showing signs of exhaustion. But the one product with which they could compete, even with limited space, was dairy, and so they became great dairy producers. The extreme example was New York City, which produced tremendous quantities of milk with little space at all.
As transportation improved, milk could be brought to New York City by steamboats that traveled down the Hudson or by train. But the hours of transport on a hot summer day made this milk risky. Cities were probably the worst places for raw milk, but ironically, that was where the drinking of milk first caught on.
Milk drinking increased with the growth of cities. It was in cities that milk became the preferred substitute for breastfeeding and the food of choice for weaned toddlers and children.
Milk was supposed to be good for you. There was even a fashion for the “milk cure,” six weeks at a milk home drinking six quarts a day. Oddly, though, some of the true health benefits of milk had not yet been discovered. The role of calcium and phosphorus in bone development would not be fully understood until the early twentieth century. But with the industrial revolution and the growth of cities came a belief that breastfeeding was primitive and that the modern, industrialized, urbanized woman was no longer a nutritious milk provider; animal milk was a better alternative.
Old beliefs in the health benefits of whey lingered. Lydia Maria Child, America’s first great woman novelist, turned to writing cookbooks when her novels were blackballed because of her abolitionist stand. She and her husband, both antislavery activists, had little money, and she knew how to cook on a tight budget, publishing The Frugal Housewife in 1829 and The Family Nurse in 1837. In the later book she gives recipes for nine different wheys made by adding acid to fresh milk—vinegar, orange, cider, wine, lemon, and others. All were offered as cures for various complaints: lemon whey for high fevers, molasses whey for wet nurses with insufficient milk, mustard whey for low fevers and nervous fevers, etc.
Yet as the demand for animal milk in the cities increased, its quality grew worse. A few cows staked here and there could no longer provide enough milk for the many clamoring customers. Large stables holding hundreds of cows were established adjacent to breweries, and milk became a big, profitable business. The leftovers from making beer—the mash, or slop, or swill—was poured down wooden chutes into dairies next door. But beer waste was not good feed for cows, and the milk they produced was low in fat and watery, with a light blue color. Producers added annatto to improve its color and chalk to give it body. They also added water to increase the amount of milk they could sell, and covered up the dilution by adding more chalk. Sometimes a little molasses was added, too, to give the concoction the slightly sweet flavor of fresh milk.
By the 1840s almost half the babies born in Manhattan were dying in infancy, mostly from cholera. There were many theories as to the cause of this high infant mortality rate, but it was Robert Milham Hartley, a temperance crusader, who was the first to put the blame on the milk produced in the brewery dairies.
Hartley was a social reformer with many causes. As a young man, he had left his job as a factory manager in the Mohawk Valley and moved to New York City to agitate for a variety of social issues, which he continued doing all his life. Temperance and the plight of the poor were his leading concerns for a period. But then he took on milk, which he believed to be a perfect food, as least when produced by wholesome methods. He was probably the inventor of the term “swill milk,” whose production he wanted to expose and stop. He reported that “about ten thousand cows in the city of New York and neighborhood are most inhumanely condemned to subsist on the residuum or slush of this grain, after it has undergone a chemical change, and reeking hot from distilleries.”
Hartley also reported that the crowded brewery stables were filthy and that many of the cows were sick and dying, but being milked nonetheless—sometimes even when they were too weak to stand and had to be held up by straps. He identified five hundred dairies in Manhattan and Brooklyn, most around the edges of the city, producing 5 million gallons of doctored bluish milk a year. Many were located near the Hudson River or between Fifteenth and Sixteenth streets, which was then the northern edge of the city. He reported that an unbearable stench came from the stables, as they had no cleaning facilities and no ventilation. He also noted that numerous European countries, notably England and Germany, had brewery dairies too, and that s
will milk was being made in Boston, Cincinnati, and Philadelphia as well.
But Hartley’s most important point was the possible connection between the swill milk and the rise in infant mortality. In 1815 children under five had represented 33 percent of deaths in Boston, which was horrific enough, but by 1839, children under five represented 43 percent of Boston deaths. Children under five had represented 25 percent of deaths in Philadelphia and 32 percent in New York in 1815, but rose to over 50 percent in both cities in 1839. The rapid rise appeared to correspond with the rapid growth of milk production in brewery dairies. Was this milk poison?
It is not certain the extent of the impact of Harley’s 1842 book An Essay on Milk, but it was first to raise the issue of swill milk, and started a debate on the subject that took fifteen years to fully erupt. In 1848 the New York Academy of Medicine studied swill milk and concluded that it had far less nutritional value than milk from farms. This was a significant finding, as a fair number of infant deaths were caused by malnutrition.
There was also another huge problem with the milk: microorganisms. But until the mid-nineteenth century, almost nothing was known of these invisible organisms, and it would take another forty years before their ability to spread disease was fully understood.
By 1855, the 700,000 New Yorkers living in the largest city in the United States were spending $6 million annually on milk. More than two thirds of that was spent on swill milk, and infant mortality was continuing to rise. Between the time Hartley’s book was published in 1842 and 1856, the percentage of children under five who were dying in a year had more than tripled. More and more people began to wonder if that was related to swill milk.
Milk Page 15