Book Read Free

At Home

Page 20

by Bill Bryson


  Start with salt. Salt is a cherished part of our diet for a very fundamental reason. We need it. We would die without it. It is one of about forty tiny specks of incidental matter—odds and ends from the chemical world—that we must get into our bodies to give ourselves the necessary zip and balance to sustain daily life. Collectively, those specks are known as vitamins and minerals, and there is a great deal—a really quite surprising amount—that we don’t know about them, including how many of them we need, what exactly some of them do, and in what amounts they are optimally consumed.

  That they were needed at all was a piece of knowledge that was an amazingly long time coming. Until well into the nineteenth century, the notion of a well-balanced diet had occurred to no one. All food was believed to contain a single vague but sustaining substance—“the universal aliment.” A pound of beef had the same value for the body as a pound of apples or parsnips or anything else, and all that was required of a human was to make sure that an ample amount was taken in. The idea that embedded within particular foods were vital elements that were central to one’s well-being had not yet been thought of. That’s not altogether surprising, because the symptoms of dietary deficiency—lethargy, aching joints, increased susceptibility to infection, blurred vision—seldom suggest dietary imbalance. Even today if your hair started to fall out or your ankles swelled alarmingly, it is unlikely your first thoughts would turn to what you had eaten lately. Still less would you think about what you hadn’t eaten. So it was with bewildered Europeans who for a very long time died in often staggering numbers without knowing why.

  Of scurvy alone it has been suggested that as many as two million sailors died between 1500 and 1850. Typically, scurvy killed about half the crew on any long voyage. Various desperate expedients were tried. Vasco da Gama on a cruise to India and back encouraged his men to rinse their mouths with urine, which did nothing for their scurvy and can’t have done much for their spirits either. Sometimes the toll was truly shocking. On a three-year voyage in the 1740s, a British naval expedition under the command of Commodore George Anson lost fourteen hundred men out of two thousand who sailed. Four were killed by enemy action; virtually all the rest died of scurvy.

  Over time, people noticed that sailors with scurvy tended to recover when they got to a port and received fresh foods, but nobody could agree what it was about those foods that helped them. Some thought it wasn’t the foods at all, but just a change of air. In any case, it wasn’t possible to keep foods fresh on long voyages, so simply identifying efficacious vegetables and the like was slightly pointless. What was needed was some kind of distilled essence—an antiscorbutic, as the medical men termed it—that would be effective against scurvy but portable, too. In the 1760s, a Scottish doctor named William Stark, evidently encouraged by Benjamin Franklin, conducted a series of patently foolhardy experiments in which he tried to identify the active agent by, somewhat bizarrely, depriving himself of it. For weeks he lived on only the most basic of foods—bread and water chiefly—to see what would happen. What happened was that in just over six months he killed himself, from scurvy, without coming to any helpful conclusions at all.

  In roughly the same period, James Lind, a naval surgeon, conducted a more scientifically rigorous (and personally less risky) experiment by finding twelve sailors who had scurvy already, dividing them into pairs, and giving each pair a different putative elixir—vinegar to one, garlic and mustard to another, oranges and lemons to a third, and so on. Five of the groups showed no improvement, but the pair given oranges and lemons made a swift and total recovery. Amazingly, Lind decided to ignore the significance of the result and doggedly stuck with his personal belief that scurvy was caused by incompletely digested food building up toxins within the body.

  It fell to the great Captain James Cook to get matters onto the right course. On his circumnavigation of the globe in 1768–71, Captain Cook packed a range of antiscorbutics to experiment on, including thirty gallons of carrot marmalade and a hundred pounds of sauerkraut for every crew member. Not one person died from scurvy on his voyage—a miracle that made him as much a national hero as his discovery of Australia or any of his other many achievements on that epic undertaking. The Royal Society, Britain’s premier scientific institution, was so impressed that it awarded him the Copley Medal, its highest distinction. The British navy itself was not so quick, alas. Even in the face of all the evidence, it procrastinated for another generation before finally providing citrus juice to sailors as a matter of routine.*

  The realization that an inadequate diet caused not only scurvy but a range of common diseases was remarkably slow to become established. Not until 1897 did a Dutch physician named Christiaan Eijkman, working in Java, notice that people who ate whole-grain rice didn’t get beriberi, a debilitating nerve disease, while people who ate polished rice very often did. Clearly some thing or things were present in some foods, and missing in others, and served as a determinant of well-being. It was the beginning of an understanding of “deficiency disease,” as it was known, and it won Eijkman the Nobel Prize in medicine even though he had no idea what these active agents were. The real breakthrough came in 1912, when Casimir Funk, a Polish biochemist working at the Lister Institute in London, isolated thiamine, or vitamin B1, as it is now more generally known. Realizing it was part of a family of molecules, he combined the terms vital and amines to make the new word vitamines. Although Funk was right about the vital part, it turned out that only some of the vitamines were amines (that is to say, nitrogen-bearing), and so the name was changed to vitamins to make it “less emphatically inaccurate,” in Anthony Smith’s nice phrase.

  Funk also asserted that there was a direct correlation between a deficiency of specific amines and the onset of certain diseases—scurvy, pellagra, and rickets in particular. This was a huge insight and had the potential to save millions of shattered lives, but unfortunately it wasn’t heeded. The leading medical textbook of the day continued to insist that scurvy was caused by any number of factors—“insanitary surroundings, overwork, mental depression and exposure to cold and damp” were the principal ones its authors thought worth listing—and only marginally by dietary deficiency. Worse still, in 1917 America’s leading nutritionist, E. V. McCollum of the University of Wisconsin—the very man who coined the terms vitamin A and B—declared that scurvy was not in fact a dietary deficiency disease at all, but was caused by constipation.

  Finally in 1939, a Harvard Medical School surgeon named John Crandon decided to settle matters once and for all by the age-old method of withholding vitamin C from his diet for as long as it took to make himself really ill. It took a surprisingly long time. For the first eighteen weeks, his only symptom was extreme fatigue. (Remarkably, he continued to operate on patients through this period.) But in the nineteenth week he took an abrupt turn for the worse—so much so that he would almost certainly have quickly died had he not been under close medical supervision. He was injected with 1,000 milligrams of vitamin C and was restored to life almost at once. Interestingly, he had never acquired the one set of symptoms that everyone associates with scurvy: the falling out of teeth and bleeding of gums.

  Meanwhile, it turned out that Funk’s vitamines were not nearly as coherent a group as originally thought. Vitamin B proved to be not one vitamin but several, which is why we have B1, B2, and so on. To add to the confusion, vitamin K has nothing to do with an alphabetical sequence. It was called K because its Danish discoverer, Henrik Dam, dubbed it Koagulations vitamin for its role in blood clotting. Later, folic acid (sometimes called vitamin B9) was added to the group. Two other vitamins—pantothenic acid and biotin—don’t have numbers or, come to that, much profile, but that is largely because they almost never cause us problems. No human has yet been found with insufficient quantities of either.

  The vitamins are, in short, a disorderly bunch. It is almost impossible to define them in a way that comfortably embraces them all. A standard textbook definition is that a vitamin is “an organic
molecule not made in the human body which is required in small amounts to sustain normal metabolism,” but in fact vitamin K is made in the body, by bacteria in the gut. Vitamin D, one of the most vital substances of all, is actually a hormone, and most of it comes to us not through diet but through the magical action of sunlight on skin.

  Vitamins are curious things. It is odd, to begin with, that we cannot produce them ourselves when we are so very dependent on them for our well-being. If a potato can produce vitamin C, why can’t we? Within the animal kingdom only humans and guinea pigs are unable to synthesize vitamin C in their own bodies. Why us and guinea pigs? No point asking. Nobody knows. The other remarkable thing about vitamins is the striking disproportion between dosage and effect. Put simply, we need vitamins a lot, but we don’t need a lot of them. Three ounces of vitamin A, lightly but evenly distributed, will keep you purring for a lifetime. Your B1 requirement is even less—just one ounce spread over seventy or eighty years. But just try doing without those energizing specks and see how long it is before you start to fall to pieces.

  The same considerations exactly apply with the vitamins’ fellow particles the minerals. The fundamental difference between vitamins and minerals is that vitamins come from the world of living things—from plants and bacteria and so on—and minerals do not. In a dietary context, minerals is simply another name for the chemical elements—calcium, iron, iodine, potassium, and the like—that sustain us. Ninety-two elements occur naturally on Earth, though some in only very tiny amounts. Francium, for instance, is so rare that it is thought that the whole planet may contain just twenty francium atoms at any given time. Of the rest, most pass through our bodies at some time or other, sometimes quite regularly, but whether they are important or not is still often unknown. You have a lot of bromine distributed through your tissues. It behaves as if it is there for a purpose, but nobody yet has worked out what that purpose might be. Remove zinc from your diet and you will get a condition known as hypogeusia, in which your taste buds stop working, making food boring or even revolting, but until as recently as 1977 zinc was thought to have no role in diet at all.

  Several elements, like mercury, thallium, and lead, seem to do nothing good for us and are positively detrimental if consumed excessively.* Others are also unnecessary but far more benign, of which the most notable is gold. That is why gold can be used as a filling for teeth: it doesn’t do you any harm. Of the rest, some twenty-two elements are known or thought to be of central importance to life, according to Essentials of Medical Geology. We are certain about sixteen of them; the other six we merely think are vital. Nutrition is a remarkably inexact science. Consider magnesium, which is necessary for the successful management of proteins within the cells. Magnesium abounds in beans, cereals, and leafy vegetables, but modern food processing reduces the magnesium content by up to 90 percent—effectively annihilates it. So most of us are not taking in anything like the recommended daily amount—not that anyone really knows what that amount should be. Nor can anybody specify the consequences of magnesium deficiency. We could be taking years off our lives, or points off our IQ, or the edge off our memory, or almost any other bad thing you care to suggest. We just don’t know. Arsenic is similarly uncertain. Obviously, if you get too much in your system you will very quickly wish you hadn’t. But we all get a little arsenic in our diets, and some authorities are absolutely certain it is vital to our well-being in these tiny amounts. Others are not so sure.

  Which brings us back, in a very roundabout way, to salt. Of all the minerals, the most vital in dietary terms is sodium, which we mostly consume in the form of sodium chloride—table salt.* Here the problem is that we are getting not too little, but possibly way too much. We don’t need all that much—200 milligrams a day, about what you would get with six or eight vigorous shakes of a salt cellar—but we take in about sixty times that amount on average. In a normal diet it is almost impossible not to overload on sodium, because there is so much salt in the processed foods we eat with such ravenous devotion. Often it is heaped into foods that don’t seem salty at all—breakfast cereals, prepared soups, and ice cream, for instance. Who would guess that an ounce of cornflakes contains more salt than an ounce of salted peanuts? Or that the contents of one can of soup—almost any can at all—will considerably exceed the total daily recommended salt allowance for an adult?

  Archaeological evidence shows that once people settled down in agricultural communities they began to suffer salt deficiencies—something that they had not experienced before—and so had to make a special effort to find salt and get it into their diet. One of the mysteries of history is how they knew they needed to do so, because the absence of salt in the diet awakes no craving. It makes you feel bad and eventually it kills you—without the chloride in salt, cells simply shut down like an engine without fuel—but at no point would a human being think: “Gosh, I could sure do with some salt.” So how they knew to go searching for it is an interesting question, particularly as in some places getting it required some ingenuity. Ancient Britons, for instance, heated sticks on a beach, then doused them in the sea and scraped the salt off. Aztecs, by contrast, acquired salt by evaporating their own urine. These are not intuitive acts, to put it mildly. Yet getting salt into the diet is one of the most profound urges in nature, and it is a universal one. Every society in the world in which salt is freely available consumes, on average, forty times the amount needed to sustain life. We just can’t get enough of the stuff.

  Salt is now so ubiquitous and cheap that we forget how intensely desirable it was once, but for much of history it drove men to the edges of the world. Salt was needed to preserve meats and other foods, and so was often required in vast quantities: Henry VIII had twenty-five thousand oxen slaughtered and salted for one military campaign in 1513. So salt was a hugely strategic resource. In the Middle Ages caravans of as many as forty thousand camels—enough to form a column seventy miles long—conveyed salt across the Sahara from Timbuktu to the lively markets of the Mediterranean.

  People have fought wars over it and been sold into slavery for it. So salt has caused some suffering in its time. But that is nothing compared with the hardship and bloodshed and murderous avarice associated with a range of tiny foodstuffs that we don’t need at all and could do perfectly well without. I refer to salt’s complements in the condiment world: the spices. Nobody would die without spices, but plenty have died for them.

  A very big part of the history of the modern world is the history of spices, and the story starts with an unprepossessing vine that once grew only on the Malabar coast of southwestern India. The vine is called Piper nigrum. If presented with it in its natural state, you would almost certainly struggle to guess its importance, but it is the source of all three “true” peppers—black, white, and green. The little round, hard peppercorns that we pour into our household pepper mills are actually the vine’s tiny fruit, dried to pack a gritty kick.* The difference between the varieties is simply a function of when they are picked and how they are processed.

  Pepper has been appreciated since time immemorial in its native territory, but it was the Romans who made it an international commodity. Romans loved pepper. They even peppered their desserts. Their attachment to it kept the price high and gave it a lasting value. Spice traders from the distant East couldn’t believe their luck. “They arrive with gold and depart with pepper,” one Tamil trader remarked in wonder. When the Goths threatened to sack Rome in 408, the Romans bought them off with a tribute that included three thousand pounds of pepper. For his wedding meal in 1468, Duke Karl of Bourgogne ordered 380 pounds of black pepper—far more than even the largest wedding party could eat—and displayed it conspicuously so that people could see how fabulously wealthy he was.

  Incidentally, the long-held idea that spices were used to mask rotting food doesn’t stand up to much scrutiny. The only people who could afford most spices were the ones least likely to have bad meat, and anyway spices were too valuable to be used as a mask
. So when people had spices they used them carefully and sparingly, and not as a sort of flavorsome cover-up.

  Pepper accounted for some 70 percent of the spice trade by bulk, but other commodities from farther afield—nutmeg and mace, cinnamon, ginger, cloves, and turmeric, as well as several largely forgotten exotics such as calamus, asafoetida, ajowan, galangal, and zedoary—began to find their way to Europe, and these became even more valuable. For centuries spices were not just the world’s most valued foodstuffs, they were the most treasured commodities of any type. The Spice Islands, hidden away in the Far East, remained so desirable and prestigious and exotic that when James I gained possession of two small islets, it was such a coup that for a time he was pleased to style himself “King of England, Scotland, Ireland, France, Puloway and Puloroon.”

  Nutmeg and mace were the most valuable because of their extreme rarity.* Both came from a tree, Myristica fragrans, which was found on the lower slopes of just nine small volcanic islands rising sheer from the Banda Sea, amid a mass of other islands—none with quite the right soils and microclimates to support the nutmeg tree—between Borneo and New Guinea in what is now Indonesia. Cloves, the dried flowerbuds of a type of myrtle tree, grew on six similarly selective islands some two hundred miles to the north in the same chain, known to geography as the Moluccas but to history as the Spice Islands. Just to put this in perspective, the Indonesian archipelago consists of sixteen thousand islands scattered over 735,000 square miles of sea, so it is little wonder that the locations of fifteen of them remained a mystery to Europeans for so long.

  All of these spices reached Europe through a complicated network of traders, each of whom naturally took a cut. By the time they reached European markets, nutmeg and mace fetched as much as sixty thousand times what they sold for in the Far East. Inevitably, it was only a matter of time before those at the end of the supply chain concluded it would be a lot more lucrative to cut out the intermediate stages and get all the profits at the front end.

 

‹ Prev