The Horse, the Wheel, and Language: How Bronze-Age Riders From the Eurasian Steppes Shaped the Modern World

Home > Other > The Horse, the Wheel, and Language: How Bronze-Age Riders From the Eurasian Steppes Shaped the Modern World > Page 24
The Horse, the Wheel, and Language: How Bronze-Age Riders From the Eurasian Steppes Shaped the Modern World Page 24

by David W. Anthony


  WHAT IS A DOMESTICATED HORSE?

  We decided to investigate bit wear on horse teeth, because it is difficult to distinguish the bones of early domesticated horses from those of their wild cousins. The Russian zoologist V. Bibikova tried to define a domesticated skull type in 1967, but her small sample of horse skulls did not define a reliable type for most zoologists.

  The bones of wild animals usually are distinguished from those of domesticated animals by two quantifiable measurements: measurements of variability in size, and counts of the ages and sexes of butchered animals. Other criteria include finding animals far outside their natural range and detecting domestication-related pathologies, of which bit wear is an example. Crib biting, a stall-chewing vice of bored horses, might cause another domestication-related pathology on the incisor teeth of horses kept in stalls, but it has not been studied systematically. Marsha Levine of the McDonald Institute at Cambridge University has examined riding-related pathologies in vertebrae, but vertebrae are difficult to study. They break and rot easily, their frequency is low in most archaeological samples, and only eight caudal thoracic vertebrae (T11–18) are known to exhibit pathologies from riding. Discussions of horse domestication still tend to focus on the first two methods.8

  The Size-Variability Method

  The size-variability method depends on two assumptions: (1) domesticated populations, because they are protected, should contain a wider variety of sizes and statures that survive to adulthood, or more variability; and (2) the average size of the domesticated population as a whole should decline, because penning, control of movement, and a restricted diet should reduce average stature. Measurements of leg bones (principally the width of the condyle and shaft) are used to look for these patterns. This method seems to work quite well with the leg bones of cattle and sheep: an increase in variability and reduction in average size does apparently identify domesticated cattle and sheep.

  But the underlying assumptions are not known to apply to the earliest domesticated horses. American Indians controlled their horses not in a corral but with a “hobble” (a short rope tied between the two front legs, permitting a walk but not a run). The principal advantage of early horse keeping—its low cost in labor—could be realized only if horses were permitted to forage for themselves. Pens and corrals would defeat this purpose. Domesticated horses living and grazing in the same environment with their wild cousins probably would not show a reduction in size, and might not show an increase in variability. These changes could be expected if and when horses were restricted to shelters and fed fodder over the winter, like cattle and sheep were, or when they were separated into different herds that were managed and trained differently, for example, for riding, chariot teams, or meat and milk production.

  During the earliest phase of horse domestication, when horses were free-ranging and kept for their meat, any size reductions caused by human control probably would have been obscured by natural variations in size between different regional wild populations. The scattered wild horses living in central and western Europe were smaller than the horses that lived in the steppes. In figure 10.3, the three bars on the left of the graph represent wild horses from Ice Age and Early Neolithic Germany. They were quite small. Bars 4 and 5 represent wild horses from forest-steppe and steppe-edge regions, which were significantly bigger. The horses from Dereivka, in the central steppes of Ukraine, were bigger still; 75% stood between 133 and 137 cm at the withers, or between 13 and 14 hands. The horses of Botai in northern Kazakhstan were even bigger, often over 14 hands. West-east movements of horse populations could cause changes in their average sizes, without any human interference. This leaves an increase in variability as the only indicator of domestication during the earliest phase. And variability is very sensitive to sample size—the larger the sample of bones, the better the chance of finding very small and very large individuals—so changes in variability alone are difficult to separate from sample-size effects.

  Figure 10.3 The size-variability method for identifying the bones of domesticated horses. The box-and-whisker graphs show the thickness of the leg bones for thirteen archaeological horse populations, with the oldest sites (Paleolithic) on the left and the youngest (Late Bronze Age) on the right. The whiskers, showing the extreme measurements, are most affected by sample size and so are unreliable indicators of population variability. The white boxes, showing two standard deviations from the mean, are reliable indicators of variability, and it is these that are usually compared. The increase in this measurement of variability in bar 10 is taken as evidence for the beginning of horse domestication. After Benecke and von den Dreisch 2003, figures 6.7 and 6.8 combined.

  The domestication of the horse is dated about 2500 BCE by the size-variability method. The earliest site that shows both a significant decrease in average size and an increase in variability is the Bell Beaker settlement of Csepel-Háros in Hungary, represented by bar 10 in figure 10.3, and dated about 2500 BCE. Subsequently many sites in Europe and the steppes show a similar pattern. The absence of these statistical indicators at Dereivka in Ukraine, dated about 4200–3700 BCE (see chapter 11), and at Botai-culture sites in northern Kazakhstan, dated about 3700–3000 BCE, are widely accepted as evidence that horses were not domesticated before about 2500 BCE. But marked regional size differences among early wild horses, the sensitivity of variability measurements to sample size effects, and the basic question of the applicability of these methods to the earliest domesticated horses are three reasons to look at other kinds of evidence. The appearance of significant new variability in horse herds after 2500 BCE could reflect the later development of specialized breeds and functions, not the earliest domestication.9

  Age-at-Death Statistics

  The second quantifiable method is the study of the ages and sexes of butchered animals. The animals selected for slaughter from a domesticated herd should be different ages and sexes from those obtained by hunting. Herders would probably cull young males as soon as they reached adult meat weight, at about two to three years of age. A site occupied by horse herders might contain very few obviously male horses, since the eruption of the canine teeth in males, the principal marker of gender in horse bones, happens at about age four or five, after the age when the males should have been slaughtered for food. Females should have been kept alive as breeders, up to ten years old or more. In contrast, hunters prey on the most predictable elements of a wild herd, so they would concentrate their efforts on the standard wild horse social group, the stallion-with-harem bands, which move along well-worn paths and trails within a defined territory. Regular hunting of stallion-with-harem bands would yield a small number of prime stallions (six to nine years old) and a large number of breeding-age females (three to ten years old) and their immature young.10

  But many other hunting and culling patterns are possible, and might be superimposed on one another in a long-used settlement site. Also, only a few bones in a horse’s body indicate sex—a mature male (more than five years old) has canine teeth whereas females usually do not, and the pelvis of a mature female is distinctive. Horse jaws with the canines still embedded are not often preserved, so data on gender are spotty. Age is estimated based on molar teeth, which preserve well, so the sample for age estimation usually is bigger. But assigning a precise age to a loose horse molar, not found in the jaw, is difficult, and teeth are often found loose in archaeological sites. We had to invent a way to narrow down the very broad range of ages that could be assigned to each tooth. Further, teeth are part of the head, and heads may receive special treatment. If the goal of the analysis is to determine which horses were culled for food, heads are not necessarily the most direct indicators of the human diet. If the occupants of the site kept and used the heads of prime-age stallions for rituals, the teeth found in the site would reflect that, and not culling for food.11

  Marsha Levine studied age and sex data at Dereivka in Ukraine (4200–3700 BCE) and Botai in northern Kazakhstan (3700–3000 BCE), two critical sites for
the study of horse domestication in the steppes. She concluded that the horses at both sites were wild. At Dereivka the majority of the teeth were from animals whose ages clustered between five and seven years old, and fourteen of the sixteen mandibles were from mature males.12 This suggested that most of the horse heads at Dereivka came from prime-age stallions, not the butchering pattern expected for a managed population. But, in fact, it is an odd pattern for a hunted population as well. Why would hunters kill only prime stallions? Levine suggested that the Dereivka hunters had stalked wild horse bands, drawing the attention of the stallions, which were killed when they advanced to protect their harems. But stalking in the open steppe is probably the least productive way for a pedestrian hunter to attack a wild horse band, as stallions are more likely to alarm their band and run away than to approach a predator. Pedestrian hunters should have used ambush methods, shooting at short range on a habitually used horse trail. Moreover, the odd stallion-centered slaughter pattern of Dereivka closely matches the slaughter pattern at the Roman military cemetery at Kestren, the Netherlands (figure 10.4), where the horses certainly were domesticated. At Botai, in contrast, the age-and-sex profile matched what would be expected if whole wild herds were slaughtered en masse, with no selection for age or sex. The two profiles were dissimilar, yet Levine concluded that horses were wild at both places. Age and sex profiles are open to many different interpretations.

  If it is difficult to distinguish wild from domesticated horses, it is doubly problematic to distinguish the bones of a mount from those of a horse merely eaten for dinner. Riding leaves few traces on horse bones. But a bit leaves marks on the teeth, and teeth usually survive very well. Bits are used only to guide horses from behind, to drive or to ride. They are not used if the horse is pulled from the front, as a packhorse is, as this would just pull the bit out of the mouth. Thus bit wear on the teeth indicates riding or driving. The absence of bit wear means nothing, since other forms of control (nosebands, hackamores) might leave no evidence. But its presence is an unmistakable sign of riding or driving. That is why we pursued it. Bit wear could be the smoking gun in the long argument over the origins of horseback riding and, by extension, in debates over the domestication of the horse.

  Figure 10.4 The sge-at-death method for identifying the bones of domesticated horses. This graph compares the age-at-death statistics for Late Eneolithic horses from Dereivka, Ukraine, to domesticated horses from the Roman site of Kesteren, Netherlands. The two graphs are strikingly similar, but one is interpreted as a “wild” profile and the other is “domesticated.” After Levine 1999, figure 2.21.

  BIT WEAR AND HORSEBACK RIDING

  After Brown and I left the Smithsonian in 1985 we spent several years gathering a collection of horse lower second premolars (P2s), the teeth most affected by bit chewing. Eventually we collected 139 P2s from 72 modern horses. Forty were domesticated horses processed through veterinary autopsy labs at the University of Pennsylvania and Cornell University. All had been bitted with modern metal bits. We obtained information on their age, sex, and usage—hunting, leisure, driving, racing, or draft—and for some horses we even knew how often they had been bitted, and with what kind of bit. Thirteen additional horses came from the Horse Training and Behavior program at the State University of New York at Cobleskill. Some had never been bitted. We made casts of their teeth in their mouths, much as a dentist makes an impression to fit a crown—we think that we were the first people to do this to a living horse. A few feral horses, never bitted, were obtained from the Atlantic barrier island of Assateague, MD. Their bleached bones and teeth were found by Ron Keiper of Penn State, who regularly followed and studied the Assateague horses and generously gave us what he had found. Sixteen Nevada mustangs, killed in 1988 by ranchers, supplied most of our never-bitted P2s. I read about the event, made several telephone calls, and was able to get their mandibles from the Bureau of Land Management after the kill sites were documented. Many years later, in a separate study, Christian George at the University of Florida applied our methods to 113 more never-bitted P2s from a minimum of 58 fossil equids 1.5 million years old. These animals, of the species Equus “leidyi,” were excavated from a Pleistocene deposit near Leisey, Florida. George’s Leisey equids (the same size, diet, and dentition as modern horses) had never seen a human, much less a bit.13

  We studied high-resolution casts or replicas of all the P2s under a Scanning Electron Microscope (SEM). The SEM revealed that the vice of bit chewing was amazingly widely practiced (figure 10.5). More than 90% of the bitted horses showed some wear on their P2s from chewing the bit, often just on one side. Their bits also showed wear from being chewed. Riding creates the same wear as driving, because it is not the rider or driver who creates bit wear—it is the horse grasping and releasing the bit between its teeth. A metal bit or even a bone bit creates distinctive microscopic abrasions on the occlusal enamel of the tooth, usually confined to the first or metaconid cusp, but extending back to the second cusp in many cases. These abrasions (type “a” wear, in our terminology) are easily identified under a microscope. All bits, whether hard (metal or bone) or soft (rope or leather) also create a second kind of wear: a wear facet or bevel on the front (mesial) corner of the tooth. The facet is caused both by direct pressure (particularly with a hard bit of bone or metal), which weakens and cracks the enamel when the bit is squeezed repeatedly between the teeth; and by the bit slipping back and forth over the front or mesial corner of the P2. Metal bits create both kinds of wear: abrasions on the occlusal enamel and wear facets on the mesial corner of the tooth. But rope bits probably were the earliest kind. Can a rope bit alone create visible wear on the enamel of horse teeth?

  With a grant from the National Science Foundation and the cooperation of the State University of New York (SUNY) at Cobleskill we acquired four horses that had never been bitted. They were kept and ridden at SUNY Cobleskill, which has a Horse Training and Behavior Program and a shirty-five-horse stable. They ate only hay and pasture, no soft feeds, to mimic the natural dental wear of free-range horses. Each horse was ridden with a different organic bit—leather, horsehair rope, hemp rope, or bone—for 150 hours, or 600 hours of riding for all four horses. The horse with the horsehair rope bit was bitted by tying the rope around its lower jaw in the classic “war bridle” of the Plains Indians, yet it was still able to loosen the loop with its tongue and chew the rope. The other horses’ bits were kept in place by antler cheek-pieces made with flint tools. At four intervals each horse was anaesthetized by a bemused veterinarian, and we propped open its mouth, brushed its teeth, dried them, pulled its tongue to the side, and made molds of its P2s (figure 10.6). We tracked the progress of bit wear over time, and noted the differences between the wear made by the bone bit (hard) and the leather and rope bits (soft).14

  Figure 10.5 Bit wear and no wear on the lower second premolars (P2s) of modern horses.

  Left: a Scanning Electron Micrograph (SEM) taken at 13x of “a-wear” abrasions on the first cusp of a domesticated horse that was bitted with a metal bit. The profile shows a 3.5 mm bevel or facet on the same cusp.

  Right: An SEM taken at 15x of the smooth surface of the first cusp of a feral horse from Nevada, never bitted. The profile shows a 90˚ angle with no bevel.

  Figure 10.6 Brown and Anthony removing a high-resolution mold of the P2 of a horse bitted with an organic bit at State University of New York, Cobleskill, in 1992.

  The riding experiment demonstrated that soft bits do create bit wear. The actual cause of wear might have been microscopic grit trapped in and under the bit, since all the soft bits were made of materials softer than enamel. After 150 hours of riding, bits made of leather and rope wore away about 1 mm of enamel on the first cusp of the P2 (figure 10.7). The mean bevel measurement for the three horses with rope or leather bits at the end of the experiment was more than 2 standard deviations greater than the pre-experiment mean.15 The rope and leather mouthpieces stood up well to chewing, although the horse with the hem
p rope bit chewed through it several times. The horses bitted with soft bits showed the same wear facet on the same part of the P2 as horses bitted with metal and bone bits, but the surface of the facet was microscopically smooth and polished, not abraded. Hard bits, including our experimental bone bit, create distinctive “a” wear on the occlusal enamel of the facet, but soft bits do not. Soft bit wear is best identified by measuring the depth of the wear facet or bevel on the P2, not by looking for abrasions on its surface.

  Figure 10.7 Graph showing the increase in bevel measurements in millimeters caused by organic bits over 150 hours of riding, with projections of measurements if riding had continued for 300 hours.

  TABLE 10.1 Bevel Mea sure ments on the P2s of Bitted and Never- Bitted Mature (>3yr) Horses

  Table 10.1 shows bevel mearurements for modern horses that never were bitted (left column); Pleistocene North American equids that never were bitted (center left column); domestic horses that were bitted, including some that were bitted infrequently (center right column); and a smaller sub-group of domestic horses that were bitted at least five times a week up to the day we made molds of their teeth (right column). Measurements of the depth of the wear facet easily distinguished the 73 teeth of bitted horses from the 105 teeth of never-bitted horses. The never-bitted/bitted means are different at better than the .001 level of significance. The never-bitted/daily-bitted means are more than 4 standard deviations apart. Bevel measurements segregate mature bitted from mature never-bitted horses, as populations.16

  We set a bevel measurement of 3.0 mm as the minimum threshold for recognizing bit wear on archaeological horse teeth (figure 10.8). More than half of our occasionally bitted teeth did not exhibit a bevel measuring as much as 3 mm . But all horses in our sample with a bevel of 3 mm or more had been bitted. So the last question was, how adequate was our sample? Could a 3 mm wear facet occur naturally on a wild horse P2, caused by malocclusion? Criticisms of bit wear have centered on this problem.17

 

‹ Prev