Between Flesh and Steel
Page 13
The introduction of voluntary enlistments proved attractive mostly to the urban poor and surplus landless population that had manned armies for centuries. No longer driven by the cudgel or impressment, these social elements were attracted by the prospect of regular food and pay. The health of these recruits, however, proved to be as generally poor as it had historically been. In times of social disruption or difficult economic times, recruits flooded the recruitment stations, and large numbers of marginally healthy adults with poor sanitary habits entered military service. The huge losses to disease in the wars of the period led military officials to launch regular physical examinations for recruits. For the first quarter century, however, the unit or regimental commander conducted only cursory examinations of recruits. Beginning in 1726, the French Army instituted regular medical examinations. After 1763, each recruit regiment had a surgeon whose duty was to examine recruits for physical fitness and weed out those whose health failed during the training process. In France, an inspector general of recruiting was appointed in 1778 and charged with the task of overseeing the selection and health of new troops. Mandatory medical examinations were not instituted in the British Army until 1790. Prussia, meanwhile, had required regimental and battalion medical officers to conduct regular physical examinations of all soldiers since 1788.8
France was the first country to institute uniform clothing as early as the 1670s.9 English regulation military dress was mandated in 1751, following similar regulations by Frederick Wilhelm of Prussia (1688–1740) a few years earlier.10 As noted in chapter 3, the purpose of uniform clothing was to facilitate identifying friendly units on the smoky battlefield, but the leadership gave little thought to the effects of this clothing on the health and endurance of the soldier. Uniforms were most often made of cheap cotton that provided little warmth in cold climates and no protection from the rain. Tight stockings often restricted circulation and had no padding for the leather buckle shoes, which offered little defense against frostbite or trench foot. Adorning the uniform with tight buttons and belts often restricted the soldier’s breathing, and with high crowns, the heavy shakos and hats added to the soldiers’ load but did not prevent head wounds from enemy shell fragments and bullets. It would be at least another two centuries before anyone seriously considered designing a uniform for battlefield use while taking the health, comfort, and protective needs of the soldier into consideration.11
The standard military ration, meanwhile, did much to improve the soldier’s general health, and most soldiers ate better and more regularly in military messes than they had in civilian life.12 Rations were provided by a central commissary and at government expense as a matter of right.13 In France, the soldier’s daily allowance was twenty-four ounces of wheat bread, one pound of meat, and one pint of wine or two pints of beer. Frederick the Great provided his soldiers with two pounds of bread daily and two pounds of meat a week.14 Unfortunately, the promise of regular, good-quality food was more often broken than honored. All European armies relied on a supply system in which commissary officers contracted with provisioners, sutlers, and transporters for supplies. This arrangement led to common abuses of fraud and theft, and the pressure to keep expenditures down often reduced food for the troops to less than sufficient quantities or quality.
Before examining the improvements in military surgery, it is worth exploring the casualty burdens that the medical establishments of armies under fire encountered. By the eighteenth century, the armies were universally equipped with more accurate and deadly firearms, and the introduction of truly mobile artillery of increased ranges had the inevitable effect of greatly increasing casualties. Even in the early part of the century, these innovations had an enormous impact on casualty rates. For example, during the War of the Spanish Succession (1701–1714), the allied armies at the Battle of Blenheim in 1704 suffered 5,000 dead and 8,900 wounded. British forces alone endured 670 dead and 1,500 wounded, while the Bavarian armies suffered 12,000 dead and 14,000 wounded. Two years later at Ramillies, the French lost 2,000 dead and 5,000 wounded. It was not unusual during this period for armies to suffer similar casualty rates, with the usual effect of overwhelming the primitive field medical establishments of the day.
In this era, military medical surgery improved markedly and introduced a number of new techniques. Some of the old wound treatments—such as sympathetic powder and wound salve, mainstays of the previous century—disappeared, giving rise to the more extensive use of styptics to stop minor bleeding. Pressure sponges, alcohol, and turpentine came into widespread use for minor wounds. Military surgeons still cauterized arteries, but less frequently, as they widely began to use the new locked forceps as ligatures. They increasingly applied Petit’s screw tourniquet, which made thigh amputations possible and greatly reduced the risk associated with amputations above the knee. Military surgeons placed greater emphasis on preparing limbs for prosthesis, and flap and lateral incision amputations became common.
Although more surgeons questioned the need for the inevitable suppuration of wounds, many still provoked infection by inserting charpie and other foreign matter into wounds. While they continued to use the old oils and salve dressings for wounds, the new technique of applying dry bandages moistened only with water held much practical promise. That many of the old chemical and salve treatments endured is not surprising, since doctors often prepared and sold these potions themselves at considerable profit. The practice of enlarging and probing battle wounds continued unabated, but the new debridement treatment was gaining acceptance as an alternative procedure. Despite a literature that established clinical circumstances and guidelines for carrying out the procedure, the improvements in amputation surgery inevitably provoked a spate of unnecessary operations that continued for many years. Yet, without doubt, military surgery was improving at a rapid pace as military surgeons learned new and improved techniques of wound treatment.
John Hunter is generally credited with the first real improvements in understanding the nature of wound treatment. He began his career as an anatomist, only later becoming a surgeon. His training in linking anatomy to clinical signs of pathology served him well. He accepted a commission in the middle of the Seven Years’ War (1756–1763) with France and gained valuable surgical experience at the Battle of Belle Île (1761). Afterward he argued against the normal practice of enlarging gunshot wounds and against bloodletting, pushing instead for a conservative approach to treating gunshot wounds. In 1794, he published his Treatise of the Blood, Inflammation, and Gunshot Wounds, which is regarded as a major milestone in the surgical treatment of battle wounds.
Pierre-Joseph Desault coined the term “débridement” and recommended not enlarging wounds as common practice. His new technique recommended cutting away only the necrotic tissue within the wound to remove a source of infection. Desault was the first to use the technique for traumatic wounds.15
For decades bullet wounds to the head had produced great risk of infection. Because doctors believed that blood that accumulated in the extradural or subdural spaces would eventually become pus, they allowed it to remain as a seat of infection. The military surgeon Percival Pott (1713–1788) was the first to argue against this practice, suggesting that this residual blood could be extracted by cranial draining. His contribution is often cited as a major advance in cranial surgery.16 For decades surgeons commonly operated on all head wounds with experimental trephination. Many of these operations were unnecessary and exposed the patient to great risk of infection. Near the end of the century, Sylvester O’Halloran (1728–1807), an Irish surgeon, demonstrated that experimental trephination was usually not needed. Within a decade, the practice generally came to an end. O’Halloran was also the first to improve the treatment of penetrating head wounds by regularly utilizing debridement.17
Habits of personal, medical, and surgical cleanliness were still dismal during this period, and the soldier faced a greater risk to his health while in the hospital than on the battlefield. It has been estimated that
in the American Revolutionary War (1775–1783), the Continental soldier had ninety-eight chances out of a hundred of escaping death on the battlefield, but once he was hospitalized, his chances of survival after medical treatment and exposure to disease and infection fell to 75 percent.18 Some surgeons, however, did perceive a relationship between cleanliness and surgical infection. Claude Pouteau (1724–1775) a French surgeon, made cleanliness a requirement in his operating area and achieved the remarkable result of losing only 3 of the 120 lithotomies performed in his surgery to infection. John Pringle (1707–1782), the famous English physician and surgeon, first coined the term “antiseptic” in 1750 and in 1753 published the results of forty-three experiments performed over a three-year period that confirmed the antiseptic value of mineral acids.19 In 1737 Alexander Monro I claimed to have performed fourteen amputations at his Edinburgh surgery with no hospital mortality. By 1752, he had performed more than a hundred major amputations with a hospital mortality rate of only 8 percent. This achievement was all the more remarkable given that for the next century the mortality rate for hospital surgery was generally between 45 and 65 percent.20 Monro also had a fetish for cleanliness. Despite overwhelming evidence of the relation between cleanliness and infection, however, the work of surgeons Pringle, Monro, and Edward Alanson (1747–1823) was largely ignored until the next century when Joseph Lister (1827–1912) introduced general antisepsis. One can only guess how many soldiers would have survived their wounds had they not been exposed to infection while recovering in the hospital.
While the eighteenth century saw numerous improvements in the establishment and organization of military hospitals, especially in the introduction of the mobile field hospitals that accompanied the armies on the march, they still offered the unsanitary and dismal-quality care as they had in the previous century. Hospital buildings were often little more than rapidly constructed huts in the field.21 While every army had a hospital medical organization to provide treatment and administration, they were rarely fully staffed. Moreover, there was a notorious lack of coordination between regimental, field, and general rear area hospitals, especially in the provision of medical supplies. Few armies had any organized and dedicated transport to move the wounded from the front to rear area hospitals, and it was not uncommon for a third of the patients to die en route. No army developed a satisfactory solution for extracting the wounded from the battle line, and troops usually made their way to the medical facilities as best they could. As in the previous century, some armies, notably Prussia’s, actually forbade attempts to treat the wounded until the battle had ended. No one seems to have thought of copying the Romans’ stationing of combat medics within the battle units themselves.
Disease continued to be the major threat to military manpower despite military physicians’ many attempts at preventive medicine. Among Continental soldiers in the American Revolution, disease caused 90 percent of all deaths; among British regulars, the figure was 84 percent.22 Hughes Ravaton, the French physician, noted in 1768 that one of every hundred soldiers in the French Army would be unfit for duty because of illness at the beginning of a field campaign. Halfway through the campaign, another five or six would drop out of combat because of disease. By the campaign’s end, ten to twelve more soldiers would be too ill to fight. By comparison, the death and injury rate from combat fire was approximately one per ten men.23
The range of diseases that afflicted the troops had changed little from the previous century. Respiratory illnesses were most often seen in cold weather and dysentery-like conditions in hot climates. Disease diagnosis had not yet become a science, and descriptions of disease from this period cannot be entirely trusted. The common “intermittent” and “remittent” fevers of the day were most probably malaria, a disease widespread in Europe and the colonial dominions. Those conditions called “putrid,” “jail,” or “hospital fever” were probably typhus or typhoid. Dysentery and other stomach disorders were rampant, often as a consequence of poor field hygiene conditions. Venereal disease was almost epidemic. Pneumonia and pleurisy also presented a common threat. The records of the period also show that scabies was endemic. This infestation caused scratching, which produced serious infections requiring medical treatment. Although not fatal, scabies brought more patients to British Army hospitals during the Seven Years’ War than did any other medical condition.24 Scabies continued to plague armies into modern times. In World War I, scabies or the pyrodermas produced by constant scratching caused 90 percent of the illness for which Allied troops sought hospitalization or field treatment.25
Smallpox was among the most debilitating and dangerous diseases that afflicted field armies, with numerous examples of entire campaigns being halted as a consequence of outbreaks. In 1775, Gen. Horatio Gates (1727–1806) had to break off the American Northern Army Command’s campaign for five weeks because an outbreak of smallpox sent 5,500 of his 10,000 troops to the hospital.26 The disease afflicted the civilian population without mercy in multiple epidemics that marked the period.
The prevalence of smallpox in the civilian and military populations drove one of the more important medical advances of the century, inoculation. Credit for introducing smallpox inoculation is generally given to Edward Jenner (1749–1823), but in fact, inoculation against smallpox was already an established practice long before Jenner formalized the method. The practice of using cowpox inoculation to prevent smallpox disease was common in the Ottoman Empire before it was introduced into Europe. Lady Mary Wortley Montague, the wife of the British ambassador to Constantinople, knew of the practice as early as 1718 and had herself and her two children inoculated against the disease.27 In 1721, a smallpox epidemic broke out in Boston, and Dr. Zabdiel Boylston (1679–1766) used inoculation to prevent its further spread. He inoculated 247 persons with a loss rate of only 2 percent, compared to the usual 15 percent death rate.28 Jenner’s contribution to inoculation seems to have been that he was the first person to conceive of inoculating whole populations against the disease, and he developed the popular support to carry out his idea. Jenner did not perform his first smallpox inoculation until 1796.
The first army to try wholesale inoculation on its soldiers was the American Army. In 1775, noting General Gates’s debacle while confronting smallpox, Gen. George Washington (1732–1799) obtained the approval of the Continental Congress to inoculate recruits upon their entering into military service.29 The program was less than successful, however, and we do not know how many soldiers actually received inoculations. The British Army did not allow inoculation against smallpox until 1798, when the Sick and Wounded Board authorized the procedure at military hospitals for those who wanted it. As the century ended, there was still no mandatory inoculation for British troops.30 The successful immunization of military forces had to await the next century, when inoculation became more generally accepted. Holland and Prussia were the first countries to require inoculations of all their troops, while the French and English continued to lag behind. In the Franco-Prussian War of 1870–1871, unvaccinated French prisoners suffered 14,178 cases of smallpox, of which 1,963 died. The vaccinated German troops suffered only 4,835 cases of the disease, of which only 178 died, or a mortality rate of less than 4 percent.31
Recognizing the importance of military medical care in maintaining the fighting ability of their armies, some states established mechanisms for ensuring an adequate supply of surgeons and other field medical personnel. In the first quarter of the century, the French established schools for training surgeons and mates at a number of army and navy hospitals. The most important medico-military institution of the century was established when the French opened the Académie Royale de Chirurgie in Paris in 1731. That five of its seven directors and half of the forty members nominated by the king were prominent military surgeons who had served in battle attests to the academy’s dedication to military medical matters. Further, army and naval surgeons wrote more than a third of the period’s four volumes of medical papers.32 Saxony followed the Frenc
h example and established an army medical school in 1748. Additional military medical schools were established in Austria in 1784 and in Berlin in 1795. In 1766, Richard de Hautesierck (1712–1789), inspector of hospitals, published the world’s first medical journal devoted exclusively to military medicine.33
Gradually armies established regular field medical facilities. In 1745 at the Battle of Fontenoy, the British military medical service treated the wounded on the first line and collected them at ambulance stations. Surgeons performed capital operations at medical stations behind the lines and then transferred the more seriously wounded to hospitals prepared for them in nearby cities and towns. When these hospitals became overcrowded, the army made arrangements to ship the wounded farther to the rear. Although this model was becoming commonplace in all armies of the period, military medical facilities did not operate so efficiently as a matter of course. More commonly medical facilities were understaffed, were poorly supplied and had little transport, and generally were overwhelmed by large numbers of casualties. Nonetheless, the structural articulation that the armies of the day were demonstrating in other areas was also evident in their attempts to provide better medical care for the soldier. It would take yet another century, but eventually the seeds of a full-time professional military medical service sown in the eighteenth century would come to fruition.
Before examining the development of the national military medical services, it is worth noting another development that did much to foster medical care in the armies of the period. Exempting the wounded from slaughter or imprisonment had begun in the seventeenth century, and the idea gained added support during the eighteenth century. In July 1743 at the completion of the Dettingen campaign, both sides signed an agreement that declared medical personnel serving in the armies would be considered noncombatants and not taken as prisoners of war. In addition, medical personnel would be given safe passage back to their own armies as soon as practical. Most important, both sides agreed to care for the enemy wounded and sick prisoners as they would their own and provide for their return upon recovery.34 While the Dettingen agreement was important for its humanity in dictating the treatment of the sick and wounded, it was also a significant spur to the further development of military medical facilities. While the old system of slaughtering the wounded reduced the casualty load for the medical facilities, the Dettingen agreement forced armies to increase their medical staffs to deal with the enemy wounded as well.