Still the Iron Age

Home > Other > Still the Iron Age > Page 5
Still the Iron Age Page 5

by Vaclav Smil


  That requirement alone (leaving aside all charcoal need for further metal processing) would have necessitated (even with high average increment of 7 t/ha in natural forests) annual wood harvest from nearly 180,000 km2 of forest (Smil, 1994). That area would be equal to Missouri or Oklahoma (or a third of France), and if it were a square its side would go from Philadelphia to Boston, or from Paris to Frankfurt. Obviously, even forest-rich America could not do this. Moreover, there could be no doubt about its superiority as a metallurgical fuel. Coke is produced by heating suitable kinds of bituminous coals (they must have low ash and low sulfur content) in the absence of oxygen: this pyrolysis (destructive distillation) drives off virtually all volatile matter and leaves behind nearly pure carbon with low apparent density of just 0.8–1 g/cm3 but with higher heating value at 31–32 MJ/kg, roughly twice as energy dense as air-dried wood but only slightly more energy dense than the best charcoal.

  As with most technical advances, the efficiency of early coke-producing methods was very poor. For more than a century the standard way to make coke was in enclosed beehive ovens (Sexton, 1897; Washlaski, 2008). These hemispherical structures (American beehives had diameter of about 3.8 m) were usually built in banks (called batteries, with some American batteries eventually having 200 to 300 beehives), often into a hillside (making it easier to cover them with earth), and always with strong frontal retaining walls. After 4 or 5 days of preheating (first with wood, then with coal) the starter fuel was removed, front doors were bricked up to two-thirds of their height, and ovens were charged with coal. Average charge for a standard oven was 5–5.5 t, loaded coal was leveled with an iron bar, doors were bricked up and sealed with clay, and for the next 2 to 3 days (burning periods varied between 40 and 75 h) slow-burning beehive ovens lit the night skies with an orange-reddish glow and emitted hot gases through their open tops (trunnel heads).

  Once the controlled burning ended coke was finished by quenching with water, doors were broken up, and the fuel was removed from beehives to be transported to blast furnaces. Early beehive coking consumed up to 2 t of coal per tonne of coke; later the yield increased to 60% and eventually to about 70%. Coke’s ability to support heavier charges of ore and limestone made it possible to build taller blast furnaces with larger capacities and higher outputs, and this, in turn, increased demand for coke. Some of these early coke furnaces used Newcomen’s inefficient steam engines, and after 1776 the diffusion of coke-based smelting was greatly aided by the adoption of Watt’s steam engines as the drivers of more powerful bellows: they were used in that way in 1782 and by 1790 England and Wales had 83 coke-fueled furnaces in operation and 71 had steam-powered bellows (Hyde, 1977). But charcoal-fueled furnaces did not disappear: in 1810 they still smelted one-third of English and Welsh iron.

  Larger Furnaces and Hot Blast

  Coke-based smelting using steam-driven blast severed the traditional location links to forested regions with fast-flowing streams and led to rapid increases of individual furnace volumes and hence to an unprecedented growth in pig iron production. In order to follow the subsequent design changes in their height, volume, and output capacity it is first necessary to understand the basic structural components of these increasingly massive structures. Their modern designs are always built on substantial foundations, and at their bottom is a circular hearth whose top perimeter is pierced by water-cooled tuyères blowing in pressurized and heated air and where the liquid metal and slag are collected between tappings.

  Then comes the bosh, a short, truncated, and slightly outward-sloping cone whose interior contains the highest smelting temperatures. The belly, the furnace’s widest section, is between the bosh and the stack (shaft), the longest, and only slightly narrowing, section where downward movement of ore, coke, and flux meets the upward movement of hot, CO-rich gases that reduce iron oxides. The furnace throat is surmounted by a top cone that contains arrangements for furnace charging as well as for the collection of hot furnace gases. Molten metal is periodically released through tapholes whose clay plugs are opened up by a special drill, and the slag (now commonly used in construction or as a fertilizer) is taken out through cinder notches. Furnace campaigns (uninterrupted operations) last many years before the smelting stops and the furnace is let to cool for relining of its walls with refractory materials and of its hearth with carbon. Furnaces’ growth in height and capacity changed every one of their basic structural components.

  The first coke-fueled blast furnaces were not taller (about 8 m) and were no more voluminous (with inside capacities of less than 17 m3) than their large contemporary charcoal-fired structures, but in 1793 Neath Abbey was the first furnace to reach the exceptional height of 18 m. By 1810 coke-fueled furnaces were around 14 m tall with volumes of more than 70 m3, and their typical annual outputs increased from about 1000 t/year during the late 1780s to 1500t/day by 1810. The next two decades saw a few fundamental changes: by the late 1830s large British furnaces were less than 15 m tall, with stacks of less than 9 m and square hearths with sides of less than 2 m: vertical cross-section shows a narrow hearth, a relatively tall and sloping bosh, and a fairly short narrowing stack.

  Lowthian Bell (1816–1904), one of the leading metallurgists and industrialists of the nineteenth century, concluded that such furnaces were too short and too narrow for efficient iron ore reduction, and in 1840 his redesign introduced a furnace that has served as a prototype of all subsequent modern designs (Bell, 1884). Bell increased the overall height by two-thirds, to 24 m, with a stack of nearly 17 m (stack:total height ratio of 0.7 compared to previous 0.6); his bosh was relatively shorter (bosh:total height ratio of 0.22 compared to previous 0.44) and less sloping, and he also enlarged the top opening by more than 80% and the hearth diameter by a third (Bell, 1884).

  These larger coke-fueled furnaces opened the way to unprecedented increases of iron output and made the United Kingdom Europe’s largest pig iron producer. Production in England and Wales grew 2.33 times during the last decade of the eighteenth century to 282,000 t. That was nearly four times as much as the stagnating Swedish production (King, 2005; Olsson, 2007). A decade later the British output was in excess of 400,000 t/year while the US shipments in 1810 (the first year for which we have an output estimate) reached just 49,000 t (USBC, 1975). By far the most important smelting innovation of the first half of the nineteenth century was the introduction of hot blast during the late 1820s and its nearly universal use just 15 years later.

  James Beaumont Neilson (1792–1865) was not an experienced ironmaster but an outsider who was able to apply a general observation (he noticed how flame luminosity can be enhanced by supplying it with preheated air) to a specific engineering problem. Neilson patented his hot blast technique in 1828 and it was first used in the Clyde Ironworks in Scotland, and ironworks in Scotland continued to adopt hot blast much faster than their English counterparts and soon came to claim the largest share of newly installed blast furnace capacities (Birch, 1967; Harris, 1988). In retrospect this may be an eminently logical step but it was resisted by conservative ironmasters who believed in the efficacy of cold blast and maintained, erroneously, that hot blast would lower the quality of metal.

  Initially, all preheating was done by burning a relatively small amount of coal outside of a furnace to heat the blast air, but this additional outside combustion resulted in impressive fuel savings inside the furnace. In 1829 the first preheating, to just 150°C, saved more than a third of fuel compared to cold blast, and in 1833 preheating to 325°C saved an additional 45% (Bone, 1928). Part of these savings was due to less wasteful coking methods and better boiler efficiencies, but Bell (1884) concluded that the savings attributable to hot blast, equivalent to at least 1.75 t of coal, were achieved by additional external combustion of 100–150 kg of coal, a return that justified the payoff as sufficiently astounding.

  Besides the universal benefits of hot blast—reduced consumption of coke and limestone, increased furnace productivity (by about half as furnaces could
be tapped more often), and better quality of cast iron—there were three important specific gains. Hot blast opened the way to large-scale ironmaking in Scotland by making it possible to use local low-quality iron ore (blackband ironstone discovered at the beginning of the nineteenth century) and to substitute local raw coal for more expensive coke. As a result, Scottish production of cast iron had more than quintupled during the 1830s to nearly 200,000 t/year. Similarly, hot blast in South Wales allowed the use of local anthracite, and the use of high-quality Pennsylvania anthracite (for the first time by Frederick W. Gessenhainer in 1836), became the foundation of the state’s large iron industry (Bone, 1928). Hot blast had also changed the access to the hearth: the furnace’s outer casing was supported by cast iron pillars that allowed access from every direction and made it possible to deploy a larger number of better (water-cooled) tuyères that could withstand the constant heat of more than 300°C.

  The combination of hot blast and larger furnaces led to another round of productivity increases: 15-m tall cold-blast furnaces of the late 1820s produced less than 80 t of metal per week; two decades later 18-m tall hot-blast furnaces produced up to 200 t/week (Birch, 1967). The next logical step took another two decades to accomplish: why to heat gases in blast stoves when large volumes of hot gases were constantly escaping from the open tops of blast furnaces? This large loss of heat began to be tackled during the late 1840s (James P. Budd filed his heat recovery patent in 1845), and soon afterwards furnaces became closed, usually by a massive cup (fixed) and cone (movable) apparatus that was introduced by George Parry (1813–1873) in 1850. The final step in eliminating furnace waste heat came with the adoption of regenerative hot-blast stoves during the 1860s (see the next chapter).

  As productivities kept on increasing the output per furnace rose up to 2500 t/year during the 1820s and the maxima surpassed 10,000 t/year during the 1830s. Aggregate British output of pig iron rose by about 70% during the 1820s (to 677,000 t); it then doubled during the next decade (to 1.396 Mt), and by 1850 it was up by almost 80% to 2.5 Mt produced by 655 blast furnaces compared to just over 300 furnaces in 1825 when the production was less than a quarter of the 1850 level (Birch, 1967). British lead became enormous when compared to any other European country as well as to the United States: by 1850 Swedish output of pig iron was about 125,000 t, and although American pig iron shipments rose from about 49,000 t in 1810 to 150,000 t in 1830 and 512,000 t in 1850, the latter total was less than a quarter of the British output (USBC, 1975), and it took the United States almost another four decades to become the world’s largest producer of iron.

  Wrought Iron

  A higher share of the metal produced in coke-fueled furnaces was cast to produce parts for machinery and construction, and higher liquidity of that metal made it possible to produce more delicate castings. At the same time, converting pig iron into wrought iron, previously done in forges associated with furnaces, became easier with the introduction of Henry Cort’s puddling process patented in 1784. This decarburization of iron was achieved by continuous stirring (or more correctly turning and pushing) of molten pig iron with long rods (rakes) through the doors of reverberatory furnaces (fired with coke) in order to decarburize it (by exposing it to oxygen) and producing a nearly pure metal containing less than 0.1% of carbon.

  The method was widely adopted in the United Kingdom after the mid-1790s, and it became the dominant means of decarburization during the first half of the nineteenth century: British wrought iron production was surpassed by steel production only in 1855. Puddling’s yield was greatly improved by Joseph Hall (1789–1862), who substituted sand at the furnace bottom (silica furnace, commonly known as Cort’s dry puddling) by iron oxide, a change that produced partially liquid iron (hence “wet” puddling), yielded much less slag, and allowed a more complete conversion of pig to malleable iron: the losses were less than 15% and no more than 25% of the charged metal compared to at least 30% of pig iron that was incorporated in sandy slag. This has been often portrayed as an improvement of Cort’s process, but, as Flemings and Ragone (2009) point out, Hall’s innovation was entirely different from a metallurgical point of view because it involved oxidation of carbon by iron oxide rather than by oxygen present in the furnace atmosphere. Hall’s experiments began in 1811 and his process was commercialized during the 1830s.

  The sequence began with the firing of a reverberatory furnace which was lined with mostly Fe3O4; between 250–175 kg of cast iron, and some iron oxide, were shovelled in, the openings were closed, and the furnace was heated for half an hour to melt the metal. Then the puddler began to stir, exposing the metal to oxide-rich slag for up to 10 min as the metal color changed from reddish to bluish, and virtually all silicon and phosphorus were removed (the stage known as the clearing process). More iron oxide was added, air intake was restricted, and after about 10 min of vigorous stirring carbon oxidation began; once it was largely complete the temperature rose, large bubbles of gas began to escape, and the boiling caused red masses of slag to overflow. This high boil stage was followed by further stirring of the hot bath; then pasty masses of iron began to form at the furnace bottom, and the metal “came to nature” and was balled by a puddler into masses of 35–40 kg (30–38 cm in diameter).

  Puddling was one of the most taxing labor tasks of the industrialization era: Caron (2013, 153) calls a puddler’s work “virtually inhuman.” Here is a description of part of the process in the memoirs of James J. Davis (1873–1947), a Welsh-born puddler in American mills who rose to be the US Secretary of Labor between 1921 and 1930:

  Six hundred pounds was the weight of pig-iron we used to put into a single hearth … my forge fire must be hot as a volcano. There were five bakings every day and this meant the shoveling in of nearly two tons of coal. In summer I was stripped to the waist and panting while the sweat poured down across my heaving muscles. My palms and fingers, scorched by the heat, became hardened like goat hoofs, while my skin took on a coat of tan that it will wear forever. What time I was not stoking the fire, I was stirring the charge with a long iron rabble that weighed some twenty-five pounds. Strap an Oregon boot of that weight to your arm and then do calisthenics ten hours in a room so hot it melts your eyebrows and you will know what it is like to be a puddler. But we puddlers did not complain. There is men’s work to be done in this world, and we were the men to do it. We had come into a country built of wood; we should change it to a country built of steel and stone (Davis, 1922, 98–99).

  Looking back at this accomplishment, Flemings and Ragone (2009, 1964) expressed their “respect for those master puddlers who could control such a complex process with little other than their senses to guide them.” Expert puddlers would process daily about 1.2 t of pig iron and produce more than one tonne of malleable iron. Eventually some of this exceptionally hard labor—prolonged manhandling of heavy iron chunks (weighing nearly 200 kg) while exposed to high heat radiating from the furnace—was displaced by mechanical arrangements, mainly by revolving furnaces first developed by the Dowlais Iron Company in England (Bell, 1884).

  Iron balls taken from the furnace were squeezed to remove slag and then they went through roughing and finishing rollers (their early designs were also patented by Henry Cort) to produce iron bars roughly 2 cm thick, 6–20 cm wide, and 5–9 m long that had a mere trace (0.1%) of carbon and were cut to shorter (60–120 cm) pieces for processing into a variety of wrought-iron manufactures. Here, once more, is how Davis (1922, 86) saw it:

  Flaming balls of woolly iron are pulled from the oven doors, flung on a two-wheeled serving tray, and rushed sputtering and flamboyant to the hungry mouth of a machine, which rolls them upon its tongue and squeezes them in its jaw like a cow mulling over her cud. The molten slag runs down red-hot from the jaws of this squeezer and makes a luminous rivulet on the floor like the water from the rubber rollers when a washer-woman wrings out the saturated clothes. Squeezed dry of its luminous lava, the white-hot sponge is drawn with tongs to the waiting rollers—whirling a
nvils that beat it into the shape they will.

  After 1830 an increasing share of wrought iron was turned into rails (all rails during the first 25 years of the railway era were made from wrought iron) and plates (needed for locomotive boilers as well as for the cladding for warships), and wrought iron was used in construction even by the 1880s. The Eiffel tower, a 320.75-m tall structure designed by Alexandre Gustave Eiffel (1832–1923) for the 1889 Exposition Universelle and completed after two years of construction in 1889, was built with wrought iron (total weight of 7,300 t) puddled at French mills and fabricated into 18,000 parts at Eiffel’s factory at Levallois-Perret on the outskirts of Paris (Seitz, 2014).

  Despite large increases in pig iron production steel remained a rare commodity during the eighteenth century. Blister steel was made by cementation (prolonged heating of iron bars in charcoal), and Sheffield, its principal British source, made only about 200 t/year to be used for a small range of expensive products including razors and swords. The crucible process was introduced in 1748 by Benjamin Huntsman (1704–1776), a clockmaker in search of higher-quality steel. The cementation was done in crucibles (50 cm tall and 20 cm in diameter, set at floor level) made from refractory material that could withstand temperatures of up to 1600°C. They were heated in coke-fueled furnaces and charged with blister iron and flux. After 3 h of melting the crucibles (holding about 20 kg of metal and used for three metals before discarding) were lifted from the furnace and the steel was cast to produce ingots.

 

‹ Prev