Book Read Free

Asimov's New Guide to Science

Page 42

by Isaac Asimov


  The word diamond brings up the most glamorous of all the high-pressure feats. Diamond, of course, is crystallized carbon, as is also graphite. When an element appears in two different forms, these forms are allotropes. Diamond and graphite are the most dramatic example of the phenomenon. Ozone and ordinary oxygen are another example. Yellow phosphorus and black phosphorus; mentioned in the previous paragraph (there is red phosphorus, too), are still another example.

  Allotropes can seem entirely different in appearance and properties, and there is no more startling example of an allotrope than graphite and diamond—except, possibly, coal and diamond (anthracite coal is, chemically speaking, a sloppy version of graphite).

  That diamond is but graphite (or coal) with a different organization of atoms seems, at first sight, completely unbelievable, but the chemical nature of diamond was first proved as long ago as 1772 by Lavoisier and some fellow French chemists. They pooled their funds to buy a diamond and proceeded to heat it to a temperature high enough to burn it up. The gas that resulted was found to be carbon dioxide. Later the British chemist Smithson Tennant showed that the amount of carbon dioxide measured could be produced only if diamond was pure carbon, as graphite is; and in 1799, the French chemist Guyton de Morveau clinched the case by converting a diamond into a lump of graphite.

  That was an unprofitable maneuver, but now why could not matters be reversed? Diamond is 55 percent denser than graphite. Why not put graphite under pressure and force the atoms composing it into the tighter packing characteristic of diamond?

  Many efforts were made; and, like the alchemists, a number of experimenters reported successes. The most famous was the claim of the French chemist Ferdinand Frederic Henri Moissan. In 1893, he dissolved graphite in molten cast iron and reported that he found small diamonds in the mass after it cooled. Most of the objects were black, impure, and tiny, but one was colorless and almost a millimeter long. These results were widely accepted; and, for a long time, Moissan was considered to have manufactured synthetic diamonds. However, his results were never successfully repeated.

  The search for synthetic diamonds was not without its side victories, however. In 1891, the American inventor Edward Goodrich Acheson, while heating graphite under conditions he thought might form diamond, stumbled upon silicon carbide, to which he gave the trade name Carborundum. This proved harder than any substance then known but diamond; and ever since, it has been a much-used abrasive—that is, a substance used for grinding and polishing.

  The efficiency of an abrasive depends on its hardness. An abrasive can polish or grind substances less hard than itself, and diamond, as the hardest substance, is the most useful in this respect. The hardness of various substances is commonly measured on the Mohs scale, introduced by the German mineralogist Friedrich Mohs in 1818. This assigns minerals numbers from 1, for talc, to 10, for diamond. A mineral of a particular number is able to scratch all minerals with lower numbers. On the Mohs scale, Carborundum is given the number 9. The divisions are not equal, however. On an absolute scale, the difference in hardness between 10 (diamond) and 9 (Carborundum) is four times greater than the difference between 9 (Carborundum) and 1 (talc).

  The reason for all this is not hard to see. In graphite, the carbon atoms are arranged in layers. In each individual layer, the carbon atoms are arranged in tessellated hexagons, like the tiles on a bathroom floor. Each carbon atom is bonded to three others in equal fashion; and since carbon is a small atom, the neighbors are close together and strongly held. The tessellation is hard to pull apart but is very thin and easily broken. A tessellation is a comparatively large distance from the next tessellation above and below so that the bonds between layers are weak, and one layer can easily be made to slide upon the next. For that reason, graphite is not only not particularly hard but can actually be used as a lubricant.

  In diamond, however, carbon atoms are arranged with absolute three-dimensional symmetry. Each carbon atom is bonded to four others at equal distances, each of the four being at the apices of a tetrahedron of which the carbon atom under consideration forms the center. This is a very compact arrangement, so that diamond is substantially denser than graphite. Nor will it pull apart in any direction except under overwhelming force. Other atoms will take up the diamond configuration; but, of these, the carbon atom is the smallest and holds together tightest. Thus diamond is harder than any other substance under the conditions of Earth’s surface.

  In silicon carbide, half the carbon atoms are replaced by silicon atoms. As the silicon atoms are considerably larger than the carbon atoms, they do not hug their neighbors as close, and their bonds are weaker. Thus, silicon carbide is not as hard as diamond (though it is hard enough for many purposes).

  Under the surface conditions on Earth, the graphite arrangement of carbon atoms is more stable than the diamond arrangement. Hence, there is a tendency for diamond to turn spontaneously into graphite. You are, however, in no danger of waking up some morning to find your splendid diamond ring has become worthless overnight. The carbon atoms, even in their unstable arrangement, hold together so tight that it would take many millions of years for the change to take place.

  This difference in stability makes it all the harder to change graphite to diamond. It was not till the 1930s that chemists finally worked out the pressure requirements for converting graphite to diamond. It turned out that the conversion called for a pressure of at least 10,000 atmospheres, and even then it would be impracticably slow. Raising the temperature would speed the conversion but would also raise the pressure requirements At 1500° C, a pressure of at least 30,000 atmospheres would be necessary. All this proved that Moissan and his contemporaries, under the conditions they used, could no more have produced diamonds than the alchemists could have produced gold. (There is some evidence that Moissan was actually a victim of one of his assistants, who, tiring of the tedious experiments, decided to end them by planting a real diamond in the cast-iron mixture.)

  Aided by Bridgman’s pioneering work in attaining the necessary high temperatures and pressures, scientists at the General Electric Company finally accomplished the feat in 1955. Pressures of 100,000 atmospheres or more were produced, along with temperatures of up to 2500° C. In addition, a small quantity of metal, such as chromium, was used to form a liquid film across the graphite. It was on this film that the graphite turned to diamond. In 1962, a pressure of 200,000 atmospheres and a temperature of 5000° C could be attained. Graphite was then turned to diamond directly, without the use of a catalyst.

  Synthetic diamonds are too small and impure to be used as gems, but they are now produced commercially as abrasives and cutting tools and, indeed, are a major source of such products. By the end of the decade, an occasional small diamond of gem quality could be produced.

  A newer product made by the same sort of treatment can supplement the use of diamond. A compound of boron and nitrogen (boron nitride) is very similar in properties to graphite (except that boron nitride is white instead of black). Subjected to the high temperatures and pressures that convert graphite to diamond, the boron nitride undergoes a similar conversion. From a crystal arrangement like that of graphite, the atoms of boron nitride are converted to one like that of diamond. In its new form it is called borazon. Borazon is about four times as hard as Carborundum. In addition it has the great advantage of being more resistant to heat. At a temperature of 900° C, diamond burns up but borazon comes through unchanged.

  Boron has one electron fewer than carbon; nitrogen, one electron more. The two in combination, alternately, set up a situation closely resembling the carbon-carbon arrangement, but there is a tiny departure from the perfect symmetry of diamond. Boron nitride is therefore not quite as hard as diamond.

  Bridgman’s work on high pressure is not the last word, of course. As the 1980s began, Peter M. Bell of the Carnegie Institution made use of a device that squeezes materials between two diamonds, and has managed to reach pressures of 1,500,­000 atmospheres, over two-fifths tha
t at the Earth’s center. He believes it is possible for the instrument to go to 17,000,­000 atmospheres before the diamonds themselves fail.

  At the California Institute of Technology, shock waves are used to produce momentary pressures that are higher still—up to 75,000,­000 atmospheres perhaps.

  Metals

  Most of the elements in the periodic table are metals. As a matter of fact, only about 20 of the 102 elements can be considered definitely nonmetallic. Yet the use of metals came relatively late in the history of the human species. One reason is that, with rare exceptions, the metallic elements are combined in nature with other elements and are not easy to recognize or extract. Primitive people at first used only materials that could be manipulated by simple treatments such as carving, chipping, hacking, and grinding; and thus their materials were restricted to bones, stones, and wood.

  Primitive people may have been introduced to metals through discoveries of meteorites, or of small nuggets of gold, or of metallic copper in the ashes of fires built on rocks containing a copper ore. In any case, people who were curious enough (and lucky enough) to find these strange new substances and look into ways of handling them would discover many advantages in them. Metal differs from rock in that it has an attractive luster when polished. It can be beaten into sheets and drawn into wire. It can be melted and poured into a mold to solidify. It is much more beautiful and adaptable than rock and ideal for ornaments. Metals probably were fashioned into ornaments long before they were put to any other use.

  Because they were rare, attractive, and did not alter with time, these metals were valued and bartered until they became a recognized medium of exchange. Originally, pieces of metal (gold, silver, or copper) had to be weighed separately in trading transactions, but, by 700 B.C., standardized weights of metal stamped in some ofIicial government fashion were issued in the Asia Minor kingdom of Lydia and the Aegean island of Aegina. Coins are still with us today.

  What really brought metals into their own was the discovery that some of them would take a sharper cutting edge than stone could, and would maintain that edge under conditions that would ruin a stone ax. Moreover, metal was tough. A blow that would splinter a wooden club or shatter a stone ax would only slightly deform a metal object of similar size. These advantages more than compensated for the fact that metal is heavier than stone and was harder to obtain.

  The first metal obtained in reasonable quantity was copper, which was in use by 4000 B.C. Copper itself is too soft to make useful weapons or armor (though it will make pretty ornaments), but it was often found alloyed with a little arsenic or antimony, which resulted in a substance harder than the pure metal. Then samples of copper ore must have been found that contained tin. The copper-tin alloy (bronze) was hard enough for purposes of weaponry. Men soon learned to add the tin deliberately. The Bronze Age replaced the Stone Age in Egypt and western Asia about 3000 B.C. and in southeastern Europe by 2000 B.C. Homer’s Iliad and Odyssey commemorate that period of culture.

  Iron was known as early as bronze; but for a long time, meteorites were its only source. It remained no more than a precious metal, limited to occasional use, until methods were discovered for smelting iron ore and thus obtaining iron in unlimited quantities. The difIiculty lay in working with fires hot enough and methods suitable enough to add carbon to the iron and harden it into the form we now call steel. Iron smelting began somewhere in Asia Minor about 1400 B.C. and developed and spread slowly.

  An iron-weaponed army could rout a bronze-armed one, for iron swords would cut through bronze. The Hittites of Asia Minor were the first to use iron weapons to any extent, and they had a period of power in western Asia. Then the Assyrians succeeded the Hittites. By 800 B.C., they had a completely ironized army which was to dominate western Asia and Egypt for two and a half centuries. At about the same time, the Dorians brought the Iron Age to Europe by invading Greece and defeating the Achaeans, who committed the error of clinging to the Bronze Age.

  IRON AND STEEL

  Iron is obtained essentially by heating iron ore (usually a ferric oxide) with carbon. The carbon atoms carry off the oxygen of the ferric oxide, leaving behind a lump of pure iron. In ancient times, the temperatures used did not melt the iron, and the product was a tough metal that could be worked into the desired shape by hammering—that is, wrought iron. Iron metallurgy on a larger scale came into being in the Middle Ages. Special furnaces were used, and higher temperatures that melted the iron. The molten iron could be poured into molds to form castings, so it was called cast iron. This was much less expensive than wrought iron and much harder, too, but it was brittle and could not be hammered. Increasing demand for iron of either form helped to deforest England, for instance, consuming its wood in the iron-smelting furnaces. But then, in 1780, the English ironworker Abraham Darby showed that coke (carbonized coal) would work as well as, or better than, charcoal (carbonized wood). The pressures on the forests eased in this direction, and the more-than-century-long domination of coal as an energy source began.

  It was not until late in the eighteenth century that chemists, thanks to the French physicist Rene Antoine Ferchault de Réaumur, finally realized that it was the carbon content that dictates the toughness and hardness of iron. To maximize those properties, the carbon content ought to be between 0.2 percent and 1.5 percent; the steel that then results is harder and tougher and generally stronger than either cast iron or wrought iron. But until the mid-nineteenth century, high-quality steel could be made only by the complicated procedure of carefully adding the appropriate quantity of carbon to wrought iron (itself comparatively expensive). Steel remained therefore a luxury metal, used only where no substitute could be found—as in swords and springs.

  The Age of Steel was ushered in by a British engineer named Henry Bessemer. Originally interested primarily in cannon and projectiles, Bessemer invented a system of rifling intended to enable cannon to shoot farther and more accurately. Napoleon III of France was interested and offered to finance further experiments. But a French artillerist killed the idea by pointing out that the propulsive explosion Bessemer had in mind would shatter the cast-iron cannons used in those days. Bessemer, chagrined, turned to the problem of creating stronger iron. He knew nothing of metallurgy, so he could approach the problem with a fresh mind. Cast iron was brittle because of its carbon content. Therefore the problem was to reduce the carbon.

  Why not burn the carbon away by melting the iron and sending a blast of air through it? This seemed at first a ridiculous idea. Would not the air blast cool the molten metal and cause it to solidify? Bessemer tried it anyway and found that quite the reverse was true. As the air burned the carbon, the combustion gave off heat and the temperature of the iron rose rather than fell. The carbon burned off nicely. By proper controls, steel could be produced in quantity and comparatively cheaply.

  In 1856, Bessemer announced his blast furnace. Ironmakers adopted the method with enthusiasm, then dropped it in anger when they found that inferior steel was being formed. Bessemer discovered that the iron ore used by the industry contained phosphorus (which had been absent from his own ole samples). Although Bessemer explained to the ironmakers that phosphorus had betrayed them, they refused to be twice-bitten. Bessemer therefore had to borrow money and set up his own steel works in Sheffield. Importing phosphorus-free iron ore from Sweden, he speedily produced steel at a price that undersold the other ironmakers.

  In 1875, the British metallurgist Sidney Gilchrist Thomas discovered that by lining the interior of the furnace with limestone and magnesia, he could easily remove the phosphorus from the molten iron. After this, almost any iron ore could be used in the manufacture of steel. Meanwhile, in 1868, the German-British inventor Karl Wilhelm Siemens developed the open-hearth method, in which pig iron was heated with iron ore; this process also could take care of the phosphorus content.

  The Age of Steel then got under way. The name is no mere phrase. Without steel, skyscrapers, suspension bridges, great ships, railroa
ds, and many other modern constructions would be almost unthinkable; and, despite the rise of other metals, steel still remains the preferred metal in a host of everyday uses, from automobile bodies to knives.

  (It is a mistake, of course, to think that any single advance can bring about a major change in the way of life of humanity. Such change is always the result of a whole complex of interrelated advances. For instance, all the steel in the world could not make skyscrapers practical without the existence of that too-often-taken-for-granted device, the elevator. In 1861, the American inventor Elisha Graves Otis patented a hydraulic elevator; and in 1889, the company he founded installed the first electrically run elevators in a New York commercial building.)

  With steel cheap and commonplace, it became possible to experiment with the addition of other metals (alloy steel) to see whether it could be still further improved, The British metallurgist Robert Abbott Hadfield pioneered in this direction. In 1882, he found that adding manganese to steel to the extent of 13 percent produced a harder alloy, which could be used in machinery for particularly brutal jobs, such as rock crushing. In 1900, a steel alloy containing tungsten and chromium was found to retain its hardness well at high temperatures, even red heat; this alloy proved a boon for high-speed tools. Today, for particular jobs, there are innumerable other alloy steels, employing such metals as molybdenum, nickel, cobalt, and vanadium.

  The great difficulty with steel is its vulnerability to corrosion—a process that returns iron to the crude state of the ore whence it came. One way of combating this is to shield the metal by painting it or by plating it with a metal less likely to corrode—such as nickel, chromium, cadmium, or tin. A more effective method is to form an alloy that does not corrode. In 1913, the British metallurgist Harry Brearley discovered such an alloy by accident. He was looking for steel alloys that would be particularly suitable for gun barrels. Among the samples he discarded as unsuitable was a nickel-chromium alloy. Months later, he happened to notice that these particular pieces in his scrap heap were as bright as ever, although the rest were rusted. That was the birth of stainless steel. It is too soft and too expensive for use in large-scale construction, but serves admirably in cutlery and small appliances where non rusting is more important than hardness.

 

‹ Prev