by Isaac Asimov
Humphry Davy used an electric current to pull apart the atoms of tightly bound molecules and was able for the first time, in 1807 and 1808, to prepare such metals as sodium, potassium, magnesium, calcium, strontium, and barium. Faraday (Davy’s assistant and protégé) went on to work out the general rules of such molecule-breaking electrolysis; and his work, a half century later, was to guide Arrhenius in working out the hypothesis of ionic dissociation (see chapter 5).
The manifold uses of dynamic electricity in the century and a half since Volta’s battery seem to have placed static electricity in the shade and to have reduced it to a mere historical curiosity. Not so, for knowledge and ingenuity need never be static. By 1960, the American inventor Chester Carlson had perfected a practical device for copying material by attracting carbon-black to paper through localized electrostatic action. Such copying, involving no solutions or wet media, is called xerography (from Greek words meaning “dry writing”) and has revolutionized office procedures.
The names of the early workers in electricity have been immortalized in the names of the units used for various types of measurement involving electricity. I have already mentioned coulomb as a unit of quantity of electricity. Another unit is the faraday: 96,500 coulombs is equal to 1 faraday. Faraday’s name is used a second time: a farad is a unit of electrical capacity. Then, too, the unit of electrical intensity (the quantity of electric current passing through a circuit in a given time) is called the ampere, after the French physicist Ampère (see chapter 5). One ampere is equal to 1 coulomb per second. The unit of electromotive force (the force that drives the current) is the volt, after Volta.
A given EMF did not always succeed in driving the same quantity of electricity through different circuits. It would drive a great deal of current through good conductors, little current through poor conductors, and virtually no current through nonconductors. In 1827, the German mathematician George Simon Ohm studied this resistance to electrical flow and showed that it can be precisely related to the amperes of current flowing through a circuit under the push of a known EMF. The resistance can be determined by taking the ration of volts to amperes. This is Ohm’s law, and the unit of electrical resistance is the ohm, 1 ohm being equal to 1 volt divided by 1 ampere.
GENERATING ELECTRICITY
The conversion of chemical energy to electricity, as in Volta’s battery and the numerous varieties of its descendants, has always been relatively expensive THE MACHINE 391 because the chemicals involved are not common or cheap. For this reason, although electricity could be used in the laboratory with great profit in the early nineteenth century, it could not be applied to large-scale uses in industry.
There have been sporadic attempts to make use of the chemical reactions involved in the burning of ordinary fuels as a source of electricity. Fuels such as hydrogen (or, better still, coal) are much cheaper than metals such as copper and zinc. As long ago as 1839, the English scientist William Grove devised an electric cell running on the combination of hydrogen and oxygen. It was interesting but not practical. In recent years, physicists have been working hard to prepare practical varieties of such fuel cells. The theory is all set; only the practical problems must be ironed out, and these are proving most refractory.
When the large-scale use of electricity came into being in the latter half of the nineteenth century, it is not surprising, then, that it did not arrive by way of the electric cell. As early as the 1830s, Faraday had produced electricity by means of the mechanical motion of a conductor across the lines of force of a magnet (figure 9.5; see also chapter 5). In such an electric generator, or dynamo (from a Greek word for “power”), the kinetic energy of motion could be turned into electricity. Such motion could be kept in being by steam power, which in turn could be generated by burning fuel. Thus, much more indirectly than in a fuel cell, the energy of burning coal or oil (or even wood) could be converted into electricity. By 1844, large, clumsy versions of such generators were being used to power machinery.
Figure 9.5. Faraday’s dynamo. The rotating copper disk cuts the magnet’s lines of force, inducing a current on the voltmeter.
What was needed were ever stronger magnets, so that motion across the intensified lines of force could produce larger floods of electricity. These stronger magnets were obtained, in turn, by the use of electric currents. In 1823, the English electrical experimenter William Sturgeon wrapped eighteen turns of bare copper wire about a U-shaped iron bar and produced an electromagnet. When the current was on, the magnetic field it produced was concentrated in the iron bar which could then lift twenty times its own weight of iron. With the current off, it was no longer a magnet and would lift nothing.
In 1829, the American physicist Joseph Henry improved this gadget vastly by using insulated wire. Once the wire was insulated, it could be wound in close loops over and over without fear of short circuits. Each loop increased the intensity of the magnetic field and the power of the electromagnet. By 1831, Henry had produced an electromagnet, of no great size, that could lift over a ton of iron.
The electromagnet was clearly the answer to better electrical generators. In 1845, the English physicist Charles Wheatstone made use of such an electromagnet for this purpose. Better understanding of the theory behind lines of force came about with Maxwell’s mathematical interpretation of Faraday’s work (see chapter 5) in the 1860s; and, in 1872, the German electrical engineer Friedrich von Hefner-Alteneck designed the first really efficient generator. At last electricity could be produced cheaply and in floods, and not only from burning fuel but from falling water.
EARLY APPLICATION OF ELECTRICITY TO TECHNOLOGY
For the work that led to the early application of electricity to technology, the lion’s share of the credit must fall to Joseph Henry. Henry’s first application of electricity was the invention of telegraphy. He devised a system of relays that made it possible to transmit an electric current over miles of wire. The strength of a current declines fairly rapidly as it travels at constant voltage across long stretches of resisting wire; what Henry’s relays did was to use the dying signal to activate a small electromagnet that operated a switch that turned on a boost in power from stations placed at appropriate intervals. Thus a message consisting of coded pulses of electricity could be sent for a considerable distance, Henry actually built a telegraph that worked.
Because he was an unworldly man, who believed that knowledge should be shared with the world and therefore did not patent his discoveries, Henry got no credit for this invention. The credit fell to the artist (and eccentric religious bigot) Samuel Finley Breese Morse. With Henry’s help, freely given (but later only grudgingly acknowledged), Morse built the first practical telegraph in 1844. Morse’s main original contribution to telegraphy was the system of dots and dashes known as the Morse code.
Henry’s most important development in the field of electricity was the electric motor. He showed that electric current could be used to turn a wheel, just as the turning of a wheel can generate current in the first place. And an electrically driven wheel (or motor) could be used to run machinery, The motor could be carried anywhere; it could be turned on or off at will (without waiting to build up a head of steam); and it could be made as small as one wished (figure 9.6).
Figure 9.6. Henry’s motor. The upright bar magnet D attracts the wirewound magnet B, pulling the long metal probes Q and R into the brass thimbles S and T, which act as terminals for the wet cell F. Current flows into the horizontal magnet, producing an electromagnetic field that pulls A and C together. The whole process is then repeated on the opposite side. Thus the horizontal bars oscillate up and down.
The catch was that electricity had to be transported from the generating station to the place where the motor was to be used. Some way had to be found to cut down the loss of electrical energy (taking the form of dissipated heat) as it traveled over wires,
One answer was the transformer. The experimenters with currents found that electricity suffers far less loss if it i
s transmitted at a low rate of flow, So the output from the generator was stepped up to a high voltage by means of a transformer that—while multiplying the voltage, say, three times—reduces the current (rate of flow) to one-third, At the receiving station, the voltage can be stepped down again so that the current is correspondingly increased for use in motors .
The transformer works by using the primary current to induce a current at high voltage in a secondary coil. This induction requires varying the magnetic field through the second coil. Since a steady current will not do this, the current used is a continually changing one that builds up to a maximum and then drops to zero and starts building in the opposite direction—in other words, an alternating current.
Alternating current (A.C.) did not win out over direct current (D.C.) without a struggle, Thomas Alva Edison, the greatest name in electricity in the final decades of the nineteenth century, championed direct current and established the first dc generating station in New York in 1882 to supply current for the electric light he had invented. He fought alternating current on the ground that it was more dangerous (pointing out, for instance, that it was used in electric chairs). He was bitterly opposed by Nikola Tesla, an engineer who had worked for Edison and been shabbily treated. Tesla developed a successful system of alternating current in 1888. In 1893, George Westinghouse, also a believer in alternating current, won a crucial victory over Edison by obtaining for his electric company the contract to develop the Niagara Falls power plants on an ac basis. In the following decades, Steinmetz established the theory of alternating currents on a firm mathematical basis.
Today alternating current is all but universal in systems of power distribution. (In 1966, to be sure, engineers at General Electric devised a direct-current transformer—long held to be impossible; but it involves liquid-helium temperatures and low efficiency. It is fascinating theoretically, but of no likely commercial use right now.)
Electrical Technology
The steam engine is a prime mover: it takes energy already existing in nature (the chemical energy of wood, oil, or coal) and turns it into work. The electric motor is not a prime mover: it converts electricity into work, but the electricity must itself be formed from the energy of burning fuel or falling water. For this reason, electricity is more expensive than steam for heavy jobs. Nevertheless, it can be used for the purpose. At the Berlin Exhibition of 1879, an electric-powered locomotive (using a third rail as its source of current) successfully pulled a train of coaches. Electrified trains are common now, especially for rapid transit within cities, for the added expense is more than made up for by increased cleanliness and smoothness of operation.
THE TELEPHONE
Where electricity really comes into its own, however, is where it performs tasks that steam cannot. There is, for instance, the telephone, patented by the Scottish-born inventor Alexander Graham Bell in 1876. In the telephone mouthpiece, the speaker’s sound waves strike a thin steel diaphragm and make it vibrate in accordance with the pattern of the waves. The vibrations of the diaphragm, in turn, set up an analogous pattern in an electric current, which strengthens and weakens in exact mimicry of the sound waves. At the telephone receiver, the fluctuations in the strength of the current actuate an electromagnet that makes a diaphragm vibrate and reproduce the sound waves.
The telephone was crude, at first, and barely worked; but even so, it was the hit of the Centennial Exposition held at Philadelphia in 1876 to celebrate the hundredth anniversary of the Declaration of Independence. The visiting Brazilian emperor, Pedro II, tried it and dropped the instrument in astonishment, saying “It talks!” which made newspaper headlines. Another visitor, Kelvin, was equally impressed, while the great Maxwell was astonished that anything so simple would reproduce the human voice. In 1877, Queen Victoria acquired a telephone, and its success was assured.
Also in 1877, Edison devised an essential improvement. He constructed a mouthpiece containing loose-packed carbon powder. When the diaphragm pressed on the carbon powder, the powder conducted more current; when it moved away, the powder conducted less. In this way, the sound waves of the voice were translated by the mouthpiece into varying pulses of electricity with great fidelity, and the voice one heard in the receiver was reproduced with improved clarity.
Telephone messages could not be carried very far without ruinous investment in thick (therefore low-resistance) copper wire. At the turn of the century, the Yugoslavian-American physicist Michael Idvorsky Pupin developed a method of loading a thin copper wire with inductance coils at intervals. These reinforced the signals and allowed them to be carried across long distances. The Bell Telephone Company bought the device in 1901; and by 1915, long-distance telephony was a fact as the line between New York City and San Francisco was opened.
The telephone operator became an unavoidable and increasing part of life for half a century until her domination (she was almost invariably a woman) began to fade with the beginnings of the dial telephone in 1921. Automation continued to advance until by 1983, hundreds of thousands of telephone employees went out on strike for a couple of weeks, and telephone service continued without interruption. Currently radio beams and communications satellites add to the versatility of the telephone.
RECORDING SOUND
In 1877, a year after the invention of the telephone, Edison patented his phonograph. The first records had the grooves scored on tinfoil wrapped around a rotating cylinder. The American inventor Charles Sumner Tainter substituted wax cylinders in 1885, and then Emile Berliner introduced wax-coated disks in 1887. In 1904, Berliner introduced a still more important advance: the flat phonograph record on which the needle vibrates from side to side. Its greater compactness allowed it to replace Edison’s cylinder (with a needle vibrating up and down) almost at once.
In 1925, recordings began to be made by means of electricity through the use of a microphone, which translated sound into a mimicking electric current via a piezoelectric crystal instead of a metal diaphragm—the crystal allowing a better quality of reproduction of the sound. In the 1930s, the use of radio tubes for amplification was introduced.
In 1948, the Hungarian-American physicist Peter Goldmark developed the long-playing record, which turned 33½ times per minute rather than the till-then regulation 78. A single LP record could hold six times the amount of music of the old kind and made it possible to listen to symphonies without the repeated necessity of turning and replacing records.
Electronics made possible high-fidelity (hi-fi) and stereophonic sound, which have had the effect, so far as the sound itself is concerned, of practically removing all mechanical barriers between the orchestra or singer and the listener.
Tape-recording of sound was invented in 1898 by a Danish electrical engineer named Valdemar Poulsen, but had to await certain technical advances to become practical. An electromagnet, responding to an electric current carrying the sound pattern, magnetizes a powder coating on a tape or a wire moving past it, and the playback is accomplished through an electromagnet that picks up this pattern of magnetism and translates it again into a current that will reproduce the sound.
ARTIFICIAL LIGHT BEFORE ELECTRICITY
Of all the tricks performed by electricity, certainly the most popular was its turning night into day. Human beings had fought off the daily crippling darkness-after-sundown with the campfire, the torch, the oil lamp, and the candle; for half a million years or so, the level of artificial light remained dim and flickering.
The nineteenth century introduced some advances in these age-old methods of lighting. Whale oil and then kerosene came to be used in oil lamps, which grew brighter and more efficient. The Austrian chemist Karl Auer, Baron von Welsbach, found that if a fabric cylinder, impregnated with compounds of thorium and cerium were put around a lamp flame, it would glow a brilliant white. Such a Welsbach mantle, patented in 1885, greatly increased the brightness of the oil lamp.
Early in the century, gas lighting was introduced by the Scottish inventor William Murdoc
k. He piped coal gas to a jet where it could be allowed to escape and be lit. In 1802, he celebrated a temporary peace with Napoleon by setting up a spectacular display of gas lights; and by 1803, he was routinely lighting his main factory with them. In 1807, some London streets began to use gas lighting, and the custom spread. As the century progressed, large cities grew ever lighter at night, reducing the crime rate and enhancing the security of citizens.
The American chemist Robert Hare found that a hot gas flame played upon a block of calcium oxide (lime) produces a brilliant white light. Such limelight came to be used to illuminate theater stages to a brighter level than had hitherto been possible. Although this technique has long since been outmoded, people who are in the blaze of publicity are still said to be “in the limelight.”
All of these forms of lighting from bonfires to the gas jet involve open flames. Some device must exist to light the fuel—be it wood, coal, oil, or gas—if a flame does not already exist in the vicinity. Prior to the nineteenth century, the least laborious method was to use flint and steel. By striking one against another, a spark could be elicited that might, with luck, light some tinder (finely divided inflammable material) which could, in turn, light a candle, and so on.
In the early nineteenth century, chemists began devising methods for coating one end of a piece of wood with chemicals that would burst into flame when the temperature was elevated. Such a piece of wood was a match. Friction would raise the temperature, and “striking a match” on a rough surface produced a flame.