The Amazing Story of Quantum Mechanics

Home > Other > The Amazing Story of Quantum Mechanics > Page 26
The Amazing Story of Quantum Mechanics Page 26

by Kakalios, James


  In the 2008 film Iron Man, Tony Stark designs a suit of armor that contains a host of high-tech gadgets, all of which are within the realm of physical plausibility—with one big exception. The one miracle exception from the laws of nature that the film invokes is the “arc reactor” that powers Stark’s high-tech exoskeleton. This device is a cylinder about the size of a hockey puck and is capable of producing “three GigaWatts of power,”75 sufficient to keep a real-world jet pack aloft and flying for hours. Sadly, we have no way of producing such compact, lightweight, high-energy-content power cells.

  Had the revolution in energy anticipated by the science fiction pulp magazines indeed occurred and we employed personal jet packs to get to work or the corner grocery store, powered by some exotic energy source, the need for conventional fossil fuels would of course be dramatically reduced, with a concurrent dramatic shift in geopolitical relations. There is one important use of potential jet-pack technology that does not involve transportation but rather thirst quenching, that would have an immediate beneficial impact.

  According to the World Health Organization, as of 2009, 40 percent of the world’s population suffers from a scarcity of potable fresh water. The most straightforward method to convert seawater to fresh water involves boiling the salt water and converting the liquid water to steam, which leaves the salts behind in the residue. This is, after all, what occurs during evaporation from the oceans, which is why rainwater is salt free. The amount of energy needed to boil a considerable amount of water is not easily provided by solar cells, but if one had a power supply for a fully functioning jet pack, the lives of more than two billion people would be profoundly improved, even if everyone’s feet stayed firmly planted on the ground.

  Can quantum mechanics help in the production of energy, so that the jet-pack dreams of the 1930s can be at long last realized? Possibly. Global consumption of energy, which in 2005 was estimated to be sixteen trillion Watts, will certainly increase in the future, with many experts projecting that demand will grow by nearly 50 percent in the next twenty years. One strategy to meet this additional need involves the construction of a new power plant, capable of producing a gigaWatt of power, at the rate of one new facility every day for the next two decades. This does not seem likely to happen.

  Another approach is to tap the vast amount of energy that is, for the most part, ignored by all nations—sunlight. The surface of the Earth receives well over a hundred thousand trillion Watts of power, more than six thousand times the total global energy usage and more than enough to meet the world’s energy needs for decades to come. As described in Chapter 16, the simple diode, comprised of a junction between one semiconductor with impurities that donate excess electrons and a second semiconductor with impurities that donates holes, can function as a solar cell. When the diode absorbs a photon, an electron is promoted into the upper band, leaving a mobile hole in the lower filled band. These charge carriers feel a force from the strong internal electric field at the pn junction, and a current can be drawn out of the device, simply as a result of exposing it to sunlight. Work is under way to improve the conversion efficiency of these devices—that is, to maximize the current that results for a given intensity of sunlight. But even using current cells, with conversion efficiencies of only 10 percent (that is, 90 percent of the energy that shines on the solar cell does not lead to electrical power), we could provide all the electricity needs of the United States with an array of solar cells of only one hundred miles by one hundred miles.

  The problem is, we don’t have enough solar cells on hand to cover a one-hundred-mile-by-one-hundred-mile grid, and at the present production capacity it would take many years to fabricate these devices. Moreover, even if the solar cells existed, we would need to get the electrical power from bright sunny locales to the gloomy cities with large population densities. Here again, quantum mechanics may help.

  In Chapter 13 we saw that at low temperatures certain metals become superconductors, when their electrons form bound pairs through a polarization of the positive ions in the metal lattice. Electrons have intrinsic angular momentum of ℏ/2 and individually obey Fermi-Dirac statistics (Chapter 12) that stipulate that no two electrons can be the same quantum state. When the electrons in a metal at low temperature pair up, they create composite charge carriers that have a net total spin of zero. These paired electrons obey Bose-Einstein statistics, and as the temperature is lowered they condense into a low energy state. If the temperature of the solid is low enough, then for moderate currents there is not enough energy to scatter the electrons out of this lowest energy state, and they can thus carry current without resistance. This phenomenon—superconductivity—is an intrinsically quantum mechanical effect and is observed only in metals at extremely low temperatures, below -420 degrees Fahrenheit.

  At least—that was the story until 1986. In that year two scientists, Johannes Bednorz and Karl Müller, at the IBM research laboratory in Zurich, Switzerland, reported their discovery of a ceramic that became a superconductor at -400 degrees Fahrenheit. That’s still very cold, but at the time it set a record for the highest temperature at which superconductivity was observed. Once the scientific community knew that this class of materials, containing copper, oxygen, and rare Earth metals, could exhibit superconductivity, the race was on, and research labs around the world tried a wide range of elements in a host of combinations. A year later a group of scientists from the University of Houston and University of Alabama discovered a compound of yttrium, barium, copper, and oxygen that became a full-fledged superconductor at a balmy -300 degrees Fahrenheit. Liquid nitrogen, used in many dermatologists’ offices for the treatment of warts, is 20 degrees colder at -321 Fahrenheit. These materials are referred to as “high-temperature superconductors,” as their transition into a zero resistance state can be induced using a refrigerant found in many walk-in medical clinics. There is no definitive explanation for how these materials are able to become superconductors at such relatively toasty temperatures, and their study remains an active and exciting branch of solid-state physics. The most promising models to account for this effect invoke novel mechanisms that quantum mechanically induce the electrons in these solids to form a collective ground state.

  High-temperature superconductors would be ideal to transmit electricity generated from a remote bank of solar cells or windmills to densely populated regions where the power is needed. While it would need to be kept cool, liquid nitrogen is easy to produce, and when purchased for laboratory needs it is cheaper than milk (and certainly cheaper than bottled water). Unfortunately, to date challenging materials-science issues limit the currents that can be carried by these ceramics, such that if we were to use them for transmission lines they would cease to become superconductors and would in fact have resistances higher than those of ordinary metals.

  If these problems are ever solved, then along with transmitting electrical power, these innovations may help transportation undergo a revolution as well. As discussed in Chapter 13, in addition to carrying electrical current with no resistance, superconductors are perfect diamagnets, completely repelling any externally applied magnetic field. The material sets up screening currents that cancel out the external field trying to penetrate the superconductor, and as there is no resistance to current flow, these screening currents can persist indefinitely. If high-temperature superconductors can be fabricated that are able to support high enough currents to block out large enough magnetic fields, then high-speed magnetically levitating trains are possible, where the major cost involves the relatively cheap and safe liquid nitrogen coolant.

  Bednorz and Müller won the Nobel Prize in Physics just one year after they published their discovery of high-temperature superconductivity in ceramics. However, more than twenty years later, the trains still do not levitate riding on rails composed of novel copper oxide compounds. Unlike giant magnetoresistance and the solid-state transistor, both of which went from the research lab to practical applications in well under a decade, t
here are no preexisting consumer products for which raising the transition temperature of a superconductor would make a significant difference. Nevertheless, research on these materials continues, and someday we may have high-temperature superconductors overhead in our transmission lines and underfoot on our rail lines.

  Another untapped source of energy that quantum mechanics- based devices may be able to exploit in the near future involves waste. I speak here not of garbage but of waste heat, generated as a by-product of any combustion process.

  Why is heat wasted under the hood of your car? Heat and work are both forms of energy. Work, in physics terms, involves a force applied over a given distance, as when the forces exerted by the collisions of rapidly moving gas molecules lift a piston in a car engine. Heat in physics refers to the transfer of energy between systems having different average energy per atom. Bring a solid where the atoms are vigorously vibrating in contact with another where the atoms are slowly shaking, and collisions and interactions between the constituent atoms result in the more energetic atoms slowing down while the sluggish atoms speed up. We say that the first solid initially had a higher temperature while the second had a lower temperature, and that through collisions they exchange heat until they eventually come to some common temperature. We can do work on a system and convert all of it to heat, but the Second Law of Thermodynamics informs us that we can never, with 100 percent efficiency, transform a given amount of heat into work.

  Why not? Because of the random nature of collisions. Consider the molecules in an automobile piston, right before the ignition spark and compression stroke cause the gasoline and oxygen molecules to undergo combustion. They are zipping in all directions, colliding with each other and the walls and bottom and top of the cylinder. The pressure is uniform on all surfaces in the cylinder. Following combustion, the gas-oxygen mixture undergoes an explosive chemical reaction, yielding other chemicals and releasing heat; that is, the reaction products have greater kinetic energy than the reactants had before the explosion. This greater kinetic energy leads to a greater force being exerted on the head of the piston as the gas molecules collide with it. This larger force raises the piston and, through a clever system of shafts and cams, converts this lifting to a rotational force applied to the tires. But the higher gas pressure following the chemical explosion pushes on all surfaces of the cylinder, though only the force on the piston head results in useful work. The other collisions wind up warming the walls and piston of the cylinder, and from the point of view of getting transportation from the gasoline, this heat is “wasted.”

  When heat is converted to work, the Second Law of Thermodynamics quantifies how much heat will be left over. In an automobile, in the best-case scenario, one can expect to convert only one-third of the available chemical energy into energy that moves the car, and very few auto engines are even that efficient. There’s a lot of energy under the hood that is not being effectively utilized. Similarly, cooling towers for power plants eject vast quantities of heat into the atmosphere. It is estimated that more than a trillion Watts of energy are wasted every year in the form of heat not completely converted to work. This situation may change in the future, thanks to solid-state devices called “thermoelectrics.” These structures convert temperature differences into voltages and are the waste-heat version of solar cells (also known as “photovoltaic” devices) that convert light into voltages.

  Thermoelectrics make use of the same physics that enables solid-state thermometers to record a temperature without glass containers of mercury. Consider two different metals brought into contact. We have argued that metals can be viewed as lecture halls where only half of the possible seats are occupied, so that there are many available empty seats that can be occupied if the electrons absorb energy from either light, or applied voltages, or heat. Different metals will have different numbers of electrons in the partially filled lower band. Think about two partially filled auditoriums, each with different numbers of people sitting in the seats, separated by a removable wall, as in some hotel ballrooms. One auditorium has two hundred people, while the other has only one hundred. Now the wall separating them is removed, creating one large auditorium. As everyone wants to sit closer to the front, fifty people from the first room move into vacant seats in the other, until each side has one hundred and fifty people sitting in it. But both metals were electrically neutral before the wall was removed. Adding fifty electrons to the small room creates a net negative charge, while subtracting fifty electrons from the first room yields a net positive charge. A voltage thus develops at the juncture between the two metals, just by bringing them into electrical contact. If there are significant differences in the arrangements on the rows of seats in each side, then as the temperature is raised the number of electrons on each side may vary, leading to a changing voltage with temperature. In this way, by knowing what voltage measured across the junction corresponds to what temperature, this simple device, called a “thermocouple,” can measure the ambient temperature.

  Thermoelectrics perform a similar feat using a nominally homogenous material, typically a semiconductor. If one end of the solid is hotter than the other, then the warmer side will have more electrons promoted from the full lower band up into the mostly empty conducting band than will be found at the cooler end. For some materials the holes that are generated in the nearly filled lower-energy orchestra will move much slower than the electrons in the higher-energy balcony, so we can focus only on the electrons. The electrons promoted at the hot side will diffuse over to the cooler end, where they will pile up, creating a voltage that repels any additional electrons from moving across the semiconductor. This voltage can then be used to run any device, acting as a battery does. To make an effective thermoelectric device, one wants a material that is a good conductor of electricity (so that the electrons can easily move across the solid) but a poor conductor of heat (so that the temperature difference can be maintained across the length of the solid). Research in developing materials well suited to thermoelectric applications is under way at many laboratories. Commercially viable devices could find application in, for example, hybrid automobiles, taking the waste heat from the engine and converting it into a voltage to charge the battery. In the world of the future, thanks to solid-state devices made possible through our understanding of quantum mechanics, the cars may not fly, but they may get much better mileage.

  Another way to extract electrical power from random vibrations involves nanogenerators. These consist of special wires only several nanometers in diameter, composed of zinc oxide or other materials that are termed “piezoelectric.” For these compounds a mechanical stress causes a slight shift in the crystal structure, which then generates a small voltage. Progress has been made in fabricating arrays of nanoscale wires of these piezoelectric materials. Any motion or vibration will cause the tiny filaments to flex and bend, thereby creating an electric voltage that can be used to provide power for another nanoscale machine or device.

  Finally, we ask, can quantum mechanics do anything to develop small, lightweight batteries to power a personal jet pack? The answer may lie in the developing field of “nanotechnology.” “Nano” comes from the Greek word for “dwarf,” and a nanometer is one billionth of a meter—equivalent to approximately to the length of three atoms placed end to end. First let’s see how normal batteries operate, and then I’ll discuss why nanoengineering may lead to more powerful energy-storage devices.

  In an automobile engine the electrical energy from the spark plug induces the chemical combustion of gasoline and oxygen. Batteries employ a reverse process, where chemical reactions are used to generate voltages.

  In an electrolysis reaction, an electrical current passes through reactants (often in liquid form) and provides the energy to initiate a chemical reaction. For example, one way to generate hydrogen gas (that does not involve the burning of fossil fuels) is to break apart water molecules. To do this we insert two electrodes in a beaker of water and attach them to an external elec
trical power supply, passing a current through the fluid. One electrode will try to pull electrons out of the water (pure water is a very good electrical insulator), while the other will try to shove them in. The input of electrical energy overcomes the binding energy holding the water molecule together, and positively charged hydrogen ions (H+) are attracted to the electrode trying to give up electrons, while the negatively charged hydroxides (OH- units) move toward the electrode trying to accept electrons. The net result is that H2O molecules break into gaseous hydrogen and oxygen molecules.

  In a battery, making use of essentially a reverse electrolysis process, different metals are employed for the electrodes (such as nickel and cadmium); they are chosen specifically because they undergo chemical reactions with certain liquids, leaving the reactant either positively or negatively charged. Where the metal electrode touches the chemical fluid (though batteries can also use a porous solid or a gel between the electrodes), electrical charges are either taken from the metal or added to it, depending on the chemical reaction that proceeds.76 A barrier is placed between the two electrodes, preventing the fluid from moving from one electrode to the other, so that negative charges (that is, electrons) pile up on one electrode and an absence of electrons (equivalent to an excess of positive charges) accumulates at the other.

  The only way the excess electrons on one electrode, which are repelled from each other and would like to leave the electrode, can move to the positively charged electrode is if a wire is connected across the two terminals of the battery. The stored electrical charges can then flow through a circuit and provide the energy to operate a device. In an alkaline battery, once the chemical reactants in the fluid are exhausted, the device loses its ability to charge up the electrodes. Certain metal-fluid chemical reactions can proceed in one way when current is drawn from the battery, and in the reverse direction with the input of an electrical current (as in the water electrolysis example earlier), restoring the battery to its original state. Such batteries are said to be “rechargeable,” and it is these structures that have exhibited the greatest increases in energy-storage capacity of late.

 

‹ Prev