Asimov's New Guide to Science

Home > Science > Asimov's New Guide to Science > Page 57
Asimov's New Guide to Science Page 57

by Isaac Asimov


  The matter waves had important consequences for theory, too. For one thing, they cleared up some puzzles about the structure of the atom.

  In 1913, Niels Bohr had pictured the hydrogen atom, in the light of the recently propounded quantum theory, as consisting of a central nucleus surrounded by an electron that could circle that nucleus in anyone of a number of orbits. These orbits are in fixed positions; if a hydrogen electron drops from an outer orbit to an inner one, it loses energy, emitting that energy in the form of a quantum possessing a fixed wavelength. If the electron were to move from an inner electron to an outer one, it would have to absorb a quantum of energy, but only one of a fixed size and wavelength just enough to move it by the proper amount. Hence, hydrogen can absorb or emit only certain wavelengths of radiation, producing characteristic lines in its spectrum. Bohr’s scheme, which was made gradually more complex over the next decade—notably by the German physicist Arnold Johannes Wilhelm Sommerfeld, who introduced elliptical orbits as well—was highly successful in explaining many facts about the spectra of various elements. Bohr was awarded the Nobel Prize in physics in 1922 for his theory. The German physicists James Franck and Gustav Ludwig Hertz (t)e latter a nephew of Heinrich Hertz), whose studies on collisions between atoms and electrons lent an experimental foundation to Bohr’s theories, shared the Nobel Prize in physics in 1925.

  Bohr had no explanation of why the orbits were fixed in the positions they held. He simply chose the orbits that would give the correct results, so far as absorption and emission of the actually observed wavelengths of light were concerned.

  In 1926, the German physicist Erwin Schrodinger decided to take another look at the atom in the light of the de Broglie theory of the wave nature of particles. Considering the electron as a wave, he decided that the electron does not circle around the nucleus as a planet circles around the sun but constitutes a wave that curves all around the nucleus, so that it is in all parts of its orbit at once, so to speak. It turned out that, on the basis of the wavelength predicted by de Broglie for an electron, a whole number of electron waves would exactly fit the orbits outlined by Bohr. Between the orbits, the waves would not fit in a whole number but would join up out of phase; and such orbits could not be stable.

  Schrodinger worked out a mathematical description of the atom called wave mechanics or quantum mechanics, which became a more satisfactory method of looking at the atom than the Bohr system had been. Schrodinger shared the Nobel Prize in 1933 with Dirac, the author of the theory of antiparticles (see chapter 7), who also contributed to the development of this new picture of the atom. The German physicist Max Born, who contributed further to the mathematical development of quantum mechanics, shared in the Nobel Prize in physics in 1954.

  THE UNCERTAINTY PRINCIPLE

  By this time the electron had become a pretty vague “particle”—a vagueness soon to grow worse. Werner Heisenberg of Germany proceeded to raise a profound question that projected particles, and physics itself, almost into a realm of the unknowable.

  Heisenberg had presented his own model of the atom. He had abandoned all attempts to picture the atom as composed either of particles or of waves. He decided that any attempt to draw an analogy between atomic structure and the structure of the worldabout us is doomed to failure. Instead, he described the energy levels or orbits of electrons’ purely in terms of numbers, without a trace of picture. Since he used a mathematical device called a matrix to manipulate his numbers, his system was called matrix mechanics.

  Heisenberg received the Nobel Prize in physics in 1932 for his contributions to quantum mechanics, but his matrix system Wasless popular with physicists than Schrodinger’s wave mechanics, since the latter seemed just as useful as Heisenberg’S abstractions, and it is difficult for even physicists to force themselves to abandon the attempt to picture what they are talking about.

  By 1944, physicists seemed to have done the correct thing, for the Hungarian-American mathematician John von Neumann presented a line of argument that seemed to show that matrix mechanics and wave mechanics are mathematically equivalent: everything demonstrated by one could be equally well demonstrated by the other. Why not, therefore, choose the less abstract version?

  After having introduced matrix mechanics (to jump back in time again), Heisenberg went on to consider the problem of describing the position of a particle. How can one determine where a particle is? The obvious answer is: Look at it. Well, let us imagine a microscope that could make an electron visible. We must shine a light or some appropriate kind of radiation on it to see it. But an electron is so small that a single photon of light striking it would move it and change its position. In the very act of measuring its position, we would have changed that position.

  This is a phenomenon that occurs in ordinary life. When we measure the air pressure in a tire with a gauge, we let a little air out of the tire and change the pressure slightly in the act of measuring it. Likewise, when we put a thermometer in a bathtub of water to measure the temperature, the thermometer’s absorption of heat changes the temperature slightly. A meter measuring electric current takes away a little current for moving the pointer on the dial. And so it goes in every measurement of any kind that we make.

  However, in all ordinary measurements, the change in the subject we are measuring is so small that we can ignore it. The situation is quite different when we come to look at the electron. Our measuring device now is at least as large as the thing we are measuring; there is no usable measuring agent smaller than the electron. Consequently our measurement must inevitably have, not a negligible, but a decisive, effect on the object measured. We could stop the electron and so determine its position at a given instant. But, in that case, we could not know its motion or velocity. On the other hand, we might record its velocity, but then we could not fix its position at any given moment.

  Heisenberg showed that there is no way of devising a method of pinpointing the position of a subatomic particle unless you are willing to be quite uncertain about its exact motion. And, in reverse, there is no way of pinpointing a particle’s exact motion unless you are willing to be quite uncertain about its exact position. To calculate both exactly, at the same instant of time, is impossible.

  If Heisenberg was right, then even at absolute zero, there cannot be complete lack of energy. If energy reached zero and particles became completely motionless, then only position need be determined since velocity could be taken as zero. It would be expected, therefore, that some residual zero-point energy must remain, even at absolute zero, to keep particles in motion and, so to speak, uncertain. It is this zero-point energy, which cannot be removed, that is sufficient to keep helium liquid even at absolute zero (see chapter 6).

  In 1930, Einstein showed that the uncertainty principle, which stated it is impossible to reduce the error in position without increasing the error in momentum, implied that it is also impossible to reduce the error in measurement of energy without increasing the uncertainty of time during which the measurement can take place. He thought he could use this idea as a springboard for the disproof of the uncertainty principle, but Bohr proceeded to show that Einstein’s attempted disproof was wrong.

  Indeed, Einstein’s version of uncertainty proved very useful, since it meant that in subatomic processes, the law of conservation of energy can be violated for very brief periods of time, provided all isbrought back to the conservational state by the end of those periods: the greater the deviation from conservation, the briefer the time-interval allowed. (Yukawa used this notion in working out his theory of pions; see chapter 7.) It was even possible to explain certain subatomic phenomena by assuming that particles are produced out of nothing in defiance of energy conservation, but cease to exist before the time allotted for their detection, so that they are only virtual particles. The theory of virtual particles was worked out in the late 1940s by three men: the American physicists Julian Schwinger and Richard Phillips Feynman, and the Japanese physicist Sinitiro Tomonaga. The three were j
ointly awarded the 1965 Nobel Prize in physics in consequence.

  There have even been speculations, since 1976, that the universe began as a tiny, but massive, virtual particle that expanded with extreme quickness and remained in existence. The universe, in this view, formed itself out of Nothing, and we may wonder about there possiblybeing an infinite number of universes forming (and eventually ending) in an infinite volume of Nothing.

  The uncertainty principle has profoundly affected the thinking of physicists and philosophers. It had a direct bearing on the philosophical question of causality (that is, the relationship of cause and effect). But its implications for science are not those that are commonly supposed. One often reads that the principle of indeterminacy removes all certainty from nature and shows that science after all does not and never can know what is really going on, that scientific knowledge is at the mercy of the unpredictable whims of a universe in which effect does not necessarily follow cause. Whether this interpretation is valid from the standpoint of philosophy, the principle of uncertainty has in no way shaken the attitude of scientists toward scientific investigation. If, for instance, the behavior of the individual molecules in a gas cannot be predicted with certainty, nevertheless on the average the molecules do obey certain laws, and their behavior can be predicted on a statistical basis, just as insurance companies can calculate reliable mortality tables even though it is impossible to predict when any particular individual will die.

  In most scientific observations indeed, the indeterminacy is so small compared with the scale of the measurements involved that it can be neglected for all practical purposes. One can determine simultaneously both the position and the motion of a star, of a planet, of a billiard ball, or even of a grain of sand, with complete satisfactory accuracy.

  As for the uncertainty among the subatomic particles themselves this does not hinder but actually helps physicists. It has been used to explain facts about radioactivity and about the absorption of subatomic particles by nuclei, as well as many other subatomic events, more reasonably than would have been possible without the uncertainty principle.

  The uncertainty principle means that the universe is more complex than was thought, but not that it is irrational.

  Chapter 9

  * * *

  The Machine

  Fire and Steam

  So far in this book, I have been concerned almost entirely with pure science: that is, science as an explanation of the universe about us. Throughout history, however, human beings have been making use of the workings of the universe to increase their own security, comfort, and pleasure. They used those workings, at first, without any proper understanding of them but gradually came to command them through careful observation, common sense, and even hit-and-miss. Such an application of the workings to human uses is technology, and it antedates science.

  Once science began to grow, however, it became possible to advance technology at ever increasing speed. In modern times, science and technology have grown so intertwined (science advancing technology as it elucidates the laws of nature, and technology advancing science as it produces new instruments and devices for scientists to use) that it is no longer possible to separate them.

  EARLY TECHNOLOGY

  If we go back to the beginning, consider that though the first law of thermodynamics states that energy cannot be created out of nothing, there is no law against turning one form of energy into another. Our whole civilization has been built upon finding new sources of energy and harnessing it for human use in ever more efficient and sophisticated ways. In fact, the greatest single discovery in human history involved methods for converting the chemical energy of a fuel such as wood into heat and light.

  It was perhaps half a million years ago that our hominid ancestors “discovered” fire long before the appearance of Homo sapiens (modern man). No doubt they had encountered—and been put to flight by—lightning-ignited brush fires and forest fires before that. But the discovery of fire’s virtues did not come until curiosity overcame fear.

  There must have come a time when an occasional primitive—perhaps a woman or (most likely) a child—may have been attracted to the quietly burning remnants of such an accidental fire and been amused by playing with it, feeding it sticks, and watching the dancing flames. Undoubtedly, elders would put a stop to this dangerous game until one of them, more imaginative than most, recognized the advantages of taming the flame and turning a childish amusement into adult use. A flame offered light in the darkness and warmth in the cold. It kept predators away. Eventually, people may have found that its heat softened food and made it taste better. (It killed germs and parasites, too, but prehistoric human beings could not know that.)

  For hundreds of thousands of years, human beings could only make use of fire by keeping it going constantly. If a flame accidentally went out, it must have been equivalent to an electrical blackout in modern society. A new flame had to be borrowed from some other tribe, or one had to wait for the lightning to do the job. It was only in comparatively recent times that human beings learned how to make a flame at will where no flame had previously existed, and only then was fire truly tamed (figure 9.1). It was Homo sapiens who accomplished that task in prehistoric times, but exactly when, exactly where, and exactly how we do not know and may never know.

  Figure 9.1. Early firemaking methods.

  In the early days of civilization, fire was used not only for light, warmth, protection and cooking but also eventually for the isolation of metals from their ores and for handling the metals thereafter; for baking pottery and brick; and even for making glass.

  Other important developments heralded the birth of civilization. About 9000 B.C., human beings began to domesticate plants and animals, beginning the practices of agriculture and herding, and thus increased the food supply and, in animals, found a direct energy source. Oxen, donkeys, camels, and eventually horses (to say nothing of reindeer, yaks, water buffalo, llamas, and elephants in various corners of the world) could bring stronger muscles to bear on necessary tasks while using, as fuel, food too coarse for human beings to eat.

  Sometime about 3500 B.C., the wheel was invented (possibly, to begin with, as a potter’s wheel for the molding of pottery). Within a few centuries, certainly by 3000 B.C., wheels were placed on sledges, so that loads that had had to be dragged could now be rolled. Wheels were not a direct source of energy, but they made it possible for far less energy to be lost in overcoming friction.

  By that time, too, primitive rafts or dugouts were being used to allow the energy of running water to transport loads. By 2000 B.C. perhaps, sails were used to catch the wind, so that moving air could hasten the transport or even force the ship to move against a slow current. By 1000 B.C., the Phoenicians in their ships were plowing the full length of the Mediterranean Sea.

  In 50 B.C. or thereabouts, the Romans began to make use of waterwheels. A quickly running stream could be made to turn a wheel, which could in turn be made to turn other wheels that would do work—grind grain, crush ore, pump water, and so on. Windmills also began to come into use at this time, devices in which moving air rather than moving water turn the wheel. (Quickly running streams are rare, but wind is almost everywhere.) In medieval times, windmills were an important source of energy in western Europe. It was in medieval times, too, that human beings first began to burn the black rock called coal in metallurgical furnaces, to employ magnetic energy in the ship’s compass (which eventually made possible the great voyages of exploration), and to use chemical energy in warfare.

  The first use of chemical energy for destruction (past the simple technique of firing flame-tipped arrows) came about in A.D. 670, when a Syrian alchemist Callinicus is believed to have invented Greek fire, a primitive incendiary bomb composed of sulfur and naphtha, which was credited with saving Constantinople from its first siege by the Moslems in 673. Gunpowder arrived in Europe in the thirteenth century. Roger Bacon described it about 1280, but it had been known in Asia for centuries before th
at and may have been introduced to Europe by the Mongol invasions beginning in 1240. In any case, artillery powered by gunpowder came into use in Europe in the fourteenth century, and cannons are supposed to have appeared first at the battle of Crecy in 1346.

  The most important of all the medieval inventions is the one credited to Johann Gutenberg of Germany. About 1450, he cast the first movable type and thereby introduced printing as a powerful force in human affairs. He also devised printer’s ink, in which carbon black was suspended in linseed oil rather than, as hitherto, in water. Together with the replacement of parchment by paper (which had been invented by a Chinese eunuch, Ts’ai Lun—according to tradition—about A.D. 50 and which reached modern Europe, by way of the Arabs, in the thirteenth century), these inventions made possible the largescale production and distribution of books and other written material. No invention prior to modern times was adopted so rapidly. Within a generation, 40,000 books were in print.

  The recorded knowledge of mankind was no longer buried in royal collections of manuscripts but was made accessible in libraries available to all who could read. Pamphlets began to create and give expression to public opinion. (Printing was largely responsible for the success of Martin Luther’s revolt against the papacy in 1517, which might otherwise have been nothing more than a private quarrel among monks.) And it was printing that created one of the prime instruments that gave rise to science as we know it. That indispensable instrument is the wide communication of ideas. Science had been a matter of personal communications among a few devotees; now it became a major field of activity, which enlisted more and more workers into an eventually worldwide scientific community, elicited the prompt and critical testing of theories, and ceaselessly opened new frontiers.

 

‹ Prev