Asimov's New Guide to Science
Page 52
THE NATURE OF LIGHT
When light enters glass, or some other transparent substance, obliquely—that is, at an angle to the vertical—it is always refracted into a path that forms a smaller angle to the vertical. The exact relationship between the original angle and the refracted angle was first worked out in 1621 by the Dutch physicist Willebrord Snell. He did not publish his finding, and the French philosopher René Descartes discovered the law independently in 1637.
The first important experiments on the nature of light were conducted by Isaac Newton in 1666, as I have already mentioned in chapter 2. He let a beam of sunlight, entering a dark room through a chink in a blind, fall obliquely on one face of a triangular glass prism. The beam was refracted when it entered the glass and then refracted still farther in the same direction when it emerged from a second face of the prism. (The two refractions in the same direction arose because the two sides of the prism met at an angle instead of being parallel, as would have been the case in an ordinary sheet of glass.) Newton caught the emerging beam on a white screen to see the effect of the reinforced refraction. He found that, instead of forming a spot of white light, the beam was spread out in a band of colors—red, orange, yellow, green, blue, and violet, in that order.
Newton deduced that ordinary white light is a mixture of different kinds of light which, separately, affect our eyes so as to produce the sensation of different colors. This band of colors, though it looks real enough, is immaterial, as immaterial as a ghost; and, indeed, Newton’s name for it—spectrum—comes from a Latin word meaning “ghost.”
Newton decided that light consisted of tiny particles (corpuscles) traveling at enormous speed. These would explain why light travels in straight lines and casts sharp shadows. It is reflected by a mirror because the particles bounce off the surface, and it is bent on entering a refracting medium (such as water or glass) because the particles travel faster in such a medium than in air.
Still, there were awkward questions. Why should the particles of green light, say, be refracted more than those of yellow light? Why can two beams of light cross without affecting each other—that is, without the particles colliding?
In 1678, The Dutch physicist Christiaan Huygens (a versatile scientist who had built the first pendulum clock and done important work in astronomy) suggested an opposing theory, namely, that light consists of tiny waves. If it is made up of waves, there is no difficulty about explaining the different amount of refraction of different kinds of light through a refracting medium, provided it is assumed that light travels more slowly through the refracting medium than through air. The amount of refraction would vary with the length of the waves: the shorter the wavelength, the greater the refraction. Hence violet light (the most refracted) would have a shorter wavelength than blue light, blue shorter than green, and so on. It is this difference in wavelength, Huygens thought, that distinguishes the colors to the eye. And, of course, if light consists of waves, two beams could cross without trouble. (After all, sound waves and water waves cross without losing their identity.)
But Huygens’s wave theory was not very satisfactory either. It did not explain why light rays travel in straight lines and cast sharp shadows, nor why light waves cannot go around obstacles, as water waves and sound waves can. Furthermore, if light consists of waves, how can it travel through a vacuum as it certainly seemed to do in coming to us through space from the sun and stars? What medium was it waving?
For about a century, the two theories contended with each other. Newton’s corpuscular theory was by far the more popular, partly because it seemed on the whole more logical, and partly because it had the support of Newton’s great name. But, in 1801, an English physician and physicist, Thomas Young, performed an experiment that swung opinion the other way. He projected a narrow beam of light through two closely spaced holes toward a screen behind. If light consisted of particles, presumably the two beams emerging through the holes would simply produce a brighter region on the screen where they overlapped and less bright regions where they did not. But this was not what Young found. The screen showed a series of bands of light, each separated from the next by a dark band. It seemed that in these dark intervals, the light of the two beams together added up to darkness!
The wave theory would easily explain this effect. The bright band represented the reinforcement of waves of one beam by waves of the other; in other words, the two sets of waves were in phase, both peaks together and strengthening each other. The dark bands, on the other hand, represented places where the waves were out of phase, the trough of one canceling the peak of the other. Instead of reinforcing each other, the waves at these places interfered with each other, leaving the net light energy there zero.
From the width of the bands and the distance between the two holes through which the beams issued, it was possible to calculate the length of light-waves—say, of red light or violet or colors between. The wavelengths turned out to be very small indeed. The wavelength of red light, for example, came to about 0.000075 centimeters or 0.000030 inch. (Eventually the wavelengths of light were expressed in a convenient unit suggested by Ångström. The unit, called the angstrom—abbreviated A—is 100 millionth of a centimeter.
Thus, the wavelength of red light at one end of the spectrum is about 7,500 angstrom units; the wavelength of violet light at the other end is about 3,900 angstrom units; and the color wavelengths of the visible spectrum lie between these numbers.)
The shortness of the wavelengths is very important. The reason light-waves travel in straight lines and cast sharp shadows is that they are incomparably smaller than ordinary objects; waves can curve around an obstruction only when that obstruction is not much larger than the wavelength. Even bacteria, for instance, are vastly wider than a wavelength of light, so light can define them sharply under a microscope. Only objects somewhere near a wavelength of light in size (for example, viruses and other submicroscopic particles) are small enough for light-waves to pass around them.
It was the French physicist Augustin Jean Fresnel who showed (in 1818) that, if an interfering object is small enough, a light-wave will indeed travel around it. In that case, the light produces what is called a diffraction pattern. For instance, the very fine parallel lines of a diffraction grating act as a series of tiny obstacles that reinforce one another. Since the amount of diffraction depends on the wavelength, a spectrum is produced. From the amount by which any color or portion of the spectrum is diffracted, and from the known separation of the scratches on the glass, the wavelength can again be calculated.
Fraunhofer pioneered in the use of such diffraction gratings, an advance generally forgotten in the light of his more famous discovery of spectral lines. The American physicist Henry Augustus Rowland invented concave gratings and developed techniques for ruling them with as many as 20,000 lines to the inch. It was his work that made it possible for the prism to be supplanted in spectroscopy.
Between such experimental findings and the fact that Fresnel systematically worked out the mathematics of wave motion, the wave theory of light seemed established and the corpuscular theory smashed—apparently for good.
Not only were light waves accepted as existing, their length was measured with increasing precision. By 1827, the French physicist Jacques Babinet was suggesting that the wavelength of light—an unalterable physical quantity—be used as the standard for measurement of length, instead of the various arbitrary standards that were then used. This suggestion did not become practicable, however, until the 1880s, when the German-American physicist Albert Abraham Michelson invented an instrument called the interferometer, which could measure the wavelengths of light with unprecedented accuracy. In 1893, Michelson measured the wavelength of the red line in the cadmium spectrum and found it to be 1/1,553,164 meter long.
A measure of uncertainty still existed when it was discovered that elements consist of different isotopes, each contributing a line of slightly different wavelength. As the twentieth century progressed, however
, the spectral lines of individal isotopes were measured. In the 1930s, the lines of krypton 86 were measured. This isotope, being that of gas, could be dealt with at low temperatures where atomic motion is slowed, with less consequent thickening to the line.
In 1960, the krypton-86 line was adopted by the General Conference of Weights and Measures as the fundamental standard of length. The meter has been redefined as equal to 1,650,763.73 wavelengths of this spectral line. This standard has increased the precision of measurement of length a thousandfold. The old standard meter bar could be measured, at best, to within one part in a million, whereas the light wave can be measured to within one part in a billion.
THE SPEED OF LIGHT
Light obviously travels at tremendous speeds. If you put out a light, it gets dark everywhere at once, as nearly as can be made out. Sound does not travel as fast. If you watch a man in the distance chopping wood, you do not hear the stroke until some moments after the ax has struck. Sound has clearly taken a certain amount of time to travel to the ear. In fact, its speed of travel is easy to measure: 1,090 feet per second, or about 750 miles per hour, in the air at sea level.
Galileo was the first to try to measure the speed of light. Standing on one hill while an assistant stood on another, he would uncover a lantern; as soon as the assistant saw the flash, he would signal by uncovering a light of his own. Galileo did this at greater and greater distances, assuming that the time it took the assistant to make his response would remain uniform, and therefore that any increase in the interval between his uncovering his own lantern and seeing the responding flash would represent the time taken by the light to cover the extra distance. The idea was sound, but light travels much too fast for Galileo to have detected any difference by this crude method.
In 1676, the Danish astronomer Olaus Roemer did succeed in timing the speed of light—on an astronomical distance scale. Studying Jupiter’s eclipses of its four large satellites, Roemer noticed that the interval between successive eclipses became longer when the earth was moving away from Jupiter, and shorter when it was moving toward Jupiter in its orbit. Presumably the difference in eclipse times reflected the difference in distance between the earth and Jupiter: that is, it would be a measure of the distance in the time that light takes to travel between Jupiter and the earth. From a rough estimate of the size of the earth’s orbit, and from the maximum discrepancy in the eclipse timing, which Roemer took to represent the time it takes light to cross the full width of the earth’s orbit, he calculated the speed of light. His estimate came to 132,000 miles per second, remarkably close to the actual speed for what might be considered a first try, and high enough to evoke the disbelief of his contemporaries.
Roemer’s results were, however, confirmed a half-century later from a completely different direction. In 1728, the British astronomer James Bradley found that stars seem to shift position because of the earth’s motion—not through parallax, but because the velocity of the earth’s motion about the sun is a measurable (though small) fraction of the speed of light. The analogy usually used is that of a man under an umbrella striding through a rainstorm. Even though the drops are falling vertically, the man must tip the umbrella forward, for he is stepping into the drops. The faster he walks, the farther he must tip the umbrella. Similarly, the earth moves into the light rays falling from the stars, and the astronomer must tip the telescope a bit, and in different directions, as the earth changes its direction of motion. From the amount of tip (the aberration of light), Bradley could estimate the value of the speed of light at 176,000 miles a second—a higher, and more accurate, value than Roemer’s, though still about 5.5 percent too low.
Eventually, scientists obtained still more accurate measurements by applying refinements of Galileo’s original idea. In 1849, the French physicist Armand Hippolyte Louis Fizeau set up an arrangement whereby a light was flashed to a mirror 5 miles away and reflected back to the observer. The elapsed time for the 10-mile round trip of the flash was not much more than 1/20,000 of a second, but Fizeau was able to measure it by placing a rapidly rotating toothed wheel in the path of the light beam. When the wheel turned at a certain speed, the flash going out between the two teeth would hit the next tooth when it came back from the mirror, and so Fizeau, behind the wheel, would not see it. When the wheel was speeded up, the returning flash would not be blocked but would come through the next gap between teeth (figure 8.1). Thus, by controlling and measuring the speed of the turning wheel, Fizeau was able to calculate the elapsed time, and therefore the speed of travel, of the flash of light. He found it to be 196,000 miles a second, which was 5.2 percent too high.
Figure 8.1. Fizeau’s arrangement for measuring the speed of light. Light reflected by the semi mirror near the source passes through a gap in the rapidly spinning toothed wheel to a distant mirror (right) and is reflected back to the next tooth or the next gap.
A year later, Jean Foucault (who was soon to perform his pendulum experiment; see chapter 4) refined the measurement by using a rotating mirror instead of a toothed wheel. Now the elapsed time was measured by a slight shift in the angle of reflection by the rapidly turning mirror (figure 8.2).Foucault’s best measurement, in 1862, was 185,000 miles per second for the speed of light in air—only 0.7 percent too low. In addition, Foucault used his method to determine the speed of light through various liquids. He found the speed to be markedly less than the speed of light in air. This finding fitted Huygen’s wave theory, too.
Figure 8.2. Foucault’s method. The amount of rotation of the mirror, instead of Fizeau’s toothed wheel, gave the speed of the light’s travel.
Still greater precision in the measurement of light’s velocity came with the work of Michelson, who—over a period of more than forty years, starting in 1879—applied the Fizeau-Foucault approach with ever greater refinement. He eventually sent light through a vacuum rather than through air (even air slows it up slightly), using evacuated steel pipes up to a mile long for the purpose. He measured the speed of light in a vacuum to be 186,271 miles per second—only 0.006 percent too low. He was also to show that all wavelengths of light travel at the same speed in a vacuum.
In 1972, a research team under Kenneth M. Evenson made still more precise measurements and found the speed of light to be 186,282.3959 miles per second. Once the speed of light was known with such amazing precision, it became possible to use light, or at least forms of it, to measure distance. (It was practical to do so even when the speed was known less precisely.)
RADAR
Imagine a short pulse of light moving outward, striking some obstacle, being reflected backward, and being received at the point where it has issued forth an instant before. What is needed is a wave form of low enough frequency to penetrate fog, mist, and cloud, but of high enough frequency to be reflected efficiently. The ideal range was found to be in the microwave region, with wavelengths of from 0.2 to 40 inches. From the time lapse between emission of the pulse and return of the echo, the distance of the reflecting object can be estimated.
A number of physicists worked on devices making use of this principle, but the Scottish physicist Robert Alexander Watson-Watt was the first to make it thoroughly practicable. By 1935, he had made it possible to follow an airplane by the microwave reflections it sent back. The system was called radio detection and ranging, the word range meaning “to determine the distance of.” The phrase was abbreviated to ra.d. a. r., or radar. (A word, such as radar, that is constructed out of the initials of a phrase is called an acronym. Acronyms have become common in the modern world, particularly in science and technology.)
The world first became conscious of radar when it was learned that, by using that device, the British had been able to detect oncoming Nazi planes during the Battle of Britain, despite night and fog. To radar therefore belongs at least part of the credit of the British victory.
Since the Second World War, radar has had numerous peacetime uses. It has been used to detect rainstorms and has helped weather forecasters
in this respect. It has turned up mysterious reflections called angels, which turned out to be, not heavenly messengers, but flocks of birds, so that now radar is used in the study of bird migrations.
And, as I described in chapter 3, it was radar reflections from Venus and Mercury that gave astronomers new knowledge concerning the rotations of those planets and, with regard to Venus, information about the nature of the surface.
LIGHT-WAVES THROUGH SPACE
Through all the mounting evidence of the wave nature of light, a nagging question continued to bother physicists. How is light transmitted through a vacuum? Other kinds of wave—sound, for instance—require a material medium.nWe derive the sensation of sound by the vibration, back and forth, of the atoms or molecules of the medium through which it travels. (From our observation platform here on Earth, we can never hear an explosion, however loud, on the moon or anywhere else in space because sound waves cannot travel across empty space.) Yet here were light-waves traveling through a vacuum more easily than through matter, and reaching us from galaxies billions of light-years away, although there was nothing there to wave.
Classical scientists were always uncomfortable about the notion of “action at a distance.” Newton, for instance, worried about how the force of gravity could operate through space. As a possible explanation, he revived the Greeks’ idea of an ether filling the heavens and speculated that perhaps the force of gravity might somehow be conducted by the ether. He avoided the lightproblem by supposing light to consist of speeding particles, but that idea fell through when light was eventually found to be a wave phenomenon.