Book Read Free

Asimov's New Guide to Science

Page 14

by Isaac Asimov


  Naturally, Galileo’s findings met with considerable opposition; for by older the view, they seemed blasphemous. A German astronomer, Christoph Scheiner, who also observed the spots, suggested they were not part of the sun but were small bodies that orbited about the sun and showed up darkly against its glowing disk. Galileo won that debate, however.

  In 1774, a Scottish astronomer, Alexander Wilson, noted that a large sunspot near the edge of the sun, when it was seen sideways, looked concave, as though it were a crater on the sun. This point was taken up in 1795 by Herschel, who suggested that the sun was a dark, cold body with a flaming layer of gases all about it. The sunspots, by this view, were holes through which the cold body below could be seen. Herschel speculated that the cold body might even be inhabited by living beings. (Note how even brilliant scientists can come up with daring suggestions that seem reasonable in the light of the knowledge of the time, but nevertheless turn out to be ludicrously wrong as further evidence on the subject accumulates.)

  Actually, sunspots are not really black. They are areas of the solar surface that are cooler than the rest so that they look dark in comparison. If, however, Mercury or Venus moves between us and the sun, each shows up on the solar disk as a small, really black circle; and if that circle moves near a sunspot, one can then see that the spot is not truly black.

  Still, even totally wrong notions can be useful, for Herschel’s idea served to increase interest in the sunspots.

  The real breakthrough, however, came with a German pharmacist, Heinrich Samuel Schwabe, whose hobby was astronomy. Since he worked all day, he could not sit up all night looking at the stars. He cast about for a daytime task and decided to observe the solar disk and look for planets near the sun that might demonstrate their existence by crossing in front of it.

  In 1825, he started observing the sun, and could not help noting the sunspots. After a while, he forgot about the planets and began sketching the sunspots, which changed in position and shape from day to day. He spent no less than seventeen years observing the sun on every day that was not completely cloudy.

  By 1843, he was able to announce that the sunspots did not appear utterly at random: there was a cycle. Year after year, there were more and more sunspots till a peak was reached. Then the number declined until there were almost none; whereupon a new cycle started. We now know that the cycle is somewhat irregular but averages out to about eleven years. Schwabe’s announcement was ignored (he was only a pharmacist, after all) until the well known scientist Alexander van Humboldt mentioned the cycle in 1851 in his book Kosmos, a large overview of science.

  At this time, the Scottish-German astronomer Johann von Lamont was measuring the intensity of Earth’s magnetic field and found that it was rising and falling in regular fashion. In 1852, a British physicist, Edward Sabine, pointed out that this cycle kept time with the sunspot cycle.

  It thus appeared that sunspots affect Earth, and they began to be studied with intense interest. Each year came to be given a Zurich sunspot number according to a formula first worked out in 1849 by a Swiss astronomer, Rudolf Wolf, who worked in Zurich. (He was the first to point out that the incidence of auroras also rose and fell in time to the sunspot cycle.)

  The sunspots seem to be connected with the sun’s magnetic field and to appear at the point of emergence of magnetic lines of force. In 1908, three centuries after the discovery of sunspots, G. E. Hale detected a strong magnetic field associated with sunspots. Why the magnetic field of the sun should behave as it does, emerge from the surface at odd times and places, increase and decrease in intensity in a somewhat irregular cycle still remains among the solar puzzles that have so far defied solution.

  In 1893, the English astronomer Edward Walter Maunder was checking through early reports in order to set up data for the sunspot cycle in the first century after Galileo’s discovery. He was astonished to find that there were virtually no reports on sunspots between the years 1645 and 1715. Important astronomers, such as Cassini, looked for them and commented on their failure to see any. Maunder published his findings in 1894, and again in 1922, but no attention was paid to his work. The sunspot cycle was so well established that it seemed unbelievable that there could have been a seven-decade period in which hardly any appeared.

  In the 1970s, the American astronomer John A. Eddy came across this report and, checking into it, discovered that there actually seemed to have been what came to be called a Maunder minimum. He not only repeated Maunder’s researches but investigated reports of naked-eye sightings of particularly large sunspots from many regions, including the Far East—data that had been unavailable to Maunder. Such records go back to the fifth century B.C. and generally yield five to ten sightings per century. There are gaps, and one of those gaps spans the Maunder minimum.

  Eddy checked reports on auroras, too. These rise and fall in frequency and intensity with the sunspot cycle. It turned out there were many reports after 1715, and quite a few before 1645, but just about none in between.

  Again, when the sun is magnetically active and there are many sunspots, the corona is full of streamers of light and is very beautiful. In the absence of sunspots, the corona seems a rather featureless haze. The corona can be seen during solar eclipses; and while few astronomers traveled to view such eclipses in the seventeenth century, such reports as existed during the Maunder mini mum were invariably of coronas of the kind associated with few or no sunspots.

  Finally, at the time of sunspot maxima, there is a chain of events that succeeds in producing carbon-14 (a variety of carbon that I shall mention in the next chapter) in smaller quantities than usual. It is possible to analyze tree rings for carbon-14 content and to judge the existence of sunspot maxima and minima by fall and rise of carbon-14 content, respectively. Such analysis also produced evidence for the existence of the Maunder minimum and, indeed, numerous Maunder minima in earlier centuries.

  Eddy reported that there seem to have been some twelve periods over the last five thousand years in which there were Maunder minima enduring from fifty to a couple of hundred years each. There was one such between 1400 and 1510, for instance.

  Since sunspot cycles have an effect on Earth, we might ask what effect Maunder minima have. It may be that they are associated with cold periods. The winters were so cold in Europe in the first decade of the 1700s that it was called the little ice age. It was also cold during the 1400-1510 minimum, when the Norse colony in Greenland died out because the weather simply got too bad for survival.

  The Moon

  When, in 1543, Copernicus placed the sun at the center of the solar system, only the moon was left to owe allegiance to Earth which, for so long previously, had been assumed to be the center.

  The moon circles Earth (relative to the stars) in 27.32 days. It turns on its own axis in precisely that same period. This equality between its period of revolution and rotation results in its perpetually presenting the same face to Earth. This equality of revolution and rotation is not a coincidence. It is the result of Earth’s tidal effect on its moon, as I shall explain later.

  The moon’s revolution with respect to the stars is the sidereal month. However, as the moon revolves about Earth, Earth revolves about the sun. By the time the moon has made one revolution about Earth, the sun has moved somewhat in its sky because of Earth’s motion (which has dragged the moon with it). The moon must continue its revolution for about 2½ days before it catches up with the sun and is back in the same spot relative to the sun it was in before. The moon’s revolution about Earth with respect to the sun is the synodic month, which is 29.53 days long.

  The synodic month was more important to humanity than the sidereal, for as the moon revolves about Earth, the face we see experiences a steadily changing angle of sunlight, and that angle depends on its revolution with respect to the sun. It undergoes a succession of phases. At the beginning of a month, the moon is located just east of the sun and appears as a very thin crescent visible just after sunset. From night to night
it moves farther from the sun, and the crescent thickens. Eventually, the lighted portion of the moon is a semicircle, and then it moves beyond that. When the moon has moved so that it is in that portion of the sky directly opposite to that of the sun, the sunlight shines upon the moon over Earth’s shoulder (so to speak) and the entire visible face of the moon is lit up: that full circle of light is the full moon.

  Next the shade encroaches from the side of the moon where the crescent first appeared. Night after night, the moon’s lighted portion shrinks, until it is a half-moon again, with the light on the side opposite to where it was on the earlier half-moon. Finally, the moon ends up just west of the sun and appears in the sky just before dawn as a crescent curving in the opposite direction from that which it had formed at first. The moon then moves past the sun and shows up as a crescent just after sunset, and the whole set of changes starts over.

  The entire cycle of phase change lasts 29½ days, the length of the synodic month, and formed the basis of humanity’s earliest calendars.

  Human beings first assumed that the moon was really waxing and waning, growing and fading as the phases changed. It was even assumed that, each time a crescent appeared in the western sky after sunset, it was literally a new moon, and it is still called that today.

  The ancient Greek astronomers realized, however, that the moon must be a globe, that the changes in phase arose from the fact that it shone only by reflecting sunlight, and that the changing position of the moon in the sky with respect to the sun accounted for the phases exactly. This was a most important fact. The Greek philosophers, notably Aristotle, tried to differentiate Earth from the heavenly bodies by demonstrating that the properties of Earth were altogether different from those the heavenly bodies held in common. Thus, Earth was dark and gave off no light, while the heavenly bodies all gave off light. Aristotle thought the heavenly bodies were made of a substance he called aether (from a Greek word for “glowing” or “blazing”), which was fundamentally different from the materials that made up Earth. And yet the cycle of the phases of the moon showed that the moon, like Earth, gave off no light of its own and glowed only because it reflected sunlight. Thus, the moon at least was Earthlike in this respect.

  What’s more, occasionally the sun and the moon were so precisely on opposite sides of Earth that the sun’s light was blocked by Earth and could not reach the moon. The moon (always at full moon) passed into Earth’s shadow and was eclipsed.

  In primitive times, it was thought the moon was being swallowed by some malign force and would disappear altogether and forever. It was a frightening phenomenon; and it was an early victory of science to be able to predict an eclipse and to show that it was a natural phenomenon with an easily understood explanation. (It is thought by some that Stonehenge was, among other things, a primitive Stone Age observatory which could be used to predict the coming of lunar eclipses by the shifting of positions of the sun and the moon relative to the regularly placed stones of the structure.)

  In fact, when the moon is a crescent, it is sometimes possible to see its remainder dimly outlined in ruddy light. It was Galileo who suggested that Earth, like the moon, must reflect sunlight and shine, and that the portion of the moon unlit by the sun was dimly lit by Earthlight. This would be visible only when so little of the sunlit portion could be seen that its light would not wash out the much dimmer Earthlight. Not only, then, was the Moon non-luminous like Earth, but Earth reflected sunlight and would show phases like the moon (if viewed from the moon).

  Another supposed fundamental difference between Earth and the heavenly bodies was that Earth was flawed, imperfect, and forever changing while the heavenly bodies were perfect and unchanging.

  Only the sun and the moon appear to the unaided eye to be anything more than dots of light. Of the two, the sun appears to be a perfect circle of perfect light. The moon, however—even discounting the phases—is not perfect. When the full moon shines, and the moon seems a perfect circle of light, it is nevertheless clearly not perfect. There are smudges upon its softly glowing surface, which detract from the notion of perfection. Primitive man made pictures out of the smudges, each different culture coming up with a different picture. Human self-love is such that people frequently saw the smudges as forming the picture of a human being, and we still speak of the “man in the moon.”

  It was Galileo who, in 1609, looked through a telescope at the sky for the first time and turned it on the moon to see mountains, craters, and flat areas (which he took to be seas or, in Latin, maria). This was the final indication that the moon was not a “perfect” heavenly body, fundamentally different from Earth, but was an Earthlike world.

  This realization did not in itself totally demolish the older view, however. The Greeks had noted that there were several objects in the sky that steadily shifted position against the stars generally, and that, of them all, the moon shifted position most rapidly. They assumed that it did so because it was closer to Earth than any other heavenly body was (and in this the Greeks were right). It might be argued that the moon, because of its closeness to Earth, was somewhat polluted by Earth’s imperfections, that it suffered from proximity. It was not till Galileo discovered spots on the sun that the notion of heavenly perfection really shivered.

  MEASURING THE MOON

  But if the moon was the closest body to Earth, how close was it? Of the ancient Greek astronomers who tried to determine that distance, Hipparchus worked out essentially the right answer. Its average distance from Earth is now known to be 238,900 miles, or about 9.6 times Earth’s circumference.

  If the moon’s orbit were circular, that would be its distance at all times. The moon’s orbit, however, is somewhat elliptical, and Earth is not at the center of the ellipse but at one of the foci, which are off-center. The moon approaches Earth slightly in one-half of its orbit and recedes from it in the other half. At its closest point (perigee), the moon is but 221,500 miles from Earth, and at its farthest point (apogee), 252,700 miles.

  The moon is, as the Greeks surmised, by far the closest to Earth of all the heavenly bodies. Even if we forget the stars and consider only the solar system, the moon is, relatively speaking, in our backyard. The moon’s diameter (judging from its distance and its apparent size) is 2,160 miles. Earth’s globe is 3.65 times as broad, and the sun’s is 412 times as broad. It just happens that the sun’s distance from Earth is about 390 times that of the moon on the average, so that differences in distance and diameter nearly cancel out, and the two bodies, so different in real size, appear almost equally large in the sky. It is for this reason that, when the moon gets in front of the sun, the smaller, nearer body can so nearly fit over the larger, farther one, making the total eclipse of the sun the wonderful spectacle it is. It is an astonishing coincidence from which we benefit.

  GOING TO THE MOON

  The comparative nearness of the moon and its prominent appearance in the sky has long acted as a spur to the human imagination. Was there some way of reaching it? (One might equally wonder about reaching the sun, but the sun’s obviously intense heat served to cool one’s desire to do so. The moon was clearly a much more benign target as well as a much closer one.)

  In early times, reaching the moon would not seem an insuperable task, since it was assumed that the atmosphere extended up to the heavenly bodies, so that anything that lifted you up in the air might well carry you up to the moon in extreme cases.

  Thus, in the second century A.D., the Syrian writer Lucian of Samosata wrote the first story of space travel that we know of. I n it, a ship is caught in a waterspout which lifts it high into the air, high enough to reach the moon.

  Again, in 1638, there appeared Man in the Moone by an English clergyman, Francis Godwin (who died before its publication). Godwin has his hero carried to the moon in a chariot pulled by large geese who migrate to the moon annually.

  In 1643, however, the nature of air pressure came to be understood, and it was rapidly seen that Earth’s atmosphere could not extend m
ore than a comparatively few miles above its surface. Most of the space between Earth and the moon was vacuum into which waterspouts could not penetrate and across which geese could not fly. The problem of reaching the moon was suddenly much more formidable, yet still not insuperable.

  In 1650, there appeared (again posthumously) Voyage to the Moon by the French writer and duelist Cyrano de Bergerac. In his tale, Cyrano lists seven ways by which it might be possible to reach the moon. Six of them were quite wrong for one reason or another, but the seventh method was through the use of rockets. Rockets were indeed the one method then known (or now, for that matter) whereby the vacuum could be crossed.

  It was not till 1687, however, that the rocket principle was understood. In that year, Newton published his great book Principia Mathematica in which, among other things, he listed his three laws of motion. The third law is popularly known as the law of action and reaction: when a force is applied in one direction, there is an equal and opposite force in the other. Thus, if a rocket ejects a mass of matter in one direction, the rest of the rocket moves in the other, and will do so in a vacuum as well as in air. In fact, it will do so with greater ease in a vacuum where there is no air resistance to motion. (The general feeling that a rocket must need “something to push against” is wrong.)

  ROCKETRY

  Nor were rockets a matter of theory only. They were in existence centuries before Cyrano wrote and Newton theorized.

  The Chinese, as long ago as the thirteenth century, invented and used small rockets for psychological warfare—to frighten the enemy. Modern Western civilization adapted rockets to a bloodier purpose. In 1801, a British artillery expert, William Congreve, having learned about rockets in the Orient, where Indian troops used them against the British in the 1780s, devised a number of deadly missiles. Some were used against the United States in the War of 1812, notably at the bombardment of Fort McHenry in 1814, which inspired Francis Scott Key to write the “Star-Spangled Banner,” singing of “the rockets’ red glare.” Rocket weapons faded out in the face of improvements in range, accuracy, and power of conventional artillery. However, the Second World War saw the development of the American bazooka and the Soviet “Katusha,” both of which are essentially rocket-propelled packets of explosives. Jet planes, on a much larger scale, also make use of the rocket principle of action and reaction.

 

‹ Prev