The Greatest Story Ever Told—So Far

Home > Other > The Greatest Story Ever Told—So Far > Page 3
The Greatest Story Ever Told—So Far Page 3

by Lawrence M. Krauss


  The religiosity of the early scientific pioneers is also cited today by sophists who claim that science and religious doctrine are compatible, but who confuse science and scientists. In spite of frequent appearances to the contrary, scientists are people. And like all people they are capable of holding many potentially mutually contradictory notions in their head at the same time. No correlation between divergent views held by any individual is representative of anything but human foibles.

  To claim that some scientists are or were religious is like saying some scientists are Republicans or some are flat-earthers or some are creationists. It doesn’t imply causality or consistency. My friend Richard Dawkins has told me of a professor of astrophysics who, during the day, writes papers that are published in astronomical journals assuming that the universe is more than 13 billion years old, but then goes home and privately espouses the literal biblical claim that the universe is six thousand years old.

  What determines intellectual consistency or lack thereof in the sciences is a combination of rational arguments with subsequent evidence and continued testing. It is perfectly reasonable to claim that religion, in the Western world, may be the mother of science. But as any parent knows, children rarely grow up to be models of their parents.

  Newton may, following tradition, have been motivated to look at light because it was a gift from God. But we remember his work not because of his motivation, but because of what he discovered.

  Newton was convinced that light was made of particles, which he referred to as corpuscles, while Descartes, and later Newton’s nemesis Robert Hooke, and still later the Dutch scientist Christiaan Huygens, all claimed that light was a wave. One of the key observations that appeared to support the wave theory was that white light, such as light from the Sun, could split into all the colors of the rainbow when passed through a prism.

  As was often the case during his life, Newton believed that he was correct and several of his most famous contemporaries (and competitors) were wrong. To demonstrate this, he devised a clever experiment using prisms that he first performed while at home in Woolsthorpe, to escape the bubonic plague ravaging Cambridge. As he reported at the Royal Society in 1672, on the forty-fourth try, he observed precisely what he hoped he would see.

  Advocates of the wave theory had argued that light waves were made of white light and that the light split into colors when it passed through a prism because of “corruption” of the rays as they traversed the glass. In this case, the more glass, the more splitting.

  Newton reasoned that this was not the case, but that light is made of colored particles that combine together to appear white. (With a nod to his occult fascination, Newton classified the colored particles of the spectrum—a term he coined—into seven different types: red, orange, yellow, green, blue, indigo, and violet. From the time of the Greeks, the number seven had been considered to possess mystical qualities.) To demonstrate that the wave/corruption picture was incorrect, Newton passed a beam of white light through two prisms held in opposite orientations. The first prism split the light into its spectrum, and the second recomposed it back into a single white light beam. This result would have been impossible if the glass had corrupted the light. A second prism would have simply made the situation worse and would not have caused the light to revert back to its original state.

  This result does not in fact disprove the wave theory of light (it actually supports it, because light slows down as it bends upon entering the prism, just as waves would do). But since the advocates of that theory had argued (incorrectly) that the spectral splitting was due to corruption, Newton’s demonstration that this was not the case struck a significant blow in favor of his particle model.

  Newton went on to discover many other facets of light that we use today in our understanding of the wave nature of light. He showed that every color of light has a unique bend angle when passing through a glass prism. He also showed that all objects appear to be the same color as the color of the light beam illuminating them. And he showed that colored light will not change its color no matter how many times it is reflected by or passes through a prism.

  All of these results, including his original result, can be explained simply if white light is indeed composed of a collection of different colors—that much he got right. But they can’t be explained if light is made of different-colored particles. Rather, white light is composed of waves of many different wavelengths.

  Newton’s opponents did not give up easily, even in the face of Newton’s rising popularity and the death of his chief opponent, Hooke. They did not give up even after Newton’s election as president of the Royal Society in 1703, the year before he actually published his research on light in his epic Opticks. Indeed, the debate on the nature of light continued to rage on for over a century.

  Part of the problem with a wave picture of light was the question “What is it that light is a wave of, exactly?” And if it is a wave, then since all known waves require some medium, what medium does it travel in? These questions were sufficiently perplexing that practitioners of the wave theory had to resurrect a new invisible substance permeating all space, the ether.

  The resolution of this conundrum came, as such resolutions often do, from a totally unexpected corner of the physical world, one full of sparks, and spinning wheels.

  When I was a young professor at Yale—in the ancient but huge office I was lucky enough to commandeer when an equally ancient colleague retired—there was left hanging for me a copy of a photograph of Michael Faraday taken in 1861. I have treasured it ever since.

  I don’t believe in hero worship, but if I did, Faraday would be up there with the best. Perhaps more than any other scientist of the nineteenth century, he is responsible for the technology that powers our current civilization. Yet he had little formal education and at age fourteen became a bookbinder’s apprentice. Later in his career, after achieving world recognition for his scientific contributions, he insisted on keeping to his humble roots, turning down a knighthood and twice turning down the presidency of the Royal Society. Later on he refused to advise the British government on the production of chemical weapons for use in the Crimean War, citing ethical reasons. And for more than thirty-three years he gave a series of Christmas lectures at the Royal Institution to excite young people about science. What’s not to like?

  Much as one might admire the man, it is the scientist who matters here for our story. Faraday’s first scientific lesson is one I tell my students: always suck up to your professors. At the age of twenty, after completing seven years of apprenticeship as a bookbinder, Faraday attended the lectures of the famous chemist Humphry Davy, then the head of the Royal Institution. Afterward Faraday presented Davy with a three-hundred-page, beautifully bound book containing the notes Faraday had taken during the lectures. Within a year, Faraday was appointed Davy’s secretary and shortly thereafter got an appointment as chemical assistant in the Royal Institution. Later on, Faraday learned the same lesson but with the opposite result. Following his excitement over some early, quite significant experiments that he performed, Faraday accidentally forgot to acknowledge work with Davy in his published results. This accidental snub probably resulted in his being reassigned to other activities by Davy and delaying his world-changing research by several years.

  When reassigned, Faraday had been working on the “hot” area of scientific research, the newly discovered connections between electricity and magnetism, driven by results of the Danish physicist Hans Christian Oersted. These two forces seem quite different, yet have odd similarities. Electric charges can attract or repel. So can magnets. Yet magnets always seem to have two poles, north and south, which cannot be isolated, while electric charges can individually be positive or negative.

  For some time, scientists and natural philosophers had wondered if the two forces might have some hidden connection, and the first empirical clue came to Oersted by accident. In 1820, while delivering a lecture, Oersted saw that a compass needle was deflec
ted when an electric current from a battery was switched on. A few months later he followed up on this observation and discovered that a current of moving electric charges, which we now commonly call an electric current, produced a magnetic attraction that caused compass needles to point in a circle around the wire.

  He had blazed a new trail. Word spread quickly among scientists, through the Continent and across the English Channel. Moving electric charges produced a magnetic force. Could there be other connections? Could magnets in turn influence electric charges?

  Scientists searched for such a possibility, without success. Davy and another colleague tried to build an electric motor based on the connection discovered by Oersted, but failed. Faraday ultimately got a wire with a current in it to move around a magnet, which did form a crude sort of motor. It was this exciting development that he reported without citing Davy’s name.

  Partly this was mere gamesmanship. No new fundamental phenomenon was being uncovered. Perhaps this was the rationale for one of my favorite (likely apocryphal) stories about Faraday. It is said that William Gladstone, later to be British prime minister, heard of Faraday’s laboratory, full of weird devices, and asked in 1850 what the practical value of all this study into electricity was. Faraday was purported to have replied, “Why, sir, there is every probability that you will soon be able to tax it.”

  Apocryphal or not, both great irony and truth are in that witty comeback. Curiosity-driven research may seem self-indulgent and far from the immediate public good. However, essentially all of our current quality of life, for people living in the first world, has arisen from the fruits of such research, including all the electric power that drives almost every device we use.

  Two years after Davy’s death in 1829, and six years after Faraday had become director of the laboratory of the Royal Institution, he made the discovery that cemented his reputation as perhaps the greatest experimental physicist of the nineteenth century—magnetic induction. Since 1824, he had tried to see if magnetism could alter the current flowing in a nearby wire or otherwise produce some kind of electric force on charged particles. He primarily wanted to see if magnetism could induce electricity, just as Oersted had shown that electricity, and electric currents in particular, could produce magnetism.

  On October 28, 1831, Faraday recorded in his laboratory notebook a remarkable observation. While closing the switch to turn on a current in a wire wound around an iron ring to magnetize the iron, he noticed a current flow momentarily in another wire wrapped around the same iron ring. Clearly the mere presence of a nearby magnet could not cause an electric current to flow in a wire—but turning the magnet on or off could. Subsequently he showed that the same effect occurred if he moved a magnet near a wire. As the magnet came closer or moved away, a current would flow in the wire. Just as a moving charge created a magnet, somehow a moving magnet—or a magnet of changing strength—created an electric force in the nearby wire and produced a current.

  If the profound theoretical implication of this simple and surprising result is not immediately apparent, you can be forgiven, because the implication is subtle, and it took the greatest theoretical mind of the nineteenth century to unravel it.

  To properly frame it, we need a concept that Faraday himself introduced. Faraday had little formal schooling and was largely self-taught and thus was never comfortable with mathematics. In another probably apocryphal story, Faraday boasted of using a mathematical equation only one time in all of his publications. Certainly, he never described the important discovery of magnetic induction in mathematical terms.

  Because of his lack of comfort with formal mathematics, Faraday was forced to think in pictures to gain intuition about the physics behind his observations. As a result he invented an idea that forms the cornerstone of all modern physics theory and resolved a conundrum that had puzzled Newton until the end of his days.

  Faraday asked himself, How does one electric charge “know” how to respond to the presence of another, distant electric charge? The same question had been posed by Newton in terms of gravity, where he earlier wondered how the Earth “knew” to respond as it did to the gravitational pull of the Sun. How was the gravitational force conveyed from one body to another? To this, he gave his famous response “Hypotheses non fingo,” “I frame no hypotheses,” suggesting that he had worked out the force law of gravity and showed that his predictions matched observations, and that was good enough. Many of us physicists have subsequently used this defense when asked to explain various strange physics results—especially in quantum mechanics, where the mathematics works, but the physical picture often seems crazy.

  Faraday imagined that each electric charge would be surrounded by an electric “field,” which he could picture in his head. He saw the field as a bunch of lines emanating radially outward from the charge. The field lines would have arrows on them, pointing outward if the charge was positive, and inward if it was negative:

  He further imagined that the number of field lines increased as the magnitude of the charge increased:

  The utility of this mental picture was that Faraday could now intuitively understand both what would happen when another test charge was put near the first charge and why. (Whenever I use the colloquial why, I mean “how.”) The test charge would feel the “field” of the first charge wherever the second charge was located, with the strength of the force being proportional to the number of field lines in the region, and the direction of the force being along the direction of the field lines. Thus, for example, the test charge in question would be pushed outward in the direction shown:

  One can do more than this with Faraday’s pictures. Imagine placing two charges near each other. Since field lines begin at a positive charge and end on a negative charge and can never cross, it is almost intuitive that the field lines in between two positive charges should appear to repel each other and be pushed apart, whereas between a positive and a negative charge they should connect together:

  Once again, if a test charge is placed anywhere near these two charges, it would feel a force in the direction of the field lines, with a strength proportional to the number of field lines in that region.

  Faraday thus pictured the nature of electric forces between particles in a way that would otherwise require solving the algebraic equations that describe electrical forces. What is most amazing about these pictures is that they capture the mathematics exactly, not merely approximately.

  A similar pictorial view could be applied to magnets, and magnetic fields, reproducing the magnetic force law between magnets, experimentally verified by Coulomb, or current-carrying wires, derived by André-Marie Ampere. (Up until Faraday, all the heavy lifting in discovering the laws of electricity and magnetism was done by the French.)

  Using these mental crutches, we can then reexpress Faraday’s discovery of magnetic induction as follows: an increase or decrease in the number of magnetic field lines going through a loop of wire will cause a current to flow in the wire.

  Faraday recognized quickly that his discovery would allow the conversion of mechanical power into electrical power. If a loop of wire was attached to a blade that was made to rotate by, say, a flow of water, such as a waterwheel, and the whole thing was surrounded by a magnet, then as the blade turned the number of magnetic field lines going through the wire would continuously change, and a current would continuously be generated in the wire. Voilà, Niagara Falls, hydroelectricity, and the modern world!

  This alone might be good enough to cement Faraday’s reputation as the greatest experimental physicist of the nineteenth century. But technology wasn’t what motivated Faraday, which is why he stands so tall in my estimation; it was his deep sense of wonder and his eagerness to share his discoveries as broadly as possible that I admire most. I am convinced that he would agree that the chief benefit of science lies in its impact in changing our fundamental understanding of our place in the cosmos. And ultimately, this is what he did.

  I cannot help but be reminded of
another more recent great experimental physicist, Robert R. Wilson—who, at age twenty-nine, was head of the Research Division at Los Alamos, which developed the atomic bomb during the Manhattan Project. Many years later he was the first director of the Fermi National Accelerator Laboratory in Batavia, Illinois. When Fermilab was being built, in 1969 Wilson was summoned before Congress to defend the expenditure of significant funds on this exotic new accelerator, which was to study the fundamental interactions of elementary particles. Asked if it contributed to national security (which would have easily justified the expenditure in the eyes of the congressional committee members), he bravely said no. Rather:

  It only has to do with the respect with which we regard one another, the dignity of men, our love of culture. . . . It has to do with, are we good painters, good sculptors, great poets? I mean all the things that we really venerate and honor in our country and are patriotic about. In that sense, this new knowledge has all to do with honor and country, but it has nothing to do directly with defending our country except to help make it worth defending.

  Faraday’s discoveries allowed us to power and create our civilization, to light up our cities and our streets, and to run our electric devices. It is hard to imagine any discovery that is more deeply ingrained in the workings of modern society. But more deeply, what makes his contribution to our story so remarkable is that he discovered a missing piece of the puzzle that changed the way we think about virtually everything in the physical world today, starting with light itself. If Newton was the last of the magicians, Faraday was the last of the modern scientists to live in the dark, regarding light. After his work, the key to uncovering the true nature of our main window on the world lay in the open waiting for the right person to find it.

 

‹ Prev