Book Read Free

An Incomplete Education

Page 63

by Judy Jones


  If you want to speak well of an ancient scientist, though, the one always to go with is Archimedes, whose virtues include not just the formulation of the principles of buoyancy and the lever and the finest mathematical mind before Newton but a whole lot of method in the absence of any system or metaphysics. IS IT TRUE THAT THE ARABS KEPT SCIENCE ALIVE DURING THE MIDDLE AGES, WHILE EUROPE SLUMBERED?

  Strictly speaking, this is not true. It wasn’t the Arabs who kept science alive: it was the Arabic language, which played the same role in the vast spaces from Spain to India that Latin played farther to the north. Arabic was in effect the great switching station—or to put it another way, Islam made science international. Here’s what the medieval Muslims did: They translated Galen, Ptolemy, Aristotle, Euclid, Hippocrates, and Dioscorides. Al-Razi, a Persian physician, wrote The Comprehensive Book, whose title suggests the overall range of the effort: It summed up everything that had been known of medicine in Greece, India, and the Middle East and some of what had been known of medicine in China.

  At the battle of Samarkand in A.D. 704, the Muslims got their hands on some Chinese papermakers. They then built paper mills of their own at Samarkand and at Baghdad, and they passed the process along to Europe by way of Spain, another medieval switching station. Al-Khwarazmi, a Baghdad mathematician, borrowed our numeral system from the Hindus and then went on to develop algebra, without which the complex weight distribution of Gothic cathedrals probably couldn’t have been pulled off. Someone called Leonardo of Pisa introduced Arabic numerals into Europe in 1202.

  Europe, meanwhile, sat on its thumbs only in the universities. Away from the gownies, things were cracking; even before the Renaissance happened along, Northerners had substituted trousers for the Roman toga. They had invented skis and the stirrup and the spinning wheel and the heavy-duty plough and mechanical clocks, and they had figured out how to cast iron and harness horses and make a barrel. This was the practical, nonacademic tradition that would sooner or later give the world the Connecticut Yankee, the Wizard of Menlo Park, and better living through chemistry. DID GALILEO REALLY DROP A COUPLE OF LEAD WEIGHTS FROM THE LEANING TOWER OF PISA, THEREBY PROVING SOMETHING OR OTHER?

  By common consent, this most famous of all experiments (or, more properly, demonstrations) was a nice one, if it occurred. Aristotle had claimed that objects of different weights fall at different speeds—so, anyway, the sixteenth-century Aristotelians believed he’d claimed; and anyone with a little equipment at the top of the Leaning Tower could have shown that the Aristotelians were wrong as usual. Whether or not Galileo did drop the weights, however, has been in dispute for most of this century.

  Galileo

  The story was first told by Galileo’s last pupil, Vincenzo Viviani, who said that Galileo dropped the weights in front of an audience of the entire faculty and student body of Pisa University. But if Galileo had done so and if Aristotelian mechanics had been shown up for the fantasy it was, why isn’t there a single independent account of the event in university records or in letters or memoirs? This is the question as modern scholarship puts it.

  And it leads to a corollary question. If Galileo didn’t drop weights from the Leaning Tower, what did he do? What he did was propagandize for the Copernican view that the earth voyages around the sun, not vice versa, and, what was worse, he did so in modern Italian, not in Latin, thereby stirring up trouble. In The Dialogue Concerning the Two Chief Systems of the World, the Ptolemaic and the Copernican, moreover, he made the Ptolemaic lunkhead Simplicius sound suspiciously like Pope Urban VIII.

  As it happens, Galileo also constructed a telescope through which he discovered spots on the sun, mountains on the moon, satellites in the orbit of Jupiter, and phases of Venus; and he laid the groundwork for a modern mechanics. But if you want to remember just one thing about Galileo, remember that he was the first scientist-philosopher who routinely approached problems mathematically, by quantifying them—and also the first to get a bad case of the creeps at the thought of colors, tastes, odors, and anything else he believed to be nonquantifiable. DID NEWTON REALLY WATCH AN APPLE FALL—AND IF SO, SO WHAT?(AND IF NOT, SO WHAT?)

  Isaac Newton

  Here, obviously, we are onto another one of those Galileo-and-the-lead-weight questions. Sir David Brewster, one of Newton’s nineteenth-century biographers, claims actually to have visited the tree the apple fell from and to have walked away with a piece of the root. Brewster also claims that in 1820, six years after his pilgrimage, the tree had decayed to such an extent that it was chopped down, the wood, however, like any other holy relic, being “carefully preserved”—in this case, by a Mr. Turnor of Stoke Rocheford. The story of course is that it was the falling apple that suggested to Newton the theory of universal gravitation, and the story, as Brewster acknowledges, was spread by Voltaire, who got it from Newton’s niece, Catherine Barton. Voltaire, who wrote about Newton in detail, in fact never got to see the elderly mathematician-physicist, though, like any good journalist assigned to a palace press room, he did spend a lot of time hanging out in his subject’s antechambers gathering scabrous anecdotes. The net of it is that the apple story is probably false and that the apple itself belongs in a barrel along with the one that Eve gave to Adam and the one that William Tell shot off his son’s head and the golden one, inscribed “For the fairest,” that Eris, the goddess of discord, tossed in the direction of Juno, Venus, and Minerva. On the other hand, how important is it whether he did or didn’t watch an apple fall? Plenty of people other than Newton watched apples fall, and the theory of universal gravitation didn’t occur to them.

  The theory of gravitation, by the way, however obvious it may seem to every seven-year-old today, is on the face of it no less implausible than the much earlier theory that the planets and the stars are attached to clear rotating spheres that make music (if only we could hear it); and, when you think about it, it’s not really a lot less occult. You know, what is this spooky stuff called gravity that holds everything in place without glue, paste, screws, nails, or even spit? Newton himself said:

  That gravity should be innate, inherent, essential to matter, so that one body may act upon another at a distance through a vacuum, without the mediation of anything else, by and through which the action and force may be conveyed from one to another, is to me so great an absurdity that I believe no man who has in philosophical matters a competent faculty of thinking can ever fall into it.

  Newtons was nevertheless the E = mc2 of its day What it says is that the force of attraction between any two bodies will be directly proportional to the product of their masses and inversely proportional to the square of the distance between them. HOW COME CHEMISTRY TOOK SO LONG TO COME UP FROM THE DARK AGES?

  An eighteenth-century chemistry lab

  Chemistry took so long to come up from the Dark Ages because even after physics and astronomy and anatomy had assumed something vaguely like their modern contours, instruments didn’t exist to isolate gases and no one troubled much to weigh or measure anything. Then a German named Stahl came along and carried the science down the dark, lonely dead-end road of the phlogiston theory. This theory held that all combustible substances had a physical component—namely, phlogiston—that was released on burning; new findings were wrenched about to fit the theory (for example, the discovery that the ash of a piece of firewood weighed more than the unburnt firewood led to the conclusion not that oxygen had been absorbed, but that phlogiston had negative weight). Which is to say that at the time of the American Revolution, chemists still believed that of the four elements of the ancients—earth, air, fire, and water—all but earth were irreducible. H2O? Not yet. Water.

  It was Antoine Lavoisier who got chemistry out of the fix. Lavoisier saw the implications of the findings of other men and worked them into a unified system. Both Joseph Priestley, an Englishman, and Karl Scheele, a Swede, had already isolated oxygen, but they failed to get straight the relationship between oxygen and either fire or water (Priestley, in fact, never gave
up on phlogiston). Lavoisier saw that air was made up of two elements of which one contributed to combustion and one, nitrogen, did not. Henry Cavendish, an Englishman, combined hydrogen and oxygen to get water but concluded that water was hydrogen minus phlogiston. Lavoisier came up with the conclusion that satisfied your high-school chemistry teacher. WHO GOT TO THE EVOLUTION THEORY FIRST, DARWIN OR THIS ALFRED RUSSEL WALLACE? AND ONCE YOU’VE ANSWERED THAT ONE, WHAT’S AN ENLIGHTENED MODERN PERSON SUPPOSED TO THINK OF POOR VILIFIED LAMARCK?

  Darwin

  Wallace

  The first of these questions was somewhat baffling to Darwin himself. Darwin kind of thought that the evolution theory was his own intellectual property, but at home one day in June 1858, tinkering away at the future Origin of Species, he received from Wallace (who was in Malaya, recovering from malaria) a paper setting forth not just a theory of evolution but a theory of evolution by natural selection. And talk about spooky: The evolutionary process had been suggested to both men in exactly the same way. They had both read Malthus’ Essay on Population, according to which an expanding human population is always pressing on food supply with the result that poverty and death are inevitable. N.B.: It was neither Darwin nor Wallace but the philosopher Herbert Spencer who coined the Victorian catchphrase “survival of the fittest,” and Alfred Tennyson, the poet laureate, who added the cheerful chorus, “For nature is one with rapine, a harm no preacher can heal.”

  Actually, what was at issue between the two naturalists was not who got to the theory first, but who had title to it. Darwin had started keeping his Notebook on Transmutation of Species in 1837. He had a first intimation of the role of adaptation in early 1838, and he added Malthus to his stewpot later that year. Wallace didn’t begin to think of the transmutation of species until 1845, after he’d read Chambers’ Vestiges of the Natural History of Creation. Wallace, however, announced his theory to Darwin before Darwin had published anything on evolution anywhere. This, then, would probably have been one for the lawyers, had both men not done honor to science by behaving like gentlemen. They actually praised each other.

  As for poor Jean-Baptiste-Pierre-Antoine de Monet, chevalier de Lamarck, the pity is he’s remembered for his zany ideas about how an animal that stretches its neck often enough to get at leaves on high branches will sooner or later become a giraffe and—why stop there?—the acquired characteristic of longneckedness will be passed along to its offspring. In fact, Lamarck not only got the idea of evolution out into the intellectual atmosphere, he also invented the categories “vertebrate” and “invertebrate” and he turned “biology” into an early-nineteenth-century buzzword. HONEST, NOW, WAS LOBACHEVSKY THE GREATEST MATHEMATICIAN EVER TO GET CHALK ON HIS COAT?

  This, to be sure, is what Tom Lehrer alleged of Lobachevsky in the famous song that also advises, “Plagiarize, plagiarize, why do you think the good Lord made your eyes?” Nikolai Lobachevsky not only ran the University of Kazan, he also found time to squeak through Euclid’s undefended window of vulnerability, namely, the fifth axiom, which states that through a point P, not on a given line only one line I1 can be drawn so that / and I1 are parallel and so that they will never meet no matter how far they are extended in either direction.

  Lobachevsky demonstrated that if you throw out the fifth axiom, which was never proven, even by Euclid, who didn’t try, you can construct a non-Euclidian geometry in which more than one parallel line will pass through the point P and all the angles of a triangle will add up to less than 180°. Lobachevsky did not apparently mean his geometry as a description of the real universe, yet it is not only as internally consistent as Euclid’s, it also has implications for geodesics and great circle navigation. In any case, as Einstein was later to demonstrate, the real universe is both weird and non-Euclidean, regardless of what Lobachevsky thought.

  There is just one problem with naming Lobachevsky the greatest mathematician, etc. Carl Gauss had gotten to non-Euclidian geometry before him, concluding that geometry, “the theory of space,” was no longer on a level with arithmetic, the latter being exclusively a product of mind, the former, empirical. Most people, in fact, will tell you that not only was Gauss greater, so was Newton, who invented calculus, and Archimedes, who almost invented calculus with less to go on. Archimedes, though, doesn’t pass the chalk-on-his-coat test. Archimedes didn’t work in chalk. He worked in sand, ash, or, occasionally, olive oil.

  Nikolai Ivanovitch Lobachevsky WHAT DOES PLANCK’S CONSTANT HAVE TO DO WITH HEISENBERG’S UNCERTAINTY?

  In a mutable world where yesterday’s $1-a-slice pizza parlor becomes tomorrow’s $13-a-radicchio-salad watering hole, a constant is always a nice thing to happen on; and Planck’s, though it was scoffed at when it was first offered to the scientific public in 1900, has turned out over the years to be one of those rare harborages that you can tie up at with confidence. Its numerical value, always represented by the letter h, is 0.000000000000000000000000006547—a mite of a thing, to be sure; but it has been confirmed by many hundreds of methods, and in a world like this one, you take what constants you can get and you put your arms around them. What Max Planck discovered is that radiant energy is given off in particles or, as he called them, quanta; it is not given off continuously. You want to find the energy quantum of light, for example—namely, the photon? You multiply h by the frequency of the radiation, represented by the letter v, and voilà—there you’ve got it. Historians have taken the quantum theory to be the dividing line between “classical” and “modern” physics.

  Now, constancy is a lot less modern than uncertainty, so it’s not surprising that, come 1927, Werner Heisenberg took Planck’s constant and used it to elevate uncertainty beyond mere pose: Heisenberg made uncertainty into a basic principle of the cosmos. He observed that the closer you get to observing the velocity of a particle, the further you inevitably get from measuring its position, and vice versa. Indeterminacy of velocity times indeterminacy of position will equal roughly—you guessed it—the famous h. For this, Heisenberg won the 1932 Nobel Prize. But, because it seemed like a mockery of the law of cause and effect, even Einstein didn’t like it much, and he went on record with the remark: “I shall never believe that God plays dice with the universe.” WAIT, YOU’RE FORGETTING THE DANES. DIDN’T THEY CONTRIBUTE ANYTHING?

  It’s hard to know exactly how you mean this, but let’s assume you’re serious. In fact, the Danes are not just blue cheese and breakfast pastries. Denmark may even have been the first country to have something like a national science policy. King Frederick II, in order to avoid brain drain to Germany, set up the astronomer Tycho Brahe as feudal lord of the island of Hveen and provided him with the wherewithal to establish what in effect was the world’s first observatory and think tank. Brahe spent twenty-odd years there making the most complete and precise survey of the heavens since the Greeks closed up shop. But when the new administration of Christian IV came along and cut off funding, he migrated anyway, to Bohemia, where he made his greatest contribution: He provided the data on the basis of which Johannes Kepler, a German and his successor as Imperial Mathematician, calculated the elliptical orbits of the planets.

  Let’s raise a tankard to the memory of Olaus Roemer. Roemer, in the Tycho tradition, traced the path of the planet Jupiter carefully, then used his results to calculate the speed of light, thereby beating out Aristotle, who thought that the speed of light was infinite, and Galileo, who tried to calculate a finite speed by trading lantern signals with an assistant on a nearby hilltop.

  Hoist another tankard for Thomas Bartholin, who discovered the lymphatic system, and one more for his brother Erasmus, who discovered the double refraction of light through a piece of Icelandic crystal. And “skoal” to Hans Christian Oersted, who brought a compass needle close to an electrical wire, thereby stumbling on electromagnetism.

  Niels Bohr won the Nobel Prize in 1922 for developing a new model for the hydrogen atom. Then—another forward march in the evolution of Danish science policy—he opened the Co
penhagen Institute for atomic studies and drew theoretical physicists from all over the world, as Tycho had once drawn astronomers. Whereas Tycho’s operation had been bankrolled by the Crown, however, the Bohr Institute was funded by the Carlsberg brewery. IS SCIENCE WORTH DYING FOR?

  This is a good question, and though the evidence is inconclusive, the answer is probably no. Life is sweet, and over the long haul the truth outs anyway. You can’t kill a good idea whose time has come, etc. Be that as it may, the evidence strongly suggests that the average ecclesiastic does enjoy getting his hands on a scientist with a good idea (this is particularly true if the idea is expressed in such a way that laymen can understand it) and wringing the life out of it or, better yet, him.

  In favor of dying for science, partidans of martyrdom used to invoke Giordano Bruno. The Inquisition gave him a chance to recant, but he wound up burning at the stake anyway. Since Socrates, no man had fought less to save his own life. Or at least that’s the way Victorian press agentry told the story. Bruno had been an adherent of Copernicanism (the earth moved; the universe was infinite) but he was also of a mystical stripe (other worlds were inhabited; infinite universe was indistinguishable from infinite God). As a case for martyrdom for science, therefore, this can be thrown out on technical grounds: By modern criteria, Bruno wasn’t a scientist at all. Rather, he used scientific ideas to dress up a system of hermetic magic. And, in any case, the Inquisition didn’t indict him for Copernicanism; they indicted him for his lukewarm acceptance of the doctrine of the Trinity.

 

‹ Prev