Death By Black Hole & Other Cosmic Quandaries

Home > Science > Death By Black Hole & Other Cosmic Quandaries > Page 29
Death By Black Hole & Other Cosmic Quandaries Page 29

by Neil DeGrasse Tyson


  Just when we thought we had the size and shape of our presumably eternal cosmos figured out, Edwin Hubble went on to discover that the universe was expanding and that the galactic universe extended as far as the largest telescopes could see. One consequence of this discovery was that the cosmos had a beginning—an unthinkable notion to all previous generations of scientists.

  Just when we thought that Albert Einstein’s relativity theories would enable us to explain all the gravity of the universe, the Caltech astrophysicist Fritz Zwicky discovered dark matter, a mysterious substance that wields 90 percent of all the gravity of the universe, but emits no light and has no other interactions with ordinary matter. The stuff is still a mystery. Fritz Zwicky further identifies and characterizes a class of objects in the universe called supernovas, which are single, exploding stars that temporarily emit the energy equivalent of a hundred billion suns.

  Not long after we figured out the ways and means of supernova explosions, somebody discovered bursts of gamma rays from the edge of the universe that temporarily outshined all the energy-emitting objects of the rest of the universe combined.

  And just as we were growing accustomed to living in our ignorance of dark matter’s true nature, two research groups working independently, one led by Berkeley astrophysicist Saul Perlmutter and one led by astrophysicists Adam Reiss and Brian Schmidt, discovered that the universe is not just expanding, it’s accelerating. The cause? Evidence indicates a mysterious pressure within the vacuum of space that acts in the opposite direction of gravity and which remains more of a mystery than dark matter.

  These are, of course, just an assortment of the countless mind-bending and brain-boggling phenomena that have kept astrophysicists busy for the past hundred years. I could stop the list here, but I would be remiss if I did not include the discovery of neutron stars, which pack the mass of the Sun within a ball that measures barely a dozen miles across. To achieve this density at home, just cram a herd of 50 million elephants into the volume of a thimble.

  No doubt about it. My mind is wired differently from that of a biologist, and so our different reactions to the evidence for life in the Mars meteorite was understandable, if not entirely expected.

  Lest I leave you with the impression that the behavior of research scientists is indistinguishable from that of freshly beheaded chickens running aimlessly around the coop, you should know that the body of knowledge about which scientists are not baffled is impressive. It forms most of the contents of introductory college textbooks and comprises the modern consensus of how the world works. These ideas are so well understood that they no longer form interesting subjects of research and are no longer a source of confusion.

  I once hosted and moderated a panel discussion on theories of everything—those wishful attempts to explain under one conceptual umbrella all the forces of nature. On the stage were five distinguished and well-known physicists. Midway through the debate I nearly had to break up a fight as one of them looked like he was ready to throw a punch. That’s okay. I didn’t mind it. The lesson here is if you ever see scientists engaged in a heated debate, they are arguing because they are all baffled. These physicists were arguing on the frontier about the merits and shortcomings of string theory, not whether Earth orbits the Sun, or whether the heart pumps blood to the brain, or whether rain falls from clouds.

  THIRTY-SEVEN

  FOOTPRINTS IN THE SANDS OF SCIENCE

  If you visit the gift shop at the Hayden Planetarium in New York City, you’ll find all manner of space-related paraphernalia for sale. Familiar things are there—plastic models of the space shuttle and the International Space Station, cosmic refrigerator magnets, Fisher space pens. But unusual things are there too—dehydrated astronaut ice cream, astronomy Monopoly, Saturn-shaped salt-and-pepper shakers. And that’s not to mention the weird things such as Hubble telescope pencil erasers, Mars rock super-balls, and edible space worms. Of course, you’d expect a place like the planetarium to stock such stuff. But something much deeper is going on. The gift shop bears silent witness to the iconography of a half-century of American scientific discovery.

  In the twentieth century, astrophysicists in the United States discovered galaxies, the expanding of the universe, the nature of supernovas, quasars, black holes, gamma-ray bursts, the origin of the elements, the cosmic microwave background, and most of the known planets in orbit around solar systems other than our own. Although the Russians reached one or two places before us, we sent space probes to Mercury, Venus, Jupiter, Saturn, Uranus, and Neptune. American probes have also landed on Mars and on the asteroid Eros. And American astronauts have walked on the Moon. Nowadays most Americans take all this for granted, which is practically a working definition of culture: something everyone does or knows about, but no longer actively notices.

  While shopping at the supermarket, most Americans aren’t surprised to find an entire aisle filled with sugar-loaded, ready-to-eat breakfast cereals. But foreigners notice this kind of thing immediately, just as traveling Americans notice that supermarkets in Italy display vast selections of pasta and that markets in China and Japan offer an astonishing variety of rice. The flip side of not noticing your own culture is one of the great pleasures of foreign travel: realizing what you hadn’t noticed about your own country, and noticing what the people of other countries no longer realize about themselves.

  Snobby people from other countries like to make fun of the U.S. for its abbreviated history and its uncouth culture, particularly compared with the millennial legacies of Europe, Africa, and Asia. But 500 years from now historians will surely see the twentieth century as the American century—the one in which American discoveries in science and technology rank high among the world’s list of treasured achievements.

  Obviously the U.S. has not always sat atop the ladder of science. And there’s no guarantee or even likelihood that American preeminence will continue. As the capitals of science and technology move from one nation to another, rising in one era and falling in the next, each culture leaves its mark on the continual attempt of our species to understand the universe and our place in it. When historians write their accounts of such world events, the traces of a nation’s presence on center stage sit prominently in the timeline of civilization.

  MANY FACTORS INFLUENCE how and why a nation will make its mark. Strong leadership matters. So does access to resources. But something else must be present—something less tangible, but with the power to drive an entire nation to focus its emotional, cultural, and intellectual capital on creating islands of excellence in the world. Those who live in such times often take for granted what they have created, on the blind assumption that things will continue forever as they are, leaving their achievements susceptible to abandonment by the very culture that created it.

  Beginning in the 700s and continuing for nearly 400 years—while Europe’s Christian zealots were disemboweling heretics—the Abbasid caliphs created a thriving intellectual center of arts, sciences, and medicine for the Islamic world in the city of Baghdad. Muslim astronomers and mathematicians built observatories, designed advanced timekeeping tools, and developed new methods of mathematical analysis and computation. They preserved the extant works of science from ancient Greece and elsewhere and translated them into Arabic. They collaborated with Christian and Jewish scholars. And Baghdad became a center of enlightenment. Arabic was, for a time, the lingua franca of science.

  The influence of these early Islamic contributions to science remains to this day. For example, so widely distributed was the Arabic translation of Ptolemy’s magnum opus on the geocentric universe (originally written in Greek in A.D. 150), that even today, in all translations, the work is known by its Arabic title Almagest, or “The Greatest.”

  The Iraqi mathematician and astronomer Muhammad ibn Musa al-Khwarizmi gave us the words “algorithm” (from his name, al-Khwarizmi) and “algebra” (from the word al-jabr in the title of his book on algebraic calculation). And the world’s shared system of numer
als—0, 1, 2, 3, 4, 5, 6, 7, 8, 9—though Indian in origin, were neither common nor widespread until Muslim mathematicians exploited them. The Muslims furthermore made full and innovative use of the zero, which did not exist among Roman numerals or in any established numeric system. Today, with legitimate reason, the ten symbols are internationally referred to as Arabic numerals.

  PORTABLE, ORNATELY ETCHED, brass astrolabes were also developed by Muslims, from ancient prototypes, and became as much works of art as tools of astronomy. An astrolabe projects the domed heavens onto a flat surface and, with layers of rotating and nonrotating dials, resembles the busy, ornate face of a grandfather clock. It enabled astronomers, as well as others, to measure the positions of the Moon and the stars on the sky, from which they could deduce the time—a generally useful thing to do, especially when it’s time to pray. The astrolabe was so popular and influential as a terrestrial connection to the cosmos that, to this day, nearly two-thirds of the brightest stars in the night sky retain their Arabic names.

  The name typically translates into an anatomical part of the constellation being described. Famous ones on the list (along with their loose translations) include: Rigel (Al Rijl, “foot”) and Betelgeuse (Yad al Jauza, “hand of the great one”—in modern times drawn as the armpit), the two brightest stars in the constellation Orion; Altair (At-Ta’ir, “the flying one”), the brightest star in the constellation Aquila, the eagle; and the variable star Algol (Al-Ghul, “the ghoul”), the second brightest star in the constellation Perseus, referring to the blinking eye of the bloody severed head of Medusa held aloft by Perseus. In the less-famous category are the two brightest stars of the constellation Libra, athough identified with the scorpion in the heyday of the astrolabe: Zubenelgenubi (Az-Zuban al-Janubi, “southern claw”) and Zebueneschamali (Az-Zuban ash-Shamali, “northern claw”), the longest surviving star names in the sky.

  At no time since the eleventh century has the scientific influence of the Islamic world been equal to what it enjoyed the preceding four centuries. The late Pakistani physicist Abdus Salam, the first Muslim ever to win the Nobel Prize, lamented:

  There is no question [that] of all civilizations on this planet, science is the weakest in the lands of Islam. The dangers of this weakness cannot be overemphasized since honorable survival of a society depends directly on strength in science and technology in the conditions of the present age. (Hassan and Lui 1984, p. 231)

  PLENTY OF OTHER nations have enjoyed periods of scientific fertility. Think of Great Britain and the basis of Earth’s system of longitude. The prime meridian is the line that separates geographic east from west on the globe. Defined as zero degrees longitude, it bisects the base of a telescope at an observatory in Greenwich, a London borough on the south bank of the River Thames. The line doesn’t pass through New York City. Or Moscow. Or Beijing. Greenwich was chosen in 1884 by an international consortium of longitude mavens who met in Washington, DC, for that very purpose.

  By the late nineteenth century, astronomers at the Royal Greenwich Observatory—founded in 1675 and based, of course, in Greenwich—had accumulated and catalogued a century’s worth of data on the exact positions of thousands of stars. The Greenwich astronomers used a common but specially designed telescope, constrained to move along the meridional arc that connects due north to due south through the observer’s zenith. By not tracking the general east to west motion of the stars, they simply drift by as Earth rotates. Formally known as a transit instrument, such a telescope allows you to mark the exact time a star crosses your field of view. Why? A star’s “longitude” on the sky is the time on a sidereal clock the moment the star crosses your meridian. Today we calibrate our watches with atomic clocks, but back then there was no timepiece more reliable than the rotating Earth itself. And there was no better record of the rotating Earth than the stars that passed slowly overhead. And nobody measured the positions of passing stars better than the astronomers at the Royal Greenwich Observatory.

  During the seventeenth century Great Britain had lost many ships at sea due to the challenges of navigation that result from not knowing your longitude with precision. In an especially tragic disaster in 1707, the British fleet, under Vice Admiral Sir Clowdesley Shovell, ran aground into the Scilly Isles, west of Cornwall, losing four ships and 2,000 men. With enough impetus, England finally commissioned a Board of Longitude, which offered a fat cash award—£20,000—to the first person who could design an ocean-worthy chronometer. Such a timepiece was destined to be important in both military and commercial ventures. When synchronized with the time at Greenwich, such a chronometer could determine a ship’s longitude with great precision. Just subtract your local time (readily obtained from the observed position of the Sun or stars) from the chronometer’s time. The difference between the two is a direct measure of your longitude east or west of the prime meridian.

  In 1735, the Board of Longitude’s challenge was met by a portable, palm-sized clock designed and built by an English mechanic, John Harrison. Declared to be as valuable to the navigator as a live person standing watch at a ship’s bow, Harrison’s chronometer gave renewed meaning to the word “watch.”

  Because of England’s sustained support for achievements in astronomical and navigational measurements, Greenwich landed the prime meridian. This decree fortuitously placed the international date line (180 degrees away from the prime meridian) in the middle of nowhere, on the other side of the globe in the Pacific Ocean. No country would be split into two days, leaving it beside itself on the calendar.

  IF THE ENGLISH have forever left their mark on the spatial coordinates of the globe, our basic temporal coordinate system—a solar-based calendar—is the product of an investment of science within the Roman Catholic Church. The incentive to do so was not driven by cosmic discovery itself but by the need to keep the date for Easter in the early spring. So important was this need that Pope Gregory XIII established the Vatican Observatory, staffing it with erudite Jesuit priests who tracked and measured the passage of time with unprecedented accuracy. By decree, the date for Easter had been set to the first Sunday after the first full moon after the vernal equinox (preventing Holy Thursday, Good Friday, and Easter Sunday from ever falling on a special day in somebody else’s lunar-based calendar). That rule works as long as the first day of spring stays in March, where it belongs. But the Julian calendar of Julius Caesar’s Rome was sufficiently inaccurate that by the sixteenth century it had accumulated 10 extra days, placing the first day of spring on April 1 instead of March 21. The four-year leap day, a principal feature of the Julian calendar, had slowly overcorrected the time, pushing Easter later and later in the year.

  In 1582, when all the studies and analyses were complete, Pope Gregory deleted the 10 offending days from the Julian calendar and decreed the day after October 4 to be October 15. The Church thenceforth made an adjustment: for every century year not evenly divisible by 400, a leap day gets omitted that would otherwise have been counted, thus correcting for the overcorrecting leap day itself.

  This new “Gregorian” calendar was further refined in the twentieth century to become even more precise, preserving the accuracy of your wall calendar for tens of thousands of years to come. Nobody else had ever kept time with such precision. Enemy states of the Catholic Church (such as Protestant England, and its rebellious progeny, the American colonies) were slow to adopt the change, but eventually everyone in the civilized world, including cultures that traditionally relied on Moon-based calendars, adopted the Gregorian calendar as the standard for international business, commerce, and politics.

  EVER SINCE THE BIRTH of the industrial revolution the European contributions to science and technology have become so embedded in Western culture that it may take a special effort to step outside and notice them at all. The revolution was a breakthrough in our understanding of energy, enabling engineers to dream up ways to convert it from one form to another. In the end, the revolution would serve to replace human power with machine power, d
rastically enhancing the productivity of nations and the subsequent distribution of wealth around the world.

  The language of energy is rich with the names of those scientists who contributed to the effort. James Watt, the Scottish engineer who perfected the steam engine in 1765, has the moniker best known outside the circles of engineering and science. Either his last name or his monogram gets stamped on the top of practically every lightbulb. A bulb’s wattage measures the rate it consumes energy, which correlates with its brightness. Watt worked on steam engines while at the University of Glasgow, which was, at the time, one of the world’s most fertile centers for engineering innovation.

  The English physicist Michael Faraday discovered electromagnetic induction in 1831, which enabled the first electric motor. The farad, a measure of a device’s capacity to store electric charge, probably doesn’t do full justice to his contributions to science.

  The German physicist Heinrich Hertz discovered electromagnetic waves in 1888, which enabled communication via radio; his name survives as the unit of frequency along with its metric derivatives “kilohertz,” “megahertz,” and “gigahertz.”

  From the Italian physicist Alessandro Volta we have the volt, a unit of electric potential. From the French physicist André-Marie Ampère, we have the unit of electric current known as the ampere, or “amp” for short. From the British physicist James Prescott Joule, we have the joule, a unit of energy. The list goes on and on.

  With the exception of Benjamin Franklin and his tireless experiments with electricity, the U.S. as a nation watched this fertile chapter of human achievement from afar, preoccupied with gaining its independence from England and exploiting the economies of slave labor. Today the best we could do was pay homage in the original Star Trek television series: Scotland is the country of origin of the industrial revolution, and of the chief engineer of the starship Enterprise. His name? “Scotty” of course.

 

‹ Prev