Book Read Free

Waters of the World

Page 18

by Sarah Dry


  Such “imaginary” uses were implicit in the early plans for the application of electronic computing to numerical weather prediction. Weather forecasts—which is what the computers were initially conceived of being able to do—are, after all, imagined futures. The difference between numerical weather forecasts and so-called “weather models” is that models were intended to be used as tools for understanding the weather processes, while forecasts were usually addressing much more immediate and practical questions.

  In Woods Hole, Malkus was applying these new ideas and new computing power to the tricky task of describing the growth of individual clouds. Using data she had gathered in the bumpy PBY flights, she created the first numerical cloud model that described, using a series of physical equations, how clouds grew and developed.40 This was groundbreaking work—the first such “model” to attempt to reduce cloud growth to a series of equations. But it was only a start. Her studies of individual clouds only deepened her curiosity about how the process of convection acted on the larger scale. Both the temporal and spatial scale of regional or even global atmospheric motions required much more computing power than was available. And even if the computing power did become available, the problem remained much too complex to solve fully using the brute force of numerical computation. Better physical understanding was needed before more complex models could be contemplated.

  Malkus took another tack. First, she convinced Herbert Riehl, her former supervisor, to join her as a partner on the project. Together, they started to look at data not just about tropical clouds. Instead, they looked at a vastly increased scale, the entire tropical zone, which stretched ten degrees on either side of the equator all the way around the planet. This was a scale of investigation that had previously been impossible. Now, thanks to the data streaming in from airplanes and radiosondes and plotted on global maps, Malkus and Riehl were able to form a clearer picture of how the atmosphere was moving around the entire planet—and to identify a gaping hole in theories of the general circulation. They identified this hole by tracking the movement of the sun’s energy around the planet and in the process stumbling on an unexplained gap in that transfer—like a missing participant in a game of telephone. Somehow energy was moving around the planet, but the details remained fuzzy about where and how.

  The sun is the source of all the energy on our planet. When its rays hit the earth, the angle and shape of the planet determine how much light is received by different parts of the planet. At latitudes higher than thirty-eight degrees in both hemispheres, the earth loses heat. Only between thirty-eight degrees and the equator—roughly the latitudes occupied by the African continent—is the radiation balance positive. But the planet generally maintains its average temperature at a fairly stable level. So the planet as a whole must act to transfer heat from the region around the equator to the poles; otherwise it would begin to cool off. To complicate matters, sea-level winds at the equator—the so-called trade winds upon which sailors had long depended—blow very consistently toward the equator. While it was generally accepted that the heat was carried aloft over the equatorial regions, and then transported poleward at higher altitudes, it was unclear what the precise mechanism for this transfer was. Somehow heat must be getting from the surface of the equatorial ocean—where the heat so efficiently absorbed by the water was in turn radiated upward—to the higher levels of the atmosphere known as the troposphere where the winds blew toward the poles. But measurements had shown that the middle layers of the atmosphere—between the surface of the ocean and the troposphere—did not have nearly enough energy to transfer the heat upward. The middle layers were a kind of dead zone, energetically speaking. This left a mystery. How was the hot air getting from the sea surface up to the troposphere?

  In addition to the “weather models” in which an increasingly complex earth was constructed out of equations representing physical phenomena, a new breed of studies had been developing since 1920 or so.41 It was to these studies that Simpson and Riehl now turned. Called bookkeeping studies, they worked on the principle that in order to understand the planet, it was sometimes advisable to (momentarily) set physics aside. Just as an accountant processes transactions in order to balance a business’s account books, so too was it possible to “process” the heat in the earth’s climate in order to reach a certain balance, or equilibrium. In these studies, all that mattered was the increase or decrease of a chosen variable, be it heat, angular momentum, carbon dioxide, or any number of other quantities (such as, for example, ice, ozone, tritium, methane, and sulfur).

  As suggestive as these papers were about the role played by small-scale phenomena such as eddies in larger-scale atmospheric features, no one had yet considered that cumulus clouds could play a role in large-scale circulations. This was the task Malkus and Riehl now set themselves. Their guess, based on their study of the radiosonde data and a rather bold intuitive leap made in the absence of other evidence, was that the heat was traveling upward from the ocean’s surface in narrow regions of exceptionally buoyant air. These columns, or “hot towers,” in which water vapor was condensed into droplets and released its heat, were the “overgrown brothers” of ordinary trade-wind cumulus clouds. They were giant—commonly reaching up to 35,000 feet high and sometimes as high as 50,000 feet—but relatively sparse. At any given moment, there might be only a few thousand active across the entire planet. These could serve as escalators for an enormous amount of heat, which was thereby able to bypass the lower layers of the atmosphere in which the winds were blowing back toward the equator. The lopsidedness of the situation struck Malkus and Riehl forcefully. “The most striking conclusion from this work,” they summarized, was the fact that “only about 1500–5000 active giant clouds are needed to maintain the heat budget of the equatorial trough and thus, implicitly, to provide for much of its poleward energy transport!”42

  The hypothesis—and it remained only a hypothesis, with little direct evidence to support it—solved the mystery of how heat from the surface of the tropical oceans could be transported high enough in the atmosphere to be carried on winds blowing away from the equator. It also linked the energetics of the ocean and the atmosphere in a way that very few meteorologists (or oceanographers) had yet done. Hot towers showed that the atmospheric circulation could only be understood in relation to the ocean that supplied most of its heat. Clouds could play an outsized role in the climate system, just as the impresarios of climate control had imagined. Even without proof and a rather wide degree of uncertainty, the hypothesis was suggestive enough that Malkus and Riehl did not hesitate to publish.43 Cumulonimbus clouds of the necessary height—some 40,000 to 50,000 feet—had been observed. The question remained whether there were enough of them transporting enough heat to resolve the paradox. They ended their paper calling for more observations, during the upcoming International Geophysical Year, to enable their theory to be refined and tested.

  The challenge for Malkus and Riehl, and for others, was to generate not separate meteorologies—of fronts, of hot towers, of the tropics, and of cyclones—but a single science that could, somehow, describe the linkages between these scales. The theory of hot towers had seemed to solve the mystery of how energy was transferred from lower to higher altitudes in the tropics, but it had created another. How would it be possible to characterize (or understand) a “system” in which large-scale regularities—the general circulation—were in part determined by the most seemingly ephemeral and fickle of phenomena?

  “What we were doing that no one else had ever done before was to put in the cloud systems as a key part of tropical energetics,” explained Malkus, “and thus have energy move up scale in the circulation size hierarchy, rather than down as in classical hydrodynamics.”44 She was keenly aware of how strange, and theoretically complex, such a system was. “It is no small wonder,” wrote Malkus, “that the global circulation system operates in fits and starts, with its evanescent cylinders, of transient numbers, whose very exist
ence depends upon the vagaries of the flow itself!”45 Malkus and Riehl had put their finger on a weather trigger, just the sort of thing the promoters of weather control were after. But what good was an evanescent, difficult-to-locate trigger that seemed, somewhat paradoxically, to depend for its existence on the large-scale phenomena that it might be said to affect? Here was a confoundingly circular and dynamic world of multiple scales that seemed to lack any reassuring hierarchy. The prospect of control seemed remote.

  FIG. 5.8. Joanne and Herbert Riehl puzzling over hurricanes, in a cartoon by Margaret LeMone. Note the “real hand-analyzed data” and the big question: “Why are there so few hurricanes?” Credit: Margaret LeMone.

  Malkus and Riehl’s first impulse, after publishing this suggestive paper, was to do some more observing. Given the complex relationship between wildly disparate scales, they thought the only way to make further advances would be to “study these widely different scales of motion in their context to each other.” They set out to make the “first attempt, largely descriptive, to relate synoptic and cloud scale phenomena.” With this research program, Malkus was moving a step toward her goal of linking small-scale phenomena, such as clouds, with larger-scale phenomena such as storms, hurricanes, and finally the general circulation of the entire atmosphere. It was an exciting time. The long era during which the cry had always been “we need more observations” seemed to be finally coming to a close, as airplanes and radiosondes began gathering more data in more places than ever before.46 Drawing on the Wyman and Woodcock trade-wind expedition, Stommel’s 1947 entrainment paper, and a series of other papers demonstrating how important the surrounding atmosphere was in the formation of clouds, Malkus and Riehl summarized their findings in a book titled Cloud Structure and Distributions over the Tropical Pacific Ocean. In it, they demonstrated why it was no longer be possible to look at the tropics and see a boring, steady-state atmosphere. Henceforth, the tropics would be seen as a tumultuous place, far more variable than it was stable.47 Rain fell in the tropics far more erratically than anyone had imagined. In regions where the majority of rain fell on just two or three days a month and even annual averages varied significantly, averages were not merely unhelpful but actively misleading.48

  * * *

  Underlying this optimism, for those who cared to notice, was a groundswell of doubt and uncertainty. It was one thing to have observations and the means to make calculations based upon them, but would “mere” observations ever really be enough to crack the atmospheric code? Physical insights, not just observations, were required to reduce an otherwise potential deluge of data into a usable current. “It is only through the leaven of some purely physical hypothesis,” cautioned Victor Starr, “that we are guided to the appropriate mathematical use of these principles.”49 Where to find that leaven? The most useful tool in the scientific arsenal for reducing complexity was the experiment, a controlled intervention that enabled a researcher to isolate and test aspects of an otherwise overwhelmingly complex problem. Computers had raised the possibility of identifying likely points for experimental intervention. But the ability to perform controlled physical (rather than computer) experiments in which certain variables were held stable while others were manipulated had long eluded atmospheric scientists, partly for the reasons that Victor Starr had underlined. The atmosphere was so big, so unruly, and so “essentially one” that it was almost impossible to render it a pliable experimental subject.

  Clouds could be—and had been—reproduced in the laboratory, including memorably by John Tyndall himself, but these miniature artificial clouds failed to capture all of the salient features of natural clouds. The motions of fluids more generally had been fruitfully investigated by Dave Fultz in a laboratory at the University of Chicago beginning in 1950. There he’d set up what were affectionately called rotating dishpan experiments. By heating a round tank of water, rotating it, and then dropping dyes into it, Fultz captured pictures of changes in the flow that reproduced some of the large-scale features of the general circulation of the atmosphere and ocean, such as the jet stream and other atmospheric waves. Using this apparatus, Fultz and others were able to reproduce some atmospheric phenomena artificially.50

  The laboratory work by Fultz and others was useful but also frustrating, precisely because of how important scale was to matters both oceanographic and atmospheric. Much could be learned from reducing the ocean or atmosphere to a dishpan-sized model, but much was inevitably missed from such a set-up. The only way to truly understand the atmosphere, many felt, would be to experiment on it directly. The idea of an atmospheric experiment was almost unavoidable in these years, following a war that had been brought to a close by a grand and terrible atmospheric experiment that had produced, in the skies over Hiroshima and Nagasaki, an entirely new cloud.

  As darkly potent as the radioactive clouds released by atomic weapons were, other, less obviously powerful technologies made surprising and important contributions to the growing sense that experimenting on the planet was not only inevitable but a necessary part of the progress of human knowledge. It was specifically the domestic freezer, a new appliance designed by General Electric to meet the growing demand of America’s housewives for convenient and nutritious food with which to feed the postwar baby boom, which heralded a transformation in meteorological practice.

  In 1946, in the laboratories of GE, a young engineer named Vincent Schaefer had been playing around with creating supercooled clouds inside one of these consumer freezers. After generating clouds made up of the supercooled water vapor expelled from his lungs, he experimented with dropping bits of dry ice into them. Immediately, and dramatically, the clouds precipitated into snow. His colleague Irving Langmuir predicted that atmospheric clouds found outside GE freezers would respond in the same way. Bernard Vonnegut (brother of author Kurt) then demonstrated that silver iodide could be a very effective cloud seeder (more effective, per gram, than dry ice). In 1946, Schaefer succeeded for the first time in seeding a cloud in situ with dry ice. It was the beginning of a bonanza of cloud seeding, in which states across America (mainly in the dry Western states) sought to solve their agricultural worries with the expeditious application of a few kilograms of silver iodide.

  In 1947, under the auspices of Project Cirrus, Langmuir seeded the first hurricane using this technique. The effects were disastrous. The storm, which had been headed northeast over the Atlantic off the coasts of Florida and Georgia, abruptly reversed track and headed west, making landfall in Georgia and South Carolina. Though the observers on-board the aircraft which had seeded the storm did not measure any changes in the structure or intensity of the storm (which might have indicated that the seeding had been the cause of the change in its direction), Langmuir nevertheless could not resist claiming “success” in this instance, even though the landfall had resulted in damage.51 The local towns sued, and cloud seeding was flagged not as a source of knowledge but one of potentially limitless liability.

  Such episodes demonstrated how strong was the desire to exploit what remained a little-understood aspect of cloud physics—the role played by seeds, or nucleators, in prompting precipitation. Bernard Vonnegut’s brother, Kurt, was inspired by these events to write Cat’s Cradle. Ice-9, Kurt Vonnegut’s imaginary corollary to silver iodide, turned everything it touched not to water, but to ice. The consequences were terrible, and the message of the tale was as clear as the destructive ice: Interfere with the dynamics of nature at your peril.

  The distance between visionary dreams and inadvertent consequences was shorter than most imagined. In 1957, Roger Revelle and Hans Suess published an article in which they described the widespread emission of carbon dioxide via the burning of fossil fuels as a “large scale geophysical experiment.”52 This now-famous sentence is often presented as a prescient call to arms, one of the first to alert humanity to the risks of an uncontrolled intervention into the planet’s climate system. Revelle and Suess did emphasize the novelty of the
situation, noting that this experiment “could not have happened in the past nor be reproduced in the future.” But rather than warning of the danger of unchecked emissions, Revelle and Suess were urging their fellow scientists to take advantage of an unprecedented opportunity to study the ocean, much as Rossby had mused on the possibility of covering the polar caps with coal. They used the term experiment in the classical sense of a scientific test designed to eliminate as much uncertainty as possible. “This experiment, if adequately documented, may yield a far-reaching insight into the processes determining weather and climate.” Careful measurement and observation, in other words, could transform a merely unwitting (and uncontrolled) intervention into a proper scientific experiment. Revelle and Suess accordingly urged, as Malkus and Riehl had, that data be collected during the International Geophysical Year which could be used to track the path taken by this excess carbon dioxide as it traveled through the “atmosphere, the oceans, the biosphere and the lithosphere.”53

 

‹ Prev