Book Read Free

The Theory That Would Not Die

Page 27

by Sharon Bertsch McGrayne


  As naval commanders came and went during the Mizar’s five cruises to the search site, Bayes’ rule became the group memory of the search and its coordinating principle. For the first time, the rule was used from beginning to end of a long search. Unfortunately, not everyone saw the Monte Carlo prior map as a powerful tool for directing Mizar’s search. It took almost a month to get the map to the scene of operations for its Bayesian updating. Communications between land and sea were so poor that Stone finally hand-carried the map to the Azores on August 12. Not until the research ship’s fourth and fifth cruises in October—five months after Scorpion disappeared—did the detailed prior distributions based on Craven’s and Andrews’ scenarios become available.

  The Mizar’s fifth and last cruise was originally planned to test the sensors, refine the underwater tracking system, and study the contours of the ocean bottom. By this time Craven had organized acoustical studies to more precisely calibrate the location of the blip recorded by the supersecret sensors. Small depth charges were exploded in the ocean at precisely known positions, and their sounds were used to refine the information recorded by naval listening posts during the Scorpion’s last moments. Every day Craven’s acoustical analyses edged the most likely spot for the Scorpion closer to Buchanan’s shining piece of metal.

  In late October the increasingly impatient and by now heavily bearded Buchanan finally got approval to investigate the shiny metal. As Mizar’s sled made its 74th run over the ocean floor, its magnetometer spiked high at several anomalies in cell F6. Returning to the area on October 28, Mizar struggled to pinpoint the spot again. Finally its cameras revealed, lying on the sea bottom, partially buried in sand, the submarine Scorpion. A poorly functioning sonar detector had previously passed right over the sub without finding it. Word that Buchanan was shaving his beard spread rapidly to the United States.

  Richardson was back in the States when he got the phone call. “They gave me the location in code,” he said, “and I plotted it up, and at first I thought it was going to plot right smack in the middle of that high probability square, and I was really excited.” Instead, it was 260 yards away, close to the mysterious piece of shiny metal found at the beginning of the search. The scrap was later determined to be a Scorpion fragment. Still, Richardson joked ruefully, 260 yards off in a 140-square mile area of open sea was “close enough for government work.”

  Years later, Captain Andrews argued that Bayes was only one day and a half mile behind Buchanan. If Buchanan had not returned the Mizar to the shiny metal that day, Craven’s presearch probabilities, updated with his later acoustical studies, would have found Scorpion first.

  On November 1, five months after the search began, Rosenberg, the co-op student from Drexel, hand-carried the photos of Scorpion back to the United States. Excluding time spent studying a misleadingly magnetic, hull-shaped rock, the search sled had located the submarine after scanning 1,026 miles of ocean bottom at a speed of one knot for the equivalent of 43 days, two days earlier than Bayesian predictions.

  President Johnson was told “the highest probability was that the sinking was caused by an accident on board the submarine.”14 This time he may have listened to probabilities.

  Analyses of the sounds made by the Scorpion suggested that it had been traveling east, rather than west, when it sank. Twenty years later Craven learned that the sub could have been destroyed by a “hot-running torpedo.” Other subs in the fleet had replaced their defective torpedo batteries, but the navy wanted Scorpion to complete its mission first. If Scorpion had fired a defective torpedo, it would have missed its target and probably turned back and struck the sub that had launched it.

  Anxious to document the methods used in the search for Scorpion, the Office of Naval Research commissioned Stone to write Theory of Optimal Search. Published in 1975, it is an unabashedly Bayesian book incorporating applied mathematics, statistics, operations research, optimization theory, and computer programs. Cheaper and more powerful computers were transforming Bayesian searches from mathematical and analytical problems to algorithms for software programs. Stone’s book became a classic, important for the military, the Coast Guard, fishermen, police, oil explorers, and others.

  While Stone was writing his book, the United States agreed to help Egypt clear the Suez Canal of unexploded ammunition from the Yom Kippur war with Israel in 1973. The explosives made dredging dangerous. Using the SEPs developed in Palomares, it was possible to measure the search effectiveness to get the probability that, if a bomb had been there, it would have been spotted. But how could anyone estimate the number of bombs remaining in the canal when no one knew how many were there to begin with? Wagner, Associates chose three priors with different probability distributions to express high, middle, and low numbers. Next, using the handy system of conjugate priors described by Raiffa and Schlaifer in 1961, they declared that each prior would have a posterior with the same class of probability distributions. This produced three tractable distributions (Poisson, binomial, and negative binomial) complete with those statistical desiderata, mean values and standard deviations. Computing became “a piece of cake,” Richardson reported, but it proved impossible to explain the system to hardened ordinance-disposal specialists with missing fingers. In the end, no one talked about Bayes at Suez.

  Up to this point, postwar Bayes had searched only for stationary objects like bombs in a canal, or H-bombs and submarines on the ocean floor. Technically, these were simple problems. But shortly after the Suez Canal was cleared and Theory of Optimal Search was published, intensive efforts were made to adapt Bayesian methods to moving targets: civilian boats adrift in predictable currents and winds.

  The technology was a perfect match for U.S. Coast Guard rescue coordinators like Joseph Discenza, whose job in the late 1960s was to answer the telephone when someone called saying, “My husband went out fishing with my son, and they’re not back.”15 After checking area ports of call for the boat, he used a Coast Guard Search and Rescue Manual to estimate by hand the target’s location and its probable drift.

  “Like a dog with a bone in his teeth,” Discenza started to computerize the Coast Guard’s manual.16 He studied search theory and earned a master’s degree at the Naval Postgraduate School in Monterey, California, and a Ph.D. at New York University. Along the way Discenza discovered that ever since the Second World War the Coast Guard had been using the Bayesian search theory developed by Koopman to find U-boats in the open ocean. Discenza filled in the corporate memory gap between the 1940s and the 1970s. “The Coast Guard was very Bayesian. Even when they’re doing it manually, they’re doing Bayes,” Stone said. But until Discenza, they were like early casualty actuaries, using Bayes’ rule without realizing it.

  Joining forces with Discenza, Wagner’s company designed a computerized search system based on Bayesian principles for the Coast Guard. A natural outgrowth of the H-bomb and Scorpion searches, it combined clues about a vessel’s original location and subsequent movements into a series of self-consistent scenarios and then weighted them as to their likelihood.

  The Coast Guard ruled that estimating probabilities and weights should be a group decision. Each individual involved should weight the scenarios privately before they were averaged or combined by consensus. Above all, no scenario should be discarded. “To leave out subjective information is to throw away valuable information because there is no unique or ‘scientific’ way to quantify it,” Stone urged.17

  What if a ship in distress radioed its position but a small plane reported seeing it an hour later a hundred miles away? One or the other had made an error in position, but neither report should be ignored; both should be assigned relative reliabilities. As Stone commented, “Discarding one of the pieces of information is in effect making the subjective judgment that its weight is zero and the other weight is one.”

  Bayesian updating and, at Richardson’s insistence, Monte Carlo techniques were incorporated into the Coast Guard system in 1972, almost two decades before u
niversity theorists popularized the method or the term “filters.” The Monte Carlo methods estimated an enormous number of possible latitudes, longitudes, velocities, times, and weights for each lost ship to pinpoint 10,000 possible target locations.

  Stone also used a Bayesian procedure, an early version of a Kalman filter, to separate and concentrate the data or signals according to specified criteria and to weigh each possible path of a target’s motion according to its believability. The technique did not become popular among academics until the 1990s, but it saved military and space contractors immense amounts of time in the 1960s because their computers had little memory or power. Before Rudolf E. Kalman and Richard Bucy invented the procedure in 1961, each original observation had to be completely recalculated every time a new one appeared; with the filter, new observations could be added without having to do everything all over again. Kalman vehemently denied that Bayes’ theorem had anything to do with his invention, but Masanao Aoki proved mathematically in 1967 that it can be derived directly from Bayes’ rule. Today, it is known as a Kalman or a Kalman-Bucy filter.

  Once Monte Carlo methods and filters were adopted, even untenable and highly improbable paths produced valuable information and helped searchers determine which of the remaining paths were more likely. As more information arrived from sensors, Coast Guard aircraft, weather reports, tide tables, and charts of prevailing currents and winds, the data were converted into likelihood functions and then combined with priors about the target’s movements to predict its probable location. As data accumulated with each iteration, the filter concentrated a relatively small number of highly probable paths.

  The Coast Guard’s system was up and running in 1974 when a tuna boat sank off Long Beach, California. Two days later, purely by chance, a freighter found 12 of its survivors in a lifeboat. Using their new technology, the Coast Guard calculated backward from the chance rescue to the tuna boat’s probable capsize point and then forward again using probability maps of ocean currents and Bayesian updating. Armed with a Bayesian probability map, the Coast Guard rescued three more men the next day. Another successful search took place two years later after a ship capsized and sank while crossing the Pacific. Five sailors took to sea in two life rafts. Twenty-two days later, also by chance, two survivors were found in one of the rafts. Six days after that, the same Coast Guard program found a third survivor, who had been adrift for 28 days.

  Bayes had found stationary objects lodged on the seafloor and had tracked boats drifting with predictable ocean currents and the wind. But what about locating and following evasive prey, a Soviet submarine, say, or a moving target operated by human beings? Could Bayes accommodate human behavior?

  “It’s the Cold War, and there are submarines out there that are a threat to the U.S.,” recalled Richardson. “They’re moving targets, so why not do something in antisubmarine warfare. . . . It started in the 1970s and continued for two decades, and I personally did a lot of work searching for subs in the Atlantic Ocean and the Mediterranean.”

  When the future vice admiral John “Nick” Nicholson took command of the U.S. submarine fleet in the Mediterranean in 1975, he secured a 100,000 grant from the ONR to bring Richardson to Naples for a year. It was one of ONR’s biggest contracts, and navy accountants considered it a waste of money. But the Mediterranean was full of Soviet and NATO ships and submarines eyeing one another; the Soviets alone had 50 vessels, including ten submarines. By the early 1970s the “Med” was so crowded that the U.S. and Soviet governments signed a pact to reduce collisions. When the U.S. Navy began routine tracking of Soviet subs in the Mediterranean in 1976, Nicholson thought Richardson and Bayes’ rule could help.

  Starting from scratch, Richardson cranked intelligence information into an antiquated computer in Naples: previous submarine tracks; particular types of Soviet sub that were apt to take a particular route and perform certain maneuvers; and reports from sonobuoys, passive acoustic listening devices that were dropped by aircraft into fixed underwater surveillance systems. Unlike Koopman’s purely objective analysis of radio transmissions and submarine tracks during the Second World War, Richardson was making subjective assessments of the behavior of Soviet officers. He was also using real-time feedback of actual search results, something that submarine hunters in the Second World War would have regarded as science fiction. To all this intelligence data Richardson added the islands, sea mounts, and tight passages in the region’s constricted geography. These natural obstacles became surprisingly helpful features.

  By definition, tracking involves uncertainties and estimations that are far from ideal. Parameters change as new data appear, and “to make matters worse, the data can be remarkably uninformative and obtained from a number of different sources,” as Stone wrote.18 An optical scanner might spot a distant periscope emerging a foot above the horizon for 10 seconds but fail to identify it as a submarine. Operators watching radar signals on their computer screens could not always distinguish a sub from a surface ship. Arrays of acoustic hydrophones extended over the seafloor for hundreds of miles to detect low-frequency acoustic signals emitted by submarines, but their data were often highly ambiguous. Different targets, for example, radiated acoustic signals at the same or nearly the same frequency. Even the ocean distorted sounds. Sound waves bend with every change in water temperature, and the roar of breaking waves affects noise-to-signal ratios. Bayes’ common currency—probabilities—fused information gathered from these various sources. Amid such vague and ephemeral reports, Bayes’ rule was in its element.

  One summer day in 1976 a Soviet nuclear-powered submarine slipped through the Strait of Gibraltar and entered the Mediterranean. It was a 5,600ton Echo II class vessel, armed with cruise missiles that could be fired from the surface. The U.S. fleet tracked it as far as Italy before losing it. No one could tell when it would pass through the Sicily Straits into the eastern Mediterranean.

  Besides the submarines under his command, Nicholson had been assigned four antisubmarine destroyers that pulled experimental sleds packed with trailing-wire sonar detectors. Arranging his forces across the Sicily Straits so the destroyers would have a chance to detect the Soviet sub passing through, Nicholson waited tensely. “The wait went on longer and longer than all of our operations [intelligence] people were expecting,” Nicholson related years later. “Tony kept working his program and he kept saying, ‘I still think there’s an x percent possibility that it hasn’t gone through yet.’”19

  Nicholson’s superiors were pressuring him to move his submarines and surface ships over to the eastern Mediterranean to look for the sub there. But he was an old hand at pressure; he had been executive officer and navigator of the second nuclear submarine to go under the Arctic ice cap to the North Pole and had commanded the first nuclear sub to go from the Pacific to the North Pole in the winter.

  Richardson, still poring over the old computer, urged Nicholson to ignore his commander. “I think we should take at least one to two more days,” he said. He estimated the probability that the sub had not yet slipped through the straits at about 55%. Nicholson didn’t know how much confidence to place in Richardson’s system. It was new, it included subjective assessments of human behavior, and it was being used for real-time decision making. But Richardson was filling a void other intelligence and operations experts could not. Making a bold decision that could have destroyed his career, Nicholson decided to wait. “And, lo and behold, we made contact and were able to track the guy through the Strait.” The Sixth Fleet was jubilant, and, as Richardson described the reaction of the brass, “most everyone became a believer” in Bayesian search methods.

  Thanks to the trailing-wire sonar detectors, every time the Soviet submarine came to the surface in the eastern Mediterranean one of Nicholson’s destroyers was cruising nearby. Their skippers were under orders not to come too close to the sub, but, as Nicholson says, “These destroyer guys don’t listen very well.”

  One rather clear Sunday morning the Soviet submar
ine surfaced with its sail (a metallic structure covering periscopes and masts) about four feet out of the water. Waiting nearby was one of Nicholson’s destroyers, the 3,400ton Voge. To everyone’s surprise, the Soviet sub turned toward the Voge and charged at full speed.

  As Nicholson recounts the story, “Everybody on the ship was taking pictures of this thing, the submarine moving along at 20 knots with its sail out of the water, when the skipper of the surface ship slowed for some reason. The submarine skipper apparently didn’t keep his eye right on it, and the first thing you know, the submarine rammed right into the Voge. We believe the Soviet captain was trying to cut the trailing-wire sonar that had given him such fits.”

  The sub was badly damaged, and its captain was relieved of his command that same night. The Voge was towed to France for repairs. The incident proved the value of Richardson’s tracking and of trailing-wire sonar systems on surface ships. Later, Bayesian methods tracked Soviet submarines in the Atlantic and Pacific, although after the breakup of the USSR in 1991 Russian out-of-area submarine deployments were greatly reduced.

  “The antisub warfare work was pretty much the highlight of things that were really Bayesian,” Richardson reflected. “. . . It was like being back in Spain again. I was ten or fifteen years older, sitting up all night, running my computer and briefing the admiral in the morning. . . . That’s kind of the ultimate happiness, when you can make things move around in the world based on your ideas.”

  Meanwhile, the military that had been so slow to embrace Bayesian search theory was exploring its use for identifying asteroids speeding toward Earth and for locating Soviet satellites as they orbited in space. In 1979 NATO held a symposium in Portugal to encourage the solution of “real problems” with Bayesian methods.

 

‹ Prev