Code Warriors

Home > Other > Code Warriors > Page 27
Code Warriors Page 27

by Stephen Budiansky


  Programming the early machines involved a host of highly specialized tricks to synchronize computational steps with fetching data from memory and other input-output functions, all of which were opaque to the cryptanalysts, who “come to regard programmers as very obstinate and undependable people who have no grasp of the problem, bother a great deal about petty details, appear crushed by the slightest change of plans, and never get jobs done on time.” The programmer in turn found himself bewildered by the cryptanalysts, “notoriously inarticulate even in their own idiom,” throwing out “terms like ‘stecker,’ ‘three wheel cycle,’ ‘isomorphic poker hand,’ and ‘wheels not on a true base’ ” that he “has never heard…in his life.”48

  Because of the considerable individual differences among wired-rotor cipher machines, each of which presented its own potential weaknesses around which an attack could be fashioned, the programming of research jobs aimed at recovering the basic configuration and setup of rotor machines proved the greatest challenge, even as they potentially made the best use of the digital computer’s inherent flexibility. The easiest jobs to program, by contrast, were the massive, highly repetitive streaming runs to match messages in depth, hunt for busts, or laboriously “set” individual messages against known key sequences in cases where the indicator system of an otherwise solved system could not be broken. The teleprinter scramblers, with their often exceptionally long key sequences, typically involved huge data processing runs as well. These types of jobs were also the most processor-intensive: a search devised to locate a bust in messages produced by a Soviet military cipher machine known to NSA as Silver called for more than 1017 tests, far beyond the processing capacity of the most advanced computers at that time. Yet with a powerful enough machine it might be possible even to attempt something approaching a brute-force “exhaustive key search,” simply trying every single possible key to unlock a message setting.49

  In the wake of the Clark committee report, Canine had seen an opportunity to make a pitch for a huge project to make an unprecedented advance in computer speeds in a single leap. Given the “very high standards of security techniques and discipline” the Soviets had introduced in their high-level systems, “the single most important project to improve NSA’s capability on the Russian high-level problems is the development of super high-speed machinery,” argued a May 1956 NSA proposal titled “Recommendations for a Full-Scale Attack on the Russian High-Level Systems.” It recommended hiring nine hundred additional staff and buying $16 million in new computers to tackle the “three main high-level problems,” which included both machine ciphers and manual one-time-pad systems. (The proposal also recommended providing a subsidy to GCHQ “to derive full advantage of their established technical competence.”)50

  Kullback, Sinkov, Snyder, and other NSA cryptanalytic veterans in the meantime had been pushing a proposal from IBM that promised a hundredfold increase in speed over the IBM 705, which they felt would offer the breakthrough required. Negotiating with IBM was a bit like dealing with the court of Louis XIV. Although the company was determined to become the industry leader in computers, it was determined to do so on its terms. By the late 1950s, half of NSA’s computers were being supplied by IBM, which was taking in more than $4 million a year in rental fees from the cryptanalytic agency. The company repeatedly held out the promise that its next computer would be “a machine designed primarily for Agency needs,” but invariably once it had a contract in hand the design would drift back to a general-purpose machine that could be sold to other customers, none of which had NSA’s unique needs for massive data handling and specialized streaming processing. The IBM 701, which IBM originally called the “Defense Calculator,” was much more of a number-cruncher designed to meet the needs of Los Alamos’s nuclear weapons designers, meteorologists at the U.S. Weather Bureau, and ballistics engineers at the Army’s ordnance labs. The new IBM machine that the company was now proposing was turning into the same bait and switch. In the summer of 1955, NSA agreed to provide IBM the $800,000 in funding it needed to develop the high-speed core memory that was to be the heart of the new “Stretch” computer. But meanwhile IBM also negotiated a deal with the Atomic Energy Commission to supply Los Alamos with a Stretch computer, too, for a fixed price of $4.3 million; then the company’s top management began to insist that whatever the final design, it had to be marketable to commercial users as well.51

  “As usual the agency has a firm hold on the IBM leash and is being dragged down the street,” an NSA engineer assigned to keep tabs on the company’s work reported as the project progressed. “If you want to control an R/D contract you should pick a company other than IBM. If you pick IBM sit back and wait to get something like the equipment you ordered at a premium price. Don’t try to direct, you’re only kidding yourself.”

  By the time the first machine was delivered to NSA in 1962, the price of the project had ballooned to $19 million, which did not include $1 million for supplies such as magnetic tapes and cartridges; $4.2 million for training, additional personnel, and software development; $196,045 for “installation”; and $765,000 a year in rental fees. IBM had resolved the problem of building a computer that could simultaneously serve scientific, cryptanalytic, and commercial customers by designing a flexible central processor, a high-speed arithmetic add-on unit for the AEC, and an add-on streaming unit for NSA, modeled on Abner’s “Swish” function. The special NSA add-on was called “Harvest,” which eventually became the name of the whole system; its official designation was the IBM 7950.52

  Canine pushed for an even more ambitious research project in the wake of the Clark report. At an NSA cocktail party the director was talking to several of the agency’s senior computer managers about the seeming impossibility of ever building a machine fast enough to get ahead of the relentlessly growing data processing load. Harvest, they noted, was to have a 10-megacycle processor speed. Canine exploded in frustration, “Build me a thousand megacycle machine! I’ll get the money!”

  Canine went to President Eisenhower and did just that, securing for NSA a special exception from the rule that all basic research in the Department of Defense had to be funded through a central agency. The Lightning project, as it was called, was strongly pushed inside the agency by Howard Engstrom, who though in ill health agreed to return from Sperry Rand in 1956, where he was a vice president, to become NSA’s deputy director. With a budget of $5 million a year for five years, NSA contracted with IBM, RCA, UNIVAC, Philco, and MIT to carry out basic research on microcircuitry and component fabrication.53

  By 1960, NSA had spent $100 million on computers and special-purpose analyzers. The basement of the Operations Building at Fort Meade held more than twenty general-purpose machines, one of the largest complexes of computing power in the world. There had been a brief flurry of excitement when two significant busts were found in Silver messages in 1956, leading to the hope that a general solution would soon follow. But in spite of a $20 million crash program to quickly build a series of special-purpose analyzers called Hairline, the Soviet machine cipher resisted regular solution; only about 3 percent of the traffic was exploitable by late 1957. A later agency review, apparently referring to the same machine, noted that even though NSA’s cryptanalysts had devised a method for reconstructing the machine’s internal configuration “halfway through the typical two-year crypto period, on average,” reading individual messages still was limited to instances of operator carelessness or machine malfunction. “While valuable, Silver’s messages contained rather low-level information,” an agency history of NSA computers acknowledged. It would take more than that to bring back the golden age of World War II codebreaking.54

  —

  Even with all the computing power in the world, the classical methods of cryptanalysis were also hitting fundamental limits. “To rush a computer to completion by an extravagant expenditure of both money and of our technical resources” was to put brawn over the brains actually required, the Baker Panel critically observed, all th
e more so since it was clear now that “the goal of the 1,000-megacycle repetition rate can no longer be regarded as a near magic solution to the problem of breaking” Soviet high-level ciphers. The panel pointed out that in the case of the most secure Soviet machine systems, an exhaustive key search was beyond the bounds set by the laws of physics.

  “There is not nearly enough energy in the universe to power the computer” that could test every setting of such a rotor machine, which had an effective cryptanalytic keyspace on the order of 1044. Even the “more modest undertaking” of recovering the setting of an individual message enciphered on such a machine whose internal configuration has already been recovered, which would involve testing about 1016 possibilities, would cost $2,000,000,000,000,000,000,000 per message for the electricity required to power any known or projected computing devices.55 (In 1998 a $250,000 machine built with 1,856 custom-made chips successfully carried out an exhaustive key search on the 56-bit key DES encryption system—a keyspace slightly greater than 1016—in two days. But a 128-bit key, with a keyspace of the order 1038, can be shown to resist an exhaustive search even by the most theoretically energy-efficient computer that the laws of physics permit.)

  More to the point, the Baker Panel questioned the entire thrust of NSA’s reliance on massive data processing, a legacy of a traditional approach to both cryptanalysis and signals intelligence that no longer made sense: “In the past, an overwhelming emphasis has been put on volume and completeness of interception. Today, volume of intercept is out of proportion to the value of its content.” Trying to chase the bygone World War II successes of its predecessors, NSA was becoming a “Frankenstein-like monster which amasses constantly greater heaps of material which a dozen or 20 cryptanalysts…cannot even lift, let alone survey.”

  The massive searches for coincidences and depths and letter-frequency counts that computers had been pressed into service to carry out were in fact just mechanized versions of the same old cryptanalytic tricks that had long predated the computer era. Engstrom was one who agreed that these traditional methods of bust searches, brute-force data processing, and simple statistical tests—which had become enshrined as the canonical tools of NSA cryptanalysts, and the entire aim of the agency’s multimillion-dollar computer development—were never more than “expedients,” which “had to be replaced” by a more mathematically sophisticated approach.56

  In fact, many of these standard cryptanalytic tests had been developed precisely because they were all that manual methods or punch card machines could do: they really were tricks rather than fundamental mathematical solutions to the problem. Looking for double hits in enciphered codes or counting the number of E’s in a message were only very crude approximations of the real hypotheses cryptanalysts were seeking to test as they tried to reconstruct an unsolved code or cipher.

  The habit of “muddling through, laboriously bludgeoning answers out of a problem with outmoded techniques,” in Frank Lewis’s words, was inescapably reinforced by NSA’s culture as it had evolved from “a small, more personalized, and highly motivated fraternity” to a large, overorganized, and ever more security-conscious bureaucratic giant. Need-to-know and comp​artment​alizat​ion were the very antitheses of the kind of open inquiry, freewheeling exchange of ideas, and give-and-take bull sessions that academic researchers took for granted as a mainspring of creative advances. Lewis regretted the tendency for ever-higher walls to prevent the exchange of technical ideas even within the agency, and his humorous description of the “conservative supervisor who likes to survey a nice quiet roomful of deeply concentrating, strong-but-silent types” captured an all too serious problem in the agency’s work culture.57

  A striking sign of how far NSA’s cryptanalysts had fallen behind developments in academic mathematics, the Baker Panel noted, was the complete absence of any research by the agency on communication theory, linguistic structure, or higher-order language statistics. “This is a field of great subtlety and challenge, affording many opportunities,” they wrote. “However, NSA appears to have no work going on in this or related information-theoretic directions.”

  Several seminal papers published in the late 1940s and early 1950s by Claude E. Shannon, a researcher at Baker’s own Bell Labs—he was also a member of the Special Cryptologic Advisory Group—had laid the foundation of the new field of information theory and set off an explosion of work of considerable significance to mathematical cryptanalysis. But none of it seemed to have made much of an impression on NSA’s business-as-usual approach.

  Shannon’s key insight was that written language is highly redundant: rules of spelling, syntax, and logic all constrain the ways letters are combined into words, words into sentences, sentences into longer passages. The result is that any written text contains far less actual information than a string of characters of equal length in theory could contain. For example, certain letters regularly follow others as a result of particular grammatical constructions or spelling conventions (such as, in English, the combinations TH, ED, LY, EE, IE, OU, ING) while others are rare or impossible (XR, QA, BG, KK).

  The uneven frequency with which individual letters appear in normal text, that mainstay of paper-and-pencil cryptanalysts from time immemorial, is merely the simplest, lowest-order manifestation of this redundancy. Shannon calculated that English is in fact about 75 percent redundant: in other words, written language betrays a huge amount of extra clues beyond what is strictly needed to convey meaning. He pointed out that it is often possible to guess the next letter or word in a sentence as a result of this high degree of excess signal to information; one can, for example, eliminate vowels altogether and still usually understand the meaning of a typical sentence (T B R NT T B, THT S TH QSTN for “To be or not to be, that is the question”). Higher-level redundancies in the structure of language likewise make it easy to predict which words follow others, even over long intervals of text; in a phrase such as “There is no way that she possibly could have misunderstood me,” many of the words can be omitted with no loss of meaning: “No way she misunderstood me.”58

  The relevance to cryptanalysis was that information theory likewise implied that much of the redundant statistical structure of language must persist in any message encrypted with a finite key, and detecting those ghosts of plaintext can reveal significant information about the cipher machine or its setting that was employed. Traditional cryptanalytic tests such as searching for unrandomly frequent repeats of particular short character strings or code groups at significant intervals were elementary examples of how these persistent statistical structures could be exploited. But they barely scratched the surface of what information theory could do.59

  Friedman and Kullback, who wrote important papers on the index of coincidence and other fundamental statistical tests of cryptanalysis in the 1930s, were solid statisticians but never at the forefront of the field, and by the late 1950s they were unmistakably behind in embracing the latest research. The entire cryptanalytic leadership of NSA remained in the hands of Friedman’s original protégés. Even their admirers had begun to feel that the old guard—Kullback, Sinkov, and others from the World War II generation—had stayed too long, and that NSA had notably failed to bring in truly “high-calibre research mathematicians” or provide the working conditions amenable to the kind of research that the problems demanded, as the agency’s Scientific Advisory Board noted in 1957.60

  The failure to develop methods that put the new insights of information theory to use to exploit the higher-level statistical structure of language in cryptanalysis was the most glaring evidence of this. William Friedman had seen cryptanalysis as a unique profession, demanding a peculiar kind of puzzle-solving mentality combined with patience, a solid but not brilliant mathematical mind, and an apprenticeship in its arcane mysteries. But the Baker Panel pointed out that the world had changed: “The required skills and ability of a new generation of cryptanalysts may…be somewhat different from that of the old generation.” What it would take to
break the most difficult problems now facing NSA were skills and techniques that went beyond the rote application of the same old “bag of tricks” that the “traditional cryptanalyst could rely entirely on.”61

  The Baker Panel’s tough words were effectively an indictment of the entire culture of extreme secrecy that had left the agency increasingly isolated and cut off from the advances in higher mathematics that modern cryptanalysis depended on, and whose wartime legacy of individualism and innovation was rapidly succumbing to the paralysis of bureaucratic sclerosis. “Everything secret degenerates,” the British statesman Lord Acton had warned a century earlier. Now approaching only its second decade, NSA was encountering the truth of that observation with a vengeance.

  *

  *1Theremin continued to suffer from the paranoia of the Soviet regime over the following decades. In 1967 the New York Times music critic Harold Schonberg was astonished to encounter Theremin at the Moscow Conservatory; his American acquaintances, who had never heard from him again after his disappearance, were convinced he had died during the war. But after Schonberg’s article appeared in the Times revealing that the famous inventor was not only alive and well but continuing to create new electronic musical instruments, he was abruptly fired by nervous conservatory officials.

 

‹ Prev