There is a quantum computing arms race of sorts under way. China, Israel, and Russia have advanced quantum computing programs with the direct aim of gaining geopolitical advantage, as does DARPA itself. The U.S. government monitors the activities of physicists and mathematicians who work on the subject, and the government even went so far as to require scientists who worked on quantum computing for Bell Labs when it dissolved into Baby Bells to remain in the United States if they wanted to pursue the subject.
Tether told Wired magazine that quantum computing is one reason he does not always agree with the nostrum that the best science is done openly, with results shared with everyone. It would be catastrophic, he believes, if someone else got their hands on this technology before the United States does, much like it would have been a crippling blow to U.S. military hegemony had another country figured out stealth bomber technology before we did. The United States, he argues, has to keep the bulk of its efforts secret, lest we allow any enemy, perceived or real, to take an advantage. Quantum computing would seem to fit into a category rule for legitimate secrets.
But science is irrepressible. Legions of scientists work openly on quantum computing efforts; rarely does an issue of Nature print without the report of some small advance. In 1994, mathematician Peter Shor published an algorithm that a quantum computer could use to break a cryptographic system whose key was based on the difficulty of factoring incredibly large prime numbers. That’s well and good, because there is no computer capable of employing it yet.10 Encryption systems that depend on large-number factoring are called asymmetric; the most common is the RSA public key,∗ which is central to the way the Internet encodes and transmits data packets.11 Many banks and financial exchange mechanisms use public key cryptography to protect their control systems. According to a study for computer security firm SANS Institute, written by Bob Gourley, who would later serve as chief technology officer for the Defense Intelligence Agency, a working quantum computer in existence would suddenly mean that “encryption algorithms such as RSA which rely on the difficulty of factoring large primes will suddenly be obsolete, and everything ever encrypted by RSA will be at risk. If quantum computers become functional very little on the current day internet would be safe from cracking.”12
Likewise, there is speculation that a quantum computer might be able to break Pretty Good Privacy (PGP) encryption. To put that in perspective, theoretically PGP could be bombarded with keys and ultimately penetrated. But even under the best, nonexistent, and likely impossible conditions, this would require constant bombardment for ten trillion years. That comes out roughly to “a thousand times the age of the known universe.”13 (The Utah Data Center, a secret facility run by the NSA, has a prototype quantum computer dubbed VESUVIUS, estimated to be capable of performing 100,000,000,000,000,000,000,000,000,000,000,000,000 computations at once.)14
Such raw computing power is one reason most discussions of quantum computing focus on the cryptography issue. It’s scary and interesting to think about all the information in the world suddenly flowing free. A country whose computers are enriched by quantum processing could overwhelm virtually every piece of defensive technology we employ, using unstoppable viruses to cripple financial markets and missile defense systems and power grids. Scary stuff, though we stress the conditional. This is the quantum world, where in order to be exploited the information bit must be as perfectly contained as possible. In classical physics, the moon is the moon—something tangible, solid, always there. In quantum physics, there is a tiny but nonzero chance that if you look up at the sky and look in the direction where the moon is supposed to be, you won’t find it. That’s because the indivisible bits of matter envisioned by scientists from Aristotle to Einstein are more aptly described as mathematical wave-function equations, where a certain something has at least some probability of being anywhere in the universe at a certain point, and might also be spookily entangled with something else that is farther away. But why do we see the moon if this is true? Because these fuzzy equations bump into each other and suddenly the bits of information decohere into things that much better approximate the solid stuff we’re used to dealing with. This is an extremely simplified, largely misleading way of saying that anything suspended in a quantum state has to be free from error—that is, the chance that it will decohere has to be very low.
And so the first thing a working quantum computer will do, as Christopher Monroe, the Bice Zorn Professor of Physics at the University of Maryland’s Joint Quantum Institute, put it to a curious senior Bush administration official in 2009, “is spend 99 percent of its time correcting errors and the other one percent of its time on computation.”15 Right now, scientists have managed to get the error rates down to about 0.1 percent. That sounds impressive, but in order for a quantum computer to work, scientists need to reduce the error to a level of 10−6, or 0.00006. “We have a ways to go,” as Monroe put it. The next thing a quantum computer will do, one scientist working on the problem told the authors, is “build a better version of itself.” That is, the first thing any smart person would want to do with a quantum computer is to use it to make a better one, because of all the computational time and energy spent building the first one.
Tether was at first very reluctant to talk to the authors about his quantum concerns. He said he did not want to reveal the degree to which the U.S government was worried about the problem. Here is his case for more secrecy: “Having something other than the United States get a quantum computer would be an enormously big deal,” he says. While some of his colleagues liken the advent of quantum to the development of the nuclear bomb—its disruptive effects are that significant—Tether remembers back to the development of the solid state computer in the 1960s, which suddenly allowed millions of transistors to be placed on chips the size of a thumbnail (no more vacuum tubes). “People back then could not imagine the applications of an integrated circuit with billions of transistors in the chip.”
For Tether, it is the economic impact of quantum that worries him the most. “Forget the cryptography. Imagine our ability to model things down at the atomic level and get back an answer in seconds. We could make a new metal, much stronger than steel, that is transparent and incredibly thin. You could solve all sorts of complex biochemistry equations and make new medicines. If we had a quantum computer in 1939, the atomic bomb could have been designed in a week.” The United States, he insists, must be “first to market” with a quantum computer. “You really have to have two or three years’ lead time in the market. That’s all we’re really talking about. You’re not going to keep in front of anyone else forever. We need to have it so that we as a corporation can come up with new products and new solutions that we can sell on the world market that will increase our economic strength.” Tether is advancing quite deliberately the model of a country as a corporation because, he says, that’s how our rival nations look at themselves—especially China.
DARPA, its intelligence cousin IARPA, and the NSA refused to discuss quantum computing with the authors, as did Microsoft, which has a quantum research program under way behind locked doors at its Santa Barbara campus.
The government is working on a solution to a potentially nearer-time problem: it needs to develop a cryptographic system that would sustain a quantum assault. To that end, the NSA and other government laboratories are partnering with the private sector to rapidly understand the “major ramifications” that the ability to quickly factor prime numbers would have, according to a National Research Council white paper on the frontiers of quantum science.16 In a vaultlike series of rooms at Stanford University’s SRI International, mathematicians and engineers are trying to develop an impregnable form of what’s known as elliptical curve cryptography, which serves as the basis for most U.S. government cryptographic research efforts today, and which is currently vulnerable to a quantum computer.17
There are classified and semiclassified DARPA and Army/Air Force/Navy Research Lab programs for the potential uses of quantum crypt
ography for a set of select defense technologies, including:
The ability to design a perfect sensing laser for drones or satellites;
The ability to decrypt, in real time, RSA-based public key systems (the NSA can usually begin real-time decryption as soon as it breaks a key or a code);
The ability to design radar that can defeat counterstealth techniques the Chinese and Russians are working on;
The ability to design coatings for aircraft that truly are stealth, owing to the exploitation of quantum fluid dynamics;
Advances in acoustical detection technology;
Nuclear weapon dispersal and damage simulations.
Quantum computing has its skeptics, including many scientists who believe that building an operating computer is impossible. The transition from a world of normal cryptography to a world of quantum cryptography would pose significant costs on the first country to try it, which is one counterpressure to the security concerns over mastering it. Even a basic system to use quantum encryption to encode, say, a message from the White House to NORAD in Cheyenne, Wyoming, would require a quantum repeater infrastructure that no one knows how to build.18
Most of what DARPA does is unclassified by design, owing to the principle that transparency and efficiency and collaboration will produce the best results. But Tether had a bad experience that leaves him worried about the future of a quantum free-for-all. He was the main driver of the Total Information Awareness (TIA) project, the first major DOD research effort to envision using bulk data collection and mining for counterterrorism purposes. In Tether’s mind, there were two problems with the idea: that Vice Admiral John Poindexter, an Iran-Contra figure, was too controversial to tend to such a project that would itself become so controversial; and that he did not classify it from the start. “The reason for that program is that I watched the Twin Towers come down and I knew that we were going to find that we had all the data and had trouble connecting the dots—that type of thing. So we started the program to develop the technology and give the intelligence community a chance to do this better. I put John in charge, and he was a lightning rod. The program being unclassified meant that all of the privacy people had access to it. A couple of them became alarmed and started talking about it to reporters, and then someone went to William Safire, and that was the end of it.” The New York Times columnist wrote about TIA in November 2002, casting DARPA’s $200 million effort as a totalitarian effort to create psychographic “dossiers” on all American citizens. Congress got involved and held hearings, canceled the program, and prevented DOD from engaging in mass data mining.
The research programs that TIA funded were farmed out and put to use. One version of TIA migrated over to the NSA, where it was classified. (The DOD funding provision apparently had no effect on the NSA’s bulk data collection mining program, in part because the armed services committees did not know about it.) TIA’s technologies are ubiquitous now; every counterterrorism entity in the government uses social network analysis, evidence extraction and link discovery, instantaneous speech translation software, and more. “I assumed that no one would make a big fuss about it, said Poindexter. “We did all the right things. We brought people in. But I guess I was wrong. I thought we were at war, and when you’re at war, everyone works together and plays by the rules.”
If Tether were to do it over, he says, he would not have accepted the project’s Orwellian name, would not have appointed Poindexter (not because Poindexter did a bad job, but rather because he had too much baggage), and “I would have classified it. There was a clear national security interest there. I actually think that the TIA thing made it much more difficult for these types of programs to be created with the type of privacy safeguards that we had. But we could never convince anyone that we never had any intentions of using these tools on Americans.”
Tether is obviously defensive about the program, which is part of his legacy; the chronology he shares is not universally accepted, and civil libertarians who knew about the project say they objected to it precisely because safeguards were nonexistent.19 He blames civil libertarians for the demise of another program that he thinks could have saved lives. It was called CITIES THAT SEE, and its purpose was to set up a networked series of rugged cameras in Baghdad that could track cars and trucks. The camera feeds would be recorded 24/7. “If we saw a car blow up, we could roll back the tape and see exactly where it came from. But Congress killed it after one year.”
Why?
“I was told that, well, if you can do this in Baghdad, the Bush administration is going to figure out how to do it in the U.S., and we can’t have that. Congress would have rather killed the baby in the crib. They were more concerned with unintended consequences of having this capability. They were more concerned about the rights of a private citizen than they were about capturing a terrorist.”
Tether sets up a neat dichotomy with his critics, and both are correct. Eventually, versions of the technology he had seen did wind up being used to find the makers of bombs in Iraq, while it also crept its way into local law enforcement pilot projects in the United States. As much as Tether may disagree, DARPA’s—and science’s—predilection for transparency did not completely curtail the use of promising technology. But to Tether’s point, some degree of discretion might have helped those technologies save lives earlier than they did.
No technology is born classified; it took the government a long time to figure out it needed to put controls on the transmission of information about nuclear weapons, and it was probably too late in doing so, at least from the perspective of wanting to preserve the strategic advantage of building the bomb. Today, the U.S. Patent Office will automatically assume that something related to a cryptoanalytic or cryptographic technique ought to be classified unless the government says otherwise; the same rule now applies to nuclear weapons technology. In 2001, the Department of Energy forced an Australian company working in the United States to classify a promising technology to use lasers to separate isotopes for uranium enrichment—only the second time since the dawn of the nuclear age that the government used an obscure power to retroactively classify a technology.20 As of 2011, GE was on the verge of using the technology to develop a much more efficient process to enrich uranium at a plant in Delaware.21 GE insists it developed the technology to facilitate its civilian nuclear power research, but in the wrong hands, it could help a country like Iran more quickly develop enough highly enriched uranium to pack into the core of a nuclear bomb. The government is so sensitive about the spread of uncontrolled information that it once tried to classify a student’s research dissertation.
George Washington University’s Sean Gorman culled information from both open and unclassified sources to map the nation’s technological infrastructure. The government, when it became aware of the project, saw two things: the secret concerning America’s hosting transit communications for many other nations, and the frailty of the fiber optic network that underlay America’s digital commerce. “Burn it,” said Richard Clarke, the decidedly not conservative former senior National Security Council hero, to the Washington Post. In the end, the university caved: it agreed to keep the paper under lock and key.22
But eventually, as these things do, the paper got out there, somehow.
∗Kanuck did not provide the classified report to the authors and declined to comment on its contents. The University of Texas Law School made his slideshow available.
∗Among the other facts classified at the code word level for two years: that the United States needs more public-private partnerships and is falling behind India when it comes to generating and keeping computer engineering talent.
∗According to Lewis, “DHS has looked at twenty-two different power companies, and found that every single one of them said they weren’t connected, when in fact they were. And believe me, if DHS can find it out, the Chinese are way ahead of them.”
∗RSA derives its name from the first initials of the three scientists who invented it.
Notes
1. Siobhan Gorman, August Cole, and Yochi Dreazen, “Computer Spies Breach Fighter-Jet Project,” Wall Street Journal, April 21, 2009, A1.
2. Robert Westervelt, “White House Declassifies CNCI Summary, Lifts Veil on Security Initiatives,” Security Search, March 2, 2010, http://searchsecurity.techtarget.com/news/1407888/White-House-declassifies-CNCI-summary-lifts-veil-on-security-initiatives.
3. Siobhan Gorman, “Details of ‘Einstein’ Cyber Shield Disclosed by White House,” Wall Street Journal, March 2, 2010, http://blogs.wsj.com/digits/2010/03/02/”einstein”-program-disclosed-as-us-cyber-shield/?blog_id=100&post_id=11601.
4. Marc Ambinder, “Pentagon Wants to Secure Dot-Com Domains of Contractors,” Atlantic, August 13, 2010, http://www.theatlantic.com/politics/archive/2010/08/pentagon-wants-to-secure-dot-com-domains-of-contractors/61456/; Ellen Nakashima, “NSA Allies with Internet Carriers to Thwart Cyber Attacks against Defense Firms,” Washington Post, June 16, 2011, http://www.washingtonpost.com/national/major-internet-service-providers-cooperating-with-nsa-on-monitoring-traffic/2011/06/07/AG2dukXH_story.html.
5. Joseph Marks, “Industry Urges Better Cooperation from Government on Cyber Threats,” NextGov, April 15, 2011, http://www.nextgov.com/nextgov/ng_20110415_2482.php.
6. CBS News, “Cyber War: Sabotaging the System,” June 15, 2010, http://www.cbsnews.com/stories/2009/11/06/60minutes/main5555565.shtml.
7. Chris Reed, “Taking Sides on Technology Neutrality,” Scripted, 2007, http://www.law.ed.ac.uk/ahrc/script-ed/vol4-3/reed.asp. Note the term “tech indifference.”
Deep State Page 35