by Shane Harris
Tether liked what he heard. All across the government agencies were revving up for a new kind of war. Previously “risk averse” organizations, as critics liked to think of the CIA and others, were now desperate for ideas. Tether would back Poindexter’s program, he said, but on one condition: Either he or Sharkey had to come back to government to run it. This was far too complicated a task to leave to anyone other than the experts, and clearly, they were Poindexter and Sharkey.
Sharkey left the luncheon and relayed the bad news to his friend. Neither of them wanted to return to public life. Since his speech in Denver, Sharkey had left DARPA and joined a boutique technology and consulting firm, Hicks & Associates. He enjoyed private-sector work, both for the intellectual freedom and the financial reward. Hicks’s parent company, Science Applications International Corporation, was one of the largest employee-owned companies in the country, and Sharkey had amassed considerable stock options. The company’s business would boom as government agencies planned to spend billions on new technology for counterterrorism. If Sharkey returned to public service, he might have to give up his options. He’d certainly have to take a substantial pay cut.
Sharkey was a younger man than Poindexter. He had different obligations. He just wasn’t prepared to come back, and Poindexter didn’t want to force him. Besides, Poindexter was the mastermind. He could see the system from end to end. And he knew how the bureaucracy worked. Few DARPA program managers had ever been national security adviser to the president. Poindexter’s return was the agency’s gain.
All that was true, but Poindexter had his own reasons to turn the job down. Along with his résumé came his baggage. How big a distraction might that be? He and Linda weighed the options together. He had the bureaucratic expertise, the technological fluency, the vision. He also still had the highest level of security clearance, owing to his work at Syntek.
He also weighed the financial concerns. Poindexter was not a rich man. But his children were grown. He owned his home. The boat was the biggest expense, yet a manageable one. There was really only one question: Was John Poindexter his plan’s best asset or its worst liability?
Though he’d been a master of casuistry in his dealings with Congress, Poindexter had always been honest with himself. His association with an ambitious and controversial counterterrorism program would cause trouble. He had no doubts about that. The moment he stepped back onto the public stage he could become a lightning rod atop his provocative venture.
Poindexter had long thought that if he had it to do over again, he would have devised a public relations strategy for the “Iran-Contra business,” as he preferred to call it. He had warned his cohort then that if the convoluted operation ever became public, they’d have a very hard time explaining it.
Washington was a town of institutional memories. Many of his oldest allies in government were back in power, in senior positions within the Bush administration. And his oldest foes had remained in Congress. Poindexter wasn’t sure now how, or if, he could avoid the lightning. But he figured it would come eventually.
He thought about it for a few days, then picked up the phone and called Sharkey. Okay. Let’s do it.
A month later—a quick turnaround by government standards—Poindexter found himself sitting in Tether’s office, suit-clad, a PowerPoint briefing on his laptop, ready to explain TIA. He’d taken to calling total information awareness by an acronym, which he pronounced like a woman’s name, “Tia.”
Poindexter had always prided himself as a superb briefer, and this time he’d come armed with the goods. Tether donned his big wire-rimmed glasses and pulled up to Poindexter’s laptop. It wouldn’t be hard to sell him on the technology. He had a PhD in electrical engineering from Stanford and had spent a career in the private sector as a high-level program manager for defense contractors. Rather, Poindexter wanted to impress upon his future boss, from the outset, that this was no ordinary research project.
Poindexter called up the title slide: “A Manhattan Project on Countering Terrorism.”
He couldn’t have chosen a more powerful allusion. Likening TIA to the construction of the first atomic bomb, Poindexter immediately conjured up the forces of war, ingenuity, and dynamism. He was telling Tether that he sensed a historic opportunity, and that, like the original Manhattan Project, TIA would require an extraordinary combination of research, science, and money. DARPA was the only agency Poindexter could imagine taking on such a futuristic and risky concept.
Beneath the title, Poindexter excerpted a passage from a New York Times article that had run a few days earlier, which crystallized the nascent critique of the government’s pre-9/11 failures now taking hold.
In hindsight, it is becoming clear that the CIA, FBI and other agencies had significant fragments of information that, under ideal circumstances, could have provided some warning if they had all been pieced together and shared rapidly.
That was TIA in a nutshell. And from then on, even if Tether didn’t grasp the technical esoterica that riddled Poindexter’s presentation, he would get the essence of it.
The quote was appropriate for another reason. The article from which it came led with news of a bracing memo sent to senior officials at the CIA from the director of central intelligence, George Tenet only five days after 9/11. Tenet had made it clear that the government was less interested in assigning blame for the attacks than in radically departing from the culture that had allowed them.
“The agency must give people the authority to do things they might not ordinarily be allowed to do,” Tenet commanded. “If there is some bureaucratic hurdle, leap it.” Tenet titled his memo simply “We’re at War.”
Poindexter showed Tether the leap he wanted to make. He called up a world map peppered with red dots and accompanying text labels. Each one hovered over some disaster-prone part of the world and asked an expansive question. Over the Middle East, “What do we know about terrorists like Osama Bin Laden?” Next to New York, “How do we prevent another World Trade Center Disaster?”
But elsewhere around the map, Poindexter placed questions that had nothing to do with religious terrorism. “What should the U.S. do about China?” “How can we fix Colombia?” Clearly he thought TIA had applications beyond just the current crisis. Poindexter had been traveling this road for decades, and in a way, 9/11 was merely a catalyzing event.
“It’s all about information,” Poindexter told Tether. “We are swimming in data but still need more.” But not general information, and not the kind that was easy for the intelligence community to get. Right now they needed transactional information—about people. What they bought. What they read. Where they traveled. Whom they talked to, and what they said.
Terrorists worked in cells, and those cells represented nodes in a large global network. The government might not always understand its reach, but they could grasp its structure by pulling apart the transactional bonds between nodes and among clusters of them. This was the best way to disrupt terrorists and prevent their attacks: to unlock the design of the network.
What Poindexter was proposing now went far beyond the traditional boundaries of law and policy. “Our focus has to be on individual people,” he told Tether.
“Focusing on People . . . Extremely Sensitive,” Poindexter titled a subsequent slide. It featured a flow chart, bland and undecipherable at first, but that, upon closer examination, revealed a basic structure for how information about people might flow through a TIA system. First, Poindexter listed a dizzying array of sensitive data to access. “Transactions: Communications, Financial, Education, Travel, Country Entry, Place or Event Entry, Medical, Veterinary, Transportation, Housing, Critical Resource, Government.”
This was the raw material of a total information awareness system. But it was just a first step. The chart showed how this stream, along with covertly collected photographs and “biometric” information such as fingerprints and image scans, would comprise “dossiers” used by multiple government agencies to develop model
s of terrorist attacks. The TIA system would use this information to help analysts predict “plausible futures,” or terrorist scenarios that were likely to occur. The analysts, not the system, would then suggest to national leaders how they might mitigate or prevent that crisis.
Poindexter described the entire process in four discrete steps: “Detect, Classify, ID, Track.” And, he told Tether, an analog for this kind of hunt already existed—antisubmarine warfare.
“This is not business as usual,” Poindexter said. “We must put introduction of new technology on a wartime basis.” He wanted only the brightest program managers and researchers. He told Tether he’d like to cordon them all in a secret facility outside Washington, some massive complex surrounded by high fences and concertina wire with “No Trespassing” signs prominently displayed. This was mostly theater, he explained, set dressings meant to impress upon those inside the gravity and consequence of their historic mission. It was a “patriotic duty,” Poindexter said. “We should not waste time.”
Poindexter would only design TIA. It was up to the intelligence agencies, or maybe the FBI, to use it once the prototype had been built. That was not his call. In the meantime, though, Poindexter wanted to do something that DARPA managers had never tried before—he wanted to ignite a policy debate.
Poindexter suffered no illusions that a large portion of the public—maybe the overwhelming majority—would find his ideas not only distasteful but unconscionable. He wanted a chance to change their minds. And he would do it, unsurprisingly, in a technical fashion.
Poindexter told Tether that he would build “privacy-protection” technologies into TIA’s design. He showed how the system could encrypt each one of the millions, perhaps billions of discrete data points it inspected, so that all a human analyst ever saw was a series of numbers—no names, no faces, no identifying information. The identifier could contain all the information an analyst needed, including its source, which agency collected it, its relationship to other data, and the time and place it was captured. But it need not contain a name or any other clue that would give away the true identity of the person who had created that tiny ripple in the digital ocean. As his thinking evolved, Poindexter imagined an entire “privacy appliance” built into the system that would lock private information away in a kind of electronic safe that might only be opened upon order of a judge. The judge would have to find that the government had a reason for thinking this anonymous person might actually be a terrorist. Poindexter called this case-by-case approach to putting names to data “selective revelation.”
Poindexter emphasized that the research in this area was distressingly scant. He wanted Tether to include funding in the TIA budget for new research, which he also hoped might inspire discussion about the merits of the concept, and therefore the entire system. Poindexter believed that if he could prove a privacy appliance worked, people might strike a bargain between their privacy and their safety.
That meant taking his idea public. He’d first imagined TIA as a “black program,” veiled behind a classified budget and rings of secrecy. But he wondered if this would slow his progress and lead people to think he had something to hide.
Tether also said keeping TIA classified would slow Poindexter down. Time was not on his side. Poindexter wanted to assemble a team quickly, and he wanted to gather ideas from thinkers and tinkerers well beyond the security-cleared confines of Washington. If he kept TIA under wraps, he was bound to get many of the same tired ideas he’d seen in the past five years. He wanted to blow the doors open.
“Don’t worry about the dollars,” Tether assured. “I’ll give you all the money you need.” DARPA would pool all the existing funds for the various programs Poindexter planned to acquire, the ones he’d mentioned to Sharkey when they first talked after 9/11. And Tether would throw in some start-up capital. The pool came to about $100 million. And Tether agreed to increase that budget to at least $150 million the following fiscal year.
Poindexter had gotten what he asked for. He had planned. He had executed. He had succeeded. Strong winds portended smooth sailing. He hadn’t any clue that thirty miles away, at the headquarters of the greatest signals-catching apparatus ever devised, another fleet was preparing to sail.
CHAPTER 13
THE BAG
It was October 6, two days before Columbus Day. The holiday passed unnoticed and uncelebrated by most, but Washingtonians coveted it: For thousands of federal employees and Capitol Hill staffers, it was the one day each year when they got the day off and their kids still had to go to school.
Mike Wertheimer was looking forward to some quiet time with his wife. They both needed the break. Wertheimer, a PhD mathematician and the National Security Agency’s top technologist, had spent the past few weeks climbing out of a profound depression.
When the planes hit the Twin Towers and the Pentagon, the shock wave resonated uniquely with Wertheimer and his colleagues. It was their job to build the tools that the NSA’s terrorist hunters needed to stop a cataclysm like this. The agency had paid for Wertheimer’s university education, and in return, he’d given the place twenty-one years of his life, putting his prodigious technical skills to work as an electronic monitor in his country’s secret service. Yet he seemed to have missed the signals emitted right in front of him. The 9/11 hijackers, it turned out, had been plotting their attack within miles of the NSA’s headquarters at Fort Meade, Maryland. They’d been here all along, and Wertheimer never knew it.
Wertheimer had risen through the ranks to become the agency’s top techie. On paper, he fit the stereotype captured in an old joke traded among the NSA lifers: “How can you tell an extroverted analyst? He’s the one who looks at your shoes when he’s talking.” But Wertheimer, over six feet tall with deep, searching eyes, defied the image. He looked in people’s faces when he spoke, undistracted. The arch of his eyebrows and the glint in his eyes gave everything away—when he was listening, when he disagreed, when he was afraid.
And so he couldn’t conceal the depth of his regret from his wife, his family, or his coworkers. He was convinced that he hadn’t fought hard enough for those things that might have made a difference. Haunted by three thousand ghosts, he wondered what he had missed. What he had left undone. What trick he hadn’t tried or what argument he’d conceded when he should have shouted louder. Wertheimer drifted about his house, inconsolable, until one day his eighty-year-old father pulled him aside.
“What are you doing? I lived through World War II, through the Depression, through Vietnam,” Wertheimer’s father said. And now terrorism. He looked his son in the eyes, those eyes that revealed all. “I don’t hold you responsible,” he said. “I sleep better with you back at work trying to solve these problems.”
Wertheimer had long since concluded that he and his father had little in common beyond lineage. But now the man was breaking through to him in a way no one else had.
The lecture was like an antidote. It snapped him to life. Wertheimer renewed himself to the fight, but he vowed never to let it into his home. He would do his job, and keep his family safe.
His newfound resolve was tested that Saturday before Columbus Day, as Mr. and Mrs. Wertheimer made plans for a solitary retreat. The phone rang. An emergency call from the office. “You need to come in.”
Wertheimer found himself in a large conference room in the NSA’s headquarters building, along with about eighty of his colleagues. Standing before them was a man who had made his own resolution in the weeks since 9/11: The next attack was not going to happen on his watch.
Wertheimer had worked closely with Mike Hayden since he took over the agency in March 1999. They and a band of senior officials had been shaking up the place from the inside out. Hayden, a lieutenant general in the Air Force and a career intelligence officer, knew that the NSA was being overtaken by the digital revolution. The biggest threat came from global telecommunications networks, which the agency’s eavesdropping systems, largely designed to intercept satellite transmis
sions, were not prepared to address. Hayden, Wertheimer, and others had done yeoman’s work bringing the agency into the twenty-first century and realigning its post-cold war mission to deal with terrorism and other asymmetric threats. But now they could wonder if it was all for naught.
Hayden was a football fanatic and often liked to tell stories about the NSA using sports metaphors. After 9/11 it had become clear to him that the agency played too much defense. President Bush agreed. The NSA had a formidable capacity to detect terrorist threats, but it had gone largely unused in the United States, owing largely to legal restrictions over when and how the agency could monitor targets inside the country. Now it was clear that the strategy of fighting a war “over there” had been upended. The targets the NSA needed to track now were in the United States, or on their way. It was time to change the game plan.
Hayden explained to his employees that four days earlier the president had granted the agency new authorities that allowed the NSA to greatly expand its surveillance net. The agency could now target the communications of anyone reasonably suspected of being a terrorist, or those associated with them, without a warrant. Whereas once the agency would have had to get permission from the secretive Foreign Intelligence Surveillance Court to conduct intelligence surveillance inside the United States, now they could bypass that requirement, so long as the surveillance complied with a few simple yet utterly new rules.
First, there was a question of geography. Say someone in Pakistan called a phone number in San Francisco. That would be considered a foreign communication, well within the agency’s traditional surveillance domain. If NSA analysts reasonably believed the target in Pakistan was a terrorist or associated with terrorists, then the agency’s eavesdroppers could monitor both the target in Pakistan and the person he called in San Francisco. Under ordinary circumstances the NSA would have either stopped that surveillance or erased records of the San Francisco person’s name and identity. But now the analysts could listen in and determine if the conversation, or the parties involved, had any “nexus to terrorism.”