Lights Out

Home > Other > Lights Out > Page 4
Lights Out Page 4

by Ted Koppel


  This monitoring process, while routine, also creates a dangerous point of vulnerability. If someone was able to hack into an RTO or ISO and deliberately overload the lines, the impact would be swift and physical. The lines would start to droop from the heavy load. They would overheat. “When the lines dip,” said Clarke, “they can set a tree on fire, or they can melt the line.” There are built-in controls to ensure that such an overcapacity never happens, but if a hacker got into the system and targeted those controls, Clarke explained, so that “the guy sitting in the operations center doesn’t see it—he sees that everything is in the green,” there would be no relationship between the operations center dashboard and reality. Such a situation could quickly escalate out of control. If you can break key transmission lines, said Clarke, you can produce cascading, potentially catastrophic outages.

  Is it doable? It is anything but simple. It would require detailed mapping and lengthy reconnaissance operations to conclude what to target and how to find a critical point of failure in the system. But it is, technically speaking, more plausible today than ever before. Deregulation of the power industry has created a system with more vulnerable points of entry than ever existed previously, and a lot of the equipment is controlled by aging, standardized computer systems used around the world and familiar to many of America’s enemies.

  Many of the old power stations were operated by manual controls. If they had any computer software at all, it was unique to that company. These days, within any one of the three U.S. grids, almost all operational phases of thousands of power companies are interconnected. Coordinating operations are run using the same supervisory control and data acquisition (SCADA) systems. Most of the systems are manufactured by a relative handful of companies, and while they are not quite interchangeable, there are similarities in programming and structure. This presents a web of pathways connecting the thousands of power companies and enabling transactions such as that Florida-to-Chicago transfer. The overall system has been designed for maximum efficiency, eliminating waste while establishing a precise balance between the power needed and the power generated.

  Craig Fugate, administrator of the Federal Emergency Management Agency, is concerned that we have sacrificed resiliency in the interest of achieving efficiency. “We have created a system,” Fugate told me, “where we generate power in very efficient quantities at specific locations, and then we have to move that power, oftentimes at great distances, to where it’s being consumed.” If someone was knowledgeable about the functioning of a SCADA system and succeeded in hacking into it, that individual could engineer “a series of events that seem totally unrelated” but which could, according to Fugate, “turn the lights out very quickly over large areas.”

  Richard Clarke agrees. “If you go into a big, modern power station in Shanghai, or a big, modern power station in California, you’re going to find the same SCADA software.” SCADA systems were, for the most part, designed and installed before the notion of cyberattacks had even occurred to anyone. The Internet itself was not designed to keep anybody out. It was created to be universally accessible.

  There’s a small historical irony in the fact that the extreme vulnerability of SCADA systems to cyber sabotage was first demonstrated, and dramatically so, during the administration of George W. Bush. At about the same time that Richard Clarke was serving as White House advisor on cybersecurity, U.S. computer specialists at the National Security Agency and their counterparts at the Israeli military’s Unit 8200 launched a cyberattack against a critical element of Iran’s nuclear program. Iran’s nuclear program was set back by as much as two years, according to some estimates.

  The attack, code-named Olympic Games, targeted an array of several thousand nuclear centrifuges located at Natanz, Iran’s main enrichment center. These centrifuges spin uranium gas at the high speeds necessary to refine the uranium used to fuel both nuclear reactors and bombs. With the introduction of a computer worm code-named Stuxnet, the cyber saboteurs were able to alter the speed of those centrifuges, undermining the refinement process.

  Sending the centrifuges into a destructive spiral would have been only marginally damaging had the Iranians recognized what was happening. They could have responded in time to mitigate the attack. The genius of the U.S.-Israeli attack lay in its ability to conceal the sabotage. According to David Sanger, who reported on Olympic Games for the New York Times, Stuxnet “also secretly recorded what normal operations at the nuclear plant looked like, then played those recordings back to plant operators, like a pre-recorded security tape in a bank heist, so that it would appear that everything was operating normally while the centrifuges were actually tearing themselves apart.”

  The SCADA system controlling those nuclear centrifuges in Iran was manufactured by Siemens, as is much of the SCADA software used by the electric power industry in the United States. All of that Siemens software has a built-in access point accessible only to Siemens engineers employing a closely held password. Or so Siemens believed. As Richard Clarke recounted to me, one attendee got up at the 2011 Black Hat hackers conference in Las Vegas and announced: “Here’s that password.” Siemens had to go to every deployment of their software worldwide and change the password.

  Getting into a piece of critical infrastructure is one thing, but it’s worth repeating that navigating an electric grid is a highly complex operation. The reconnaissance required to understand the system sufficiently to compromise it can take years, challenging the skills of even the most cyber-competent nation-states. We’ll get into what the experts call “preparing the battlefield” in a later chapter. (Several nation-states, most prominently the Russians and the Chinese, have already spent years conducting just such reconnaissance.) For the moment, suffice it to say that it’s difficult to keep hackers out of the system.

  Analogies are imperfect, but they can be instructive. During the fall of 2014 we were reminded incessantly that terms we have grown accustomed to associating with the Internet were, in fact, borrowed from the field of medicine. The shared terminology is not accidental. How better to describe the spread of an alien intruder through a computer program than as a “virus”? “Hygiene,” “anti-virus protection,” “immunity,” and “vaccination” are terms commonly used in both medicine and cyberspace; each refers to a defense against “bugs.” Searching for the origin of a virus, for “patient zero,” is strikingly similar to examining the outer edges of a network that has been infected by malware. One failure to disinfect, to follow the proper protocol, and a virus spreads from carrier to carrier or program to program.

  When Thomas Eric Duncan, infected with the Ebola virus, was finally admitted to Dallas’s Texas Health Presbyterian Hospital in the fall of 2014, he was placed in isolation, where he was treated by nurses Nina Pham and Amber Vinson. Each nurse wore two gowns, two pairs of gloves, shoe covers, a surgical mask, and a face shield. It was not enough. Duncan was in the worst throes of the disease, subject to explosive diarrhea and projectile vomiting. Either because some of Duncan’s bodily fluid splashed onto an exposed portion of their neck or because they inadvertently touched some of the fluid while taking off their gloves, Pham and Vinson became infected with the Ebola virus. Both women ultimately recovered, but in each case, the tiniest margin of error provided an opportunity for infection. In the context of cyberattacks, that vulnerable spot, the equivalent of the small exposed portion of the neck, is what specialists call an “attack surface”—a vulnerable entry point to what is believed to be a secure operating system.

  In terms of the power grid, the number of attack surfaces has increased exponentially with the integration of everyday devices on the Internet. Whereas a prospective hacker in the past might have had to go after a server or a desktop computer to gain access to an electric company’s corporate network, now he can do it by way of the devices that enable a consumer to program the lighting or heating and air-conditioning in his home remotely or automatically. The “smart” thermostat that automatically lowers the tempe
rature in a customer’s home at night or warms his kitchen before he gets up in the morning has to be connected to the company’s billing department, which in turn needs to be connected to whatever department actually conveys electricity to the home. Each connection provides another potential attack surface.

  In theory, the administrative network is “air-gapped” from the operational side of each power company, meaning that there is no physical connection between the two. Power companies insist that those two networks are absolutely separate and not connected. Whenever Homeland Security or the Federal Energy Regulatory Commission has hired computer forensic experts to investigate this claim, however, they have found minute connections. A Verizon/Secret Service study concluded that two-thirds of companies across a spectrum of industries didn’t realize they had been breached until someone outside the company informed them. Another study, conducted by the cybersecurity firm FireEye, found that it took on average 279 days before companies that had been breached came to realize it or were told by someone else.

  The problem with air-gapping, one academic specialist warned me, is that it fails to take the human factor into account: “Every time a worker brings in a thumb drive or laptop from home and hooks it up to an ‘isolated’ system, the mobility of workers bridges the air gap.” As workers and users of the two systems transfer work to their personal computers at home, or from their smartphones or other interconnected networks back to what is supposed to be an isolated, secure system, they run the risk of infecting the operational network. Would-be hackers, operating on what is sometimes referred to as the “Sneakernet,” can introduce their malware, their viral programs, by way of an employee’s insecure iPhone or thumb drive. The insecure thumb drive becomes the cyber equivalent of the tiny exposed portion of the nurse’s neck. Anything less than absolute hygiene provides a potential attack surface.

  Absolute hygiene may be theoretically possible, but it would be prohibitively expensive—not just for smaller, local companies but also for regional authorities tasked with monitoring vast areas. PJM, for example, is a regional transmission organization that handles the output of more than thirteen hundred generating sources, which it then distributes to more than six thousand transmission substations, serving fourteen states on the Eastern Interconnection grid. George Cotter, former chief scientist for the NSA, explained that the larger RTOs use literally tens of thousands of devices, many of which are insecure, and all of which are interconnected. “A company like PJM,” said Cotter, “isn’t going to buy seventy thousand crypto devices. They’ll wait until the companies that manufacture them build cryptography into them, and that’s decades.”

  Time is not on our side.

  5

  Guardians of the Grid

  We can’t defend against everything, but right now we’re vulnerable to just about everything.

  — MAJOR GENERAL BRETT WILLIAMS, FORMER DIRECTOR OF OPERATIONS, U.S. CYBER COMMAND

  Certainly no one has a greater interest in protecting the security of the electric power industry than the industry itself—if only cost were not a factor and profit were not an essential ingredient of staying in business. It is not altogether reassuring, then, to consider that the only institution with real power to decide how the power industry is protected is the power industry. In evaluating industry regulations in 2012, the nonpartisan Congressional Research Service questioned the entire arrangement, calling it “unusual” and observing that it “may potentially be a conflict of interest” for an industry to legislate its own standards. Corporate leaders dismiss this, arguing that government lacks the expertise to run the industry. On the other hand, they are open to working with government agencies in an advisory or supporting role, given the government’s undeniable expertise in computing technology and security. It is, say industry representatives, a perfect partnering opportunity.

  Three times a year, the Electricity Sector Coordinating Council, led by chief executive officers of the power industry, meets with senior administration officials. The council arranges for the deployment of security tools developed by the Defense Department, the National Security Agency, the Pacific Northwest National Laboratory, and the Department of Homeland Security’s Science and Technology Directorate. While these agencies do pass along new technology when available, in large part what they convey to the power industry is information—reports on risk assessment designed to improve the industry’s situational awareness and to help it detect as quickly as possible the presence of anomalies in the system.

  It sounds encouraging: making sure that the top people in industry are getting the most up-to-date information at the right time. But as Major General Brett Williams explained, progress is halting. As the director of operations for U.S. Cyber Command until his retirement, Williams speaks with considerable authority. He warns that while the United States has the best cyber offense in the world, the same is not true of the nation’s cyber defense. When he left Cyber Command in the spring of 2014, threat information was still being communicated between U.S. intelligence agencies and the electric power industry the “old-fashioned” way, via phone calls and emails. On the cyber battlefield, Williams explained, information needs to be communicated instantaneously. It’s not that the technological ability to deliver automated, machine-to-machine warnings doesn’t exist. Indeed, in 2013 President Obama issued an executive order making information sharing between critical industry and government cybersecurity services a priority. The order did not spell out the means by which information is to be communicated, however, nor is making something a “priority” a mandate. Not surprisingly, in fact, industry and government have different definitions of what constitutes a “priority.”

  On the intelligence side of the equation, it’s sources and methods. Williams explained it this way: “If you look back at classic signals intelligence, you’ve got a bug in some foreign leader’s office and you’ve got a piece of information there. If you use that information to make a decision, they’re going to know that the only place that information could have come from is a bug in the office. We apply that same logic to things in cyberspace.” We simply don’t have the luxury, Williams argued, of continuing to place the protection of sources and methods ahead of instant response. Cyber defense demands speed. “I would argue we just don’t have time to go through the same process that we go through with legacy types of intelligence. Obviously, we can’t defend against everything; but right now we’re vulnerable to almost everything.”

  Industry and government leaders alike recognize the threat, according to Williams. “There’s plenty of people that understand what needs to be done. It’s policy and it’s money” standing in the way. The policy discussion currently under way, Williams explained, is establishing how far the Defense Department’s role extends in defending the homeland against cyberattack. As things now stand, he told me, Cyber Command doesn’t defend private industry; “the policy hasn’t even matured enough to defend critical infrastructure.” For any sort of cyber defense system to efficiently protect the electric power industry, information sharing has to be a two-way street. Corporations will have to get over their privacy and liability concerns and give government agencies the security data those agencies say they need in order to be effective. The military and intelligence agencies, in turn, need to make information relating to cyber threats available in real time, setting aside worries about jeopardizing sources and methods.

  Scott Aaronson, national security director for Edison Electric Institute, the industry’s trade organization, made it appear that the policy of sharing security data in real time was up and running. The program, as he later conceded, is actually in its infancy. Aaronson cited an initiative called CRISP, the Cyber Risk Information Sharing Program. All network traffic coming into the program from the outside, Aaronson explained, is fed through an ISD, or information sharing device. It doesn’t block information; it simply analyzes the traffic, comparing it to classified and unclassified government databases. It can identify the IP addresses
that need to be blocked and the malware that threatens secure operations. What’s more, said Aaronson, it does all of this in real time.

  A senior power company executive described what sounded like the same program in different terms. Cyber traffic designated as coming from “friendly” servers is “whiteboarded.” Servers found to be connected with unfriendly foreign states, criminals, or hackers are tagged and blocked, or “blackboarded.”

  If that sounds a little too good to be entirely true, it is, just a little. If and when CRISP is deployed across the entire electric power industry, it will be an enormous step forward. But, as Aaronson acknowledged, creating a private-sector model around a government technology is not a simple task. As of early 2015, CRISP was operating at what Aaronson called “near-real time.” Machine-to-machine information was actually being shared between the government and fifteen companies—fifteen out of roughly three thousand, or, as he estimated, 0.2 percent of the entire industry.

  How long, I asked, before CRISP is deployed across the entire industry?

  “We will double this year. We will get to at least thirty [companies] by the end of 2015. I think there’s a goal to be at forty.” The information gathered by that tiny fragment is being “socialized” with thousands of other companies, but this is not happening in anything approaching real time and has not yet reached the point where it can be described as offering a serious defense of the industry at large.

 

‹ Prev