Countdown to Zero Day: Stuxnet and the Launch of the World's First Digital Weapon

Home > Other > Countdown to Zero Day: Stuxnet and the Launch of the World's First Digital Weapon > Page 20
Countdown to Zero Day: Stuxnet and the Launch of the World's First Digital Weapon Page 20

by Kim Zetter


  23 Dan Morain, “Hackers Victimize Cal-ISO,” Los Angeles Times, June 9, 2001.

  24 A previous SCADA testing program had launched at Sandia National Laboratory in 1998, but didn’t involve vendors. INL is the Department of Energy’s lead lab for nuclear energy research and runs the largest test reactor in the world. The Atomic Energy Commission took over the land in Idaho after World War II to build a nuclear research lab. Over the years, the lab’s work expanded to include research on the electric grid and, after the Bush administration released its National Strategy to Secure Cyberspace in February 2003, the security of industrial control systems. That strategy called for the Department of Energy and the Department of Homeland Security (DHS) to partner with private industry to address the security of control systems.

  25 Department of Homeland Security, “The National Strategy for the Physical Protection of Critical Infrastructures and Key Assets” (report, The White House, February 2003), 9. Available at dhs.​gov/​xlibrary/​assets/​Physical_Strategy.​pdf.

  26 Ibid., 39.

  27 One problematic loophole in the test program, however, is that the reports vendors receive describing the vulnerabilities found in their systems are covered by a nondisclosure agreement and the vendors are not required to tell customers about the vulnerabilities found in their systems.

  28 Kim Zetter, “Hard-Coded Password and Other Security Holes Found in Siemens Control Systems,” Wired.​com, August 3, 2011, available at wired.​com/​2011/​08/​siemens-​hardcoded-​password.

  29 Joe Weiss estimates that more than half of all control systems have a back door embedded in them by the vendor.

  30 Beresford also found one more surprise—an “Easter egg” that a Siemens programmer had hidden in the firmware. Easter eggs are inside jokes that coders bury in their programs for users to find. Often they can be seen only if a user types a specific sequence of keys or accesses an obscure part of the program. In Siemens’s case, the Easter egg consisted of an animated image of dancing chimpanzees that appeared on-screen with a German proverb. Translated loosely the proverb said, “All work and no play makes Jack a dull boy.” Though the Easter egg wasn’t malicious, it raised serious concerns about Siemens’s security. If a programmer had slipped the joke past the company’s internal code reviewers, what else might have slipped by them?

  31 In 2013, two researchers found problems with a popular protocol used by control centers to communicate with PLCs and RTUs installed at substations. An intruder who couldn’t gain direct access to the control-center machine via the internet could compromise the communication device at a remote substation—either by accessing it physically or hacking into the wireless radio network it uses to communicate with the control center—and exploit a vulnerability in the protocol to send malicious commands to the control center. In this way an attacker could either crash the control-center machine or use it to distribute malicious commands to all of the substations with which that machine communicates, potentially taking out dozens or even hundreds of substations at a time, depending on the size of the utility. See Kim Zetter, “Researchers Uncover Holes That Open Power Stations to Hacking,” Wired.​com, October 16, 2013, available at wired.​com/​2013/​10/​ics.

  32 Jordan Robertson, “Science Fiction–Style Sabotage a Fear in New Hacks,” Associated Press, October 23, 2011, available at news-​yahoo.​com/​science-​fiction-​style-​sabotage-​fear-​hacks-​120704517.​html.

  33 In 2003, according to Joe Weiss, when the SQL Slammer worm hit the internet, one controlsystem supplier warned its customers not to install a patch released by Microsoft to combat the worm, because the patch would shut down their system.

  34 Safety systems at nuclear plants, luckily, are still controlled by analog means, according to Joe Weiss, and the horror of a core meltdown being caused by a cyber incident is very low for existing nuclear plants. But that could change, since designs for next-generation plants include digital, networked systems, he says, that could make it easier to attack such plants.

  35 Kim Zetter, “10K Reasons to Worry About Critical Infrastructure,” Wired.​com, January 24, 2012, available at wired.​com/​2012/​01/​10000-​control-​systems-​online. Researcher Eireann Leverett was unable to determine how many of the control systems were working systems as opposed to demo systems, and couldn’t tell how many of them were critical systems as opposed to simply an office heating system at a plant. But he did identify control systems for water facilities in Ireland and sewage facilities in California among them, and even controls for a heating system can sometimes be leveraged by attackers to access other parts of a network. And only 17 percent of the 10,000 systems he found required authorization to connect to them. In some cases, the owners weren’t even aware their systems were accessible online.

  36 Paul F. Roberts, “Hacker Says Texas Town Used Three Character Password to Secure Internet Facing SCADA System,” Threatpost blog, November 20, 2011, available at threatpost.​com/​blogs/​hacker-​says-​texas-​town-​used-​three-​character-​password-​secure-​internet-​facing-​scada-​system-​11201/​75914.

  37 His statements appeared on the Pastebin site on November 18, 2011. See Pastebin.​com/​Wx90LLum.

  38 Ken Dilanian, “Virtual War a Real Threat,” Los Angeles Times, March 28, 2011.

  39 Kim Zetter, “Chinese Military Linked to Hacks of More Than 100 Companies,” Wired.​com, February 19, 2013, available at wired.​com/​2013/​02/​chinese-​army-​linked-​to-​hacks. For more information on the specifics of the Telvent hack, see also Kim Zetter, “Maker of Smart-Grid Control Software Hacked,” Wired.​com, September 26, 2012, available at wired.​com/​2012/​09/​scada-​vendor-​telvent-​hacked.

  40 “Report of the Commission to Assess the Threat to the United States from Electromagnetic Pulse (EMP) Attack,” April 2008, available at empcommission.​org/​docs/​A2473-​EMP_Commission-​7MB.​pdf. See also footnote 25 for a description of an intentional electromagnetic pulse attack plan.

  41 A 1996 RAND study titled “The Day After … in Cyberspace” was one of the first to imagine the consequences of a multipronged attack that targeted planes, trains, phone systems, and ATMs on a number of continents. See Robert H. Anderson and Anthony C. Hearn, “An Exploration of Cyberspace Security R&D Investment Strategies for DARPA: The Day After … in Cyberspace II,” RAND, 1996, available at rand.​org/​pubs/​monograph_reports/​MR797.​html.

  42 Bill Gertz, “Computer-Based Attacks Emerge as Threat of Future, General Says,” Washington Times, September 13, 2011.

  43 Joe P. Hasler, “Investigating Russia’s Biggest Dam Explosion: What Went Wrong,” Popular Mechanics, February 2, 2010.

  44 “Pipeline Rupture and Subsequent Fire in Bellingham, Washington June 10, 1999,” published by the National Transportation Safety Board, 2002, available at ntsb.​gov/​doclib/​reports/​2002/​PAR0202.​pdf.

  45 “Pacific Gas and Electric Company Natural Gas Transmission Pipeline Rupture and Fire,” National Transportation Safety Board, September 9, 2010, available at ntsb.​gov/​investigations/​summary/​PAR1101.​html.

  46 J. David Rogers and Conor M. Watkins, “Overview of the Taum Sauk Pumped Storage Power Plant Upper Reservoir Failure, Reynolds County, MO,” presented at the 6th International Conference on Case Histories in Geotechnical Engineering, Arlington, VA, August 11–16, 2008, available at web.​mst.​edu/​~rogersda/​dams/​2_43_rogers.​pdf.

  47 Emitt C. Witt III, “December 14th, 2005 Taum Sauk Dam Failure at Johnson’s Shut-Ins Park in Southeast Missouri,” National Oceanic and Atmospheric Administration, available at crh.​noaa.​gov/​lsx/​?​n=​12_14_2005.

  48 Lyndsey Layton, “Metro Crash: Experts Suspect System Failure, Operator Error in Red Line Accident,” Washington Post, June 23, 2009.

  49 Graeme Baker, “Schoolboy Hacks into City’s Tram System,” Telegraph, January 11, 2008.

  50 From author interview, August 2012.


  51 A YouTube video of the simulation can be seen online at: youtube.​com/​watch?​v=​kc_ijB7VPd8. Or see links to Davis’s presentation slides and two other smart meter simulations at ioactive.​com/​services_grid_​research.​html.

  52 NERC has cyber security regulations that utilities are supposed to follow, but they apply only to bulk electric systems (defined as facilities and systems that operate at or above 100 kilovolts) and compliance doesn’t guarantee a system won’t get hacked. Security is an evolving condition, not a static one, and can change anytime new equipment is installed or configurations are changed.

  53 US-Canada Power System Outage Task Force, “Final Report on the August 14th Blackout in the United States and Canada,” April 2004, available at https://reports.energy.gov/BlackoutFinal-Web.pdf.

  54 Kevin Poulsen, “Software Bug Contributed to Blackout,” SecurityFocus.​com, February 11, 2004, available at securityfocus.​com/​news/​8016.

  55 Rebecca Smith, “U.S. Risks National Blackout from Small-Scale Attack,” Wall Street Journal, March 12, 2004.

  56 The scenario was similar to a real-life incident that occurred at a Coors bottling plant in 2004 when an employee mistakenly changed the settings on a system responsible for greasing the bearings on a bottling line. Instead of greasing the bearings every twenty minutes he set it to grease them every eight hours, and eventually the bottling line seized up.

  57 Justin Blum, “Hackers Target US Power Grid,” Washington Post, March 11, 2005.

  58 Florida Power and Light, “FPL Announces Preliminary Findings of Outage Investigation,” February 29, 2008, available at fpl.​com/​news/​2008/​022908.​shtml.

  59 From an undated DHS slide presentation obtained through a FOIA request made by the author. The slide presentation is titled “Control Systems Vulnerability—Aurora.”

  60 The figure comes from a cost assessment developed for the Aurora test and released by DHS in the author’s FOIA request.

  61 As an example of what can happen when the coupling on a turbine is damaged, in 2011, a steam turbine generator at a power plant in Iran exploded in Iranshahr and was attributed to a coupling failure. The explosion was so forceful that investigators couldn’t even find the power turbine after the accident. Couplings need to be inspected regularly for signs of wear and need to be lubricated to maintain operations and prevent accidents. The plant in Iranshahr had three oil burners in the room where the generator was installed, which likely exacerbated the explosion when it occurred. The explosion could indeed have been the result of badly maintained coupling or faulty installation, but there were some at the time who thought it might have been the result of sabotage on par with the Aurora attack.

  62 Author interview, August 2012.

  63 Ibid.

  64 60 Minutes, “Cyber War: Sabotaging the System,” original air date November 6, 2009, CBS.

  CHAPTER 10

  PRECISION WEAPON

  Ralph Langner sat in his Hamburg office and watched as his two engineers fed a stream of artful lies to the Stuxnet code they had installed on their test machine. Langner, an expert in the arcane field of industrial-control-system security, had been working with his colleagues for days to identify and re-create the precise conditions under which the stubborn code would release its payload to their PLC, but it was proving to be more difficult than they’d expected.

  Days earlier, Langner’s team had set up a computer with the Siemens Step 7 software installed and connected it to a Siemens PLC they happened to have on hand. They also installed a network analyzer to watch data as it passed between the Step 7 machine and the PLC. Unlike the Symantec researchers, Langner and his team worked with PLCs all the time and knew exactly what kind of traffic should pass between the Step 7 machine and the PLC; as a result, they assumed it would be easy to spot any anomalies in the communication. But when they initially infected their Step 7 system with Stuxnet, nothing happened. Stuxnet, they discovered, as others had before, was on the hunt for two specific models of Siemens PLC—the S7-315 and S7-417—and they didn’t have either of these models on hand.

  So they installed a Windows debugger on their test machine to observe the steps Stuxnet took before releasing its payload and devised a way to trick the code into thinking it had found its target. Stuxnet ran through a long checklist pertaining to the target’s configuration, each seemingly more specific than the last. Langner and his colleagues didn’t know what exactly was on the checklist, but they didn’t need to know. As Stuxnet queried their system for each item on the list, they fed it a series of manufactured responses, until they landed on the answers Stuxnet wanted to hear. It was a crude, brute-force method of attack that took several days of trial and error. But when they finally got the right combination of answers and ran the code through its paces one last time, they saw exactly what the Symantec researchers had described: Stuxnet injected a series of rogue code blocks into their PLC. “That’s it,” Langner recalls thinking. “We got the little motherfucker.”1

  They only noticed the rogue code going into the PLC because the blocks of code were slightly larger than they should have been. Before infecting their Step 7 system with the malware, they had transferred blocks of code to the PLC and captured them with the analysis tool to record their basic size and characteristics. After infecting the machine with Stuxnet, they transferred the same blocks of code again and saw that they had suddenly grown.

  They couldn’t yet see what the Stuxnet code was doing to the PLC, but the injection itself was big news. It was way beyond anything they’d ever warned customers about and way beyond anything they expected to see in the first known attack against a PLC.

  WHEN SYMANTEC HAD revealed on August 17 that Stuxnet was bent on sabotaging PLCs, it might have seemed to Chien and Falliere that no one was listening. But six thousand miles away, Langner was sitting in his small office in a leafy suburb of Germany, reading Symantec’s words with great interest. Langner had been warning industrial clients for years that one day someone would devise a digital attack to sabotage their control systems and now, it appeared, the day had finally arrived.

  Langner was the owner of a three-man boutique firm that specialized in the security of industrial control systems. It was the only thing his company did. He had no interest in general computer security and couldn’t care less about announcements warning of the latest viruses and worms infecting PCs. Even zero-day exploits held no allure for him. So when Stuxnet first made headlines in the technology press and became the subject of extensive chatter on security forums, he paid it little notice. But when Symantec wrote that Stuxnet was sabotaging Siemens PLCs, Langner was immediately intrigued.

  Symantec didn’t reveal what Stuxnet was doing to the PLCs, only that it was injecting code into the so-called ladder logic of the PLC—whether that meant bringing the PLC to its knees, or worse, the antivirus firm didn’t say.2 But it struck Langner that thousands of Siemens customers, including many of his own clients, were now facing a potential killer virus and were waiting anxiously for Siemens or Symantec to tell them what exactly Stuxnet was doing to their PLCs. But, oddly, after making their startling announcement, the Symantec researchers had gone quiet.

  Langner suspected the researchers had hit a wall, due to their lack of expertise with PLCs and industrial control systems. But curiously, Siemens had also gone silent. This was strange, Langner thought. It was, after all, Siemens controllers that were being attacked; the company had an obligation to analyze the malevolent code and tell customers what it might be doing to their systems. But after a couple of brief announcements the German company had made in July, it had gone mum.3

  Langner was incensed. Although Stuxnet appeared to be targeting only Siemens Step 7 machines, no one really knew what the malicious code was capable of doing or if it might be laced with bugs that could damage other PLCs. And there was one more important concern: the vulnerability that let Stuxnet inject its malicious code into the ladder logic of a Siemens PLC also existed in other controlle
rs.4 Samples of Stuxnet were already available for download on the internet; any random hacker, criminal extortionist, or terrorist group could study the code and use it as a blueprint to devise a more wide-scale and destructive attack against other models of PLCs.

  This made the silence of two other parties even more perplexing—the CERT-Bund, Germany’s national computer emergency response team; and ICS-CERT in the United States. Both organizations were tasked with helping to secure critical infrastructure systems in their respective countries, but neither party had said much about Stuxnet. There was no talk in an ICS-CERT alert about injecting ladder logic into the Siemens PLCs, or even any mention of sabotaging them. There was also nothing at all about the dangers that Stuxnet presented for future attacks.5 The silence of German authorities was even stranger, since Siemens controllers were installed in almost every German plant or factory Langner could name.

  Langner talked it over with his two longtime engineers, Ralf Rosen and Andreas Timm. None of them had any experience reverse-engineering viruses or worms, but if no one else was going to tell them what Stuxnet was doing to the Siemens PLCs, then they would have to take it apart themselves. It would mean days of doing pro bono work squeezed in between other assignments from paying customers, but they concluded they didn’t have a choice.

  Langner and his colleagues made an odd but effective team. In a profession sometimes characterized by frumpy, pale, and ponytailed engineers, Langner, a vigorous fifty-two-year-old with short dark hair minus any gray, sported crisp business suits and finely crafted leather shoes. He had piercing blue eyes in a vacation-tanned face and the trim, toned frame of a seasoned mountaineer—the by-product of ski excursions in the Alps and rugged hikes in the hills. If Langner’s dapper appearance didn’t set him apart from the pack, his brusque and bold manner did. He had a reputation for being an outspoken maverick, and often made provocative statements that riled his colleagues in the control-system community. For years he had railed about security problems with the systems, but his blunt, confrontational manner often put off the very people who most needed to listen. Rosen and Timm, by contrast, were both graybeard engineers in their forties who had a more relaxed approach to dress and fitness and took a quieter, more backseat role to Langner’s conspicuous one.

 

‹ Prev