Book Read Free

Countdown to Zero Day: Stuxnet and the Launch of the World's First Digital Weapon

Page 2

by Kim Zetter


  Ulasen tried to wrap his head around the number of machines that were at risk of infection from this. But then something equally troubling struck him. The malicious driver module, and another driver module that got dropped onto targeted machines as part of the malicious cargo, had installed themselves seamlessly on their test machine, without any warning notice popping up on-screen to indicate they were doing so. Windows 7 had a security feature that was supposed to tell users if an unsigned driver, or one signed with an untrusted certificate, was trying to install itself on their machine. But these two drivers had loaded with no problem. That was because, Ulasen realized with alarm, they were signed with what appeared to be a legitimate digital certificate from a company called RealTek Semiconductor.10

  Digital certificates are trusted security documents, like digital passports, that software makers use to sign their programs to authenticate them as legitimate products of their company. Microsoft digitally signs its programs and software updates, as do antivirus firms. Computers assume that a file signed with a legitimate digital certificate is trustworthy. But if attackers steal a Microsoft certificate and the private cryptographic “key” that Microsoft uses with the certificate to sign its files, they can fool a computer into thinking their malicious code is Microsoft code.

  Attackers had used digital certificates to sign malicious files before. But they had used fake, self-signed certificates masquerading as legitimate ones, or had obtained real certificates through fraudulent means, such as creating a shell company to trick a certificate authority into issuing them a certificate under the shell company’s name.11 In both scenarios, attackers ran the risk that machines would view their certificate as suspicious and reject their file. In this case, the attackers had used a valid certificate from RealTek—a trusted hardware maker in Taiwan—to fool computers into thinking the drivers were legitimate RealTek drivers.

  It was a tactic Ulasen had never seen before and it raised a lot of questions about how the attackers had pulled it off. One possibility was that they had hijacked the computer of a RealTek software developer and used his machine and credentials to get their code secretly signed.12

  But it was also possible the attackers had simply stolen the signing key and certificate, or cert. For security reasons, smart companies store their certs and keys on offline servers or in hardware security modules that offered extra protection. But not everyone did this, and there were possible clues to suggest that RealTek’s cert had indeed been nabbed. A timestamp on the certificates showed that both of the drivers had been signed on January 25, 2010. Although one of the drivers had been compiled a year earlier on January 1, 2009, the other one was compiled just six minutes before it was signed. The rapid signing suggested the attackers might have had the RealTek key and cert in their possession.

  The implications were disturbing. The use of a legitimate digital certificate to authenticate malicious files undermined the trustworthiness of the computer world’s signing architecture and called into question the legitimacy of any file signed with digital certificates thereafter. It was only a matter of time before other attackers copied the tactic and began stealing certificates as well.13 Ulasen needed to get the word out.

  Responsible disclosure dictated that researchers who find vulnerabilities in software notify the relevant vendors before going public with the news to give the vendors time to patch the holes, so Ulasen dashed off e-mails to both RealTek and Microsoft, notifying them of what his team had found.

  But after two weeks passed with no response from either company, Ulasen and Kupreev decided they couldn’t keep quiet.14 The rest of the security community needed to know about the .LNK exploit. They had already added signatures to VirusBlokAda’s antivirus engine to detect the malicious files and were seeing infections pop up on machines all over the Middle East and beyond. The worm/virus was on the run and spreading quickly. They had to go public with the news.15

  So on July 12, Ulasen posted a brief announcement about the zero-day to his company’s website and to an online English-language security forum, warning that an epidemic of infections was about to break out.16 He divulged few details about the hole it was attacking, to avoid giving copycat hackers information that would help them exploit it. But members of the forum grasped the implications quickly, noting that it had the potential to be “deadly to many.”

  Three days later, tech journalist Brian Krebs picked up the announcement and wrote a blog post about it, summarizing what little was known about the vulnerability and exploit at the time.17 The news raced through the security community, causing everyone to brace for a wave of assaults expected to come from the worm and copycat attacks using the same exploit.18 In the meantime, the head of an institute in Germany that researched and tested antivirus products brokered an introduction between Ulasen and his contacts at Microsoft, prompting the software company to begin work on a patch.19 But with news of the vulnerability already leaked, Microsoft decided to release an immediate advisory about the critical flaw to customers, along with a few tips advising them how to mitigate their risk of infection in the meantime. In the absence of a patch, however, which wouldn’t be released for another two weeks, it was far from a cure.20

  The computer security industry also rumbled into action to address the worm that now had a name—“Stuxnet,” an alias Microsoft conjured from letters in the name of one of the driver files (mrxnet.sys) and another part of the code. As security companies added signatures to their engines to detect the worm and its exploit, thousands of malicious files started showing up on the machines of infected customers.21

  Almost immediately, another surprise emerged. On July 17, an antivirus firm in Slovakia named ESET spotted another malicious driver that appeared to be related to Stuxnet. This one was also signed with a digital certificate from a company in Taiwan, though not from RealTek. Instead, it came from a company called JMicron Technology, a maker of circuits.

  The driver was discovered on a computer by itself, without any of Stuxnet’s other files, but everyone assumed it must be related to Stuxnet since it shared similarities with the other drivers that VirusBlokAda had found.22 There was something notable about the compilation date of this driver, however. When hackers ran their source code through a compiler to translate it into the binary code that a machine could read, the compiler often placed a timestamp in the binary file. Though attackers could manipulate the timestamp to throw researchers off, this one appeared to be legitimate. It indicated that the driver had been compiled on July 14, two days after VirusBlokAda had gone public with news of Stuxnet. Had the Stuxnet hackers unleashed the driver in a new attack, completely oblivious to the fact that an obscure antivirus firm in Belarus had just blown their cover? Or had they known their stealth mission was about to be exposed and were racing to get Stuxnet onto more machines before it would be blocked? There were clues that the attackers had missed a few steps while signing the driver with the JMicron cert, which suggested they may indeed have been in a hurry to get their attack code out the door and onto machines.23 One thing was clear, though: the attackers had needed this new certificate to sign their driver because the RealTek certificate had expired a month earlier, on June 12. Digital certificates have a limited life-span, and once RealTek’s expired, the attackers could no longer use it to sign new files. The certificate was also revoked by certificate authorities once Stuxnet was exposed, which meant that Windows machines would now reject or flag any files that had already been signed with it.24

  The discovery of the second certificate led to more speculation about how the hackers had obtained these security documents. RealTek and JMicron were both headquartered just two blocks away from each other in the Hsinchu Science and Industrial Park in Hsinchu City, Taiwan. Given their geographic proximity, some speculated that the attackers may have physically broken into the two offices to steal the digital signing keys and certs. Others speculated that the People’s Republic of China was behind the Stuxnet attack and had hacked the two Taiwanese companies
to get their digital signing keys and certificates.

  Whatever the scenario, it meant the attackers likely had other stolen digital certificates in their arsenal. And if they had gone to this much trouble to make sure their attack would work, it likely meant they had a serious goal and considerable means at their disposal. Many in the security community were left feeling very uneasy and perplexed. “We rarely see such professional operations,” ESET researcher Pierre-Marc Bureau remarked online.25

  As antivirus firms examined the Stuxnet files pouring in from customers, they got another surprise. Based on dates in some of the files, it appeared that Stuxnet had been launched in the wild as early as June 2009, which meant it had been lurking on machines for at least a year before VirusBlokAda discovered it. It also appeared that the attackers had unleashed their attack in three different waves—in June 2009, and in March and April 2010—changing the code slightly in each of these waves.

  One thing that was still a mystery, though, was Stuxnet’s intention. Researchers could find no sign in any of the files that Stuxnet was stealing bank account passwords or other credentials the way so much other malware was designed to do. Neither could they find signs of any other obvious motive in the code. That is, until a researcher in Germany found one possible clue suggesting Stuxnet’s aim.

  “Hi guys,” Frank Boldewin wrote to the online forum where Ulasen had first published his notice about Stuxnet, “has anyone … taken a deeper look at the malware?” Boldewin had unwrapped the first layer of covering on one of Stuxnet’s files and found unusual references inside to software made by the German firm Siemens. The attackers appeared to be searching for computers that had one of two Siemens proprietary software programs installed—either Siemens SIMATIC Step 7 software or its SIMATIC WinCC program. Both programs are part of an industrial control system (ICS) designed to work with Siemens programmable logic controllers (PLCs)—small computers, generally the size of a toaster, that are used in factories around the world to control things like the robot arms and conveyor belts on assembly lines.

  Boldewin had never seen malware targeting an industrial control system before. There was no obvious financial gain to be made from hacking factory equipment like PLCs, at least not the kind of quick cash that could be made from hacking bank accounts and credit card systems. It could mean only one thing to him. “Looks like this malware was made for espionage,” he wrote.26 The attackers must have been looking to steal a competitor’s factory design or their product blueprints.

  It was an assessment that many in the tech community were all too happy to embrace. Stuxnet appeared to be targeting only systems with the Siemens software installed, which meant that any computer not using the Siemens programs was presumably safe, and their owners could relax. The systems in Iran that were caught in the reboot loop didn’t have the Siemens software installed, Ulasen discovered, and aside from the system crashes they experienced, it appeared that Stuxnet had caused them no lingering harm.

  So within a week or so after the mysterious worm’s brief brush with fame, it appeared that Stuxnet was on its way out the door to lasting obscurity. Microsoft was still working on a patch to fix the security hole the .LNK exploit breached, but as far as most security companies were concerned, once they added signatures to their scanners to detect the worm’s malicious files, Stuxnet held no further interest.

  The story of the world’s first digital weapon might well have ended here, except that a few security researchers weren’t quite ready to let it go.

  * * *

  1 Ulasen and his team encountered the malware the week of June 24, 2010.

  2 Ulasen has never disclosed the name of the reseller, but a link on VirusBlokAda’s website for its distributor in Iran points to vba32-​ir.​com, a site owned by the Deep Golden Recovery Corporation, a data-recovery firm in Iran.

  3 Information about VirusBlokAda’s encounter with the malware comes from interviews with Sergey Ulasen and Oleg Kupreev, as well as from an account published by Kaspersky Lab in 2011, after the Russian antivirus firm hired Ulasen away from VirusBlokAda. That interview, “The Man Who Found Stuxnet—Sergey Ulasen in the Spotlight,” was published November 2, 2011, at eugene.​kaspersky.​com/​2011/​11/​02/​the-​man-​who-​found-​stuxnet-​sergey-​ulasen-​in-​the-​spotlight.

  4 A module is a stand-alone component. It is often interchangeable and can be used with various programs.

  5 Drivers are software programs that are used as interfaces between a device and a computer to make the device work with the machine. For example, a driver is required to allow a computer to communicate with a printer or digital camera that is connected to it—different drivers are available for different operating systems so that the same device will work with any computer. In this case the drivers were actually rootkits designed to install and conceal malicious files on the machine.

  6 The reboot problem didn’t occur on other machines later found to be infected by the malware. So some researchers suspect the problem may have been an incompatibility between one of the malware’s drivers and VirusBlokAda’s antivirus software. The malware used the driver to install itself, and researchers at Kaspersky Lab in Russia suspected that when the driver injected the malware’s main file into the memory of the machines in Iran, this caused some machines to crash. Researchers at Kaspersky Lab later tried to reproduce the problem but got inconsistent results—sometimes a machine crashed, sometimes it didn’t. The irony is that the attackers had put a lot of effort into testing their malware against antivirus scanners from Kaspersky, Symantec, McAfee, and others, precisely to make sure their code wouldn’t be detected by the scanners or crash machines. But they apparently hadn’t tested it against VirusBlokAda’s scanning software. So if VBA’s scanner was the problem, it meant this tiny Belarusian firm had been their undoing in more ways than one.

  7 Autorun is a convenience feature in Windows that allows programs on a USB flash drive, CD-ROM, or DVD, to automatically launch when the devices are inserted into a computer. It’s a known security risk, however, because any malicious program on the device will automatically launch as well.

  8 If Autorun is disabled for security reasons, then the malicious code on the flash drive that exploits this feature will not be able to launch automatically but will launch only if users specifically click on the file to open it.

  9 The exploit worked against seven versions of Windows: Windows 2000, WinXP, Windows 2003, Vista, Windows Server 2008, Windows 7, and Windows Server 2008 R2.

  10 With Windows Vista and Windows 7, a driver that isn’t signed with a trusted digital certificate that Microsoft recognizes will have trouble installing on the machine. On 32-bit Windows machines that have Vista or Windows 7 installed, a warning will display, telling the user the file is not signed or is not signed with a trusted certificate, forcing the user to make a decision about whether to let it install. On 64-bit Windows machines using either operating system, a file not signed with a trusted certificate simply won’t install at all. The malware VirusBlokAda found only worked on 32-bit Windows machines.

  11 Certificate authorities dole out the signing certificates that companies use to sign their code and websites. The CAs are supposed to verify that an entity requesting a certificate has the authority to do so—to prevent someone other than Microsoft from obtaining a code-signing certificate in Microsoft’s name, for example—and to ensure that if someone applies for a signing certificate for a company they claim is theirs, it’s a real company producing real code. Some certificate authorities don’t do due diligence, however, and certificates are sometimes issued to malicious actors. There are also companies that, for a fee, will use their key and certificate to sign code for others. Hackers have used these companies in the past to sign their malware.

  12 In September 2012, this is exactly what happened to Adobe. The software giant, which distributes the popular Adobe Reader and Flash Player programs, announced that attackers had breached its code-signing server to sign two mali
cious files with an Adobe certificate. Adobe stored its private signing keys in a device called a hardware security module, which should have prevented the attackers from accessing the keys to sign their malicious files. But they compromised a build server—a server used for developing software—which had the ability to interact with the code-signing system and get it to sign their files.

  13 Ironically, on July 12, 2010, the day Ulasen went public with news about the malware, a researcher with the Finnish security firm F-Secure published a conference presentation about digital certificates, stating that, as of then, malware using stolen certificates had yet to be discovered. He noted, however, that this would inevitably happen now that new versions of Windows treated unsigned drivers with suspicion, pushing hackers to steal legitimate certificates to sign their malware. (See Jarno Niemela, “It’s Signed, Therefore It’s Clean, Right?” presented at the CARO conference in Helsinki, Finland; available at f-​secure.​com/​weblog/​archives/​Jarno_Niemela_​its_signed.​pdf.) Indeed, not long after VirusBlokAda’s discovery of the RealTek certificate, other hackers were already attempting to use the same tactic. In September 2010, antivirus firms discovered Infostealer. Nimkey, a Trojan horse specifically designed to steal private key certificates from computers. This was followed over the next two years by a number of malicious programs signed with certificates apparently stolen from various trusted companies.

  14 Ulasen contacted Microsoft through a general e-mail address used for its security team. But Microsoft’s security response team receives more than 100,000 e-mails a year, so it was understandable that an e-mail sent to its general mailbox from an obscure antivirus firm in Belarus got lost in the queue.

  15 The malware, researchers would later discover, was a combination of a worm and virus. The worm portion allowed it to spread autonomously without user action, but once it was on a system, other components infected files, like a virus would, and required user action to spread.

 

‹ Prev