Stuxnet was a two-headed beast as it turns out, one that could have laid waste to the Natanz nuclear facility which it infected, and one that should have, by expert accounts, remained undetected if not for the noisier yet less complex second attack routine that is now familiar to the world.

Industrial control system and SCADA expert Ralph Langner wrote an article for Foreign Policy magazine and a paper on his website published this week that throws back the covers on an older, more complex and stealthier version of the malware, which is markedly different from the second attack routine, which emerged in 2010..

“It turns out that it was far more dangerous than the cyberweapon that is now lodged in the public’s imagination,” Langner wrote. “Without that later and much simpler version, the original Stuxnet might still today sleep in the archives of antivirus researchers, unidentified as one of the most aggressive cyberweapons in history.”

Langner said the older, lesser known Stuxnet—put in place in 2007—targeted the protection systems around cascades of centrifuges used to enrich uranium at the plant. The attackers were keenly aware of weaknesses in plant design and process execution. They knew the Iranians were content in accepting a percentage of faulty centrifuges because they had designed a protection system that enabled enrichment to continue amidst the breaking centrifuges, Langner said.

“The system might have keep Natanz’s centrifuges spinning, but it also opened them up to a cyberattack that is so far-out, it leads one to wonder whether its creators might have been on drugs,” Langner wrote.

Ingeniously, the malware had the capability of recording 21 seconds of activity from the protection system’s sensors, showing a healthy stream of activity. That 21 seconds was looped over and over on monitoring screens while the attack was executed. Engineers thought they were watching an enrichment process hum along as designed that instead was spinning out of control. The malware attacked industrial controllers built by Siemens, closing crucial valves causing pressure to go up, gases to collect, and centrifuges to figuratively blow up.

The attackers, Langner said, could have let them literally blow up, causing catastrophic destruction. They didn’t, keeping their cover as a result, he said. “The implementation of the attack with its extremely close monitoring of pressures and centrifuge status suggests that the attackers instead took great care to avoid catastrophic damage,” Langner wrote.

Langner’s analysis of the attack called it over-engineered for the task and that any slip-up would have risked detection by the Iranians. Two years after the first Stuxnet was in place, in 2009, the second phase was introduced.

The variant attacked another process control system that controlled rotor speeds in the centrifuges and was a self-replicating worm that moved within the plant’s network and on portable USB drives; the older version, Langner said, was deliberately installed on plant computers, likely by an agent of the attackers.

“This suggests that the attackers had lost the capability to transport the malware to its destination by directly infecting the systems of authorized personnel, or that the centrifuge drive system was installed and configured by other parties to which direct access was not possible,” Langner wrote.

This version of Stuxnet has been well documented, from its use of a number of Windows zero-day exploits and malware signed with stolen Microsoft digital certificates. Langner said this version of Stuxnet was written by hackers, skilled in writing malicious code, while the first attack was coded alongside experts adept in industrial control systems, not IT. Langner points a finger at the National Security Agency as the authors of Stuxnet, calling it the only logical location for its development.

This version and approach to attacking the Iranians’ nuclear capabilities left fingerprints—strange behavior in the industrial processes that could, and would, be detected. And while the attackers could have caused catastrophic destruction at any time, Langner estimates they instead set the country’s nuclear program back by only two years.

“The attackers were in a position where they could have broken the victim’s neck, but they chose continuous periodical choking instead,” Langner wrote. “Stuxnet is a low-yield weapon with the overall intention of reducing the lifetime of Iran’s centrifuges and making the Iranians’ fancy control systems appear beyond their understanding.”

Langner also speculates that Stuxnet was not built to escape beyond Natanz’s walls, yet it did, likely through contractors who worked at the plant leaving with laptops infected with Stuxnet and plugging them in at other industrial facilities where they were contracted. Stuxnet was designed to spread only on local networks, or via portable drives, Langner said.

He also wrote that it was likely the attackers’ intention to allow Stuxnet to spread since the malware reports IP addresses and hostnames of infected systems to a command infrastructure. The attackers could monitor the movement of contractors, likely in the hopes of spotting other nuclear facilities in Iran operating under the radar, he wrote.

The danger too is that future weaponized attacks such as Stuxnet can follow this same path into a facility because, as Langner put it, contractors are good at engineering tasks but lousy at cybersecurity and could be unwitting pawns in deploying another such weapon at any time.

Langner estimates that 50 percent of the investment into Stuxnet was put into hiding the attack; future attacks may not require the same kind of investment, and therefore may not need the resources of a nation-state such as Stuxnet did, Langner wrote.

“And unlike the Stuxnet attackers, these adversaries are also much more likely to go after civilian critical infrastructure. Not only are these systems more accessible, but they’re standardized,” he wrote.

Ultimately, Langner said, Stuxnet may have served two purposes: 1) disrupt the Iranian nuclear program; and 2) allow the attacker to flex its cyberweaponry muscle.

“Operation Olympic Games started as an experiment with an unpredictable outcome. Along the road, one result became clear: Digital weapons work. And different from their analog counterparts, they don’t put military forces in harm’s way, they produce less collateral damage, they can be deployed stealthily, and they are dirt cheap,” Langner wrote. “The contents of this Pandora’s box have implications much beyond Iran; they have made analog warfare look low-tech, brutal, and so 20th century.”

Categories: Critical Infrastructure

Leave A Comment

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>