Two things worth noting from yesterday’s unveiling of the Bureau of Industry and Security’s proposed Wassenaar rules for the U.S. that weren’t so overt: a) The U.S. generally leads the way in implementing Wassenaar changes, and this time it’s been beaten by the EU by almost 18 months; and b) requests for comments, such as the 60-day period that opened yesterday, are uncommon.

“I think this means [BIS] had trouble understanding fully the scope and understanding potentially negative repercussions for overregulating,” said Collin Anderson, an independent security researcher who has spent many hours studying the Wassenaar controls. “I think it means they’re still trying to figure out what to do with this rule.”

BIS, a bureau under the U.S. Commerce Department, published rules yesterday that left some scratching their heads in confusion, and others scurrying for cover because of its potential implications on legitimate vulnerability research and exploit development, and the use of commercial penetration testing and other dual-use technologies.

What has experts such as Anderson concerned is the rules’ broad definition of “intrusion software,” which is at the center of the document. As defined in the rules, intrusion software is:

“Software ‘specially designed’ or modified to avoid detection by ‘monitoring tools,’ or to defeat ‘protective countermeasures,’ of a computer or network-capable device, and performing any of the following:

(a) The extraction of data or information, from a computer or network-capable device, or the modification of system or user data; or

(b) The modification of the standard execution path of a program or process in order to allow the execution of externally provided instructions.”

Anderson said this definition paves the way for an expansion of the rules as implemented by the EU, and beyond their original intent of imposing export controls and licensing requirements on spying software such as FinFisher and Hacking Team, which reportedly has been sold to and used by oppressive regimes around the world. Previous language in the rules protected some off-the-shelf commercial malware, and dual-use tools available to researchers, from export controls. Anderson’s interpretation is that’s no longer the case.

“What it looks like is that the rules are not going to provide that exception; now everything except open source is controlled,” Anderson said. “That was not anticipated. For them to consider zero days and exploits as commodities and as controlled, was not expected, and something I’ve argued against in two papers. That’s an expansion.”

Randy Wheeler, director of BIS, confirmed during a teleconference yesterday afternoon that the development, testing, evaluating and productizing of exploits, zero days and intrusion software would be controlled, but the same did not apply to vulnerability research.

“Vulnerability research is not controlled nor would the technology related to choosing, finding, targeting, studying and testing a vulnerability be controlled,” she said.

With the devil buried in many of the details, Anderson said it’s important to note that no discussions have been had on the issue of export controls on exploits and zero days, nor how they’re interpreted in the rules. He hopes that the security community will take full advantage of the two-month comment period, and contribute to the process. Anderson said he anticipates that he, other researchers, and civil organizations will make themselves available to collect comments from researchers, and submit a larger comment on behalf of the many.

“There’s a large topic covered here, and it’s problematic,” Anderson said. “Is the sale of a vulnerability in a bug bounty considered a control? It’s a problem since that’s a major source of security research and funding for security research globally. These sorts of things, we don’t have a real solid answer for and that’s problematic.”

“The broad definition of intrusion software could mean that we end up with control of commonplace research, as opposed to the technologies the [Wassenaar Arrangement] set out to control originally.”
-Runa A. Sandvik

Anderson also wonders how the transfer of knowledge applies under Wassenaar. For example, would Iranian engineers and security researchers be excluded from projects under the rules? “That’s less likely,” Anderson said, “but there’s no answer to that, especially when it’s considered a subset of a completely separate problem.”

Runa A. Sandvik, a privacy and security researcher, said that this implementation of Wassenaar would put export control authorities in a position where they would be the ones directing and driving security research.

“The broad definition of intrusion software could mean that we end up with control of commonplace research, as opposed to the technologies the [Wassenaar Arrangement] set out to control originally,” Sandvik said. “This attempt to define which technologies to regulate reminds me of the whitelist/blacklist approach in computer security.”

Dual-use technologies such as Metasploit from Rapid7 and Core Security’s pen-testing tools would also have to be sorted out (a request for comment from Rapid7 was not returned in time for publication).

“A hammer, for example, is a tool that can be used for both good and bad. If the authorities do not fully understand every single use case of said hammer, they will fail to properly regulate its use. Not to
mention the fact that use cases change and new ones develop; who knows how people will use a hammer ten, twenty or thirty years from now,” Sandvik said. “BIS has said that defensive pen-testing tools could be re-appropriated for offensive purposes and therefore be in line with control, but this
would likely require researchers to go through the whole process again to get these tools approved as ‘good tools.’

“If this passes, we may find ourselves in a future where revisions attempt to expand the already broad definitions. This seems like a slippery slope.”

Categories: Cryptography, Government, Malware, Privacy, Vulnerabilities, Web Security