Government Promises Comment Period on Next Wassenaar Draft

A National Security Council member promised Rep. Jim Langevin that a final U.S. rule on the Wassenaar Arrangement would not happen without another public comment period.

It’s been months since the U.S. Commerce Department’s Bureau of Industry and Security pulled the U.S. implementation of the Wassenaar Arrangement off the table for an unusual rewrite of the rules governing so-called intrusion software.

The overly broad rule drew the ire of security and privacy experts because its vague language would put a serious crimp into legitimate white-hat security research and even threaten to demand export control licenses for the use of popular and widely used penetration testing tools vital to security research.

Until last week, little had been heard from the government about the status of the rewrite, but a letter from Caroline A. Tess of the National Security Council to Rep. James Langevin of Rhode Island indicates that a final rule will not be issued without another public comment period.

“The Administration is committed to taking into account the impact that any export control rule relating to cyber technology may have on our national security and adequately considering the burden that such a rule may place on legitimate cybersecurity activities,” Tess wrote. “To that end, we have intensified our engagement with experts and stakeholders from the U.S. government and industry on how to mitigate the national security risks posed by the proliferation of cyber tools in a manner consistent with promoting cybersecurity.”

The initial public comment period ended July 20 and more than 300 individual researchers and technology companies submitted their objections to the rules as written and their potential chilling effect on research. The fear is that not only would pen-testing tools and other legitimate security software be in the rules’ crosshairs, but also the development of proof-of-concept exploits would fall under Wassenaar and require an export license to be shared. Such exploits are crucial for vendors as they examine vulnerabilities in their products and try to reproduce the conditions that could put data at risk.

High costs and fear of potential legal trouble, researchers cautioned, would not only stymie innovation, but affect product security as known and unknown vulnerabilities would be left unpatched.

Katie Moussouris, chief policy officer at HackerOne said the pressure is on to get the rules passed before June, otherwise, the likelihood of anything happening over the summer as the Republican and Democratic conventions convene and the Presidential election kicks into high gear, lessens the chance of anything happening.

“We have only a couple of opportunities left to pass it be other things take priority and get in the queue first,” Moussouris said of the election season. “At this point, there is no other draft, and Commerce has promised another 60-day review period. As the clock runs down, I don’t know if this will hit this session of Congress.”

Moussouris has been one of the most outspoken voices of the security community against the previous draft of the U.S. rules. She points out that in addition to the vague language, the U.S. rules were also shy on exemptions that would allow legitimate research to continue unscathed.

“Exemptions like the ability to do technology transfers inside organizations with a multinational presence where you would not have to apply for an export license to pass vulnerability information,” Moussouris said. “Other countries had things in place, but the U.S. did not. [The U.S.] could not agree on exemptions.”

Even with certain exemptions in place, Moussouris said it would still be difficult to transfer PoC exploits and coordination between vendors such as with Heartbleed and other Internet-wide bugs.”

“It would break the ability for global vendors to vendor vulnerability coordination,” Moussouris said.

One key sticking point in the U.S. is that the government views intrusion software with an eye toward it threatening national security, while other countries saw the potential for privacy and human rights violations. Software such as Gamma International’s FinFisher or Hacking Team surveillance tools were sold to oppressive regimes to keep tabs on activists, journalists and others viewed as threatening to those governments. After the Hacking Team data dump last summer, it was learned that the company was selling its tools to sanctioned nations to monitor the activities of its citizens, putting personal safety at risk in some instances.

Suggested articles