SAN FRANCISCO–Security people are, by nature, cautious and methodical, and that is even more true of cryptographers. And in the current environment, when new adversaries seem to emerge on a daily basis and cryptographic standards are under intense scrutiny, a panel of some of the biggest names in cryptography said more conservatism and caution in the development and deployment of encryption is warranted.

In most years, the cryptographers’ panel at the RSA Conference here is a deep discussion of crypto standards, key lengths and the relative merits of various hash functions. But the bright light that has been shone on the NSA’s activities recently gave the panelists quite a bit more to discuss this year. The panelists, who included Adi Shamir of the Weizmann Institute, Ron Rivest of MIT, Whit Diffie of SafeLogic and Brian LaMacchia of Microsoft Research, had plenty to say about the revelations of the NSA’s reported efforts to undermine crypto algorithms and influence technical standards.

“I was most surprised by the Americans’ deep involvement in this,” said Shamir.

“We’ve had a loss of innocence as we’ve seen what goes on behind the curtains of government,” said Paul Kocher of Cryptography Research, who moderated the panel.

Some of the most damaging and concerning revelations to come from the Edward Snowden leaks have been about the agency’s alleged efforts to weaken some technical standards and crypto algorithms. There are also reams of documents showing the NSA’s work at getting around SSL in various ways, which Shamir said is actually a good sign.

“In all of the documents, there isn’t any indication that they manager to break the mathematics,” he said.

Still, the panelists agreed that the NSA revelations should serve as a reminder to cryptographers and product designers to err on the side of caution when it comes to design choices.

“We should really putting a hefty degree of conservatism in our standards,” said Rivest, who, along with Shamir and Len Adleman, designed the RSA algorithm.

As the events of the last year have shown, standards and technologies that seem to be on solid footing one day can be revealed as weak or compromised the next. LaMacchia, of Microsoft Research, said that the prudent thing is to work under the assumption that at some point, the algorithm you’re designing or using will fail.

“You have to plan for you algorithm to fail. Early on I think we underestimated the effort it takes to move to a new cipher suite,” he said.

 

Categories: Cryptography, Government, Web Security

Leave A Comment

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>