‘Why is security so hard?’ That’s the question that a new conference focused on “useable security” will attempt to answer by bringing together experts on security, software development and human-computer interaction.
The Software and Usable Security Aligned for Good Engineering (SAUSAGE) conference is scheduled to run from April 5 to April 6, 2011 in Gaithersburg, Maryland. The Conference is sponsordd by the Institute for Information Protection at Dartmouth College and the National Institute for Standards and Technology (NIST). The organizers have issued a call for papers to academics, industry researchers, and practitioners in computer security, HCI, and software engineering, said Serge Egelman of NIST.
Egelman said the idea behind the SAUSAGE conference is to address a gap between security and usability when it comes to software development.
“In the past ten years, a lot of attention has been paid to integrating security into the software development cycle…Usability also has a development lifecycle, but the two were heretofore disjointed processes,” Egelman said.
In fact, the two “have many parallels and a synergistic relationship. “We need to consider whether they should be integrated to create a ‘usable security development lifecycle’,” Egelman wrote in an e-mail message.
Usable security is an area of research that looks at the intersection of computer security and human computer interaction (HCI) – analyzing the ways that human behavior can work against (or enhance) the impact of security features.
Overly technical warning messages, opaque graphical interfaces and draconian password creation policies are just a few of the ways that security runs up against human limitations – or merely human nature.
“Many security problems are caused by a failure to account for human behavior,” Egelman said. For example “users may share or write down passwords due to an onerous policy for how passwords must be composed,” degrading the security offered by strong passwords in the first place, he wrote.
In other cases, software developers who are responsible for creating and implementing security features fail to understand the perspectives and capabilities of users who will be asked to interact with those features.
“Is it reasonable to expect the user to notice lock icons every time they believe they are on a secure website?” Egelman asks. “Is it reasonable to expect the user to understand (or care) what a self-signed certificate is?”
In most cases the answer is “no.” Computers are productivity tools and most users interact with them in order to do work unrelated to security.
“No one sits down at the computer solely to ‘not get phished,’ Egelman wrote. ” “Thus, security is often seen as a hindrance to getting something else done…time spent interacting with security mechanisms is time spent not doing what the user really wanted to do.”
SAUSAGE hopes to explore questions like ways to encourage developers and useability designers to do a better job making security transparent, while also encouraging users to make use of security features, rather than disregarding them.
The goal is to avoid bolted on solutions that try to make security features usable – or useable products secure after the face, Egelman wrote. Papers could straddle fields like computer science and psychology, the cognitive science and sociology.