The findings from a recent study carried out by Microsoft Research and the University of South Carolina suggest that we should be asking ourselves when to require authentication rather than whether to require authentication.
The research puts forth the idea of tailoring authentication requirements on mobile devices, by application or otherwise, so that users are only prompted for a password or other authentication method when it’s necessary. In this way, the study’s authors believe, users would be required to authenticate themselves less often, therefore, lowering the barrier of entry for those who currently use no authentication methods at all.
For example, the authors argue that if a user hangs up after a call and places the device in their pocket, where it remains, then they should not be required to re-authenticate the next time they attempt to access the device, because that person hasn’t truly lost contact with it. On the other hand, if a user puts the phone down somewhere, actually losing contact with it, then they should be required to re-authenticate before accessing the device again. Furthermore, the authors propose valuing the data within a given device. If a system has strong confidence in a user’s authenticity, then the system should allow that user to access any content without authentication. If the system has little confidence, then that user should still be able to access low-value content (news, weather, etc.) without authentication but be required to authenticate in order to view high-value content (email, banking, etc.).
The research paper was done by Oriana Riva, Karin Strauss and Dimitrios Lymberopoulos of Microsoft Research and Orin Qin of the University of South Carolina, and is titled “Progressive authentication: deciding when to authenticate on mobile phones“.
In order to work effectively, a progressive authentication system would have to be based on a number of different variables. The study recommends voice recognition sensors to establish identity, then the use of an accelerometer to determine continuity of contact and extend the validity of successful authentication. Another method would be to use proximity sensors to determine whether a given device is near another device belonging to the same user.
Of course there remains the problem of false rejections due to background noise and other factors. To mitigate this, the authors cite machine learning techniques and rigorous cross-checking as methods of diminishing the effectiveness of weak signals and false rejections. Beyond that though, there is the chance that such methods could present serious energy concerns for devices with limited power supply, so the authors suggest outsourcing these processes to cloud servers.
The actual experiment upon which the research was based demonstrated a 42 percent decrease in authentication prompts and no unauthorized accesses. Beyond that, users’ authenticity was over-valued in just eight percent of the trials.
On a more human level, the study’s respondents who already use authentication methods indicated that they would prefer that about half of their applications be accessible without a lock. Among those without authentication, they responded to the contrary, saying that they would prefer locks for about half of their applications. In the end though, most participants preferred a three-tiered authentication system, a highest-value tier, requiring complicated authentication, a mid-level tier, requiring simple authentication, and a bottom-tier, requiring no authentication.