It’s not that users “don’t want to keep their data safe”. They do. Most corporate users don’t want their personal or corporate, private information, available to someone else. They don’t want their email stolen or their contacts pillaged. So why do people insist on ignoring the multitude of security recommendations on how to have a more secure mobile work environment? The answer to this question is that inside, users really just don’t care.
The average corporate user of a mobile device has a litany of reasons why they think they don’t need to listen to the advice of their security organization. Some of these reasons are more legitimate than others, but what they really boil down to is the fact that they all indicate a lack of economic incentive when compared against required effort. The end result is that the user just doesn’t care about the problem.
- There are so many phones out there, it won’t happen to me. The chances are too slim.
- I don’t understand the danger here? I mean it’s a smartphone, nobody attacks phones.
- What do you mean I have to act in a secure manner? How do I do that?
- But I downloaded this app from the official marketplace. What do you mean it’s not secure?!
- You put firewalls and antivirus garbage on my laptop and it’s slow as heck, and I STILL get infected. Security doesn’t work.
In economics, the concept of incentives is defined (Wikipedia) as: “Any factor (financial or non-financial) that enables or motivates a particular course of action, or counts as a reason for preferring one choice to the alternatives.” Incentives are typically broken down into four main categories of social incentives:
- Financial Incentives: A material reward for ones actions
- Moral Incentives: It’s just the “right thing to do”. A failure to act in a certain way is frowned upon by society at large.
- Coercive Incentives: Failure to act in a certain way will result in a negative physical act. This could take the form of jail, pain, loss of possessions, etc.
- Natural Incentives: Curiosity, imagination, admiration, fear, anger, pain, joy, truth, etc.
Let’s analyze the incentives that a corporate user has with regards to security policy and implementation of the security recommendations that are mandated to them by the IT department:
There are most certainly no positive financial incentives for a corporate citizen to follow best security practices. They don’t receive any bonus or extra salary if they properly lock their phone with a strong password. They don’t get any extra money or time off when they install and don’t disable the corporate mobile AV solution. From a financial perspective incentives exist to ignore corporate security requirements as long as possible. Time is money, and having a slower, less responsive phone is a time money trade off that occurs when security is implemented on a device.
Society as a whole doesn’t really see the value of the security of mobile devices. There is no societal expectation with regards to locking your phone. If you were to go on vacation and leave your front door open and windows open, there wouldn’t be much sympathy if your house were broken into. However, if you implement no security mechamisms and give full permissions to every application on your phone, society would feel sorry for your compromise. The average person would feel bad that your email and contacts were farmed and posted to the Internet. Society places no value on the security of mobile devices so moral incentives to fix the problem don’t exist.
This leaves coercive incentives. Make the user do what the security team says or they will be PUNISHED. Rule with an iron fist! An author recently suggested that coercive incentives were the only way to successfully implement security in the mobile environment. He suggested that if you fall victim to a mobile phishing attack, you should lose your phone for a week. If you get trapped by a social engineering attack, maybe you should receive a cut in pay or a demotion. Coercive incentives aren’t currently implemented for a reason. They won’t work in the long term and will be coupled with resentment and anger by the organization. In the short term you may end up with strong results for your efforts, but in the long run, the “slavery” model of work will not hold and the organization will revolt.
In a few of the more diligent (or paranoid) of consumers there will be a natural incentive to secure their mobile device. Some corporate users will have a natural interest in the security of the corporate data. They might fear the results of a compromise enough to implement the recommended best practices. However, fear is generally not the best of motivators and when you place the level of rational fear (remember we have a minimal chance of being attacked by sheer odds) up against the level of effort to secure the device. Many people will override this incentive and still ignore the recommendations of the security team.
Where do we go from here?
So where exactly does this leave us? Our corporate users have no incentives to follow the advice of the security organization. If we dove much more deeply into this exercise we would be able to come up with a number of disincentives that cause people to actively ignore the security of their devices. Without proper incentives and motivation to secure the corporate environment we will never have a reasonable security adoption rate. If the end user really isn’t going to do what we as corporate security teams require, we will have to do it for them in a manner that is as invisible to them as possible. We are going to have to harden the phones before we give it to them, without any slowdown in device performance. We are going to have to securely test the applications they put on their phones prior to them downloading the application themselves. We must give them a whitelist of applications that are safe and instill in them the natural incentive of wanting to secure the mobile environment. If the end user chooses to not be active in their own security, and we’ve shown they won’t be, we must proactively implement the security best practices on their behalf.
*This article originally appeared on Veracode’s ZeroDay Labs blog.