One of the more trite and oft-repeated maxims in the software industry goes something like this: We’re not focusing on security because our customers aren’t asking for it. They want features and functionality. When they ask for security, then we’ll worry about it. Not only is this philosophy doomed to failure, it’s now being repeated in the realm of privacy, with potentially disastrous effects.
One problem with the “our customers aren’t asking for security” mantra is that, even if it’s true right now, it won’t be true forever. And when your customers do come knocking on the door complaining about insecure products and demanding that you do better, you’ll likely be in no position to respond.
Security isn’t an ingredient that developers can add to an application on demand, like adding an extra dash of Cholula to a plate of enchiladas. Security is a property of well-designed and well-executed software and hardware systems. It’s a result of careful planning, development and review–not to mention a lot of training. In other words, security is hard. And it’s time-consuming. And it’s expensive.
Plenty of software companies have found this out the hard way, with Microsoft being the most obvious example. But there are others: Adobe, EMC, Intuit. It’s a long road, but they’re learning, and that’s progress.
Of course, the other problem with the above philosophy is that it is predicated upon the company in question actually talking to its customers. That’s not as much of a no-brainer as it would seem. Sure, enterprise software companies go to great pains to get feedback from their customers and (usually) try to respond accordingly. It’s just good business.
But that clearly is not what’s happening in the world of privacy.
A quick search of recent news on the privacy front reveals that just about all of it is bad. Facebook is exposing users’ live chat sessions and other data to third parties. Google is caught recording not only MAC address and SSID information from public Wi-Fi hotspots, but storing data from the networks, as well.
Unfortunately, none of this is shocking or even surprising. It’s what we’ve come to expect on the Web. We assume that big companies are playing fast and loose with our personal information and that there’s little we can do about it. And this notion is reinforced by the companies themselves, especially social networking companies such as Facebook and Twitter that have built their entire business models on their users sharing personal information. It’s their lifeblood, and their goal is to make it as easy as possible for users to bring more and more people into their networks and share ever-increasing amounts of information.
Likewise Google. If no one searches for information on alcoholism or cancer treatments, then they can’t sell ads against those search terms and they lose money. How this business model leads to Google fingerprinting every Wi-Fi hotspot its Street View cars come across and recording data that crosses those hotspots is another matter. But the prevailing attitude among corporate executives in these cases seems to be summed up by Google CEO Eric Schmidt, who famously said this not too long ago: “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.”
If you look beyond the patent absurdity of Schmidt’s statement for a minute, you’ll find another old maxim hiding underneath: Blame the user.
You want privacy? Don’t use our search engine/photo software/email application/maps. That’s our data now, thank you very much.
Oh, you don’t want your private chats exposed to the world? Sorry, you never told us that.
The blame-the-user defense likely has been around since the first time someone complained to the Roman authorities about the raw sewage running through the streets. No problem, just stop eating and drinking and they’ll dry up in no time.
The point, of course, is that while these companies (and centurions) may be hearing the complaints, they’re not listening to them. Little is changing. They seem to be saying that customers don’t really want privacy, they just get angry when something goes wrong and then go right back to over-sharing once the furor dies down.
That certainly seems to have been the case until very recently. Users of social networks blithely share details of their job searches, illnesses, vacation plans, financial problems, love lives and anything else that strikes their fancy at the moment. But all of that is information that users share voluntarily and consciously. Maybe it’s not the best idea to tell your 1,337 Twitter followers that you’re leaving the country for 10 days and hope your house will be OK, but it’s your choice. No one should be making those decisions for you, whether through technical glitches or bad policies.
For most people, privacy, if they thought about it at all, was pretty far down the list of priorities. The recent spate of news coverage about these privacy debacles seems to have raised awareness levels among the general public about these problems, and even prompted Facebook CEO Mark Zuckerberg to pen a mea culpa column in The Washington Post today, promising his company would do better.
That’s a start. But in order for real change to occur, companies are going to have to take this problem seriously and not treat it like an afterthought, the way that security once was treated by software companies. Sure, privacy is a nebulous concept that’s famously hard to define and difficult to protect, but that doesn’t mean it’s not worth the effort.
Privacy, like security, has a million different pieces. And all the pieces matter.
Blame the user? Here’s a better idea: Listen to the user.
Anonymous on
I'd like to add that most of the world doesn't have a population that's so rabidly afraid of a powerful government that it'll happily choose to let unaccountable commercial parties do much of that sort of thing.
And so we get governments that treat their very own citizens as criminals above and beyond what the TSA does for a living. And a lot of quangos to make up the icing. We already have to have ID cards and increasingly often have to show them at the drop of a hat. The British started fingerprint kids from the age of six. For lunch money. And to keep track of future criminals, that too. Just to name one out of many, many examples. And then just about everything gets spun together and pinned down back onto you. Just recently an unfortunate come together of knee-jerked together slap-dash rules ment that visitors to a very private kinky rubber club had to have their IDs scanned at the door. So much for your freedom to blow off some steam in the manner of your
choosing.
Before you know it that information will end up on a USB stick lost on the tube or a CD lost in the mail, together with "vulnerable children" records, tax records, bank details, and ghod knows what else.
My point? Our governments are completely failing privacy, in fact are putting into law the very destruction of any possibility to get a reasonable identity-protecting infrastructure, and all they have to say about it is ``you'll just have to trust us, because we can't trust you''.
How are they going to protect even their own citizens from the eric schmidts and mark zuckerbergs of this world? There's only so much you can spin on the first and the fourth, provided your country has them. Ninetyfive-odd percent of the world doesn't.