Privacy Goal: More Controls in Users’ Hands

The chief privacy officers of Microsoft, Facebook and Google today at RSA Conference discussed how their respective companies want to put more privacy controls in users’ hands.

SAN FRANCISCO – The same companies that brought, among other things, facial recognition into your living rooms and the ability to record video to your eyewear, swear the next big thing in privacy is putting more controls in your hand.

The privacy officers of Microsoft, Google and Facebook said Wednesday at RSA Conference 2015 that disruptive movements such as the continued connectivity of home devices, wearables and more will continue to complicate privacy and force technology providers to re-address how much control is transferred to the user.

“We are going to see technology continue to challenge traditional notions of privacy,” said Keith Enright, director of the privacy legal team at Google. “We always struggle to keep up and protect traditional norms. I think we’ll see smart companies double down and invest in user controls and keep users empowered about the way information about them being used.”

For Microsoft, its most prominent example of privacy disruption was the introduction of the Kinect technology for Xbox One, which allows gamers to control their platform using voice commands and gestures. Kinect required considerable privacy discussions during design and implementation, privacy officer Brendon Lynch said, because the technology brought facial recognition to gaming and other activities on the platform. Microsoft’s privacy team, which includes a number of privacy professionals embedded with designers and engineers, helped assess any risks introduced by the new product.

“The privacy team was at the forefront of determining what data collection makes sense, for example. Can we still provide something that’s cool and fun, but with a minimal amount of data coming to us or third parties? Applying core privacy processes led to a lot of design decisions that limited the impact of privacy.”

For example, Lynch said, the decision was made early on not to capture images, but to use geometric points on the face instead, and use an algorithm to create a unique hash of each player’s face—most of whom are children.

“And that stayed local to the device, and not on our server,” Lynch said. “Examples like that are the front line that our privacy team is dealing with. They set standards that can apply broadly to other parts of the company leveraging similar technology.”

For Facebook, the introduction of its newsfeed functionality, which streams updates from one’s friends onto a user’s page, along with the launch of Graph Search two years ago, forced the company to provide users with the ability to manage what information is streamed out, and to whom. The new search capability, for example, makes it simple to find out information or photographs related to an individual.

“We had to engineer the capability to do privacy checks, build those types of controls and bring them inline,” said chief privacy officer Erin Egan, who explained that Facebook has a cross-functionality review team in place that trains every engineer and product manager on privacy commitments made by Facebook to its users. “We created a means with one switch, users could limit the data sent to friends or the ability to un-tag themselves [in a photo]. You have to put controls in place to do that.”

As for Google Glass, internally, engineers and privacy experts believed at first that the disruption potential of the wearable technology was overstated. Enright said that Google learned plenty during the launch of Glass and how to balance user concerns and diverse global privacy laws with the urge to push cool technology to market.

“Glass was for me and Google a tremendous learning experience. It taught us a lot about launching innovative products and helping users integrate technology in their lives in a responsible way and cognizant that technology may impact other people,” Enright said. He added that Google too embeds legal and compliance professionals with product engineers to identify privacy concerns, drive consistent privacy documentation and conduct privacy impact assessments for each new product design.

“A privacy design document is filled out when a new product is conceived where the team describes how the information collected by the new product will be processed, shared and deleted,” Enright said. “That is at work from conception and it’s ongoing in development and implementation so that they’re cognizant about privacy as the product evolves.”

Suggested articles

Discussion

  • Ilya Geller on

    I developed the only technology that does not need any kind of spying: no browsing history, no cookies! I discovered and patented how to structure any data: Language has its own Internal parsing, indexing and statistics. For instance, there are two sentences: a) ‘Fire!’ b) ‘Dismay and anguish were depicted on every countenance; the males turned pale, and the females fainted; Mr. Snodgrass and Mr. Winkle grasped each other by the hand, and gazed at the spot where their leader had gone down, with frenzied eagerness; while Mr. Tupman, by way of rendering the promptest assistance, and at the same time conveying to any persons who might be within hearing, the clearest possible notion of the catastrophe, ran off across the country at his utmost speed, screaming ‘Fire!’ with all his might.’ Evidently, that the phrase ‘Fire!’ has different importance into both sentences, in regard to extra information in both. This distinction is reflected as the phrase weights: the first has 1, the second – 0.02; the greater weight signifies stronger emotional ‘acuteness’. First you need to parse obtaining phrases from clauses, for sentences and paragraphs. Next, you calculate Internal statistics, weights; where the weight refers to the frequency that a phrase occurs in relation to other phrases. After that data is indexed by common dictionary, like Merriam, and annotated by subtexts. This is a small sample of the structured data: this - signify - : 333333 both - are - once : 333333 confusion - signify - : 333321 speaking - done - once : 333112 speaking - was - both : 333109 place - is - in : 250000 To see the validity of the technology - pick up any sentence and try yourself. After that try a paragraph? All other technologies depend on spying, on quires, on SQL, all of them. See IBM, Oracle, Microsoft, Google an Yahoo? Apache Hadoop and NoSQL? All information will soon be structured and put into database; information is going to search for users based on their profiles of structured data, which cannot be read, used and understood by third parties in any way. The technology just does not require spying! The Era of Absolute Privacy is coming. No more spying! My technology exploits the Laws of Nature, which determine the inner construction of all Languages: I came from Analytic Philosophy, from Internal Relations Theory.

Subscribe to our newsletter, Threatpost Today!

Get the latest breaking news delivered daily to your inbox.