FTC Urges IoT Privacy, Security-by-Design at CES

FTC at SEC on privacy and security in IoT

The head of the Federal Trade Commissioned warned of a future where data collection has gone awry and avoid it at the Consumer Electronics Show.

In her keynote address yesterday at the Consumer Electronics Show in Las Vegas, Federal Trade Commission Chairwoman Edith Ramirez imagined the dystopic convergence of big data conglomerates and a ceaseless information gathering machine fueled by the constant connectivity ushered in by the so-called “Internet of Things.”

As the nation’s top regulatory agency responsible for protecting consumers, the FTC is in a unique position to discourage the creation of an environment where data collection and brokering runs awry. As such, Ramirez would follow her grim portrayal with a series of suggestions for how we can access the many benefits of nearly ubiquitous connectivity without suffering from its pitfalls.

“In the not too distant future, many, if not most, aspects of our everyday lives will leave a digital trail,” Ramirez said. “That data trove will contain a wealth of revealing information that, when patched together, will present a deeply personal and startlingly complete picture of each of us – one that includes details about our financial circumstances, our health, our religious preferences, and our family and friends.”

The trove, she says, is being driven by the rapid introduction of sensors and devices into personal spaces like our homes, cars and bodies. She went on to explain that such devices will give companies the power to monitor otherwise private activities.

“Moreover,” she reasoned, “the sheer volume of granular data that a small number of devices can generate allows those with access to the data to perform analyses that would not be possible with less rich data sets, providing the ability to make additional sensitive inferences and compile even more detailed profiles of consumer behavior.”

Ramirez expressed concern about how all this data would be used. Particularly, whether it would be used to provide services to the consumer  or, more nefariously, if it would be used in ways that were inconsistent with customers’ expectation of the companies handling the data.

She wondered whether TV watching habits could be shared with employers or universities and whether that information would be combined with parking lot security gates, heart monitors and smart phones.

“And will this information be used to paint a picture of you that you will not see but that others will – people who might make decisions about whether you are shown ads for organic food or junk food, where your call to customer service is routed, and what offers of credit and other products you receive?” Ramirez asked the audience before suggesting that such activities could exacerbate existing socio-economic disparities.

Problematically, Ramirez explained that security poses as much a risk as privacy as more and more things come online, particularly when those things directly affect physical safety. Unlike the world of traditional computing, many developers of so-called Internet of Things devices are relatively new to the game. While traditional computer companies have been warding off hackers and thwarting attacks for years, these new developers simply will not have that experience. Also unlike traditional computers, smaller devices will not have the processing power to to implement encryption nor will they have the requisite memory to apply software patches or even inform consumers of vulnerabilities.

In the latter half of her keynote, Ramirez turned to how we can address these risks by incorporating security by design, through data minimization and by providing customers with clear notice and simple choices.

I question the notion that we must put sensitive consumer data at risk on the off-chance a company might someday discover a valuable use for the information.

In order to build security into the design of new devices, Ramirez believes companies will need to conduct a privacy and security risk assessment during the design process, test security measures before products launch, require consumers to change default passwords during set-up, deploy encryption, and monitor products for vulnerabilities throughout their life cycle.

On the point of minimization, Ramirez explained that companies should limit data collection to only the information they absolutely need for specific purposes. Once they’ve used that information, they should dispose of it.

“We often hear the argument that to realize the benefits of big data, businesses should not face limits on the collection and retention of data because the value lies in its unanticipated uses,” Ramirez said. “But I question the notion that we must put sensitive consumer data at risk on the off-chance a company might someday discover a valuable use for the information.”

Furthermore, she said, when a company does collect data, that data should scrubbed of identifying features, noting that many of the benefits of big data can still be accomplished using information that remains anonymous.

The third part of Ramirez’s solution involves informing customers about what their devices do.

“Consumers know, for instance, that a smart thermostat is gathering information about their heating habits, and that a fitness band is collecting data about their physical activity. But would they expect this information to be shared with data brokers or marketing firms? Probably not. In these and similar cases, consumers should be given clear and simple notice of the proposed uses of their data and a way to consent. This means notice and choice outside of lengthy privacy policies and terms of use.”

In closing, the FTC chair acknowledged that providing notice and choice to every consumer for every device is easier said then done, but she challenged the tech world to harness its “ingenuity, design acumen and technical know-how” in order to provide consumers with easy-to-understand choices.

Suggested articles