News Wrap: Amazon Echo Privacy, Facebook FTC Fines and Biometrics Regulation

facebook privacy FTC

On this week’s Threatpost news wrap, the team discusses Facebook’s FTC fine for its data security practices, a report that Amazon is collecting Echo users’ geolocation data, and more.

Data privacy has been an outstanding theme this past week, and the Threatpost team discussed the biggest privacy related news. In the news wrap podcast for April 26, the team discussed the backstories behind several reports from the week, including:

Below is a lightly edited transcript of the podcast.

Lindsey O’Donnell: Welcome to the Threatpost podcast, and the Threatpost team is all here this Friday morning. You’ve got Lindsey O’Donnell and I’m here with Tara Seals and Tom Spring. Hey, everyone.

Tara Seals:Hey, Lindsey.

Tom Spring: How’s it going, Lindsey? How’s it going, Tara?

Lindsey: Good. So, privacy has really been kind of the name of the game this week, in terms of all the stories that we’ve written. And I know, we had a lot of data privacy type stories, everything from Amazon Echo privacy issues to facial recognition. But if we’re talking about data privacy, I think we should really start by bringing Facebook into the conversation here, as we usually do.

Tara: Yeah, that seems to have been a top theme of the week for sure. And you did a ton of reporting on that this week.

Lindsey: Yeah, so, the big news this week was that Facebook may be facing fines of between $3 to $5 billion for that FTC fine that was related to the Cambridge Analytica incident last year, and all of their data privacy issues that they’ve had since then. So, Facebook had its earnings and disclosed this amount of money that is set aside as contingency expenses. And I feel like we keep hearing about reports of Facebook, having all these data sharing incidents, or having all these crazy data practices, but now we’re really looking at the consequences. And everyone’s wondering how data collection and sharing will be regulated and what kind of fines we’ll see. So that should be interesting to keep an eye on how this actually plays out in the coming months.

Tara: Yeah, and I wonder in terms of all of that, when we talk about the GDPR, over in Europe, and how it has really stringent requirements for explicit consent before somebody harvests your data, which obviously is not something that Facebook adheres to, for U.S. citizens anyway – have there been any rumblings out there in terms of whether or not Facebook might face future regulation?

Lindsey: I think that’s there’s been a lot of discussion about it. I know, obviously, Mark Zuckerberg has appeared in front of Congress. And it’s definitely been at the forefront of discussion. But beyond some state-level data privacy practice regulations, it’s something that people are still trying to figure out. So I think that’s kind of why this FTC fine is at the center of attention. There was news today, actually, that the New York Times was talking to sources who said that the FTC is discussing stronger monitoring of Facebook’s privacy policies, as well as direct punishment of Mark Zuckerberg. So that raises questions about how to deal with data sharing, whether it’s kind of hitting at the CEO, or even just imposing bigger fines. But Tara, I know, you listen to the actual earnings call. Were there any special call outs about the fine or data security in general? I’m curious if they talked about it at all.

Tara:They studiously avoided talking about the fines specifically, which, it’s a charge off of, they added $3 billion, and they said it could go up to as much as $5 billion, and so that ate into their profit, which is kind of interesting, because they reported, I think it was, I don’t have it in front of me, but I think it was around like $2.3 billion in profit for the quarter.

And that that is taking into account that $3 billion contingency fine. And they didn’t really specifically discuss it. But they did say that they expected profits to continue to waver a little bit going forward, due to regulatory headwinds, as well as advertising-related falloff, because they’re not sure that they can make the same amount of revenue off of ad targeting that they have in the past.

So that sort of in a roundabout way speaks to the fact that they’re looking into making some changes in terms of how they collect and use user data. But that’s sort of reading between the lines, and they certainly didn’t say anything explicit about it, unfortunately.

Lindsey: Right. Well, I know one big point of discussion was, is this enough? How does this compare to past fines? Because I know Facebook has faced various fines in the past, which Tara you have actually written about. I think it was in December it was fined like $11 million. And then in October, it was fined $645,000. So obviously, those kind of shy away in comparison to $5 billion, but I think people are still kind of asking, how does this compare? Facebook’s kind of overall –

Tara: Yeah, their overall profit, annual, you know, $3 to 5 billion is significant for them, actually.

Tom:Well, I just looked it up. Facebook made more than $40 billion in revenue in 2017.

Tara:  What’s the profit? That’s the real marker right?

Tom: Well, it is the real marker.

Lindsey:  I’m curious what will come out of it. But I do know that everyone’s really looking at this as some sort of precedent for how Facebook will be regulated in the future, if it continues with the data security issues that have been happening over the past year, since Cambridge Analytica.

Tara: One of the things too, Lindsey, that I wanted to ask you about was, you know, the poll that we did on attitudes towards Facebook. But, you know, also in the wake of their earnings that showed that they had seen an 8 percent year over year, subscriber jump, so the headlines, even though people are sort of horrified by them, they’re not really dissuading people from actually using the platform, which I think is interesting. And then also their stock price just skyrocketed, after they reported their earnings, even with the charge off for the fine. So I don’t know, I don’t know what’s going to happen in the future and whether any of this is going to make a difference in terms of whether or not it’s successful as a company.

Tom: I was just thinking, I think that, it’d be interesting to watch the regulatory space to see what the U.S. does, especially with GDPR, in terms of what’s going on in Europe, and really a constant sort of, you know, march of bad news in terms of privacy, and also with breaches that are taking place, not only with Facebook, but with a ton of other companies – I think what we’re doing is we’re setting up in 2020, and beyond some new rules around privacy and some new regulations around privacy. Because I mean, as you just pointed out, Tara, fines and threats and punishments are not really are impacting the way Facebook’s doing business or hurting them in terms of their business model.

Lindsey: Right. I don’t think at all that people are going to stop using Facebook. And I mean, to be totally honest,  even if they do adopt some sort of model where you pay to use the platform without advertising or without your data being collected and shared – I’m not sure how many people would even opt in for that as well. I mean, I could be completely wrong. But I don’t know if people are going to pay an extra like $5 a month or something to use a social media platform that’s already free.

Tom: Yeah, I don’t think anybody’s going to be paying. But I think what you’ll see is probably some government intervention. That’s my prediction. I mean, the things that we regulate here in the U.S. – these companies, whether it be Amazon, Google, or Facebook, they’ve basically had a clear runway to do whatever they wanted for I don’t know how many years. And, you know, if you think about all the different things that we regulate in this country, privacy really isn’t one of them right now, but certainly isa right target for legislators to focus on.

Lindsey: Right. That’s the good point. Speaking of Amazon, I know, Tara, you covered a really interesting story this week too about news of their auditing program for Echo devices, which had already been reported. But now I guess a new report said that they’re also exposing geolocation data, in addition to voice data. Can you add some color there?

Tara: Sure. So, this story was really interesting to me. And it’s not just Echo either. It’s also, you know, the other Alexa devices including the Fire TV devices and there are tons of third-party gadgets that have Alexa built in now. So this is kind of a broad reaching story, from an Internet of Things perspective. But yeah, so apparently, and as you pointed out, this is something that Bloomberg had broken a story on about three weeks ago, talking about the fact that Amazon has a team of people in place that may manually audit Alexa interactions to make sure that the AI is learning appropriately. And it’s been effective and accurate and returning good results for users, and all that kind of thing. But what’s interesting is in the process of that, this data, which is supposed to be anonymous, right? So it’s just sort of random snippets – human people will listen to this, and then see what Alexa’s response was matched up, make sure that it’s accurate, do whatever secret sauce they have to do with the algorithm and the AI to fix it, or to make her smarter – But in the process of this, apparently, geolocation data gets scooped up here. Because when people ask, Alexa, tell me what the weather forecast is, or Alexa, I’m feeling like Chinese, is anybody delivering to my house, that type of thing. That necessarily, obviously, those local results have to be tied to geolocation data. So they’re scooping up and harvesting and storing and logging GPS coordinates, in addition to sort of these random, other snippets. And so there were five different employees within Amazon that are working on this program, that basically came forward and said that they feel that nobody gave their consent for this and that it’s too broad of an access for them to have. And then they actually on a whim, sort of plugged these coordinates into Google Maps and found that they could actually track somebody’s place of business or their house, and even bring up a picture of that house. And through other means, actually identify who lives there, and then tie all this other information together and be able to create a very creative profile.

Tom: I agree with you, Tara, I think that we need to be more concerned about the privacy that we hand over to these types of digital devices. And I’m even more concerned now about the privacy issues that have surround geo-specific apps, where you’re using an app and it understands where you’re at and gives you sort of context-relevant information, and how that data is being used, and who’s using it, and who’s collecting it. When you think about Amazon, they’re a much more potentially powerful company considering all the tentacles that it has into my buying and my data, and my home with their Alexa speakers.

Lindsey:Yeah, that’s a really good point. And I’m curious too about the consent and notification side of all of this. I mean, did they have any response Tara about if they gave any notification that they were doing any of this at all? Is there anything on Amazon’s website about this program?

Tara: No, no, this was completely in the background until Bloomberg came forward with their report, they didn’t acknowledge that it exists. And they just put out a statement saying, you know, we take privacy seriously. And saying, we limit, the number of people that have access to this, who are tasked with doing this as part of their job, and they’re bound by, you know, all kinds of restrictions and things like that it’s highly controlled.

Tom: I gotta come back to the point where I feel like this is an area ripe for regulation. I’m not pro regulation but I mean, if this is something that consumers are outraged about – I think there’s got to be a GDPR type regulations that we’re going to see here in the U.S. that that are going to impact the Facebook’s and the Amazons in the world.

Tara: Right and now, we have other types of privacy and sort of potentially intrusive privacy issues to worry about too – Lindsey, going back to some of the reporting you did this week, but with the facial recognition stuff is happening. You know that that seems like sort of the Wild West out there. There’s no regulation around that.Right?

Lindsey: Well, yeah, exactly. And the scary thing about that, too, is that a lot of the facial recognition applications out there are actually being used by the government. So by the Department of Homeland Security and by policemen and whatnot. But yeah, facial recognition came up in the headlines a bunch this week, because there’s been two different incidents. The first was you guys may have heard the EU last week approved a massive biometrics database that would combine the data from law enforcement, from Border Patrol, and more for both EU and non US citizens. So there was that. And then there was another incident this week that occurred where a JetBlue passenger was boarding a flight. And she noticed that instead of scanning her boarding pass, or taking a look at her passport, she was directed to look into a camera, before being allowed on onto the jet bridge. So she was confused about what was going on and so tweeted at JetBlue. And it turns out, this was part of a Customs and Border Patrol program that’s used in I think, 17 airports, where it uses facial recognition to identify passengers and let them through the gateway onto the plane. So her tweet went viral and kind of started this massive conversation about facial recognition and you know, if you can consent and where the data is coming from, how it’s being shared. So that’s been a really interesting story to cover, and kind of see the backlash and reaction to both of these incidents.

Tom: I can relate to that. I recently traveled to Mexico, for a little vacation. And, I am seeing facial recognition more and more in my life. I think the interesting thing about your story, Lindsey, was also you wrote about consent, whether or not all of these facial recognition systems actually ask for consent and get consent, which they don’t. But when I went to Mexico, we flew into Mexico, and then we went through customs in Mexico, and Mexico had immigration kiosks, where they asked for facial recognition and fingerprints, and to scan our passports, which – I was really creeped out. My son, who’s 14 years old, I think probably is now part of the government database of fingerprints and facial recognition. It was kind of weird. Considering, you know, he’d been off grid, perhaps I think for a while now, he’s part of the system. And then we flew back into the United States. There was these huge immigration lines in the Boston Airport. And one of the things that we were able to do was to cut the line by using what was called a mobile passport app. And I didn’t realize it but when you use the app, and you get to skip this, this huge onerous line that goes to basically more facial recognition kiosks for people coming into the United States. And the app itself was pretty slick. I mean, it’s kind of funny, because I felt really good about using the app, because it allowed me to cut in line. But the app basically did a facial recognition, had me input my passport information, and basically, took my identity in this app. And, I was so eager to cut the line, I gotta admit, I kind of skipped over a lot of the terms of services. And it saved me about 45 minutes. And for the price of handing over my biometric data to the government and to this to this app.

Lindsey: That experience brings up a really good point, because, I think that there definitely are benefits to facial recognition. Like, it’s not all about this dire Orwellian society. I think it makes these processes so much more efficient. But I do think there’s also a bunch of kind of privacy concerns that people expressed to me over the past week. And, Tom, like you were saying, consent and notification, but then also in terms of how the data is being secured, how it’s being shared, and who’s gaining access to that data. So I think that there’s kind of a lot that goes into it. I know that we actually did a poll, a Threatpost poll, and half of the respondents, this kind of surprised me, but half of the respondents said that they don’t believe consent is realistically possible when it comes to facial recognition. So I thought that was interesting, too, because if you think about some of the use cases where biometrics and facial recognition exists, if you have like a security camera, or surveillance camera that is using facial recognition, there’s not a lot you can do to opt out of that except for avoiding that area.

Tom:Well, I think you mentioned that the White House now has a zone where they use facial recognition. And right there, there’s no way you can say no, you walk into that zone. And you’re basically get put into a big database, and they cross reference it and figure out who you are.

Lindsey: Right. So there’s a lot that goes into that. And then when I was talking to a bunch of security people at the Electronic Frontier Foundation, as well, they were mentioning that there really needs to be regulation for all this. And there, there is one law that exists in Illinois, where it basically regulates the collection of biometric data without consent. But they think that there needs to be more. And in particular, regulation that impacts law enforcement, as opposed to just businesses which that law did. So I know, there’s also been a new bill that was introduced in March, it was, what was it called, the Commercial Facial Recognition Privacy Act, that would have like more widespread implications for businesses in terms of how what kind of notification and consent they would need when they use facial recognition. So I think that’s kind of a step in the right direction, but something to be looking out for.

Tom: Yeah, facial recognition has been a creepy topic for a long time. But you know, as these GPUs get better, and these computers get better, and the efficiency of the compute behind them get better. It just becomes even creepier. I don’t even think the tin foil hats will help protect you.

Lindsey: So Tom, you also had an interesting story this week. I think it was about passwords being – I think it was 2 million passwords were – being exposed.

Tom: Yeah. So I mean, we hear about these breach stories all the time. And I mean, there’s probably like, since we’ve been talking, there’s probably been like three breaches, or should I say leaky servers and insecure data on the internet. And one of the things that I think is kind of interesting about the story is that the leaky data, it was tied to a China-based app manufacturer, called Wi Fi Finder. And researchers at GDI Foundation, found 2 million hotspots and passwords for those hotspots on the servers of this app, this Android app called WiFi Finder. And essentially, it’s pretty straightforward. The app itself is an Android-based app, you can get it on Google Play. And it’s one of many of apps that do the same thing. And that is essentially crowdsource on Wi-Fi hotspot data, and also pairing that information with passwords. So the idea is if your dataset is big enough, and you’re wandering around with this app on your phone, you can find a hotspot, and you can authenticate to that hotspot, and you don’t have to ask anybody for a Wi-Fi password. Now, the data that was found on the servers was pretty extensive in the sense that it wasn’t just commercial businesses. So you know, you go to Starbucks, you go to your local gym, or you go to, you know, a bookstore or something like that, you know, you have these public Wi Fi hotspots with a password that you may have to ask for, you may have to look for. And what was happening was that people were crowdsourcing private companies that were not, generally publicly accessible. And for some odd reason, and this really wasn’t explained very well in the reporting, of the research, was that there was a massive, massive amount of Wi-Fi hotspots that were owned by home users like consumers. And so you would you basically had a lot of a lot of password information and a lot of hotspots by consumers in their homes. And the concern there is, is that in a commercial setting, or even in a sort of a public business, publicly accessible hotspot, there are protections put in place to prevent people from messing with the router configurations and accessing some of the some of the settings within the router. But as if you have access to a home router, those security measures are not in place. And there was no documented cases of hacking, but the concern was there regarding that type of information being available to anybody that had access to this leaky server.

Lindsey:I feel like we keep seeing this issue of insecure databases and these accidental exposures, which, obviously are different from a malicious breach. I’m curious if there’s something that can be done to prevent this for people who own these databases. I mean, Tom, did you talk to anyone, any experts who had any recommendations about how to better secure databases and kind of what the underlying problem is here?

Tom: I did talk to a couple experts on this one. And, you know, the advice is always the same.In terms of leaky data on servers, it doesn’t change much. Just make sure you configure your servers correctly, and make sure that they’re not accessible to the public. I mean, there’s a couple strategies that you can apply to that. I think one of the one of the other suggestions was the way in which some of these publicly accessible sites providing and offer Wi-Fi, and that would be more or less not an open Wi-Fi, not an insecure Wi-Fi, but things that use tokens and allow and divvy out Wi-Fi to individuals using a specific time delineated username and a unique password. And that way, it would basically render all of these apps useless, because there would be a unique username and a unique password, that would timeout within a certain period of time, which would really create a much more secure public Wi-Fi experience. And that was really the suggestion. And that was really what the experts were saying that I talked to regarding the blowback on this story.

Lindsey: Well, I’m feeling sufficiently like I need more privacy right now. Maybe we should wrap up now, Tom and Tara, thanks for taking the time and really interesting discussion today.

Tara: Yeah. Thanks, Lindsay. Thanks, Tom.

Tom: Yeah, have a great weekend. Have a great weekend.

Lindsey: Catch us next week on the Threatpost podcast.

For direct download, click here.

Suggested articles

Hey Alexa, Who Am I Messaging?

Research shows that microphones on digital assistants are sensitive enough to record what someone is typing on a smartphone to steal PINs and other sensitive info.