After Google Home users started receiving mysterious alerts when their fire alarms went off or their plates smashed in their homes, Google acknowledged that it accidentally rolled out a feature causing the smart devices to record sounds without the voice prompt.
Reports of the privacy faux pas began after one Reddit user reported earlier in August that Google sent him a phone notification saying that the smoke detector in his home had been triggered. Others said they were notified of possible glass breaking when some dishes broke or when they were watching television. These reported incidents surprised Google Home users as the devices are only supposed to record audio when prompted with specific commands, including “Hey Google” or “OK Google.”
Google for its part said that the alerts are part of a subscription service called “Nest Aware,” first launched in May. As part of this service, users of Google cameras, speakers and displays can pay for the devices to detect any “critical” sound in their home and send them an alert on their phones while they are away from their homes. But, while users can sign up for service, Google said that the feature was inadvertently turned on for some users.
“We are aware of an issue that inadvertently enabled sound detection alerts for sounds like smoke alarms or glass breaking on speakers that are not part of a Nest Aware subscription,” a Google spokesperson told Threatpost. “The issue was caused by a recent software update and only impacted a subset of Google Home, Google Home Mini, and Google Home Max speakers. We have since rolled out a fix that will automatically disable sound detection on devices that are not part of Nest Aware.”
According to reports, the recording policy change is part of a recently announced long-term strategic partnership between Google and ADT, a U.S. security and home automation provider. As part of the partnership, Google will combine its Nest devices, services and technology with ADT’s leadership position providing security solutions for millions of homes and small businesses in the U.S.
“Over time, Nest’s devices, powered by Google’s machine learning capabilities will enhance ADT’s security monitoring and become the cornerstone of ADT’s smart home offering,” according to Google in its announcement of the partnership. “The goal is to give customers fewer false alarms, more ways to receive alarm events, and better detection of potential incidents inside and around the home.”
A similar feature exists in Alexa-powered Amazon Echo smart home devices. In order to turn on this feature, called Alexa Guard, an Echo user can tell the device, “Alexa, I’m leaving,” and then it will begin recording for getting smart alerts – such as smoke alarms, glass breaking or carbon monoxide alerts – that will be sent to users via phone notification.
While these types of services are valuable for users when it comes to home security services, the privacy implications concern cybersecurity experts – particularly as voice assistants such as Amazon Alexa and Google Home have come under increased scrutiny about how much data is being collected, what that data is, how long it’s being retained and who accesses it.
“Voice assistant technology, including that used in smart speakers designed for the home, have long been suspected of and in some cases sometimes confirmed, of recording more than voice commands,” Wendy Foote, senior contracts manager at WhiteHat Security, told Threatpost. “Assuming [Google] dedicates significant resources to this product line, it is surprising to learn the inadvertent recording of private interactions in people’s homes was caused by a recent software update that turned on advanced sound detection that bypassed ‘wake up’ words that activated Google’s listening feature.”
Mohit Tiwari, Co-Founder and CEO at Symmetry Systems, told Threatpost that voice assistant device developers need a tool that can run through a check-list of safety issues, so that if an error occurs, it is automatically caught and the developers can fix it as soon as possible, either in pre-production or early in production.
“However, such tools today aren’t available at scale today — automated pre-production tools check for close to machine-level errors (like ‘memory safety’) but not for privacy or ‘information flow’ errors,” he said.
Google and Amazon’s virtual assistants drew criticism last year after they were caught eavesdropping without permission – with Google Home devices capturing and recording highly personal audio of domestic violence, confidential business calls — and even some users asking their smart speakers to play porn on their connected mobile devices. In April 2019, Amazon was thrust into the spotlight for a similar reason, after a report revealed the company employs thousands of auditors to listen to Echo users’ voice recordings.
“The latest major privacy failure at Google is a reminder that when you have a microphone nearby, it is likely recording,” Joseph Carson, chief security scientist and Advisory CISO at Thycotic in a comment to Threatpost. “The important message to any vendor with active smart microphones is that transparency and consent for the users when the device is recording is critical, especially at a time when many employees are working from home and sensitive business details might be leaking via nearby smart devices. The good news is that Google reported the privacy incident and made an improvement to notify and alert the user when a recording have been made.”
Complimentary Threatpost Webinar: Want to learn more about Confidential Computing and how it can supercharge your cloud security? This webinar “Cloud Security Audit: A Confidential Computing Roundtable” brings top cloud-security experts from Microsoft and Fortanix together to explore how Confidential Computing is a game changer for securing dynamic cloud data and preventing IP exposure. Join us Wednesday Aug. 12 at 2pm ET for this FREE live webinar with Dr. David Thaler, software architect, Microsoft and Dr Richard Searle, security architect, Fortanix – both with the Confidential Computing Consortium. Register Now.