The potential for digital-home assistants like Amazon Alexa to infringe on user privacy by making and saving voice recordings of them is already widely known. Now researchers have discovered that the devices also may be able to “hear” and record what people are typing on nearby smartphones, even amid background noise.
The microphones on digital assistants are sensitive enough that they can record the taps people make on a mobile device when sitting up to a foot and a half away, according to a team of researchers from the University of Cambridge. The researchers constructed an attack in which they used this capability to identify PINs and text typed into a smartphone.
“Given just 10 guesses, five-digit PINs can be found up to 15 percent of the time, and text can be reconstructed with 50 percent accuracy,” the team–Almos Zarandy, Ilia Shumailov and Ross Anderson—wrote in a paper published online, “Hey Alex, What Did I Just Type” [PDF].
The same group of researchers already had discovered ways that various forms of technology can potentially violate user privacy by engaging in what they call “acoustic snooping.” Last year, they published research on how a smartphone app has the ability record the sound from its microphones and figure out from that what someone has typed, giving it the potential to steal PINs and passwords.
The new research also builds on previous research that found that voice assistants could record the typing of keys on a computer to determine someone’s input, Anderson wrote in a blog post.
“We knew that voice assistants could do acoustic snooping on nearby physical keyboards, but everyone had assumed that virtual keyboards were so quiet as to be invulnerable,” he wrote.
It turns out that they are not, researchers found. Because modern voice assistants like Alexa have two to seven microphones, they can do directional localization, just as human ears do but with even greater sensitivity, the researchers discovered.
“We assess the risk and show that a lot more work is needed to understand the privacy implications of the always-on microphones that are increasingly infesting our work spaces and our homes,” they wrote.
Researchers based their attack on the fact that microphones located close to the screen can hear screen vibrations and use them successfully reconstruct the tap location, they said.
“Physical keyboards emit sound on key presses,” they wrote. “Recent research shows that acoustic side channels can also be exploited with virtual keyboards such as phone touchscreens, which despite not having moving parts still generate sound.”
For their attack, researchers also assumed that an attacker had access to microphones on a smart speaker near a target and aimed to steal PINs and passwords entered on their touchscreen.
To construct the attack, researchers used a Raspberry Pi with a ReSpeaker six-microphone circular array to collect data. The Pi was running a simple TCP server that could be told to start and stop recording and save the audio to a six-channel .wav file. They also set up a Wi-Fi hotspot on the Pi so that participating devices could connect to it.
The “victim devices” used in the experiments were an HTC Nexus 9 tablet, a Nokia 5.1 smartphone and a Huawei Mate20 Prosmartphone, all running Android 7 or above and having at least two microphones, they said.
While only one of each device type was used, it’s likely that a second identical device also could be attacked using training from the first device, researchers noted.
Alexa the Threat
Aside from making voice recordings, it’s becoming increasingly clear that digital assistants have other ways of accessing sensitive data of their users that can be exploiting for nefarious purposes.
A team of researchers last year also discovered that attackers can potentially use light to manipulate the microphones of digital assistants like the Amazon Echo to turn it into sound, using it to attack not only the device but others connected to it.
Their research delved into how the ecosystem of devices connected to voice-activated assistant — such as smart-locks, home switches and even cars—could be hacked by using a device like an Amazon Echo as the gateway to take control of other devices as well.
Other security holes in digital assistants also put people’s personal information at risk. Earlier this year, researchers found flaws in Alexa that could allow attackers to access users’ personal information, like home addresses, simply by persuading them to click on a malicious link.
Download our exclusive FREE Threatpost Insider eBook Healthcare Security Woes Balloon in a Covid-Era World , sponsored by ZeroNorth, to learn more about what these security risks mean for hospitals at the day-to-day level and how healthcare security teams can implement best practices to protect providers and patients. Get the whole story and DOWNLOAD the eBook now – on us!