CAMBRIDGE, Mass. – Chris Valasek and Charlie Miller’s car hacking research put a crunching reality on Internet of Things security, moving it beyond almost clichéd discussions of smart refrigerators leaking inconsequential data, to hackers remotely manipulating car brakes.
But Furby hacking matters too.
Valasek made it clear today during a keynote at the Security of Things forum here Thursday that the connectivity of things is a great unknown, and that today’s low-impact vulnerability in a processor, connector or CAN bus, is tomorrow’s high-impact issue inside a power plant or the brains of a Jeep Cherokee. His favorite example of some low-impact research involved work done by Azimuth Security’s Michael Coppola, a recent Northeastern University graduate, who reverse-engineered a Furby, a popular child’s toy from the 1990s. Coppola discovered vulnerabilities in the way the toy communicates with other Furby toys and its mobile app.
“We did high-impact car hacking research over a cell network that instituted a massive recall,” Valasek said. “But low-impact research cannot be dismissed either. Not every IOT vulnerability is going to be high impact. You have to judge how technology that might be vulnerable today will be used in the future.
“There are processors and communications channels everywhere, and purchasers buy these things in bulk,” Valasek said. “Something that does communications in a Furby may be in a SCADA system as well. Don’t dismiss small things that could have a high impact.”
Having since joined Uber’s Advanced Technology Center—along with Miller—Valasek’s talk was his first public appearance since the remote car hacking research dominated the summer. Valasek had stern reminders about the opportunity in front of researchers and manufacturers to secure devices by design and the need to implement processes to update things already in the field that have been connected, and are likely vulnerable.
But unlike software that can be updated monthly, or on-the-fly if need be, IOT devices have hardware dependencies that make patching challenging.
“There are a lot of complexities these companies have that regular software people don’t. Microsoft can refactor software and not care about the hardware it’s running on. The makers of things like cars cannot do that,” he said. Valasek and Miller were able to attack critical systems on the Jeeps they tested by finding connections via a CAN bus that talked to the entertainment system in the vehicle as well as steering, acceleration and braking systems. Fiat Chrysler America immediately issued a recall of 1.4 million vehicles to apply patches.
“They can’t just refactor,” Valasek said. “They have to replace hardware, which is impossible for a lot of large companies.”
The researcher urged that responsibility for security be shared by a number of parties, including parts manufacturers, OEMs and carriers in the case of the car-hacking research.
In the example of Valasek’s and Miller’s car hacking, the researchers found a vulnerability in a communications module called UConnect manufactured by Harman. Complicating matters was shoddy network segmentation by Sprint that allowed the researchers to use a burner phone purchased at Wal-Mart to act as a hotspot that enabled the remote attacks.
Sprint closed a number of open ports that did more to mitigate potential attacks than the Fiat Chrysler patch that closed a supposed air-gapped connection between CAN buses managing the vehicle’s respective entertainment and acceleration/braking systems, Valasek said. But the key is that the parties—minus Harman—were talking.
“These parties need to communicate and work to ensure networks used for their products are aware of each other,” Valasek said. “What we should do is put forth an effort to secure things when we design them, have design, implementation and remediation reviews. OTA (over-the-air) updates are a must. If something runs code, it will have to be fixed. Researchers have to keep researching.”