FDA: Software Failures Responsible for 24% Of All Medical Device Recalls

Software failures were behind 24 percent of all the medical device recalls in 2011, according to data from the U.S. Food and Drug Administration, which said it is gearing up its labs to spend more time analyzing the quality and security of software-based medical instruments and equipment.

Software failures were behind 24 percent of all the medical device recalls in 2011, according to data from the U.S. Food and Drug Administration, which said it is gearing up its labs to spend more time analyzing the quality and security of software-based medical instruments and equipment.

The FDA’s Office of Science and Engineering Laboratories (OSEL) released the data in its 2011 Annual Report on June 15, amid reports of a compromise of a Web site used to distribute software updates for hospital respirators. The absence of solid architecture and “principled engineering practices” in software development affects a wide range of medical devices, with potentially life-threatening consequences, the Agency said. In response, FDA told Threatpost that it is developing tools to disassemble and test medical device software and locate security problems and weak design.

In an e-mail statement, the agency said that it is developing “techniques and laboratory expertise to assist our review staff in identifying potential vulnerabilities and evaluating risk mitigation measures” similar to those used in “regulated industry.”

The Agency is also acquiring expertise in areas like “detecting malware inside device designs…(and) reverse engineering certain types of malware to best identify the specific protective practices which manufacturers should be employing,” the report reads. (PDF)

The statement is the clearest indication to date that FDA is shifting focus to make software quality an area of interest. The agency has come under fire in recent years for not holding manufacturers’ accountable for insecure or poorly written software. There is growing evidence that software security and integrity is a growing problem in the medical field. In October, for example, security researcher Barnaby Jack demonstrated a remote, wireless attack on an implantable insulin pump from the firm Medtronic. The attack could have enabled a remote assailant to command the pump to release a fatal dose of insulin to a diabetic. That presentation was similar to one in August, 2011, at the DEFCON hacking conference in Las Vegas. In that case, researcher Jerome Radcliffe – a diabetic, himself – demonstrated how he could remotely manipulate the dosage levels delivered by this insulin pump from up to 300 feet away. The demonstration prompted calls for a review of wireless medical device security from two U.S. Congressmen.

FDA says it “shares the concern” of security researchers about the security and privacy of medical devices. FDA “emphasizes security as a key element in device design. Any system with wireless communication can be subject to interception of data and compromised privacy as well as interference with performance that can compromise the safety and effectiveness of the device,” according to an e-mail statement from the Agency.

“We continue to closely monitor for safety or security problems,” FDA wrote.

Security researchers point out that many medical devices now rely on some form of embedded software for management and diagnostics. However, that software is often rife with vulnerabilities and other exploitable security holes. Moreover, medical device makers don’t deploy tools and strategies long used by commercial software developers to ensure the integrity of software and communications. Those include the use of cryptographic hashes to verify the authenticity of software updates.

Recent research done on the security of medical devices by a team of researchers identified software security vulnerabilities in software that controlled an Automated External Defibrillator (AED) which is used to treat cardiac arrhythmias (PDF). The researchers also found that the device would accept unsigned, counterfeit software updates.

Research by University of Massachusetts researchers Shane Clark and Kevin Fu found that the problem of securing the software that runs medical devices is complicated by the tension between the need for devices to operate securely and the need for many to operate swiftly and reliably in the event of an emergency. Software coding problems and fuzzier “human factors” – such as our tendency to ignore or incorrectly use security features were also identified as confounding factors in medical device security.

Those questions were brought into sharp relief this week with news that a Web site operated by the firm CareFusion Inc. was hacked and infected with malicious code. The Web sites in question were used to distribute software updates for CareFusion’sAVEA and VELA brand respirators. In a statement, CareFusion said that the infection lasted around two weeks and did not affect any downloadable software on the site.

Fu, of The University of Massachusetts, said that device manufacturers operate in a manner that’s almost “medieval” compared to mainstream technology vendors.
“There are some very basic problems right now, like merely recognizing that it’s not safe to distribute software without cryptographic protection for authenticity, integrity, and freshness. It’s not part of the the manufacturing vocabulary,” he wrote. “Some companies might do a good job, but I have yet to find a single company that issues digital signatures or hashes of medical device downloads.”

And that doesn’t even address the (bigger) problem of software quality. However, there’s evidence that the FDA is beginning to turn its attention to that topic. In the latest OSEL report, FDA describes efforts by its Division of Electrical and Software Engineering to identify deficiencies in a medical device manufacturer’s software quality inspection processes. In one instance, a DESE-employed biomedical software engineer identified a pattern of customer complaints about incorrect or missing notifications to clinicians when test results were out of range. The DESE employee linked the reports to what FDA said were “several coding defects which directly caused many of (the) customer complaints. “Some defects were basic violations of software coding practices while others were new defects that were introduced during correction of the previous defects.” The audit resulted in the manufacturer issuing two FDA-mandated “Correction and Removal” notices and 11 Class II recalls of its products.

Still, the FDA maintains that it is the job of manufacturers to secure their products and make sure they stay secure.

“Manufacturers are responsible for identifying risks and hazards associated with medical device software (or) firmware, including risks related to security, and are responsible for putting appropriate mitigations in place to address patient safety,” the agency said in an e-mail statement. “Information related to theoretical device security problems is helpful. However, it is very important that the agency receive reports of devices that have had security breaches.”

Suggested articles