of safety, effectiveness, usability, reliability, dependability, security, privacy, availability, and maintainability, Fu said.
There can be overconfidence in the function of software, Fu said. Complacency can be based on the belief that if the software appears to function, nothing can go wrong; this is not always the case. Fu cited one example from the late 1980s involving the Therac-25, one of the first linear accelerators to use software aggressively in the control of radiation treatments. After reports from health-care professionals of injuries and deaths from machine malfunctions (which resulted in radiation overdoses), the manufacturer investigated and reported that the machine could not possibly overtreat a patient (Leveson and Turner, 1993).
Since then, the number of devices using software and the number of devices recalled for software-related issues have been increasing. Fu reported that 6% of all device recalls issued by the Food and Drug Administration (FDA) from 1983 to 1997 cited software as the reason. The proportion nearly doubled from 1999 to 2005: 11.3% of device recalls were attributed to software. In 1983–1997, 24% of recalled devices relied on software in some way, and this increased to 49% during 1999–2005. In 2006, it was reported that over half the medical devices on the US market involved software in their function. In 2002–2010, there were more than 537 recalls of devices that used software, which affected over 1.5 million devices being used in the United States.
Software in a device is different from the hardware for two reasons, Fu asserted. First, software is discrete, rather than continuous. For example, there would be little concern if a manufacturer of 1-inch nails produced a product ranging from 0.9999 inch to 1.0001 inch. That small error is usually tolerable. However a single error in a computer system, changing a 20-mL entry for an infusion pump to a 200-mL entry, can have potentially catastrophic consequences. There is generally no analogous notion of a safety margin for software. Second, software is extremely difficult to test for every possible complication.
Fu noted that software itself can constitute a device itself, for example, an electronic health record. Electronic health records, if designed correctly, could reduce errors substantially, especially errors of patient misidentification. But electronic health records will need to have very strong integrity guarantees and strong security and privacy, and there is an issue of interoperability among hospitals and systems. System complexities involving the collation of vast amounts of information could introduce risks. When asked whether a paper medical record would be a reasonable predicate, especially in considering security and privacy, Fu stated that software behaves differ-