The Institute of Medicine (IOM) report To Err Is Human estimated that 44,000-98,000 lives are lost every year due to medical errors in hospitals and led to the widespread recognition that health care is not safe enough, catalyzing a revolution to improve the quality of care.1 Despite considerable effort, patient safety has not yet improved to the degree hoped for in the IOM report Crossing the Quality Chasm. One strategy the nation has turned to for safer, more effective care is the widespread use of health information technologies (health IT).2 The U.S. government is investing billions of dollars toward meaningful use of effective health IT so all Americans can benefit from the use of electronic health records (EHRs) by 2014.
Health IT is playing an ever-larger role in the care of patients, and some components of health IT have significantly improved the quality of health care and reduced medical errors. Continuing to use paper records can place patients at unnecessary risk for harm and substantially constrain the country’s ability to reform health care. However, concerns about harm from the use of health IT have emerged. To protect America’s health, health IT must be designed and used in ways that maximize patient safety while minimizing harm. Information technology can better help patients if it becomes more usable, more interoperable, and easier to implement and
1 The IOM identified six aims of quality improvement, stating that health care should be safe, effective, patient-centered, timely, efficient, and equitable.
2 Health IT can also be referred to as health information systems and health information and communications technology, among others. This report employs the term health IT but recognizes that these other, broader terms are also used.
maintain. This report explains the potential benefits and risks of health IT and asks for greater transparency, accountability, and reporting.
In this report, health IT includes a broad range of products, including EHRs,3 patient engagement tools (e.g., personal health records [PHRs] and secure patient portals), and health information exchanges; excluded is software for medical devices. Clinicians expect health IT to support delivery of high-quality care in several ways, including storing comprehensive health data, providing clinical decision support, facilitating communication, and reducing medical errors. Health IT is not a single product; it encompasses a technical system of computers and software that operates in the context of a larger sociotechnical system—a collection of hardware and software working in concert within an organization that includes people, processes, and technology.
It is widely believed that health IT, when designed, implemented, and used appropriately, can be a positive enabler to transform the way care is delivered. Designed and applied inappropriately, health IT can add an additional layer of complexity to the already complex delivery of health care, which can lead to unintended adverse consequences, for example dosing errors, failure to detect fatal illnesses, and delayed treatment due to poor human-computer interactions or loss of data.
In recognition of the rapid adoption of health IT, the Office of the National Coordinator for Health Information Technology (ONC) asked the IOM to establish a committee to explore how private and public actors can maximize the safety of health IT-assisted care. The committee interpreted its charge as making health IT-assisted care safer so the nation is in a better position to realize the potential benefits of health IT.
EVALUATING THE CURRENT STATE OF PATIENT SAFETY AND HEALTH IT
The expectations for safer care may be higher in a health IT-enabled environment as compared to a paper-based environment because the opportunity to improve patient care is much greater. The evidence in the literature about the impact of health IT on patient safety, as opposed to quality, is mixed but shows that the challenges facing safer health care and safer use of health IT involve the people and clinical implementation as much as the technology. The literature describes significant improvements in some aspects of care in health care institutions with mature health IT. For example, the use of computerized prescribing and bar-coding systems has been shown
3 “Electronic health records” is used as the desired term because it is more inclusive of the way electronic records are being used currently than “electronic medical records.” EHRs include clinical decision support tools, computerized provider order entry systems, and e-prescribing systems.
to improve medication safety. But the generalizability of the literature across the health care system may be limited. While some studies suggest improvements in patient safety can be made, others have found no effect. Instances of health IT-associated harm have been reported. However, little published evidence could be found quantifying the magnitude of the risk.
Several reasons health IT-related safety data are lacking include the absence of measures and a central repository (or linkages among decentralized repositories) to collect, analyze, and act on information related to safety of this technology. Another impediment to gathering safety data is contractual barriers (e.g., nondisclosure, confidentiality clauses) that can prevent users from sharing information about health IT-related adverse events. These barriers limit users’ abilities to share knowledge of risk-prone user interfaces, for instance through screenshots and descriptions of potentially unsafe processes. In addition, some vendors include language in their sales contracts and escape responsibility for errors or defects in their software (i.e., “hold-harmless clauses”). The committee believes these types of contractual restrictions limit transparency, which significantly contributes to the gaps in knowledge of health IT-related patient safety risks. These barriers to generating evidence pose unacceptable risks to safety.
EXAMINING THE CURRENT STATE OF THE ART IN SYSTEM SAFETY
Software-related safety issues are often ascribed to software coding errors or human errors in using the software. It is rarely that simple. Many problems with health IT relate to usability, implementation, and how software fits with clinical workflow. Focusing on coding or human errors often leads to neglect of other factors (e.g., usability, workflow, interoperability) that may increase the likelihood a patient safety event will occur. Furthermore, software—such as an EHR—is neither safe nor unsafe because safety of health IT cannot exist in isolation from its context of use. Safety is an emergent property of a larger system that takes into account not just the software but also how it is used by clinicians.
The larger system—often called a sociotechnical system—includes technology (e.g., software, hardware), people (e.g., clinicians, patients), processes (e.g., workflow), organization (e.g., capacity, decisions about how health IT is applied, incentives), and the external environment (e.g., regulations, public opinion). Adopting a sociotechnical perspective acknowledges that safety emerges from the interaction among various factors. Comprehensive safety analyses consider these factors taken as a whole and how they affect each other in an attempt to reduce the likelihood of an adverse event, rather than focusing on eliminating one “root cause” and ignoring other possible contributing factors.
OPPORTUNITIES TO BUILD SAFER SYSTEMS FOR HEALTH IT
Merely installing health IT in health care organizations will not result in improved care. Together, the design, implementation, and use of health IT affect its safe performance. Safer implementation and use of health IT is a complex, dynamic process that requires a shared responsibility between vendors and health care organizations.
Features of Safer Health IT
Safely functioning health IT should provide easy entry and retrieval of data, have simple and intuitive displays, and allow data to be easily transferred among health professionals. Many features of software contribute to its safe use, including usability and interoperability. Although definitive evidence is hard to produce, the committee believes poor user-interface design, poor workflow, and complex data interfaces are threats to patient safety.
Similarly, lack of system interoperability is a barrier to improving clinical decisions and patient safety, as it can limit data available for clinical decision making. Laboratory data have been relatively easy to exchange because good standards exist such as Logical Observation Identifiers Names and Codes (LOINC) and are widely accepted. However, important information such as problem lists and medication lists are not easily transmitted and understood by the receiving health IT product because existing standards have not been uniformly adopted. Interoperability must extend throughout the continuum of care; standards need to be developed and implemented to support interaction between health IT products that contain disparate data.
Opportunities to Improve the Design and Development of Technologies
Application of quality management practices needs to be a high priority for design and development activities. Creating safer systems begins with user-centered design principles and continues with adequate testing and quality assessments conducted in actual and/or simulated clinical environments. Vendors should not only create useful functions in their software but also understand how user-interface design affects the clinical setting and workflow where the applications are to be used, as well as support for activities within a health professional’s scope of practice.
Opportunities to Improve Safety in the Use of Health IT
Safety considerations need to be embedded throughout the implementation process, including the stages of planning and goal setting, deployment, stabilization, optimization, and transformation. Selecting the right software
requires a comprehensive understanding of the data and information needs of the organization and the capabilities of the system. Vendors take primary responsibility for the design and development of technologies, ideally with iterative feedback from users. Users assume responsibility for safe implementation and work with vendors throughout the health IT implementation process. The partnership to develop, implement, and optimize systems is a shared responsibility where vendors and users help each other achieve the safest possible applications of health IT.
It is important to recognize that health IT products generally cannot be installed out of the box. Users need to customize products judiciously to appropriately match their needs and capabilities—in both functionality and complexity of operation. The process of implementing software is critical to optimizing value and mitigating patient safety risks. A constant, ongoing commitment to safety—from acquisition to implementation and maintenance—is needed to achieve safer, more effective care. Testing at each of these stages is needed to ensure successful use of health IT.
Responsible use requires diligent surveillance for evolving needs, gaps, performance issues, and mismatches between user needs and system performance, unsafe conditions, and adverse events. The committee believes certain actions are required by private and public entities to monitor safety in order to protect the public’s health and provides the following recommendations to improve health IT safety nationwide—optimizing their use to achieve national health goals, while reducing the risks of their use resulting in inadvertent harm.
Recommendation 1: The Secretary of Health and Human Services (HHS) should publish an action and surveillance plan within 12 months that includes a schedule for working with the private sector to assess the impact of health IT on patient safety and minimizing the risk of its implementation and use. The plan should specify:
a. The Agency for Healthcare Research and Quality (AHRQ) and the National Library of Medicine (NLM) should expand their funding of research, training, and education of safe practices as appropriate, including measures specifically related to the design, implementation, usability, and safe use of health IT by all users, including patients.
b. The Office of the National Coordinator for Health Information Technology should expand its funding of processes that promote safety that should be followed in the development of health IT products, including standardized testing procedures to be used by manufacturers and health care organizations to assess the safety of health IT products.
c. The ONC and AHRQ should work with health IT vendors and health care organizations to promote postdeployment safety testing of EHRs for high-prevalence, high-impact EHR-related patient safety risks.
d. Health care accrediting organizations should adopt criteria relating to EHR safety.
e. AHRQ should fund the development of new methods for measuring the impact of health IT on safety using data from EHRs.
PATIENTS’ AND FAMILIES’ USE OF HEALTH IT: CONCERNS ABOUT SAFETY
Health IT products are also being developed to engage and support patients and their families in decision making and management of their own personal health information. Examples of electronic patient engagement tools include PHRs (both integrated and freestanding), mobile applications, and tools for assessing day-to-day health status (e.g., weight loss), and continue to evolve rapidly. The increasing use of health IT by consumers, patients, and families creates an urgent need for the development and support of a research agenda for these tools.
A SHARED RESPONSIBILITY FOR IMPROVING HEALTH IT SAFETY
Health IT safety is contingent on how the technology is designed, implemented, used, and fits into clinical workflow, requiring the cooperation of both vendors and users. In the absence of a single accountable party, policy makers need to act on behalf of the public good to promote and monitor health IT safety. The committee believes this is best accomplished through collaboration between the private and public sectors.
The private sector must play a major role in making health IT safer, but it will need support from and close collaboration with the public sector. Currently, there is no systematic regulation or sense of shared accountability for product functioning, liability is shifted primarily onto users, and there is no way to publicly track adverse outcomes. Therefore, when instances that either cause or could result in harm occur, there is no authority to collect, analyze, and disseminate learning. Lack of sufficient vendor action to build safer products, or regulatory requirements to do so, threatens patient safety. Access to details of patient safety risks is essential to a properly functioning market where users identify the product that best suits their needs. Users need to share information about risks and adverse events with other users and vendors. Legal clauses shifting liability from vendors to users discourage sharing.
Recommendation 2: The Secretary of HHS should ensure insofar as possible that health IT vendors support the free exchange of information about health IT experiences and issues and not prohibit sharing of such information, including details (e.g., screenshots) relating to patient safety.
Once information about patient safety risks is available, comparative user experiences can be shared. Currently, users cannot communicate effectively their experiences with health IT. In other industries, product reviews are available where users can rate their experiences with products and share lessons learned. A consumer guide for health IT safety could help identify safety concerns, increasing system transparency.
To gather objective information about health IT products, researchers should have access to both test versions of software provided by vendors and software already integrated in user organizations. Users should be able to compare and share their experiences and other measures of safety from health IT products.
Recommendation 3: The ONC should work with the private and public sectors to make comparative user experiences across vendors publicly available.
Another area necessary for making health IT safer is the development of measures. Inasmuch as the committee’s charge is to recommend policies and practices that lead to safer use of health IT, the nation needs reliable means of assessing the current state and monitoring for improvement. Currently, no entity is developing such measures; Recommendation 1 is for AHRQ, the NLM, and the ONC to fund development of these measures. The lack of measures and diversity of involved parties suggests a coordinating body is needed to oversee the development, application, and evaluation of measures of safety of health IT use. Best practices will need to ensure health IT is developed and implemented with safety as a priority.
Recommendation 4: The Secretary of HHS should fund a new Health IT Safety Council to evaluate criteria for assessing and monitoring the safe use of health IT and the use of health IT to enhance safety. This council should operate within an existing voluntary consensus standards organization.
This function could be housed within existing organizations, such as the National Quality Forum.
Because threats to health IT safety can arise before, during, and after implementation, it is also useful to design methods to monitor health IT
safety. Standards development organizations such as the American National Standards Institute and the Association for the Advancement of Medical Instrumentation could seek input from a broad group of stakeholders when developing these standards, criteria, and tests. Additionally, vendor attestation that they have addressed specific safety issues in the design and development of their products can be important. Best practices for acquisition and implementation of health IT need to be developed. Development of postimplementation tests would help users monitor whether their systems meet certain safety benchmarks. Applying these tests is also a way for users to work with vendors to ensure that products have been installed correctly; accreditation organizations, such as The Joint Commission, could require conduct of these safety tests as part of their accreditation criteria.
Finally, the committee found successful adoption of change requires education and training of the workforce. Basic levels of competence, knowledge, and skill are needed to navigate the highly complex implementation of health IT. Because health IT exists at the intersection of multiple disciplines, a variety of professionals will need training in a number of established disciplines such as health systems, IT, and clinical care.
The Role of the Public Sector: Strategic Guidance and Oversight
A shared learning environment should be fostered to the fullest extent possible by the private sector, but, in some instances, the government needs to provide guidance and direction to private-sector efforts and to correct misaligned market forces. An appropriate balance must be reached between government oversight and market innovation. To encourage innovation and shared learning environments, the committee adopted the following general principles for government oversight:
• Focus on shared learning,
• Maximize transparency,
• Be nonpunitive,
• Identify appropriate levels of accountability, and
• Minimize burden.
The committee believes HHS should take the following actions to improve health IT safety.
First, to improve transparency and safety, it is necessary to identify the products being used and to whom any actions need to be directed. Having a mechanism to accomplish this is important so that when new knowledge about safety or performance arises, other users and products that could also be vulnerable can be identified. The ONC employed a similar mechanism for EHR vendors to list their products in implementing the meaningful use
program. The committee supports continuation of the ONC’s efforts to list all products certified for meaningful use in a single database as a first step for ensuring safety.
Recommendation 5: All health IT vendors should be required to publicly register and list their products with the ONC, initially beginning with EHRs certified for the meaningful use program.
Second, by establishing quality management principles and processes in health IT, vendors can improve the safety of their product lines. Experiences from other industries suggest the best approach to proactively creating highly reliable products is not to certify each individual product but to make sure organizations have adopted quality management principles and processes in the design and development of products.
While many vendors already have some types of quality management principles and processes in place, not all vendors do and to what standard they are held is unknown. An industry standard is needed to ensure comprehensive industry adoption. To this end, the committee believes adoption of quality management principles and processes should be mandatory for all health IT vendors. The ONC, Food and Drug Administration (FDA), and health IT certification bodies are examples of organizations that could potentially administer this function.
Recommendation 6: The Secretary of HHS should specify the quality and risk management process requirements that health IT vendors must adopt, with a particular focus on human factors, safety culture, and usability.
Third, to quantify patient safety risks, reports of adverse events need to be collected, supplementing private-sector efforts. High-priority health IT- related adverse events include death, serious injury, and unsafe conditions. Analyses of unsafe conditions would produce important information that could have a great impact on improving patient safety and enable adoption of corrective actions that could prevent death or serious injury.
Regular reporting of adverse events is widely used to identify and rectify vulnerabilities that threaten safety for the purposes of learning. However, learning about safety of health IT is limited because there are currently no comprehensive analyses available about health IT-related adverse events, no consequences for failing to discover and report evidence about harm, and no aggregation of data for learning. In other countries and industries, reporting systems all differ with respect to their design, but the majority employ reporting that is voluntary, confidential, and nonpunitive. Creating a nonpunitive environment is essential for the success of voluntary
reporting systems. Reports must be collected for the purpose of learning and should not be used to address accountability.
The committee believes reports of health IT-related adverse events and unsafe conditions that are verified and free of user-identifying information should be transparently available to the public. The committee believes reporting of deaths, serious injuries, or unsafe conditions should be mandatory for vendors. Direction will need to come from a federal entity with adequate expertise, capacity, and authority to act on reports of health IT-related adverse events. The Secretary of HHS should designate an entity and provide it with the necessary resources to do so.
Current user reporting efforts are generally not coordinated with one another and not collected in a systematic manner; a more streamlined reporting system is needed. AHRQ has developed a common format that precisely defines the components of a field report for health IT-related adverse events or risks. Reports by users should remain voluntary and the identities of reporters should not be discoverable under any circumstance. Patient Safety Organizations are examples of entities that can protect this information from discovery. User-reported health IT-related adverse events should be collected by a central repository and also be sent to the appropriate vendor.
Recommendation 7: The Secretary of HHS should establish a mechanism for both vendors and users to report health IT-related deaths, serious injuries, or unsafe conditions.
a. Reporting of health IT-related adverse events should be mandatory for vendors.
b. Reporting of health IT-related adverse events by users should be voluntary, confidential, and nonpunitive.
c. Efforts to encourage reporting should be developed, such as removing the perceptual, cultural, contractual, legal, and logistical barriers to reporting.
However, reports of patient safety incidents are only one part of a larger solution to maximize the safety of health IT-assisted care. The power to improve safety lies not just with reporting requirements, but with the ability to act on and learn from reports. To this end, two distinct functions are also needed: (1) aggregating and analyzing reports and (2) investigating the circumstances associated with safety incidents to determine the conditions that contribute to those incidents. Through these processes, lessons learned can be developed so similar incidents will be less likely to occur in the future. To maximize the effectiveness of reports, the collection, aggregation and analysis, and investigation of reports should be coupled as closely as possible.
Ideally, all reports of health IT-related adverse events would be aggregated and analyzed by a single entity that would identify reports for immediate investigation. Reports to this entity have to include identifiable data to allow investigators to follow up in the event the reported incident requires investigation. The entity would investigate two categories of reports: (1) reports that result in death or serious injury and (2) reports of unsafe conditions. Prioritization among the reports should be determined on a risk-based hazard analysis. In keeping with the principle of transparency, reports and results of investigations should be made public. A feedback loop from the investigatory entity back to both vendors and users is essential to allow groups to rectify systemic issues found that introduce risk.
The committee considered a number of potential organizations that could objectively analyze reports of unsafe conditions, as well as conduct investigations into health IT-related adverse events in the way the committee envisions, including FDA, the ONC, AHRQ, and the private sector. The committee concluded that investigating patient safety incidents does not match the internal expertise of any existing entity, as the needed functions are under the jurisdiction of multiple federal agencies and efforts are generally uncoordinated and not comprehensive.
The committee believes development of an independent, federal entity could perform the needed analytic and investigative functions in a transparent, nonpunitive manner. It would be similar in structure to the National Transportation Safety Board, an independent federal agency created by Congress to conduct safety investigations. The entity would make non- binding recommendations to the Secretary of HHS. Nonbinding recommendations provide flexibility, allowing the Secretary, health care organizations, vendors, and external experts to collectively determine the best course forward. Because current federal agencies do not have this as their charge, nor the baseline funding to take on these activities, the committee believes an independent, federal entity is the best option to provide a platform to support shared learning at a national level.
Recommendation 8: The Secretary of HHS should recommend that Congress establish an independent federal entity for investigating patient safety deaths, serious injuries, or potentially unsafe conditions associated with health IT. This entity should also monitor and analyze data and publicly report results of these activities.
When combined, removing contractual restrictions, promoting public reporting, and having a system in place for independent investigations can be a powerful force for improving patient safety.
Achieving transparency and safer health IT products and safer use of health IT will require the cooperation of all stakeholders. Without more information about the magnitude and types of harm, other mechanisms will be necessary to motivate the market to correct itself. The committee offers a two-stage approach, with its recommended actions as the first stage to provide a better understanding of the threats to patient safety.
The current state of safety and health IT is not acceptable; specific actions are required to improve the safety of health IT. The first eight recommendations are intended to create conditions and incentives to encourage substantial industry-driven change without formal regulation. However, because the private sector to date has not taken sufficient action on its own, the committee believes a follow-up recommendation is needed to formally regulate health IT.4 If the actions recommended to the private and public sectors are not effective as determined by the Secretary of HHS, the Secretary should direct FDA to exercise all authorities to regulate health IT.
The committee was of mixed opinion on how FDA regulation would impact the pace of innovation by industry but identified several areas of concern regarding immediate FDA regulation. The current FDA framework is oriented toward conventional, out-of-the-box, turnkey devices. However, health IT has multiple different characteristics, suggesting that a more flexible regulatory framework will be needed in this area to achieve the goals of product quality and safety without unduly constraining market innovation. For example, as a software-based product, health IT has a product life cycle very different from that of conventional technologies. These products exhibit great diversity in features, functions, and scope of intended and actual use, which tend to evolve over the life of the product. Taking a phased, risk-based approach can help address this concern. FDA has chosen to not exercise regulatory authority over EHRs, and controversy exists over whether some health IT products such as EHRs should be considered medical devices. If the Secretary deems it necessary for FDA to regulate EHRs and other currently nonregulated health IT products, clear determinations will need to be made about whether all health IT products classify as medical devices for the purposes of regulation. If FDA regulation is deemed necessary, FDA will need to commit sufficient resources and add capacity and expertise to be effective.
The Secretary should report annually to Congress and the public on the progress of efforts to improve the safety of health IT, beginning 12 months from the release of this report. In these reports, the Secretary should make
clear the reasons why further oversight actions are or are not needed. In parallel, the Secretary should ask FDA to begin planning the framework needed for potential regulation consistent with Recommendations 1 through 8 so that, if she deems FDA regulation to be necessary, the agency will be ready to act, allowing for the protection of patient safety without further delay. The committee recognizes that not all of its recommendations can be acted on by the Secretary alone and that some will require congressional action.
Recommendation 9a: The Secretary of HHS should monitor and publicly report on the progress of health IT safety annually beginning in 2012. If progress toward safety and reliability is not sufficient as determined by the Secretary, the Secretary should direct FDA to exercise all available authorities to regulate EHRs, health information exchanges, and PHRs.
Recommendation 9b: The Secretary should immediately direct FDA to begin developing the necessary framework for regulation. Such a framework should be in place if and when the Secretary decides the state of health IT safety requires FDA regulation as stipulated in Recommendation 9a above.
FUTURE RESEARCH FOR CARE TRANSFORMATION
The committee identified a number of research gaps during its information gathering. Research is needed to continue to build the evidence to determine how to develop and adopt safer health IT most effectively. A greater body of conclusive research is needed to fully meet the potential of health IT for ensuring patient safety.
Recommendation 10: HHS, in collaboration with other research groups, should support cross-disciplinary research toward the use of health IT as part of a learning health care system. Products of this research should be used to inform the design, testing, and use of health IT. Specific areas of research include
a. User-centered design and human factors applied to health IT,
b. Safe implementation and use of health IT by all users,
c. Sociotechnical systems associated with health IT, and
d. Impact of policy decisions on health IT use in clinical practice.
Creating an infrastructure that supports learning about and improving the safety of health IT is needed to achieve better health care. Proactive steps must be taken to ensure that health IT is developed and implemented
with safety as a primary focus through the development of industry-wide measures, standards, and criteria for safety. Surveillance mechanisms are needed to identify, capture, and investigate adverse events to continually improve the safety of health IT. Transparency and cooperation between the private and public sectors are critical to creating the necessary infrastructure to build safer systems that will lead to better care for all Americans.