Creating Safety Systems in Health Care Organizations
Unsafe acts are like mosquitoes. You can try to swat them one at a time, but there will always be others to take their place. The only effective remedy is to drain the swamps in which they breed. In the case of errors and violations, the "swamps" are equipment designs that promote operator error, bad communications, high workloads, budgetary and commercial pressures, procedures that necessitate their violation in order to get the job done, inadequate organization, missing barriers, and safeguards . . . the list is potentially long but all of these latent factors are, in theory, detectable and correctable before a mishap occurs.1
Safety systems in health care organizations seek to prevent harm to patients, their families and friends, health care professionals, contract-service workers, volunteers, and the many other individuals whose activities bring them into a health care setting. Safety is one aspect of quality, where quality includes not only avoiding preventable harm, but also making appropriate care available—providing effective services to those who could benefit from them and not providing ineffective or harmful services.2
As defined in Chapter 3, patient safety is freedom from accidental injury. This definition and this report intentionally view safety from the perspective of the patient. Accordingly, this chapter focuses specifically on patient safety. The committee believes, however, that a safer environment for patients would also be a safer environment for workers and vice versa, because both
are tied to many of the same underlying cultural and systemic issues. As cases in point, hazards to health care workers because of lapses in infection control, fatigue, or faulty equipment may result in injury not only to workers but also to others in the institution.
This chapter introduces what has been learned from other high-risk industries about improving safety. It then discusses key concepts for designing systems and their application in health care. This is followed by a discussion of five principles to guide health care organizations in designing and implementing patient safety programs. Lastly, the chapter discusses a critical area of safety, namely medication safety and illustrates the principles with strategies that health care organizations can use to improve medication safety.
The committee is convinced that there are numerous actions based on both good evidence and principles of safe design that health care organizations can take now or as soon as possible to substantially improve patient safety. Specifically, the committee makes two overarching recommendations: the first concerns leadership and the creation of safety systems in health care settings; the second concerns the implementation of known medication safety practices.
RECOMMENDATION 8.1 Health care organizations and the professionals affiliated with them should make continually improved patient safety a declared and serious aim by establishing patient safety programs with a defined executive responsibility. Patient safety programs should: (1) provide strong, clear, and visible attention to safety; implement nonpunitive systems for reporting and analyzing errors within their organizations; (2) incorporate well-understood safety principles, such as, standardizing and simplifying equipment, supplies, and processes; and (3) establish interdisciplinary team training programs, such as simulation, that incorporate proven methods of team management.
Chief executive officers and boards of trustees must make a serious and ongoing commitment to creating safe systems of care. Other high-risk industries have found that improvements in safety do not occur unless there is commitment by top management and an overt, clearly defined, and continuing effort on the part of all personnel and managers. Like any other program, a meaningful safety program should include senior-level leadership,
defined program objectives, plans, personnel, and budget, and should be monitored by regular progress reports to the executive committee and board of directors.
According to Cook,3 Safety is a characteristic of systems and not of their components. Safety is an emergent property of systems. In order for this property to arise, health care organizations must develop a systems orientation to patient safety, rather than an orientation that finds and attaches blame to individuals. It would be hard to overestimate the underlying, critical importance of developing such a culture of safety to any efforts that are made to reduce error. The most important barrier to improving patient safety is lack of awareness of the extent to which errors occur daily in all health care settings and organizations. This lack of awareness exists because the vast majority of errors are not reported, and they are not reported because personnel fear they will be punished.
Health care organizations should establish nonpunitive environments and systems for reporting errors and accidents within their organizations. Just as important, they should develop and maintain an ongoing process for the discovery, clarification, and incorporation of basic principles and innovations for safe design and should use this knowledge in understanding the reasons for hazardous conditions and ways to reduce these vulnerabilities. To accomplish these tasks requires that health care organizations provide resources to monitor and evaluate errors and to implement methods to reduce them.
Organizations should incorporate well-known design principles in their work environment. For example, standardization and simplification are two fundamental human factors principles that are widely used in safe industries and widely ignored in health care.
They should also establish interdisciplinary team training programs—including the use of simulation for trainees and experienced practitioners for personnel in areas such as the emergency department, intensive care unit, and operating room; and incorporating proven methods of managing work in teams as exemplified in aviation (where it is known as crew resource management).
RECOMMENDATION 8.2 Health care organizations should implement proven medication safety practices.
A number of practices have been shown to reduce errors in the medication process and to exemplify known methods for improving safety. The committee believes they warrant strong consideration by health care organi-
zations including hospitals, long-term-care facilities, ambulatory settings, and other health care delivery sites, as well as outpatient and community pharmacies. These methods include: reducing reliance on memory; simplification; standardization; use of constraints and forcing functions; the wise use of protocols and checklists; decreasing reliance on vigilance, handoffs, and multiple data entry; and differentiating among products to eliminate look-alike and sound-alike products.
Errors occur in all industries. Some industrial accidents involve one or a few workers. Others affect entire local populations or ecosystems. In health care, events are well publicized when they appear to be particularly egregious—for example, wrong-site surgery or the death of a patient during what is thought to be a routine, low-risk procedure. Generally, however, accidents are not well publicized; indeed, they may not be known even to the patient or to the family. Because the adverse effects may be separated in time or space from the occurrence, they may not even be recognized by the health care workers involved in the patient's care.
Nevertheless, we know that errors are ubiquitous in all health care settings.4 Harms range from high-visibility cases to those that are minimal but require additional treatment and time for the patient to recuperate or result in a patient's failure to receive the benefit of appropriate therapy. In aggregate, they represent a huge burden of harm and cost to the American people as described in Chapter 2.
To date, however, those involved in health care management and delivery have not had specific, clear, high-level incentives to apply what has been learned in other industries about ways to prevent error and reduce harm. Consequently, the development of safety systems, broadly understood, has not been a serious and widely adopted priority within health care organizations. This report calls on organizations and on individual practitioners to address patient safety.
Health care is composed of a large set of interacting systems—paramedic, emergency, ambulatory, inpatient care, and home health care; testing and imaging laboratories; pharmacies; and so forth—that are connected in loosely coupled but intricate networks of individuals, teams, procedures, regulations, communications, equipment, and devices that function with diffused management in a variable and uncertain environment.5 Physicians in community practice may be so tenuously connected that they do not even
view themselves as part of a system of care. They may see the hospitals in which they are attendings as platforms for their work. In these and many other ways, the distinct cultures of medicine (and other health professions) add to the idiosyncrasy of health care among high-risk industries.
Nevertheless, experience in other high-risk industries has provided wellunderstood illustrations that can be used in improving health care safety. Studies of actual accidents, incident-reporting systems, and research on human factors (i.e., the interface of human beings and machines and their performance in complex working environments) have contributed to our growing understanding about how to prevent, detect, and recover from accidents. This has occurred because, despite their differences from health care, all systems have common characteristics that include the use of technologies, the users of these technologies, and an interface between the users and the technologies.6 The users of technology bring certain characteristics to a task such as the quality of their knowledge and training, level of fatigue, and careful or careless habits. They also bring characteristics that are common to everyone, including difficulty recalling material and making occasional errors.
Safety Systems in High-Risk Industries
The experience in three high-risk industries—chemical and material manufacturing and defense—provides examples of the information and systems that can contribute to improved safety and of the safety achievements that are possible. Claims that health care is unique and therefore not susceptible to a transfer of learning from other industries are not supportable. Rather, the experiences of other industries provide invaluable insight about how to begin the process of improving the safety of health care by learning how to prevent, detect, recover, and learn from accidents.
E.I. du Pont de Nemours and Company
E.I. du Pont de Nemours and Company has one of the lowest rates of occupational injury of any company, substantiation of an 11-point safety philosophy that includes the tenets that all injuries are preventable; that management is responsible and accountable for preventing injury; that safety must be integrated as a core business and personal value; and that deficiencies must be corrected promptly. In 1994, Conoco Refining, a subsidiary, reported only 1.92 work-loss days per 200,000 hours of exposure. In 1998,
this rate was further reduced to 0.39. Some of DuPont's plants with more than 2,000 employees have operated for more than 10 years without a losttime injury, and one plant producing glycolic acid celebrated 50 years without a lost workday.7 DuPont credits its safety record, at least in part, to its implementation of a nonpunitive system to encourage employees to report near-miss incidents without fear of sanctions or disciplinary measures and its objective to create an all-pervasive, ever-present awareness of the need to do things safely.89
Another industry example is Alcoa, which is involved in mining, refining, smelting, fabricating, and recycling aluminum and other materials. Alcoa uses a worldwide on-line safety data system to track incidents, analyze their causes, and share preventive actions throughout all of its holdings. One of its principles is that all incidents, including illnesses, injuries, spills, and excursions, can be prevented whether they are immediate, latent, or cumulative. Although Alcoa reduced its international lost work day rate per 200,000 hours worked from 1.87 in 1987 to 0.42 in 1997, it has recently gone even further and announced a plan to eliminate fatalities and reduce the average injury rate by 50 percent by the end of the year 2000.10
Several aspects of these two examples are striking. In comparison to the health care industry, DuPont, Alcoa, and others systematically collect and analyze data about accidents. They have been tracking their own performance over time and are able to compare themselves to others in their industries. They are willing to publish their results as information to which stockholders and employees are entitled and as a source of pride, and their efforts have achieved extremely low and continuously decreasing levels of injury. The importance of a strong culture of safety, as nurtured by both DuPont and Alcoa, is viewed by many in the safety field as being the most critical underlying feature of their accomplishments.
U.S. Navy: Aircraft Carriers
People are quick to point out that health care is very different from a manufacturing process, mostly because of the huge variability in patients and circumstances, the need to adapt processes quickly, the rapidly changing knowledge base, and the importance of highly trained professionals who must use expert judgment in dynamic settings. Though not a biological sys-
tem, the performance of crews and flight personnel on aircraft carriers provides an example that has features that are closer to those in health care environments than manufacturing.
On an aircraft carrier, fueling aircraft and loading munitions are examples of the risks posed when performing incompatible activities in close proximity. On the flight deck, 100 to 200 people fuel, load munitions, and maintain aircraft that take off and are recovered at 48- to 60-second intervals. The ability to keep these activities separate requires considerable organizational skill and extensive ongoing training to avoid serious injury to flight and nonflight personnel, the aircraft, and the ship. Despite extremely dangerous working conditions and restricted space, the Navy's "crunch rate" aboard aircraft carriers in 1989 was only 1 per 8,000 moves which makes it a very highly reliable, but complex, social organization.*
Students of accident theory emphasize how the interactive complexity of an organization using hazardous technologies seems to defy efforts of system designers and operators to prevent accidents and ensure reliability. In part, this is because individuals are fallible and in part because unlikely and rare (and thus unanticipated) failures in one area are linked in complex systems and may have surprising effects in other systems—the tighter the "coupling," generally, the more likely that failure in one part will affect the reliability of the whole system. Nevertheless, even in such systems, great consistency is achievable using four strategies in particular: the prioritization of safety as a goal; high levels of redundancy, the development of a safety culture that involves continuous operational training, and high-level organizational learning.11
Weick and Roberts12 have studied peacetime flight operations on aircraft carriers as an example of organizational performance requiring nearly continuous operational reliability despite complex patterns of interrelated activities among many people. These activities cannot be fully mapped out beforehand because of changes in weather (e.g., wind direction and strength), sea conditions, time of day and visibility, returning aircraft arrivals, and so forth. Yet, surprisingly, generally mapped out sequences can be carried out with very high reliability in novel situations using improvisation and adaptation and personnel who are highly trained but not highly educated.
*A crunch occurs when two aircraft touch while being moved, either on the flight or hangar deck, even if damage is averted.
Naval commanders stress the high priority of safety. They understand the importance of a safety culture and use redundancy (both technical and personnel) and continuous training to prepare for the unexpected. The Navy also understands the need for direct communication and adaptability. Because errors can arise from a lack of direct communication, the ship's control tower communicates directly with each division over multiple channels.
As in health care, it is not possible in such dynamic settings to anticipate and write a rule for every circumstance. Once-rigid orders that prescribed how to perform each operation have been replaced by more flexible, less hierarchical methods. For example, although the captain's commands usually take precedence, junior officers can, and do, change these priorities when they believe that following an order will risk the crew's safety. Such an example demonstrates that even in technologically sophisticated, hazardous, and unpredictable environments it is possible to foster real-time problem solving and to institute safety systems that incorporate a knowledge of human factors.
In summary, efforts such as those described in the three examples have resulted neither in stifled innovation nor loss of competitive benefit; nor have they resulted in unmanageable legal consequences. Rather, they are a source of corporate and employee pride. Characteristics that distinguish successful efforts in other industries include the ability to collect data on errors and incidents within the organization in order to identify opportunities for improvement and to track progress. The companies make these data available to outsiders. Other notable features of these efforts include the importance of leadership and the development of a safety culture, the use of sophisticated methods for the analysis of complex processes, and a striving for balance among standardization where appropriate, yet giving individuals the freedom to solve problems creatively.
Key Safety Design Concepts
Designing safe systems requires an understanding of the sources of errors and how to use safety design concepts to minimize these errors or allow detection before harm occurs. This field is described in greater detail in Chapter 3 which includes an error taxonomy first proposed by Rasmussen13 and elaborated by Reason14 to distinguish among errors arising from (1) skill-based slips and lapses; (2) rule-based errors; and (3) knowledge-based mistakes.
Leape has simplified this taxonomy to describe what he calls ''the pathophysiology of error." He differentiates between the cognitive mechanisms
used when people are engaging in well-known, oft-repeated processes and their cognitive processes when problem solving. The former are handled rapidly, effortlessly, in parallel with other tasks, and with little direct attention. Errors may occur because of interruptions, fatigue, time pressure, anger, anxiety, fear, or boredom. Errors of this sort are expectable, but conditions of work can make them less likely. For example, work activities should not rely on weak aspects of human cognition such as short-term memory. Safe design, therefore, avoids reliance on memory.
Problem-solving processes, by contrast, are slower, are done sequentially (rather than in parallel with other tasks), are perceived as more difficult, and require conscious attention. Errors are due to misinterpretation of the problem that must be solved, lack of knowledge to bring to bear, and habits of thought that cause us to see what we expect to see. Attention to safe design includes simplification of processes so that users who are unfamiliar with them can understand quickly how to proceed, training that simulates problems, and practice in recovery from these problems.
As described in Chapter 3, instances of patient harm are usually attributed to individuals "at the sharp end" who make the visible error. Their prevention, however, requires systems that are designed for safety—that is, systems in which the sources of human error have been systematically recognized and minimized.15, 16
In recent years, students of system design have looked for ways to avoid error using what has been called by Donald Norman17 "user-centered design." This chapter draws on six strategies that Norman outlines. They are directed at the design of individual devices so that they can be used reliably and safely for their intended purposes. Although these strategies are aimed at the human-machine interface, they can also be usefully applied to processes of care.
The first strategy is to make things visible—including the conceptual model of the system—so that the user can determine what actions are possible at any moment—for example, how to turn off a piece of equipment, how to change settings, and what is likely to happen if a step in a process is skipped. The second strategy is to simplify the structure of tasks so as to minimize the load on working memory, planning, or problem solving.
A third strategy is what Norman calls the use of affordances and natural mappings. An affordance is a characteristic of equipment or workspace that communicates how it is to be used, such as a push bar on an outward opening door that indicates where to push. Another example is a telephone handset that is uncomfortable to hold in any position but the correct one.
Natural mapping refers to the relationship between a control and its movement; for example, in steering a car to the right, one turns the wheel right. Natural mapping takes advantage of physical analogies and cultural knowledge to help users understand how to control devices. Other examples of natural mapping are arranging light switches in the same pattern as lights in a lecture room; arranging knobs to match the arrangement of burners on a stove; or using louder sound, an increasingly brighter indicator light, or a wedge shape to indicate a greater amount.
A fourth important strategy is the use of constraints or "forcing functions" to guide the user to the next appropriate action or decision. A constraint makes it hard to do the wrong thing; a forcing function makes it impossible. A classic example of a forcing function is that one cannot start a car that is in gear.
Norman's fifth strategy is to assume that errors will occur and to design and plan for recovery by making it easy to reverse operations and hard to carry out nonreversible ones. An example is the Windows® computer operating system that asks if the user really intends to delete a file, and if so, puts it in a "recycle" folder so that it can still be retrieved.
Finally, Norman advises that if applying the earlier strategies does not achieve the desired results, designers should standardize actions, outcomes, layouts, and displays. An example of standardization is the use of protocols for chemotherapy. An example of simplification is reducing the number of dose strengths of morphine in stock.
Safety systems can be both local and organization wide. Local systems are implemented at the level of a small work group—a department, a unit, or a team of health care practitioners. Such local safety systems should be supported by, and consistent with, organization-wide safety systems.
Anesthesiology is an example of a local, but complex, high-risk, dynamic patient care system in which there has been notably reduced error. Responding to rising malpractice premiums in the mid-1980s, anesthesiologists confronted the safety issues presented by the need for continuing vigilance during long operations but punctuated by the need for rapid problem evaluation and action. They were faced with a heterogeneity of design in anesthesia devices; fatigue and sleep deprivation; and competing institutional, professional, and patient care priorities. By a combination of technological advances (most notably the pulse oximeter), standardization of equipment, and changes in training, they were able to bring about major, sustained, widespread reduction in morbidity and mortality attributable to the administration of anesthesia.18
Organization-wide systems, on the other hand, are implemented and monitored at the level of a health care organization. These include programs and processes that cross departmental lines and units. In hospitals, infection control and medication administration are examples of organization-wide systems that encompass externally imposed regulations, institutional policies and procedures, and the actions of individuals who must provide potentially toxic materials at the right time to the right patient.
Principles for the Design of Safety Systems in Health Care Organizations
Hospitals and other institutions have long-standing efforts to ensure patient safety in a variety of areas. Appendix E provides an overview of some of these efforts in hospitals. Some have been very effective in certain units or certain hospitals. These activities have not, however, succeeded in eliminating error or injury, and they have not been part of national or even institution-wide, high-priority efforts.
Compared to hospital care, out-of-hospital care—whether in institutions, homes, medical offices or other settings, both the knowledge of the kind and magnitude of errors and the development of safety systems are rudimentary. Safety tends to be addressed narrowly by reliance on education and training, policies, and procedures. There are undoubtedly many reasons for the lack of attention to safety including: small staff size, lack of technical knowledge of effective ways to improve quality or an infrastructure to support deploying this knowledge; lack of recognition of error (because the harm is removed in time or space from the error and because individuals are unharmed); lack of data systems to track and learn from error (most of the adverse drug events studies use emergency visits or hospital admissions to establish a denominator); the speed of change and the introduction of new technologies; and clearly, the same cultural barriers that exist in hospitals—namely, the high premium placed on medical autonomy and perfection and a historical lack of interprofessional cooperation and effective communication.
With the rise in outpatient and office-based surgery, attention is turning to anesthesia safety in settings such as private physician offices, dental, and podiatry offices. For example, guidelines for patient assessment, sedation, monitoring, personnel, emergency care, discharge evaluation, maintenance of equipment, infection control, and the like have been developed by an ad hoc committee for New York State practitioners.19
After reviewing what has been learned from other high-risk industries as well as the evidence of practices that can improve health care safety, the committee has identified a set of five principles that it believes can be usefully applied to the design of safe health care, whether in a small group practice, a hospital, or a large health care system. These principles include: (1) providing leadership; (2) respect for human limits in the design process; (3) promoting effective team functioning; (4) anticipating the unexpected; and (5) creating a learning environment.
Principle 1. Provide Leadership
• Make patient safety a priority corporate objective.
• Make patient safety everyone's responsibility.
• Make clear assignments for and expectation of safety oversight.
• Provide human and financial resources for error analysis and systems redesign.
• Develop effective mechanisms for identifying and dealing with unsafe practitioners.
Make Patient Safely a Priority Corporate Objective
The health care organization must develop a culture of safety such that an organization's design processes and workforce are focused on a clear goal—dramatic improvement in the reliability and safety of the care process. The committee believes safety must be an explicit organizational goal that is demonstrated by clear organizational leadership and professional support as seen by the involvement of governing boards, management, and clinical leadership. This process begins when boards of directors demonstrate their commitment to this objective by regular, close oversight of the safety of the institutions they shepherd.
Reviews of progress in reaching goals and system design should be repeated, detailed, quantitative, and demanding. Ways to implement this at the executive level include frequent reports highlighting safety improvement and staff involvement, regular reviews of safety systems, "walk-throughs" to evaluate hazardous areas and designs, incorporation of safety improvement goals into annual business plans, and providing support for sensible forms of simplification.
Recommendations 5.1 and 7.1 also address institutional accountability for safety. Recommendation 5.1 calls for mandatory reporting of serious adverse events by health care organizations. Recommendation 7.1 urges regu-
lators to focus greater attention on patient safety by requiring health care organizations to implement meaningful patient safety programs with defined executive responsibility and for public and private purchasers to provide incentives to health care organizations to demonstrate continuous improvement in patient safety.
Make Patient Safety Everyone's Responsibility
Messages about safety must signal that it is a serious priority of the institution, that there will be increased analysis of system issues with awareness of their complexity, and that they are endorsed by nonpunitive solutions encouraging the involvement of the entire staff. The messages must be well conceived, repeated, and consistent across health care systems, and should stress that safety problems are quality problems. Establishing and clearly conveying such aims are essential in creating safety systems.
All organizations must allocate resources to both production and safety. Although compatible in the long run, they may not be in the short run, which often results in considerable short-run tension. Health care institutions must be both accountable to the public for safety and able to address error and improve their performance without unreasonable fear of the threat of civil liability. This, too, creates tension between ensuring the transparency that allows institutions to be viewed publicly as trustworthy and the confidence that their workers have in identifying and addressing error without fear of formal or informal reprisal.
The committee recommends that health care professionals as well as health care organizations make safety a specific aim. Many, if not most, physicians in community practice view organizations such as hospitals primarily as platforms for their work and do not see themselves as being part of these larger organizations. Nevertheless, their participation in the safety efforts of these organizations is crucial. Health care practitioners should seek to affiliate themselves with organizations that embrace such aims, whether the organizations are hospitals, managed care organizations, medical societies, medical practice groups, or other entities. Rather than treating each error and hazard as a unique, surprising, separate, and sometimes tragic event, they should view the entire organization as a safety system and the search for improved safety and its associated design principles as a lifelong, shared journey.20 Health professionals should also participate in new efforts that may be undertaken by groups such as a medical practice and the professional groups to which they belong.
Make Clear Assignments and Set Expectations for Safely
Health care organizations should establish meaningful patient safety programs with defined executive responsibility that supports strong, clear, visible attention to safety. Most hospitals have safety programs for workers as required by Occupational Safety and Health Administration (OSHA), but few have patient safety programs. The committee emphasizes that by health care organizations, it intends such safety programs to be established not only by hospitals, but also by other organizations, including managed care organizations and the delivery sites with which they contract. Other industries have found that improvements in safety do not occur unless there are both a commitment by top management and an overt, clearly defined, and continuing effort on the part of all personnel, workers, and managers. As with any other program, a meaningful safety program should include senior-level leadership, defined program objectives, and plans; personnel; budget; collecting and analyzing data; and monitoring by regular progress reports to the executive committee and board of directors. Although safety can never be delegated, there should be clear accountability for safety, a budget, a defined program, and regular reporting to the board.
Provide Human and Financial Resources for Error Analysis and Systems Redesign
Responsibility for management and improvement in risky systems (e.g., medication) as a whole should be clearly located in individuals or crossfunctional, cross-departmental teams given the time to discharge this duty. For example, individuals or departments "own" pieces of the medication system, but as a rule, no one manages the medication system as a whole. Oversight of a hospital's medication system as a whole, including its safety and improvement, might be placed under a single clinician, with 50 percent or more of his or her time devoted to this role.
In managed care organizations, quality improvement activities, whether or not developed by accreditation bodies, should focus on patient safety activities and an expectation of major improvements in safety. Although data from ambulatory settings are very limited, the committee believes that such improvement could be on the order of a 50 percent reduction in errors in hospital environments and could be greatly reduced in outpatient settings.
Develop Effective Mechanisms for Identifying and Dealing with Unsafe Practitioners
Although almost all accidents result from human error, it is now recognized that these errors are usually induced by faulty systems that "set people up" to fail. Correction of these systems failures is the key to safe performance of individuals. Systems design—how an organization works, its processes and procedures—is an institutional responsibility. Only the institution can redesign its systems for safety; the great majority of effort in improving safety should focus on safe systems, and the health care organization itself should be held responsible for safety.
The committee recognizes, however, that some individuals may be incompetent, impaired, uncaring, or may even have criminal intent. The public needs dependable assurance that such individuals will be dealt with effectively and prevented from harming patients. Although these represent a small proportion of health care workers, they are unlikely to be amenable to the kinds of approaches described in detail in this chapter. Registration boards and licensure discipline is appropriately reserved for those rare individuals identified by organizations as a threat to patient safety, whom organizations are already required by state law to report.
Historically, the health system has not had effective ways of dealing with dangerous, reckless, or incompetent individuals and ensuring they do not harm patients. Although the health professions have a long history of work in this area, current systems do not, as a whole, work reliably or promptly. The lack of timeliness has been a special problem. Numerous reasons have been advanced for the lack of more timely and effective response by professions and institutions. Requirements posed by legal due process can be very slow and uncertain; the need for, but difficulty in arranging, excellent supervision has stymied efforts at retraining; and matching individual needs to adult learning principles and retraining that is tailored to specific deficits has been problematic. With this acknowledged, the committee believes that health care organizations should use and rely on proficiency-based credentialing and privileging to identify, retrain, remove, or redirect physicians, nurses, pharmacists, or others who cannot competently perform their responsibilities. With effective safety systems in place, the committee believes it will be easier for those within organizations to identify and act on information about such individuals. If these systems are working properly, unsafe professionals will be identified and dealt with before they cause serious patient injury.
Principle 2. Respect Human Limits in Process Design
• Design jobs for safety.
• Avoid reliance on memory.
• Use constraints and forcing functions.
• Avoid reliance on vigilance.
• Simplify key processes.
• Standardize work processes.
Human beings have many intellectual strengths, such as their large memory capacity; a large repertory of responses; flexibility in applying these responses to information inputs; and an ability to react creatively and effectively to the unexpected. However, human beings also have well-known limitations, including difficulty in attending carefully to several things at once, difficulty in recalling detailed information quickly, and generally poor computational ability.21 Respecting human abilities involves recognizing the strengths of human beings as problem solvers, but minimizing reliance on weaker traits. Several strategies are particularly important when considering such human factors: designing jobs for safety; avoiding reliance on memory and vigilance; using constraints and forcing functions; and simplifying and standardizing key processes.
Design Jobs for Safely
Designing jobs with attention to human factors means attending to the effect of work hours, workloads, staffing ratios, sources of distraction, and an inversion in assigned shifts (which affects worker's circadian rhythms) and their relationship to fatigue, alertness, and sleep deprivation. Designing jobs to minimize distraction may, for example, mean setting aside times, places, or personnel for specific tasks such as calculating doses or mixing intravenous solutions. Designing jobs for safety also means addressing staff training needs and anticipating harm that may accompany downsizing, staff turnover, and the use of part-time workers and "floats" who may be unfamiliar with equipment and processes in a given patient care unit. To the extent that these barriers presented by departmental affiliation and disciplinary training prevent caregivers from working cooperatively and developing new safety systems, job design requires attention not only to the work of the individual but also to the work and training of multidisciplinary teams.
Avoid Reliance on Memory
Health care organizations should use protocols and checklists wisely and whenever appropriate. Examples of the sensible design and use of protocols and checklists are to ensure their routine updating and constructing checklists so that the usual state is answered as yes. Protocols for the use of heparin and insulin, for example, have been developed by many hospitals.22 An Institute of Medicine report on the development of clinical guidelines suggests features for assessing guidelines that address their substance and process of development. Examples of attributes concerning the substance of guidelines are their validity and clinical applicability. Examples of the process of development include its clarity and documentation of the strength of the evidence.23
For medications, ways to reduce reliance on memory are the use of drug-drug interaction checking software and dosing cards (e.g., laminated cards that can be posted at nursing stations or carried in the pocket) that include standard order times, doses of antibiotics, formulas for calculating pediatric doses, and common chemotherapy protocols.24
Caution about using protocols wisely derives from the need to generalize and simplify, but to recognize that not all steps of a protocol may be appropriate. Rapid increases in knowledge and changing technology mean that a system for regular updating of protocols should be built into their production.
Use Constraints and Forcing Functions
Constraints and forcing functions are employed to guide the user to the next appropriate action or decision and to structure critical tasks so that errors cannot be made. They are important in designing defaults for devices and for processes such as diagnostic and therapeutic ordering. When a device fails, it should always default to the safest mode; for example, an infusion pump should default to shutoff, rather than free flow.
Examples of the use of constraints in ordering medications are pharmacy computers that will not fill an order unless allergy information, patient weight, and patient height are entered. Another forcing function is the use of special luer locks for syringes and indwelling lines that have to be matched before fluid can be infused. Removal of concentrated potassium chloride from patient floor stock is a (negative) forcing function.25 Less restrictive, but user-oriented approaches to design are the use of affordances and natural mappings.
Avoid Reliance on Vigilance
Human factors research has taught us that individuals cannot remain vigilant for long periods during which little happens that requires their action, and it is unreasonable to expect them to do so. Health care has many examples of automation used to reduce reliance on vigilance: using robotic dispensing systems in the pharmacy and infusion pumps that regulate the flow of intravenous fluids. Although automation is intended to reduce the need for vigilance, there are also pitfalls in relying on automation if a user learns to ignore alarms that are often wrong or becomes inattentive or inexpert in a given process, or if the effects of errors remain invisible until it is too late to correct them. Well-designed pumps give information about the reason for an alarm, have moderate sensitivity, and prevent free flow when the unit is turned off or fails.
Other approaches for accommodating the need for vigilance have been developed. These include providing checklists and requiring their use at regular intervals, limiting long shifts, and rotating staff who must perform repetitive functions.26
Simplify Key Processes
Simplifying key processes can minimize problem solving and greatly reduce the likelihood of error. Simplifying includes reducing the number of handoffs required for a process to be completed (e.g., decreasing multiple order and data entry). Examples of processes that can usually be simplified are: writing an order, then transcribing and entering it in a computer, or having several people record and enter the same data in different databases. Other examples of simplification include limiting the choice of drugs available in the pharmacy, limiting the number of dose strengths, maintaining an inventory of frequently prepared drugs, reducing the number of times per day a drug is administered, keeping a single medication administration record, automating dispensing, and purchasing easy-to-use and maintain equipment.27
Standardize Work Processes
Standardization reduces reliance on memory. It also allows newcomers who are unfamiliar with a given process or device to use it safely. In general, standardizing device displays (e.g., readout units), operations (e.g., location of the on-off switch), and doses is important to reduce the likelihood of
error. Examples of standardizing include not stocking look-alike products; the use of standard order forms, administration times, prescribing conventions; protocols for complex medication administration; reducing the numbers of available dose strengths and the times of drug administration, placement of supplies and medications; and types of equipment.28
Sometimes devices or medications cannot be standardized. When variation is unavoidable, the principle followed should be to differentiate clearly. An example is to identify look-alike, but different, strengths of a narcotic by labeling the higher concentration with bright orange tape.
Principle 3. Promote Effective Team Functioning
• Train in teams those who are expected to work in teams.
• Include the patient in safety design and the process of care.
Train in Teams Those Who Are Expected to Work in Teams
People work together in small groups throughout health care, whether in a multispecialty group practice, in interdisciplinary teams assembled for the care of a specific clinical condition (e.g., teams that care for children with congenital problems, oncology teams, end-of-life care), in operating rooms, and in ICUs. However, members of the team are typically trained in separate disciplines and educational programs. They may not appreciate each other's strengths or recognize weaknesses except in crises, and they may not have been trained together to use new or well-established technologies.
The committee believes that health care organizations should establish team training programs for personnel in critical care areas (e.g., the emergency department, intensive care unit, operating room) using proven methods such as the crew resource management techniques employed in aviation, including simulation. People make fewer errors when they work in teams. When processes are planned and standardized, each member knows his or her responsibilities as well as those of teammates, and members ''look out" for one another, noticing errors before they cause an accident. In an effective interdisciplinary team, members come to trust one another's judgments and attend to one another's safety concerns.
The risk associated with a move to adopt such training from fields such as aviation is in borrowing these training technologies too literally. Although the team issues associated with performance in aviation and medicine have
strong parallels in medical settings, effective training must be based not on adopting the training technologies too literally but on adapting them to the practices and personnel in the new setting.
Include the Patient in Safety Design and the Process of Care
The members of a team are more than the health care practitioners. A team includes the practitioners, patients, and technologies used for the care of these patients. Whenever possible, patients should be a part of the care process. This includes attention to their preferences and values, their own knowledge of their condition, and the kinds of treatments (including medications) they are receiving. Patients should also have information about the technologies that are used in their care, whether for testing, as an adjunct to therapy, or to provide patient information. Examples of ways to share such information with patients include reviewing with patients a list of their medications, doses, and times to take them; how long to take them; and precautions about interactions with alternative therapies or with alcohol, possible side effects, and any activities that should be avoided such as driving or the use of machinery. Patients should also receive a clearly written list of their medications and instructions for use that they can keep and share with other clinicians.29
Principle 4. Anticipate the Unexpected
• Adopt a proactive approach: examine processes of care for threats to safety and redesign them before accidents occur.
• Design for recovery.
• Improve access to accurate, timely information.
Adopt a Proactive Approach: Examine Processes of Care for Threats to Safety and Redesign Them Before Accidents Occur
Technology is ubiquitous in acute care, long-term care, ambulatory surgical centers, and home care. The value of automating repetitive, time-consuming, and error-prone tasks has long been understood and embraced in health care. The increasing use of technologies goes well beyond bedside or operating room devices. It includes emerging technologies that range from molecular, cellular, genetic, and pharmaceutical interventions; to patientadministered technologies (e.g., prescribed medications, monitors, patient-
controlled analgesia); to robotic and remote technologies such as remote ICU and telemedicine, Internet-based systems, and expert systems.30,31,32,33
At the same time, the human-machine interface is a focus of much preventive effort. Indeed, many technologies are engineered not only for safe operation in the care process, but specifically for the purpose of preventing error. Such technologies include automated order entry systems; pharmacy software to alert about drug interactions; and decision support systems such as reminders, alerts, and expert systems.
Health care organizations should expect any new technology to introduce new sources of error and should adopt the custom of automating cautiously, alert to the possibility of unintended harm. Despite the best intentions of designers, the committee emphasizes that ALL technology introduces new errors, even when its sole purpose is to prevent errors. Therefore as change occurs, health systems should anticipate trouble. Indeed, Cook emphasizes that future failures cannot be forestalled by providing simply another layer of defense against failure.34 Rather, safe equipment design and use depend on a chain of involvement and commitment that begins with the manufacturer and continues with careful attention to the vulnerabilities of a new device or system. Prevention requires the continuous redesign and implementation of safe systems to make error increasingly less likely, for example:
• using order entry systems that provide real-time alerts if a medication order is out of range for weight or age, or is contraindicated;
• using bar coding for positive identification and detection of misidentified patients, records, and so forth;
• using "hear back" for oral orders and instructions—for example, having a pharmacist repeat a phoned-in prescription to the caller; and
• monitoring vital signs, blood levels, and other laboratory values for patients receiving hazardous drugs.
Double-checking for particularly vulnerable parts of the system is another approach to preventing patient injury. One approach could be the use of tiger teams. The military phrase tiger team originated with a group whose purpose is to penetrate security and test security measures. Professional tiger teams are now used to test corporate systems for vulnerability, particularly to hackers. The idea of using teams with sophisticated knowledge of technical systems to test and anticipate the ways health systems can go wrong could well be adopted by health care organizations.
Patient safety, as well as business outcomes, should be anticipated when
reorganization, mergers, and other organization-wide changes in staffing, responsibilities, work loads, and relationship among caregivers result in new patterns of care. Such major changes often have safety implications that can be anticipated and tracked.
Design for Recovery
Prevention is one way to reduce error, but once the error rate and the transmission of the error to patients become very small, incremental gains are increasingly difficult to achieve. Another approach is to work on the processes of recovery when an error occurs. Designing for recovery means making errors visible, making it easy to reverse operations and hard to carry out nonreversible ones, duplicating critical functions or equipment as necessary to detect error, and intercepting error before harm occurs. Although errors cannot be reduced to zero, we should strive to reduce to zero the instances in which error harms a patient. A reliable system has procedures and attributes that make errors visible to those working in the system so that they can be corrected before causing harm.
Examples of procedures to mitigate injury are
• keeping antidotes for high-risk drugs up-to-date and easily accessible;
• having procedures in place for responding quickly to adverse events, such that these processes are standardized across units and personnel are provided with drills to familiarize them with the procedures and the actions each person should take;
• equipment that defaults to the least harmful mode in a crisis; and
• simulation training.
Another example of ways to prevent and to mitigate harm is simulation training. Simulation is a training and feedback method in which learners practice tasks and processes in lifelike circumstances using models or virtual reality, with feedback from observers, other team members, and video cameras to assist improvement of skills.35 Simulation for modeling crisis management (e.g., when a patient goes into anaphylactic shock or a piece of equipment fails) is sometimes called "crew resource management," an analogy with airline cockpit crew simulation.36,37,38,39,40,41 Such an approach carries forward the tradition of disaster drills in which organizations have long participated. In such simulation, small groups that work together—whether in the
operating room, intensive care unit, or emergency department—learn to respond to a crisis in an efficient, effective, and coordinated manner.
In the case of the operating room (OR) this means attempting to develop simulation that involves all key players (e.g., anesthesia, surgery, nursing) because many problems occur at the interface between disciplines.42 Although a full OR simulator has been in operation for some years at the University of Basel (Switzerland), the range of surgical procedures that can be simulated is limited. It will be a great challenge to develop simulation technology and simulators that will allow full, interdisciplinary teams to practice interpersonal and technical skills in a non-jeopardy environment where they can receive meaningful feedback and reinforcement.
Improve Access to Accurate, Timely Information
Information about the patient, medications, and other therapies should be available at the point of patient care, whether they are routinely or rarely used. Examples of ways to make such information available are the following
• Have a pharmacist available on nursing units and on rounds.
• Use computerized lab data that alert clinicians to abnormal lab values.
• Place lab reports and medication administration records at the patient's bedside.
• Place protocols in the patient's chart.
• Color-code wristbands to alert of allergies.
• Track errors and near misses and report them regularly.
• Accelerate laboratory turn around time.
Organizations can improve up-to-date access to information about infrequently used drugs by distributing newsletters and drug summary sheets; and ensuring access to Internet-based web sites, the Physicians Desk Reference, formularies, and other resources for ordering, dispensing, and administering medications.
Clearly, any discussion of the availability of accurate, timely information for patient care must stress the need for electronic databases and interfaces to allow them to be fully integrated, and the committee underscores the need for data standards and the development of integrated computer-based databases and knowledge servers.
Health care organizations should join other groups in contributing to the development of standardized data sets for patient records. Uniform standards for connectivity, terminology, and data sharing are critical if the creation and maintenance of health care databases are to be efficient and their information is to be accurate and complete. National standards for the protection of data confidentiality are also needed. The committee urges that health care organizations join payers, vendors, quasi-public standard-setting bodies (such as the National Institute of Standards and Technology (NIST) and American National Standards Institute (ANSI)), federal agencies, and advisory groups in working to facilitate standards-setting efforts and otherwise become full participants in the multidisciplinary effort that is now under way.
Despite the computer-based patient record being "almost here" for 45 years, it has still not arrived. Its advantages are clear: computer-based patient records and other systems give physicians and other authorized personnel the ability to access patient data without delay at any time in any place (e.g., in an emergency or when the patient is away from home); ensure that services are obtained and track outcomes of treatment; and aggregate data from large numbers of patients, both to measure outcomes of treatment; and to promptly recognize complications of new drugs, devices, and treatments.43
The committee also believes that organizations, individually and in collaboration, must commit to using information technology to manage their knowledge bases and processes of care. Doing so will require the integration of systems that are patient specific, allow population-based analyses, and systems that manage the case process through reminder, decision support, and guidance grounded in evidence-based knowledge.
Principle 5. Create a Learning Environment
• Use simulations whenever possible.
• Encourage reporting of errors and hazardous conditions.
• Ensure no reprisals for reporting of errors.
• Develop a working culture in which communication flows freely regardless of authority gradient.
• Implement mechanisms of feedback and learning from error.
Use Simulations Whenever Possible
As described under Principle 4, health care organizations and teaching institutions should participate in the development and use of simulation for training novice practitioners, problem solving, and crisis management, especially when new and potentially hazardous procedures and equipment are introduced. Crew resource management techniques, combined with simulation, have substantially improved aviation safety and can be modified for health care use. Early successful experience in emergency department and operating room use indicates they should be more widely applied.44
As noted, health care—particularly in dynamic setting such as operating rooms and emergency departments—involves tightly coupled systems. For this reason, crew resource management can be very valuable in reducing (though probably not eliminating) error. For such programs to achieve their potential, however, requires a thorough understanding of the nature of team interactions, the etiology and frequency of errors, and the cultures of each organization into which they are introduced.
Encourage Reporting of Errors and Hazardous Conditions
The culture of a health care organization plays a critical role in how well errors are detected and handled. Medical training and the culture instilled during this training have considerable strengths—emphasizing autonomy of action and personal responsibility. It has also led to a culture of hierarchy and authority in decision making and to a belief that mistakes should not be made. If they do occur, mistakes are typically treated as a personal and professional failure.45 Because medical training is typically isolated from the training of other health professionals, people have not learned to work together to share authority and collaborate in problem solving. Attempting to change such a culture to accept error as normal is difficult, and accepting the occurrence of error as an opportunity to learn and improve safety is perhaps even more difficult. As noted at the beginning of this chapter, it requires at a minimum that members of the organization believe that safety is really a priority in their organization, that reporting will really be nonpunitive, and that improving patient safety requires fixing the system, not fixing blame. It will almost surely require changes in the way health care professionals are trained in terms not only of their own professional work, but also of how they learn to work together.
Ensure No Reprisals for Reporting of Errors
Health care organizations should establish nonpunitive environments and systems for reporting errors and accidents. The most important barrier to improving patient safety is lack of awareness of the extent to which errors occur daily in all health care organizations. It is difficult to remedy problems that you do not know exist. This lack of awareness occurs because in most cases, errors are not reported.
Studies have shown that typically less than five percent of known errors are reported, and many are unknown.46 When punishment is eliminated, reporting soars.
Important characteristics of reporting systems within organizations include that they be voluntary, have minimal restrictions on acceptable content, include descriptive accounts and stories (i.e., not be a simple checklist), be confidential, and be accessible for contributions from all clinical and administrative staff. Once submitted, they should be de-identified by reporter and analyzed by experts. Finally, staff should be given timely feedback on the results and how problems will be addressed.47
Develop a Working Culture in Which Communication Flows Freely Regardless of Authority Gradient
Organizations also have to foster a management style in dealing with error that supports voluntary reporting and analysis of errors so there are no reprisals and no impediments to information flowing freely against a power gradient.
Techniques for such communication can be taught. Military and civilian aviation has taught senior pilots to respect and listen to junior colleagues, and that copilots and junior officers have the responsibility to communicate clearly their concerns about safety. Superiors have the responsibility to reply to these concerns according to the "two-challenge rule." This rule states that if a pilot is clearly challenged twice about an unsafe situation during a flight without a satisfactory reply, the subordinate is empowered to take over the controls. During military briefings and debriefings, attendees are also expected to express their concerns about safety aspects of an operation.
Bringing about such change in communication patterns within the health care environment, particularly in teaching environments, is without question a major undertaking that begins at least with medical residency training and nursing training. For the leaders of health care teams, it requires learning leadership behavior that encourages and expects all mem-
bers of the team to internalize the need to be alert to threats to patient safety and to feel that their contributions and concerns are respected.
Implement Mechanisms of Feedback and Learning from Error
In order to learn from error, health care organizations will have to establish and maintain environments and systems for analyzing errors and accidents so that the redesign of processes is informed rather than an act of tampering. There are five important phases to improving safety. The first is the reporting of events in sufficiently rich detail to create a "story" about what occurred. The second is understanding the story in order to make its meaning clear. The third is to develop recommendations for improvement. The fourth is implementation, and the fifth is tracking the changes to learn what new safety problems may have been introduced.
Organizations should develop and maintain an ongoing process for the discovery, clarification, and incorporation of basic principles and innovations for safe design, and should use this knowledge to understand the reasons for hazardous conditions and ways to reduce these vulnerabilities. Organizations require sound, scientifically grounded theories about error and safety. They should draw on the health care industry, other industries, and research on human factors and engineering, organizational and social psychology, and cognitive psychology for useful ideas. Analysis of events leading to error should draw on this knowledge base. Organizational expertise may have to be augmented by external technical assistance, especially in small institutions without the resources to support such activities and expertise internally. Such assistance might come from academically based research centers, trade associations, and professional groups.
Research and analysis are not luxuries in the operation of safety systems. They are essential steps in the effective redesign of systems because analysis provides the information needed for effective prevention. As safety research in other fields has taught us, when a major event occurs that results in patient harm or death, both active and latent errors were present. Investigation of active errors has focused on the individuals present and the circumstances immediately surrounding the event. However, such an explanation is often not only premature and uninformed, but it is usually unhelpful in preventing future events. Understanding the latent errors whose adverse consequences may lie dormant within the system requires considerable technical and systems knowledge about technical work and the way organizational
factors play out in this technical work. It also requires understanding the roles of resource limitations, conflicts, uncertainty, and complexity.
Two other ways in which organizations can improve their performance through shared learning are by benchmarking and collaboration. Benchmarking is a way to compare oneself or one's organization against the "best in class." While learning about and finding ways to implement the best practices they can identify, organizations can implement sets of practical, time-series measures that can help them learn whether the steps they have taken are improving safety.48 Organizations can also collaborate with other facilities, even within their market areas, to understand patterns of error and new approaches to prevention. For example, the New England Cardiovascular Project, the Vermont-Oxford Neonatal Network, and multisite research on the organization and delivery of care in intensive care units have demonstrated the gains that are possible from such collaborative work.49, 50
The committee strongly encourages organizations to participate in voluntary reporting systems. Chapter 5 provides descriptions of some voluntary reporting systems available in the health care industry, and the committee has recommended that voluntary reporting initiatives be encouraged and expanded.
As described in Chapter 2, a good deal of research has identified medication error as a substantial source of preventable error in hospitals. In addition, organizations and researchers have paid considerable attention to methods of preventing such errors, and there is reasonable agreement about useful approaches. For this reason, the remainder of this chapter focuses on medication administration to illustrate how the principles for creating safety systems might be applied, including the need for a systems approach. It focuses on hospitals because most of the research in this area and virtually all the data are hospital-based but recognizes that many of the strategies apply to ambulatory and other settings as well.
Errors increase with complexity. Complexity in the medication system arises from several sources; including the extensive knowledge and information that are necessary to correctly prescribe a medication regimen for a particular patient; the intermingling of medications of varying hazard in the pharmacy, during transport, and on the patient care units; and the multiple tasks performed by nurses, of which medication preparation and administration are but a few. Because the burden of harm to patients is great, the
cost to society is large, and knowledge of how to prevent the most common kinds of errors is well known, the committee singles out medication safety as a high priority area for all health care organizations.
A number of practices have been shown to reduce errors in the medication process and should be in place in all hospitals and other health care organizations in which they are appropriate.51,52,53
Selected Strategies to Improve Medication Safety
• Adopt a system-oriented approach to medication error reduction.
• Implement standard processes for medication doses, dose timing, and dose scales in a given patient care unit.
• Standardize prescription writing and prescribing rules.
• Limit the number of different kinds of common equipment.
• Implement physician order entry.
• Use pharmaceutical software.
• Implement unit dosing.
• Have the central pharmacy supply high-risk intravenous medications.
• Use special procedures and written protocols for the use of high-risk medications.
• Do not store concentrated solutions of hazardous medications on patient care units.
• Ensure the availability of pharmaceutical decision support.
• Include a pharmacist during rounds of patient care units.
• Make relevant patient information available at the point of patient care.
• Improve patients' knowledge about their treatment.
Several organizations have recently focused attention on medication safety, and a number have compiled recommendations for safe medication practices, particularly in the inpatient environment. Most recently, these include the National Patient Safety Partnership,54 the Massachusetts Coalition for the Prevention of Medical Errors (1999),55 the Institute for Healthcare Improvement (1998),56 the National Coordinating Council for Medication Error Reporting and Prevention (NCCMERP); and the American Society for Health-System Pharmacists.57
As illustrated in Table 8.1, most of the groups' recommendations are consistent with one another. Although each has been implemented by a large number of hospitals, none has been universally adopted, and some are not in
(table continued on next page)
(table continued from previous page)
(table continued on next page)
TABLE 8–1 Continuted
(table continued on next page)
(table continued from previous page)
(table continued on next page)
(table continued from previous page)
TABLE 8–1 Continued
(table continued on next page)
(table continued form previous page)
place in even a majority of hospitals. Based on evidence and drawing on the principles described in this chapter, this IOM committee joins other groups in calling for implementation of proven medication safety practices as described below.
Adopt a System-Oriented Approach to Medication Error Reduction
Throughout this chapter, emphasis is put on the development of a system-oriented approach that prevents and identifies errors and minimizes patient harm from errors that do occur. It involves a cycle of anticipating problems, for example with changes in staffing or the introduction of new technologies, adopting the five principles described, tracking and analyzing data as errors and near misses occur, and using those data to modify processes to prevent further occurrences. None of these steps is useful alone. When taken together with strong executive leadership in a nonpunitive environment and with appropriate resources, they become extremely powerful in improving safety.
Implement Standard Processes for Medication Doses, Dose Timing, and Dose Scales in a Given Patient Care Unit
One of the most powerful means of preventing errors of all kinds is to standardize processes. If doses, times, and scales are standardized, it is easier for personnel to remember them, check them, and cross-check teammates who are administering the medications.
Standardize Prescription Writing and Prescribing Rules
A host of common shortcuts in prescribing have frequently been found to cause errors. Abbreviations are the major offender because they can have more than one meaning. Other "traps" include the use of "q" ( as in qid, qod, qd, qh), which is easily misread, and the use of the letter "u" for "unit.'' Failure to specify all of the elements of an order (form, dose, frequency, route) also leads to errors. Putting such information in computerized order entry forms can help eliminate such errors.
Limit the Number of Different Kinds of Common Equipment
Simplification—reducing the number of options—is almost as effective as standardization in reducing medication errors. Just as with limiting
medications to one dose decreases the chance of error, limiting the types of equipment (e.g., infusion pumps) available on a single patient care unit will improve safety. Unless all such equipment has the same method of setup and operation, having several different types of infusion pumps and defibrillators increases the likelihood of misuse, sometimes with disastrous consequences.
Implement Physician Order Entry
Having physicians enter and transmit medication orders on-line (computerized physician order entry) is a powerful method for preventing medication errors due to misinterpretation of hand-written orders. It can ensure that the dose, form, and timing are correct and can also check for potential drug-drug or drug-allergy interactions and patient conditions such as renal function. In one before-and-after comparison,58 nonintercepted serious medication errors decreased by more than half (from 10.7 to 4.86 events per 1,000 patient-days).
Direct order entry reduces errors at all stages of the medication process, not just in prescribing60 and it has been recommended by National Patient Safety Partnership, a coalition of health care organizations.*
One study estimated cost savings attributable to preventable adverse drug events (ADEs) at more than $4,000 per event. Direct savings from reduction of ADEs were estimated to be more than $500,000 annually at one teaching hospital, with an overall savings from all decision support interventions related to order entry of between $5 to 10 million per year.61 A computerized system costing $1 to 2 million could pay for itself in three to five years, while preventing injury to hundreds of patients each year.
Until computerized order entry is implemented, much of the safety benefit may be realized by manual systems that use standard order forms for highly prevalent circumstances, (e.g., myocardial infarction, use of heparin) if the forms are used as completed by clinicians and not transcribed.
Computerized order entry can be a valuable safety adjunct for laboratory and radiology ordering as well as for medication and to achieve the
*Member organizations include the American Hospital Association, American Medical As sociation, American Nurses Association, Association of American Medical Colleges, Agency for Healthcare Research and Quality, Food and Drug Administration, Health Care Financing Administration, Joint Commission on the Accreditation of Healthcare Organizations, Institute for Healthcare Improvement, National Institute for Occupational Safety and Health, National Patient Safety Foundation, Department of Defense (Health Affairs), and Department of Veterans Affairs.
most benefit, should be linked with these databases. Such systems should provide relevant information about the patient and his or her medications to anyone who needs them. Bates et al.62 report on the ability of computerized information systems to identify and prevent adverse events using three hierarchical levels of clinical information. Using only what they call Level 1 information (demographic information, results of diagnostic tests, and current medications), 53 percent of adverse events were judged identifiable. Using Level 2 (as well as Level 1) information (physician order entry), 58 percent were judged identifiable. Using Level 3 (as well as Levels 1 and 2) information that included additional clinical data such as automated problem lists, the authors judged that 89 percent of adverse events were identifiable. In this study a small but significant number of adverse events (5, 13, and 23 percent, respectively) were judged preventable by using such techniques as guided-dose, drug-laboratory, and drug-patient characteristic software algorithms.
As with any new technology, implementing any of these practices requires attention to the user-system interface to minimize the introduction of new problems. It is helpful if these systems have a clearly designated "process manager." It is also important to remember that on-line computer entry does not eliminate all errors associated with prescribing drugs. For example, if allergic reactions to a medication are not entered in the database for a given patient, the order entry system cannot alert the prescriber when the same medication (or one in the same class) is prescribed. Other errors such as transcription errors can remain if they are within an expected range.
Use Pharmaceutical Software
Pharmacies in health care organizations should routinely use reliable computer software programs designed to check all prescriptions for duplicate drug therapies; potential drug-drug and drug-allergy interactions; and out-of-range doses, timing, and routes of administration.
Software is available that permits pharmacists to check each new prescription at a minimum for dose, interactions with other medications the patient is taking, and allergies. Although not as sophisticated as computerized physician order entry, until the latter is in place, pharmacy computerized checking can be an efficient way to intercept prescribing errors. The committee cautions, however, that many pharmacy computer systems today are of limited reliability when used to detect and correct prescription errors, most notably serious drug interactions.63 At a minimum, such systems should screen for duplicate prescriptions, patient allergies, potential drug-
drug interactions, out-of-range doses for patient weight or age, and druglab interactions. Because such pharmacy software may not be programmed to detect all, or even most, dangers, pharmacists and other personnel should not rely on these systems exclusively nor, on the other hand, habitually override alerts.
Implement Unit Dosing
If medications are not packaged in single doses by the manufacturer, they should be prepared in unit doses by the central pharmacy. Unit dosing—the preparation of each dose of each medication by the pharmacy—reduces handling as well as the chance of calculation and mixing errors. Unit dosing can reduce errors by eliminating the need for calculation, measurement, preparation, and handling on the nursing unit and by providing a fully labeled package that stays with the medication up to its point of use.
Unit dosing was a major systems change that significantly reduced dosing errors when it was introduced nearly 20 years ago. Unit dosing has been recommended by the American Society of Health-System Pharmacists, JCAHO, NPSF, and the MHA in their "Best Practice Recommendations." As a cost-cutting measure, unfortunately some hospitals have recently returned to bulk dosing, which means that an increase in dosing errors is bound to occur.
Have the Central Pharmacy Supply High-Risk Intravenous Medications
Having the pharmacy place additives in IV solutions or purchasing them already mixed, rather than having nurses prepare IV solutions on patient care units, reduces the chance of calculation and mixing errors. For example, one study showed that the error rate in mixing of IV drugs is 20 percent by nurses; 9 percent by pharmacies, and 0.3 percent by manufacturers. This recommendation is supported by the American Society of Health-System Pharmacists, the Institute for Safe Medication Practices, and the experience reported by Bates et al.64
Use Special Procedures and Written Protocols for the Use of High-Risk Medications
A relatively small number of medications carry a risk of death or serious injury when given in excessive dose. However, these include several of the
most powerful and useful medications in the therapeutic armamentarium. Examples are heparin, warfarin, insulin, lidocaine, magnesium, muscle relaxants, chemotherapeutic agents, and potassium chloride (see below), dextrose injections, narcotics, adrenergic agents, theophylline, and immunoglobin.65,66 Both to alert personnel to be especially careful and to ensure that dosing is appropriate, special protocols and processes should be used for these "high-alert" drugs. Such protocols might include written and computerized guidelines, checklists, preprinted orders, double-checks, special packaging, and labeling.
Do Not Store Concentrated Potassium Chloride Solutions on Patient Care Units
Concentrated potassium chloride (KCl) is the most potentially lethal chemical used in medicine. It is widely used as an additive to intravenous solutions to replace potassium loss in critically ill patients. Each year, fatal accidents occur when concentrated KCl is injected because it is confused with another medication. Because KCl is never intentionally used undiluted, there is no need to have the concentrated form stocked on the patient care unit. Appropriately diluted solutions of KCl can be prepared by the pharmacy and stored on the unit for use.
After enacting its sentinel event reporting system, JCAHO found that eight of ten incidents of patient death resulting from administration of KCl were the result of the infusion of KCl that was available as a floor stock item.67 This has also been reported as a frequent cause of adverse events by the U.S. Pharmacopoeia (USP) Medication Errors Reporting Program.68
Ensure the Availability of Pharmaceutical Decision Support
Because of the immense variety and complexity of medications now available, it is impossible for nurses or doctors to keep up with all of the information required for safe medication use. The pharmacist has become an essential resource in modern hospital practice. Thus, access to his or her expertise must be possible at all times.69,70 Health care organizations would greatly benefit from pharmaceutical decision support. When possible, medications should be dispensed by pharmacists or with the assistance of pharmacists. In addition, a substantial number of errors are made when nurses or other nonpharmacist personnel enter pharmacies during off hours to obtain drugs. Although small hospitals cannot afford and do not need to have a
pharmacist physically present at all times, all hospitals must have access to pharmaceutical decision support, and systems for dispensing medications should be designed and approved by pharmacists.
Include a Pharmacist During Rounds of Patient Care Units
As the major resource for drug information, pharmacists are much more valuable to the patient care team if they are physically present at the time decisions are being made and orders are being written. For example, in teaching hospitals, medical staff may conduct "rounds" with residents and other staff. Pharmacists should actively participate in this process and be present on the patient care unit when appropriate. Such participation is usually well received by nurses and doctors, and it has been shown to significantly reduce serious medication errors. Leape et al.71 measured the effect of pharmacist participation on medical rounds in the intensive care unit. They found that in one large, urban, teaching hospital the rate of preventable adverse drug events related to prescribing decreased significantly—66 percent—from 10.4 per 1,000 patient-days before the intervention to 3.5 after the intervention; the rate in the control group was unchanged.
Make Relevant Patient Information Available at the Point of Patient Care
Many organizations have implemented ways to make information about patients available at the point of patient care as well as ways to ensure that patients are correctly identified and treated. With medication administration, some inexpensive but useful strategies include the use of colored wristbands (or their equivalent) as a way to alert medical staff of medication allergies. Colored wristbands or their functional equivalent can alert personnel who encounter a patient anywhere in a hospital to check for an allergy before administering a medication. Using computer-generated MARs, can minimize transcription errors and legibility problems as well as provide flow charts for patient care.
Improper doses, mix-ups of drugs or patients, and inaccurate records are common causes of medication errors in daily hospital practice. Bar coding (or an electronic equivalent) is an effective remedy.72 It is a simple way to ensure that the identity and dose of the drug are as prescribed, that it is being given to the right patient, and that all of the steps in the dispensing and administration processes are checked for timeliness and accuracy. Bar
coding can be used not only by drug manufacturers, but also by hospitals to ensure that patients and their records match. The Colmercy-O'Neil VA Medical Center in Topeka, Kansas, reports, for example, a 70 percent reduction in medication error rates between September, 1995 and April, 1998 by using a system that included bar coding of each does, use of a hand-held laser bar code scanner, and a radio computer link.73
Improve Patients' Knowledge About Their Treatment
A major unused resource in most hospitals, clinics, and practices is the patient. Not only do patients have a right to know the medications they are receiving, the reasons for them, their expected effects and possible complications, they also should know what the pills or injections look like and how often they are to receive them. Patients should be involved in reviewing and confirming allergy information in their records.
Practitioners and staff in health care organizations should take steps to ensure that, whenever possible, patients know which medications they are receiving, the appearance of these medications, and their possible side effects.74 They should be encouraged to notify their doctors or staff of discrepancies in medication administration or the occurrence of side effects. If they are encouraged to take this responsibility, they can be a final "fail-safe" step.
At the time of hospital discharge, patients should also be given both verbal and written information about the safe and effective use of their medications in terms and in a language they can understand.
Patient partnering is not a substitute for nursing responsibility to give the proper medication properly or for physicians to inform their patients, but because no one is perfect, it provides an opportunity to intercept the rare but predictable error. In addition to patients' informing their health care practitioner about their current medications, allergies, and previous adverse drug experiences, the National Patient Safety Partnership has recommended that patients ask the following questions before accepting a newly prescribed medication:75
• Is this the drug my doctor (or other health care provider) ordered? What are the trade and generic names of the medication?
• What is the drug for? What is it supposed to do?
• How and when am I supposed to take it and for how long?
• What are the likely side effects? What do I do if they occur?
• Is this new medication safe to take with other over-the-counter or prescription medication or with dietary supplements that I am already taking? What food, drink, activities, dietary supplements, or other medication should be avoided while taking this medication?
This chapter has proposed numerous actions based on both good evidence and principles of safe design that health care organizations could take now or as soon as possible to substantially improve patient safety. These principles include (1) providing leadership; (2) respecting human limits in process design; (3) promoting effective team functioning; (4) anticipating the unexpected; and (5) creating a learning environment.
The committee's recommendations call for health care organizations and health care professionals to make continually improved patient safety a specific, declared, and serious aim by establishing patient safety programs with defined executive responsibility. The committee also calls for the immediate creation of safety systems that incorporate principles such as (1) standardizing and simplifying equipment, supplies, and processes; (2) establishing team training programs; and (3) implementing nonpunitive systems for reporting and analyzing errors and accidents within organizations. Finally, drawing on these principles and on strong evidence, the committee calls on health care organizations to implement proven medication safety practices.
1. Reason, James T. Forward to Human Error in Medicine, Marilyn Sue Bogner, ed. Hillsdale, NJ: Lawrence Erlbaum Associates, 1994, p. xiv. Though ecologically unsound, the analogy is apt.
2. Chassin, Mark R.; Galvin, Robert W., and the National Roundtable on Health Care Quality. The Urgent Need to Improve Health Care Quality. Institute of Medicine National Roundtable on Health Care Quality. JAMA. 280:1000–1005, 1998.
3. Cook, Richard I. Two Years Before the Mast: Learning How to Learn About Patient Safety. Invited presentation. "Enhancing Patient Safety and Reducing Errors in Health Care," Rancho Mirage, CA, November 8–10, 1998.
4. Senders, John. "Medical Devices, Medical Errors and Medical Accidents," in Human Error in Medicine, Marilyn Sue Bogner, ed. Hillsdale, NJ: Lawrence Erlbaum Associates, 1994.
5. Van Cott, Harold. "Human Errors: Their Causes and Reduction," in Human Error in Medicine, Marilyn Sue Bogner, ed., Hillsdale, NJ: Lawrence Erlbaum Associates, 1994.
6. Van Cott, 1994.
7. MacCormack, George. Zeroing in on Safety Excellence—It's Good Business. http://www.dupont.com/safety/esn97-1/zeroin.html 5/27/99.
8. DuPont Safety Resources. Safety Pays Big Dividends for Swiss Federal Railways. http://www.dupont.com/safety/ss/swissrail22.html 5/3/99.
9. From "Executive Safety News" DuPont Safety Resources. http://www.dupont.com/safety/esn98-3.html 5/3/99.
10. Alcoa. Alcoa Environment, Health and Safety Annual Report. 1997.
11. Sagan, Scott D. The Limits of Safety. Organizations, Accidents, and Nuclear Weapons. Princeton, N.J.: Princeton University Press, 1993.
12. Weick, Karl E. and Roberts, Karlene H. Collective Mind in Organizations: Heedful Interrelating on Flight Decks. Administrative Science Quarterly. 38:357–381, 1993.
13. Rasmussen, Jens. Skills, rules, Knowledge: Signals, Signs, and Symbols and Other Distinctions in Human Performance Models. IEEE Transactions: Systems, Man & Cybernetics (SMC-13): 257–267, 1983.
14. Reason, James. Human Error. New York: Cambridge University Press, 1990.
15. Moray, Nevill. "Error Reduction as a Systems Problem," in Human Error in Medicine, ed., Marilyn Sue Bogner, Hillsdale, NJ: Lawrence Erlbaum Associates, 1994.
16. Van Cott, 1994.
17. Norman, Donald A. The Design of Everyday Things. NY: Doubleday/Currency, 1988.
18. Gaba, David; Howard, Steven K., and Fish, Kevin J. Crisis Management in Anesthesiology. NY: Churchill-Livingstone, 1994.
19. Committee on Quality Assurance in Office-Based Surgery. A Report to New York State Public Health Council and New York State Department of Health, June, 1999.
20. Berwick, Donald M. "Taking Action to Improve Safety: How to Increase the Odds of Success," Keynote Address, Second Annenberg Conference, Rancho Mirage, CA, November 8, 1998.
21. Haberstroh, Charles H. "Organization, Design and Systems Analysis," in Handbook of Organizations, J.J. March, ed. Chicago: Rand McNally, 1965.
22. Leape, Lucian L.; Kabcenell, Andrea; Berwick, Donald M., et al. Reducing Adverse Drug Events. Boston: Institute for Healthcare Improvement, 1998.
23. Institute of Medicine. Guidelines for Clinical Practice. From Development to Use. Marilyn J. Field and Kathleen N. Lohr, eds. Washington, D.C.: National Academy Press, 1992.
24. Leape, et al., 1998.
25. Leape, et al., 1998.
26. Leape, et al., 1998.
27. Leape, et al., 1998.
28. Leape, et al., 1998.
29. Hwang, Mi Y. JAMA Patient Page. Take Your Medications as Prescribed. JAMA. 282:298, 1999.
30. Blumenthal, David. The Future of Quality Measurement and Management in a Transforming Health Care System. JAMA. 278:1622–1625, 1997.
31. Sheridan, Thomas B. and Thompson, James M. "People Versus Computers in Medicine," in Human Error in Medicine, Marilyn Sue Bogner, ed., Hillsdale, NJ: Lawrence Erlbaum Associates, 1994.
32. Hyman, William A. "Errors in the Use of Medical Equipment," in Human Error in Medicine, Marilyn Sue Bogner, ed., Hillsdale, NJ: Lawrence Erlbaum Associates, 1994.
33. Senders, John W. "Medical Devices, Medical Errors, and Medical Accidents," in Human Error in Medicine, Marilyn Sue Bogner, ed., Hillsdale, NJ: Lawrence Erlbaum Associates, 1994.
34. Cook, Richard I. Two Years Before the Mast: Learning How to Learn About Patient Safety. "Enhancing Patient Safety and Reducing Errors in Health Care," Rancho Mirage, CA, November 8–10, 1998.
35. Leape, et al., 1998.
36. Helmreich, Robert L.; Chidester, Thomas R.; Foushee, H. Clayton, et al. How Effective is Cockpit Resource Managmeent Training? Flight Safety Digest. May:1–17, 1990.
37. Chopra,V.; Gesink, Birthe J.; deJong, Jan, et al. Does Training on an Anaesthesia Simulator Lead to Improvement in Performance? Br J Anaesthesia. 293–297, 1994.
38. Denson, James S. and Abrahamson, Stephen. A Computer-Controlled Patient Simulator. JAMA. 208:504–508, 1969.
39. Howard, Steven K.; Gaba, David, D.; Fish, Kevin J., et al. Anesthesia Crisis Resource Management Training: Teaching Anesthesiologists to Handle Critical Incidents. Aviation, Space, and Environmental Medicine. 63:763–769, 1992.
40. Spence, Alastair. A. The Expanding Role of Simulators in Risk Management. Br J Anaesthesia. 78:633–634, 1997.
41. Inaccurate Reporting of Simulated Critical Anaesthetic Incidents. Br J Anaesthesia. 78:637–641, 1997.
42. Helmreich, Robert L. and Davies, Jan M. Anaesthetic Simulation and Lessons to be Learned from Aviation. [Editorial]. Canadian Journal of Anaesthesia. 44:907–912, 1997.
43. Institute of Medicine. The Computer-Based Patient Record. An Essential Technology for Health Care. Revised Edition. Washington, DC: National Academy Press, 1997.
44. Tuggy, Michael L. Virtual Reality Flexible Sigmoidoscopy Simulator Training: Impact on Resident Performance. J Am Board Fam Pract. 11:426–433, 1998.
45. Leape, Lucian, L.; Woods, David D.; Hatlie, Martin, J., et al. Promoting Patient Safety and Preventing Medical Error. JAMA. 280:1444–1447, 1998.
46. Leape, Lucian L. Error in Medicine. JAMA 272:1851–1857, 1994.
47. Kizer, Kenneth W. VHA's Patient Safety Improvement Initiative, presentation to the National Health Policy Forum, Washington, D.C., May 14, 1999.
48. Nolan, Thomas. Presentation, IHI Conference, Orlando, FL, December, 1998.
49. Zimmerman, Jack E.; Shortell, Stephen M., et al. Improving Intensive Care: Observations Based on Organizational Case Studies in Nine Intensive Care Units: A Prospective, Multicenter Study. Crit Care Med. 21:1443–1451, 1993.
50. Shortell, Stephen M.; Zimmerman, Jack E.; Gillies, Robin R., et al. Continuously Improving Patient Care: Practical Lessons and an Assessment Tool From the National ICU Study. QRB Qual Rev Bull. 18:150–155, 1992.
51. Manasse, Henri R. Jr. Toward Defining and Applying a Higher Standard of Quality for Medication Use in the United States. Am J Health System Pharm. 55:374–379, 1995.
52. Lesar, Timothy S.; Briceland, Laurie; and Stein, Daniel S. Factors Related to Errors in Medication Prescribing. JAMA. 277:312–317, 1997.
53. Avorn, Jerry. Putting Adverse Drug Events into Perspective. JAMA. 277:341–342, 1997.
54. Healthcare Leaders Urge Adoption of Methods to Reduce Adverse Drug Events. News Release. National Patient Safety Partnership, May 12, 1999.
55. Massachusetts Hospital Association (Massachusetts Coalition for the Prevention of Medical Errors). ''MHA Best Practice Recommendations to Reduce Medication Errors," Kirle, Leslie E.; Conway, James; Peto, Randolph, et al. http://www.mhalink.org/mcpme/recommend.htm.
56. Leape, et al., 1998.
57. Consensus Statement. American Society of Health-system Pharmacists. Top-Priority Actions for Preventing Adverse Drug Events in Hospitals. Recommendations of an Expert Panel. Am J Health System Pharm. 53:747–751, 1996.
58. Bates, David W.; Leape, Lucian L.; Cullen, David J., et al. Effect of Computerized Physician Order Entry and a Team Intervention on Prevention of Serious Medical Error. JAMA. 280:1311–1316, 1998.
59. Bates, 1998.
60. Evans, R. Scott; Pestotnik, Stanley L.; Classen, David C., et al. A Computer-Assisted Management Program for Antibiotics and Other Anti-Infective Agents. N Engl J Med. 338(4):232–238, 1997. See also: Schiff, Gordon D.; and Rucker, T. Donald. Computerized Prescribing: Building the Electronic Infrastructure for Better Medication Usage. JAMA. 279:1024–1029, 1998.
61. Bates, David W.; Spell, Nathan, Cullen; David J., et al. The Costs of Adverse Drug Events in Hospitalized Patients. JAMA. 227:307–311, 1997.
62. Bates, David W.; O'Neil, Anne C.; Boyle, Deborah, et. al. Potential Identifiablity and Preventability of Adverse Events Using Information Systems. J American Informatics Assoc. 5:404–411, 1994.
63. Institute for Safe Medication Practices. Over-reliance on Pharmacy Computer Systems May Place Patients at Great Risk. http://www.ismp.org/ISMP/MSAarticles/Computer2.html_6/01/99.
64. Bates, David W.; Cullen, David J.; Laird, Nan, et al. Incidence of Adverse Drug Events and Potential Adverse Drug Events. JAMA. 274:29–34, 1995.
65. Leape, Lucian L.; Kabcenell, Andrea; Berwick, Donald M., et al. Reducing Adverse Drug Events. Boston: Institute for Healthcare Improvement, 1998.
66. Cohen, Michael; Anderson, Richard W.; Attilio, Robert M., et al. Preventing Medication Errors in Cancer Chemotherapy. Am J Health System Pharm. 53:737–746, 1996.
67. Sentinel Event Alert. The Joint Commission on Accreditation of Healthcare Organizations, Oakbrook Terrace, IL: JCAHO, 1998.
68. Cohen, Michael. Important Error Prevention Advisory. Hosp Pharmacists. 32:489–491, 1997.
69. ASHP Guidelines on Preventing Medication Errors in Hospitals. Am J Hospital Pharmacists. 50:305–314, 1993.
70. Crawford, Stephanie Y. Systems Factors in the Reporting of Serious Medication Errors. Presentation at Annenburg Conference, Rancho Mirage, CA, November 8, 1998.
71. Leape, Lucian L.; Cullen, David J.; Clapp, Margaret D., et al. Pharmacist Participation on Physician Rounds and Adverse Drug Events in the Intensive Care Unit. JAMA.. 282(3):267–270, 1999.
72. Top Priority Actions for Preventing Adverse Drug Events in Hospitals. Recommendations of an Expert Panel. Am J Health System Pharm. 53:747–751, 1996.
73. Gebhart, Fred. VA Facility Slashes Drug Errors Via Bar-Coding. Drug Topics. 1:44, 1999.
74. Joint Commission on Accreditation of Healthcare Organizations. 1998 Hospital Accreditation Standards. Oakbrook Terrace, IL: Joint Commission, 1998.
75. Healthcare Leaders Urge Adoption of Methods to Reduce Adverse Drug Events. News Release. National Patient Safety Partnership, May 12, 1999.