National Academies Press: OpenBook

To Err Is Human: Building a Safer Health System (2000)

Chapter: 3 Why Do Errors Happen?

« Previous: 2 Errors in Health Care: A Leading Cause of Death and Injury
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 49

3—
Why Do Errors Happen?

The common initial reaction when is to find and blame an error occurs someone. However, even apparently single events or errors are due most often to the convergence of multiple contributing factors. Blaming an individual does not change these factors and the same error is likely to recur. Preventing errors and improving safety for patients require a systems approach in order to modify the conditions that contribute to errors. People working in health care are among the most educated and dedicated workforce in any industry. The problem is not bad people; the problem is that the system needs to be made safer.

This chapter covers two key areas. First, definitions of several key terms are offered. This is important because there is no agreed-upon terminology for talking about this issue.1 Second, the emphasis in this chapter (and in this report generally) is about how to make systems safer; its primary focus is not on "getting rid of bad apples," or individuals with patterns of poor performance. The underlying assumption is that lasting and broad-based safety improvements in an industry can be brought about through a systems approach.

Finally, it should be noted that although the examples may draw more from inpatient or institutional settings, errors occur in all settings. The concepts presented in this chapter are just as applicable to ambulatory care,

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 50

home care, community pharmacies, or any other setting in which health care is delivered.

This chapter uses a case study to illustrate a series of definitions and concepts in patient safety. After presentation of the case study, the chapter will define what comprises a system, how accidents occur, how human error contributes to accidents and how these elements fit into a broader concept of safety. The case study will be referenced to illustrate several of the concepts. The next section will examine whether certain types of systems are more prone to accidents than others. Finally, after a short discussion of the study of human factors, the chapter summarizes what health care can learn from other industries about safety.

(sidebar continued on next page)

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 51

Why Do Accidents Happen?

Major accidents, such as Three Mile Island or the Challenger accident, grab people's attention and make the front page of newspapers. Because they usually affect only one individual at a time, accidents in health care delivery are less visible and dramatic than those in other industries. Except for celebrated cases, such as Betsy Lehman (the Boston Globe reporter who died from an overdose during chemotherapy) or Willie King (who had the wrong leg amputated),2 they are rarely noticed. However, accidents are a form of information about a system.3 They represent places in which the system failed and the breakdown resulted in harm.

The ideas in this section rely heavily upon the work of Charles Perrow

(sidebar continued from previous page)

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 52

and James Reason, among others. Charles Perrow's analysis of the accident at Three Mile Island identified how systems can cause or prevent accidents.4 James Reason extended the thinking by analyzing multiple accidents to examine the role of systems and the human contribution to accidents.5 "A system is a set of interdependent elements interacting to achieve a common aim. The elements may be both human and non-human (equipment, technologies, etc.)."

Systems can be very large and far-reaching, or they can be more localized. In health care, a system can be an integrated delivery system, a centrally owned multihospital system, or a virtual system comprised of many different partners over a wide geographic area. However, an operating room or an obstetrical unit is also a type of system. Furthermore, any element in a system probably belongs to multiple systems. For example, one operating room is part of a surgical department, which is part of a hospital, which is part of a larger health care delivery system. The variable size, scope, and membership of systems make them difficult to analyze and understand.

When large systems fail, it is due to multiple faults that occur together in an unanticipated interaction,6 creating a chain of events in which the faults grow and evolve.7 Their accumulation results in an accident. "An accident is an event that involves damage to a defined system that disrupts the ongoing or future output of that system."8

The Challenger failed because of a combination of brittle O-ring seals, unexpected cold weather, reliance on the seals in the design of the boosters, and change in the roles of the contractor and NASA. Individually, no one factor caused the event, but when they came together, disaster struck. Perrow uses a DEPOSE (Design, Equipment Procedures, Operators, Supplies and materials, and Environment) framework to identify the potential sources of failures. In evaluating the environment, some researchers explicitly include organizational design and characteristics.9

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 53

The complex coincidences that cause systems to fail could rarely have been foreseen by the people involved. As a result, they are reviewed only in hindsight; however, knowing the outcome of an event influences how we assess past events.10 Hindsight bias means that things that were not seen or understood at the time of the accident seem obvious in retrospect. Hindsight bias also misleads a reviewer into simplifying the causes of an accident, highlighting a single element as the cause and overlooking multiple contributing factors. Given that the information about an accident is spread over many participants, none of whom may have complete information,11 hindsight bias makes it easy to arrive at a simple solution or to blame an individual, but difficult to determine what really went wrong.

Although many features of systems and accidents in other industries are also found in health care, there are important differences. In most other industries, when an accident occurs the worker and the company are directly affected. There is a saying that the pilot is always the first at the scene of an airline accident. In health care, the damage happens to a third party; the patient is harmed; the health professional or the organization, only rarely. Furthermore, harm occurs to only one patient at a time; not whole groups of patients, making the accident less visible.*

In any industry, one of the greatest contributors to accidents is human error. Perrow has estimated that, on average, 60–80 percent of accidents involve human error. There is reason to believe that this is equally true in health. An analysis of anesthesia found that human error was involved in 82 percent of preventable incidents; the remainder involved mainly equipment failure.12 Even when equipment failure occurs, it can be exacerbated by human error.13 However, saying that an accident is due to human error is not the same as assigning blame. Humans commit errors for a variety of

*Public health has made an effort to eliminate the term, ''accident," replacing it with unintentional injuries, consistent with the nomenclature of the International Classification of Diseases. However, this report is not focused specifically on injury since an accident may or may not result in injury. See Institute of Medicine, Reducing the Burden of Injury, eds. Richard J. Bonnie, Carolyn Fulco and Catharyn Liverman. Washington, D.C., National Academy Press, 1999).

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 54

expected and unexpected reasons, which are discussed in more detail in the next two sections.

Understanding Errors

The work of Reason provides a good understanding of errors. He defines an error as the failure of a planned sequence of mental or physical activities to achieve its intended outcome when these failures cannot be attributed to chance.14 It is important to note the inclusion of "intention." According to Reason, error is not meaningful without the consideration of intention. That is, it has no meaning when applied to unintentional behaviors because errors depend on two kinds of failure, either actions do not go as intended or the intended action is not the correct one. In the first case, the desired outcome may or may not be achieved; in the second case, the desired outcome cannot be achieved.

Reason differentiates between slips or lapses and mistakes. A slip or lapse occurs when the action conducted is not what was intended. It is an error of execution. The difference between a slip and a lapse is that a slip is observable and a lapse is not. For example, turning the wrong knob on a piece of equipment would be a slip; not being able to recall something from memory is a lapse.

In a mistake, the action proceeds as planned but fails to achieve its intended outcome because the planned action was wrong. The situation might have been assessed incorrectly, and/or there could have been a lack of knowledge of the situation. In a mistake, the original intention is inadequate; a failure of planning is involved.

In medicine, slips, lapses, and mistakes are all serious and can potentially harm patients. For example, in medicine, a slip might be involved if the physician chooses an appropriate medication, writes 10 mg when the intention was to write 1 mg. The original intention is correct (the correct medication was chosen given the patient's condition), but the action did not proceed as planned. On the other hand, a mistake in medicine might involve selecting the wrong drug because the diagnosis is wrong. In this case, the situation was misassessed and the action planned is wrong. If the terms "slip" and "mistake" are used, it is important not to equate slip with "minor." Patients can die from slips as well as mistakes.

For this report, error is defined as the failure of a planned action to be completed as intended (e.g., error of execution) or the use of a wrong plan to achieve an aim (e.g., error of planning). From the patient's perspective, not

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 55

only should a medical intervention proceed properly and safely, it should be the correct intervention for the particular condition. This report addresses primarily the first concern, errors of execution, since they have their own epidemiology, causes, and remedies that are different from errors in planning. Subsequent reports from the Quality of Health Care in America project will consider the full range of quality-related issues, sometimes classified as overuse, underuse and misuse.15

Latent and Active Errors

In considering how humans contribute to error, it is important to distinguish between active and latent errors.16 Active errors occur at the level of the frontline operator, and their effects are felt almost immediately. This is sometimes called the sharp end.17 Latent errors tend to be removed from the direct control of the operator and include things such as poor design, incorrect installation, faulty maintenance, bad management decisions, and poorly structured organizations. These are called the blunt end. The active error is that the pilot crashed the plane. The latent error is that a previously undiscovered design malfunction caused the plane to roll unexpectedly in a way the pilot could not control and the plane crashed.

Latent errors pose the greatest threat to safety in a complex system because they are often unrecognized and have the capacity to result in multiple types of active errors. Analysis of the Challenger accident traced contributing events back nine years. In the Three Mile Island accident, latent errors were traced back two years.18 Latent errors can be difficult for the people working in the system to notice since the errors may be hidden in the design of routine processes in computer programs or in the structure or management of the organization. People also become accustomed to design defects and learn to work around them, so they are often not recognized.

In her book about the Challenger explosion, Vaughan describes the "normalization of deviance" in which small changes in behavior became the norm and expanded the boundaries so that additional deviations became acceptable.19 When deviant events become acceptable, the potential for er-

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 56

rors is created because signals are overlooked or misinterpreted and accumulate without being noticed.

Current responses to errors tend to focus on the active errors by punishing individuals (e.g., firing or suing them), retraining or other responses aimed at preventing recurrence of the active error. Although a punitive response may be appropriate in some cases (e.g., deliberate malfeasance), it is not an effective way to prevent recurrence. Because large system failures represent latent failures coming together in unexpected ways, they appear to be unique in retrospect. Since the same mix of factors is unlikely to occur again, efforts to prevent specific active errors are not likely to make the system any safer.20

Focusing on active errors lets the latent failures remain in the system, and their accumulation actually makes the system more prone to future failure.21 Discovering and fixing latent failures, and decreasing their duration, are likely to have a greater effect on building safer systems than efforts to minimize active errors at the point at which they occur.

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 57

Understanding Safety

Most of this chapter thus far has drawn on Perrow's normal accident theory, which believes that accident are inevitable in certain systems. Although they may be rare, accidents are "normal" in complex, high technology industries. In contrast to studying the causes of accident and errors, other researchers have focused on the characteristics that make certain industries, such as military aircraft carriers or chemical processing, highly reliable.22 High reliability theory believes that accidents can be prevented through good organizational design and management.23 Characteristics of highly reliable industries include an organizational commitment to safety, high levels of redundancy in personnel and safety measures, and a strong organizational culture for continuous learning and willingness to change.24 Correct performance and error can be viewed as "two sides of the same coin."25 Although accidents may occur, systems can be designed to be safer so that accidents are very rare.

The National Patient Safety Foundation has defined patient safety as the avoidance, prevention and amelioration of adverse outcomes or injuries stemming from the processes of health care.26 Safety does not reside in a person, device or department, but emerges from the interactions of components of a system. Others have specifically examined pharmaceutical safety and defined it to include maximizing therapeutic benefit, reducing risk, and eliminating harm.27 That is, benefit relates to risk. Other experts have also defined safety as a relative concept. Brewer and Colditz suggest that the acceptability of an adverse event depends on the seriousness of the underlying illness and the availability of alternative treatments.28 The committee's focus, however, was not on the patient's response to a treatment, but rather on the ability of a system to deliver care safely. From this perspective, the committee believes that there is a level of safety that can and should be ensured. Safety is relative only in that it continues to evolve over time and, when risks do become known, they become part of the safety requirements.

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 58

Safety is more than just the absence of errors. Safety has multiple dimensions, including the following:

• an outlook that recognizes that health care is complex and risky and that solutions are found in the broader systems context;

• a set of processes that identify, evaluate, and minimize hazards and are continuously improving, and

• an outcome that is manifested by fewer medical errors and minimized risk or hazard.29

For this report, safety is defined as freedom from accidental injury. This simple definition recognizes that from the patient's perspective, the primary safety goal is to prevent accidental injuries. If an environment is safe, the risk of accidents is lower. Making environments safer means looking at processes of care to reduce defects in the process or departures from the way things should have been done. Ensuring patient safety, therefore, involves the establishment of operational systems and processes that increase the reliability of patient care.

Are Some Types of Systems More Prone to Accidents?

Accidents are more likely to happen in certain types of systems. When they do occur, they represent failures in the way systems are designed. The primary objective of systems design ought to be to make it difficult for accidents and errors to occur and to minimize damage if they do occur.30

Perrow characterizes systems according to two important dimensions: complexity and tight or loose coupling.31 Systems that are more complex and tightly coupled are more prone to accidents and have to be made more reliable.32 In Reason's words, complex and tightly coupled systems can "spring nasty surprises."33

In complex systems, one component of the system can interact with multiple other components, sometimes in unexpected or invisible ways. Although all systems have many parts that interact, the problem arises when one part serves multiple functions because if this part fails, all of the dependent functions fail as well. Complex systems are characterized by specialization and interdependency. Complex systems also tend to have multiple feedback loops, and to receive information indirectly, and because of

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 59

specialization, there is little chance of substituting or reassigning personnel or other resources.

In contrast to complex systems, linear systems contain interactions that are expected in the usual and familiar production sequence. One component of the system interacts with the component immediately preceding it in the production process and the component following it. Linear systems tend to have segregated subsystems, few feedback loops, and easy substitutions (less specialization).

An example of complexity is the concern with year 2000 (Y2K) computer problems. A failure in one part of the system can unexpectedly interrupt other parts, and all of the interrelated processes that can be affected are not yet visible. Complexity is also the reason that changes in long-standing production processes must be made cautiously.34 When tasks are distributed across a team, for example, many interactions that are critical to the process may not be noticed until they are changed or removed.

Coupling is a mechanical term meaning that there is no slack or buffer between two items. Large systems that are tightly coupled have more time-dependent processes and sequences that are more fixed (e.g., y depends on x having been done). There is often only one way to reach a goal. Compared to tightly coupled systems, loosely coupled systems can tolerate processing delays, can reorder the sequence of production, and can employ alternative methods or resources.

All systems have linear interactions; however, some systems additionally experience greater complexity. Complex interactions contribute to accidents because they can confuse operators. Tight coupling contributes to accidents because things unravel too quickly and prevent errors from being intercepted or prevent speedy recovery from an event.35 Because of complexity and coupling, small failures can grow into large accidents.

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 60

Although there are not firm assignments, Perrow considered nuclear power plants, nuclear weapons handling, and aircraft to be complex, tightly coupled systems.36 Multiple processes are happening simultaneously, and failure in one area can interrupt another. Dams and rail transportation are considered tightly coupled because the steps in production are closely linked, but linear because there are few unexpected interactions. Universities are considered complex, but loosely coupled, since the impact of a decision in one area can likely be limited to that area.

Perrow did not classify health care as a system, but others have suggested that health care is complex and tightly coupled.37 The activities in the typical emergency room, surgical suite, or intensive care unit exemplify complex and tightly coupled systems. Therefore, the delivery of health care services may be classified as an industry prone to accidents.38

Complex, tightly coupled systems have to be made more reliable.39 One of the advantages of having systems is that it is possible to build in more defenses against failure. Systems that are more complex, tightly coupled, and are more prone to accidents can reduce the likelihood of accidents by simplifying and standardizing processes, building in redundancy, developing backup systems, and so forth.

Another aspect of making systems more reliable has to do with organizational design and team performance. Since these are part of activities within organizations, they are discussed in Chapter 8.

Conditions That Create Errors

Factors can intervene between the design of a system and the production process that creates conditions in which errors are more likely to happen. James Reason refers to these factors as psychological precursors or preconditions.40 Although good managerial decisions are required for safe and efficient production, they are not sufficient. There is also a need to have the right equipment, well-maintained and reliable; a skilled and knowledgeable workforce; reasonable work schedules, well-designed jobs; clear guidance on desired and undesired performance, et cetera. Factors such as these are the precursors or preconditions for safe production processes.

Any given precondition can contribute to a large number of unsafe acts. For example, training deficiencies can show up as high workload, undue time pressure, inappropriate perception of hazards, or motivational difficulties.41 Preconditions are latent failures embedded in the system. Designing

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 61

safe systems means taking into account people's psychological limits and either seeking ways to eliminate the preconditions or intervening to minimize their consequences. Job design, equipment selection and use, operational procedures, work schedules, and so forth, are all factors in the production process that can be designed for safety.

One specific type of precondition that receives a lot of attention is technology. The occurrence of human error creates the perception that humans are unreliable and inefficient. One response to this has been to find the unreliable person who committed the error and focus on preventing him or her from doing it again. Another response has been to increase the use of technology to automate processes so as to remove opportunities for humans to make errors. The growth of technology over the past several decades has contributed to system complexity so this particular issue is highlighted here.

Technology changes the tasks that people do by shifting the workload and eliminating human decision making.42 Where a worker previously may have overseen an entire production process, he or she may intervene now only in the last few steps if the previous steps are automated. For example, flying an aircraft has become more automated, which has helped reduce workload during nonpeak periods. During peak times, such as take-off and landing, there may be more processes to monitor and information to interpret.

Furthermore, the operator must still do things that cannot be automated. This usually involves having to monitor automated systems for rare, abnormal events43 because machines cannot deal with infrequent events in a constantly changing environment.44 Fortunately, automated systems rarely fail. Unfortunately, this means that operators do not practice basic skills, so workers lose skills in exactly the activities they need in order to take over when something goes wrong.

Automation makes systems more "opaque" to people who manage, maintain, and operate them.45 Processes that are automated are less visible because machines intervene between the person and the task. For example, automation means that people have less hands-on contact with processes and are elevated to more supervisory and planning tasks. Direct information is filtered through a machine (e.g., a computer), and operators run the risk of having too much information to interpret or of not getting the right information.

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 62

One of the advantages of technology is that it can enhance human performance to the extent that the human plus technology is more powerful than either is alone.46 Good machines can question the actions of operators, offer advice, and examine a range of alternative possibilities that humans cannot possibly remember. In medicine, automated order entry systems or decision support systems have this aim. However, technology can also create new demands on operators. For example, a new piece of equipment may provide more precise measurements, but also demand better precision from the operator for the equipment to work properly.47 Devices that have not been standardized, or that work and look differently, increase the likelihood of operator errors. Equipment may not be designed using human factors principles to account for the human-machine interface.48

Technology also has to be recognized as a "member" of the work team. When technology shifts workloads, it also shifts the interactions between team members. Where processes may have been monitored by several people, technology can permit the task to be accomplished by fewer people. This affects the distributed nature of the job in which tasks are shared among

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 63

several people and may influence the ability to discover and recover from errors.49

In this context, technology does not involve just computers and information technology. It includes ''techniques, drugs, equipment and procedures used by health care professionals in delivering medical care to individuals and the systems within which such care is delivered."50 Additionally, the use of the term technology is not restricted to the technology employed by health care professionals. It can also include people at home of different ages, visual abilities, languages, and so forth, who must use different kinds of medical equipment and devices. As more care shifts to ambulatory and home settings, the use of medical technology by non-health professionals can be expected to take on increasing importance.

Research on Human Factors

Research in the area of human factors is just beginning to be applied to health care. It borrows from the disciplines of industrial engineering and psychology. Human factors is defined as the study of the interrelationships between humans, the tools they use, and the environment in which they live and work.51

In the context of this report, a human factors approach is used to understand where and why systems or processes break down. This approach examines the process of error, looking at the causes, circumstances, conditions, associated procedures and devices and other factors connected with the event. Studying human performance can result in the creation of safer systems and the reduction of conditions that lead to errors. However, not all errors are related to human factors. Although equipment and materials should take into account the design of the way people use them, human factors may not resolve instances of equipment breakdown or material failure.

Much of the work in human factors is on improving the human-system interface by designing better systems and processes.52 This might include, for example, simplifying and standardizing procedures, building in redundancy to provide backup and opportunities for recovery, improving communications and coordination within teams, or redesigning equipment to improve the human-machine interface.

Two approaches have typically been used in human factors analysis. The first is critical incident analysis. Critical incident analysis examines a significant or pivotal occurrence to understand where the system broke down,

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 64

why the incident occurred, and the circumstances surrounding the incident.53 Analyzing critical incidents, whether or not the event actually leads to a bad outcome, provides an understanding of the conditions that produced an actual error or the risk of error and contributing factors.

A critical incident analysis in anesthesia found that human error was involved in 82 percent of preventable incidents. The study identified the most frequent categories of error and the riskiest steps in the process of administering anesthesia. Recommended corrective actions included such things as labeling and packaging strategies to highlight differences among anesthesiologists in the way they prepared their workspace, training issues for residents, work-rest cycles, how relief and replacement processes could be improved, and equipment improvements (e.g., standardizing equipment in terms of the shape of knobs and the direction in which they turn).

Another analytic approach is referred to as "naturalistic decision making."54 This approach examines the way people make decisions in their natural work settings. It considers all of the factors that are typically controlled for in a laboratory-type evaluation, such as time pressure, noise and other distractions, insufficient information, and competing goals. In this method, the researcher goes out with workers in various fields, such as firefighters or nurses, observes them in practice, and then walks them through to reconstruct various incidents. The analysis uncovers the factors weighed and the processes used in making decisions when faced with ambiguous information under time pressure.

In terms of applying human factors research, David Woods of Ohio State University describes a process of reporting, investigation, innovation, and dissemination (David Woods, personal communication, December 17, 1998). Reporting or other means of identifying errors tells people where

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 65

errors are occurring and where improvements can be made. The investigation stage uses human factors and other analyses to determine the contributing factors and circumstances that created the conditions in which errors could occur. The design of safer systems provides opportunities for innovation and working with early adopters to test out new approaches. Finally, dissemination of innovation throughout the industry shifts the baseline for performance. The experience of the early adopters redefines what is possible and provides models for implementation.

Aviation has long analyzed the role of human factors in performance. The Ames Research Center (part of the National Aeronautics and Space Administration) has examined areas related to information technology, automation, and the use of simulators for training in basic and crisis skills, for example. Other recent projects include detecting and correcting errors in flight; interruptions, distractions and lapses of attention in the cockpit; and designing information displays to assist pilots in maintaining awareness of their situation during flight.55

Summary

The following key points can be summarized from this chapter.

1. Some systems are more prone to accidents than others because of the way the components are tied together. Health care services is a complex and technological industry prone to accidents.

2. Much can be done to make systems more reliable and safe. When large systems fail, it is due to multiple faults that occur together.

3. One of the greatest contributors to accidents in any industry including health care, is human error. However, saying that an accident is due to human error is not the same as assigning blame because most human errors are induced by system failures. Humans commit errors for a variety of known and complicated reasons.

4. Latent errors or system failures pose the greatest threat to safety in a complex system because they lead to operator errors. They are failures built into the system and present long before the active error. Latent errors are difficult for the people working in the system to see since they may be hidden in computers or layers of management and people become accustomed to working around the problem.

5. Current responses to errors tend to focus on the active errors. Although this may sometimes be appropriate, in many cases it is not an effec-

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 66

tive way to make systems safer. If latent failures remain unaddressed, their accumulation actually makes the system more prone to future failure. Discovering and fixing latent failures and decreasing their duration are likely to have a greater effect on building safer systems than efforts to minimize active errors at the point at which they occur.

6. The application of human factors in other industries has successfully reduced errors. Health care has to look at medical error not as a special case of medicine, but as a special case of error, and to apply the theory and approaches already used in other fields to reduce errors and improve reliability.56

References

1. Senders, John, "Medical Devices, Medical Errors and Medical Accidents," in Human Error in Medicine, ed., Marilyn Sue Bogner, Hillsdale, NJ: Lawrence Erlbaum Associates, 1994.

2. Cook, Richard; Woods, David; Miller, Charlotte, A Tale of Two Stories: Contrasting Views of Patient Safety, Chicago: National Patient Safety Foundation, 1998.

3. Cook, Richard and Woods, David, "Operating at the Sharp End: The Complexity of Human Error," in Human Error in Medicine, ed., Marilyn Sue Bogner, Hillsdale, NJ: Lawrence Erlbaum Associates, 1994.

4. Perrow, Charles, Normal Accidents, New York: Basic Books, 1984.

5. Reason, James, Human Error, Cambridge: Cambridge University Press, 1990.

6. Perrow, 1984; Cook and Woods, 1994.

7. Gaba, David M.; Maxwell, Margaret; DeAnda, Abe, Jr.. Anesthetic Mishaps: Breaking the Chain of Accident Evolution. Anesthesiology. 66(5):670–676, 1987.

8. Perrow, 1984.

9. Van Cott, Harold, "Human Errors: Their Causes and Reductions," in Human Error in Medicine, ed., Marilyn Sue Bogner, Hillsdale, NJ: Lawrence Erlbaum Associates, 1994. Also, Roberts, Karlene, "Organizational Change and A Culture of Safety," in Proceedings of Enhancing Patient Safety and Reducing Errors in Health Care, Chicago: National Patient Safety Foundation at the AMA, 1999.

10. Reason, 1990. See also Cook, Woods and Miller, 1998.

11. Norman, Donald, Things That Make Us Smart, Defending Human Attributes in the Age of Machines, Menlo Park, CA: Addison-Wesley Publishing Co., 1993.

12. Cooper, Jeffrey B.; Newbower, Ronald; Long, Charlene, et al. Preventable Anesthesia Mishaps: A Study of Human Factors. Anesthesiology. 49(6):399–406, 1978.

13. Cooper, Jeffrey B. and Gaba, David M. A Strategy for Preventing Anesthesia Accidents. International Anesthesia Clinics. 27(3):148–152, 1989

14. Reason, 1990.

15. Chassin, Mark R.; Galvin, Robert W., and the National Roundtable on Health Care Quality. The Urgent Need to Improve Health Care Quality, JAMA. 280(11):1000–1005, 1998.

16. Reason, 1990.

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 67

17. Cook, Woods and Miller, 1998.

18. Reason, 1990.

19. Vaughan, Diane, The Challenger Launch Decision, Chicago: The University of Chicago Press, 1996.

20. Reason, 1990.

21. Reason, 1990.

22. Roberts, Karlene, 1999. See also: Gaba, David, "Risk, Regulation, Litigation and Organizational Issues in Safety in High-Hazard Industries," position paper for Workshop on Organizational Analysis in High Hazard Production Systems: An Academy/Industry Dialogue," MIT Endicott House, April 15–18, 1997, NSF Grant No. 9510883-SBR.

23. Sagan, Scott D., The Limits of Safety, Princeton, NJ: Princeton University Press, 1993.

24. Sagan, Scott D., 1993 and Robert, Karlene, 1999.

25. Reason, James, "Forward," in Human Error in Medicine, ed., Marilyn Sue Bogner, Hillsdale, NJ: Lawrence Erlbaum Associates, 1994.

26. "Agenda for Research and Development in Patient Safety," National Patient Safety Foundation at the AMA, http://www.ama-assn.org/med-sci/npsf/research/research.htm. May 24, 1999.

27. Dye, Kevin M.C.; Post, Diana; Vogt, Eleanor, "Developing a Consensus on the Accountability and Responsibility for the Safe Use of Pharmaceuticals," Preliminary White Paper prepared for the National Patient Safety Foundation, June 1, 1999.

28. Brewer, Timothy; Colditz, Graham A. Postmarketing Surveillance and Adverse Drug Reactions, Current Perspectives and Future Needs. JAMA. 281(9):824–829, 1999.

29. VHA's Patient Safety Improvement Initiative, presentation to the National Health Policy Forum by Kenneth W. Kizer, Under Secretary for Health, Department of Veterans Affairs, May 14, 1999, Washington, D.C.

30. Leape, Lucian L. Error in Medicine. JAMA. 272(23):1851–1857, 1994.

31. Perrow, 1984.

32. Cook and Woods, 1994.

33. Reason. 1990.

34. Norman, 1993.

35. Perrow, 1984.

36. Perrow, 1984.

37. Cook, Woods and Miller, 1998.

38. On the other hand, in some places, the health system may be complex, but loosely coupled. For example, during an emergency, a patient may receive services from a loosely networked set of subsystems—from the ambulance to the emergency room to the outpatient clinic to home care. See Van Cott in Bogner, 1994.

39. Cook and Woods, 1994.

40. Reason, 1990.

41. Reason, 1990.

42. Cook and Woods, 1994.

43. Reason, 1990.

44. Van Cott, 1994.

45. Reason, 1990.

46. Norman, 1993.

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×

Page 68

47. Cook and Woods, 1994.

48. Van Cott, 1994.

49. Norman, 1993.

50. Institute of Medicine, Assessing Medical Technologies, Washington, D.C.: National Academy Press, 1985.

51. Weinger, Matthew B; Pantiskas, Carl; Wiklund, Michael; Carstensen, Peter. Incorporating Human Factors Into the Design of Medical Devices. JAMA. 280(17):1484, 1998.

52. Reason, 1990. Leape, 1994.

53. Cooper, Newbower, Long, et al., 1978.

54. Klein, Gary, Sources of Power: How People Make Decisions, Cambridge, MA: The MIT Press, 1998.

55. "Current Projects," Human Factors Research and Technology Division, Ames Research Center, NASA, http://human-factors.arc.nasa.gov/frameset.html

56. Senders, 1994.

Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 49
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 50
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 51
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 52
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 53
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 54
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 55
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 56
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 57
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 58
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 59
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 60
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 61
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 62
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 63
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 64
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 65
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 66
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 67
Suggested Citation:"3 Why Do Errors Happen?." Institute of Medicine. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press. doi: 10.17226/9728.
×
Page 68
Next: 4 Building Leadership and Knowledge for Patient Safety »
To Err Is Human: Building a Safer Health System Get This Book
×
Buy Paperback | $54.95 Buy Ebook | $43.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Experts estimate that as many as 98,000 people die in any given year from medical errors that occur in hospitals. That's more than die from motor vehicle accidents, breast cancer, or AIDS—three causes that receive far more public attention. Indeed, more people die annually from medication errors than from workplace injuries. Add the financial cost to the human tragedy, and medical error easily rises to the top ranks of urgent, widespread public problems.

To Err Is Human breaks the silence that has surrounded medical errors and their consequence—but not by pointing fingers at caring health care professionals who make honest mistakes. After all, to err is human. Instead, this book sets forth a national agenda—with state and local implications—for reducing medical errors and improving patient safety through the design of a safer health system.

This volume reveals the often startling statistics of medical error and the disparity between the incidence of error and public perception of it, given many patients' expectations that the medical profession always performs perfectly. A careful examination is made of how the surrounding forces of legislation, regulation, and market activity influence the quality of care provided by health care organizations and then looks at their handling of medical mistakes.

Using a detailed case study, the book reviews the current understanding of why these mistakes happen. A key theme is that legitimate liability concerns discourage reporting of errors—which begs the question, "How can we learn from our mistakes?"

Balancing regulatory versus market-based initiatives and public versus private efforts, the Institute of Medicine presents wide-ranging recommendations for improving patient safety, in the areas of leadership, improved data collection and analysis, and development of effective systems at the level of direct patient care.

To Err Is Human asserts that the problem is not bad people in health care—it is that good people are working in bad systems that need to be made safer. Comprehensive and straightforward, this book offers a clear prescription for raising the level of patient safety in American health care. It also explains how patients themselves can influence the quality of care that they receive once they check into the hospital. This book will be vitally important to federal, state, and local health policy makers and regulators, health professional licensing officials, hospital administrators, medical educators and students, health caregivers, health journalists, patient advocates—as well as patients themselves.

First in a series of publications from the Quality of Health Care in America, a project initiated by the Institute of Medicine

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!