empowered to call a halt to an unsafe practice that was putting the patient at risk. Finally, the nurse’s statement that she felt ashamed and afraid indicates that the workplace environment did not possess a culture of safety that would encourage the reporting, analysis, and remediation of error-producing situations. Because the nurse did not come forward, none of these latent conditions were recognized as threats to patient safety, and the potential remained that future patients admitted to this unit would face a similar risk to their safety. Indeed, latent conditions such as these are present in all organizations and have been identified as posing the greatest risk to safety in complex or high-technology systems because of their capacity to result in multiple types of active failures. Their impact spreads throughout an organization, creating error-producing factors within individual workplaces (Reason, 1990).

Unfortunately, when errors are discovered, attention tends to focus on the more visible “sharp end” of the activity (the person associated with the error) because latent conditions are less visible, often hidden in routine practices or in the structure or management of an organization. As a result, responses to errors tend to focus on retraining, “discipline” (reprimanding, firing, or suing), or other responses aimed at specific individuals. Although a punitive response may be appropriate in cases of willful wrongdoing, evidence has shown that it is not an effective way to prevent subsequent errors. Focusing only on the sharp end allows latent conditions to remain undetected in the system, and their accumulation makes the system more prone to additional accidents and errors in the future.

Efforts to discover and fix latent system conditions are more likely to result in safer systems than attempts to minimize active errors at the point at which they occur (Institute of Medicine, 2000). Reason (2000:769) uses the analogy of mosquito control to illustrate this argument: “Active failures are like mosquitoes. They can be swatted one by one, but they will still keep coming.” The best remedies involve creating more effective defenses to target and prevent the conditions that allow them to breed and flourish in the first place.

However, viewing errors as resulting solely from either individual or systemic errors has its dangers. Attributing errors predominantly to the deficiencies of individuals fails to recognize the findings of safety studies estimating that the majority of unsafe acts—90 percent or more—arise from system failures in which individuals are not to blame (Reason, 1997). Focusing exclusively on individuals misses an essential part of the error story, and blocks the path to effective remediation.

On the other hand, an extreme systems perspective that recognizes no individual contributions to patient safety presents problems such as “learned helplessness” and failure to address instances of individual deficits in competencies or willful wrongdoing. With regard to the phenomenon of

The National Academies of Sciences, Engineering, and Medicine
500 Fifth St. N.W. | Washington, D.C. 20001

Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement