Active error—an error involving frontline clinicians (sometimes referred to as an error occurring at the “sharp end” of patient safety) (IOM, 2000).
Adverse event—“an event that results in unintended harm to the patient by an act of commission or omission rather than by the underlying disease or condition of the patient” (IOM, 2004, p. 32).
Burnout—condition due to occupational stress resulting from demanding and emotional relationships between health care professionals and patients that is marked by emotional exhaustion, a negative attitude toward one’s patients, and the belief that one is no longer effective at work with patients (Bakker et al., 2005).
Calibration—the process of a clinician becoming aware of his or her diagnostic abilities and limitations through feedback.
Clinical decision support (CDS)—a health information technology component that “provides clinicians, staff, patients or other individuals with knowledge and person-specific information, intelligently filtered or presented at appropriate times, to enhance health and health care. CDS encompasses a variety of tools to enhance decision making in the clinical workflow. These tools include computerized alerts and reminders to care providers and patients; clinical guidelines; condition-specific order sets; focused patient data reports and summaries; documentation templates;
diagnostic support; and contextually relevant reference information, among other tools” (HealthIT.gov, 2014).
Clinical reasoning—“the cognitive process that is necessary to evaluate and manage a patient’s medical problems” (Barrows, 1980, p. 19).
Clinician survey—a questionnaire (written, telephone, interview, Web-based) that obtains clinicians’ self-reports about diagnostic errors they have made or what they know about diagnostic errors made by other clinicians.
Cognitive autopsy—a form of cognitive and affective root cause analysis that focuses on factors that can affect cognition such as ambient conditions, physical state (fatigue), and cognitive heuristics (Croskerry, 2005).
Cognitive bias—a predisposition to think in a way that leads to systematic failures in judgment. Cognitive biases often result from heuristics that fail in a predictable manner, but they can also be caused by affect and motivation (Kahneman, 2011).
Communication and resolution program (CRP)—a program that encourages “the disclosure of unanticipated care outcomes to affected patients and their families and proactively seek[s] resolutions, which may include providing an apology; an explanation; and, where appropriate, an offer of reimbursement, compensation, or both” (Mello et al., 2014, p. 20).
Defensive medicine—“occurs when doctors order tests, procedures, or visits, or avoid high-risk patients or procedures, primarily (but not necessarily solely) to reduce their exposure to malpractice liability” (OTA, 1994, p. 13).
Diagnosis—the explanation of a patient’s health problem.
Diagnostic error—the failure to (a) establish an accurate and timely explanation of the patient’s health problem(s) or (b) communicate that explanation to the patient.
Diagnostic management team—a group of diagnostic specialists (pathologists, radiologists, and other diagnosticians) that offer participating health care professionals assistance in selecting appropriate diagnostic tests and interpreting diagnostic test results (Govern, 2013).
Diagnostic process—a complex, patient-centered, collaborative activity that involves information gathering and clinical reasoning with the goal of determining a patient’s health problem.
Dual process theory—a model of cognition that proposes two processes—fast, intuitive system 1, and slow, analytic system 2 processes—are responsible for human reasoning and decision making.
Electronic health record (EHR)—a real-time, patient-centered record that contains information about a patient’s medical history, diagnoses, medications, immunization dates, allergies, radiology images, and lab and test results (HealthIT.gov, 2013).
Error—“failure of a planned action to be completed as intended (i.e., error of execution) and the use of a wrong plan to achieve an aim (i.e., error of planning) [commission]. It also includes failure of an unplanned action that should have been completed (omission)” (IOM, 2004, p. 330).
Error recovery—the early identification of an error so that actions can be taken to reduce or avert negative effects resulting from the error (IOM, 2000).
Feedback—information on the accuracy of diagnosis and diagnostic performance that is provided to individual health care professionals, care teams, or organizational leaders.
Harm—“hurtful or adverse outcomes of an action or event, whether temporary or permanent” (IOM, 2011, p. 240).
Health information technology (health IT)—“a technical system of computers and software that operates in the context of a larger sociotechnical system; that is, a collection of hardware and software working in concert within an organization that includes people, processes, and technology” (IOM, 2012, p. 2).
Health literacy—“the degree to which individuals have the capacity to obtain, process, and understand basic health information and services needed to make appropriate health care decisions and services needed to prevent or treat illness” (HRSA, 2015).
Heuristic—a special type of system 1 process that can facilitate decision making but can also lead to errors. Sometimes referred to as cognitive strategies or mental shortcuts, heuristics are automatically and uncon-
sciously employed during reasoning and decision making (Cosmides, 1996; Cosmides and Tooby, 1994; Gigerenzer, 2000; Gigerenzer and Goldstein, 1996; Klein, 1998, 2003; Lipshitz et al., 2001).
Human factors (or ergonomics)—“the scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory, principles, data, and methods to design in order to optimize human well-being and overall system performance. Ergonomists contribute to the design and evaluation of tasks, jobs, products, environments, and systems in order to make them compatible with the needs, abilities, and limitations of people” (IEA, 2000).
Integrated practice unit—a group of clinicians and non-clinicians who are responsible for the comprehensive care of a specific medical condition and the associated complications, or a set of closely related conditions (Porter, 2010).
Interoperability—the ability of different information technology systems and software applications to communicate, exchange data, and use the information that has been exchanged (HIMSS, 2014).
Inter-rater reliability—the degree to which two or more independent raters can consistently and systematically apply a rubric to assign scores to observations (or participants) based on a preestablished scoring protocol (or rubric) (Stemler, 2007).
Intra-rater reliability—the degree of agreement among multiple repetitions of a scoring protocol performed by a single rater.
Latent error—“errors in the design, organization, training, or maintenance that lead to operator errors and whose effects typically lie dormant in the system for lengthy periods of time” (IOM, 2000, p. 210). Latent errors are more removed from the control of frontline clinicians and can include failures in organizations and design that enable active errors to cause harm (often called the “blunt end” of patient safety) (AHRQ, 2015; IOM, 2000).
Learning sciences—the multidisciplinary science that studies how people learn in order to optimize education and training.
Morbidity and mortality (M&M) conferences—forums that allow clinicians to discuss and learn from errors that have occurred within an organization.
Near miss—a failure in the diagnostic process that does not lead to a diagnostic error.
Overdiagnosis—“when a condition is diagnosed that would otherwise not go on to cause symptoms or death” (Welch and Black, 2010, p. 605).
Patient portal—“Secure, online patient access to health information and serves as an interface to provide useful information to both patients and health professionals” (IOM, 2012, p. 118).
Patient safety—“freedom from accidental injury; ensuring patient safety involves the establishment of operational systems and processes that minimize the likelihood of errors and maximizes the likelihood of intercepting them when they occur” (IOM, 2000, p. 211); the prevention of harm caused by errors of commission and omission (IOM, 2004).
Patient survey—a questionnaire (written, telephone, interview, Web-based) that obtains patients’ self-reports about diagnostic errors they have experienced or their awareness of diagnostic errors experienced by others.
Postmortem examination (autopsy)—“an external and internal examination of the body after death using review of medical records, surgical techniques, microscopy, and laboratory analysis. It is performed by a pathologist, a medical doctor specially trained for the procedure who is able to recognize the effects of disease on the body” (CAP, 2014).
Quality of care—“degree to which health services for individuals and populations increase the likelihood of desired health outcomes and are consistent with current professional knowledge” (IOM, 1990, p. 128).
Root cause analysis—“a structured method used to analyze serious adverse events. Initially developed to analyze industrial accidents, [root cause analysis] is now widely deployed as an error analysis tool in health care” (AHRQ, 2012).
Safe—“avoiding injuries to patients from the care that is intended to help them” (IOM, 2001, p. 39).
Safe care—“involves making evidence-based clinical decisions to maximize the health outcomes of an individual and to minimize the potential for harm. Both errors of commission and omission should be avoided” (IOM, 2004, p. 334).
Second review—a process used in pathology and radiology in which a second health care professional reviews the same information as the first health care professional in order to detect discrepancies in results that may be indicative of error.
Simulation—“allows researchers and practitioners to test new clinical processes and enhance individual and team skills before encountering patients. Many simulation applications involve mannequins that present with symptoms and respond to the simulated treatment, analogous to flight simulators used by pilots” (AHRQ, 2014).
Standardized patient—“a person carefully recruited and trained to take on the characteristics of a real patient thereby affording the student an opportunity to learn and to be evaluated on learned skills in a simulated clinical environment” (Johns Hopkins, 2015).
System—“set of interdependent elements interacting to achieve a common aim. These elements may be both human and nonhuman (equipment, technologies, etc.)” (IOM, 2000, p. 211).
System 1—fast (nonanalytical, intuitive) automatic cognitive processes that require very little working memory capacity and are often triggered by stimuli or result from overlearned associations or implicitly learned activities.
System 2—slow (analytical, reflective) cognitive processes that place a heavy load on working memory and involve hypothetical and counterfactual reasoning (Evans and Stanovich, 2013; Stanovich and Toplak, 2012).
Usability—“the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use” (ISO, 1998).
Voluntary reporting—“those reporting systems for which the reporting of patient safety events is voluntary (not mandatory). Generally, reports on all types of events are accepted” (IOM, 2004, p. 335).
Workflow—the sequence of physical and cognitive tasks performed by various people within and between work environments (Carayon et al., 2010).
AHRQ (Agency for Healthcare Research and Quality). 2012. Patient safety primers: Root cause analysis. http://psnet.ahrq.gov/primer.aspx?primerID=10 (accessed April 11, 14, 2014).
AHRQ. 2014. Improving patient safety through simulation research. www.ahrq.gov/research/findings/factsheets/errors-safety/simulproj11/index.html (accessed April 10, 2015.
AHRQ. 2015. Patient safety network: Patient safety primers. Systems approach. http://psnet.ahrq.gov/primer.aspx?primerID=21 (accessed May 8, 2015).
Bakker, A. B., P. M. Le Blanc, and W. B. Schaufeli. 2005. Burnout contagion among intensive care nurses. Journal of Advanced Nursing 51(3):276–287.
Barrows, H. S. 1980. Problem-based learning: An approach to medical education. New York: Springer Publishing Company.
CAP (College of American Pathologists). 2014. Autopsy. www.cap.org/apps//cap.portal?_nfpb=true&cntvwrPtlt_actionOverride=%2Fportlets%2FcontentViewer%2Fshow&_windowLabel=cntvwrPtlt&cntvwrPtlt%7BactionForm.contentReference%7D=committees%2Fautopsy%2Fautopsy_index.html&_state=maximized&_pageLabel=cntvwr (accessed April 10, 2015).
Carayon, P., B. T. Karsh, R. Cartmill, et al. 2010. Incorporating health information technology into workflow redesign: Request for information summary report. Rockville, MD: Agency for Healthcare Research and Quality.
Cosmides, L. 1996. Are humans good intuitive statisticians after all? Rethinking some conclusions from the literature on judgment under uncertainty. Cognition 58(1):1–73.
Cosmides, L., and J. Tooby. 1994. Better than rational: Evolutionary psychology and the invisible hand. American Economic Review 84(2):327–332.
Croskerry, P. 2005. Diagnostic failure: A cognitive and affective approach. Advances in Patient Safety 2:241–254.
Evans, J. S. B. T., and K. E. Stanovich. 2013. Dual-process theories of higher cognition: Advancing the debate. Perspectives on Psychological Science 8(3):223–241.
Gigerenzer, G. 2000. Adaptive thinking: Rationality in the real world. New York: Oxford University Press.
Gigerenzer, G., and D. G. Goldstein. 1996. Reasoning the fast and frugal way: Models of bounded rationality. Psychology Review 103:650–669.
Govern, P. 2013. Diagnostic management efforts thrive on teamwork. http://news.vanderbilt.edu/2013/03/diagnostic-management-efforts-thrive-on-teamwork (accessed February 11, 2015).
HealthIT.gov. 2013. Learn EHR basics. www.healthit.gov/providers-professionals/learn-ehr-basics (accessed March 11, 2014).
HealthIT.gov. 2014. Clinical decision support. www.healthit.gov/policy-researchers-implementers/clinical-decision-support-cds (accessed April 9, 2014).
HIMSS (Healthcare Information and Management Systems Society). 2014. What is interoperability? www.himss.org/library/interoperability-standards/what-is-interoperability (accessed February 9, 2015).
HRSA (Health Resources and Services Administration). 2015. About health literacy. www.hrsa.gov/publichealth/healthliteracy/healthlitabout.html (accessed August 10, 2015).
IEA (International Ergonomics Association). 2000. The discipline of ergonomics. www.iea.cc/whats/index.html (accessed April 10, 2015).
IOM (Institute of Medicine). 1990. Medicare: A strategy for quality assurance, Volume II. Washington, DC: National Academy Press.
IOM. 2000. To err is human: Building a safer health system. Washington, DC: National Academy Press.
IOM. 2001. Crossing the quality chasm: A new health system for the 21st century. Washington, DC: National Academy Press.
IOM. 2004. Patient safety: Achieving a new standard for care. Washington, DC: The National Academies Press.
IOM. 2011. Finding what works in health care: Standards for systematic reviews. Washington, DC: The National Academies Press.
IOM. 2012. Health IT and patient safety: Building safer systems for better care. Washington, DC: The National Academies Press.
ISO (International Organization for Standardization). 1998. Ergonomic requirements for office work with visual display terminals (VDTS)—Part 11: Guidance on usability. www.iso.org/obp/ui/#iso:std:iso:9241:-11:ed-1:v1:en (accessed February 25, 2015).
Johns Hopkins. 2015. Standardized patient program. www.hopkinsmedicine.org/simulation_center/training/standardized_patient_program (accessed April 10, 2015).
Kahneman, D. 2011. Thinking fast and slow. New York: Farrar, Strauss and Giroux.
Klein, G. 1998. Sources of power: How people make decisions. Cambridge, MA: MIT Press.
Klein, G. 2003. The power of intuition. New York: Doubleday.
Lipshitz, R., G. Klein, J. Orasanu, and E. Salas. 2001. Taking stock of naturalistic decision making. Journal of Behavioral Decision Making 14(5):331–352.
Mello, M. M., R. C. Boothman, T. McDonald, J. Driver, A. Lembitz, D. Bouwmeester, B. Dunlap, and T. Gallagher. 2014. Communication-and-resolution programs: The challenges and lessons learned from six early adopters. Health Affairs 33(1):20–29.
OTA (Office of Technology Assessment). 1994. Defensive Medicine and Medical Malpractice, OTA-H-602. Washington, DC: U.S. Government Printing Office.
Porter, M. E. 2010. What is value in health care? New England Journal of Medicine 363(26): 2477–2481.
Stanovich, K. E., and M. E. Toplak. 2012. Defining features versus incidental correlates of type 1 and type 2 processing. Mind & Society 11(1):3–13.
Stemler, S. E. 2007. Interrater reliability. In N. J. Salkind (ed.), Encyclopedia of Measurement and Statistics. Thousand Oaks, CA: Sage Publications.
Welch, H. G., and W. C. Black. 2010. Overdiagnosis in cancer. Journal of the National Cancer Institute 102(9):605–613.