National Academies Press: OpenBook

Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence (2004)

Chapter: Section II Keynote Speakers2 The Opportunity of Precursors

« Previous: Section I Committee Summary Report1 The Accident Precursors Project: Overview and Recommendations
Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×

Section II
Keynote Speakers

Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×

This page intentionally left blank.

Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×

The Opportunity of Precursors

JAMES P. BAGIAN

U.S. Department of Veterans Affairs National Center for Patient Safety

One difficulty in identifying vulnerabilities in a system, sometimes called the precursor problem, is hindsight bias. After a big, smoking hole appears in the ground, it is very easy to say someone should have taken the problem seriously. That bias certainly appeared in the wake of the Three Mile Island incident and the Challenger and Columbia space shuttle disasters. But, as people with operational or hands-on managerial experience know, in any large, complex project, people often bare their souls and express their uncertainties about many aspects of a project at the last minute. Often these last minute revelations are attempts to prevent being held responsible for a bad outcome—in the case of the space shuttle, the deaths of seven people who were strapped in and launched on that day. The manager, at whose desk the buck finally stops, has to ask what data support these last minute concerns.

Even if the data are not very good, decisions must be made. Concerns about possible negative outcomes, although they must be taken into account, should not inordinately influence a final decision, which should be based on facts and not emotions. Every project entails risks, which can never be eliminated entirely. Nevertheless, when a bad outcome occurs, the knee-jerk response is to equate it with a bad decision. When the causes have been analyzed, however, they may very well show that the decisions leading up to the bad outcome were entirely appropriate.

The real challenge we face is how to go from theory to practice. In making that practical transition, it is essential that we first determine the ultimate goal. Unless a goal is clearly understood—not in tactical terms but in terms of the end result—confusion and ineffective actions are likely to result. For instance, we

Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×

might ask what the ultimate goal of manufacturing buggy whips was—to make transportation using horses more efficient or to enable people, merchandise, and information to move over large distances as quickly as possible. If we understand that the latter was the goal, then clearly other modalities, such as cars, trucks, airplanes, etc., should have been pursued as they became available. The changeover to more effective modalities is not always instantaneous though, because it is easy to become enamored of a particular technique, hobby, or traditional way of achieving a goal. So we must always ask ourselves what the overarching goal is and how we can best reach it.

Systems in the health-care field are not always clearly aligned with the goals of the overall system. If we ask about the overarching goal of patient safety initiatives, the answer is usually to eliminate errors. The problem with this answer is that eliminating errors is a tactic, not a goal. Of course, it is impossible to eliminate all errors. Therefore, adopting a goal of eliminating all errors is tantamount to declaring that a project is doomed to fail. It may sound simplistic, but failing to distinguish between goals and tactics can result in efforts that do not lead to solutions of the problems at hand. Activities should be measured against the yardstick of whether or not they really contribute to meeting strategic objectives.

In medicine, our goal should be to prevent unnecessary harm to the patient, not to eliminate errors. People involved in health care may disagree about standards of care or about what constitutes an error. But when patients end up injured or dead, and these outcomes were avoidable, everyone agrees that these outcomes are a bad thing.

RECOGNIZING OUTCOMES

The aftermath of the Three Mile Island incident is instructive for understanding the perception of outcomes. Two divergent views of the incident emerged that were polar opposites and raised questions about how we define success. One side fretted that the accident demonstrated that nuclear power was dangerous. People who drew this conclusion tended to view any risk to life as undesirable or unacceptable. The other side felt that the plant’s safety systems had achieved their goals by preventing a disaster. Both views are worthy of consideration, but in fairness, the yardstick for evaluation must be to measure performance against design specifications. If, in hindsight, the design specifications are determined to be inadequate, then the specifications should be revised, but the performance of the system on the day in question is not the primary issue.

The traditional approach to recognizing a problem is reactive and retrospective, which appears to be a natural human response. Unfortunately, the perceptions created by this less-than-scientific, or hindsight-biased, approach can unduly influence behavior. September 11, 2001, for instance, made a huge impression

Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×

on the way people think about terrorism. After that day, many people refused to fly on airplanes—some still refuse today. Arguably, flying is no riskier today than it was before September 11, but people perceive it differently now, and those perceptions govern their decisions.

Similar rethinking about aviation has happened before. Until the late 1940s, airplane crashes were in many ways regarded as “the cost of doing business.” Statistics from World War II show that the number of planes lost as a result of normal, noncombat activities was staggering compared to the number lost to enemy fire. At the time, this was not regarded as abnormal, but as the way of the world. Some would call the acceptance of the risk of airplane crashes the “normalization of deviance.”

The attitude toward plane crashes began to change in the 1950s. Accident investigation data from the U.S. Navy (similar to data from other military services) show that the aircraft loss rate has dramatically dropped since then. In 1950, there were approximately 54 losses per 100,000 flying hours; 776 aircraft were lost in 1954 alone. By 1996, the figure had dropped to approximately two aircraft per 100,000 flying hours. That’s a 96 percent reduction, even though the physical environment in which pilots operated (i.e., low-level, high-speed, all-weather, and night flights) presented many more objective hazards than before. This reduction was accomplished through the institution of a proactive and systematic approach to safety.

REPORTING SYSTEMS

We study precursors because we want to take a new, proactive approach to system safety that emphasizes prevention. To become proactive, however, we must first identify problems that could lead to bad outcomes. One of the tools for becoming proactive is an effective reporting system. But because there can be tremendous barriers to reporting, it is essential that the ultimate goals of the reporting system be clearly defined. One of the first decisions that must be made is whether the purpose of reporting is organizational learning or accountability. Safety systems that have a goal of preventing harm in a sustainable way must emphasize organizational learning.

Another important decision is what should be reported. Is the purpose of the reporting system to look only at things that have caused an undesirable outcome, or is it also to scrutinize other things, such as close calls that almost resulted in undesirable outcomes but did not, either because of a last-minute “good catch” or because of plain good fortune? Close calls are extremely important areas of study because they are much more common than actual bad outcomes. Thus, close calls provide repeated opportunities to learn without first having to experience a tragic outcome. In addition, because close calls do not result in harm, people are generally more willing to discuss them openly and candidly, because

Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×

they are less fearful of retribution for the part they played in the event. Also, people are often more motivated to analyze close calls if understanding them is considered an opportunity to act proactively to prevent undesirable outcomes in the future.

However, in medicine as in industry, close calls are often ignored. For instance, the Joint Commission on Accreditation of Healthcare Organizations, a quasi-regulatory body, recognizes the value of analyzing close calls but does not require that they be investigated. In fact, if someone submits a voluntary report of a close call with an associated root-cause analysis (RCA) to the Joint Commission, it will not be accepted. This policy sends a mixed message that appears to emphasize learning only from events in which patients have been injured rather than from close calls where learning can take place without first having to hurt a patient.

Great care must be taken in using data in reporting systems. By their very nature, the self-reporting that populates most reporting systems cannot be used to estimate the true incidence of events. This fact is often overlooked, as has been demonstrated by erroneous incidence statistics published after analysts have “tortured the data” from a variety of reporting systems. We must remember that reports are simply reports; they do not necessarily reflect reality. Trends and rates based on them are simply trends and rates of what was reported, which may bear no relation to the trends and rates of the entire system. The best use of reporting systems is for identifying potential system vulnerabilities.

We had a report, for example, that identified a significant problem. A patient was physically pinned to a magnetic resonance imaging (MRI) scanner by the “sandbags” that had been used to stabilize him. MRI scanners have very strong magnets, and sandbags are sometimes inappropriately filled with ferromagnetic particles rather than sand. Had we relied on the so-called rate and incidence statistics culled from our reporting system, we would have concluded that this was not an important problem, because we had never received previous reports of such problems in MRI suites. However, we thought this represented a real vulnerability, so we went out to medical centers, both inside and outside the VA, to observe, talk to people, and learn what was happening on a daily basis. We found that similar system issues were quite common. For example, MRIs often caused pens and paper clips to fly out of shirt pockets, sometimes striking patients.

As a result of this fieldwork, we implemented a system-wide alert with instructions for mitigating these risks to our patients. If we had relied on misleading statistics based on reports, we would have ignored the single report and decided that it was not worth studying the problem.

A reporting system is essential, but it is also essential not to become a slave to it. Reporting systems and self-reports never totally reflect reality, but they are valuable for identifying system vulnerabilities. But if we only sit in our offices counting and sorting reports, it is unlikely that anything will get better.

Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×

ANALYZING PRECURSORS

Reports can be thought of as fuel for the safety-improvement engine. How then do we get this fuel? A mandatory reporting system is not the answer. Although it may seem like a simple solution, it ignores real issues concerning the effective relationship that must be developed with those from whom you wish to receive reports. Anyone who thinks people will report adverse events just because there is a regulation that they do so is living in a dream world. As Dr. Charles Billings, the father of the Aviation Safety Report System stated, “in the final analysis all reporting is voluntary” (statement at a meeting of Advisory Panel on Patient Safety Systems, Washington, D.C., March 12, 1998). In other words, there is no such thing as mandatory reporting. Billings says people only report what they care to report, either because there are penalties for not reporting and the event was witnessed by others or because they feel there is some intrinsic value to reporting an event to improve the system. People will not report simply because there is rule that they do so. Senate Bill 720 and House Bill 663 recognize the fallacy of mandatory reporting requirements and endorse voluntary reporting.

There are numerous sources of information about hazards and risks. The challenge becomes determining how to prioritize reports and what to do with the information. The VA has developed a prioritization methodology based on the severity of an event and its probability of occurrence; we assign each event a safety assessment code, which determines if a detailed RCA is required.

In determining action to be taken, it is essential to look at the root causes and contributing factors that led to an undesirable condition or event. There is seldom a single cause. A thorough analysis of underlying causes can provide insight into the problem and a basis for taking steps to correct or prevent the problem. For instance, we looked at a collection of RCAs dealing with cases in which incorrect surgical procedures were performed or incorrect sites were operated on. The RCAs revealed that the problem was much different than had been thought.

It had been generally accepted that the problem was mainly one of identifying the correct side of the body to operate on. After looking at the real situation, we found that marking to establish laterality was an issue, but almost as big an issue was that the incorrect patient was operated on because of inadequate patient identification methods. In many cases, the incorrect patient was scheduled to undergo a similar procedure the same day by the same physician, but on another part (or side) of the body. Only by understanding the underlying causes could we take appropriate countermeasures and implement preventive strategies.

Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×

CREATING LEARNING SYSTEMS

There are many accountability systems in medicine but very few learning systems. Most medical problems and errors that occur today have happened before and will continue to happen in the future unless we do something differently. It is naïve to expect that the traditional approach of punishing the individual(s) involved in an incident will make the world safer. Rather than assigning blame, we must start learning from mistakes and translating the lessons into system-based solutions.

One of the most important steps in creating a learning system is demonstrating to participants in the system that the objective is not punishment but systematic improvements that will prevent undesirable events from occurring in the future. For a learning system to be trusted, it must be considered fair, in plan and in practice. This does not mean that it must be a blame-free system.

We have found that frontline staff do not want a blame-free system; they want a fair system. Therefore, we established a ground rule that the results of an RCA could not be used to punish an individual. However, because we did not want the safety system to be used, or appear to be used, to hide events that all parties agree require disciplinary action, we decided to define events that were “blameworthy.” We did not use legal terms, such as reckless or careless, that have been interpreted differently in different jurisdictions. Instead, we defined a blameworthy act as an “intentionally unsafe act,” that is, a criminal act, an act committed under the influence of drugs or alcohol, or a purposefully unsafe act.

Blameworthy acts are not examined as part of the safety system, which is strictly for learning. They are passed on to the administrative system where, besides learning, punishment may be an outcome; in addition, the proceedings are also discoverable. Thus, those who wish to report events clearly understand under what circumstances they can be subject to punitive action. Although I cannot prove causality, I can state that, after we instituted these definitions, reporting went up 30-fold in an organization that already had a good reporting culture.

A learning system must be shown to be useful. People in the organization must understand why the system is necessary and what its benefits might be. People will not waste time reporting if their participation makes no difference. If the system becomes a black hole into which learning and energy disappear, people will not participate.

The individual is the most important component of an effective learning system. The system is absolutely necessary to making things work, but it ultimately depends on people. Systems that work best are the ones people internalize as their own. A successful system creates an environment that makes people want to do the right thing. In systems that work, people speak up and communicate with each other.

Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×

In a study last year, pilots and board-certified physicians were asked if they would protest if their superiors told them to do something with which they disagreed. Virtually all of the pilots said yes, but only about half of the physicians did. This shows vastly different thresholds for communicating critical information.

TAKING THE LEAD

This workshop is not about management; it is about leadership. Successful leaders must be willing to take on risk. One example of leaders assuming risk is the way the VA issues patient safety alerts. Alerts identify a discrete problem, describe its solution, and set a time by which the solution must be implemented. We identify so many problems through our reporting system that it would be easy to issue 100 alerts a day. But we know that issuing too many alerts would ultimately make people indifferent to alerts, thus creating new risks. Therefore, we prioritize potential alerts through a scoring mechanism, and we issue only the most critical alerts for national dissemination.

So far, this approach has resulted in an average of three or four alerts per month. We realize that this approach could put leadership in a politically awkward position someday if a vulnerability for which we chose not to issue an alert resulted in a patient injury. But rather than taking a self-serving, risk-averse position and sending out many more alerts, thus passing the responsibility and risk to the front line, VA leadership believes we can more effectively help patients by issuing alerts judiciously, even though the leadership is placed in greater personal/professional jeopardy. Leaders should be willing to accept the risk of being criticized in exchange for a safer system for patients.

Leaders must demonstrate priorities by their actions, as well as their words. Paying them lip service is not enough. The old aphorism, lead by example, is still true. Leaders must maintain a relentless drumbeat that safety-related activities are an inextricable part of everything we do.

There is altruism out there, and people will participate in a reporting system that they feel is fair and that provides a safe environment for them. People cannot be forced to participate; they must be invited to play. In the VA patient safety program, we have demonstrated that the best way of getting people to participate initially is to make program adoption and implementation voluntary. In a pilot test, this approach attracted dedicated volunteers, and within a few weeks, the response to the program was so favorable that the remaining facilities asked to implement the program. We completed the pilot test and rolled the program out nationally in just nine months. In fact, facilities that had initially been reluctant to adopt the new system became impatient when they were told they would have to wait their turn to be trained in the use of the new system. It is possible to win enthusiastic acceptance, but this requires patience and not trying to force acceptance.

Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×

SUMMARY

The purpose of a reporting system is to identify vulnerabilities; the desired result should be preventing harm to patients. Once the causes of underlying vulnerabilities have been determined, corrective action can be implemented. System improvements should be made based on the study of causes and vulnerabilities. However, if the reporting system does not result in actions that mitigate the vulnerabilities locally or throughout a system, then the entire effort is for naught.

Perhaps the most important resource for learning is close calls. Under the old VA reporting system, reports of close calls accounted for only 0.05 percent of all reports. In the current system, which emphasizes close calls, 95 percent of reports are about close calls. Another important breakthrough was ensuring that events that were reported resulted in action being taken. Under the old system, less than 50 percent of all events that received in-depth analysis resulted in any action being taken. This created great cynicism. With the new system, less than 1 percent of RCAs do not result in corrective action(s). Not every action is effective, of course, but if no action is taken, it is certain that the situation will not improve.

We typically use a different team for every RCA. In this way, we provide experiential learning for staff members, who come to appreciate the value and details of the safety system. When they return to their jobs, their view of the world is very different. The response to this changing-team approach has been almost uniformly positive.

In the end, success is not about counting reports. It is about identifying vulnerabilities and precursors to problems and then formulating and implementing corrective actions. Analysis and action are the keys, and success is manifested by changes in the culture of the workplace. Change does not happen overnight; it takes time.

As Einstein said, “The significant problems we face cannot be solved with the same level of thinking we were at when we created them.” His corollary to this was that “Insanity is doing the same thing over and over and expecting different results.” But probably most important of all for us today is something Margaret Mead said, “Never doubt that a small group of thoughtful, committed people can change the world. Indeed it’s the only thing that ever has.”

Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×
Page 35
Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×
Page 36
Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×
Page 37
Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×
Page 38
Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×
Page 39
Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×
Page 40
Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×
Page 41
Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×
Page 42
Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×
Page 43
Suggested Citation:"Section II Keynote Speakers2 The Opportunity of Precursors." National Academy of Engineering. 2004. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence. Washington, DC: The National Academies Press. doi: 10.17226/11061.
×
Page 44
Next: On Signals, Response, and Risk Mitigation: A Probabilistic Approach to the Detection and Analysis of Precursors »
Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence Get This Book
×
Buy Paperback | $48.00 Buy Ebook | $38.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

In the aftermath of catastrophes, it is common to find prior indicators, missed signals, and dismissed alerts that, had they been recognized and appropriately managed before the event, could have resulted in the undesired event being averted. These indicators are typically called "precursors." Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence documents various industrial and academic approaches to detecting, analyzing, and benefiting from accident precursors and examines public-sector and private-sector roles in the collection and use of precursor information. The book includes the analysis, findings and recommendations of the authoring NAE committee as well as eleven individually authored background papers on the opportunity of precursor analysis and management, risk assessment, risk management, and linking risk assessment and management.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!