F
Quality Improvement and Proactive Hazard Analysis Models: Deciphering a New Tower of Babel

Commissioned Paper by John E. McDonough, Milbank Memorial Fund Ronni Solomon and Luke Petosa, ECRI

ABSTRACT

Health care leaders seeking to improve quality and prevent harm to patients have an array of tools to help them in this task. We propose dividing these tools into two categories: (1) quality improvement tools—including Continuous Quality Improvement, Six Sigma, and Toyota Production System—can be applied to many organizational challenges, including but not limited to safety concerns; and (2) proactive hazard analysis tools—including Health Care Failure Mode and Effect Analysis, Hazard Analysis and Critical Control Point, Hazard and Operability Studies, Proactive Risk Analysis—are designed specifically to identify hazards and to prevent harm. Each tool has common ancestry in the application of the scientific method to process analysis pioneered by Shewhart and Deming; each has unique attributes and advantages. This report explains each model in the context of patient safety. We recommend establishment of a clearinghouse to enable physicians and other practitioners to learn from experimentation with these models and to establish a common analytic framework. We also recommend use of models for personal health information as a methodology for medical specialties to address patient safety concerns.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 471
Patient Safety: Achieving a New Standard for Care F Quality Improvement and Proactive Hazard Analysis Models: Deciphering a New Tower of Babel Commissioned Paper by John E. McDonough, Milbank Memorial Fund Ronni Solomon and Luke Petosa, ECRI ABSTRACT Health care leaders seeking to improve quality and prevent harm to patients have an array of tools to help them in this task. We propose dividing these tools into two categories: (1) quality improvement tools—including Continuous Quality Improvement, Six Sigma, and Toyota Production System—can be applied to many organizational challenges, including but not limited to safety concerns; and (2) proactive hazard analysis tools—including Health Care Failure Mode and Effect Analysis, Hazard Analysis and Critical Control Point, Hazard and Operability Studies, Proactive Risk Analysis—are designed specifically to identify hazards and to prevent harm. Each tool has common ancestry in the application of the scientific method to process analysis pioneered by Shewhart and Deming; each has unique attributes and advantages. This report explains each model in the context of patient safety. We recommend establishment of a clearinghouse to enable physicians and other practitioners to learn from experimentation with these models and to establish a common analytic framework. We also recommend use of models for personal health information as a methodology for medical specialties to address patient safety concerns.

OCR for page 471
Patient Safety: Achieving a New Standard for Care I. INTRODUCTION Two Institute of Medicine (IOM) reports, To Err Is Human (1999) and Crossing the Quality Chasm (2001), moved public and health care industry concerns about quality, patient safety, and hazard analysis to greater visibility. As patient safety and hazard analysis concerns rise, health industry leaders have sought tools to address these challenges more effectively. Many tools exist; the quality improvement and hazard analysis models that offer methodologies to make medicine safer include Six Sigma, Hazard Analysis and Critical Control Points (HACCP), Failure Mode and Effect Analysis/ Healthcare Failure Mode and Effect Analysis (FMEA/HFMEA™), Toyota Production System (TPS), Hazard and Operability Studies (HAZOP), Total Quality Management/Continuous Quality Improvement (TQM/CQI), Root Cause Analysis (RCA), and Probabilistic Risk Assessment (PRA). Each approach has champions, supported by consultants ready to train managers and frontline workers in the rollout of each. Competing terms, acronyms, symbols, and techniques suggest a Tower of Babel—health leaders speaking different languages and using tools that do not resemble each other. As demands for improvements in patient safety escalate, the IOM’s Patient Safety Data Standards Committee seeks a framework to understand these approaches to identify principles necessary for any quality improvement (QI) or proactive hazard analysis (PHA) methodology to succeed. This paper provides an overview of key features of prominent methodologies, offers a framework to understand each, and shows how each relates to others. We outline principles to create effective hazard analysis in health care organizations, and we identify conceptual and methodological considerations in design and evaluation of risk/hazard identification. We relate hazard analysis to adverse event prevention and discuss strategies to apply this approach to health care. Finally, we discuss data requirements and measurement tools to support this approach. As a caveat, we recall the words of Avedis Donabedian, who devised our modern framework for understanding quality in health care: “If we are truly committed to quality, almost any mechanism will work. If we are not, the most elegantly constructed of mechanisms will fail.” While today’s quality leaders dispute the first sentence, all affirm the validity of the second. While QI and PHA tools can assist any health care organization’s commitment to making health care safer, none will succeed in the absence of deep and sustained leadership commitment.

OCR for page 471
Patient Safety: Achieving a New Standard for Care II. OVERVIEW OF EXISTING QUALITY IMPROVEMENT/ HAZARD PREVENTION MODELS Essential Features of Health Care Quality The Chasm report identifies six attributes for a quality health care system: (1) safe, (2) effective, (3) patient centered, (4) timely, (5) efficient, and (6) equitable.1 Safety is a preeminent feature of health care quality, first on the list, though not the only one. Health care quality may be thought of as a circle, with each of the six essential features forming smaller overlapping circles within the larger whole. Over the past 60 years, many models have been developed to help organizations improve quality and enhance safety. Among the methodologies discussed here, we distinguish between tools that address all six aspects of quality versus tools with an explicit focus on safety and hazard analysis. General QI tools can be used to improve timeliness, efficiency, and other goals in addition to safety. PHA tools are more prescriptive and require more steps, including documentation; in cases where a tool is applied to an ongoing service operation (i.e., HACCP), it becomes a part of a firm’s daily functioning.2 This distinction provides the framework for discussion in this paper of the various methodologies: Quality Improvement Tools (QI) Proactive Hazard Analysis Tools (PHA) Total Quality Management—TQM Failure Mode and Effect Analysis—FMEA Continuous Quality Improvement—CQI Healthcare Failure Mode and Effect Analysis—HFMEA™ Toyota Production System Hazard Analysis and Critical Control Points—HACCP Six Sigma Hazard and Operability Studies—HAZOP Probabilistic Risk Assessment—PRA For comparative purposes, we also include discussion of Root Cause Analysis under PHA tools. Following is a brief outline of each approach, describing purpose and features, a thumbnail history, and key applications. Tables D–1 and D–2 summarize key points. 1. Quality Improvement Tools: TQM/CQI, Toyota Production System, Six Sigma The three approaches we will describe can be used to improve all aspects of quality and are not targeted specifically at hazard prevention. Still,

OCR for page 471
Patient Safety: Achieving a New Standard for Care TABLE D-1 Quality Improvement Approaches Continuous Quality Improvement (Total Quality Management) Origin TQM: Japanese and U.S. manufacturing, 1950s/1980s CQI: Berwick and Bataldan, 1980s Purpose Continuously improve quality by relentless focus on customer satisfaction Core methodology 1. Plan a process improvement. 2. Do the intervention. 3. Study the results from the intervention. 4. Act on the results—if favorable by institutionalizing; if unfavorable by testing another intervention. Key example Ford Motor Company Health care example Institute for Healthcare Improvement; JCAHO accreditation requirement Strength Most widely dispersed and recognized improvement methodology

OCR for page 471
Patient Safety: Achieving a New Standard for Care Toyota—Lean Production Six Sigma Toyota Motorola in 1984 Lean production; endlessly reduce costs and lead time through elimination of waste Achieve near zero defects (3.4 per million opportunities) Rule 1. All work highly specified as to content, sequence, timing, and outcome. Rule 2. Every customer-supplier connection is direct, with unambiguous yes-or-no way to send requests and receive responses. Rule 3. The pathway for every product and service must be simple and direct. Rule 4. Improvement must be made in accord with scientific method, under guidance of a teacher, at the lowest possible level in organization. 1. Define: Identify problems, clarify scope, define goals 2. Measure performance to requirements, gather data, refine goals 3. Analyze: Develop hypotheses, identify root causes, analyze best practices 4. Improve: Conduct experiments to remove root cause, test solution, measure results, standardize solutions, implement new process 5. Control: Establish standard measures to maintain performance and correct problems as needed Toyota, Alcoa General Electric Pittsburgh Regional Healthcare Initiative University of Virginia Health System; Virtua Health, New Jersey Focus on elimination of waste, empowerment of frontline workers Focus on near zero defects and control of gains once achieved

OCR for page 471
Patient Safety: Achieving a New Standard for Care TABLE D-2 Proactive Hazard Analysis Approaches   Healthcare Failure Mode and Effect Analysis (adapted from FMEA) Hazard Analysis and Critical Control Points Origin U.S. military, 1949, and NASA, 1960s Pillsbury for NASA, 1959, to ensure safe food for astronauts Purpose To evaluate potential failures and their causes, pointing to actions to eliminate or reduce them A systematic approach to the identification, assessment, and control of hazards Core methodology 1. Define HFMEA™ topic. 2. Assemble HFMEA™ team. 3. Describe the process. 4. Conduct failure analysis. 5. Evaluate actions and outcome measures. 1. Conduct a hazard analysis. 2. Identify critical control points. 3. Establish critical limits for each CCP. 4. Establish monitoring requirements. 5. Establish correction actions when a CCP deviation occurs. 6. Establish ongoing verification procedures. 7. Establish record-keeping procedures. Key example U.S. auto manufacturing (FMEA) Food manufacturing and services Health care example VHA Medical device manufacturing Strength Adapted specifically for health care; model for JCAHO proactive risk assessment requirement International standard in food sector; close interface with public-sector regulation; empirical evidence of effectiveness

OCR for page 471
Patient Safety: Achieving a New Standard for Care Hazard and Operability Studies Probabilistic Risk Assessment United Kingdom—chemical industry, 1960s Aviation industry A team-based, systematic, and qualitative method to identify hazards (or deviations in design) in process industries A tool to assess the contribution of multiple failures and combinations that may lead to catastrophic occurrences 1. Will someone be harmed? Who? In which way? How severely? 2. Will processes performance be reduced? In which way? How severely? What will impact be? 3. Will costs increase? If so, by how much? 4. Will there be cascading effects where deviation leads to other deviations? If so, what are they? 1. Development of a fault tree to visualize risk. Three elements: basic events, “AND” gates, “OR” gates. 2. Probability predictions are added to fault trees. Chemical industry Aviation, nuclear power Telemedicine in European Union Environmental health risk assessment Compels parties to assess potential difficulties and devise mutually agreeable solutions Models all combinations of failures that may lead to adverse events

OCR for page 471
Patient Safety: Achieving a New Standard for Care advocates of each approach have examples where each has been used to achieve safety improvements. TQM is the earliest of the three approaches; the other two, Toyota Production System and Six Sigma, acknowledge their debts to TQM/CQI principles and techniques. Total Quality Management/Continuous Quality Improvement TQM/CQI is the earliest application of the scientific method to process improvement. TQM techniques have been applied widely in U.S. and Japanese manufacturing and in other organizations facing competitive challenges in a disciplined approach to enhance customer satisfaction. CQI is TQM applied within health care. The method requires organizational leaders to establish improvement goals and to choose projects that can achieve specific improvements. Cross-functional teams devise a flow chart of a process under study and use data to understand variations from quality. The methodology regards errors as products of poorly designed systems, not as the fault of individual workers or “bad apples.” Once teams have developed a sophisticated understanding of a process, they start a four-step practice: Plan an intervention/experiment to improve the process. Do the hypothesized intervention. Study the results from the intervention. Act on the results—if favorable, by institutionalizing the intervention; if unfavorable, by testing another intervention.3 Organizations with robust CQI programs have many improvement teams working at all times. TQM was introduced to Japanese manufacturers in the 1950s by Deming, Juran, and others and to U.S. manufacturers in the late 1970s and 1980s. Berwick and others proposed TQM as an alternative to traditional quality assurance under the term “Continuous Quality Improvement.” CQI may be used to improve many organizational features beyond clinical quality, including patient satisfaction, error rates, waste, unit production costs, productivity, market share, and more. In the early 1990s, the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) embraced this new paradigm and included CQI activities in its requirements for accredited institutions. Fifteen years after its introduction, however, CQI has not lived up to its promise to “cure health care.” Reviewing CQI’s history in 1998, Blumenthal and Kilo found accomplishments and disappointments.4 Among the former is a changed mind-set from assurance to continuous improvement, aban-

OCR for page 471
Patient Safety: Achieving a New Standard for Care doning blame by focusing on system defects, creating a customer focus, motivating improvement projects across the nation, and educating thousands of health care workers in improvement techniques. Among its shortcomings has been an inability to identify a dramatically changed health care institution despite manufacturing examples such as Toyota; other problems include the failure to make deep inroads into clinical quality and a scant literature documenting sustained improvement. CQI has been eclipsed by other methodologies, including the Toyota Production System, Six Sigma, and reengineering. Still, CQI remains the predominant quality improvement philosophy and methodology in the health care industry today. Toyota Production System Lean production focuses on elimination of waste—of materials, time, idle equipment, and inventory—to improve productivity and profits by improving material handling, inventory, quality, scheduling, personnel, and customer satisfaction. The core methodology as applied at Toyota is captured in four rules: Rule 1: All work is highly specific as to content, sequence, timing, and outcome. Rule 2: Every customer-supplier connection must be direct, and there must be an unambiguous yes-or-no way to send requests and receive responses. Rule 3: The pathway for every product and service must be simple and direct. Rule 4: Any improvement must be made in accordance with the scientific method, under the guidance of a teacher, at the lowest possible level of responsibility. A key feature is the empowerment of line workers to implement design changes and to halt a process to avoid errors—turning workers into problem solvers. Although some initially thought Toyota’s success was tied to cultural differences between Japan and the United States, the company’s success in implementing the strategy in its North American plants neutralized that criticism. Alcoa used the process to achieve one of the safest manufacturing sites for workers in the nation. Its head, former U.S. Treasury Secretary Paul O’Neill, helped establish the Pittsburgh Regional Healthcare Initiative in 1998, bringing stakeholders together to pursue perfecting the region’s health

OCR for page 471
Patient Safety: Achieving a New Standard for Care care system—using the Toyota Production System (see case studies section later in this appendix). The initiative focuses on three goals: Patient safety: Reducing hospital-acquired infections and medication errors to zero. Clinical initiatives: Achieving breakthrough performance in cardiac surgery, depression, diabetes, orthopedics, and obstetrics. Perfecting patient care: Redesigning organizations to allow everyone to learn from errors and problems.5 Six Sigma Six Sigma is a quality program that seeks to improve processes so that no more than 3.4 mistakes occur per million opportunities. One commentator describes its approach as “much like that of Total Quality Management, perhaps with a more aggressive goal.” Proponents suggest that the relentless focus on error reduction provides a structure and focus missing from other QI techniques. Six Sigma has a five-step improvement cycle corresponding to the acronym DMAIC with the aim to continuously reduce defects: Define by identifying problems, clarifying scope, defining goals. Measure performance against requirements, gather data, refine problems/goals. Analyze by developing hypotheses, identifying root causes, analyzing best practices. Improve by conducting experiments to remove root cause, testing solutions, measuring results, standardizing solutions, implementing new processes. Control by establishing standard measures to maintain performance and correcting problems as needed. In 1984, Motorola engineers invented Six Sigma, named for a statistical measure of variation (1 Sigma reflects 690,000 defects per million opportunities; 2 equals 308,000; 3 reflects 66,800; 4 reflects 6,210; 5 reflects 230, and 6 reflects 3.4). The strategy achieved prominence in the 1980s at IBM and became widely known in the 1990s at General Electric, which claims dramatic error reduction and savings from its Six Sigma program. GE has applied Six Sigma to its medical device manufacturing division and to its employee health benefits program. GE also initiated its own Six Sigma health care consulting organization.6 The University of Virginia Health System and Virtua Health in New Jersey are two examples of health care adapters.

OCR for page 471
Patient Safety: Achieving a New Standard for Care Chassin provides examples where medical care performs below one sigma, as in the documented 79 percent—or 790,000 out of 1 million—of eligible heart attack survivors not receiving beta-blockers. He also notes how the anesthesia community reduced deaths from rates of 50 per million in the 1980s to as few as 5 today. A proposed advantage of Six Sigma over TQM is the former’s focus on defects from perfect versus the latter’s focus on improvement from variation in a mean.7 2. Proactive Hazard Analysis Tools: FMEA/HFMEA™, HACCP, HAZOPS, PRA, RCA PHA tools tend to be more prescriptive and to have more record-keeping and other requirements than QI tools. These requirements, justifiable when the objective is safety, are more onerous than needed for nonsafety improvement projects. Root Cause Analysis, though not explicitly proactive, is described here for comparative purposes. Failure Mode and Effect Analysis/Healthcare Failure Mode and Effect Analysis FMEA is a tool used in manufacturing to evaluate potential failures and their causes and to prioritize potential failures according to risk, pointing to actions to eliminate or reduce the likelihood of occurrence. The Veterans Health Administration (VHA) pioneered the adaptation of FMEA and other industrial process control tools to patient safety, developing the HFMEA™ for use in health care settings. This section describes HFMEA™ more than FMEA. Five steps are involved in an HFMEA™ analysis: Define the HFMEA™ topic, including a clear definition of the process to be studied. Assemble the HFMEA™ team, which should be multidisciplinary and include subject matter experts and an adviser. Graphically describe the process with a flow diagram, numbering each step, identifying the area on which to focus, and identifying all subprocesses. Conduct a failure analysis listing all possible failure modes, determining the severity and probability of each, using a decision tree to determine if the failure mode warrants further action, and listing all failure modes where the decision is made to proceed. Evaluate actions and outcome measures determining which failure modes to eliminate, control, or accept; identifying an action for each failure

OCR for page 471
Patient Safety: Achieving a New Standard for Care Healthcare Failure Mode and Effect Analysis Failure mode and effects analysis helps to anticipate what can go wrong with a high-risk health care process and to apply measures to prevent the error. Industries such as aviation, aerospace, and automotive manufacturing have long used failure mode and effects analysis to prevent accidents from occurring, but there is only one model specific to health care. That model, called Healthcare Failure Mode and Effects Analysis, was developed by the Department of Veterans Affairs (VA) National Center for Patient Safety and first put into practice in 2001. The VA’s HFMEA™ model is a five-step process that involves selecting a topic for analysis, selecting a team to do the analysis, mapping a flow chart of the high-risk process, identifying failure modes within the process, and, if necessary, redesigning the process. In its first application of HFMEA™ the VA asked its 163 medical centers to use HFMEA™ to analyze their contingency plans for their computerized, bar-code medication administration systems in the event of a power failure or other interruption to the system. The process was a valuable exercise, VA officials say, and revealed vulnerabilities to facilities’ contingency plans and prompted facilities to make changes to prevent problems from occurring. For example, some facilities learned that they wrongly assumed that data backups of their computerized bar-code systems were performed more frequently than every 24 hours. In the event of a power failure, newly entered data such as a change in a patient’s medication may not have been included in the data backup, and the patient could be at risk of receiving an incorrect medication order. HFMEA™ teams recommended redesigning the process by requiring more frequent data backups of their facilities’ electronic medication records and providing a mechanism to let staff know when the backup is completed. The HFMEA™ process helped the teams identify other gaps in the contingency plans by asking the following questions: Do caregivers know how to access and use their contingency plans for the medication administration system? Is a process in place to stop new referrals to a unit, if necessary, when the electronic medication administration system is unavailable? Is there a procedure to request additional staff if necessary to help implement the contingency measures? What process is in place to ensure that once the electronic system is restored, any information about a patient’s medications that is recorded manually during a power failure is available to caregivers?

OCR for page 471
Patient Safety: Achieving a New Standard for Care How are new medication orders recorded while the electronic system is unavailable, and how are they entered into the system when it is restored? How much data from the patient’s medication history should be provided when paper backup records are needed? Without some parameters on the amount of information needed from paper backup records, several facilities realized they could end up with complete paper records of 100 or more pages for some patients. Although the HFMEA™ teams addressed the same topic, each designed its own solutions to the questions raised by the analysis. VA facilities are now on their own to select topics for a proactive risk assessment in 2003. Topics selected include reporting of laboratory or radiology results, patient identification procedures, and patient backlogs for procedures. The VA’s first experience with HFMEA™ also provided the agency with additional lessons to improve the process for proactive risk analysis. Some of the lessons learned from the VA’s first application of HFMEA™ include the following: Assign an HFMEA™ team member the task of mapping the flow diagram before the team’s first meeting. This ensures that the team moves in the right direction from the start. Ensure that the steps to a process are numbered and the subprocesses are lettered. These simple measures help to keep the HFMEA™ team organized and prevent the team from overlooking potential failure modes. Limit the flow diagram of the process to no more than 10 to 12 steps; otherwise the diagram gets too large. Make testing of proposed changes a formal part of the HFMEA™ process. Testing can evaluate whether any of the proposed changes introduce unintended consequences. Additional information and tools for HFMEA™ are available from the VA National Center for Patient Safety Web site (http://www.patientsafety.gov). Probabilistic Risk Assessment In the only published study of Probabilistic Risk Assessment and patient safety that we could identify, Dr. Elisabeth Paté-Cornell extended PRA—called “engineering risk analysis”—to the study of anesthesia patient risk to show how this tool can incorporate human and organizational factors to support patient safety decisions before complete datasets can be gathered and in cases where key factors are not directly observable.29 In assessing the risk of severe anesthesia accidents, technical failures

OCR for page 471
Patient Safety: Achieving a New Standard for Care such as machine malfunctions can be easily identified and corrected. Indeed, most of the progress in improving anesthesia safety over the past 25 years has been attributed to identifying and correcting technical risks. Risks attributable to human errors are more difficult to detect, characterize, and anticipate because statistical samples are seldom available, such as the risk of injury due to substance abuse by the anesthesiologist. Gathering these data is difficult. Accidents are divided into scenarios formed of “basic events,” and a Bayesian approach is used to assess probabilities and consequences of each type. Probabilities are developed from three sources: existing datasets, analysis of basic engineering properties of the systems, and expert opinions. Expert opinions, when well defined and encoded, provide essential information that could not be obtained in time to support urgent decisions. Seven initiating events were identified: breathing circuit disconnect, esophageal intubation, nonventilation, malignant hyperthermia, inhaled anesthetic overdose, anaphylactic reaction, and severe hemorrhage. Probabilities per operation were assessed. Experts identified types of problems that could affect the performance of anesthesiologists and the rates of occurrence. Analysts then recomputed the probability of each patient accident for each problem type: Problem-free: 0.53 Fatigue: 0.10 Cognitive problems: 0.04 Personality problems: 0.04 Severe distraction: 0.03 Drug abuse: 0.03 Alcohol abuse: 0.04 Aging/neurological problems: 0.03 Lack of training: 0.12 Lack of supervision: 0.04 Figures show estimated probability of the state of the anesthesiologist per operation. Experts identified policies to decrease the probability of each problem. The distribution of practitioner problems was then used to compute the anticipated benefits from each measure. Whereas alcohol and drug problems had been at the forefront of concerns at the outset of the study, the more immediate and less visible problems were supervision of residents and

OCR for page 471
Patient Safety: Achieving a New Standard for Care problems of incompetence. The results were “interesting” because they did not correspond to the initial motivation of the sponsors (the Anesthesia Patient Safety Foundation), who were concerned about drug abuse and behavioral problems. “The major contributors to the problems are much closer to home and the most beneficial measures are mundane, such as better supervision of residents and periodic retraining of all practitioners so that they get familiar again with situations that they may have forgotten because they only rarely occur.” Root Cause Analysis A recent article in the Quality Grand Rounds series as presented in the September 3, 2002, issue of the Annals of Internal Medicine, deals with a patient who suffered multiple adverse events consistent with cascade iatrogenesis. This case raises two important quality issues: Can health care improve the reliability and accuracy of interpretations of diagnostic tests, and should health care regulate the introduction and use of new technologies? It also brings to light limitations to routine use of RCA to identify remediable errors or to better prevent those system errors when the causal pathway to an apparent adverse medical outcome has not been definitively established. In this case there is a question as to whether RCA would yield improved systems for patient care. Despite multiple opportunities to identify errors in the patient’s care, the decisions or circumstances associated with these adverse events contributed to the outcomes in uncertain ways and are not easily classified as clear-cut errors. If the recommendations of such an ill-conceived RCA are based on unreliable assessment of causality, a Root Cause Analysis can do more harm than good. In the case, a 40-year-old woman with a history of type B aortic dissection, renal insufficiency, poorly controlled hypertension, erratic adherence to prescribed treatment regimen, and cocaine use was to be evaluated for dyspnea and swelling of her left breast and arm. At initial presentation, the findings seemed consistent with deep vein thrombosis of the upper left extremity and pulmonary embolism associated with a hypercoagulable state due to possible left-sided breast cancer. In contrast to the initial read (by a radiology resident) of a spiral computed tomography (CT) scan as negative for pulmonary emboli, the attending radiologist identified segmental emboli in the lungs, chronic type B aortic dissection, and a huge pericardial effusion when reading the scan the next morning. Based on this read, the patient was treated with intravenous heparin and oral warfarin. Mammography revealed no evidence of breast cancer and ultrasonography of the left arm found no

OCR for page 471
Patient Safety: Achieving a New Standard for Care deep venous thrombosis. After one week of hospitalization, another attending radiologist, one with expertise in imaging pulmonary emboli, reevaluated the original CT scan and found it to be negative for pulmonary emboli—a read consistent with the initial read by the radiology resident. The authors point to this portion of the case as highlighting the need for a general strategy to improve the reliability of radiographic interpretation and introduce new medical technologies (i.e., spiral CT scan) instead of using more well-studied, albeit more resource-intensive, diagnostics such as ventilation perfusion scanning or pulmonary arteriography. The authors see the diagnostic uncertainty regarding the use of the spiral CT scan as pointing to an apprehension, namely the appropriateness of integrating new health care technologies prior to sufficient supporting evidence. With pulmonary embolism having been ruled out, physicians debated whether pericardiocentesis under cardiographic guidance should be performed in an effort to explain the patient’s dyspnea and arm and breast swelling. Unfortunately, the patient’s anticoagulant therapy had not been discontinued in time to permit the procedure to be performed on the more desired day, Thursday. Instead the pericardiocentesis was performed on a Friday evening by another competent cardiologist with a full complement of catheterization laboratory personnel. Because of some of the patient’s preexisting complications and the formation of a hemopneumothorax during the process, the patient went into cardiac arrest with pulseless electrical activity. The patient was successfully resuscitated after 10 minutes and a pericardial window and pleural and pericardial drains were surgically inserted. Using RCA, one is inclined to look at the decision to perform pericardiocentesis. Was it wrong to perform the procedure? Was it wrong to perform the procedure on a Friday evening? The authors suggest the decision to go ahead with the pericardiocentesis, even if problematic in retrospect, does not suggest a clear preventive solution to the breakdown in decision making. In contrast, the failure to discontinue the anticoagulation therapy in a timely manner is an error of omission. In retrospect and knowing the outcome already, an observer could be tempted to label the pericardiocentesis an error of commission, arguing that watchful waiting would have been a more reasonable alternative because the patient’s symptoms were stable. But watchful waiting could still lead to cardiac arrest due to tamponade over the weekend, implicating an error of omission. This is a good example of an RCA influenced by hindsight bias and a case where the overall outcome of the patient may not have been improved by any intervention that would prevent the decision to conduct pericardiocentesis. Several evenings after the patient seemingly recovered from the cardiac

OCR for page 471
Patient Safety: Achieving a New Standard for Care arrest and pericardial window insertion, she developed right-sided pleuritic chest pain and relative hypotension. Two days earlier, based on the unlikelihood of recurrent pericardial effusion (with the pericardial window in place), the patient’s mediastinal drain was removed. Again considering the possibility of pulmonary embolism and in an effort to diagnose the patient, the residents initiated intravenous heparin and a repeat spiral CT scan. Later that same morning, the patient’s attending physician discontinued the anticoagulant medication. An emergency echocardiography revealed a large thrombus in the pericardium compressing the left atrium of the heart. The patient subsequently suffered a second cardiac arrest with pulseless activity while undergoing the echocardiography. An emergency sternotomy was performed; then the pericardial clot was evacuated and a laceration of the left ventricle was repaired. On the second day in the intensive care unit, the patient developed R-on-T phenomenon, followed by torsade de pointes tachycardia and subsequent pulseless ventricular tachycardia, requiring intubation, defibrillation, and amiodarone therapy. Laboratory results revealed the patient’s renal function and metabolic acidosis had worsened, requiring dialysis. Although the authors indicate that the decision to discount tamponade and restart anticoagulation therapy may have been the worst decision of the case, it may be difficult even here to get a consensus opinion on whether the decision was an “error” and whether such a system error could be prevented under the circumstances. The authors suggest that the resident’s error is more likely from not knowing his own skill limitations and not seeking a competent supervisor to help in making the decision, which represents an important policy issue throughout health care. The patient eventually recovered and was discharged after a 27-day hospital stay, with more than $200,000 in hospital charges and the need for long-term dialysis. Six Sigma Virtua Health, a not-for-profit community hospital system in southern New Jersey, adopted Six Sigma in 2000 to achieve operational goals. One of its first six projects, conducted between January and June 2001, sought improvements and error reduction in anticoagulation therapy. Specifically, the hospital sought to reduce errors related to incorrect pump settings, incorrect use of pumps, delays in obtaining and reacting to activated partial thromboplastin time (aPPT), dosing errors, and mixing errors. Other QI activities, including RCA, failed to address the overall performance of the anticoagulation process in quantitative terms.

OCR for page 471
Patient Safety: Achieving a New Standard for Care The improvement team used the Six Sigma DMAIC process: define the process to address, measure how the current process is performing, analyze key factors driving the process, improve the process, and control the process to sustain progress. The team defined safe and effective anticoagulation capability as the project goal: First aPTT after bolus above therapeutic threshold; aPTT in therapeutic range at 24 hours; Interval between aPTTs until two consecutive are in range; Low platelet counts noted and addressed; Low hemoglobins noted and addressed. The team’s analysis revealed 92 steps required to reach completion of the first dose adjustment—and that system complexity hampered staff performance, with few elements in place to prevent errors by staff. The team determined that simplifying and error proofing the process were the greatest opportunities to increase safety. The following chart shows the steps taken in the improvement phase: Six Sigma Anticoagulation Improvements: Virtua Health Process Step Deficiency Intervention Anticipated Benefit Weighing patients Done on admission only 48% of time Bed scales purchased Easier to weigh patients Lab–pharmacy data link No prior system to monitor efficiency All patients on heparin included in automated review, with manual review of charts identified Detection of otherwise silent process failures; ongoing comparison to target performance Heparin hold for aPTT >240 seconds Unclear definition of start time for 6-hour interval Clarification with physicians Decreased process variation Physician called for aPTT >240 × 3 Unclear which physician group to call ID of physician group responsible for heparin order on initial order sheet Decreased miscommunication

OCR for page 471
Patient Safety: Achieving a New Standard for Care Six Sigma Anticoagulation Improvements: Virtua Health Process Step Deficiency Intervention Anticipated Benefit Preheparin lab studies Inconsistency among nurses and physicians on holding heparin until results received Clarification with physicians; default is do not wait for labs with hold option for physician Decreased variation in nursing practice Infusion pumps Occasional incorrect setting leading to dosage error Programmable pumps with drug personalities and maximum drip rate settings Avoidance of extreme overdosage due to pump-setting errors Use of unfractioned heparin Complex process with complexity-related failures Substitute low-molecular heparin Fewer complexity-related errors The control phase includes creation of visible metrics used by process owners to ensure gains are sustained. Study authors note their work “is not a research methodology, and the findings of this project should not be interpreted in the same light as a rigorous clinical research paper. The focus of this paper is to describe an approach for identifying opportunities for improvement and taking action that leads to results that matter to patients in a framework that is achievable in the typical community hospital setting.”30 Toyota Production System The Pittsburgh Regional Healthcare Initiative (PRHI) is a collaborative effort by institutions and individuals that provide, purchase, insure, and support health care services in Southwestern Pennsylvania. The initiative aims to achieve “perfect patient care” in six counties in the Pittsburgh Metropolitan Statistical Area with the following goals: Zero medication errors Zero health care–acquired (nosocomial) infections Perfect clinical outcomes, measured by complications, readmissions, and other patient outcomes in the following areas:

OCR for page 471
Patient Safety: Achieving a New Standard for Care Invasive cardiac procedures (cardiac bypass surgery, angioplasty, and diagnostic catheterization) Hip and knee replacement surgery Repeat cesarean sections for women with no clinical indications for them Depression Diabetes The initiative calls these goals “the most aggressive and ambitious performance goals in American health care.” It seeks to redefine the patient as the “client” in health care, as opposed to the physician, the insurer, or the payer in the current environment, by reallocating resources based on each patient’s needs. “In effect, the patient ‘pulls’ the resources he or she needs. This system—derived from the Toyota Production System—is capable of adjusting to and meeting varying patient needs quickly and flawlessly.”31 A Learning Line is a small hospital unit organized around the principles of TPS. At the point of patient care, the people doing the work are the experts and focus on the shared goal of meeting patient needs, one patient at a time. When a problem hinders work, the full-time team leader takes the lead, researching the problem by first determining what happened and asking the question “why” five times to determine the root cause. As the origins become known, the workers closest to the problem design solutions immediately, testing them with scientific methods. The team leader is free to pull assistance as needed to the point of patient care from the manager, the director, the chief executive officer, even trustees. Proponents suggest this approach enables health care professionals to spend more time providing frontline caregiving by wringing inefficiency out of the system; inefficiency is estimated to consume 33 to 50 cents of every health care dollar. At the Veterans Administration Pittsburgh Healthcare System, one Learning Line team addressed the issue of antibiotic-resistant infection by attempting to increase compliance with procedures to halt the spread of infection and act on PHRI’s goal of zero nosocomial infections. In seeking to understand the root cause for infections—asking “why” five times and observing workers at close range—the team leader discovered one reason workers had trouble complying with infection control procedures: Some rooms had gowns and some did not, and stock outs occurred daily. Workers on the Learning Line established who would be responsible for restocking gloves, how often supplies would be checked (daily), and how the cupboards would be labeled so any deficiency would immediately become obvious. Within days, gloves and gowns that workers had stashed away became available as

OCR for page 471
Patient Safety: Achieving a New Standard for Care the system supported the workers; glove consumption and costs dropped 15 percent as “stashes” disappeared. The unit believes it has already gained ground on the Centers for Disease Control and Prevention’s goal of improved compliance as hand hygiene compliance has risen.32 Special thanks to the following individuals for their advice and comments on this project: Judene Bartley, Paul Batalden, Donald Berwick, David Blumenthal, Mark Brulin, Mark Chassin, Richard Croteau, Edward Dunn, Karen Feinstein, Robert Galvin, Doris Hanna, Brent James, Molly Joel Coye, Lucian Leape, Tammy Lundstrom, Thomas Massero, Julie Mohr, Thomas Nolan, Elisabeth Paté-Cornell, Paul Schyve, Ethel Seljevold, Kimberly Thompson, Mark Van Kooy, Cindy Wallace, and Jonathan Wilwerding. REFERENCES 1. Committee on Quality of Health Care in America. 2001. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press. Pp. 41–42. 2. For a discussion of HFMEA/FMEA and HACCP, see McDonough J. 2002. Proactive Hazard Analysis and Health Care Policy. New York: Milbank Fund and ECRI. 3. Nolan T. 1996. The Improvement Guide. New York: Jossey Bass. 4. Blumenthal D, Kilo C. 1998. A report card on Continuous Quality Improvement. Milbank Quarterly 76(4): 625–648. 5. See http://www.prhi.org/ [accessed March 27, 2003]. 6. Welch J. 2001. Jack: Straight from the Gut. New York: Warner Business Books. 7. Chassin M. 1998. Is health care ready for Six Sigma quality? Milbank Quarterly 76(4):565–591. 8. Military Procedure MIL-P-1629. 1949 (November 9). Procedures for Performing a Failure Mode, Effects and Criticality Analysis. 9. Marx, D, Slonim A. 2003. Assessing Patient Safety Risk Before the Injury Occurs: An Introduction to Socio-Technical Probabilistic Risk Modeling in Healthcare. Qual Saf Health Care 12(Suppl2):ii33–ii38. 10. Green L, Crouch E. 1997. Probabilistic risk assessment: Lessons from four case studies. Annals of the New York Academy of Sciences 837:387–396. 11. Op. cit., p. 10. 12. See McDonough, op. cit., for detailed references. 13. Personal communication, Dr. Mark Van Kooy, M.D., Virtua Health Master Black Belt, March 26, 2003. 14. Ackoff R. 1994. The Democratic Corporation. New York: Oxford University Press. Pp. 18–21. 15. Batalden PB, Mohr JJ. 1997. Building knowledge of health care as a system. Quality Management in Health Care 5(3):1–12. 16. Personal Communication, Dr. Brent James, March 3, 2003.

OCR for page 471
Patient Safety: Achieving a New Standard for Care 17. Gaba D. 2000. Anesthesiology as a model for patient safety in health care. British Medical Journal 320:785–788. 18. Lagasse R. 2002. Anesthesia safety: Model or myth? A review of the published literature and analysis of current original data. Anesthesiology 97(6):1609–1617. 19. Cooper J, Gaba D. 2002. No myth: Anesthesia is a model for addressing patient safety. Anesthesia 97(6):1335–1337. 20. Brennan T. 2002. Physicians’ professional responsibility to improve the quality of care. Academic Medicine 77:973–980. 21. Croteau R, Schyve P. 2000. Proactively error proofing health care processes. In: Spath P. editor. Error Reduction in Health Care. New York: Jossey Bass. P. 184. 22. Chang A. 2003. Joint Commission Benchmark 5(2). 23. Marguerez G, Erbault E, Terra JL, Maisonneuve H, Matillon Y. 2001. Evaluation of 60 continuous quality improvement projects in French hospitals. International Journal for Quality in Health Care 13(2):89–97. 24. Solberg L, et al. 2000. Failure of a continuous quality improvement intervention to increase the delivery of preventive services: A randomized trial. Effective Clinical Practice May/June:105–115. 25. Contributed by Mary Ivins, Morrison Management Specialists. 26. Gran, B.A., Winther, R. and Johnsen, O.A. Security Assessment of Safety Critical Systems Using HAZOPs, in Proc. of Safecomp 2001, Budapest, Hungary, September 26-28, 2001. Stolen K. A Framework for Risk Analysis of Security Critical Systems. In supplement of the 2001 International Conference on Dependable Systems and Networks. Gothenburg, Sweden, July 2-4, 2001, P. D4-D11. 27. CORAS is a European Research and Development project funded by the 5th Framework Program on Information Society Technologies by the European Commission. The project began in 2001 and will last through 2003. Eleven partners are involved: five from Norway, three from Greece, two from England, and one from Germany. One Norwegian participant is the National Centre for Telemedicine (NST), whose mission is to contribute to making effective health services available to all. Formerly the Norwegian Centre for Telemedicine, NST was designated the first World Health Organization Collaborating Center for Telemedicine in July 2002. 28. Personal communication, e-mail from Eva Skipenes, Security Adviser, Norwegian Centre for Telemedicine, to Luke Petosa, Director, ECRI’s Center for Healthcare Environmental Management. Sent March 17, 2003. 29. Paté-Cornell E. 1999. Medical application of engineering risk analysis and anesthesia patient risk illustration. American Journal of Therapeutics 6(5):245–255. 30. Van Kooy M, Edell L, Scheckner HM. 2002. Use of Six Sigma to Improve the Safety and Efficacy of Acute Anticoagulation with Heparin. Journal of Clinical Outcomes Management 9(8): 445–453. 31. Pittsburgh Regional Healthcare Initiative. 2001. PHRA Scorecard 2001–2003. [Online]. Available: http://www.prhi.org/publications/member_pubs.htm [accessed April 25, 2003]. 32. Pittsburgh Regional Healthcare Initiative. 2002. On the Learning Line: Case Studies from the Perfecting Patient Care Learning Lines in Pittsburgh-Area Hospitals. [Online]. Available: http://www.prhi.org/pdfs/Learning_Line_Booklet.pdf [accessed April 25, 2003].