Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
APPENDIX D Human Reliability Analysis Human error can originate or surface in all phases of a systemâs life cycle, including design, construction, operation, and management. Since risk management requires an understanding of the causes of and contributors to the risk so that effective safeguards can be developed, human causes of accidents must be explicitly consid- ered in the analysis of risk (see Box D-1). Human risk factors are deï¬ned as those factors that can be attributed to the people in the system and âinclude both factors that cannot be directly changed (e.g., age, gender, personality, information processing, cognitive ability) and those that can (e.g., experience levels; training, edu- cation, and qualiï¬cations; substance use; compliance; peer pres- sure)â (TRB 2002, 118). One technical discipline that provides the methods and tools for modeling and analyzing human con- tributions to risk is known as human reliability analysis (HRA). Objectives of HRA are to (a) identify human failure events in the context of risk scenarios, (b) estimate human error probabilities, and (c) provide a causal explanation for the errors to support the development of preventive or mitigating measures. As a multi- disciplinary domain, HRA uses techniques and insights from cog- nitive psychology, behavioral sciences, human factors engineering, organizational behavior, and historical event records. HRA methods (e.g., Bieder et al. 1998, Macwan and Mosleh 1994, Swain and Guttman 1983, Swain 1987) generally identify errors through some type of task analysis, preferably done in the context of 189
190 â¢ Risk of Vessel Accidents and Spills in the Aleutian Islands BOX D-1 The Role of the Human Element in Maritime Safety The importance of the human element in maritime safety is increasingly being recognized by the shipping and offshore communities and is receiving increased levels of attention due to the efforts of organizations such as the United States Coast Guard, the United Kingdomâs Health and Safety Exec- utive, and the International Maritime Organization (IMO). IMOâs primary efforts have concentrated on human element issues relating to management, training, and personnel, as reï¬ected by the International Management Code for the Safe Operation of Ships and for Pollution Prevention in 1993 and the update of the International Convention on Standards of Training, Certiï¬cation, and Watchkeeping for Seafarers in 1995. The human element, however, includes other areas of application, which, if systematically considered and treated, will decrease the potential human error and improve safety, productivity, and efï¬ciency. Elements that affect safety and efï¬ciency in job perfor- mance include: vessel design and layout considerations, work- place ambient environmental elements, management and organizational issues related to operations, and the personnel who operate the vessel or offshore installation. Insufï¬cient attention to any of these elements may adversely affect safety, productivity, and efï¬ciency. . . . The workplace design may increase the likelihood of human error. Additional training, operations, and maintenance manuals and more detailed written procedures cannot adequately compensate for human errors induced by poor design. Source: ABS 2003.
Human Reliability Analysis â¢ 191 the risk scenarios being considered. In the absence of a widely accepted causal model of human error, HRA methods typically rely on per- formance shaping factors (PSFs) to relate the context and con- ditions to human errors. Examples are environmental factors (e.g., limited visibility), the physical state of the crew (e.g., fatigue), and the psychological state of the crew (e.g., stress, high workload). The set of PSFs is sometimes extended to include organizational factors (e.g., poor quality of procedures, lack of training, poor safety culture). Depending on the industry and level of ï¬delity of HRA models, error probabilities are estimated on the basis of ï¬eld data, simulator experiments, quantiï¬cation models, and expert judgment. A recent report (Kolaczkowski et al. 2005) identiï¬es good prac- tices associated with quality HRA, and a follow-on study (Forester et al. 2006) evaluates the extent to which some of the more popu- lar methods incorporate those good practices. A common view today is that human error is often a symptom of trouble deeper within a system (Dekker 2002; Reason 1997). In fact, from this point of view, âerrorâ is not the right term. These human actions are seen as reasonable responses given the context in which peo- ple ï¬nd themselves. Reason speaks of âorganizational accidentsâ that occur in complex systems possessing a wide variety of techni- cal and procedural safeguards. These occurrences arise from the insidious accumulation of delayed-action failures lying mainly in the managerial and organizational spheres. Such latent conditions (or latent failures) are like resident pathogens within the system. Organizational accidents can result when these latent conditions combine with active failures (errors or violations at the âsharp endâ) and local triggering factors to breach or bypass the system defenses. These events have the following characteristics: â¢ The accident sequence begins with the negative consequences of organizational processes (i.e., decisions concerned with plan- ning, forecasting, designing, managing, communicating, budget- ing, monitoring, auditing, and the like). Another inï¬uential factor is the systemâs safety culture. â¢ The latent conditions thus created are transmitted along depart- mental and organizational pathways to the various workplaces, where they manifest as conditions that promote errors and vio- lations (e.g., high workload, time pressures, inadequate skills and experience, poor and unreliable equipment).
192 â¢ Risk of Vessel Accidents and Spills in the Aleutian Islands â¢ At the level of the individual at the âsharp end,â these local latent conditions combine with psychological error and violation tenden- cies to create unsafe acts. Many unsafe acts will be committed, but few of them will penetrate the numerous defenses and safeguards to produce bad outcomes. â¢ The fact that engineered safety features, standards, administrative controls, procedures, and the like can be deï¬cient because of latent conditions as well as active failures reï¬ects the connection between organizational processes and defenses. Some more recent methods are an attempt to address many of these issues from a more fundamental perspective that views the identiï¬cation of âerrorsâ and determination of their causes as two sides of the same coin. Some HRA methods offer a taxon- omy of error types that, depending on the orientation and ori- gin of the method, may cover such issues as action failures (e.g., failing to start an engine in a timely manner or skipping a pro- cedural step), cognitive errors (e.g., misdiagnosis), and violations (e.g., violation of rules and regulations, sometimes with the best of intentions). REFERENCES Abbreviations ABS American Bureau of Shipping TRB Transportation Research Board ABS. 2003. ABS Guidance Notes for the Application of Ergonomics to Marine Systems. Houston, Tex. Bieder, C., P. Le-Bot, E. Desmares, J.-L. Bonnet, and F. Cara. 1998. MERMOS: EDFâs New Advanced HRA Method. In Probabilistic Safety Assessment and Management (PSAM 4) (A. Mosleh and R. A. Bari, eds.), Springer-Verlag, New York. Dekker, S. 2002. The Field Guide to Human Error Investigations. Ashgate Publishing, Burlington, Vt. Forester, J., A. Kolaczkowski, E. Lois, and D. Kelly. 2006. Evaluation of Human Reliability Analysis Methods Against Good Practices. NUREG-1842. U.S. Nuclear Regulatory Commission, Washington, D.C. Kolaczkowski, A., J. Forester, E. Lois, and S. Cooper. 2005. Good Practices for Implementing Human Reliability Analysis (HRA). NUREG-1792. U.S. Nuclear Regulatory Commission, Washington, D.C.
Human Reliability Analysis â¢ 193 Macwan, A., and A. Mosleh. 1994. A Methodology for Modeling Operator Errors of Commission in Probabilistic Risk Assessment. Reliability Engineering and System Safety, Vol. 45, pp. 139â157. Reason, J. 1997. Managing the Risks of Organizational Accidents. Ashgate Publishing, Burlington, Vt. Swain, A. D. 1987. Accident Sequence Evaluation Program Human Reliabil- ity Analysis Procedure. NUREG/CR-4772/SAND86-1996. Sandia National Laboratories for the U.S. Nuclear Regulatory Commission, Washington, D.C., Feb. Swain, A. D., and H. E. Guttman. 1983. Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications. NUREG/CR-1278/ SAND80-0200. Sandia National Laboratories for the U.S. Nuclear Regulatory Commission, Washington, D.C., Aug. TRB. 2002. Special Report 269: The Relative Risks of School Travel: A National Perspective and Guidance for Local Community Risk Assessment. National Academies, Washington, D.C. ADDITIONAL SOURCES Embrey, D. E., P. C. Humphreys, E. A. Rosa, and K. Rea. 1984. SLIM-MAUD: An Approach to Assessing Human Error Probabilities Using Structured Expert Judgment, Vols. I and II. NUREG/CR-3518. Brookhaven National Laboratory, Upton, N.Y. Forester, J., A. Kolaczkowski, S. Cooper, D. Bley, and E. Lois. 2007. ATHEANA Userâs Guide Final Report. NUREG-1880. U.S. Nuclear Regulatory Commission, Washington, D.C., May. Gertman, D. I., H. S. Blackman, J. Byers, L. Haney, C. Smith, and J. Marble. 2005. The SPAR-H Method. NUREG/CR-6883. U.S. Nuclear Regulatory Commission, Washington, D.C., Aug. Hollnagel, E. 1998. Cognitive Reliability and Error Analysis Method (CREAM). Elsevier Science, N.Y. Julius, J., E. Jorgenson, G. W. Parry, and A. M. Mosleh. 1995. A Procedure for the Analysis of Error of Commission in a Probabilistic Safety Assessment of a Nuclear Power Plant at Full Power. Reliability Engineering and System Safety, Vol. 50, pp. 189â201. Reer, B., V. N. Dang, and S. Hirschberg. 2004. The CESA Method and Its Application in a Plant-Speciï¬c Pilot Study on Errors of Commission. Reliability Engineering and System Safety, Vol. 83, pp. 187â205.