Click for next page ( 7


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 6
CHAPTER 2 Understanding Rules Noncompliance A significant body of research exists regarding the factors that influence safety-related rules noncompliance including the reasons underlying errors and violations and ways to mitigate non- compliance. This chapter summarizes factors and mitigation strategies that are applicable to public transit operations. It describes the factors from a bottom-up perspective, concluding with a discussion of the role of safety culture and safety management in the rules compliance process. Framework for Understanding Noncompliance Just as there is no single cause of an accident, reasons for noncompliance are multifaceted. Non- compliance can be willful, a violation, or it can be unintentional, resulting from human error. Numerous error and violation taxonomies exist that differentiate among the underlying causes of noncompliant behavior. A popular error classification system, known as the skill-, rule-, knowledge-based (SRK) approach, was based on information processing models and is described in a number of publications (Rasmussen 1979, 1980, 1986; Reason 1990). Figure 1 presents an adaptation of the SRK model and includes other types of error classifications in the context of human information processing. Knowledge-based errors occur when someone does not have the correct mental model or infor- mation to assess a situation, resulting in formation of an incorrect plan of action. These errors often occur when an individual has to work out solutions to a problem from "scratch." High pressure situations exacerbate problem solving by reducing cognitive resources. Inexperienced employees often fall prey to these types of errors. In contrast, rule-based errors occur when an employee has a clear understanding of the situation, but either chooses an incorrect plan to deal with the situation or mis-executes a well-chosen plan. The outcome is poor. These types of decision-based errors arise when an employee is not adequately trained via classroom and field exposure to handle unexpected situations; the employee does not possess the strategies needed to address low-frequency events. Rule-based errors do not refer to an organization's rules. Rather, rules in the context of the SRK model refer to decision-making strate- gies a person has. (To prevent confusion regarding the term rules, the term strategy-based pro- cessing/errors is used in lieu of the term rule-based for the remainder of the report.) Skill-based errors, also known as slips, occur when performance is highly automatic (as indicated by the dashed line in Figure 1) and a cue in the operational environment triggers the behavior at an inappropriate time (Norman 1981). While slips are errors of commission, lapses are errors of omission, resulting from memory failure. DiFiore and Cardosi (2006) found pilot reports of air traffic control (ATC) personnel who forgot that aircraft were holding in position on the runway (a lapse) and cleared another aircraft to land on the same runway. Employee dis- traction, workload, and fatigue are among the risk factors for these types of incidents and are discussed in subsequent subsections. 6

OCR for page 6
Understanding Rules Noncompliance 7 Figure 1. Information processing model of human error. Reason (1997) makes an important point that "a purely cognitive analysis of error mecha- nisms fails to capture some of the more important human contributions to [accidents]" (p. 204). An examination of violations, that is, deliberate acts of noncompliance, fills this gap. The Uni- versity of Texas has developed a methodology to examine flight operations, the Line Operational Safety Audit (Helmreich 2000). The data from these studies indicate that more than one-half of the in-flight noncompliance observed was intentional. Lawton (1988) distinguishes among three types of violations: situational, exceptional, and routine. The classification is based on data obtained from a survey of United Kingdom (UK) rail shunters' experience with rule noncompliance. (In the UK, shunters refer to people who work on the ground switching cars in a railroad yard.) For the most part, Lawton argues that viola- tions tend to be perceived as well-intentioned desires to get the job done. Situational violations result from motivations to keep the job going under adverse conditions. There is often an incon- sistent approach to dealing with these types of violations. That is, when the job is completed without incident, the employee is rewarded. However, if an accident occurs, the response is often disciplinary action. This inconsistent response does little to curb these types of violations. Situational violations are frequently observed in the public transportation industry because of the time pressure employees feel trying to adhere to schedules. Exceptional violations occur when unusual circumstances call for an unusual response and the employee knowingly does not comply with the organization's rules and chooses an alternative action. Routine violations occur when a shortcut presents itself and is taken regularly. This often happens when an employee no longer thinks a rule applies either because of lack of supervisory enforcement or the employee is overconfident in his or her skill. Incident taxonomies provide a framework to identify reasons for noncompliance with safety- related rules and guide appropriate countermeasures. The Human Factors Analysis and Classifi- cation System (HFACS) is a well-known error and violation taxonomy used in aviation and served as the basis for the pilot safety reporting system in the railroad industry that is reviewed in Appen- dix C (Wiegmann and Shappell 2001). With an understanding of the underlying reason, a public transit agency supervisor or safety officer can determine the appropriate strategy to correct or man- age noncompliance. For example, did noncompliance occur because the employee did not under- stand the situation and unintentionally failed to comply (knowledge-based error)? Training is a possible remedy when this is the cause of noncompliance. Was there a willful decision by the employee to disobey the rules because the organization placed more emphasis on getting