The goal of this chapter is to summarize approaches and concepts from occupational safety research and practices that are particularly relevant for improving the safety of academic chemistry laboratories. The chapter begins by tracing the development of modern safety practice and the emergence of the concept of safety culture. Next, consideration is given to several different industries that have made good use of modern safety concepts and practices in the face of obvious and significant hazards to people and property. The chapter concludes with a brief discussion of organizational change processes.
The evolution of modern safety management practice is often described in terms of three somewhat overlapping periods or epochs.1,2 The first phase of development is referred to as the technology period, in which attention was focused on finding and applying engineering or other technological measures to control hazards and prevent work-related injuries.
1 Hale, A. R., and J. Hovden. Management and culture: The third age of safety. A review of approaches to organizational aspects of safety, health and environment. Occupational Injury: Risk, Prevention and Intervention, A. M. Feyer and A. Williamson, eds. Taylor & Francis, London and Bristol, PA, 2003: 129-165.
2 Hudson, P. Implementing a safety culture in a major multi-national. Safety Science 2007; 45(6): 697-722.
The hierarchy of hazard controls3,4 is one of the most enduring products of this period. Within this framework, hazard controls are organized with the highest priority assigned to actions that eliminate the hazard entirely, followed by those that control or otherwise contain the hazard. Lowest priority is assigned to strategies that may be helpful but do not directly remove or alter the hazard, such as warnings or the provision of personal protective equipment (PPE). This basic hierarchy is also reflected in how the Occupational Safety and Health Administration (OSHA) approaches hazard control. OSHA standards typically give preference to engineering controls, followed by administrative controls (training, work rules, etc.), and lastly to the provision of PPE.
As industrial and work systems became more complex, the limits of simple technological solutions quickly became apparent. Many of today’s work environments are highly complex, making it difficult to anticipate all possible interactions and possible failures among multiple components and multiple human operators.5 The traditional view that accidents can be understood in terms of simple linear chains of events has been gradually replaced by a broader systems perspective.6
The systems perspective represents the second epoch of safety and views accidents and other losses as arising from causal factors that reside at multiple levels within complex sociotechnical systems.7,8 The concept of human–systems integration (HSI) is central to the systems perspective. HSI focuses on the interaction of people, tasks, and equipment and technology in the pursuit of some goal or set of goals.9,10 This interaction occurs within, and is influenced by, the broader environmental context.
3 Barnett, R. L., and D. B. Brickman. Safety hierarchy. Journal of Safety Research 1986; 17(2): 49-55.
4 Haddon, W. Energy damage and the ten countermeasure strategies. Human Factors: The Journal of the Human Factors and Ergonomics Society 1973; 15(4): 355-366.
5 Leveson, N. A new accident model for engineering safer systems. Safety Science 2004; 42(4): 237-270.
6 Perrow, C. Normal Accidents: Living with High Risk Technologies (Updated). Princeton University Press, Princeton, NJ, 2011.
7 Rasmussen, J. Risk management in a dynamic society: A modelling problem. Safety Science 1997; 27(2): 183-213.
8 Reason, J. Human error: Models and management. BMJ 2000; 320(7237): 768-770.
9 Booher, H. R. Handbook of Human Systems Integration, Vol. 23. John Wiley & Sons, Hoboken, NJ, 2003.
10 Czaja, S. J., and S. N. Nair. Human factors engineering and systems design. Handbook of Human Factors and Ergonomics, 3rd Ed., John Wiley & Sons, Hoboken, NJ, 2006: 32-49.
HSI acknowledges that people differ in terms of their cognitive, perceptual, and physical capabilities, and that these capabilities influence how they interact with different tasks, and equipment and technology. These interactions take place in some larger environment or set of environments, which also have their own characteristics that are capable of either facilitating or impeding the successful use of equipment and/or technology and completion of tasks.
Work systems are basically open systems; that is, they can be influenced by both internal and external factors. For example, external factors such as economic conditions or competitive pressures have the potential to impact safety, either positively or negatively. Also inherent in the systems perspective is the idea that some systems may require defenses in depth or redundant controls at different points or levels within the system. The distinction between active and latent failures is also pertinent to the systems approach.11 Errors or mistakes made by frontline workers or researchers are frequently referred to as active failures, and these active failures are often the result of actions or decisions taking place (or not taking place) at higher levels of the organization (latent failures). Effective and permanent solutions to safety problems at the lab bench level often require the identification and elimination of these latent failures, which sometimes are hidden or lie dormant within organizations for long periods of time before contributing to adverse events.
An important feature of the systems approach is the acknowledgment that entire systems can degrade subtly or drift toward failure. Various specialized analytic tools or techniques have been developed to help identify and guard against such shifts in system integrity. These specialized tools include a variety of different risk assessment methodologies such as fault tree analysis and failure modes and effects analysis. Most of these techniques can be used to analyze system vulnerabilities or to reconstruct and understand why failures occurred. Systems safety also makes use of safety audits and other techniques that can be used to monitor system performance and provide early detection of changes in key system parameters.
The emphasis on culture, specifically safety culture, represents the third epoch of modern safety management. This shift or expansion came about from the realization that it is not enough to provide safe equipment, systems, and procedures if the culture of the organization does not
11 Reason, J. Achieving a safe culture: Theory and practice. Work & Stress 1998; 12(3): 293-306.
encourage and support safe working. Hudson12 argues that safety culture is probably the most important issue in modern thinking and practice in safety. The investigative report13 that followed the Chernobyl nuclear disaster is usually credited with introducing the concept of safety culture. Since that time, safety culture has been a prominent feature in the investigation and analysis of most major or catastrophic accidents, including, for example, the recent Deepwater Horizon oil spill. In essence, safety culture forms the organizational context in which all actions pertinent to safety occur.
Although there is no uniform definition offered in the literature, “safety culture” arose from a more general understanding of organizational culture. Edgar Schein, a psychologist credited with pioneering the field of organizational culture, explains that culture embodies values, beliefs, and underlying assumptions.14 Schein describes culture as something that is developed over time by a group as it “solves its problems of external adaptation and internal integration, which has worked well enough to be considered valid, and therefore to be taught to new members as the correct way to perceive, think, and feel in relation to those problems.”15Table 2-1 presents a summary of three models of organizational culture generally accepted by behavioral and social scientists. Taken further, safety culture is most often identified by an organization’s response to or prevention of workplace accidents.
Safety culture, as typically defined, refers to the organization’s shared values, assumptions, and beliefs specific to workplace safety, or more simply, the relative importance of safety within the organization.
Numerous attempts have been made to identify the key attributes or characteristics of a positive safety culture, and although the various frameworks differ in the details, there are clearly more similarities than differences.16,17,18 For example, virtually all discussions of safety culture highlight the fundamental importance of management commitment and active involvement. Frameworks also emphasize the importance of com-
12 Hudson, P. Implementing a safety culture in a major multi-national. Safety Science 2007; 45(6): 697-722.
13 International Atomic Energy Agency. Safety Culture: A Report by the International Nuclear Safety Advisory Group. Safety Series No. 75-INSAG-4. Vienna, Austria, 1991.
14 Schein, E. H. Organizational Culture and Leadership. John Wiley & Sons, Hoboken, NJ, 2010.
15 Id., p. 18.
16 DeJoy, D. M. Behavior change versus culture change: Divergent approaches to managing workplace safety. Safety Science 2005; 43(2): 105-129.
17 Hopkins, A. Studying organisational cultures and their effects on safety. Safety Science 2006; 44(10): 875-889.
18 Wiegmann, D. A., H. Zhang, T. L. von Thaden, G. Sharma, and A. Mitchell Gibbons. Safety culture: An integrative review. International Journal of Aviation Psychology 14(2): 117-134.
TABLE 2-1 Three Models of Organizational Culture
|Schein (1985)||Behaviors and artefacts||Beliefs and values||Underlying assumptions|
|Rousseau (1988, 1990)||Observable artefacts (e.g., company logo); observable patterns of behavior||Behavioral norms, which can be inferred from observed behaviors; values, as expressed consciously by organization members||Fundamental assumptions—core values that may not be articulated|
|Deal and Kennedy (1986); Lundberg (1990)||Manifest level—symbolic artefacts, language, stories, rituals, normative behaviors||Strategic level—strategic beliefs||Core level—ideologies, values, assumptions|
Adapted from Glendon, A. I., and Stanton, N. A. Perspectives on safety culture. Safety Science 2000; 34(1): 193-214.
munication and the free exchange of safety-related information, especially the freedom of all members to report hazards and to be heard on matters involving safety. Positive safety cultures also place high importance on hazard identification and control as well as continuous learning and improvement. To a considerable extent, achieving a safety culture that emphasizes learning and improvement requires a culture that seeks and values information and that assigns greater importance to problem solving than blame assignment. Obviously, a positive safety culture is one in which a high relative importance is assigned to safety all the time, not just when it is convenient or does not threaten personal or institutional productivity goals. However, the strongest, most positive safety culture is established when all members at all levels of the organization basically agree on the importance of safety. However, particularly within large or loosely structured organizations, there are many opportunities for “disconnects” to occur as to the primacy of the safety mission. Such variability or heterogeneity can easily undermine safety performance. Disconnects also can occur in work situations where individual members
24 Westrum, R. A typology of organisational cultures. Quality and Safety in Health Care 2004; 13(S2): ii22-ii27.
25 Parker, D., M. Lawrie, and P. A. Hudson. A framework for understanding the development of organisational safety culture. Safety Science 2006; 44(6): 551-562.
Besides identifying the core traits of positive safety cultures, other researchers have sought to create taxonomies of safety culture types. These taxonomies can be used by organizations for purposes of self-assessment and change or they can be used to help verify and refine the key attributes of safety culture. Westrum developed a taxonomy consisting of three types of cultures that were distinguished primarily in terms of how information is handled.24 His three culture types were pathological, bureaucratic, and generative. Pathological cultures are basically power-oriented and information is viewed as a personal resource to be guarded. Bureaucratic cultures are heavily rule-oriented, and information is often not welcome or is ignored. Generative cultures, on the other hand, are more performance-oriented. In such cultures, information is welcomed, and efforts are made to get the right information to the right person at the right time. The pathological culture is a blame-type culture, the bureaucratic culture is a compliance-type culture, and a generative culture is a more proactive and positive culture.
Others have extended this basic typology. Parker and colleagues, in particular, describe five culture types: pathological, reactive, calculative, proactive, and generative. They summarize the five cultures as follows: pathological, “who cares as long as we are not caught”; reactive, “safety is important: we do a lot every time we have an accident”; calculative, “we have systems in place to manage all hazards”; proactive, “we try to anticipate safety problems before they arise”; and generative, “health, safety, and environment is how we do business around here.”25 These authors also outline how each culture type would likely handle various
19 Hage, J. and M. Aiken. Routine technology, social structure, and organizational goals. Administrative Science Quarterly 1969; 14: 366-378.
20 Zohar, D. Safety climate: Conceptual and measurement issues. In J. C. Quick and L. E. Tetrick (eds). Occupational Health Psychology, 2nd ed. American Psychological Association, Washington, DC, 2011: 141-164.
21 Zohar, D. Modifying supervisory practices to improve sub-unit safety: A leadership-based intervention model. 2002; Journal of Applied Psychology 87: 156-163.
22 Zohar, D., The effects of leadership dimensions, safety climate, and assigned priorities on minor injuries in work groups. 2002; Journal of Organizational Behavior 23: 75-92.
23 Kines, P., L. P. Andersen, J. Dyreborg, and D. Zohar. Improving construction site safety through leader-based verbal safety communication. 2010; Journal of Safety Research 41: 399-406.
aspects of safety management, such as safety audits and reviews, work planning, and handling contractors.
Mindfulness is a psychological quality that involves bringing one’s complete attention to the present experience on a moment-to-moment basis in a nonjudgmental way. The mindless following of routine and other automatic behaviors leads to error, pain, and a predetermined course of life. To be mindful stresses process over outcomes, allowing free rein for intuition and creativity, and opens us to new information and perspectives. When applied to safety, the concept of mindfulness extends to groups as well as individuals. Indeed, collective mindfulness is an important factor in achieving high levels of safety in high-hazard situations.26
The development of situational awareness requires mindfulness. While there are many definitions of situational awareness, Endsley’s is probably the most commonly used: “the perception of elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future.”27 Situational awareness is commonly used in complex domains, such as air traffic control or surgery. It is often called upon in time-critical situations in which choices have to be made quickly by decision makers, with the support of other team members and a myriad of information coming from other sources. Situational awareness relates more to achieving immediate tactical objectives than to long-term objectives.
The development of sense-making requires situational awareness. Sense-making addresses more long-term strategic issues than situational awareness. Klein and colleagues define sense-making as “a motivated, continuous effort to understand connections (which can be among people, places and events) in order to anticipate their trajectories and act effectively.”28 It is a constant process of acquisition, reflection, and action.
Their view of the process is one shared of many organizational theorists (e.g., Westrum29) where, in a large organization, various people may hold different pieces of data, and different levels of awareness of events, that are all critical to the success of a given project. Sense-making is deeply
26 Langer, E. J. Mindfulness. Addison Wesley Longman, Boston, MA, 1989.
27 Endsley, M. R. Measurement of situation awareness in dynamic systems. Human Factors: The Journal of the Human Factors and Ergonomics Society 1995; 37(1): 65-84.
28 Klein, G., B. M. Moon, and R. R. Hoffman. Making sense of sensemaking, 1: Alternative perspectives. IEEE Intelligent Systems 2006; 21(4): 70-73.
29 Westrum, R. A typology of organisational cultures. Quality and Safety in Health Care 2004; 13(S2): ii22-ii27.
related to a process of “socialization,” whereby those with ideas and data share them with others in an effort to actively disseminate information and build consensus. Klein and colleagues’30 view of sense-making is a process that is both personal and shared, one that takes place over a long period of time, and one that is heavily dependent on a perspective or point of view.31
Promoting workers’ involvement at all levels can be an effective way to help build and sustain a positive safety culture.32 It is especially important for improving the exchange of safety-related information, fostering collective mindfulness and sense-making, empowering workers to speak up and share what they know, and creating a learning and improvement focus. Involvement also can facilitate the successful implementation of new programs and initiatives.33 In many work situations, managers or other leaders are often unaware of frontline safety problems. Getting frontline workers involved by thinking and talking about safety is one way to address this problem and leverage the expertise that these workers possess. As Susan Silbey argues, “Lower-level actors are often repositories of critical information, yet are often unable to persuade higher-ups in the organization of either the credibility of their knowledge or the relevance of their perspective.”34 Worker involvement has been linked to better safety outcomes in a number of work settings, including chemical plants,35 oil and gas extraction,36 and health care.37 Health and safety committees are perhaps the most frequently employed worker involvement
30 Klein, G., B. M. Moon, and R. R. Hoffman. Making sense of sensemaking, 1: Alternative perspectives. IEEE Intelligent Systems 2006; 21(4): 70-73.
31 Kolko, J. Sensemaking and Framing: A Theoretical Reflection on Perspective in Design Synthesis. frog design & Austin Center for Design, Austin, TX, 2010. Available at http://www.designresearchsociety.org/docs-procs/DRS2010/PDF/067.pdf.
32 Simard, M., and A. Marchand. Workgroups’ propensity to comply with safety rules: The influence of micro-macro organisational factors. Ergonomics 1997; 40(2): 172-188.
33 Lawler, E. J. Affective attachments to nested groups: A choice-process theory. American Sociological Review 1992; 57(3): 327-339.
34 Silbey, S. S. Taming Prometheus: Talk about safety and culture. Annual Review of Sociology 2009; 35 (2009): 341-369.
35 Hofmann, D. A., and A. Stetzer. A cross-level investigation of factors influencing unsafe behaviors and accidents. Personnel Psychology 1996; 49(2): 307-339.
36 Mearns, K., S. M. Whitaker, and R. Flin. Safety climate, safety management practice and safety performance in offshore environments. Safety Science 2003; 41(8): 641-680.
37 Singer, S., S. Lin, A. Falwell, D. Gaba, and L. Baker. Relationship of safety climate and safety performance in hospitals.” Health Services Research 2009; 44(2 Pt 1): 399-421.
strategy specific to safety,38 but employee involvement can take many different forms.
Behavioral and organization scientists have devoted considerable attention to various types of worker involvement approaches. High-performance work systems and high-involvement work processes (HIWPs) are two approaches that have received considerable research attention. Both approaches involve sets of work practices designed to leverage employee motivation and creativity, and in some sense represent reactions against scientific management and its centralization of decision making and problem solving at the management level.39 Using HIWPs as an example, Edward Lawler proposed a framework consisting of four HIWPs: power (P), information (I), reward (R), and knowledge (K).40 These four processes (PIRK) are intended to be mutually reinforcing. HIWPs empower workers to make more decisions on the job, provide them with the information and knowledge they need for decision making, and reward them for doing so.
Employee empowerment is a central feature of most high-performance and high-involvement models and frameworks. Social support within the workgroup and from managers and supervisors is important in empowering employees and giving them “voice” in safety matters. This permits them to speak out and to modify or halt work that they consider too risky.41,42 Empowerment is also a key attribute of high-reliability organizations (HROs). HROs strive for constant safety mindfulness, and there is deference to expertise whereby authority migrates down the command structure to whomever has the most pertinent knowledge or the best perspective for understanding and solving a problem.43,44 This priority on safety mindfulness often extends to recognizing and rewarding people
38 Dunlop Commission. Report and Recommendations of the Commission on the Future of Worker-Management Relations. U.S. Department of Labor and Department of Commerce, Washington, DC, 1994: Section II.
39 Boxall, P., and K. Macky. Research and theory on high-performance work systems: Progressing the high-involvement stream. Human Resource Management Journal 2009; 19(1): 3-23.
40 Lawler, E. E. High-Involvement Management. Jossey-Bass, San Francisco, CA, 1986.
41 Conchie, S. M., P. J. Taylor, and I. J. Donald. Promoting safety voice with safety-specific transformational leadership: The mediating role of two dimensions of trust. Journal of Occupational Health Psychology 2012; 17(1): 105-115.
42 Tucker, S., N. Chmiel, N. Turner, M. S. Hershcovis, and C. B. Stride. Perceived organizational support for safety and employee safety voice: The mediating role of coworker support for safety. Journal of Occupational Health Psychology 2008; 13(4): 319-330.
43 Roberts, K. H. Some characteristics of one type of high reliability organization. Organization Science 1990; 1(2): 160-176.
44 Rochlin, G. I. Safe operation as a social construct. Ergonomics 1999; 42(11): 1549-1560.
even when their safety-related concerns prove to be inaccurate or not well founded.
Forming workgroups or teams is another strategy for increasing employee involvement and empowerment. Effective teams have shared mental models and group situation awareness; they also efficiently process, share, and use information. These attributes are especially important in emergency and high-stress situations where performance must be adapted to cope with rapidly changing or unexpected conditions. Simulations and other interactive training activities allow team members to operate as a team while training, engage in the social, cognitive, and behavioral processes of team performance, and receive feedback based on their performance. Training approaches such as Cockpit Resource Management (CRM) in aviation45 directly foster team skills, including assertiveness, maintaining shared situation awareness, and communication. Considerable research, much involving cockpit crews, but also some in health care, underscores the importance of group processes and group cohesion in overall safety.46,47
This section describes and discusses several examples of industries that have adopted many of the principles and approaches described above with the primary goal of improving safety performance. Many of these same industries are those utilizing complex technologies, obvious inherent hazards, and the potential for experiencing serious or even catastrophic losses. The experiences of these industries may be relevant to experimental research with chemicals.
Aviation safety was given a boost by a National Aeronautics and Space Administration (NASA)-sponsored workshop, “Resource Management on the Flightdeck,” in 1979. This conference was the outgrowth of NASA’s research into the causes of commercial air transport accidents. The research presented at this meeting identified the human error aspects of the majority of air crashes as failures of communication, decision making, and leadership. At this meeting, the label Cockpit Resource Manage-
45 Wiener, E. L., B. G. Kanki, and R. L. Helmreich, eds. Cockpit Resource Management. Gulf Professional Publishing, Houston, TX, 1993.
46 Clarke, S. The contemporary workforce: Implications for organisational safety culture. Personnel Review 2003; 32(1): 40-57.
47 Helmreich, R. L., and A. C. Merritt. Culture at Work in Aviation and Medicine: National, Organizational and Professional Influences. Ashgate Publishing, Surry, UK, 2001.
ment (CRM) was applied to the process of training crews to reduce pilot error by making better use of the human resources on the flightdeck.48
The first comprehensive cockpit or CRM program was initiated by United Airlines (UAL) in 1981 following a devastating UAL accident in Portland, Oregon. As CRM developed, training emphasis was placed increasingly on group dynamics. The new courses dealt with more specific aviation concepts related to flight operations, became more modular, and became more team-oriented in nature. Basic training conducted in intensive seminars included concepts such as team building, briefing strategies, situational awareness, and stress management. Specific modules addressed decision-making strategies and breaking the chain of errors that can result in catastrophe. Much of CRM addresses communication processes and power and knowledge differentials within interdependent work groups.
Fortunately, actual incidents involving injury and damage to property are relatively rare, even in very high hazard environments. Much more common, however, are near misses. Near misses are incidents or events that could have resulted in injuries or other adverse consequences, but fortunately did not.49,50 The loss potential of a near miss is quite real; the difference between a near miss and an actual accident often amounts to a fraction of a second or a fraction of an inch. With a near miss, some combination of unsafe conditions and/or behaviors existed and a sequence of events unfolded that could have led to adverse outcomes. Although often ignored, near misses represent an important data source for learning and prevention. Near misses are often symptomatic of some type of system vulnerability or degradation, which, if uncorrected, may cause serious problems in the future. They might best be viewed as instructive. The importance and usefulness of reporting and tracking near misses has gained broad recognition in many areas of safety practice. Near miss reporting can be a useful part of the surveillance and monitoring component of a comprehensive safety management system.
48 Helmreich, R. L., A. C. Merritt, and J. A. Wilhelm. The evolution of Crew Resource Management training in commercial aviation. International Journal of Aviation Psychology 1999; 9(1): 19-32.
49 Jones, S., C. Kirchsteiger, and W. Bjerke. The importance of near miss reporting to further improve safety performance. Journal of Loss Prevention in the Process Industries 1999; 12(1): 59-67.
50 Wright, L., and T. Van der Schaaf. Accident versus near miss causation: A critical review of the literature, an empirical test in the UK railway domain, and their implications for other sectors. Journal of Hazardous Materials 2004; 111(1): 105-110.
The Federal Aviation Administration (FAA) has perhaps the best-known near-miss reporting system in the United States. This system, the Aviation Safety Reporting System (ASRS), allows pilots and other personnel to confidentially report near misses and other close calls. The reporting system was first established in 1976. The ASRS is confidential and independent, and near-miss reports cannot normally be used in any FAA enforcement actions. Independence is achieved by having the system maintained by NASA. Those making reports do not have to (but may) provide their name and contact information. Once staff analysts are satisfied with the information contained in a report, contact information is removed from the report. ASRS analysts may identify hazardous situations from reports and issue “Alert Messages” to organizations within the aviation sector. The database of reports is also used for research (modeling, trending, root-cause taxonomies, etc.) and other purposes intended to better inform the aviation community and benefit safety. ASRS reports are available from NASA’s ASRS website. The database is searchable and available to the public.
Near-miss reports can be submitted electronically or by mail. The report form includes space for describing the event or situation. Cues are provided on the form encouraging reporters to address causal and contributing factors, the sequence of events involved, and any human performance factors involved. Much of the remainder of the form consists of sets of checkboxes that collect information specific to different contributing factors and conditions. Since its inception, over 1 million reports have been submitted; including over 70,000 reports in 2012. The system is promoted as confidential, voluntary, and nonpunitive. The National Firefighter Near-Miss Reporting System is of more recent origin. This system is based closely on the ASRS and is supported and funded by the U.S. Department of Homeland Security as part of the Assistance to Firefighters Grants Program. Near-miss reporting systems have been advocated for a number of other industries, including the chemical process industry51 and the health care industry.52
A commitment by the entire scientific community to promote an effective near-miss reporting system might ultimately be productive in practice. The difficulty lies in the necessary level of detail of the reported chemicals and materials used in the potential hazard as well as the documentation of the level of experience of those involved. However, the
51 Phimister, J. R., U. Oktem, P. R. Kleindorfer, and H. Kunreuther. Near-miss incident management in the chemical process industry. Risk Analysis 2003; 23(3): 445-459.
52 Institute of Medicine. Patient Safety: Achieving a New Standard of Care. The National Academies Press, Washington, DC, 2004.
chemistry community, with all its levels of expertise, has a great opportunity to optimize an anonymous near-miss reporting system.
Since the Institute of Medicine’s (IOM’s) 2000 publication of To Err Is Human,53 the health care community has given a great deal of attention to patient safety. To Err Is Human and its 2001 follow-up publication, Crossing the Quality Chasm,54 both concluded that health care is not as safe as it should be and suggested that between 44,000 and 98,000 patients are killed in hospitals in the United States every year—medical errors that could have been prevented. The highest error rates with serious consequences are most likely to occur in intensive care units, operating rooms, and emergency departments.
The authors hypothesize that error rates are so high because of the decentralized and fragmented nature of health care in the United States. They propose that medical errors are not the result of individual recklessness, but result from faulty systems, and the pressures that lead people to make mistakes or not prevent them from happening.
Crossing the Quality Chasm recommends redesigning the American health care system and provides specific direction for policy makers, health care leaders, clinicians, regulators, purchasers, and others. Health care providers are asked to adopt a shared vision of six specific aims for improvement. These aims are built around the core need for health care to be
- Safe: avoiding injuries to patients from the care that is intended to help them.
- Effective: providing services based on scientific knowledge to all who could benefit, and refraining from providing services to those not likely to benefit.
- Timely: reducing waits and sometimes harmful delays for both those who receive and those who give care.
- Efficient: avoiding waste, including waste of equipment, supplies, ideas, and energy.
- Equitable: providing care that does not vary in quality because
53 Institute of Medicine. To Err Is Human: Building A Safer Health System. National Academy Press, Washington, DC, 2000.
54 Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. National Academy Press, Washington, DC, 2001.
of personal characteristics such as gender, ethnicity, geographic location, and socioeconomic status.55
In response to the IOM and Congress, in 2001 the Agency for Healthcare Research and Quality (AHRQ) renamed its Center for Quality Measurement and Improvement, the Center for Quality Improvement and Patient Safety. This was step 1 in AHRQ’s efforts to refocus and concentrate in one unit its research and implementation activities devoted to safety in health care.
In 2008, AHRQ published Becoming a High Reliability Organization: Operational Advice for Hospital Leaders. In it, they spelled out Sutcliffe and Weick’s56 high-reliability characteristics:
- Sensitivity to operations. HROs recognize that manuals and policies constantly change and are mindful of the complexity of the systems in which they work. HROs work quickly to identify anomalies and problems in their system to eliminate potential errors. Maintaining situational awareness is important for staff at all levels because it is the only way anomalies, potential errors, and actual errors can be quickly identified and addressed.
- Reluctance to simplify. HROs refuse to simplify or ignore the explanations for difficulties and problems that they face. These organizations accept that their work is complex and do not accept simplistic solutions for challenges confronting complex and adaptive systems. They understand that their systems can fail in unexpected ways that have never happened before and that they cannot identify all the ways in which their systems could fail in the future.
- Preoccupation with predicting potential failures. HROs are focused on predicting and preventing catastrophes rather than reacting to them. These organizations constantly entertain the thought that they may have missed something that places patients at risk. Near misses are viewed as opportunities to improve current systems by examining strengths, determining weaknesses, and devoting resources to improve and address them.
- Deference to expertise. HROs cultivate a culture in which team members and organizational leaders defer to the person with the most knowledge relevant to the issue they are confronting. The most generally experienced person or the person highest in the
56 Sutcliffe, K. E., and K. M. Weick. Managing the Unexpected: Assuring High Performance in an Age of Complexity, Wiley India, New Delhi, 2006.
organizational hierarchy does not necessarily have the information most critical to responding to a crisis.
- Resilience. HROs pay close attention to their ability to quickly respond to and contain errors and recover when difficulties occur. Thus, systems can function despite setbacks.57
The health care industry engages in other activities designed to promote patient safety, such as the National Patient Safety Foundation and the Lucean Leape Institute. Have these activities improved patient safety? It is hard to know because, as is true for aviation, the obvious indicators, such as morbidity and mortality, are influenced by many variables. Unlike aviation, however, there are enough incidents to obtain reliable statistical metrics.
Stelfox and co-workers searched MEDLINE for articles on patient safety and medical error from November 1, 1994, to November 1, 2004, and examined federal funding of patient safety research from 1995 to 2004.58 The rate of publication of patient safety research was significantly (p < .01) higher after publication of To Err Is Human than before. Prior to the book’s publication, patient safety publications were overwhelmingly about malpractice; however, after publication, they were overwhelmingly about culture. Research support was also higher after the book’s publication. Many of the activities described are relevant to the academic chemistry community and are worth consideration.
Foundations in Regulation Mandatory Safety and Health Standards
Because of their size and scope of work, industrial research facilities are most often subject to OSHA regulations. As such, their fundamental laboratory safety principles are driven by 29 CFR § 1910.1450, Occupational Exposure to Hazardous Chemicals in Laboratories standard (the “Laboratory standard”), and various other hazard-specific standards, such as 29 CFR § 1910.1030, Bloodborne pathogens, and § 1910.101, Compressed gases. These
57 Hines, S., K. Luna, J. Lofthus, M. Marquardt, and D. Stelmokas. Becoming a High Reliability Organization: Operational Advice for Hospital Leaders. AHRQ Publication No. 08-0022. Agency for Healthcare Research and Quality, Rockville, MD, April 2008. Available at http://www.ahrq.gov/professionals/quality-patient-safety/quality-resources/tools/hroadvice/hroadvice.pdf.
58 Stelfox, H. T., S. Palmisani, C. Scurlock, E. J. Orav, and D. W. Bates. The “To Err Is Human” report and the patient safety literature. Quality and Safety in Health Care 2006; 15(3): 174-178.
standards target individual hazards and follow a basic logic: identify the hazard, evaluate the hazard, train workers, and control the hazard.
Industrial research facilities are often associated with production facilities. These facilities are also required to follow OSHA’s § 1910.119, Process safety management of highly hazardous chemicals (aka Process Safety Standard), if they utilize processes that involve specific chemicals above the threshold quantities listed in the standard. While the purpose of the standard is to prevent or minimize the consequences of catastrophic chemical releases, the standard’s requirements have the added benefit of increasing individual safety. Because bench-scale research facilities are used to test production ideas prior to scale-up, these research facilities often utilize modified versions of the Process Safety Standard requirements. Examples of the standard requirements that can be modified for research activities include the following:
- Hazard analyses are conducted at the process level utilizing methods similar to those prescribed by the standard (i.e., what-if, checklist, hazard and operability study, failure modes and effects analysis, or fault tree analysis).
- The analysis addresses hazards of the process, the identification of previous incidents that could lead to catastrophic consequences, the identification of engineering and administrative controls, the consequences of failure, and human factors.
- Employees are involved in the hazard analysis.
- Systems are designed to comply with code requirements and generally accepted good engineering practices.
- Written operating procedures are utilized to provide clear instructions for safely conducting an activity.
- Employees are trained in the safe conduct of the process as well as the emergency procedures required should a failure occur.
- The hazard analysis is periodically revisited and modified if changes are anticipated in the process.
- Inspections and testing are used to identify drift from expected performance.
Research institutions that incorporate these principles into their research and development activities move from using a predefined set of controls and standard laboratory practices to incorporating a systematic approach to recognition, evaluation, and control of high-hazard activities into their way of doing business.
Growth through Adoption of Consensus Safety and Health Standards
In 1989, OSHA announced its intent to publish voluntary guidelines that employers could use to develop safety and health management programs. The guidelines were not well received by the public, and OSHA ultimately withdrew its intent. Since then, consensus standards have been developed that incorporate many of the principles laid out by OSHA in its proposed rulemaking. Most notably, the American Industrial Hygiene Association served as secretariat in cooperation with the American Society of Safety Engineers to publish ANSI Z10, American National Standard for Occupational Health and Safety Management Systems, and a number of cooperating national standards bodies from around the world assisted in the development of OHSAS 18001, Occupational Health and Safety.59 Many industrial research facilities have voluntarily adopted these standards because (1) they find value in applying the management system approach to improve organizational performance, (2) the basic structure of the consensus standards creates a framework with which the institution can demonstrate compliance with a number of specification standards, and (3) they see implementation as a competitive advantage in the international marketplace.
The standards incorporate principles engineered to integrate health and safety into the fabric of an organization rather than to exist as a standalone set of processes or standards. Marked differences between these standards and more traditional regulations include
- Management leadership and commitment;
- Clearly defined roles, responsibilities, accountabilities, and authorities;
- Identification of institutional risks, followed by performance objective and resource allocations;
- Incident investigation;
- Focus on preventive actions;
- Clear involvement by management in the review of system performance; and
- Voluntary assessment by external registration bodies.
Institutions that voluntarily follow these standards are consciously or subconsciously agreeing to modify their culture.
59 BS OHSAS 18001 Occupational Health and Safety Management. http://www.bsigroup.com/en-GB/ohsas-18001-occupational-health-and-safety/. Accessed July 28, 2014.
Safety has been a primary consideration in the nuclear industry from the very start, beginning with the Manhattan Project during World War II. Part of this concern was obviously related to the magnitude of the hazards involved and the potential for serious or catastrophic harm, not just to workers, but to the general public and the environment. The multidisciplinary nature of the enterprise also contributed to a heightened focus on safety and high regulation of the industry. Harnessing nuclear power and building reactors was very much a multidisciplinary enterprise, requiring that scientists and engineers from multiple disciplines work together to meld their different perspectives on design and construction. Despite these precautions,60 events such as Three Mile Island and Chernobyl have served to reinforce these concerns. The nuclear industry from the very beginning has been a highly regulated industry and the safety of nuclear energy remains a visible and sometimes volatile public policy issue.
Perrow emphasizes that some technological systems possess certain characteristics that make them inherently hazardous.61 From his perspective, two dimensions are particularly important: complexity and tight coupling. Complex systems, defined as those involving multiple interactions and many different components, are inherently more susceptible to unanticipated outcomes and mistakes than operations involving simple linear interactions. Tight coupling exists when there is little opportunity to correct or counteract errors or malfunctions once they occur. In a tightly coupled system, minor errors or failures can rapidly cascade out of control and produce serious consequences before corrective measures can be taken. Nuclear power plants are both very complex and tightly coupled. The typically simple task of keeping track of system status can be a challenge in such systems. Indeed, this very problem was an important contributing factor in the Three Mile Island incident.
Initial approaches to controlling risk in the nuclear industry primarily focused on providing defense in depth, redundancies, and wide safety margins. These actions were soon supplemented by the application of quality assurance techniques in design and manufacture and the use of continuous testing, inspection, and maintenance to keep system performance within design limits. As the industry continued to develop, systems safety techniques such as fault tree analysis and event trees were utilized to estimate risk and identify system weakness and vulnerabili-
60 Keller, W., and M. Modarres. A historical overview of probabilistic risk assessment development and its use in the nuclear power industry: A tribute to the late Professor Norman Carl Rasmussen. Reliability Engineering & System Safety 2005; 89(3): 271-285.
61 Perrow, C. Normal Accidents: Living with High Risk Technologies (Updated). Princeton University Press, Princeton, NJ, 2011.
ties. Probabilistic risk assessment has become an important component of safety management within this industry. The Three Mile Island incident provided an important stimulus for increased attention to general issues such as operator training and human factors more generally. It also led to increased application of accident scenarios, simulation techniques, and the monitoring and investigation of near misses and other precursor events. Greater acceptance was given to the idea that even minor events can cause major losses. As the industry has matured, there has been increased acknowledgment that each power plant is unique and may have its own specific vulnerabilities. The nuclear industry, along with other high-hazard industries, has also come to realize the importance of “upstream” organizational and managerial factors in accident causation and safety performance.62,63
The safety culture concept originated in the nuclear industry in the aftermath of the Chernobyl disaster in 1986.64 As discussed previously, in 1991, the International Atomic Energy Agency (IAEA) issued a comprehensive report on safety culture, defining it for the nuclear industry. Safety culture was defined as “that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance.”65 The definition was crafted to emphasize both organizational and individual commitment, management responsibility for policy, and the operational framework and staff responsibility for commitment and competence. The IAEA also offered quite detailed guidance for establishing and managing a positive safety culture. The U.S. Nuclear Regulatory Commission also has issued several reports and statements pertinent to safety culture. Some of the earlier documents focused on assigning top priority to safety and making sure that employees can raise safety concerns without fear of retaliation. In 1998, the Nuclear Regulatory Commission initiated its Reactor Oversight Process.66 This report included three cross-cutting themes that were intended to apply to all aspects of safety: human performance, management attention to safety
62 Flin, R., K. Mearns, P. O’Connor, and R. Bryden. Measuring safety climate: Identifying the common features. Safety Science 2000; 34(1): 177-192.
63 Weick, K. E., K. M. Sutcliffe, and D. Obstfeld. Organizing for high reliability: Processes of collective mindfulness. Crisis Management, Vol. 3, A. Boin, ed. Sage, London, UK, 2008: 81-123.
64 Nuclear Energy Agency. Chernobyl and the Safety of Nuclear Reactors in OECD Countries: Report. Organisation for Economic Co-operation and Development, 1987.
65 International Nuclear Safety Advisory Group. Management of Operational Safety in Nuclear Power Plants. INSAG Series 13. International Atomic Energy Agency, 1999.
66 U.S. Nuclear Regulatory Commission. Reactor Oversight Process. NUREG-1649. USNRC, Rockville, MD, 2006.
and workers’ ability to raise safety issues, and finding and fixing problems. The Nuclear Regulatory Commission recently published a more definitive safety culture policy statement in the Federal Register. This statement includes a definition of safety culture and enumerates nine traits of a positive safety culture. Nuclear safety culture was defined as “the core values and behaviors resulting from a collective commitment of leaders and individuals to emphasize safety over competing goals to ensure protection of people and the environment.”67 The nine traits were (1) leadership safety values and actions, (2) problem identification and resolution, (3) personal accountability, (4) work processes, (5) continuous learning, (6) environment for raising concerns, (7) effective safety communication, (8) respectful work environment, and (9) questioning attitude. Some have criticized the early discussions of safety culture in the nuclear industry for being too narrowly focused on administrative procedures and individual attitudes at the expense of broader organizational considerations.68 This most recent statement seems generally consistent with current thinking on safety culture.
Most organizational change efforts occur in response to some type of failure or poor performance.69,70 It follows that organizations seeking to change their safety culture are often doing so because of some significant safety-related problem or perceived vulnerability. In some instances, the actual problem may have occurred elsewhere, but the visibility and notoriety were such that other organizations were prompted or called upon to examine and reassess their own vulnerabilities.
Schein71 presents a general culture change model that builds on the three basic steps or phases of Lewin’s 1951 classic change model.72 Schein describes the three stages as follows: (1) unfreezing and creating the motivation for change; (2) learning new concepts and new meanings for old concepts; and (3) refreezing or internalizing new concepts, meanings,
67 U.S. Nuclear Regulatory Commission. Final safety culture policy statement. Federal Register June 14, 2011; 76(114): 34773-34778.
68 Pidgeon, N., and M. O’Leary. Man-made disasters: Why technology and organizations (sometimes) fail. Safety Science 2000; 34(1): 15-30.
69 Dunphy, D. Organizational change in corporate settings. Human Relations 1996; 49(5): 541-552.
70 Weick, K. E., and R. E. Quinn. Organizational change and development. Annual Review of Psychology 1999; 50(1): 361-386.
71 Schein, E. H. Organizational Culture and Leadership. John Wiley & Sons, Hoboken, NJ, 2010.
72 Lewin, K. Field Theory in Social Science. Harper & Row, New York, 1951.
and standards. These stages reflect the fact that change involves unlearning as well as relearning. In essence, planned organizational change is a conscious learning process.
In the first phase, Schein emphasizes the importance of presenting enough disconfirming data to cause people to be uncomfortable with the current state. Moreover, these data should be linked to important organizational goals and ideals. The free and open exchange of information is a key attribute of a positive safety culture; it is also an important aspect of successful culture change. However, these disconfirming data, although valuable and useful, are really more symptomatic than diagnostic. At this point, further work is needed to take a detailed look at current safety systems, practices, and accountabilities to identify needs and set priorities. A multilevel systems perspective or HSI perspective can be useful to capture both the human and technical aspects of the work situation. Schein argues that change goals should be defined in concrete terms about specific problems that need to be solved and not as “culture change” per se.
Much of the success of the change process involves the creation of psychological safety. People need to feel secure and supported as the change and learning process proceeds. Unfortunately, too often, employees are viewed simply as passive recipients of change activities and other new initiatives.73 Employee involvement can improve the fit and acceptance of new policies, practices, and routines by creating a sense of ownership and procedural fairness. From a culture-change perspective, involvement practices can help produce a push-pull situation where support for change is generated from both the top and the bottom of the organization. However, some organizational research has shown that employees are not always automatically ready to participate at the levels required, and efforts may be needed to build capacity in order to achieve the level of participation desired.74,75,76 Indeed, the perceived lack of psychological safety can easily create anxiety and resistance among employees concerning anticipated changes, discourage them from participating, and ultimately defeat the entire change process.
The second stage of the change process focuses on learning and behavior change. Desired new behaviors can be coerced temporarily through
73 Nielsen, K., T. W. Taris, and T. Cox. The future of organizational interventions: Addressing the challenges of today’s organizations. Work & Stress 2010; 24(3): 219-233.
75 DeJoy, D. M., M. G. Wilson, R. J. Vandenberg, A. L. McGrath-Higgins, and C. S. Griffin-Blake. Assessing the impact of healthy work organization intervention. Journal of Occupational and Organizational Psychology 2010; 83(1): 139-165.
76 LaMontagne, A. D., T. Keegel, A. M. Louie, A. Ostry, and P. A. Landsbergis. A systematic review of the job-stress intervention evaluation literature, 1990–2005. International Journal of Occupational and Environmental Health 2007; 13(3): 268-280.
the use of various enforcement protocols, but these behaviors are not likely to last if they are not accompanied by cognitive restructuring. The goal here is to change how people think about safety in their workplace, to change group norms, and reshape employee behavior-outcome expectations. Changing values, norms, and expectations is the essence of culture change. This is not easily accomplished through any single strategy or action. Consistent top management expectations and support are very important, but this change process almost always requires a well-executed, multicomponent plan that involves consistent messages through multiple channels, well-designed training activities, employee involvement, new methods and standards of evaluation, investment in new equipment and systems, and the use of role models or program champions. Changing safety culture involves altering the process of social exchange between employees and the organization. Social exchange theory77 basically argues that employees evaluate their treatment by the organization and respond proportionally; this notion of reciprocity has been applied to workplace safety.78,79,80 When managers and supervisors demonstrate their commitment and support for safety, employees reciprocate by expending greater effort to follow safe work practices and other safety recommendations.
To a considerable extent, the refreezing or internalization stage needs to show members that the new policies, programs, and behaviors are important and do produce the desired results.81 Consistent with the learning perspective, this is a process of reinforcement and strengthening. The sharing of relevant information about safety performance is important, but even more important is showing that safety goals can be achieved without compromising other important outputs. Of course, the best situation is being able to show that improving safety actually improves other valued outputs. At this point, safety culture surveys, success stories, and employee interviews can be used to help sustain and reinforce the change process and provide additional evaluative data. Andrew Hopkins argues that where safety is a top priority, “the organization will aim to assemble
77 Blau, P. M. Exchange and Power in Social Life. Transaction, Piscataway, NJ, 1964.
78 DeJoy, D. M., L. J. Della, R. J. Vandenberg, and M. G. Wilson. Making work safer: Testing a model of social exchange and safety management. Journal of Safety Research 2010; 41(2): 163-171.
79 Mearns, K. J., and T. Reader. Organizational support and safety outcomes: An uninvestigated relationship? Safety Science 2008; 46(3): 388-397.
80 Neal, A., and M. A. Griffin. Safety climate and safety at work. The Psychology of Workplace Safety, J. Barling and M. R. Frone, eds. American Psychological Association, Washington, DC, 2004.
81 Schein, E. H. Organizational Culture and Leadership. John Wiley & Sons, Hoboken, NJ, 2010.
as much relevant information as possible, circulate it, analyze it, and apply it.”82
Much of the knowledge and experiences in the development of strong safety cultures in other areas can be transferred to academic chemistry research labs. Industrial research facilities, aviation, health care, nuclear power generation, and process safety all provide important examples of best practices that can be applied to all high-risk activities. The development of strong safety cultures in these fields demonstrates that training and reporting, peer communication, and hazard assessment are all key elements of a strong safety culture in any environment. For any of these practices to be adopted, however, organizational change must take place. To do this, one must understand the details and dynamics of the institution, the subject of the next chapter.
82 Hopkins, A. Studying organisational cultures and their effects on safety. Safety Science 2006; 44(10): 875-889.