3
Review and Evaluation of Metrics Currently Used at Chemical Agent Disposal Facilities

When representatives of the chemical agent disposal facilities (CDFs) and the Chemical Materials Agency (CMA) gave their presentations it became apparent that the terminology used to name and describe the metrics was not the same at all sites. Committee members were aware that they, too, might have even different definitions for a given metric. This diversity in definitions was also the case for much of the information reviewed by the committee. For example, the incidents, actions, and conditions that were categorized as near misses were very different at the different sites and did not conform with the CMA definition or with definitions found in reference materials. Thus, at one site, an unsafe act was categorized as a near miss, while at another it was categorized as an at-risk behavior but not a near miss. Because of this inconsistency in terminology, the committee developed a glossary (Appendix A) for use with this report. While the definitions found in the glossary may not agree with the definitions used by all, some, or even any of the CDFs or external organizations, the committee believes that they will afford the reader a clear and consistent basis for understanding the terminology used in this report.

SAFETY METRICS

Not surprisingly, each CDF has its own approach to safety and environmental programs, including the metrics it employs. All facilities, however, employ the two metrics that are specifically referenced in the Army’s award fee criteria: the recordable injury rate (RIR) and lost workday cases (LWCs). The latter are cases with days away from work, as specified in the criteria document.

Beyond these two core injury metrics, each site accumulates data and develops metrics in accordance with its particular site safety program. The terminology employed at the various sites relates to

  • Injuries (including illnesses),

  • Incidents,

  • Observations, and

  • Miscellaneous metrics and activities.

In general, injuries and incidents are viewed as lagging indicators, observations are viewed as leading indicators, and miscellaneous metrics and activities can be viewed as one or the other. The types of metrics and activities encountered at the five CDFs are presented in Table 3-1.

The Occupational Safety and Health Administration (OSHA) requires that all injury data be captured at all sites. Even so, not all of the accumulated data are converted into metrics or used as such. All the injury metrics are lagging indicators and are useful for tracking performance and taking corrective action. Some, however, can be used to enhance and sustain awareness and could be viewed as leading indicators. For example, all of the CDFs use the injury metric “hours since the last LWC,” whereas only the Tooele Chemical Agent Disposal Facility (TOCDF) utilizes the metric “time since the last injury”—specifically, days since last recordable injury (RI). The hours since the last LWC metric certainly enhances awareness and site pride with



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 16
 review and evaluation of metrics currently used at chemical agent disposal Facilities When representatives of the chemical agent disposal cases with days away from work, as specified in the facilities (CDFs) and the Chemical Materials Agency criteria document. (CMA) gave their presentations it became apparent that Beyond these two core injury metrics, each site the terminology used to name and describe the metrics accumulates data and develops metrics in accordance was not the same at all sites. Committee members were with its particular site safety program. The terminology aware that they, too, might have even different defini- employed at the various sites relates to tions for a given metric. This diversity in definitions • was also the case for much of the information reviewed Injuries (including illnesses), • by the committee. For example, the incidents, actions, Incidents, • and conditions that were categorized as near misses Observations, and • were very different at the different sites and did not con- Miscellaneous metrics and activities. form with the CMA definition or with definitions found in reference materials. Thus, at one site, an unsafe act In general, injuries and incidents are viewed as lag- was categorized as a near miss, while at another it was ging indicators, observations are viewed as leading categorized as an at-risk behavior but not a near miss. indicators, and miscellaneous metrics and activities can Because of this inconsistency in terminology, the com- be viewed as one or the other. The types of metrics and mittee developed a glossary (Appendix A) for use with activities encountered at the five CDFs are presented this report. While the definitions found in the glossary in Table 3-1. may not agree with the definitions used by all, some, The Occupational Safety and Health Administration or even any of the CDFs or external organizations, the (OSHA) requires that all injury data be captured at committee believes that they will afford the reader a all sites. Even so, not all of the accumulated data are clear and consistent basis for understanding the termi- converted into metrics or used as such. All the injury nology used in this report. metrics are lagging indicators and are useful for track- ing performance and taking corrective action. Some, however, can be used to enhance and sustain awareness saFeTy meTrics and could be viewed as leading indicators. For example, Not surprisingly, each CDF has its own approach all of the CDFs use the injury metric “hours since the to safety and environmental programs, including the last LWC,” whereas only the Tooele Chemical Agent metrics it employs. All facilities, however, employ Disposal Facility (TOCDF) utilizes the metric “time the two metrics that are specifically referenced in the since the last injury”—specifically, days since last Army’s award fee criteria: the recordable injury rate recordable injury (RI). The hours since the last LWC (RIR) and lost workday cases (LWCs). The latter are metric certainly enhances awareness and site pride with 

OCR for page 16
 REViEW ANd EVALuATiON OF mETRiCS TABLE 3-1 Types of Safety Metrics Employed at Chemical Agent Disposal Facilities Injuries and Illnesses Incidents Observations Miscellaneous Metrics and Activities Safety Trained Supervisor Certificationa LWC Near misses Safety observations LWC rate At-risk behaviors Employee observations Corrective action and closure tracking Hours since last LWC Unsafe acts Safety Department observations Aborts of entries while wearing demilitarization Injury near missesb Restricted work case Management observations protective ensemble Restricted work case rate Supervisor safety observations Lessons learned Medical treatment case Stop-work orders Safe behavior ratio Medical treatment case rate Safety inspections Total safe behaviors Total RIs Safety assessments Compliance with OSHA Voluntary Protection RI rate Housekeeping Programs (VPP) Days since last RI Unsafe conditions Injury analysis 12-month rolling RIR Substandard conditions Incident analysis First aid case (FAC) Program audits FAC rate Days between FACs Days since last injury aAt some facilities this is the number of supervisors who have attained this certification. At other facilities, it is the percentage of supervisors who have done so. bThis is an incident that nearly resulted in an injury, but did not. • respect to one type of injury, but it does not speak to Safety observations, • other types of injuries, which in a true safety culture Employee observations, • are equally important. Safety Department observations, • All of the CDFs engage in injury reporting, inves- Management observations, and • tigation, and analysis but not all develop metrics for Supervisor safety observations programs. preventive strategies based on the analyses. The data collected do not include much description of the facil- All facilities employed at least one type of program, but ity location or the task being carried out at the time of only one facility employed all five types. Interestingly, injury, either or both of which could lead to identifica- not one of the facilities said it had developed consoli- tion of causal factors. Also, typical analyses rely on dated metrics from its multiple observation programs. one-dimensional classifications of the outcome data Additionally there was no evidence that any CDF had based on conventional variables such as body part, validated its observation methodologies as a means of injury type, and day of the week. There is no evidence identifying precursors of incidents and injuries. All of an active search for patterns in the data to find com- facilities characterized their observation programs as mon causes that would lead directly to management leading indicators, but this would be true only if the action. observation methodologies had been validated. Safety While all CDFs collect incident data as well as injury assessments, inspections, audits, and the like need to data, there does not appear to be an incident investiga- focus more on their findings than on the number of tion system to gain the same depth of information as activities conducted. from injury investigations. Furthermore, it is unclear Miscellaneous metrics are quite diverse, and not what triggered different depths of incident investiga- all have been included in Table 3-1. As was noted for tion. A classification system for incidents would help to metrics in the observations category, many of these ensure that precursors of injuries are better controlled are simple enumerations of actions or activities. For and could lead to the development of additional met- example, Safety Trained Supervisor Certification at rics. Incidents can be leading indicators of injuries, one site is the number of individuals who have been although they are lagging indicators of the conditions trained in safety. At another site, the term refers to the and behaviors that can lead to injuries. percentage of individuals who have been so trained. The safety observation programs in place at all CDFs While such metrics are very good measures to allow vary in form and content. The five types of observation assessment of performance and/or compliance, they programs used across the facilities are are not necessarily indicative of the actions a facility

OCR for page 16
 EVALuATiON OF SAFETy ANd ENViRONmENTAL mETRiCS needs to take to improve the safety culture, remedy the activities for the purpose of protecting the environment. physical conditions that create safety hazards, and/or Only orders imposed by the EPA or state environmental modify behaviors that place personnel at increased agencies are counted as reportable metrics; decisions risk of injury. Furthermore, a simple count of the by Department of Defense or U.S. Army officials, site number of supervisors who have been certified could managers, and workers to stop work are not included be a misleading metric. Instead, metrics are needed in the metric, but reports are encouraged by CDF that directly measure the extent to which a supervisor management. has established a strong safety culture and a safe work Measures taken by CMA headquarters to promote environment. environmental stewardship include quarterly environ- Overall, while lagging indicators dominate the safety mental data calls where EEAs, regulatory inspections metrics at all sites, they have been useful in advancing and permits, solid waste annual reports, and environ- safety performance to its present state of excellence. mental performance audit systems are discussed. Even so, sufficient data are accumulated to enable the While the facilities tracked different environmen- development of additional metrics of both lagging and tal metrics, most tracked Resource Conservation leading varieties to further improve the safety and envi- and Recovery Act remedial actions and self-reported ronmental programs. This is especially true for leading occurrences of noncompliance. That having been said, indicators and/or metrics. the level of detail in the information provided to the While the CDFs said they are using some leading committee seemed to vary. For example, TOCDF pro- indicators and/or metrics and are actively working to vided information about the number of actions and also identify others, the committee believes that in general defined targets for the key metrics, while other sites meaningful leading indicators are lacking. Most of the provided less information. Environmental metrics that metrics that were proffered as leading indicators were were reported as being tracked include (typically at a little more than simple enumerations of actions and specific facility) these: activities. • Furnace utilization (for efficiency assessment), • Bulk and shredded paper for recycling (an EPA- eNviroNmeNTal meTrics inspired metric), and • Remediation work orders. The CMA headquarters tracks several environmen- tal metrics, including the number of environmental enforcement actions (EEAs), exceedences of chemical In response to written questions, TOCDF reported agent release limits, and stop-work orders. Most are tracking the following nonregulatory metrics: green lagging metrics that do not appear to characterize the purchasing, recycling of scrap metal and paper, sec- violations, although a few are leading metrics. The ondary waste processing and disposal, fuel usage, and water consumption.1 facilities have developed their own sets of metrics that in some cases correspond to CMA headquarters’ The metrics are communicated periodically (in metrics. some cases weekly) to several communities, includ- The EEAs are defined as formal, written notification ing the facility’s general workforce and management, that an applicable statutory or regulatory requirement by means of review meetings, training sessions, and promulgated by the Environmental Protection Agency newsletters. The apparent goal of these communication (EPA) or other authorized federal, state, interstate, efforts is to provide evidence of improvements. regional, or local environmental regulatory agency has been violated. The incidence of EEAs was highest for the Umatilla facility, closely followed by EEAs at the Anniston facility. Again, there were no details on the character of the actions. No specific information was provided on the thresholds for chemical agent releases to be documented as environmental metrics although information is routinely gathered from EPA Toxic 1Personal communication between Trace Salmon, Deputy Site Release Inventories for the various facilities. Stop- Project Manager, TOCDF, and Monroe Keyserling, committee work orders are emergency orders to cease or reduce member, October 14, 2008.