Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 52
3 Methods for Assessing Effectiveness Control is a vital management function by which operations are brought into compliance with predetermined standards that are established on the basis of planning and implementing systems to achieve the goals of an organization. It is axiomatic that to control, one must first measure. To measure, one must know the characteristics of the parameters on which measurements are being made. If measurements are to be made reliably, the influences that affect the measurement must be known. Operational results, causes, and effort can be measured. The data so acquired must be evaluated as to the impact on performance, which is a measure of the effectiveness of the actions. Decisions about effectiveness, therefore, are quite complex, in that they involve judgments about assessment, methods, and evaluation of data from operations. The degree of complexity increases with the complexity of the system being evaluated. The Safety and Environmental Management Systems (SEMS) regula- tions require operators to develop and submit a SEMS plan to the Bureau of Safety and Environmental Enforcement (BSEE). Assessment of the effectiveness of an operator’s SEMS program is an essential step toward improving the quality of SEMS application in practice. SEMS regulations prescribe specific audit requirements: a comprehensive audit 2 years from the initial implementation of the SEMS program and at least once every 3 years thereafter. Potential assessment methods The breadth and depth of SEMS require that several methods be used to assess its effectiveness on an ongoing basis for continuous improvement in development and implementation. Operators, who are responsible 32
OCR for page 53
Methods for Assessing Effectiveness 33 for the development of a SEMS program, must develop a plan for assessing the implementation and performance of the program at the same time. The Committee on the Effectiveness of Safety and Environmental Man- agement Systems for Outer Continental Shelf Oil and Gas Operations (the committee) has identified nine methods that may be used to assess the effectiveness of an operator’s SEMS program: 1. Compliance inspections, 2. Audits, 3. Peer reviews and peer assists, 4. Key performance indicators, 5. Whistleblower programs, 6. Periodic lessee reports, 7. Tabletop exercises or drills, 8. Monitoring sensors, and 9. Calculation of risk with SEMS in place. Some of these methods can be further subdivided. These nine methods are not mutually exclusive, and elements of each could be combined to develop the most effective evaluation program for a given operator. Table 3-1 summarizes the nine methods, which are discussed below, and notes pros and cons for each one. Compliance inspections Compliance inspection is one of the simplest forms of SEMS verification. The intent is to verify, with little time and minimal inspector training, that at least portions of the SEMS program are operating. The compli- ance inspection is not meant to be a comprehensive audit such as that described below; rather, it provides a general indication of the state of the SEMS program by verifying specific components. Checklists may be used to conduct compliance inspections to ensure that documentation is compliant with the regulations. For example, the inspector may use a brief checklist to verify that SEMS items such as training (certificates) and operating procedures and emergency response plans are in place and that staff are familiar with the use of the latter two. Carefully crafted interviews of operational personnel can be very effective in determining whether workers understand how and why their actions lead to safer
OCR for page 54
TABLE 3-1 Summary of Methods for Assessing the Effectiveness of SEMS Programs Method Description Pros Cons Notes 1. Compliance inspection Onboard SEMS check by Maintains minimal Scope of SEMS check limited day-to-day BSEE inspectors; compliance because of responsibilities for regional inspectors can Provides regulatory pres- inspections of all other man- also perform SEMS check ence at the operations datory requirements level a. Checklist Checklist to ensure SEMS is Simple to implement with May only assess compliance in place on platform minimal training with paperwork or system; Checklist scope and details May quickly identify limited assessment of may vary deficiencies with effectiveness of the SEMS SEMS program and program implementation Platform specific; not a corporate- wide check Content and quality can vary extensively Must develop checklists b. Interviews, witnessing, Interviews or other communica- Can provide information Can be subjective California State Lands and so forth tion with platform personnel to assess whether Reliant on interviewer skills Commission program to determine whether they platform personnel Additional SEMS training is an example understand the SEMS program, are knowledgeable required, perhaps substantial including possible test drills and use SEMS Time consuming May be concurrent with administering checklists
OCR for page 55
Can only provide a reasonable 2. Audit Review of implementation and Proven method assurance that the system is quality of SEMS at both Established auditing pro- tocols available for pro- effective corporate and platform level cess safety management Specific protocols need to be Platform level may be all platforms developed for defined scope or a sampling (e.g., API, American Auditor required to be expert at Scope (e.g., comprehensive or Institute of Chemical SEMS selected components) and Engineers) Several auditors may be required details (time interval, auditing Scope and details can vary in order to look at all SEMS protocols) can vary areas a. Periodic audit Planned in advance on a regular Can be scheduled to meet Cost and time Guidelines for meet- basis, typically 2- to 3-year BSEE requirements Need to develop specific protocols ing BSEE audit intervals Can be a comprehensive for SEMS audit requirements audit b. Surprise or Unannounced; a combination Instantaneous assessment May disrupt normal activities “Surprise” means random audit of randomly selected SEMS of state of SEMS (e.g., drilling or testing) several days’ notice, across all owners implementation May not be comprehensive not instantaneously c. Event-driven audit Triggered by events such as injury Immediately corrects SEMS Reactive, lagging assessment May be required in any or death, pollution, a near issues, if applicable May not reflect processes in case by regulations miss, and noncompliance place prior to incident 3. Peer review, peer assist Assessment of SEMS implemen- Team is qualified and Independence may be questioned tation by a team composed of experienced in SEMS Potential conflicts of interest and peers from the industry Nonthreatening identifica- confidentiality tion of catastrophic Potential legal liability issues weaknesses and related to discoverability of opportunities to improve recommendations and recom- Good potential to learn mendations given in good faith from each others’ SEMS that have poor outcomes (continued on next page)
OCR for page 56
TABLE 3-1 (continued) Summary of Methods for Assessing the Effectiveness of SEMS Programs Method Description Pros Cons Notes Quantitative Unclear as to how current metrics BSEE can establish 4. Key performance indicators Use metrics from corporate- or Easy to implement relate to SEMS effectiveness specific SEMS INCs platform-specific data to Can be automated and New metrics may need to be assess SEMS effectiveness reported to BSEE developed Metrics can be currently reported regularly (quarterly) If metrics do not accurately ones (e.g., INCs, spills, reflect safe conditions, they accidents, near misses) or Could be used to identify could create complacency expressly developed new specific problem ones [e.g., number of changes platforms (i.e., MOC), SEMS INCs] BSEE databases available for analysis Lagging indicator of problems May be available in 5. Whistleblower program Owner’s policy and programs for Proactive for identifying already in place other industries anonymous reporting of events corrective actions Disgruntled persons can report (e.g., nuclear, aviation) or situations by employees Evidence of management’s or other persons to comple- commitment to SEMS false information ment normal reporting and Engages staff day to day Dependent on culture communication channels that Easy to implement Requires follow-up program and fast and transparent follow-up would lead to better SEMS by owner implementation Keeps SEMS relevant Accuracy of self-report can be Report context and 6. Periodic lessee report Quarterly, biannual, or yearly and recent in terms of questioned content are current specific report from the lessee Can be onerous on operator and relevant; may be on the status and effectiveness operator’s processes Scope and detail are not defined corporate level rather of its SEMS program As voluntary submissions, and may need to be developed than platform specific Scope and details of these these may be useful voluntary reports can vary when performing mandatory SEMS audits
OCR for page 57
7. Tabletop exercise or drill Planned or surprise drill with Can become a subset of Cannot test all SEMS—would specific actions to test SEMS; existing drills have to be a selection similar to spill drills True reflection of SEMS Would require much preplanning Can vary from simple to complex in action by owner and BSEE exercises, depending on the Can only be applied to a limited scope of SEMS tested number of facilities Time consuming May require dedicated BSEE personnel and skill set 8. Monitoring sensors Tracking onboard sensors to Quantitative SEMS Need to identify how these establish specific metrics for measure sensors may reflect SEMS SEMS purposes Possible future development issues of SEMS-specific sensors Can send data back to shore for evaluation 9. Calculation of risk with Specific quantitative methods Measurable Quantitative, results can vary SEMS in place (QRA) that use owner’s SEMS pro- Can see changes in between QRA approaches gram as well as statistics from performance over time Need data over time to see trends platform operations to deter- Need baseline data for statistical mine effectiveness of SEMS analysis over time Output depends on model assumptions and details Note: API = American Petroleum Institute; INC = incident of noncompliance; MOC = management of change; QRA = quantitative risk assessment.
OCR for page 58
38 Evaluating the Effectiveness of Offshore Safety and Environmental Management Systems operations and can lead to an understanding of the underlying safety and environmental culture of the organization. These types of interviews are also part of normal audit procedures. audits An audit of a SEMS program should be a classic audit that consists of a comprehensive, systematic collection and review of information to ensure the program is being maintained and operated as intended. Where possible, the audit should verify objective evidence that shows conformance with the SEMS program. The audit can be performed by one or more internal staff (a first-party audit), by an associated outside organization (a second-party audit), or by a completely independent organization (a third-party audit). Audits may be periodic, surprise or random, or event driven. Event-driven audits are particularly effective in leading to an understanding of what went wrong and why and are often the impetus for major changes in industry approaches and regulatory oversight. The current BSEE SEMS regulation that went into effect November 14, 2011, allows first-, second-, and third-party audits, but the pending SEMS II regulation, as proposed in the September 2011 notice of proposed rulemaking (BOEMRE 2011a), authorizes only independent third-party audits. Complete or partial audits of an operator’s SEMS program could be conducted, as justified by reports from inspectors, reviews of operators’ audit reports, incidents, or events. Peer Review and Peer assist Often simply referred to as “peer assist,” this method of assessing effectiveness engages several respected industry peers from outside the organization, including other operators, in reviewing the company’s compliance performance and SEMS implementation. The reviewers then suggest helpful ideas for improvement. There may or may not be formal documentation. Peer assists are a common intracompany and intercompany activity for technical and economic issues and have been found to work well in other contexts. There are different protocols for this method (e.g., different
OCR for page 59
Methods for Assessing Effectiveness 39 levels of required response to peer recommendations). For example, a peer assist can be • An informal process with no formal recommendations or written record, • A formal process with formal recommendations and written responses to the recommendations, or • Some variant in between. One goal of the peer review or peer assist method is to have an independent set of eyes focusing on a company’s operations with the sole purpose of helping that company improve. To ensure confidentiality, members of the team could be asked to sign a confidentiality agreement before serving. This method is based on the premise of promoting a “don’t blame, let’s improve” culture. The aviation industry is one in which the peer assist approach is employed.1 Key Performance indicators Key performance indicators (KPIs) are commonly used to evaluate a program’s success or the success of a particular activity. KPIs work well when there are clear objective metrics that can be quantified, such as barrels of oil produced or number of lost-time incidents. A difficulty in using KPIs to assess the effectiveness of a SEMS program lies in deter- mining the specific metrics that will be used to measure the effectiveness of the program. The process used by Petroleum Safety Authority (PSA) Norway, called Risikonivå i norsk petroleumsvirksomhet, is one approach that would be a useful starting point for BSEE KPIs. This approach is described more fully in the section on PSA Norway in Chapter 4. Whistleblower Programs A whistleblower program provides a means for an internal or external person (or organization) with knowledge that the SEMS program, or some of its components, is not being implemented correctly or is being See http://www.nasa.gov/offices/oce/appel/ask/issues/40/40i_peer_assist.html. 1
OCR for page 60
40 Evaluating the Effectiveness of Offshore Safety and Environmental Management Systems falsified to bring this information to the attention of the proper authority. In most cases such a program must protect the identity of the informant as well as guarantee that no repercussions, such as an employee’s losing his or her job, will be forthcoming. Many industries use whistleblower programs, so there are many examples that can be used in conjunction with SEMS programs. Periodic lessee Reports Operators or lessees may generate periodic reports describing the effectiveness of their SEMS program. Although perhaps open to questions about impartiality and accuracy, such reports do force the operator to take an active approach to SEMS implementation and monitoring. The contents of the report can range from an open format defined by the operator to a specific format and content required by the regulator. tabletop exercises or drills Special drills or tests of an operator’s SEMS program can be performed on a planned or surprise basis. Similar drills related to issues of life, safety, and environmental releases are already performed on offshore facilities. Because tabletop drills are not commonplace for SEMS, con- siderable planning by both the operator and the regulator would be needed to make the drill specific to testing the effectiveness of a SEMS program. monitoring sensors Mechanical sensors that monitor pressures, temperatures, flow rates, and related data can possibly be used in developing metrics that will determine the effectiveness of the SEMS program. The specific monitors, their relation to SEMS, and how such a system would work have yet to be determined. Some of these monitors may be in place already as part of normal production operations, while new monitoring devices specific to SEMS metrics may need to be developed. Ideally, these systems would be able to send information directly back to shore for real-time SEMS monitoring.
OCR for page 61
Methods for Assessing Effectiveness 41 Calculation of Risk with sems in Place A formal quantitative risk assessment (QRA) for a platform based on SEMS-specific data can be used to monitor the effectiveness of a SEMS program. The change in the QRA risk level when the SEMS program is modified or updated will show how effective the program is, although it is a computed theoretical effectiveness. One advantage of this method is that the owner can use the QRA risk level to determine the effective- ness of alternative SEMS-related modifications and upgrades to assist in determining the best approach from a SEMS perspective. measuRing tRends The methods identified above directly assess the effectiveness of spe- cific operator SEMS programs. However BSEE could aggregate the data across operators to monitor the trends and provide input to operators on specific improvements or areas of concern. Continuous improve- ment programs (CIPs), which are common in the offshore oil and gas industry, are one example of such an approach. In a CIP, employees typically submit suggestion slips or other forms of corporate feedback (sometimes anonymously) related to improvements to operations, includ- ing SEMS-type activities. Monitoring and reporting of these suggestions and how they change over time (e.g., an increasing or decreasing num- ber of SEMS suggestions and the focus and types of suggestions) can be informative and lead to improvements in the industry’s overall safety record. Another example is the industrywide collection and evaluation of SEMS-related data, such as data on safety and release incidents. Such data collections will improve the understanding of the effectiveness of SEMS across the industry as well as identify specific operators that have issues (or, conversely, that do not have issues) with their SEMS programs in comparison with their peers. summaRy Each of the methods described above could have a role in the assessment of both the progress being made in the implementation of SEMS and the effectiveness of SEMS. Evaluating SEMS is a continuous activity and
OCR for page 62
42 Evaluating the Effectiveness of Offshore Safety and Environmental Management Systems therefore could include, at appropriate times and appropriate levels of the organization, a selection of the methods outlined above. An audit is a periodic activity. Operating management, from first-line supervisors to top management, might find it useful to assess their progress toward improvement of safety and environmental conditions on an ongoing basis with a combination of SEMS monitoring sensors, KPIs, records of potential incidents of noncompliance, interviews, and other methods. Periodic assessment with drills, peer reviews, and lessee SEMS reports might have a broader scope than operational aspects and operating management. The methods that the committee recommends are presented in Chapter 6.