Cover Image

Not for Sale



View/Hide Left Panel
Click for next page ( 160


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 159
ANNEX B Using Assessment Tables These assessment tables are designed to organize and consolidate the information and obser- vations collected during an SMS assessment. These tables can be detached from the guidebook and used during the assessment. The tables are based on OHSAS 18002 and the Canadian CSA Standard Z, and were adapted using ICAO Doc 9859 (2006) guidelines and reviewed to address FAA AC 150/5200-37 (2007). B.1 How to Use The worksheets are a series of tables which contain all of the expectations associated with each SMS pillar and element. There is space for the assessment team to record information references, score, and observations (justification for the given score). The tables shown in Annex A (Tables 32 through 36) are organized as follows: The first column contains SMS expectations; The second column is for references, i.e., the source of observations or information collected during the document review or interviews; The third column is for the score (from 0 through 5) assigned to each expectation; and The fourth column is for the observations/information collected by the assessment team that justify the assigned score. References During the assessment, team members should collect as much information as possible to reference their observations, including the name, position, department of the person being interviewed, the observation location and time, the document title, and publication date and reference number. If the expectations worksheets are going to be part of the final deliverable to the client, a complete reference may not appear in the final version to protect the privacy of individuals. However, this information should be available to the assessment team during the scoring process and for future reference, if required. Scoring Scoring should be conducted by all members of the assessment team according to the method- ology outlined in Section 6.6. Pillar and element scoring should be done after all the team members' observations have been recorded on the worksheet (Annex C, Table 37). Observations Members of the assessment team should transfer their observations from their notebooks to the worksheets and score each expectation. There need only be one working copy of this 159

OCR for page 159
160 Safety Management Systems for Airports document, which is passed among team members. This may be done at the end of each day of the site visit. B.2 SMS Scoring Methodology Once all of the observations have been recorded, the team should score the SMS elements as a group. There should be a consensus, by the team, on the score assigned to each element. In the event that there is a disagreement, the Team Leader will make the final decision. The following is the criteria that should be used. Step 1--Score Expectations Expectations are scored first. They are given a score of "MEETS Expectation" or "BELOW Expectation" based on the infor- mation collected during assessment. This is done to remove subjectivity. The score may have one or more comments associated with it that provide justification and context. Expectations may be scored by individual team members. Another team member may override the initial score if he/she can provide the information required to justify a change. Team members should come to a consensus on the score assigned to each expectation. Step 2--Score Elements Sub-elements and elements are scored next. The assigned score is not based on a mathematical average of the expectations scores; however, expectations scores serve as a guide for pillar and element scoring. Sub-elements and elements are given a score of 0 through 3 as follows: -- "0" is given when none of the expectations under the element are met (comments/ justification required) -- "1" is given when some of the expectations under the element are met (comments/ justification required) -- "2" is given when all expectations under the element are met (no comment required) -- "3" is given when all the expectations are met or exceeded (comments/justification required) -- "3" may be assigned if the assessment team believes that the organization has done an exceptional job, meeting or exceeding all the expectations under this element, and deserves extra mention (comments/justification required) -- "4" is given in the event that the organization exhibits best practice for this element (rare, extremely subjective, and may only be assigned by an SMS expert with particular industry experience [comments/justification required]). Add the sub-element scores to assign element scores, as required. Again, the score is not based on a mathematical average; the sub-element scores serve as a guide for the element scores. Elements can only be assigned whole numbers--no decimals, please! All team members should agree on the element scores before assigning pillar scores. Element scores should be recorded on the scoring table presented in Annex C. Step 3--Add to Score Pillars Pillars are scored last, following a similar process to element scoring. The assigned score is not based on a mathematical average of the element scores. Element scor- ing serves as a guide for pillar scoring. Pillars are given a score of 0 through 4, following the same criteria used for the sub-element scores. Pillars can only be assigned whole numbers. All team members should agree on the final pillar scores. Pillar scores should be recorded on the scoring table presented in Annex C.