but it does not indicate how these 37 were selected. Each session lasted 90 minutes, was led by a professional facilitator, and involved between 2 and 15 pilots. Pilots were encouraged to mention as many different types of events as possible, including any events that should not occur during normal operations. All sessions were recorded and later transcribed. The specific questions posed to the groups are in Appendix 6 of Battelle’s final report.2 The NAOMS team also conducted nine one-on-one interviews to identify additional events that did not surface in the focus groups.

The consolidated list of safety-related topics, as generated by the focus-group sessions and the interviews, is in Appendix 7 of Battelle’s final report. Decisions on which topics to include in Section B of the NAOMS survey were based on “a desire to select events serious enough to be good indicators of the safety performance of the aviation system, yet not so serious that they would occur too rarely to be captured in the survey.”3 Some rare events were included in Section B because of strong industry interest in these specific topics.

The NAOMS team structured the organization of questions in Section B based on the team’s research on how pilots organize their memories. The advice of “accomplished survey methodologists and aviation subject matter experts” was used “to craft questions responsive to each topic.”4

  • Section C—Special Topics—The questions in Section C on special focus topics were intended to be asked only over a few months or years and then replaced by new topics. Three different Section C question sets were developed for the AC questionnaire: one concerning minimum equipment lists, a second one addressing in-close approach changes, and the final one requested by a Commercial Aviation Safety Team subgroup of the Joint Implementation Measurement Data Analysis Team (CAST-JIMDAT) focusing on “the development of baseline aviation system performance measures.”5

  • Section D—Questionnaire Feedback—The questions in Section D provided respondents with a chance to give feedback regarding their survey experience to the interviewers who called. While not directly applicable for safety event rates or trends, some answers to these questions have provided possible topics for future surveys, should they occur.


On the basis of its assessment of the AC and GA questionnaires, the committee found four types of problems that reduced the usefulness of the data collected in the NAOMS survey:

  1. The questionnaires were designed so that events and experiences from markedly different segments of the aviation industry were aggregated together (and cannot be disaggregated).

  2. Some of the questions asked pilots for information that they would likely not have had without a post-flight analysis.

  3. Some of the questions had vague or ambiguous definitions of what constituted an event to be measured.

  4. Some of the questions did not have a clear link between the measured event and aviation safety.

These problems are discussed in detail in the following subsections. (While the examples shown below come primarily from the AC questionnaire, the general problems discussed exist in both the AC and GA questionnaires, unless otherwise specified.)


Ibid., p. 19.




Ibid., p. 20.





The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement