Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 3
An Assessment of NASA’s National Aviation Operations Monitoring Service 1 Introduction and Overview 1.1 ORIGINS OF THE NAOMS SURVEY NASA’s NAOMS project was a survey administered to pilots from April 2001 through December 2004. The origins of NAOMS can be traced to the White House Commission on Aviation Safety and Security, commonly referred to as the Gore Commission. That commission was created to “study matters involving aviation safety and security, including air traffic control and to develop a strategy to improve aviation safety and security, both domestically and internationally.”1 In its report, the commission touched on almost all aspects of the aviation industry and made 57 recommendations in the areas of improving aviation safety, making air traffic control safer and more efficient, improving security for travelers, and responding to aviation disasters. Several of these recommendations can be linked to the NAOMS project, but recommendation 1.1 is of particular relevance: Government and industry should establish a national goal to reduce the aviation fatal accident rate by a factor of five within ten years and conduct safety research to support that goal.2 This recommendation set in motion a process to create methods for monitoring the National Airspace System (NAS). Existing aviation-data-collection tools for the NAS were limited in scope and did not address all of the areas for which monitoring was believed to be useful. Thus, the NAOMS survey was developed with the expectation that it would be a new tool within the aviation safety field—a tool that could generate statistically valid rates of events (such as bird strikes or rejected takeoffs) and track trends over time for the entire NAS.3 The NAOMS survey was not intended to provide an absolute measure of aviation safety;4 it was intended to support aviation safety policy “in conjunction with the many other data resources available to decision-makers.”5 The use of sample surveys is new in the field of aviation safety, so NAOMS was also viewed as a research study to assess the usefulness of sample surveys in this context. 1 White House Commission on Aviation Safety and Security, Final Report to President Clinton, Al Gore, chairman, Washington, D.C., February 12, 1997. 2 Ibid., p. 9. 3 Linda Connell, presentation to the Workshop on the Concept of the National Aviation Operations Monitoring Service (NAOMS), Washington, D.C., May 11, 1999, pp. 6-9. 4 Loren Rosenthal, Manager of NAOMS, Battelle Memorial Institute, “NAOMS Program Overview,” presentation to the NRC Committee on NASA’s National Aviation Operations Monitoring Service (NAOMS) Project, Washington, D.C., June 9, 2008, p. 5. 5 Battelle Memorial Institute, NAOMS Reference Report: Concepts, Methods, and Development Roadmap, Battelle Memorial Institute, Columbus, Ohio, November 30, 2007, p. 6.
OCR for page 4
An Assessment of NASA’s National Aviation Operations Monitoring Service The NAOMS survey was originally designed to collect information on safety-related events as experienced by all of the frontline operators of the NAS, including pilots, controllers, mechanics, flight attendants, and others. However, the survey of pilots appears to have taken longer and required more resources than expected,6 and surveys of the other groups did not even reach the stage of development. The AC and GA pilots were surveyed from April 2001 through December 2004. NASA decided to discontinue support for NAOMS toward the end of this period, and the survey was transformed into a Web-based tool and handed off to the Air Line Pilots Association (ALPA).7 1.2 BRIEF OVERVIEW OF THE SURVEY The NAOMS project was jointly managed by NASA and the Battelle Memorial Institute, NASA’s subcontractor for the project. A total of 29,882 pilots were surveyed as part of the study over the period from April 2001 to December 2004.8 Of these pilots, 25,105 participated in the AC survey. The survey of GA pilots was conducted only for a brief period (approximately 9 months) and involved 4,777 completed interviews. The FAA’s Airmen Certification Database (ACD) was the source from which the sample of pilots to be surveyed was selected. Each pilot who responded to the survey was asked a set of questions relating to the following: background information, the number of hours and flights that the pilot had flown, the number of events from among numerous possible safety-related events that the pilot had observed, some topic-specific questions, and feedback about the survey. All questions related to events that had occurred within a specific time range (recall period). For most but not all of NAOMS, this recall period was 60 days. Both AC and GA versions of the survey had the same basic structure, with four sections (see Appendixes G and H in this report). Participation in the NAOMS project was completely voluntary. All data provided by NAOMS respondents were held in confidence. NAOMS maintained records of survey participants, but there is no linkage in NAOMS data repositories between the names of past respondents and the data that they provided.9 To maintain the confidentiality of the survey participants, NASA released only redacted versions of the survey data. Only these redacted data sets were made available to the NRC’s Committee on NASA’s National Aviation Operations Monitoring Service (NAOMS) Project: An Independent Assessment. More detailed descriptions of the sampling design, the survey questionnaires, the redacted data, and other features of the NAOMS project are provided in the following chapters. 1.3 OVERVIEW OF THE REPORT AND SUMMARY OF THE FINDINGS The rest of this report is organized in six chapters that address the various charges in the committee’s statement of task (presented in the Preface). Chapter 2 discusses ways of measuring aviation safety and describes sources of data on aviation safety that existed before NAOMS or that became available after NAOMS was developed. While fatalities and accident rates are the ultimate measures of aviation safety, the committee notes that there is considerable value in collecting and examining information from certain types of operational data. There were few sources of aviation data when NAOMS was conceived, and many more have become available since then. However, each of these data sources has its advantages and limitations. The FAA’s Aviation Safety Information Analysis and Sharing System (ASIAS) is intended to allow easy access to a wide variety of existing databases. It is being developed as a source of data for the entire aviation system. Chapter 3 assesses the usefulness of sample surveys in providing information on aviation safety. A sample survey is a scientifically valid and effective way to collect data and track trends about events that are potentially related to aviation safety. It can be used to collect reliable information about all segments of civilian aviation, 6 See NASA, Final Memorandum on the Review of the National Aviation Operations Monitoring Service, NASA, Washington, D.C., March 31, 2008, p. 3 (also known as the NASA Inspector General’s Report). 7 Ibid., p. 38. 8 Battelle, NAOMS Reference Report, 2007, p. 13. Other sources provide slightly different numbers, in part because of reclassifications, different data releases, and so on. 9 Ibid., p. 8.
OCR for page 5
An Assessment of NASA’s National Aviation Operations Monitoring Service especially to characterize the safety of general aviation flights and those of other segments of aviation. Further, government-sponsored surveys can provide data that are accessible to the public and can be analyzed independently. However, past experience in the government sector indicates that successful large-scale surveys require a substantial commitment of time and resources to develop, refine, and improve the survey methodology and to ensure that the survey provides useful and high-quality data. Chapter 4 provides an assessment of the NAOMS sample design. Several aspects of the design, such as the use of a cross-sectional design, the computer-assisted telephone interview (CATI) method, and professionally trained interviewers, are consistent with generally accepted practices in survey design. A CATI system has the potential to incorporate checks for unlikely or implausible values during the interview process. However, the committee found that substantial fractions of responses had implausibly large values for reported hours and flight legs flown as well as event counts, suggesting that the NAOMS survey did not take full advantage of this feature of CATI. The NAOMS team also faced substantial challenges in the choice of the sampling frame and had to make compromises at several stages. Unfortunately, these compromises appear to have led to biases in the sample. While the choices and compromises may have been made for good reasons, the NAOMS team should have investigated the potential impact as well as the magnitude of biases resulting from failure to locate sampled pilots and other forms of nonresponse. In the committee’s view, the collection and analysis of supplemental data during the early phase of the survey would have provided a reliable assessment of the various biases and may have led, if necessary, to the development of alternative design strategies. In Chapter 5, the committee’s analysis of the survey questionnaires identified four types of problems that reduced the usefulness of the data collected: (1) the questions went beyond the scope of the intended AC and GA operations, resulting in the aggregation of data from markedly different segments of the aviation industry; (2) some of the questions asked pilots for information that they would likely not have had without a post-flight analysis (for example, the origin of smoke in the aircraft or verification of an uncommanded control surface movement); (3) some of the questions had vague or ambiguous definitions of what constituted an event to be measured; and (4) some of the questions did not have a clear link between the measured event and aviation safety. As noted above, the committee had access only to the redacted data that were made available to the public. NASA released redacted versions of the survey data starting in December 2007. Chapter 6 describes these redactions and discusses how they further constrain the usefulness of the data and the ability to conduct an analysis to meet the study objectives. One important problem is the grouping of the survey data into whole years. This limits the ability to track changes in event rates over shorter timescales, to determine the effects of changes in the aviation system on event rates, and to assess seasonal and similar types of effects. Issues associated with preserving respondents’ anonymity and confidentiality and with the public release of data are common to other government surveys. Such issues should have been anticipated and addressed at the design stage of the project, avoiding the need for after-the-fact, ad hoc redaction methods and the resulting loss of information. Chapter 7 discusses the potential utility of the NAOMS data. There are several problems with the quality of these data: substantial fractions of responses had implausibly large values, and respondents often rounded their data to convenient numbers. The extent and magnitude of these problems raise serious concerns about the accuracy and reliability of the data. Further, the committee’s limited comparison of NAOMS data with other sources indicates that there was an over-representation of some groups and an under-representation of others in the NAOMS survey. Such sampling biases must be addressed in the estimation of event rates and trends. There are many approaches in the survey sampling literature for addressing such biases, but they require detailed information on how the survey was implemented, including the type and nature of problems that were experienced, and access to the original data. In the committee’s view, some of these problems could have been reduced substantially if more effort had been spent on ensuring data accuracy during the interview and data-entry stages and if respondents had been asked to refer to their logbooks when possible. The committee does note that many of the biases that relate to the estimation of event rates would be mitigated for trend analysis to the extent that the biases remain relatively constant over time. However, the degree of mitigation might vary substantially across event types. The committee did not find any evidence that the NAOMS team had developed or documented data analysis plans or conducted preliminary analyses as initial data became available in order to identify early problems and
OCR for page 6
An Assessment of NASA’s National Aviation Operations Monitoring Service refine the survey methodology. These activities are part of a well-designed survey, especially one conducted as a research study to assess the feasibility of survey methodology in aviation safety. Based on the analyses and findings of the committee, the publicly available NAOMS data should not be used for generating rates or trends in rates of safety-related events in the National Airspace System. The data could, however, be used in developing a set of lessons learned from the project. The committee encountered several challenges in assessing NAOMS survey methodology and the potential utility of the data. The committee did not have access to the original, unredacted data. Assessing the utility of the NAOMS data based on heavily redacted data is not an easy task, and it is further complicated by NASA’s release of multiple data sets under different redaction methods. Further, as pointed out by the Government Accountability Office’s review of NAOMS: “The project staff and contractors did not assemble a coordinated, clear history detailing the project’s management that would facilitate evaluation of the overall air carrier pilot survey.”10 The lack of documentation and the delays in obtaining some documents made it difficult for the committee to obtain a full understanding of the steps taken and decisions made (including their rationale) during the NAOMS project. The committee frequently had to rely on statements based on the memory of those involved in the project and, therefore, it received a variety of recollections. Perhaps if decisions had been more clearly documented, there would have been fewer divergent views regarding the various decisions and processes that occurred during the project. 10 Government Accountability Office, Aviation Safety: NASA’s National Aviation Operations Monitoring Service Project Was Designed Appropriately, But Sampling and Other Issues Complicate Data Analysis, Report to Congressional Requesters, GAO-09-112, Washington, D.C., March 2009, pp. 34-35, 54.