National Academies Press: OpenBook

Detection of Explosives for Commercial Aviation Security (1993)

Chapter: 2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA

« Previous: 1 SYSTEMS CONSIDERATIONS
Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×

2
TESTING PROTOCOLS AND PERFORMANCE CRITERIA

A. INTRODUCTION

Effective, unbiased testing of explosive detection equipment is an essential element of the FAA's Security Technology Program. It is only through a well planned test and evaluation (T&E) program that realistic evaluations and demonstrations of the ability of equipment to satisfy FAA-mandated explosive detection requirements can be assured. By the time a deployment decision is made, T&E should encompass all of the elements of the explosive detection system: equipment, software, facilities, personnel, and test procedures.

The various types of testing mirror the development phases. Developmental testing occurs primarily during the engineering prototype phase. Its goals include: verification of the performance characteristics of a particular device; input into the engineering design and development process; demonstration that design risks are manageable; estimation of utility; evaluation of compatibility and interoperability with existing/planned systems; and assurance that the equipment/system is ready to proceed to testing in the operational environment. Additional limited developmental testing can be conducted after devices have been fielded. Typical reasons include: verification of the effectiveness of product improvements and demonstration of the adequacy of redesigns to solve field or production problems.

Testing of full configuration equipment in an operational environment can be conducted to characterize a device, or to qualify a system for certification testing. The goals are to assess functional characteristics and to demonstrate the equipment's operational effectiveness and suitability. The items tested must represent production models so that a valid assessment can be made. This testing is usually conducted by a group that is separate and independent from both the developer and the user.

Field level verification/validation testing is normally conducted at operational sites on deployed equipment. The goal is to measure the continued level of effectiveness of the equipment under normal conditions, using regular operators and maintenance personnel.

This report addresses generic test protocols required to verify functional characteristics of EDD and EDS equipment, and certification testing of an EDS. It does not address specific protocols required for developmental

Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×

testing, field level verification/validation testing, or for Explosive Detection Operation that includes activities other than those accomplished by the EDS, such as profile analyses and terrorist intelligence. Further, it does not address total Airport Security Operations, e.g., access control systems.

B. ROLE OF TESTING IN THE FAA REGULATORY PROCESS

The FAA plays a complex role both in stimulating the development of explosive detection devices and systems, and in regulating their use. A well planned test and evaluation (T&E) program should provide realistic evaluations and demonstrations of the capability of equipment. The remainder of this discussion principally focuses on T&E required to support the FAA's regulatory process, although much of the approach applies to developmental testing of engineering prototypes as well.

As required by Public law 101-604, Section 107, the FAA reviews the threats to civil aviation with particular focus on explosive materials which present the most severe threat. This review results in a list of the minimum amounts, configurations, and types of explosive materials which would reasonably be expected to cause catastrophic damage to commercial aircraft. In its regulatory role, the FAA must prepare and issue guidelines for use of explosive detection devices and requirements for explosive detection systems to the U.S. air carriers. The FAA must issue guidelines relating to the performance characteristics of prospective devices (EDD) that would allow them to be configured in different system architectures. Regulations should define specific operational requirements for systems, including probability of detection for those types and quantities of explosives that comprise the terrorist threat. The testing program must support both system and device evaluation, validation, and qualification for certification.

An important distinction is made regarding the categorization of the equipment being tested. An Explosive Detection Device (EDD), which employs a particular instrumental method to detect the presence of explosive material, would be tested by the FAA for qualification/verification purposes. The objective of this testing is to verify its performance as matching the data provided by the manufacturer or, in some instances, simply determine its performance. The result would be a set of parameters that characterize the operational performance of the device. An example of parametric testing is the generation of a family of curves relating probability of detection to false alarm rate for different sensitivity settings of the equipment.

On the other hand, an explosive detection system (EDS), which could be comprised of one or more integrated EDDs, would be tested for certification. The objective of this testing is to certify that these instruments, together with their underlying system architectures, meet the FAA-required EDS performance standards under realistic airport operating conditions. The

Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×

actual testing would be conducted similar to parametric testing, but rather than generating a family of curves, only the test conditions mentioned in the specification would be measured.1 The test results will be pass/fail.

Only certified EDD equipment would be available for installation at airports. It is probable that EDSs will be offered by systems integrators who would ''package'' instruments produced by others, together with the necessary baggage transport mechanisms, computer hardware and software, etc. The "package" as a whole would be certified; whoever submits the EDS for testing will hold the certificate.

FAA's requirements for an EDS certification program should encompass more than certification testing of an initial EDS. The certification program should include requirements that the EDS vendor address key aspects required to provide assurance that each EDS unit sold would perform over a period of time in the field at least as well as the one that passed the certification test. Additional certification factors would include: accountability for manufacturing quality control; configuration control of hardware and software; calibration and procedural data; maintenance requirements; personnel training;2 etc. The certification process should allow for incorporation of operational improvements into devices and systems, as well as cost and size reductions. The certification program must encourage participation by air carriers, airport operators, and equipment vendors, along with the FAA, in the rule making process. A comprehensive certification program is strongly recommended by the committee.

Establishment of the explosive detection system (EDS) performance standard3 should include a limited number of minimum technical performance specifications for the following parameters: detection probability; false alarm rate; and throughput rate. Size, weight, support requirements and other important operational parameters should be specified by an allowable range of values. This range can be narrowed for specific airport sites. Once the FAA is certain that at least one EDS can meet the EDS performance standard (having more than one available EDS is highly desirable), a schedule of required implementation dates by airport location can be established. The EDS standard developed by the FAA should also describe countermeasures that terrorists are likely to employ once the general nature of the EDS equipment is known.

The FAA will be responsible for routine and random testing of operating systems in the field to ensure that deployed systems are being properly maintained and are operating in compliance with established standards. The section on the testing organization discussed below does not address the enforcement area.

1  

Parametric testing could be performed on EDS equipment to fully characterize the operational characteristics prior to the formal certification process.

2  

Human factors considerations must be an integral part of the overall security program. See Technology Against Terrorism—Structuring Security, OTA-511, Chapter 5, "Human Factors in Aviation Security."

3  

Proposed criteria for Certification of Explosive Detection Systems was published in the Federal Register, Volume 57, Number 214, Notices 52698-527002, Nov. 4, 1992. It was assigned Notice Number 92-16, Docket Number 27026.

Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×

C. TESTING ORGANIZATION

The future market potential for explosive detection devices and systems in commercial aviation is quite large. Also, the FAA R&D program has actively supported the development of certain instrumental methods. For these, and other reasons, one can readily envision pressures being exerted on participants involved in the FAA certification process. There are at least two ways in which the FAA can minimize the potential influence of these pressures.

  • The first approach, and the one recommended by the committee, would require that an independent, well respected organization, outside of the FAA conduct all certification testing. This outside organization would report directly to the FAA Administrator. Candidate organizations include federal laboratories with current testing responsibilities for other federal entities, and independent research institutions with existing testing expertise. This outside organization must not be directly involved in the development or manufacture of explosive detection devices.

  • The second approach involves an organization within the FAA that would conduct the testing. This organization must have established expertise in testing matters using specified protocols and sound scientific principles. Because of the importance and impact of testing, it is essential that it be independent of, and insulated from, pressures, whether real or perceived, arising from sources internal or external to the FAA. For example, the FAA must not appear to give preference to the testing of an EDS which employs a technology developed as part of an FAA-sponsored R&D effort.

The activity of this independent testing organization must be overseen by an advisory board with expertise in technology and testing whose composition is independent of the FAA. This advisory board would report directly to the FAA Administrator. For each test, the board would review the test plans, the test data and the test findings and interpretations, and attest to the accuracy of the report's findings and conclusions.

D. TEST PLANNING AND PREPARATION

The important first step in planning any test is the definition of requirements. These requirements must be based on key technical performance indicators derived from the EDS performance specification. These indicators should encompass the most significant elements contributing to the performance of the equipment and be directly measurable.4

Activities involved in test planning should include:5

  • Definition and schedule of all test requirements.

4  

Systems Engineering Management Guide, Defense Systems Management College, Dec 1986, pages 14-1 to 14-19.

5  

B. S. Blanchard, Logistics Engineering and Management, Prentice-Hall, 1986, pages 238–267

Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×
  • Definition of the test management approach.

  • Definition of test conditions and logistic resource requirements; this includes the test environment, facilities, support equipment, test personnel, test procedures, etc.

  • Description of the test preparation phase; this includes test method, training of test personnel, preparation of test facilities, etc.

  • Description of the formal test phase; this includes test procedures, test data collection and analysis.

  • Test documentation.

  • Test funding requirements.

Rather than every test individually addressing each of the above areas, it is usually much more efficient if the common elements that address the technical aspects of a test are abstracted into a general testing protocol. A general protocol insures that all tests consider the same critical factors in a consistent way. The test director, together with a team of experts, would use the framework of the general protocol to prepare for a particular test, including the detailed test specification. This results in a uniform, fair testing approach.

The committee recommends that standard test protocols and procedures be prepared for:

  • Test and Evaluation (T&E) required for the verification of the functional characteristics of an explosive detection device (EDD) or system (EDS). Analysis of the types and causes of false alarms should be included.

  • T&E required for the certification testing of an EDS.

  • Field level verification/validation testing of EDSs installed at airports to assure that deployed systems continue to meet the standards set for certification.

The committee recommends that for certification testing, as well as for field level verification/validation testing conducted at various airport facilities, actual explosives be used in the tests as opposed to simulants until simulant technology further advances. The use of simulants for bulk detection systems is technologically achievable, but the simulants must be developed and validated.

The remaining discussion in this chapter assumes the following scenario for all categories of testing:

  • The device or system presented to the FAA for testing would first have been tested by the manufacturer using, to the extent possible, the same protocols and performance criteria that the FAA will use. These procedures and results would be reviewed by the FAA before a candidate device or system would be

Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×

accepted for test. The device or system would have been subjected to an analysis regarding susceptibility to likely countermeasures, and a determination made that the device or system could not be readily defeated.

  • An independent test director with assistance from the test team and the FAA would prepare a detailed test procedure. This procedure would meet all the requirements of the appropriate General Test Protocol and, as applicable, the FAA Certification Program.

  • The independent test director would be responsible for conducting the test, making and documenting any deviations from the test procedure that are necessary during the test, and preparing the final report.

  • Distribution of the test results would be the responsibility of the FAA, but the results would be available to the vendor.

As mentioned previously, testing can take the form of pass/fail determinations in which the vendor's device performance specifications are measured against fixed criteria. Testing can also involve parametric measurements in which characteristics of importance are verified at specified levels of statistical confidence. It is important that the test team retain all EDD test data so that any potentially relevant probabilities can be estimated. In this manner, parametric measurements of EDDs would allow a system architecture to be analyzed with several different configurations and a system functionality to be calculated.

The FAA asked the committee to provide a general test protocol that could be used for evaluating instrumental equipment that detect explosives via bulk properties6 as well as vapor properties. However, a general protocol for vapor devices was not possible.

In contrast to bulk detection, the presence of an explosive is only inferred by vapor detection. From vapor detection alone, nothing can be said about the amount of the explosive present. For instance, vapor in sufficient quantity to cause the current generation of detectors to alarm can come from crumbs of explosive material. Conversely, amounts of explosive ten times the lethal threat quantity, if properly encapsulated, would not necessarily cause these detectors to alarm because there would be no vapor for detection. Aside from the special cases of complete containment and the availability of particulate material, the amount of vapor available for sensing from a given quantity of a concealed explosive is not known. At this time, there is no certified vapor generator available to produce a known small quantity of an explosive's vapor, and there is no reference instrument available to check these generators or the background contamination at low levels.

The committee began work on a general protocol for verification and certification testing of bulk EDS systems. An acceptable general protocol must specify the main factors and considerations required to develop detailed test procedures, conduct the test, analyze test data, and evaluate the tested

6  

In this context, bulk detection refers to the sensing of some physical or chemical property of the object under examination.

Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×

equipment. The protocol must be general to accommodate the testing of the various instrumental devices and systems yet it must encompass all the important factors to be considered so that the use of the protocol will ensure that all equipment will be tested in a consistent and fair manner.

For testing of explosive detection equipment, assuming an independent test team following accepted test protocols employing a standard threat, standard bag set, and well characterized test facility, the key performance indicators (sometimes referred to as Critical Operational Issues) are:

  • Probability of detection.

  • False alarm rate.

  • Bag processing rate.

The following two sections present the results of the committee's efforts on a general test protocol for bulk detection equipment, and a test procedure for vapor devices. The role of the general testing protocol in the context of the certification testing is shown in Figure 2.1.

E. A GENERAL TESTING PROTOCOL FOR BULK EXPLOSIVE DETECTION SYSTEMS

A draft of a protocol for bulk EDS had been prepared for the FAA Technical Center by a group of independent consultants in 199.7 Committee members informally reviewed this draft using the criteria discussed in the previous section as a guideline. Extensive discussions and individual written critiques resulted in changes that were reflected in the consultants' final report to the FAA.

In late 1991, the report was distributed by the FAA to industrial contractors involved in the development and/or production of explosive detection equipment with the comment that this protocol should be used as a guide in their testing efforts since a version of it would be used to develop the FAA's test plans. This protocol requires the formation of an independent test team, clear definition of the performance to be verified, and the creation of a test plan for each system or device. Statistical considerations are discussed. In addition, a standard set of bags and threat types, and the creation of a dedicated test site are recommended.

The "General Testing Protocol for Bulk Detection Systems" presented in Appendix A, has been developed in consultation with the committee. It is a guide in designing detailed verification and certification test and data analysis plans.

The general testing protocol in Appendix A differs somewhat from that distributed by the FAA. Within the protocol, the committee has differentiated between those aspects that pertain to certification testing and those that pertain to verification testing, added to the discussion regarding the use and composition of a standard set of baggage for certification tests, expanded on the issues involved in selecting the test location, included provisions to

7  

Dr. Joseph A. Navarro; Mr. Donald A. Becker; Dr. Bernard T. Kenna; and Dr. Carl F. Kossack.

Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×

Figure 2.1. EDS Certification Testing

Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×

revise the protocol as additional testing experience is gained, added some discussion on testing with countermeasures, required analysis of false alarm data for verification testing, and required the documentation of deviations from the protocol.

The protocol allows for the results to be reported as pass/fail as well as statistically based parametric data. A synopsis of the protocol's primary test conditions is shown in Table 2.1. This general protocol is to be used to plan specific procedures to test the performance of an automatic explosive detection system or device in finding bulk explosives concealed in hand carried or checked baggage. This protocol should not be used for testing instruments based on vapor or particulate detection. The protocol is specific to testing of production hardware, as opposed to proof of principle or developmental testing of early prototypes.

TABLE 2.1 Certification Versus Verification Operational Testing

 

Test Outcome

Type of Equipment

Test Location

Threat Package

Bag Population

Test Time

EDS Certification Testing

Pass/ Fail

Low rate or full-scale production units

FAA Dedicated Site

Live Explosives, types and quantities in the FAA's EDS Requirements Specification

FAA Standard Set

Limited Duration

EDD Performance Verification Testing

Parametic Data on Functional Characteristics

Low rate or full-scale production units

FAA Dedicated Site, or Airport Environment

Live Explosives, or Simulants (at Airport Sites)

FAA Standard Set, or Actual Passenger Bags

Limited Duration, or Extended Duration (at Airport Site)

The protocol identifies and addresses the significant factors that must be considered in:

  • Designing a test plan that provides data to allow for a fair and consistent verification of the operational functional characteristics of the tested detection devices, or certification of a bulk detection system. Specifically, the protocol identifies the factors that can influence the actual conduct of the test and bias the measurements, thus affecting the results. At a minimum, these factors include: the identification of the characteristics of the explosive material that is being measured by the equipment; the identification of the threat explosives and the representative bag population that will be used for the test; and selection of the test location, equipment calibration plan, and description of the roles of the various participants before, during and after the conduct of the test.

  • Developing the data analyses plan should always be completed prior to the conduct of the test. Key factors include: planning for the data collection and recording; selection procedures for specific test bags (composition and quantity); and identification of the specific threats that will be tested in which bags.

Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×

Conducting the tests. Areas addressed in the protocol include: description of how the tests should be conducted to ensure that the equipment manufacturer can not affect the test conduct or results; procedures to test for reproducibility of results during the test runs; and specific details of the test team roles/duties during the test period.

Analyzing the test data and preparing the report.

The committee recommends that this protocol be reviewed annually for updating at appropriate intervals by outside experts to incorporate knowledge gained as more testing is accomplished so that it becomes "a living document." Over the course of a year, draft versions of the protocol assisted in the planning of verification tests, and that experience was very useful in pointing out areas in the protocol that needed revision. This process should be continued in the future.

F. THE STATUS OF VAPOR AND PARTICLE DETECTION TEST PROTOCOLS

The previous NMAB report,8 recommended that a clear definition of perfor mance criteria for detector systems be developed, and that standard vapor generators be built to deliver known amounts of vapor to devices under test. This has not been done for vapor systems; and therefore, a general testing protocol cannot yet be developed. The detailed technical understanding of the mechanisms involved in the evolution of and subsequent behavior of vapor leaving an explosive device is lacking. The false alarms caused by unintentional airborne interferrants require that special facilities and additional detectors be available for monitoring during tests. The required protocol for bulk explosive detection (Appendix A) is different than that required for vapor systems and our understanding of what should be tested is too limited to establish a generic test protocol for vapor devices or systems.

The measurement problem is made more difficult by the lack of a standard vapor generator to present a known amount of material to a device for testing. A reference instrument has not been demonstrated that can detect explosive vapor at the lower limits (50 femtograms) to assure the vapor standard performance during tests. The consistent complaint of vapor device manufacturers that background contamination unfairly affects test results can only be settled by the use of such a reference instrument. The reference instrument must have a limit of detection that is lower than that of the devices being tested, and must have a chemical specificity that exceeds that of the candidate systems.

In an attempt to increase the understanding of vapor detection, the FAA is establishing a test facility at Idaho National Engineering Laboratory (INEL). Experience in this facility could provide basic knowledge of vapor detection mechanisms under different scenarios of sample preparation, handling, and concealment. Knowledge of the amount of material available for

8  

National Research Council. 1990. Reducing the Risk of Explosives on Commercial Aircraft(U), NMAB-463. National Materials Advisory Board, Washington, DC.

Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×

sampling would allow rational assessment of the technology needed. If detectors were successful in detecting this level, a general test protocol could then be written to address the need.

The committee recommends that, if vapor detection systems are to be certified, a test facility must be maintained that has a standard vapor generator and a reference instrument. The facility will have to be free from contamination and capable of remaining in that condition after the tests.

Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×
This page in the original is blank.
Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×
Page 23
Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×
Page 24
Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×
Page 25
Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×
Page 26
Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×
Page 27
Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×
Page 28
Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×
Page 29
Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×
Page 30
Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×
Page 31
Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×
Page 32
Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×
Page 33
Suggested Citation:"2 TESTING PROTOCOLS AND PERFORMANCE CRITERIA." National Research Council. 1993. Detection of Explosives for Commercial Aviation Security. Washington, DC: The National Academies Press. doi: 10.17226/2107.
×
Page 34
Next: REFERENCES »
Detection of Explosives for Commercial Aviation Security Get This Book
×
 Detection of Explosives for Commercial Aviation Security
Buy Paperback | $45.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

This book advises the Federal Aeronautics Administration (FAA) on the detection of small, concealed explosives that a terrorist could plant surreptitiously on a commercial airplane. The book identifies key issues for the FAA regarding explosive detection technology that can be implemented in airport terminals. Recommendations are made in the areas of systems engineering, testing, and technology development.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!