Glossary
B
Baseline.
The term baseline was originally used in engineering surveying to define an established line with fixed direction and end points such that further extensions into unmapped areas could be made. In configuration management, a baseline is a document, formally designated and fixed at a specific time during a configuration item's life cycle. From this baseline, the development of the system can be extended from specifications into design documentation and ultimately into hardware and software items (Buckley, 1993).
Baseline management.
Configuration control of the identified baseline.
Baseline test.
A test to determine the baseline performance of noncertified explosives-detection equipment. This test will be conducted at the FAA Technical Center with the primary standard bag set.
Bulk explosives-detection equipment.
Any explosives detection device or system that remotely senses some physical or chemical property of an object under investigation to determine if it is an explosive.
C
Certification baseline.
Definition of the configuration baseline of explosives-detection equipment (including individual configuration items) at the time of certification.
Certification test.
A test conducted at the FAA Technical Center with the primary standard bag set to evaluate the functional and performance capabilities of an explosives-detection system, under realistic operating conditions, against the FAA's certification criteria. An explosives-detection system that meets the FAA's certification criteria is certified and referred to as a certified explosives-detection system.
Change management.
The set of management functions necessary to ensure that compatibility is maintained between all items of a system whenever any single item is changed (Blanchard, 1986). This includes change control of a configuration item (e.g., an x-ray detector) after establishment of the configuration baseline.
Computer software configuration item (CSCI).
Configuration items that are specific to system software. Each software module, for example, may constitute a separate CSCI.
Configuration auditing.
Checking a configuration item or system for compliance with the identified baseline configuration.
Configuration baseline.
A document or a set of documents, formally designated and fixed at a specific time and constituting the approved configuration identification of a configuration item. Documents usually refer to specifications and drawings for hardware, firmware, and software and may include listings, flow charts, decision trees, and so on.
Configuration control (change control).
The systematic proposal, justification, evaluation, coordination, approval, or disapproval of proposed changes and the implementation of all approved changes to the baseline configuration of a configuration item and its identification documentation. A major function of this element is the administration of a configuration control board.
Configuration control board.
A board composed of technical and administrative representatives who recommended approval or disapproval of proposed engineering changes to a configuration item's current approved configuration (DOD, 1995).
Configuration identification.
Selection of configuration items and maintenance of the documents that identify and define the baseline of a configuration item or the overall system (e.g., an explosives-detection system). This includes the determination of the types of configuration documentation for each configuration item, the issuance of unique identifiers (e.g., serial numbers) affixed to each configuration item, and the technical documentation that defines the configuration item's configuration.
Configuration item (CI).
A collection of hardware, software, and firmware that is a uniquely identifiable subset of the system configuration that represents the smallest portion of the system to be subject to independent
configuration management change control procedures (DOD, 1995; Buckley 1993). The CIs may differ widely in complexity, size, and kind. During development and initial production, CIs are those specification items whose functions and performance parameters must be defined and controlled to achieve the overall end-use function and performance (DOD, 1995).
Configuration management.
A process that identifies the functional and physical characteristics of a software, firmware, or hardware item during its life cycle, controls changes to those characteristics, and records and reports change processing and implementation status.
Configuration management plan.
A document that defines how configuration management will be implemented for a particular acquisition, program, or system (DOD, 1995).
Configuration status accounting.
Recording and reporting the implementation of changes to the baseline configuration of a configuration item and its identification documents.
Criticality (of change).
Refers to the "importance" of the item being changed to system performance.
D
Degree (of change).
Refers to the extent of a change (e.g., localized versus all encompassing).
E
Explosive.
A chemical compound that reacts rapidly, generating substantial amounts of heat and pressure.
Explosives-detection device.
An instrument that incorporates a single detection method to detect one or more explosive material categories.
Explosives-detection equipment.
Any equipment, certified or otherwise, that can be used to detect explosives.
Explosives-detection system (EDS).
A self-contained unit composed of one or more devices integrated into a system that has passed the FAA's certification test.
L
Life cycle.
The total phases through which an item passes from the time it is initially developed until the time it is either consumed in use or disposed of as being excess to all known materiel requirements (DOD, 1995).
Life-cycle management plan.
As used in this report, the management plan is a plan that will reside with and be maintained by the FAA that defines and documents the FAA's configuration management, performance-verification, and quality-assurance requirements for
the FAA during certification and field testing of explosives-detection equipment (this would include control of test articles, procedures (documentation), and test results)
explosives-detection equipment manufacturers during the engineering, manufacturing, and operational life cycles
the airlines and other end users, with regard to deployed explosives-detection equipment, during the operational life cycle (this would include control of operating and maintenance procedures)
M
Monitoring.
Monitoring of critical system parameters to determine if performance of explosives-detection equipment has changed. Monitoring would normally be done in the airport at specified intervals using test articles to demonstrate to the user and the FAA that equipment performance has not changed.
P
Performance verification.
As used in this report, the process to verify that explosives-detection equipment complies with the requirements allocated to it.
Precertification testing.
The precertification test provides quantitative evidence that an explosives-detection system meets (or fails to meet) the FAA's performance requirements prior to certification testing. This test is used to determine if an explosives-detection system is ready for certification testing.
Primary standard.
In this report, refers to any explosive material identified by the FAA that must be detected by an explosives-detection system for such a system to be certified.
Primary standard bag set.
In this report, refers to the standard bag set that the FAA uses for certification testing of every explosives-detection system submitted. The primary standard bag set consists of representative passenger bags, some of which contain explosives at threat quantity.
Q
Qualification test.
The purpose of the qualification test is to verify the performance of an individual explosives-detection system unit to qualify that unit for deployment.
Quality standard.
Defines the requirements of a quality system, for example, ISO 9001.
Quality system.
The total quality system is the agreed company-wide and plant-wide operations work structure, documented in effective, integrated, technical, and managerial procedures, for guiding the coordinated actions of the work force, the machines, and the information of the company and plant in the best and most practical ways to assure customer quality satisfaction and economical costs of quality (Feigenbaum, 1983).
R
Regression testing.
The process of validating modified parts of a software program and ensuring that no new errors are introduced into a previously tested code. Although software may have been completely tested during its development, program changes during maintenance require that parts of the software be tested by a regression test.
S
Secondary standard.
In this report, a nonexplosive material that simulates the physical characteristics (e.g., average atomic number, density, etc.) of an explosive such that when characterized by a particular explosives-detection technology it appears to be an explosive.
Secondary standard bag set.
In this report, a secondary standard bag set consists of a number of representative international passenger bags that do not contain threat objects and a number of bags containing simulated explosives at an amount that represents a threat quantity of explosives.
Self-diagnosis test.
A test to determine if components or subsystems of an explosives-detection system are functional. Self-diagnosis includes the continuous measure-
ment of subsystem parameters (e.g., voltages and currents) during routine operation as well as self-diagnostic routines on machine start-up.
Status accounting.
Recording and reporting proposed and approved changes to the baseline configuration of a configuration item and its identification documents. This includes a record of the approved configuration item documentation and identification numbers, the status of proposed changes to configuration item configuration, and the implementation status of approved changes.
T
Test articles.
Individual articles, including items such as simulants, the InVision IQ simulants test bag, and ASTM (1993) standardized step wedges and computed tomography phantoms. The purpose of these articles is too elicit a predetermined response to test critical system parameters.
Test objects.
Any object (or objects) that is used to test the performance of an explosives-detection system. For example, the primary standard bag set, the secondary standard bag set, simulated explosives, etc.
Trace explosives-detection device.
A device that detects explosives through direct chemical identification of particles or vapors given off by explosive materials.
V
Validation.
Confirmation that the specified requirements (for an explosives-detection system) satisfy stakeholder needs.
Verification.
Confirmation that an explosives-detection system fulfills the specified requirements of the stakeholders.
Verification testing.
Determines if a deployed explosives-detection system meets its performance requirements. Verification testing would normally be performed in the airport at initial deployment and at specified intervals using a secondary standard bag set to demonstrate to the user and the FAA that the unit is functioning as specified.
Version control.
Documentation and control of individual versions of objects, such as software source code, executables, graphics, x-ray sources, detectors, etc.
References
ASTM (American Society for Testing and Materials). 1993. Design and Use of Ionizing Radiation Equipment for the Detection of Items Prohibited in Controlled Access Areas. F792-88. West Conshohocken, Pa.: American Society for Testing and Materials.
Blanchard, B.S. 1986. Logistics Engineering and Management, 3rd ed. Englewood Cliffs, N.J.: Prentice-Hall.
Buckley, F.J. 1993. Implementing Configuration Management: Hardware, Software, and Firmware. Piscataway, N.J.: IEEE Press.
DOD. 1995. Configuration Management. MIL-STD-973. Washington, D.C.: U.S. Department of Defense.
Feigenbaum, A.V. 1983. Total Quality Control, 3rd ed. New York: McGraw-Hill.
This page in the original is blank. |
|
|
Technical Report Documentation Page |
|
1. Report No, DOT/FAA/AR-98/51 |
2. Government Accession No. |
3. Recipient's Catalog No. |
|
4. Title and Subtitle Configuration Management and Performance Verification of Explosives-Detection Systems |
5. Report Date September 1998 |
||
|
|
6. Performing Organization Code National Research Council |
|
7. Author(s) Committee on Commercial Aviation Security Panel on Technical Regulation of Explosives-Detection Systems |
8. Performing Organization Report No. NMAB-482-3 |
||
9. Performing Organization Name and Address National Materials Advisory Board National Research Coucnil 2101 Constitution Avenue, NW Washington, DC 20418 |
10. Work Unit No. (TRAIS) |
||
|
|
11. Contract or Grant No. DTFA03-94-C-00068 |
|
12. Sponsoring Agency Name and Address Federal Aviation Administration Technical Center Aviation Security Research and Development Service Atlantic City International Airport, NJ 08405 |
13. Type of Report and Period Covered Final Report |
||
|
|
14. Sponsoring Agency Code AAR-500 |
|
15. Supplementary Notes The FAA's Contracting Officer's Technical Representatives (COTR) are Alan Novakoff and Paul Jankowski |
|||
16. Abstract Aviation security is being expanded to include commercial equipment that is capable of detecting the presence of explosives in passenger baggage. The FAA is developing procedures to ensure that explosives-detection equipment is manufactured, maintained and operated in a manner that will perpetuate an acceptable level of performance. This report examines options for verifying the performance of this equipment and ensuring that changes or upgrades to such equipment are properly managed. |
|||
17. Key Words explosives detection systems, aviation security, bulk explosives detection technology, trace explosives detection technology, configuration management, performance verification, quality control |
18. Distribution Statement |
||
19. Security Classif. (of this report) Unclassified |
20. Security Classif. (of this page) Unclassified |
21. No. of Pages 88 |
22. Price |
Form DOT F1700.7 (8-72) |
Reproduction of completed page authorized |
|