National Academies Press: OpenBook

Engineering Aviation Security Environments—Reduction of False Alarms in Computed Tomography-Based Screening of Checked Baggage (2013)

Chapter: 4 Incentivizing Research and Development to Decrease False Alarms in an Airport Setting

« Previous: 3 Alternative Approaches for the Reduction of False Alarms
Suggested Citation:"4 Incentivizing Research and Development to Decrease False Alarms in an Airport Setting." National Research Council. 2013. Engineering Aviation Security Environments—Reduction of False Alarms in Computed Tomography-Based Screening of Checked Baggage. Washington, DC: The National Academies Press. doi: 10.17226/13171.
×
Page 34
Suggested Citation:"4 Incentivizing Research and Development to Decrease False Alarms in an Airport Setting." National Research Council. 2013. Engineering Aviation Security Environments—Reduction of False Alarms in Computed Tomography-Based Screening of Checked Baggage. Washington, DC: The National Academies Press. doi: 10.17226/13171.
×
Page 35
Suggested Citation:"4 Incentivizing Research and Development to Decrease False Alarms in an Airport Setting." National Research Council. 2013. Engineering Aviation Security Environments—Reduction of False Alarms in Computed Tomography-Based Screening of Checked Baggage. Washington, DC: The National Academies Press. doi: 10.17226/13171.
×
Page 36
Suggested Citation:"4 Incentivizing Research and Development to Decrease False Alarms in an Airport Setting." National Research Council. 2013. Engineering Aviation Security Environments—Reduction of False Alarms in Computed Tomography-Based Screening of Checked Baggage. Washington, DC: The National Academies Press. doi: 10.17226/13171.
×
Page 37
Suggested Citation:"4 Incentivizing Research and Development to Decrease False Alarms in an Airport Setting." National Research Council. 2013. Engineering Aviation Security Environments—Reduction of False Alarms in Computed Tomography-Based Screening of Checked Baggage. Washington, DC: The National Academies Press. doi: 10.17226/13171.
×
Page 38
Suggested Citation:"4 Incentivizing Research and Development to Decrease False Alarms in an Airport Setting." National Research Council. 2013. Engineering Aviation Security Environments—Reduction of False Alarms in Computed Tomography-Based Screening of Checked Baggage. Washington, DC: The National Academies Press. doi: 10.17226/13171.
×
Page 39
Suggested Citation:"4 Incentivizing Research and Development to Decrease False Alarms in an Airport Setting." National Research Council. 2013. Engineering Aviation Security Environments—Reduction of False Alarms in Computed Tomography-Based Screening of Checked Baggage. Washington, DC: The National Academies Press. doi: 10.17226/13171.
×
Page 40
Suggested Citation:"4 Incentivizing Research and Development to Decrease False Alarms in an Airport Setting." National Research Council. 2013. Engineering Aviation Security Environments—Reduction of False Alarms in Computed Tomography-Based Screening of Checked Baggage. Washington, DC: The National Academies Press. doi: 10.17226/13171.
×
Page 41
Suggested Citation:"4 Incentivizing Research and Development to Decrease False Alarms in an Airport Setting." National Research Council. 2013. Engineering Aviation Security Environments—Reduction of False Alarms in Computed Tomography-Based Screening of Checked Baggage. Washington, DC: The National Academies Press. doi: 10.17226/13171.
×
Page 42
Suggested Citation:"4 Incentivizing Research and Development to Decrease False Alarms in an Airport Setting." National Research Council. 2013. Engineering Aviation Security Environments—Reduction of False Alarms in Computed Tomography-Based Screening of Checked Baggage. Washington, DC: The National Academies Press. doi: 10.17226/13171.
×
Page 43
Suggested Citation:"4 Incentivizing Research and Development to Decrease False Alarms in an Airport Setting." National Research Council. 2013. Engineering Aviation Security Environments—Reduction of False Alarms in Computed Tomography-Based Screening of Checked Baggage. Washington, DC: The National Academies Press. doi: 10.17226/13171.
×
Page 44

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

4 Incentivizing Research and Development to Decrease False Alarms in an Airport Setting Improvements in technology for reducing false alarms in checked baggage screening in U.S. airports are discussed in previous chapters of this report. The committee believes that, in addition to such improvements, making adjustments in the structure that the Transportation Security Administration (TSA) uses for contracts with equipment manufacturers can lead to advances in technology development and to strengthening the mechanism by which improvements are implemented. Although the discussion in this chapter focuses on for-profit companies that seek to make sound financial decisions about investing their research and development (R&D) funds, academic groups also require long-term planning before establishing research work in a specific area. A long-term strategy for improving explosive detection system (EDS) performance in an airport setting would benefit all of the stakeholders involved and might encourage participation by others as yet not engaged in improving checked-baggage screening. ADDRESSING CONCERNS OF EXPLOSIVE DETECTION SYSTEM VENDORS The Need for a Long-Term Transportation Security Administration Plan The process of developing, testing, and implementing improvements in technology is a long-term investment for a company. Technologists with “a good idea” must convince their management not only that their idea has merit in terms of fixing a known problem or improving the performance of an existing technology, but also that the company will eventually earn back the money that it will spend to bring the idea to fruition and will make some profit on top of that. This kind of long-term planning cannot take place in an atmosphere in which the goals of the potential buyer are unclear or apt to change quickly. The committee heard from EDS vendors (see the section entitled “Study Process” in Chapter 1 of this report) that the TSA provides them with few incentives to improve the performance of their equipment. Additionally, although the Department of Homeland Security (DHS) aims to improve the false alarm performance of EDSs for baggage screening, the committee was made aware of no clear plan from the TSA to implement improvements in the performance of fielded systems. Vendors variously heard that they should be working on improvements that ranged from reducing false alarms, to reducing operational costs, and even to increasing time between planned or unplanned maintenance events. The result of these mixed signals is that companies may invest in projects that save the companies money but perhaps do not improve the performance of fielded equipment. Without changes to current TSA policy, there will be no incentives for vendors to spend money to develop improvements beyond the necessary fixes for known problems. Creating incentives for vendors and the technical community to develop improvements will require an organizational framework that includes a known path for the deployment of technology, a realistic strategy for fielding proven improvements, and specific incentives for vendors to provide 34

equipment that performs better than would be necessary to meet baseline requirements. 1 The committee believes that the DHS and the TSA, in cooperation with the equipment vendors, must develop a realistic, long-term strategy for the performance improvement of EDS equipment in an airport setting. One of the most successful demonstrations of how improvements can be driven by a long-term plan is the semiconductor industry’s International Technology Roadmap for Semiconductors (ITRS). 2 This 15-year roadmap was developed through the participation of chip makers, equipment suppliers, and research entities, and over the years it has laid out the generational technology requirements for the industry to continue to realize Moore’s law. The roadmap is updated annually by the appropriate teams, which meet each year in a public meeting. In the competitive semiconductor market, this roadmap has served to spur development and manufacturing activities by individual companies and allowed them to remain competitive. For the DHS and the TSA, a similar approach could result in a consensus on future requirements. Although incentives for participation would be different from those for the competitive market of private industry, and although priorities in a long-term plan involving EDS equipment would necessarily change on the basis of changing threat environments and other outside influences, a long-term plan developed cooperatively would allow companies to evaluate their risk-and-reward strategy in a more stable investment environment. In support of this, the committee re-endorses the following recommendation. Recommendation: “Within one year, in cooperation with the other stakeholders, the FAA [Federal Aviation Administration] should develop a five-year joint-development plan that includes cost, stakeholder responsibilities, quality measures, and other important factors. This plan should be a living document that is formally updated annually. Buy-in from all stakeholders will be necessary for the plan to be effective.” (National Research Council, Assessment of Technologies Deployed to Improve Aviation Security, National Academy Press, Washington, D.C., 1999, p. 5.) Changes Needed for Dealing with Technological Improvements Technical Review of Changes A second area in which the committee was made aware of vendor frustration by company representatives was with respect to their realistic expectation that their companies’ improvements would be purchased by the TSA for use in fielded equipment. Each vendor that the committee heard from 3 described improvements that could be fielded now but that were being hindered by TSA testing requirements or by a lack of guidance on how to evaluate or implement these changes. As with the long- term strategy, companies will invest in technology improvements that can reasonably be expected to generate a return on the investment. If the company pays for development but then has to wait for the next procurement cycle to see any payback, there is little incentive to improve its product continuously or to evaluate third-party improvements. The committee does not believe that the TSA should spend money fielding every suggested change. Instead, it should create a framework by which reasonable changes can be evaluated against the claimed improvements and implemented in a sensible way. A first step in that process could be the development of a group of individuals knowledgeable about the technology and with broad experience in the technology, testing, and field requirements (see the section below entitled “Technical Review 1 Beyond the obvious contracting mechanisms, “incentives” could come in the form of such things as extended patent protection. See, for example, Francesca Cornelli and Mark Schankerman, Patent renewals and R&D incentives, RAND Journal of Economics 30(2):197-213, 1999, which describes an “incentive effect” for R&D that comes from giving firms with R&D capabilities the option of choosing longer patent lives. 2 Available at http://www.itrs.net/, accessed December 28, 2010. 3 Representatives of General Electric (GE) Security and of L-3 Communications, presentations to the committee, February 12, 2009, San Francisco, Calif. 35

Board”). Such technical review boards—with a charter to evaluate potential changes and to identify what testing would be required to ascertain whether a change had the intended effect and what processes would be required to implement the change—are common in industry. A body within the TSA with a similar charter could add some certainty to proposed changes by articulating testing and implementation requirements before money was spent on an idea. The Need for Testing Capabilities Following the requirement to determine a path to implementation for a claimed improvement is the need for testing capabilities. For software improvements, such testing might require image data from several hundred bags to demonstrate improved detection; hardware changes might require actually scanning bags in an airport setting to confirm a lower false alarm rate. Each potential improvement would have to be evaluated for risk and reward, and each would require particular testing facilities. The TSA has a variety of testing abilities now, including the Transportation Security Laboratory (TSL) for certification testing and the TSA’s individual laboratories. The TSA can also benefit from the vendors’ in-house testing facilities and the availability of realistic explosive stimulants. The recently opened TSA Systems Integration Facility at Ronald Reagan Washington National Airport outside Washington, D.C., may add the capability of doing testing that involves actual passenger bags (as compared to “test sets”). All of these resources should be considered when determining how to test proposed improvements. The Need to Identify Bottlenecks in the Certification Process A reduction in the time that it takes vendors to complete the certification process would allow improvements to be more rapidly deployed in an airport setting. To address this, the Transportation Security Laboratory will need to examine its certification process for EDSs with the goal of identifying potential bottlenecks. One approach to this issue might be the development of a method to test systems in an airport setting that operates in parallel with extant systems, allowing data on the same passenger bags within a single airport setting to be compared. Incentives for Vendors A third area of change in the TSA’s contracting processes would be to provide vendors with the opportunity to receive performance bonuses if their equipment exceeded the required baseline performance. This type of incentive could encourage vendors to work collaboratively with researchers in determining improvements that directly impact the desired performance. The incentive would also make it more attractive for the vendors to seek out third parties that might have research that could lead to a better automated-threat reduction algorithm or other improvement. To implement such a change, the TSA would have to modify its current contracting approach and determine a method to reward performance that exceeds the baseline and to encourage collaboration. One model might be found in the Department of Defense (DOD), which is moving toward a “performance- based logistics” (PBL) contracting program that creates incentives for vendors to determine the best improvement path and to implement it. The section below entitled “Performance-Base Logistics” describes the DOD approach in more detail. Finally, the committee believes that the current plan of the TSA to replace all the fielded end-of- life EDSs in a single purchase defeats the goal of continuous improvement and could lock the TSA into years of trying to improve fielded equipment through incremental changes. The ability to purchase new 36

equipment periodically could be a strategic path toward improved performance of EDSs in an airport setting. This aspect of continuous learning reinforces the recommendation made in Chapter 2: Recommendation: The TSA should not fund an overall replacement of fielded explosive detection systems, because replacing all the units in service with currently available technology would not allow for learning in an airport setting to inform future performance improvements. Instead, the TSA should plan its capital spending for explosives detection improvements over a period of time sufficient to allow several generations of technology to be to fielded on a limited basis, evaluated, and iteratively improved—thus leading to a gradual improvement in the overall field performance of CT-based explosives detection systems. COLLABORATIVE CONTRACTING METHODOLOGY According to information available to the committee, the current contracting methodology utilized by the TSA for airport security equipment employs three types of government funding: procurement, operations and maintenance (O&M), and research, development, testing, and evaluation (RDT&E). In this system, the TSA purchases EDSs from the vendor (procurement) and installs and operates the equipment in the airport (O&M). The TSA also pays the vendor or other contractor to provide equipment maintenance in an airport setting (O&M). If development money can be obtained (RDT&E), then system improvements can be implemented. Another way of looking at these streams of funding is as follows: • Procurement funding covers the purchase of security systems in limited or full-rate production (e.g., 10 CT systems meeting a given performance specification); • O&M funding covers the original equipment manufacturer (OEM) field service support and the TSA operating costs; and • RDT&E funding covers the new development costs associated with technology investigations, new design activities, and the funding of third-party technology research (done, for example, at universities and laboratories). The limitations of RDT&E funding in the typical government procurement cycle often severely limit the ability of the TSA to fund new product improvements, because as ideas for new technology insertions emerge from the OEMs and from academia, this form of funding can inhibit continuous process and product improvements. When funding streams are separated in the way that they are in the EDS procurement model, there is little incentive for a vendor to provide equipment upgrades that might improve field performance. From the operational point of view, the TSA does not have money to test and field equipment upgrades that have the potential to reduce false alarm rates or to increase the amount of time between required maintenance events and reduce the failure rate of EDSs—and ultimately reduce operating costs. This gap in funding for continuous improvement has resulted in frustration on both sides—the TSA cannot always field the best and most-up-to-date equipment, and the equipment vendors cannot benefit from their investments in EDS performance improvements. Changing the approach to procurement and operations could provide the TSA with the flexibility necessary to reap the benefits of investments in performance improvement while offering the vendors an incentive for continuously improving their products. This approach, based on the recent shifts in the DOD procurement process known as performance-based logistics, is described in the next section. The major incentive that the TSA can offer the equipment vendors to improve the performance of their equipment is through the purchase (procurement) of new products that include improved (more 37

rigorous) systems specifications. Such a process would, by its nature, also require the TSA to have a clear set of defined and measurable standards for performance. PERFORMANCE-BASED LOGISTICS Performance-based logistics refers to the purchase of support as an integrated, affordable, performance package designed to optimize system readiness and meet performance goals for a system through long-term support arrangements with clear lines of authority and responsibility. The essence of PBL is buying performance outcomes, not individual parts and repair actions; the contract line item (CLIN) structure is therefore designed around the desired performance. Under a PBL-based contract, the purchaser (the government) and the provider (the equipment vendor) work together to determine key performance indicators for the equipment, and the purchaser provides incentives for the vendors and other contractors to invest in improvements with a reasonable expectation that these improvements will be evaluated and implemented if successful. This method has been successfully employed by DOD contractors. Overview of Performance-Based Logistics The Office of the Secretary of Defense has defined performance-based logistics as “a strategy for weapon system product support that employs the purchase of support as an integrated performance package designed to optimize system readiness. It meets performance goals for a weapon system through a support structure based on performance agreements with clear lines of authority and responsibility.” 4 When employed in the context of the total life cycle of a product, a PBL approach to major system fielding has resulted in superior system performance, operational readiness, and continuous product improvement, which directly impacts incentivized contractor profit. The TSA would benefit greatly by implementing a contracting approach that provides an incentive to the contractor to design and field system improvements that positively impact performance parameters which are determined to be significant indicators of success. In addition to the six steps in the PBL flow shown in Figure 4-1 are the lists of responsibilities of the TSA and the OEM or vendor and the joint responsibilities as suggested by the committee. Advantage of Performance-Based Logistics The primary goal of a PBL program is to provide logistics services in a contracting structure that offers incentives for continuous improvement in key measures throughout the life cycle of the product. As implemented by the DOD, the purpose for this contracting structure is to allow the procuring agency and the contractor to select system improvements for implementation that would positively impact the incentivized key measures. The contractor is funded to develop or acquire product improvements, and the government reaps the benefit of higher reliability, improved system performance, improved system readiness, and the implementation of system modifications that accommodate a changing threat level. An example of this is the DOD RQ-7 Shadow Tactical Unmanned Air Vehicle program (Shadow program) 4 ADUSD (Logistics Plans & Programs), Total Life Cycle System Management (TLCSM): Plan of Action and Milestones, updated January 6, 2003, p. 2, available at http://www.acq.osd.mil/log/sci/exec_info/sm_milestone_ plan010603.pdf, accessed June 3, 2011. 38

FIGURE 4-1 The steps in a performance-based logistics contract-based flow, and the committee-proposed responsibilities for the Transportation Security Administration (TSA), for the original equipment manufacturer (OEM) or vendor, and for joint cooperation. that consists of a series of production awards and a companion PBL contract with an incentive plan that has significantly improved system availability and reliability, reduced operating cost per unit, decreased the logistics footprint (inventory and support services), and improved the logistics response time. 5 Implementation Considerations for Performance-Based Logistics Many aspects of the PBL process can be applied to the acquisition of and logistics support for airport security screening equipment; one example is outlined in Box 4-1. Like major security systems deployed by the DOD, TSA screening equipment is also vital to the U.S. national defense and addresses evolving threat conditions. Additionally, in both cases, the procurer desires a means to improve its threat- recognition capability continually—be it in an airport, at the airport perimeter, at a train station, at a shipping port, or for the U.S. military on foreign soil. A typical DOD major system procurement is driven by a statement of objectives that provides the contractor with threshold operational performance requirements. The PBL contract is a “companion” contract (or set of contract line items) that provides life-cycle support for the fielded systems. Performance is measured by a variety of indicators (parameters) that will change throughout the product life cycle, threat, situational environment, and other factors. As noted earlier, the Shadow program employs a service contract with a fee based on performance metrics that measure results in customer (logistics) support. The customer procures the system, and the contractor provides the full integrated logistics and sustainment support. All spares, repairs, field service representative support, and management are provided under the PBL incentive program. The shadow contract is a cost plus incentive fee contract and is subject to federal acquisition 5 Performance Based Logistics (PBL) Contract W58RGZ-08-C-0016, U.S. Army Aviation and Missile Command, Redstone Arsenal, Ala. 39

regulations, DOD directives, and specific contract requirements. Key parameters in the category of system readiness may include minimum (threshold) performance and desired (objective) performance for operational availability, mean time to repair, and mean time between operational mission failures. TECHNICAL REVIEW BOARD As noted earlier in this chapter, the committee believes that it would be useful for the TSA to establish a review board of members who represent a broad range of interests for the purposes of evaluating potential improvements and outlining testing and fielding requirements, as well as determining cost of implementation versus potential performance gain. Such a review board would enable vendors not only to establish a stake in the outcome of fielded changes but would also enable them to see a clear path to the implementation of improvements. The board should also review and evaluate methods to identify and mitigate risks, which would assist vendors in making more informed decisions on how to spend their internal R&D (IR&D) funds. FIELDING CONSIDERATIONS In addition to establishing a technical review board that could define testing and implementation requirements, the TSA might also establish a capability to review and validate test conditions and results in order to determine whether a specific change meets the criteria set out. Although vendors indicated to the committee that this evaluation is being done, the committee believes that formalizing this role would provide needed structure for decisions that are made with regard to making changes to fielded equipment. This review and validation could be another function of the technical review board, or a separate entity could be established to carry out this function. A testbed for evaluating potential improvements would consist of the following elements: 40

BOX 4-1 Applying the Performance-Based Logistics Process to Explosives Detection Systems Below is an example developed by the committee of the types of performance data on explosive detection systems that the Department of Homeland Security might choose to incorporate into a performance-based logistics contract for CT-based explosives detection systems. The numbers are notional only. Selected Key Performance Indicators a. System Readiness (Up Time) (25%) —Assessed periodically and rolled up for all fielded systems Up Time (Up Time+Down Time) System Readiness = b. False Alarms—Current Year (25%) c. System Maintenance Cost (20%) d. Reliability Factor (MTBF) (15%) —The contractor shall achieve a reliability factor goal of 60 days from dock to stock. —The goal of the metric is to improve the time it takes for depot repair of assemblies. —This is defined as: Total days of down−time Number of open and closed work orders Reliability Factor = Operational Reliability Growth Factor (30%) —Aimed at improving operational reliability by reducing the false alarm rate. —Contractor and government must plan for investments which will improve the false alarm rate —Metric defined as: Cost of False Alarm Resolution (Current Year) Previous Year Investments in System Improvements 1 Operational Reliability Growth Factor = The minimum and maximum fee table can be determined as:2 95-100 15.0% 90-94 13.0% 85-89 10.0% 80-84 7.0% 75-79 5.0% 70-74 4.0% 65-69 3.5% <65 3.0% Results will be indexed in a table specific to the parameter, yielding a score for each: System Readiness (Up Time) = 80 False Alarms = 50 System Maintenance Cost = 75 Reliability Factor = 80 System Manning Cost (including clearing alarms) = 30 Indexing into this fee table yields a fee of 4% of the available fee pool. Based on the results table and the weights for each performance indicator, a quarterly calculation of performance fee would be calculated as follows: Incentive Score = (System Readiness Score (80) × 20%) + (False Alarm Score (50) × 25%) + (System Maintenance Cost (75) × 15%) + (Reliability Factor (80) × 15%) + (System Manning Cost (30) × 25%) = 59.25 ___________________________ 1 A scale must be developed to determine the allocation of points for this metric (e.g., a lower system maintenance cost earns more points, thereby providing incentive to the contractor to institute methods to improve reliability and maintainability). 2 The assumption is that one year is required to realize the benefit of funds spent on system improvements. The expectation is that the ratio should be greater than or equal to 1.0 in order to justify the investment. A scale must be developed to determine the allocation of points for this metric, depending on the value of the ratio (with a higher ratio earning more points). 41

1. Access to images from scans of bags within an airport setting, 2. Technical and financial requirement specification, 3. Realistic explosive simulants, 4. Methods to identify and retire risk, and 5. Timely discussions with the evaluation board. Ultimately this review process should lead to faster certifications at the TSL, which would be to the benefit of both vendors and the TSA. DATA COLLECTION AND ANALYSIS In a typical PBL program, the government—with contractor input—establishes the data collection structure, processes, data repository, and training required for the implementation of data collection. For example, in the case of the Shadow program, the government-controlled Unmanned Aircraft System Performance Assessment System is the source-data repository for metric performance evaluation. The contractor participates in the training required to maintain the data collection processes and causes the data collection disciplines to be implemented. The government maintains responsibility for the central data repository and provides the contractor with the levels of data access required to utilize the system for maintenance management, supply chain management, asset visibility, and data analysis. The contractor ensures that accurate data collection and analysis are input into the data collection system to determine metric performance. PROGRAM MANAGEMENT The contractor provides management personnel for planning, organizing, scheduling, controlling, and directing all activities in a manner that supports the performance metrics contained in the contract. A PBL program plan is developed by the contractor to address schedules, resources, budgets, and other information required for program management. In addition, the PBL program plan includes management planning, executive management summaries, change logs, functional budget allocation, contract data, program schedules, contract line item numbers, work-breakdown structure, control account managers, organizational charts, procurement planning, subcontract planning, facilities and capital equipment planning, a work-breakdown-structure dictionary, cost performance forecasts, cash-flow schedule, engineering planning, post-deployment software support planning, personnel planning maintenance of action item logs, security and safety requirements, project directives, risk management planning, and various program records. Box 4-2 describes how the PBL model might be employed in the aviation security setting. Recommendation: In order to better capitalize on improvements and provide vendors with the necessary incentives to invest in research that will lead to better performance metrics, the TSA should consider adoption of a different contract structure for the procurement and maintenance of the computed tomography-based explosive detection systems used for checked baggage, as well as for other screening technologies. One approach worth considering is performance-based logistics contracting, which is currently used by the Department of Defense. 42

BOX 4-2 Applying the Performance-Based Logistics Model in an Aviation Security Setting Suppose that the combination of the Transportation Security Administration (TSA) operating costs and the original equipment manufacturer (OEM) maintenance contract (also known as operation and maintenance [O&M] funding) to cover the O&M of the security screening equipment across domestic airports are approximately $100 million per year.1 Now suppose, the OEM has a concept for system improvements that would reduce the false alarm rate by 5 percent. This reduction would result in a decreased need for personnel to clear false alarms and a savings of $10 million per year. The OEM has estimated the cost of the design, certification tests, and fielding of the modification to be $17 million, indicating a payback of 1.7 years. Based on the guidelines for selecting technologies for insertion, the TSA decides to fund the contractor to implement this improvement using O&M funding, because the ramifications of the change positively impact the sustainment costs. A more distant future state of contracting for airport security services might evolve into a fee-for- service arrangement. In this contracting model, the government would own the security equipment and the contractor would operate the equipment. Security concerns might limit the degree to which the government chose to implement this contract arrangement (such as implementing it only in airports with lower threat ratings). 1 The numbers in this case have been made up to demonstrate how the model would work. APPROACHES OTHER THAN PERFORMANCE-BASED LOGISTICS FOR PRODUCT DEVELOPMENT AND SYSTEM IMPROVEMENT Original Equipment Manufacturer Research and Development Traditionally, original equipment manufacturers have invested corporate profits and/or internal research and development funds for equipment modernization and reliability improvements. This funding is generally very limited, untimely, and difficult to secure. The availability of this type of funding is generally contingent on approval of a business case for recouping the investment through subsequent sales to the government for fielding the modifications. The committee heard from various manufacturers 6 that there have been many instances in which the government had not shown interest in fielding their corporately funded upgrades. Corporate IR&D funding is generally a component of the burdening structure incorporated into a company’s billing rates. The U.S. government recognizes the need to provide contractors with incentives to invest in product improvements, and to the extent that a company can include IR&D in its bid rates (and still remain competitive), this is pre-negotiated with the contractor. Therefore, in situations in which a company has elected to use part of its IR&D funding on new baggage-screening technologies for use in airports, it is to the government’s benefit to provide feedback regarding these initiatives so that the companies have some motivation to come to successful conclusion with these investments. A joint long- term development plan between government and industry allows for systematic planning for upgrades to both existing and new technology developments. 6 Speakers from L-4 Communications and General Electric (GE) Security on February 12, 2009, and speaker from Reveal Imaging on April 28, 2009. 43

University and Laboratory Research and Development In parallel with investments by EDS manufacturers, researchers have been studying improvements in automated threat-recognition algorithms at universities, government laboratories, and other industrial companies through private funding, government grants, and other contract sources. This research is not being conducted in coordination with any product development, and the committee saw no structure in place for these researchers to partner with either the government or manufacturers for the testing, evaluation, and, ultimately, fielding of these improvements. With the appropriate incentive system in place, it would be possible to foster continuous improvement by the EDS manufacturers by removing the impediments to cooperation with researchers. Conclusion: The TSA lacks a structured plan for implementing improved EDSs that would give vendors an opportunity to plan research funding and priorities in accordance with the TSA plan. Recommendation: The TSA should develop a plan to provide appropriate incentives not only for EDS vendors but also for third parties and researchers in academia in order to improve the overall performance of computed tomography-based EDSs, including their rates of false alarms. Incentives should be provided for both short- and longer-term improvements. Recommendation: The TSA should develop a long-term strategy for the continuous improvement of performance. Involving all interested parties including EDS vendors and users would increase the probability that all stakeholders work toward the same goals. 44

Next: 5 Lessons from Medical Imaging for Explosive Detection Systems »
Engineering Aviation Security Environments—Reduction of False Alarms in Computed Tomography-Based Screening of Checked Baggage Get This Book
×
Buy Paperback | $41.00 Buy Ebook | $32.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

On November 19, 2001 the Transportation Security Administration (TSA) was created as a separate entity within the U.S. Department of Transportation through the Aviation and Transportation Security Act. The act also mandated that all checked baggage on U.S. flights be scanned by explosive detection systems (EDSs) for the presence of threats. These systems needed to be deployed quickly and universally, but could not be made available everywhere. As a result the TSA emphasized the procurement and installation of certified systems where EDSs were not yet available. Computer tomography (CT)-based systems became the certified method or place-holder for EDSs. CT systems cannot detect explosives but instead create images of potential threats that can be compared to criteria to determine if they are real threats. The TSA has placed a great emphasis on high level detections in order to slow false negatives or missed detections. As a result there is abundance in false positives or false alarms.

In order to get a better handle on these false positives the National Research Council (NRC) was asked to examine the technology of current aviation-security EDSs and false positives produced by this equipment. The ad hoc committee assigned to this task examined and evaluated the cases of false positives in the EDSs, assessed the impact of false positive resolution on personnel and resource allocation, and made recommendations on investigating false positives without increase false negatives. To complete their task the committee held four meetings in which they observed security measures at the San Francisco International Airport, heard from employees of DHS and the TSA.
Engineering Aviation Security Environments--Reduction of False Alarms in Computed Tomography-Based Screening of Checked Baggage is the result of the committee's investigation. The report includes key conclusions and findings, an overview of EDSs, and recommendations made by the committee.
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!