National Academies Press: OpenBook

Applicability of Portable Explosive Detection Devices in Transit Environments (2004)

Chapter: Chapter 3 - Test Procedures and Results

« Previous: Chapter 2 - Explosive Detection Devices
Page 16
Suggested Citation:"Chapter 3 - Test Procedures and Results." National Academies of Sciences, Engineering, and Medicine. 2004. Applicability of Portable Explosive Detection Devices in Transit Environments. Washington, DC: The National Academies Press. doi: 10.17226/23367.
×
Page 16
Page 17
Suggested Citation:"Chapter 3 - Test Procedures and Results." National Academies of Sciences, Engineering, and Medicine. 2004. Applicability of Portable Explosive Detection Devices in Transit Environments. Washington, DC: The National Academies Press. doi: 10.17226/23367.
×
Page 17
Page 18
Suggested Citation:"Chapter 3 - Test Procedures and Results." National Academies of Sciences, Engineering, and Medicine. 2004. Applicability of Portable Explosive Detection Devices in Transit Environments. Washington, DC: The National Academies Press. doi: 10.17226/23367.
×
Page 18
Page 19
Suggested Citation:"Chapter 3 - Test Procedures and Results." National Academies of Sciences, Engineering, and Medicine. 2004. Applicability of Portable Explosive Detection Devices in Transit Environments. Washington, DC: The National Academies Press. doi: 10.17226/23367.
×
Page 19
Page 20
Suggested Citation:"Chapter 3 - Test Procedures and Results." National Academies of Sciences, Engineering, and Medicine. 2004. Applicability of Portable Explosive Detection Devices in Transit Environments. Washington, DC: The National Academies Press. doi: 10.17226/23367.
×
Page 20
Page 21
Suggested Citation:"Chapter 3 - Test Procedures and Results." National Academies of Sciences, Engineering, and Medicine. 2004. Applicability of Portable Explosive Detection Devices in Transit Environments. Washington, DC: The National Academies Press. doi: 10.17226/23367.
×
Page 21
Page 22
Suggested Citation:"Chapter 3 - Test Procedures and Results." National Academies of Sciences, Engineering, and Medicine. 2004. Applicability of Portable Explosive Detection Devices in Transit Environments. Washington, DC: The National Academies Press. doi: 10.17226/23367.
×
Page 22
Page 23
Suggested Citation:"Chapter 3 - Test Procedures and Results." National Academies of Sciences, Engineering, and Medicine. 2004. Applicability of Portable Explosive Detection Devices in Transit Environments. Washington, DC: The National Academies Press. doi: 10.17226/23367.
×
Page 23

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

16 CHAPTER 3 TEST PROCEDURES AND RESULTS 3.1 FIELD-OPERATIONAL TESTING Procedures were used in the preliminary field trials to test and evaluate the performance specifications, including the following: • Detection probability; • False alarm rate; • Throughput rate; and • Size, weight, and support requirements. The goal of the field evaluation was to measure the effectiveness of the EDD under normal conditions and to determine if it was suitable for the transit environment. Ease of usage, operator interface, throughput, sensitivity, and reliability are important attributes that were monitored. Another feature that must be examined is the false alarm rate. One key goal of this study was to examine the false alarm rate when the system was deployed in a transit envi- ronment. During field testing, a false negative was recorded when a contaminated sample did not alarm. Conversely, false positives occurred when an alarm sounded on an uncontaminated sample. Sampling included examining handles and zippers, which were contaminated with trace explosives. During these pre- liminary field tests, the detection system was challenged with articles contaminated with explosive simulant and articles not contaminated with trace explosives. A record of the detection/no detection reading, as well as the humidity, tem- perature, location, and the amount of time taken to examine the articles was made. 3.1.1 Demonstration Team The test team consisted of two people. The team leader was responsible for recording all data, including the time and contaminating samples. The device operator’s role was to set up the uncontaminated samples, perform the tests, and oper- ate the portable EDD. 3.1.2 Training A manufacturer’s representative trained the field test team 2 months before field testing began. The training session dura- tion was roughly 4 hours and covered a basic overview of IMS and trace-particle detection technologies, demonstra- tions of the device, and instructions for maintenance. The manufacturer’s representative, using swabs contaminated with the supplied verification standard stick, performed demon- strations of the device. Following these demonstrations, the field test team performed multiple tests using the recom- mended procedures. After this, the instructor gave instruc- tions for basic maintenance. The device is set up for modu- lar repair by the user. The device has nine modules, all of which can be repaired by the user except the IMS module that contains the radioactive charge. That module must be sent back to the company for repair. At the conclusion of training, the instructor was briefed on the purpose of the study and answered a few questions regarding disruptions that may occur during testing, including possible false positives from combusted diesel fuel. Organized training offered by the manufacturer for oper- ating the device is a 1-day, instructor-led course. The course is divided into 12 individual modules covering such topics as reviewing narcotics and explosives, trace-particle detection technology, IMS, setup and start-up, collection and analysis, analysis of results, and basic maintenance of the device. 3.1.3 Field Test Sites The onsite testing commenced at three major transit loca- tions within the United States that will be referenced here as Test Sites A, B, and C. The criteria for the transit site selection included system age, location, climate, and types of available systems (bus, light rail, subway, regional rail, and so forth). Collectively, the transit sites used to test portable EDDs were representative of the range of potential applica- tions and reflected the nature of the perceived threat to tran- sit systems. Specifically, the selected sites included diesel and compressed-natural-gas bus maintenance yards; diesel and electric rail (including regional rail, subway, light rail, and street trolley); parking facilities (including garage and under- ground facilities); access points to transit operations (e.g., turn- stiles, escalators, tunnels, and platforms); and other areas that may have suspicious packages. All of the test sites provided the test team access to their facilities with escorts, who were transit system employees including field personnel and law enforcement. The transit

agency escorts provided timely and knowledgeable informa- tion about topics such as typical bomb procedures, identifi- cation of test sites, provision of entry to secure areas, and possible deployment uses of portable EDDs. Work permits and/or badges were also provided to the test team to ensure access and safety. The team tested a minimum of 50 sites throughout each of the three systems. A single station typically contained multi- ple test sites, such as platform, street, and mezzanine loca- tions. Testing was done primarily in public areas, although the transit agency escorts allowed the test team to access restricted areas such as maintenance closets, control offices, and areas along the tracks. The restricted sites were viewed as potential sites for hiding suspicious packages and were therefore included in the testing. Sites with the potential to adversely affect the results of the detection device were of particular interest and were widely represented (i.e., sites near cleaning closets, bus stops, exhaust vents, and so forth). The test team used the transit agency representatives’ knowl- edge of their facilities in the selection of test sites within each agency. The hours of testing ranged from 8 a.m. to 6 p.m. 3.1.4 Dry Transfer Strips In this study, test articles were contaminated with small, yet quantifiable, quantities of actual explosives. This was accom- plished by using dry transfer strips prepared by the FAA William J. Hughes Technical Center. These dry transfer strips consist of Teflon strips that are coated with very small but pre- cisely known amounts of actual explosives. The strips were prepared by dissolving known quantitative amounts of the explosives of interest into a solvent and then pipetting the liq- uid to the surface of a Teflon strip. The solvent was then allowed to evaporate off the Teflon. To contaminate a test arti- cle, the dry transfer strip is rubbed along the surface of the test article. Because the explosive is on Teflon, the trace material is transferred easily to the test article. It is estimated that 95% of the trace explosive is transferred to the test article. These dry transfer strips were prepared with 10-ng and 50-ng samples of Semtex (compound of PETN and RDX) and ammonium nitrate. The sample quantity is based on the advertised instrument sensitivity of 10 ng. While the test strips are coated with real explosives, the techniques are per- fectly safe in the field because of the extremely low level of explosive concentration (ng = 10−9). The strips are harmless and pose no threat to the test team or to the commuters. 3.1.5 Warm-Up The operator’s manual for the portable EDD being tested states that warm-up time is less than 15 minutes. The test team confirms that was true most of the time. At the last test site, however, most of the mornings included an extra 10 min- utes (a total of 20–25 minutes) to clear nitrous oxide (NO3) from the system before the EDD was in READY mode. The 17 presence of NO3 could have been from leftover contamina- tion from the previous day’s testing or from an abundance of NO3 in the air. 3.1.6 Verification The test leader conducted operational checks of the equip- ment prior to testing using the verification standard stick pro- vided with the device. The stick, resembling a crayon, consists of a wax-based substance, which contained trace amounts of various explosives (TNT, RDX, and PETN). A clean swab was inserted into the system to ensure that the system was free of any contamination. If the clean swab did not cause the device to alarm, it was considered to be ready for the confi- dence test. A small amount of the verification standard was then applied directly to a blank swab, which was then placed into the device. If a “Verific” alarm was observed, then the device was considered to be operational, and a clean blank swab was inserted to clear the device. If the “Verific” alarm failed, the process was repeated. If the expected result still did not appear, the device was recalibrated, and the process was repeated. During the field testing, a “Verific” alarm was always observed on the first try. 3.1.7 Field Test Supplies and Setup The following is a primary list of the supplies used during the field tests. • Portable EDD • Isopropyl alcohol swabs • Verification standard stick (for cleaning) • Swabs • Latex gloves • Extension cords • Thermometer and • Handles and zippers barometer • Plastic bags • Stopwatch and clock In false alarm testing using X-ray detection systems, the threat trace explosives are often moved from bag to bag within the sample set. This procedure ensures that the operator never knows what bag contains the threat, and this reduces bias in the tests. However, particle and vapor-based detection systems look for trace amounts of residual material, and, therefore, moving the trace explosives from bag to bag is not a recom- mended procedure. It is possible that the check source will leave a residue in a bag from which it has been removed, and all the test items could become contaminated rapidly. There- fore, for this study, the test area was prepared by placing two large plastic bags on the ground. This was to ensure that the test area was not contaminated. On the first bag, the team placed nine uncontaminated samples. The contaminated sam- ples used for the tests were created by wiping a handle or zip- per with the dry transfer strip, which had been treated with a small amount (10 ng or 50 ng) of Semtex or ammonium nitrate (see Figure 10). The two sets of samples were kept at least 2 feet apart to avoid cross contamination.

At the request of the hosting transit authority, the device was checked before each day of testing to ensure that the audio alarm was disabled, and only the visual alarm was enabled. This was to avoid alarming the commuters by hav- ing the device set off alarms throughout the system. 3.1.8 Test Procedure The testing procedure began with the setup of the site as depicted in Figure 11. Two bags were placed on the ground; nine uncontaminated samples were placed on one bag and one contaminated sample was placed on the other bag. The contaminated samples used for the tests were created by wip- ing them with the dry transfer strip on which the small amount (10 ng or 50 ng) of Semtex or ammonium nitrate had been evaporated. The time to set up was not included in the over- all test duration. Upon completion of setup, the test leader 18 signaled the beginning of time one (t1) and the device opera- tor, using latex gloves, randomly selected one of the 10 sam- ples to swab. The standard technique for swabbing samples was for the device operator to hold the swab in one hand and place the middle finger over the center of notches that are cut out of the swab along three of its edges. The majority of the handle or zipper was rubbed with pressure to remove particles from the surface of the sample. This is similar to the screen- ing the TSA does at airports. There were four separate instru- ment operators in the three site visits, each using different pressure and varying speed during swabbing. Multiple oper- ators were designated to better represent real world condi- tions, in which a single EDD may have multiple users. Next, the swab was inserted into the detection device sampling slot, thereby ending time one (t1). The side of the swab where the sample was collected must face the unit. Time two (t2) began directly after t1, when the swab was inserted into the device. This time represents the time it takes for the EDD to analyze the swab, purge the air inside, and display the signal READY for analysis, at which time t2 ends. The total test duration was the time it took to complete all 10 tests. As displayed in the decision tree shown in Fig- ure 12, the NO ALARM and ALARM results invoked dif- ferent procedural actions depending on whether the sample was contaminated or not. The four results recorded during testing were ALARM, NO ALARM, FALSE NEGATIVE, and FALSE POSITIVE. 3.2 TEST RESULTS Once a result was determined and displayed by the EDD, the data were recorded. Documentation of the test included detailed descriptions of test procedures; devices tested; test sets used; ambient conditions (e.g., temperature, humidity, and physical features of the test environment); and test results. The test team created a data collection spreadsheet in which the information was recorded manually in the field. Figure 10. Dry transfer strip contaminating a handle. Figure 11. Typical setup for field testing. Figure 12. Procedural decision tree for field testing.

An example of the data collection spreadsheet is included in Appendix C. The data from the spreadsheet were entered into a test database for the test sites to be synthesized and analyzed. The analysis focused on identifying statistically significant fac- tors that affect the performance of portable EDDs in transit applications and on assessing the potential of these devices for operational deployment, including potential effects on routine transit operations. For assistance in deciphering of the results, consult the glossary of acronyms for explosives in Appendix A. A total of 156 tests was performed throughout the three test sites, each test consisting of nine uncontaminated samples and one contaminated sample (i.e., handles or zippers). Within the “Notes” section of the data collection spreadsheet, the test leader recorded the general location of the test and a descrip- tion of the environment, paying specific attention to any con- ditions that could cause false alarms. Possible disruption sites were identified by the observation team in consultation with local transit officials. This identification was based solely on the sites’ proximity to areas that could have excessive levels of certain chemicals, such as nitrates. Nitrates may originate from sources such as cleaning agents; hair products; ink from printers, copiers, paints, and/or shoe polish; combusted diesel fuel; or fertilizers. Each of these possible nitrate sources was noticed at at least one of the three transit agencies. The reason for identifying possible areas of disruption is so that the team could determine if those areas cause an abundance of false alarms. Descriptions of the general locations along with a sum- mary of areas that could cause disruptions are listed below. • Platform—the landing alongside railroad tracks where commuters convene to wait for ground electric rail or diesel-powered regional rail. Possible Disruption Areas: on the platform next to a maintenance room or cleaning closet, or near any clean- ing agents that were identified by smell or vision. Also, any testing performed on a regional rail platform where a diesel engine train was idling. 19 • Mezzanine—the area located before the platform, usually separated by turnstiles, stairs, or escalators; this area may include fare machines, telephones, or public service areas. Possible Disruption Areas: on the mezzanine next to a maintenance room or cleaning closet, or near any cleaning agents that were identified by smell or vision. • Street level—the area outside the transit system, usu- ally near the entry point from street level. Possible Disruption Areas: close to a street with heavy vehicle traffic or near active construction sites. • Bus depot/stop—any bus stop, depot, or maintenance yard. Possible Disruption Areas: locations where com- busted diesel fumes are expelled from idling buses. • On board a train—any publicly accessible portion of a ground electric rail train car. Possible Disruption Areas: none of the tests on board trains were designated as producing adverse environ- ments for testing. • Other—these areas include most places not defined by the previous locations such as public waiting areas, public buildings, courtyards, hallways/tunnels, loading docks, trackside areas, emergency exits, and inside stor- age or maintenance rooms within the system. Possible Disruption Areas: near hair salons, copy cen- ters, or construction areas using heavy equipment. Table 7 shows the number of tests performed in each of the locations listed above, the number of those tests that were performed in areas of possible disruptions, the total number of false positives, and the number of false positives within areas of possible disruptions. Sections 3.2.1, 3.2.2, and 3.2.3, which discuss the results at Test Sites A, B, and C, respec- tively, include detailed discussion of false positive alarms. 3.2.1 Test Site A Test Site A was a relatively new and clean system. The test locations included compressed-natural-gas bus transit and TABLE 7 Summary of general test locations within the three test sites Location # of Tests # of Tests with Possible Disruptions % of Tests with Possible Disruptions Total False Positives # of False Positives at Location with Possible Disruptions Platform 56 11 19.6% 9 - Mezzanine 35 8 22.9% 3 3 Street Level 27 10 37.0% 5 5 Bus Depot/Stop 11 11 100.0% 1 1 On Board a Train 4 0 0.0% - - Other 23 9 39.1% - - TOTAL 156 49 31.4% - -

maintenance yards, above- and below-ground electric rail transit, diesel regional rail, parking facilities, subway turn- stiles, escalators, tunnels, platforms, mezzanines, and street- level entries to the subway. Table 8 provides a summary of Test Site A results. The Number of Samples row shows the total number of contami- nated and uncontaminated samples (i.e., handles or zippers) tested at the first test site. In total, 51 tests were completed. Each test consisted of one contaminated sample and nine uncontaminated samples. The test team used 10-ng and 50-ng strips of Semtex as well as the verification stick provided by the device manufacturer as contaminants for the test samples. The Alarms row shows the number of correct alarms out of the total possible samples that were expected to alarm (i.e., con- taminated samples). A breakdown of the individual alarm types is also listed. As mentioned, false negatives occur when a contaminated sample does not alarm. Conversely, false pos- itives occur when an alarm trips on an uncontaminated sam- ple. A false positive could occur on any of the contaminated or uncontaminated tests, but not the blank swabs inserted to clean the device after an alarm. The Time 1 Average row shows the average amount of time elapsed from swabbing the sample to receiving a result from the device. The Time 2 Average row shows the average amount of time it took after t1 ended for the device to be ready to analyze the next sample. The Total Average Test Time row shows sum of t1 and t2. Test Site A had an average test duration of 14 minutes. The average test duration is the amount of time to test all 10 sam- ples and does not include time to set up. 3.2.1.1 False Negatives Eight false negatives were recorded during testing at Test Site A. As noted in Table 9, six of the eight (75%) false neg- 20 atives were recorded during testing with zippers. The provider of the dry transfer strips, FAA William J. Hughes Technical Center, claims that the transfer rate of a dry transfer strip is about 95%. However, due to the zipper’s material, which was much smoother than the handle’s material, it is possible that the transfer rate of the trace explosive from the dry transfer strip to the zipper was much less than the transfer rate to the coarse handle. A decreased transfer rate of the trace explo- sive to the zippers may have contributed to an increase in the number of false negatives. The total percentage of false nega- tives for the 10-ng tests was 67%, as compared with only 13% for the 50-ng tests. As mentioned, the advertised sensitivity limit of the device is 10 ng. The team presumes that three things may have contributed to the false negatives: (1) the known loss of explosive particles in the transfer from the dry transfer strip to the sample; (2) the possibility that the smooth surface of the zipper did not collect explosives as well as the handle during the contamination process; and, most notably, (3) the demonstration team was using the advertised mini- mum detection levels of the device during some of the test- ing (i.e., 10 ng of Semtex). 3.2.1.2 False Positives In total, three false positives out of 510 tests (0.6%) were recorded during testing at Test Site A, all of which were nitro- glycerin alarms and at the same test location within the sys- tem. This particular location was on the mezzanine level in the middle of a long hallway between the street and platform escalators. Testing took place underneath a fire-extinguisher case and next to a water drain. There was sporadic commuter traffic as subway trains entered and exited, but there were no unique characteristics of this test location that the team felt would warrant false positives. However, the test team was TABLE 8 Summary information for Test Site A Number of Samples: 350 Handles and 160 Zippers # of Test Locations: 51 locations Sample Preparation: 10ng & 50ng of Semtex dry transfer strips, verification standard stick Alarms: 43 of 51 (84%) Alarm results: • C4/RDX – 16 alarms • SEMTEX – 12 alarms • PETN – 10 alarms • NG & PETN – 3 alarms • TNT & PETN – 2 alarms False Negatives: 8 false negatives out of the 51 contaminated samples (16%) False Positives: 3 false positives out of 510 tests (0.6%) Time 1 Average: 27 seconds Time 2 Average: 15 seconds Total Average Test Time: 44 seconds Average Test Duration: 14 minutes Average Temp: 73.4OF Average Humidity: 49.7

using its last pair of latex gloves for this test and therefore was not able to follow standard procedure by changing gloves after each alarm. After purchasing new latex gloves, the test team returned to the same test location to perform another test to see if the false positives continued. The second test at this location returned no false positives. 3.2.2 Test Site B Test Site B included some of the oldest and most diverse testing locations. Specifically, this site included diesel-bus transit and maintenance yards; diesel and electric rail (includ- ing regional rail, subway, light rail, and street trolley); park- ing facilities (including garage and underground facilities); access points to transit operations (e.g., turnstiles, escalators, tunnels, and platforms), and other areas surrounded by mer- chants, restaurants, and other public services. As shown in Table 10, the test team used 10 ng and 50 ng of Semtex to contaminate samples while testing at Test Site B. Fifty tests were completed with each test including nine uncontami- nated samples and one contaminated sample. There was an increase in the number of false negatives at this test site, most of which occurred while testing at the device’s minimum 21 sensitivity level of 10 ng. Five false positives were recorded, and all were recorded at the same location. The average test times and durations were concurrent with those during test- ing at the first test site. The transit authority at Test Site B provided the test team with an opportunity to observe their canine explosive detec- tion capabilities. The canine testing took place in an enclosed transit authority locker room within the system. Handles and zippers, provided by the test team, were contaminated with 10 ng and 50 ng of Semtex and placed in locations such as closed lockers and closed desk drawers, and on countertops. Along with the contaminated samples, the test team’s zip- locked trash bag of used samples and dry transfer strips was also hidden. Two canine units were brought in at different times shortly after the items were hidden. Within the pres- ence of odors such as cigarette smoke and cooked food, the trainers instructed each of the dogs to search for explosives. Not only did each of the dogs signal the presence of explo- sives by passively sitting at the locations of the hidden items, but also, unexpectedly, the dogs signaled the presence of explosives at the desktop where the test team had contami- nated the handles and zippers minutes before. Even though the test team was not testing the “sniffing” capability of the technology-based detection device, it is believed to be un- likely that the device would have performed as accurately as the canine units. 3.2.2.1 False Negatives As shown in Table 11, 10 of the 15 false negatives (67%) occurred while testing a sample contaminated with 10 ng of Semtex. Since 10 ng is the advertised minimum sensitivity of the machine, this outcome was expected. The analysis must also consider that the transfer rate from the dry transfer strip to either the zipper or the handle is not 100%. Once again, the numbers show that the zipper had a higher false negative per- centage. The 15 tests that resulted in false negatives occurred TABLE 9 False negative summary for Test Site A Sample Description # of Samples Contaminated # of False Negatives % False Negatives 10ng Semtex - Handle 3 1 33% 10ng Semtex - Zipper 3 3 100% TOTAL - 10ng 6 4 67% 50ng Semtex - Handle 17 1 6% 50ng Semtex - Zipper 13 3 23% TOTAL - 50ng 30 4 13% Stick - Handle 15 0 0% Stick - Zipper 0 0 0% TOTAL - Stick 15 0 0% TABLE 10 Summary information for Test Site B Number of Samples: 410 Handles and 90 Zippers # of Test Locations: 50 locations Sample Preparation: 10ng & 50ng Semtex dry transfer strips Alarms: 35 out of 50 (70%) Alarm results: • C4/RDX – 24 alarms • SEMTEX – 11 alarms False Negatives: 15 false negatives out of 50 contaminated samples (30%) False Positives: 5 false positives out of 500 tests (1.0%) Time 1 Average: 28 seconds Time 2 Average: 14 seconds Total Average Test Time: 43 seconds Average Test Duration: 14 minutes Average Temp: 68.3OF Average Humidity: 39.1

at various locations—platforms, street and mezzanine levels, as well as inside tunnel walkways within the system. None of the locations seemed to pose any environmental condi- tions that may have adversely challenged the device while testing. It can be concluded that many of the false negatives were due to the same issues discussed in Section 3.2.1. The most apparent of these issues is that the 10 ng of explosive used to contaminate the samples during testing was pushing the minimum detection capabilities of the device. 3.2.2.2 False Positives The device performed well at Test Site B with respect to false positives. There were 5 false positives recorded out of the 500 (1.0%) samples that were tested. All of these occurred in the same test location, during two separate test times, and they all had the same false positive result of triacetone triper- oxide (TATP). The test location was at an outside street trol- ley stop next to a well-traveled intersection. Directly behind the covered trolley stop was an automobile shop. The shop’s oil and transmission fluid dump tanks where roughly 15 feet away from the test location. The team’s escorts mentioned that the trolley stops are usually power washed every couple of weeks, but because of the extended winter and freezing 22 temperatures, the stops may not have been cleaned for a cou- ple of months. The trolley stops may also have had a buildup of deicer used for the winter ice. Any one of these environ- mental abnormalities may have contributed to the false pos- itive readings. 3.2.3 Test Site C Test Site C included diesel and compressed-natural-gas bus transit and maintenance yards; diesel and electric rail (including regional rail, subway, light rail, and street trolley); parking facilities (including garage and underground facilities); access points to transit operations (e.g., turnstiles, escalators, tunnels, and platforms); and other areas that may potentially have suspicious packages. At Test Site C, on the final day of testing, the team decided to test each of the two identical loaned explosive detection devices together. This testing was not scheduled in the initial project scope, but it was seen as possibly providing an answer to a question that would factor into analyzing the final data. The purpose of the “dual” testing was to check the sensitivity variance between the two machines to ensure that one machine was not alarming more than the other. The dual device-testing operations were identical to the single device testing. Overall, the dual testing returned similar results from the two machines. Of the 55 tests performed, 10 were performed using both machines. Six tests resulted in false negatives: one test by each machine during the same test at three different locations. In conclusion, it was found that the devices were calibrated sim- ilarly and produced similar results. As shown in Table 12, the test team used 10-ng and 50-ng strips of ammonium nitrate and 50-ng strips of Semtex to con- taminate samples while testing at Test Site C. Fifty-five tests were completed, with each test including nine uncontaminated samples and one contaminated sample. There was an increase in the number of false negatives at this test site, most of which TABLE 12 Summary information for Test Site C Number of Samples: 500 Handles and 50 Zippers # of Test Locations: 50 locations Sample Preparation: 10ng & 50ng of Ammonium Nitrate dry transfer strips, 50ng of Semtex dry transfer strips Alarms (correct alarms): 26 of 55 (47%) • 5 – C4/RDX • 6 – SEMTEX • 1 – PETN • 14 – NITRATE False Negatives: 28 of 55 (51%) False Positives: 10 false positives out of 550 tests (1.8%) Time 1 Average: 32 seconds Time 2 Average: 17 seconds Total Average Test Time: 49 seconds Average Test Duration: 15 minutes Average Temp: 70.0OF Average Humidity: 55.6 TABLE 11 False negative summary for Test Site B Sample Description # of Samples Contaminated # of False Negatives % False Negatives 10ng Semtex - Handle 11 8 73% 10ng Semtex - Zipper 2 2 100% TOTALS - 10ng 13 10 77% 50ng Semtex - Handle 30 3 10% 50ng Semtex - Zipper 7 2 29% TOTALS - 50ng 37 5 14%

occurred while testing at the device’s minimum sensitivity level of 10 ng. Five false positives were recorded, all at the same location. The average test times and durations were con- current with those during testing at the first test site. The hosting transit authority of this test site also provided the team with the opportunity to challenge their canine unit’s detection capabilities. This second instance of canine testing took place at an open-air bus stop in the middle of a breezy day. Multiple handles and zippers contaminated with 50 ng of ammonium nitrate as well as uncontaminated samples where hidden from the canine unit. Samples were hidden on support beams, behind and under benches, and underneath small rocks. After signaling that the test team was ready, the trainer instructed the canine to begin its search of the contam- inated samples. Once again, the canine unit passively alarmed correctly on each of the contaminated samples, this time in breezy outdoor conditions. 3.2.3.1 False Negatives Table 13 shows that 23 of the 28 false negatives (82%) occurred while testing a sample contaminated with ammo- nium nitrate. As mentioned, the sensitivity levels of the 23 instrument are around 10 ng for most plastic explosives. However, nitrate is a common compound; it is found in cleaning agents; hair products; ink from printers, copiers, and/or shoe polish; combusted diesel fuel; and fertilizers. To avoid erroneous alarms, the manufacturer sets the device’s alarm threshold for nitrates much higher to compensate for the abundance of nitrates found in everyday environments. In other words, the device’s minimum detection capability for ammonium nitrate is closer to 50 ng or 60 ng rather than the typical 10 ng. This explains the many false negatives while testing with ammonium nitrate. 3.2.3.2 False Positives There were 10 false positives recorded out of the 550 (1.8%) samples that were tested, 6 of which were in the same general location during the first two tests at this site, with the same test result of TATP. The first two tests were performed on the same platform at opposite ends. There was no unique characteristic of the location that the team felt would warrant a false positive. It should be noted that the team revisited the site 4 days later to see if the same results would occur. This time there were no false positives. Dur- ing the second visit to this location, the team ventured out- side to further investigate the environment for clues as to why there were initial false positive results. Directly above the station was a large field that had been recently mowed. In the middle of this field were air exchange vents for the subway platform. It is assumed that the nitrates from the landscape flowed to the platform through the air exchange vents after it was mowed. During testing at street level next to a bus stop at this location, the team did record another false positive of TATP. This false positive occurred while testing a sample contaminated with 50 ng of ammonium nitrate; however, since the result of TATP was incorrect, this test was considered to be a false positive. TABLE 13 False negative summary for Test Site C Sample Description # of SamplesContaminated # of False Negatives % False Negatives 50ng Semtex - Handle 12 3 25% 50ng Semtex - Zipper 4 2 50% 2x50ng Semtex - Handle 1 0 0% TOTALS - Semtex 17 5 29% 50ngAN - Handle 32 21 66% 50ngAN - Zipper 1 0 0% 2x50ngAN - Handle 5 2 40% TOTALS - AN 38 23 61%

Next: Chapter 4 - Conclusions and Recommendations »
Applicability of Portable Explosive Detection Devices in Transit Environments Get This Book
×
 Applicability of Portable Explosive Detection Devices in Transit Environments
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB's Transit Cooperative Research Program (TCRP) Report 86: Public Transportation Security, Volume 6: Applicability of Portable Explosive Detection Devices in Transit Environments assesses the usefulness of portable explosive detectors in a transit environment to help transit agencies augment their existing explosive detection activities.

The TCRP Report 86: Public Transportation Security series assembles relevant information into single, concise volumes, each pertaining to a specific security problem and closely related issues. These volumes focus on the concerns that transit agencies are addressing when developing programs in response to the terrorist attacks of September 11, 2001, and the anthrax attacks that followed. Future volumes of the report will be issued as they are completed.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!