Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 46
APPENDIX B Summary Report for the CAPTA Pilot Test with Maryland DOT, October 17, 2007 Attendees: Kevin DuffySAIC John ContestabileMDOT Matthew BassetMDOT Summary of Activity On October 17, 2007, a pilot test of the CAPTA methodology was conducted with the Maryland Department of Transportation (MDOT). This pilot test was conducted using asset data supplied by MDOT. MDOT had previously been provided with a pilot test schedule depicting the content of the test and the anticipated feedback. Utilizing the data provided, MDOT was led through the fourteen-step process. As designed, CAPTA winnowed the data fields via user-selected consequence thresholds. The summary report listed assets and asset classes that were likely to be of value to the agency. User Feedback The MDOT users suggested the alterations, additions, and modifications to the CAPTA model that are listed in the table on the following page. These are represented in the center column. The right column indicates the status of the comment after discussion by the project team. Summary Result The MDOT pilot test site was intentionally chosen due to the state's high degree of involve- ment in the asset vulnerability area and because Maryland is one of the few states in which the DOT controls all transportation aspects, including airline terminals. The lengthy comment and recommendation list from Maryland was expected. Some com- ments, such as #1#7 were planned for the final revision of the CAPTA model. Comment #14 was intriguing, as the same issue had been discussed by the project team and tabled pending completion of the pilot tests. 47
OCR for page 47
48 CAPTA Final Report Current No. Comment Description Status Need to relate and make consistent all screen backgrounds and colors. 1 Recommended screen colors of grey background, yellow categories, such as Accept in slide labeled Results Summary. Need to install pop up screens (comments) to explain all terms in the 2 spreadsheets. They would appear when then the cursors passed over it and Accept then disappear when the cursor moved on. Need to spell out all cells rather than use shorthand or just letters. Desired to 3 Accept see whole screen being used. Need to use consistent orientation of threat/hazards and asset categories. 4 Either assets always appear on x plane or the y plane. Currently, they change Accept from sheet to sheet. Need to insert directions box in a consistent place across sheets, such as 5 Accept always placing them in the top left corner. 6 Add a line to instruction boxes noting purpose of screen. Accept 7 Need to array all buttons along top right of screen. Accept Add mission importance toggle to Ferries, Buildings, and Fleets. 8 Recommendation has merit based on fact that some agencies have only bus Accept fleets to operate, with limited hard infrastructure. Add asset category for "Operations Control Centers" to move them apart 9 Accept from plain office buildings. Clarify the definition of mission importance as importance to agency or to 10 Accept state. This change can be explored in the paper writeup. 11 Buildings should be broadened to include airport terminals. Accept Consider CAPTA for use by State Homeland Security Administrators 12 Accept seeking a way to determine funding across modes. Change colors for countermeasure effectiveness. Use red for highly effective 13 Accept and orange for medium effective. Break CAPTA into two sections. The first piece would move the user through only the following screens: 1. Relevant Risks 2. Threshold 3. Yellow input tabs 4. Critical Assets 5. CM Opportunity All 14 6. Results Summary Accept All others can be reached through user pressed buttons, if they choose to go into that amount of detail. The idea is that the user does a quick run through of the CAPTA, accepting all of the calculations we have embedded in the system. After this first pass, they can then go back and tinker with the explanations.