1
Introduction
The U.S. Congress asked the National Research Council (NRC) of the National Academies to review and assess the activities of the Department of Homeland Security (DHS) related to risk analysis (P.L. 110-161, Consolidated Appropriations Act of 2008). Subsequently, a contract featuring Statement of Task in Box 1-1 was agreed upon by the National Academies and DHS officials to support this study. A committee was appointed in October 2008 to carry out the study.
Elements (a)-(c) of this task are intertwined, because the capability of risk analysis methods to “represent and analyze risks” and to “support … decision-making” are inherent in any evaluation of the quality of risk analysis. Therefore, the committee addressed these three task elements as multiple lenses through which to examine the “committee-selected sample of models and methods,” and it interpreted task (a) as the overarching goal of the study, with tasks
BOX 1-1 Statement of Task The study will review how DHS is building its capabilities in risk analysis to inform decision-making. More specifically, the study will address the following tasks:
|
(b) and (c) as particular points of emphasis. All three of these task elements were addressed through careful examination of an illustrative set of models and methods (see below), because it would be impossible to review the scores of DHS risk models and processes in a timely fashion. Through this sampling approach, the committee was exposed to major risk analysis activities across DHS and saw many commonalities. Although DHS is responsible for a range of threats to homeland security, including terrorism, natural disasters, and pandemics, its risk analysis efforts are heavily weighted toward terrorism, and that balance is reflected in this report.
Since its formation in 2002, DHS has espoused the principle of risk-informed decision making. The current DHS Secretary underscored the importance of risk analysis as follows:
Development and implementation of a process and methodology to assess national risk is a fundamental and critical element of an overall risk management process, with the ultimate goal of improving the ability of decision makers to make rational judgments about tradeoffs between courses of action to manage homeland security risk.1
For the purposes of this study, the committee accepted the definition of “risk analysis” found in the glossary of the Society for Risk Analysis:
A detailed examination including risk assessment, risk evaluation, and risk management alternatives, performed to understand the nature of unwanted, negative consequences to human life, health, property, or the environment; an analytical process to provide information regarding undesirable events; the process of quantification of the probabilities and expected consequences for identified risks.2
In contrast to some definitions, this version does not explicitly include risk perception and risk communication, though the latter are clearly important elements if risk analysis is to be effective.
THE DECISION-MAKING CONTEXT FOR THIS STUDY
The Statement of Task emphasizes the role of risk analysis as support for decision making. Risk analysis is not done in a vacuum; it is framed according to the decisions it will inform, and the results are made available in the form needed by the decision makers.
At DHS, risk analysis is used to inform decisions ranging from high-level policy choices to fine-scale protocols that guide the minute-by-minute actions of DHS employees. To illustrate these different levels of decision making, a policy
1 |
Janet A. Napolitano, Terms of Reference, 2009 Quadrennial Homeland Security Review. |
2 |
See http://www.sra.org/resources_glossary_p-r.php. Accessed January 22, 2010. |
decision related to our borders might call for strengthening the borders. With that policy in place, decisions might include setting the level of resources to be allocated to U.S. Customs and Border Protection (CBP) and deciding which border segments require extra attention. Finer-scale decisions might choose from among different options for upgrading enforcement along those segments—choosing, for example, from among different combinations of staffing, surveillance, biometrics, and so on. Finally, decision rules must be created for triggering extra checks and deciding when to pursue enforcement actions.
The committee focused its attention on risk analysis that informs the middle part of that spectrum, whether the decision making is done within DHS or at a DHS partner entity that actually “owns” and manages a given risk. This focus is in large part because the risk analyses that contribute to decision rules for routine operations and for major policy choices are especially tempered by non-technical aspects such as public perception and privacy, which, while not at all undermining the importance of solid risk analyses, do complicate an external review of the process that led to those rules. By contrast, the range of decisions on which the study focused could see the greatest improvement if risk analysis is strengthened. Improving the quality of risk analysis in general will also lead to better inputs for policy and decision rules for routine operations.
Non-routine decisions, such as how to respond to a particular threat situation or how to prepare security for a major national event, usually are unique in character, requiring special approaches that cannot be anticipated. These preparations are driven more by the experience of security experts than by any risk analysis that the committee would be able to examine a priori. Nevertheless, some of the principles set forth in this report should be of value to those decisions as well.
Risk analysis is just one input to decision making, although it is an essential one. Yet ultimately decisions are made by risk managers, who must overlay the analyses with considerations of a pragmatic, political, or other character. Risk analysis does not make decisions, it informs them: the analysts cannot build a calculus that balances all relevant considerations. However, this is not to say that risk analysis and risk management (decision making) are, or should be, in separate compartments or stovepipes. Instead, those functions should engage in back-and-forth interplay. Analysts need to have a clear understanding of the decisions to be made and the considerations beyond analysis that will be folded in. Decision makers must have a good understanding of the capabilities and limitations of risk analysis: indeed, it is the responsibility of risk analysts to ensure that they do. The emphasis of the Statement of Task on informing and supporting decision making, and its mention of “outreach and communications” reflect that interplay. Whether management of a given risk is vested within DHS or handled elsewhere, it is essential that DHS risk analysis reach out to effect good risk management.
RISK MODELS AND METHODS EXAMINED IN DETAIL TO CARRY OUT THIS STUDY
Chapter 2 gives an overview of risk analysis at DHS, which is effected with the help of—by DHS’s count—some 60 risk models and processes. At its first two meetings, the committee was briefed on approximately a dozen of those models and processes. Based on those briefings and the experience of its members, several site visits were planned, at which subsets of the committee learned about some of the models and processes—and additional ones—in more detail. DHS acknowledged that some of the models and processes in its count were at an early stage of development and therefore not good illustrations of DHS’s capability in risk analysis. As stipulated in the Statement of Task, the committee selected an illustrative sample of risk models and methods to examine in detail in order to carry out the study’s evaluation. Its criteria were that the models and processes selected be at least somewhat mature; documented to some extent and; used for a major DHS purpose rather than a niche application and that the set collectively spans the major DHS functions of infrastructure protection, support to first responders, transportation risks, and understanding the risks of weapons of mass destruction and of natural disasters. Guided by those criteria, the committee selected the following sample set:
-
Risk analysis done by DHS with respect to natural hazards, as exemplified by the flood frequency estimates and floodplain maps that the Federal Emergency Management Agency (FEMA) produces to inform the National Flood Insurance Program.3 This is a mature process grounded in extensive historical data and commonly accepted statistical models.
-
Threat, vulnerability, and consequence analyses performed for the protection of critical infrastructure and key resources (CIKR). This work is done in or for the DHS Office of Infrastructure Protection (IP), which carries out these analyses to inform decision making with respect to the nation’s CIKR assets. This is one of the major responsibilities assigned to DHS when it was established. The management of risks for CIKR is the responsibility of particular DHS components, other federal agencies, and many private owners and operators. Perhaps for that reason, IP does not generally integrate the pieces to develop risk analyses, but instead produces these component analyses. Much of the vulnerability analysis within IP is handled by Argonne National Laboratory, and much of the consequence analysis is handled by the National Infrastructure Simulation and Analysis Center (NISAC), a joint program of Sandia National Laboratories and Los Alamos National Laboratory.
-
Risk models used to underpin those DHS grant programs for which al-
-
locations are based on risk. These programs are administered by FEMA within constraints set by Congress, and they have a broad and widespread effect on the nation’s preparedness. Key elements of the modeling are done by a contractor.
-
The Terrorism Risk Assessment and Management (TRAM) tool. This is a mature software-based method for performing risk analysis primarily in the transportation sector. It has been in use at the Port Authority of New York and New Jersey for about a decade, so there is a good deal of experience to inform the committee’s evaluation of its quality and capabilities. The development of TRAM was initiated by the Port Authority, with initial funding from the Department of Justice, and the work has been done by a contractor. TRAM appears to share the general structure of other risk analysis tools such as the Maritime Security Risk Analysis Model (MSRAM) and the Risk Analysis and Management for Critical Asset Protection (RAMCAP) and RAMCAP Plus, although the committee did not examine the particular details of those tools.
-
The Biological Threat Risk Assessment (BTRA) is a large-scale, complex, event-tree formulation created by a contractor with DHS funding. It is meant to inform decision making by the White House Homeland Security Council, the Department of Health and Human Services, and others. The general BTRA approach appears to be similar to the approach used for DHS’s Chemical Terrorism Risk Assessment (CTRA) model and its Integrated Chemical, Biological, Radiological, Nuclear (iCBRN) assessment, though the committee did not examine the details of these models to determine the degree of similarity. BTRA was the subject of a thorough NRC review (2008), captured in Department of Homeland Security Bioterrorism Risk Assessment: A Call for Change, from which the committee drew heavily.
-
The Integrated Risk Management Framework (IRMF). The IRMF is not a particular risk model, but it fits within the category of “methods” in element (a) of the Statement of Task. The committee examined IRMF as it is being developed by DHS’s Office of Risk Management and Analysis (RMA), which is working to coordinate risk analysis across the department. RMA’s development of IRMF and supporting elements generally follows implementation of Enterprise Risk Management (ERM) in the private sector, aligning most closely with ERM practices found in nonfinancial services companies. The committee was told by RMA that the U.S. Coast Guard and Immigrations and Customs Enforcement also practice, or are developing, similar ERM approaches within their component agencies, but did not examine those efforts.
Collectively, this sample captures models and processes spanning a range of maturity—some predating the establishment of DHS, up to the IRMF, which is still under development. The sample includes programs such as those for infrastructure protection and grant allocation that are major activities of DHS informing billions of dollars of outlays. This sample of models and methods exposed the committee to the work of a broad range of DHS risk experts, including major contractors who contribute to DHS’s risk analyses. Through the BTRA, the committee examined a major, high-profile effort to assess the risks of weapons
of mass destruction, and through TRAM, the committee saw how DHS works with an experienced quasi-governmental entity. This sample captures methods that are influential in DHS and which collectively inform a very broad set of homeland security decision making. During the course of the study, the committee was also exposed to other risk analysis efforts within DHS, such as an agent-based simulation tool being developed by the Transportation Security Administration and one of its contractors and the Coast Guard’s Maritime Security Risk Analysis Model. The committee did not attempt to draw inferences from those limited exposures.
HOW THE STUDY WAS CONDUCTED
To carry out its charge, the full committee met five times and subgroups of the committee went on 11 site visits (see Appendixes C and D). The breadth of DHS precluded an exhaustive examination of risk analysis across the department, so the committee relied on RMA to identify topics and speakers for its first two meetings. Those meetings provided an introductory survey. The committee examined RMA’s inventory of some 60 risk models and practices across DHS, familiarized itself with studies from the Government Accountability Office (GAO) and the Congressional Research Service and with DHS publications, and used the committee members’ knowledge of DHS and suggestions from congressional staff to decide on other topics to explore or programs to examine in more detail. The committee decided to focus on risk analysis that was mature enough for some degree of sophistication to be expected. It put its emphasis on risk analysis programs with high visibility or that contribute to major parts of DHS’s counterterrorism and natural disasters missions. Because DHS risk analysis practices have evolved from different roots—some building on practices in the security community, some emulating business risk management practices, and some adapting concepts and tools from engineering—the committee explored the range of risk cultures within DHS. It also made special efforts to discern how well DHS risk analyses and tools support risk management outside DHS.
The site visits enabled subsets of the committee to engage in in-depth interactions with staff members from several DHS offices and programs and also to collect insights from some of DHS’s risk management partners. Site visits were made to the following operations:
-
Port Authority of New York and New Jersey
-
Environmental Protection Agency (EPA) headquarters office in charge of the agency’s homeland security activities
-
EPA National Homeland Security Research Center
-
Department of Health and Human Services offices that deal with preparedness
-
DHS’s IP and Homeland Infrastructure Threat and Risk Analysis Center (HITRAC) programs
-
NISAC
-
North Carolina Department of Homeland Security
-
FEMA’s Grant Program Directorate
-
Naval Postgraduate School Department of Operations Research
-
A Fusion Center Conference
STRUCTURE OF THIS REPORT
Chapter 2 describes DHS’s current systems for risk analysis. Chapter 3 discusses some of the general challenges facing DHS risk analysis, and Chapter 4 provides the committee’s evaluation of those capabilities and makes recommendations for improvements. Chapter 5 provides more general recommendations for moving forward to create a strong culture of risk analysis throughout DHS.
Elements (a)-(c) of the Statement of Task are covered in Chapter 4, which presents evaluations of illustrative DHS risk analysis models and evaluations of cross-cutting issues that affect their quality; the dimensions of quality referred to in task elements (b)-(c) are reflected in the overall assessments of quality. Element (b) is addressed in a more targeted fashion in the sections of Chapter 5 that deal with the basic structure of DHS risk models, the need for strong scientific practice, and the need for improving the technical capabilities of DHS staff with respect to risk analysis. Element (c) is addressed in a more targeted way in the subsection of Chapter 4 titled “The Assumptions Embedded in Risk Analyses Must Be Visible to Decision Makers.” However, the emphasis on risk analysis serving the needs of decision-makers is discernible throughout this report. The several questions raised in element (d) of the Statement of Task are addressed in Chapter 4’s subsections on “Comparing Risks Across DHS Missions” and “Toward Better Risk Communication,” in Chapter 4. Task (e) is addressed by the entirety of Chapter 5. In particular, the necessary (though not sufficient) step for DHS risk analyses to be validated and provide better decision support is for work to begin on characterization of the uncertainties in all the models and processes. The committee’s overall evaluation of the quality of DHS risk analysis capabilities is provided in the last conclusion in Chapter 4.