In response to a request of the U.S. Congress (P.L. 110-161, Consolidated Appropriations Act of 2008), the National Research Council (NRC) established the Committee to Review the Department of Homeland Security’s Approach to Risk Analysis to assess how the Department of Homeland Security (DHS) is building its capabilities in risk analysis to inform decision making. The specific tasks undertaken as the basis for the committee’s assessment are listed in Box S-1. This summary presents the principal conclusions and the recommendations of the committee’s full report.
SCOPE AND ROLE OF RISK ANALYSIS AT DHS
The scope of responsibilities of DHS is large, ranging over most, if not all, aspects of homeland security and supporting in principle all government and private entities that contribute to homeland security. For some functions, DHS is responsible for all of the elements of risk analysis. For other functions for which the responsibility is shared, effective coordination is required with owners and operators of private facilities; with state, territorial, and local departments of homeland security and emergency management; and with other federal agencies such as the Department of Health and Human Services, the Environmental Protection Agency, or the Department of Agriculture. While DHS is responsible for mitigating a range of threats to homeland security, including terrorism, natural disasters, and pandemics, its risk analysis efforts are weighted heavily toward terrorism, and that balance is reflected in this report.
Although risk analysis is just one input to decision making, it is an essential one. At DHS, risk analysis is used to inform decisions ranging from high-level policy choices to fine-scale protocols that guide the minute-by-minute actions of DHS employees. The committee focused its attention on risk analysis that informs the middle part of that spectrum, because it is for that range of decisions that technical improvements in risk analysis could have the greatest impact. Good risk analysis is also essential to creating decision rules for routine operations and for major policy choices, but in those cases non-technical considerations such as public acceptability can limit the potential value from improving capabilities for risk analysis. However, the recommendations offered in this report should also lead to improved inputs for those types of decisions.
Statement of Task
The study will review how DHS is building its capabilities in risk analysis to inform decision making. More specifically, the study will address the following tasks:
EVALUATION OF DHS RISK ANALYSIS CAPABILITIES
Approach to Study and Outline of Results
Based on its examination of six illustrative risk analysis models and processes—risk analysis of natural hazards, for critical infrastructure protection, and for allocation of homeland security grants; the Terrorism Risk Assessment and Management (TRAM) and Biological Threat Risk Assessment (BTRA) models; and DHS’s Integrated Risk Management Framework—the committee came to the following primary conclusion:
Conclusion: DHS has established a conceptual framework for risk analysis (risk is a function of threat (T), vulnerability (V), and consequence (C), or R = f(T,V,C) ) that, generally speaking, appears appropriate for decomposing risk and organizing information, and it has built models, data streams, and processes for executing risk analyses for some of its various missions. However, with the exception of risk analysis for natural disaster preparedness, the committee did not find any DHS risk analysis capabilities and methods that are yet adequate for supporting DHS decision making, because their validity and reliability are untested. Moreover, it is not yet clear that DHS is on a trajectory for development of methods and capability
that is sufficient to ensure reliable risk analyses other than for natural disasters.
Recommendation: To develop an understanding of the uncertainties in its terrorism-related risk analyses (knowledge that will drive future improvements), DHS should strengthen its scientific practices, such as documentation, validation, and peer review by technical experts external to DHS. This strengthening of its practices will also contribute greatly to the transparency of DHS’s risk modeling and analysis. DHS should also bolster its internal capabilities in risk analysis as part of its upgrading of scientific practices.
A focus on characterizing sources of uncertainty is of obvious importance to improving the reliability of risk models and analysis as a basis for sound decision making. Uncertainties arise from missing or incomplete observations and data, imperfect understanding of the physical and behavioral processes that determine the response of natural and built environments and the people within them, subjectivity embedded within analyses of threat and vulnerability and in the judgments of what to measure among consequences, and the inability to synthesize data and knowledge into working models able to provide predictions where and when they are needed.
Proper recognition and characterization of both variability and uncertainty are important in all elements of a risk analysis, including effective interpretation of data as they are collected over time on threats, vulnerability, consequences, intelligence, and event occurrence. While some DHS work on risk does evaluate uncertainty, the uncertainties in their models and analyses were rarely mentioned by DHS risk analysts during the committee’s meetings and site visits, and DHS appears to be at a very immature state with respect to characterizing uncertainty and considering its implications for ongoing data collection and the prioritization of efforts to improve methods and models. Closely tied with the topic of characterizing uncertainty is that of representing properly the precision of risk analyses.
The conclusion above about the capability of DHS risk analysis methods to support decision making is based on the committee’s assessment of the quality of those methods, in response to element (c) of the statement of task. Quality was evaluated in two ways, in accordance with elements (a) and (b) of the task, which overlap. The committee interpreted the first element (“Evaluate the quality of the current DHS approach to estimating risk and applying those estimates…”) as calling for an assessment of general frameworks and the second (“Assess the capability of DHS risk analysis methods to appropriately represent and analyze risks …”) as requiring an assessment of actual implementations. The committee concluded that the basic framework of risk analysis used by DHS is sound but that the operationalization of that framework is in many cases seriously deficient, as indicated in more detail below in this Summary and as supported by Chapters 4 and 5 of the report.
The committee reviewed the feasibility of creating integrated risk analyses
covering the entire DHS program areas (element (d) of the task) and concluded that it is not advisable. Instead, the committee recommends in the section “ Integrated Risk Analyses,” below, that DHS perform comparative risk analyses. The distinction is explained in that section of this Summary and in Chapter 4 of the report. The final element of the task calls for the committee to recommend steps for improvement, and these are captured in recommendations throughout this Summary and in the main text of the report.
Natural Hazards Risk Analyses
There is a solid foundation of data, models, and scholarship to underpin DHS’s risk analyses for natural hazards such as flooding. Although models are constantly being developed and improved, risk analysis associated with natural hazards is a mature activity—compared to risk analysis related to terrorism—in which analytical techniques are subject to adequate quality assurance and quality control, and verification and validation procedures are commonly used.
Conclusion: DHS’s risk analysis models for natural hazards are near the state of the art. These models—which are applied mostly to earthquake, flood, and hurricane hazards—are based on extensive data, have been validated empirically, and appear well suited to near-term decision needs.
Recommendation: DHS’s current natural hazard risk analysis models, while adequate for near-term decisions, should evolve to support longer-term risk management and policy decisions. Improvements should be made to take into account the consequences of social disruption caused by natural hazards; address long-term systemic uncertainties, such as those arising from effects of climate change; incorporate diverse perceptions of risk impacts; support decision making at local and regional levels; and address the effects of cascading impacts across infrastructure sectors.
Infrastructure Risk Analyses
The risk analyses that DHS conducts in support of infrastructure protection generally decompose risk into threat (T), vulnerability (V), and consequences (C). With respect to risk from terrorism, defining the threat and estimating probabilities are inherently challenging because of the lack of experience with such events; the associated absence of data on which to base reliable estimates of probabilities; and the effects of an intelligent adversary that may seek to defeat preparedness and coping measures, which causes T, V, and C to be interdependent. There are various methods to compensate for the lack of historical data, including “red team” analyses (in which experts are charged with trying to overcome risk-mitigation measures), scenario analysis, and subject-matter ex-
pert (SME) estimates, and DHS has pursued most of these, although not as consistently as would be desired. There are also multiple methods for combining estimates of threats, vulnerabilities, and consequences and dealing with the dependencies of T, V, and C to estimate risk, such as Bayesian analysis, multi-attribute models, attacker-defender models, or game theoretic calculations. DHS has generally not applied these methods.
DHS’s analyses of vulnerabilities has focused primarily on physical vulnerabilities. There are significant areas to expand this approach to more consistently cover the span of threats. For example, DHS’s vulnerability analyses rarely address coping capacity and resilience (or long-term adaptation). People-related factors, a major part of coping capacity, have been largely overlooked.
Similarly, DHS analyses of consequences have tended to focus on the outcomes that are most readily quantified. Little attention has been paid to secondary economic effects or to an attack’s effects on personal and group behaviors—impacts that could be significant and may be the primary goals of terrorists. Some relevant research is being conducted in DHS’s University Centers of Excellence, and a small amount is funded by the Human Factors and Behavioral Sciences program within DHS’s Science and Technology Directorate, but much more is needed. In addition, efforts must be made to incorporate the results of such research into DHS risk analyses and to heighten risk analysts’ awareness of the importance of social and economic impacts.
Recommendation: DHS should have a well-funded research program to address social and economic impacts of natural disasters and terrorist attacks and should take steps to ensure that results from the research program are incorporated into DHS’s risk analyses.
Based on its study, the committee concluded that DHS’s risk analyses for infrastructure protection might be useful but certainly can be improved. Improvements can be made by considering the adaptability of intelligent adversaries, consistently including evaluation of non-physical vulnerabilities, characterizing sources of uncertainty, working toward verification and validation of models, improving documentation, and by submitting models and analyses to external peer review.
Recommendation: DHS should consider alternatives to modeling the decisions of intelligent adversaries with fixed probabilities. Models that incorporate game theory, attacker-defender scenarios, or Bayesian methods to predict threat probabilities that evolve over time in response to observed conditions and monitored behavior provide more appropriate ways of representing the decisions of intelligent adversaries and should be explored.
Recommendation: DHS should ensure that vulnerability and consequence analyses for infrastructure protection are documented, transparent, and repeatable. DHS needs to agree on the data inputs, understand the technical approaches used in models, and understand how the models are calibrated, tested, validated, and supported over the life cycle of use.
Homeland Security Grants
The committee’s evaluation of the risk-based homeland security grant programs administered by the Federal Emergency Management Agency (FEMA) within DHS determined that they are reasonably executed, given political and other practical considerations. Population counts serve, for the most part, as the surrogate measure for risk. Some of the grants programs are moving toward risk-based decision support, but the various approaches and formulas are still evolving.
Recommendation: FEMA should undertake an external peer review by technical experts outside DHS of its risk-informed formulas for grant allocation to identify any logical flaws with the formulas, evaluate the ramifications of the choices of weightings and parameters in the consequence formulas, and improve the transparency of these crude models of risk.
Recommendation: FEMA should be explicit about using population density as the primary determinant for grant allocations.
Terrorism Risk Assessment and Management Model
DHS’s Terrorism Risk Assessment and Management (TRAM) model is held up as a successful instantiation of risk analysis, and the Port Authority of New York and New Jersey (which initiated TRAM’s development even before the establishment of DHS) is a satisfied user. The committee has concerns, however, owing to the model’s unjustified complexity and lack of validation.
Recommendation: DHS should seek expert, external peer review of the TRAM model in order to evaluate its reliability and recommend steps for strengthening it.
Biological Threat Risk Assessment Model
DHS’s Biological Threat Risk Assessment (BTRA) model, which is used to create biennial assessments of the risks of biological terrorism, was thoroughly reviewed in an NRC report published in 2008.1 The primary recommendation of that report reads as follows:
The BTRA should not be used as a basis for decision making until the deficiencies noted in this report have been addressed and corrected. DHS should engage an independent, senior technical advisory panel to oversee this task. In its current form, the BTRA should not be
used to assess the risk of biological, chemical, or radioactive threats. (p. 5)
The committee was told by DHS that it is addressing most of the recommendations of the 2008 NRC review, but in the committee’s view the response has been incremental, and a much deeper change is necessary. DHS’s proposed responses will do little to reduce the BTRA model’s great complexity, which requires many more SME estimates than can be supported by the limited base of knowledge about biological terrorism. It also precludes transparency, adequate sensitivity analysis, and validation.
Integrated Risk Management Framework
The establishment of the Integrated Risk Management Framework (IRMF) across DHS is going in the right direction, but it is far too early to know if the IRMF will provide real value. Similar integrated or enterprise-level risk management processes in industry typically require several years before their benefits begin to appear. The committee did not observe any improvements to DHS’s risk analysis that could be attributed to these early steps, and so it concludes that integrated risk management may be on the right track but is early in development.
Crosscutting Modeling Issues
Transparency is always important in risk analysis, and especially so when analysts and decision makers must contend with great uncertainty, as is the case with the risks posed by terrorism. The committee found that most DHS risk models and analyses are quite complex and poorly documented, and thus are not transparent to decision makers or other risk analysts. Moreover, some of those models imply false precision, which can give the impression of certainty when it does not exist. Security restrictions are another contributor to poor transparency in some cases.
Recommendation: To maximize the transparency of DHS risk analyses for decision-makers, DHS should aim to document its risk analyses as clearly as possible and distribute them with as few constraints as possible. Further, DHS should work toward greater sharing of vulnerability and consequence assessments across infrastructure sectors so that related risk analyses are built on common assessments.
DHS’s IRMF and National Strategy for Information Sharing documents focus on sharing information with decision makers. However, it is essential that communication with stakeholders and the general public also be included in a comprehensive risk communication strategy. For risks to be truly managed, DHS needs to provide not only information but also analysis and aids to thinking that prepare all affected audiences to cope better with the risks that events might entail. As DHS moves to the next stages of risk communication—which will have to go far beyond information sharing and include a capability for understanding the perceptions and needs of the recipients of various risk-related communications and for translating that understanding to specifically tailored messages—a well-developed risk communication strategy document and program, adequately staffed and funded, will be needed.
Integrated Risk Analyses
DHS is working toward risk analyses that are increasingly comprehensive, in an attempt to enable comparison of the diverse risks under the department’s purview. The committee evaluated the feasibility of creating integrated risk analyses that span all of DHS’s areas of responsibility. An integrated risk analysis collects analyses for all potential risks that an entity, here DHS, is charged with assessing and combines those risks into one complete analysis based on a common metric. A comparative risk analysis, by contrast, omits that last step. In comparative risk analysis, potential risks to an entity from many different sources are analyzed and the risks then compared (or contrasted), but no attempt is made to assess them against a common metric.
Qualitative risk analysis includes methods for formally eliciting advice (such as Delphi analysis and expert judgment) for use in decision making. Such advice can be used to compare risks of very different types. There is a well-established literature on comparative risk analysis that can be used to apply the TVC approach to different types of risk.2 Importantly, the results of such analysis are likely to involve substantially different metrics that cannot be directly compared. In addition, the degree and the extent of uncertainty are likely to be very different across the various risk sources. Nonetheless, the scope and diversity in the metrics can be very informative for decision making as well.
Conclusion: A fully integrated analysis that aggregates widely disparate risks by use of a common metric is not a practical goal and in fact is
likely to be inaccurate or misleading given the current state of knowledge of methods used in quantitative risk analysis. The risks presented by terrorist attack and natural disasters cannot be combined in one meaningful indicator of risk, and so an all-hazards risk assessment is not practical. The science of risk analysis does not yet support the kind of reductions in diverse metrics that such a purely quantitative analysis would require. Qualitative comparisons can help illuminate the discussion of risks and thus aid decision makers.
Recommendation: DHS should not attempt an integrated risk assessment across its entire portfolio of activities at this time because of the heterogeneity and complexity of the risks within its mission.
The committee is more optimistic about using an integrated approach if the subject of the analysis is a set of alternative options for managing risk—for example, if the analysis is of alternative investments for improving resilience. In such cases, the same option might prove able to reduce risks arising from a number of sources such as natural hazards and terrorism. The analysis of alternative risk management options for mitigating risks to a set of activities or assets could then be accomplished through a single quantitative model in much the same way that cost-effectiveness analysis can be used to select a least-cost investment even when the benefits of the various options are incommensurate.
THE PATH FORWARD—RECOMMENDED ACTIONS
Improve the Way Models Are Developed and Used
The committee observed a tendency across most of DHS to build and use complex quantitative models in the apparent belief that such models are the best way to approach risk analysis. Effective risk analysis need not always be quantitative. In particular, the generation and analysis of scenarios is an important component of risk assessment and management in a number of fields. In some cases, improved understanding of risks hinges on improved communication, organizational design, and so on.
The multiple dimensions of risk associated with natural hazards and terrorism are now widely recognized in the risk literature. These include public health and safety, as well as social, psychological, economic, political, and strategic aspects. The desire to quantify, compare, and rank risks arising from different sources can lead to characterizations that simplify or ignore many of these dimensions. In several of the risk studies presented to it, the committee observed omissions and oversimplifications of this type, reflecting a tendency to ignore non-quantifiable risks and to combine non-commensurate attributes into single measures of consequence. Even though DHS is not responsible for managing all aspects of risk—for example, the Department of Health and Human Services has the primary responsibility for managing public health risks—it is appropriate
and necessary to consider the full spectrum of consequences when performing risk analyses intended to inform constructive and effective decision making.
Recommendation: In characterizing risk, DHS should consider a full range of public health, safety, social, psychological, economic, political, and strategic outcomes. When certain outcomes are deemed unimportant in a specific application, reasons for omitting this aspect of the risk assessment should be presented explicitly. If certain analyses involve combining multiple dimensions of risk (e.g., as a weighted sum), estimates of the underlying individual attributes should be maintained and reported.
The committee observed that DHS relies heavily on quantitative models for its risk analysis activities. This approach reflects an outdated and oversimplified view of risk analysis and is certain to result in underemphasizing many attributes of risk that cannot be readily quantified, such as differences in individual values. Instead, risk analysis should be regarded as having both quantitative and non-quantitative attributes, and it should be recognized that narrative descriptions of non-quantitative information about risk are often as important to decision makers as is the more fully quantitative information. Although there are certainly decisions that can be fully informed by the use of simple, quantitative models, it is the case that many important decisions require understanding of the multiple attributes integral to risk. This last point emphasizes that careful delineation of the different types of decisions that DHS has to make is an important precursor to understanding the types of risk analyses appropriate for informing those decisions.
Recommendation: DHS should prepare scientific guidelines for risk analyses recognizing that different categories of decisions require different approaches to risk analysis strict reliance on quantitative models is not always the best approach.
To start, DHS should examine the basic structure of its risk analysis approach. Currently, DHS seems to use the special case formula Risk = T ×V × C very broadly for both terrorism and natural hazards applications. DHS needs to be very careful in documenting assumptions and understanding when the multiplicative formula is appropriate and when it is not.
Risk as a function of interdependent variables T, V, and C is a reasonable problem decomposition for analysis of risks posed by both terrorism and natural hazards. In the natural hazards domain, independence can sometimes be assumed to hold among components, and the formula can be reduced to Risk = T × V × C. In the more general case for natural hazards, the three components may not be independent but the nature of their interdependence may be reasonably known and subject to analysis. In the terrorism domain, however, it is often the case that T, V, and C are functionally interdependent, so that the simple risk function R = T ×V × C does not apply and should not be used. In particular,
DHS must examine clearly whether the variables T, V, and C are actually independent and must guard against the errors that can occur when independence is wrongly assumed.
The basic risk framework of Risk = f(T,V,C) used by DHS is sound and in accord with accepted practice in the risk analysis field.
DHS’s operationalization of that framework—its assessment of individual components of risk and their integration into a measure of risk—is in many cases seriously deficient and is in need of major revision.
More attention is urgently needed at DHS to assessing and communicating the assumptions underlying and the uncertainties surrounding analyses of risk, particularly those associated with terrorism.
Until these deficiencies are improved, only low confidence should be placed in most of the risk analyses conducted by DHS.
Follow Time-Tested Scientific Practices
DHS has not been following the critical scientific practices of documentation, validation, peer review by technical experts external to DHS, and publishing. Given the lack of that disciplined approach, it is very difficult to know precisely how DHS risk analyses are being done and whether their results are reliable and useful in guiding decisions. There is little understanding of the uncertainties in DHS risk models other than those for natural hazards, and in addition there is a tendency toward false precision. It is one thing to evaluate whether a risk model has a logical purpose and structure—the kind of information that can be conveyed through a briefing—but quite another to really understand the critical inputs and sensitivities that determine whether or not it truly produces reliable outputs. The latter understanding comes from scrutiny of the mathematical model, evaluation of a detailed discussion of the model’s implementation, and review of some model results, preferably when compared against simple bounding situations and potentially retrospective validation. It is not adequate to simply ask subject-matter experts whether they see anything odd about a model’s outcomes.
The committee found that in general the models and methods it reviewed did not have the capability to appropriately represent and analyze risks from across the department’s spectrum of activities and responsibilities. As part of its review, the committee addressed what was lacking in the models and methods. It often found that little direct, and more importantly, little effective attention was paid to the features of the risk problem that are fundamental to the homeland security modeling purview. For example, throughout its review, the committee was concerned about the lack of state-of-the-art risk modeling in address-
ing key homeland security issues such as vulnerability, intelligent adversaries, and the range of socioeconomic consequences.
As a result, the committee questions whether the creation of the department from many existing organizations with long-standing approaches to risk analysis might have anchored the DHS to the legacy models of its components. In such cases, and with no de novo process to develop methods and models that specifically focus on the new factors characterizing homeland security risks, it would not be surprising to find a poor fit between legacy models and the demands of a substantially new application. Moreover, if legacy modeling is in fact the source of the capability deficiency, then the committee found little evidence in the materials reviewed that DHS has considered, much less rigorously addressed, this general issue of model design.
Recommendation: DHS should adopt recognized scientific practices for its risk analyses:
DHS should create detailed documentation for all of its risk models, including rigorous mathematical formulations, and subject them to technical and scholarly peer review by experts external to DHS. Documentation should include simple worked-out numerical examples to show how a methodology is applied and how calculations are performed.
DHS should consider creating a central repository to enable DHS staff and collaborators to access model documentation and data.
DHS should ensure that models undergo verification and validation—or sensitivity analysis at the least. Models that do not meet traditional standards of scientific validation through peer review by experts external to DHS should not be used or accepted by DHS.
DHS should use models whose results are reproducible and easily updated or refreshed.
DHS should continue to work toward a clear, unambiguous risk lexicon.
Discard the Idea of a National Risk Officer
The director of DHS’s Office of Risk Management and Analysis (RMA) suggested to the committee that the DHS Secretary, who already serves as the Domestic Incident Manager during certain events, could serve as the “country’s chief risk officer,” establishing policy and coordinating and managing national homeland security risk efforts.3 A congressional staff member supported the
concept of establishing a chief risk officer.4 The committee has serious reservations about this idea. Risk is assessed for many issues across many federal agencies, to address disparate concerns such as health effects, technology impacts, and the safety of engineered systems. The approaches taken differ depending on the issues and the agency missions, and they require disciplinary knowledge ranging from detailed engineering and physical sciences to social sciences and law. For a single entity to wisely and adequately bring to bear such a broad range of expertise to address wide-ranging issues would require a large, perhaps separate agency. In addition, as other NRC studies have concluded, risk analysis is done best as a result of interactions between the risk analysts and the stakeholders, including the involved government agencies. To be effective, such interactions require that the federal agents have an understanding of the issues and of the values of the stakeholders. An attempt to locate all this expertise and experience in one department and to require that the personnel stay current in many different areas is unlikely to succeed.
Build a Strong Risk Analysis Culture at DHS
The long-term effectiveness of risk analysis throughout DHS and the improvement of scientific practice to enable such success both depend on the continued development of an adequate in-house workforce of well-trained risk analysis experts. As DHS expands its commitment to risk analysis, personnel who are up-to-date on scientifically grounded methods for carrying out such analyses will be in increasing demand. At present, DHS is heavily dependent on private contractors, academic institutions, and government laboratories for the development, testing, and use of models; acquisition of data for incorporation into models; interpretation of results of modeling efforts; and preparation of risk analyses. Although there are advantages to relying on expertise that is not available within DHS, in-house specialists should be fully aware of the technical content of such work. In particular, in-house DHS personnel need to ensure the scientific integrity of the approaches and understand the uncertainties inherent in the data, the risk models, and the products of those models. Contractor support will remain essential, but the direction and application of such work should be under the tight control of in-house staff.
Recommendation: DHS should have a sufficient number and range of in-house experts, who also have adequate time, to define and guide the efforts of external contractors and other supporting organizations. DHS’s internal technical expertise should encompass all aspects of risk analysis, including the social sciences. DHS should also evaluate its dependence on contractors and the possible drawbacks of any proprietary arrangements.
Recommendation: DHS should convene an internal working group of risk analysis experts to work with its Risk Management and Analysis and Human Resource offices to develop a long-term plan for the development of a multidisciplinary risk analysis staff throughout the department and practical steps for ensuring such a capability on a continuing basis. The nature and size of the staff, and the rate of staffing, should be matched to the department’s long-term objectives for risk-based decision making.