Conclusions and Recommendations
The behavioral and social sciences provide a foundation for the knowledge and continuous learning that the intelligence community needs to provide the highest level of analysis, with applications that can be implemented now with modest cost and minimal disruption.
As the intelligence community (IC) seeks to reduce uncertainty and provide warning about potential threats to the national security of the United States, it faces increasing demands for analyses that are accurate, actionable, and properly qualified, so that decision makers know how much confidence the analyses warrant. Producing those analyses requires great institutional and intellectual agility as threats emerge from new quarters and require different kinds and combinations of expertise.
Today’s rapidly changing conditions have also created new opportunities for data collection, both classified (e.g., electronic surveillance) and open (e.g., chat rooms, public calls to action). Furthermore, after years of limited hiring following the end of the Cold War, the significant influx of new employees to the IC after 9/11 has created a major workforce transition, with new analysts bringing diverse skills, backgrounds, and experiences. In order to fulfill its mission, the IC leadership must successfully train, motivate, and retain that workforce, as well as continue to recruit and select new analysts with needed skills.
The conditions the IC faces involve issues that have been long studied in the behavioral and social sciences, particularly the behavior of individuals and groups and the working conditions that foster effective analysis. Although that work has yielded significant, usable findings, little of that knowledge has found a place in the IC. As a result, there is a large body of scientific theory, method, and results that could—and should—be applied to IC tasks.
The committee concludes that the IC can derive great benefit, in short time and at relatively low cost, by building on available behavioral and
social science knowledge. As a result, the committee’s recommendations focus on strengthening the scientific foundations of the IC’s analytical methods and the organizational processes needed to support them.
The committee recommends that the IC adopt a two-fold strategy to take full advantage of existing behavioral and social science research. First, it should review its current analytic methods and procedures, in terms of how compatible they are with what is known about how people think and work, as individuals and groups. Second, it should conduct systematic empirical evaluations of current and proposed procedures, assessing their efficacy under normal working conditions as much as possible. Those assessments will allow the IC to know how much confidence to place in these procedures and where to focus its efforts on developing improved ones. These evaluations will not only strengthen the evidentiary base of the IC’s analytical work, but also provide the feedback necessary for continuous learning and improvement.
Over time, this strategy will provide a powerful impetus to basic research critical to the IC’s needs. The former head of a major research unit in the United Kingdom has argued that basic science advances through integrated programs of applied basic and basic applied research (Baddeley, 1979). The former tests how well basic research generalizes to different applied settings. The latter identifies new theoretical questions and then translates them into terms suited to basic research (e.g., experiments, modeling).
Such an integrated research strategy will derive the full benefit of the behavioral and social sciences for the IC’s analytical enterprise. In some cases, the resulting research will be on topics unique to the IC, such as the linguistic conventions of violent extremists. In other cases, it will be on general topics that are central to the IC’s needs, such as electronic collaboration among analysts with heterogeneous information.
The committee’s recommendations are designed to deliver maximum improvement with minimal disruption, helping analysts to do their normal work better. We believe that dramatic improvements in the analytic process are possible within existing organizational constraints. We recognize that many people in the IC feel reorganization fatigue, so we propose ways of working more effectively within whatever structure the IC assumes. We also know that all organizations succeed, in part, by allowing their staff to learn how to work around their inevitable imperfections. Achieving such mastery takes time. If an organization changes too rapidly, its staff cannot function effectively. Thus, we emphasize orderly, measured improvements.
Because they build on existing technologies and organizational structures, our recommendations should not be expensive to implement. They do require both deeply knowledgeable scientists and strong leaders. The scientists will need to know the existing research and ensure its faithful application to the IC’s circumstances. The leaders will need to ensure that
enhancing the IC’s human capital is seen as central to the IC’s success and, therefore, to the nation’s security.
Intelligence analysis is, at its heart, an intensely individual intellectual effort, as analysts synthesize facts of diverse origins. As a result, one focus of our report and recommendations is research regarding how individuals think. However, individual analysts do not operate in a vacuum; they have to collaborate with other analysts. As a result, a second focus of our report and recommendations is the support that they need from their organizations. We thus recommend asking the same questions about collaboration, workforce development, communication, and analytical methods:
What does the science say about current and proposed methods?
How do those methods fare when evaluated under the IC’s conditions?
The committee’s study has determined that there is knowledge from the behavioral and social sciences that is ready for application within the IC. That claim invites an explanation of why the opportunity exists. We believe that it reflects properties of the behavioral and social sciences and of the IC.
During much of its life, the IC has been intensely concerned with questions of military materiel, standing armies, and large-scale weapons. Its behavioral foci have been fairly narrow, such as the notoriously difficult task of reading leaders’ intentions and the somewhat more tractable tasks of interpreting national and international politics. As a result, the IC has developed little internal expertise on many behavioral and social science issues. Indeed, the IC has so little expertise in some areas that it sometimes struggles to recruit needed scientists, although efforts like its IC Associates Program can provide partial solutions. The computationally intensive demands of many IC analyses have also contributed to its paying relatively little attention to the human side of the analytical enterprise.
For its part, the behavioral and social science community has been a distant, sometimes reluctant partner for the IC. Its science has often involved controlled experiments that foster the discovery of basic behavioral principles, while discouraging study of applications. Social conflicts in the second half of the 20th century have also distanced the academic and intelligence communities from one another in the United States. Fortunately, these barriers have fallen with the rise in national unity following the 9/11 attacks.
A noteworthy exception to this historic pattern has been the landmark work of Richards Heuer, Jr., whose Psychology of Intelligence Analysis
(1999) demonstrates the relevance of behavioral research to the work of the IC. Equally remarkable has been the success of Heuer and his associates in getting structured analytical techniques (SATs), based on behavioral research, accepted in the IC and even having versions of SATs installed on IC computer systems. However, the IC has not pursued this effort through to the point of performing systematic empirical evaluation of SATs. There are theoretical reasons for predicting both that SATs improve analysis and that they interfere with it. Without empirical evaluation, one can only speculate about when improvement or interference will dominate under different conditions and analytic questions.
A BEHAVIORAL AND SOCIAL SCIENCES FOUNDATION
Traditionally, the IC has adopted a practice-based approach to analysis. It has relied primarily on apprenticeship-like processes to train new analysts in methods that have evolved almost exclusively through intensive attempts to learn from experience. We propose a complementary commitment to evidence-based analysis, drawing on the behavioral and social sciences to evaluate current and create new approaches to analysis, collaboration, workforce development, and communication. Such evidence-based analysis should be used both to examine existing and proposed approaches, in order to determine their compatibility with the science, and to study their actual performance empirically, under normal working conditions.
Although this recommendation reflects Enterprise Objectives 2 and 5 of the National Intelligence Strategy (Office of the Director of National Intelligence, 2009; see Box 1-1 in Chapter 1), conducting such evaluations is a brave step for any organization, because the evidence needed for internal learning can also be used for external criticism. It is, therefore, critical that the Director of National Intelligence (DNI) be the sponsor for the initiative and the audience for its results, in order to demonstrate commitment, at the highest level, to evidence-based analytical methods.
Evaluation research is methodologically demanding. Poor evaluations can undermine good initiatives (e.g., by failing to recognize that they have been poorly implemented) or promote poor ones (e.g., by subjecting them to soft tests). Poor evaluations can even undermine performance (e.g., if paperwork requirements dominate analytical accuracy). A central task in evaluation research is assessing how well a program has been implemented. Unless a program has been properly implemented, it will not receive a fair test (although if a program cannot be implemented faithfully, it has little value).
In order to meet these commitments, the IC needs staff qualified to identify, implement, and evaluate the best opportunities for improving its analytical processes. To that end, we propose designating a senior officer,
reporting to the DNI and supported by an independent advisory panel of behavioral and social scientists with strong basic and applied credentials. Such individuals are in short supply. The IC’s ability to recruit and retain expert advisors will provide a measure of its success in strengthening the scientific base of its analyses.
The Director of National Intelligence should ensure that the intelligence community applies the principles, evidentiary standards, and findings of the behavioral and social sciences to its analytic methods, workforce development, collaborations, and communications. Success will require strong leadership, active engagement with the academic community, and the creation of a robust reporting mechanism (such as a biennial report from each agency) to identify residual problems and plans to remedy them. The Director of National Intelligence should be supported by a senior officer and an independent advisory committee with appropriate scientific expertise.
Use the Intergovernmental Personnel Act to embed independent experts in the IC for limited terms.
Embed IC analysts in academic research environments to participate in research and to network with scientists who can be consulted later.
Develop specialized behavioral and social science expertise cells across the IC, coordinated through the Office of the Director of National Intelligence (ODNI).
Ensure that the IC Associates Program actively uses behavioral and social science expertise.
Create and widely disseminate an Analytical Methods Resource Guide that introduces key methods, shows how to choose methods suited to specific intelligence questions, and identifies experts who can apply each method, from inside and outside the IC.
The conditions that support learning are among the best understood aspects of human behavior. Those conditions include large quantities of unambiguous feedback, with properly aligned incentives. Achieving these conditions is consistent with Enterprise Objectives 5 and 7 of the National Intelligence Strategy (Office of the Director of National Intelligence,
2009; see Box 1-1 in Chapter 1), as well as with the IC’s tradition of lessons-learned and after-action reports and with the voluminous literature produced by government commissions, former intelligence officers, and academic researchers (see Chapter 2).
Unambiguous feedback requires predictions that can be evaluated in light of subsequent history. A straightforward and necessary step is attaching numeric probabilities to explicitly defined events (e.g., “There is a 75 percent chance that country A has a stockpile of biological weapons”; “There is a 90 percent chance of X being in power at year’s end”). Significant amounts of feedback are needed to provide stable performance measures. In order to create such feedback, we recommend compiling a database of assessments and predictions, indexed by properties that might affect their quality (e.g., the analysts’ background and analytical method), and further annotating the analyses archived in the Library of National Intelligence. Doing so would also facilitate research on confounding factors (such as self-fulfilling and self-defeating prophesies) whereby analyses lead to (political or military) actions that change the world, so that it is no longer possible to evaluate their accuracy.
We recognize that there has historically been resistance to numeric probability estimates from analysts who believe that they imply artificial precision. However, as discussed in Chapter 2, the scientific evidence, including Canada’s real-world success with numeric probabilities in intelligence analysis (Mandel, 2009), suggest that, with proper training and feedback, such judgments could substantially improve analytic products and customer understanding of them. Proper incentives seek to encourage learning, not to determine culpability. They reward positive performance and cultivate the natural desire to do well, a desire that is especially prevalent in the IC. In addition, numeric probabilities allow feedback that is essential to learning. Proper incentives discourage both overconfidence (intended perhaps to carry an argument) and underconfidence (intended perhaps to avoid responsibility). They encourage good calibration: being as confident as one’s understanding warrants. Thus the DNI must ensure that numeric probabilities are implemented in a constructive way, using them for useful feedback, not destructive criticism.
The Director of National Intelligence should ensure that the intelligence community adopts scientifically validated analytical methods and subjects all of its methods to performance evaluation. To that end, each analytical product should report, in a standardized format, the elements necessary for such evaluation, including its analytical method, domain, conclusions, analysts’ background, and the collaborations that
produced it. Analyses must include quantitative judgments of the probability and uncertainty of the events that they forecast. These reports should be archived in a database that is routinely used to promote institutional learning and individual training and as input to the Director of National Intelligence’s ongoing review efforts of analytic shortfalls and plans to address them.
Institutionalize an “Analytical Olympics,” with analysts and analytical methods competing to provide the best calibrated probabilities (i.e., showing appropriate levels of confidence) in assessments and predictions made for well-specified outcomes that have occurred or will occur in the near future.
Begin assessing how well-calibrated individual analysts are, using the results as personal feedback that will allow analysts to improve their own performance and the IC to learn how this performance is related to workforce factors, such as personal capabilities, training, and incentives.
Create a research program that reviews current and historic assessments, looking for correlates of accuracy and calibration, considers properties such as the method used, collaboration process, classification level, substantive domain, and team composition.
The quality of the human resource pool places greater constraints on an organization’s human capital than any other single factor. It is the focus of Enterprise Objective 6 of the National Intelligence Strategy (Office of the Director of National Intelligence, 2009; see Box 1-1 in Chapter 1). Currently, the IC typically recruits analysts on the basis of their substantive expertise and rewards them on the basis of process-based performance (e.g., workflow). Research finds that both practices are inadequate by themselves. We recommend a systematic review of the theoretical soundness of current practices, followed by empirical evaluation of the efficacy of current practices and alternative ones.
Clearly, the IC needs analysts with deep substantive knowledge of countries, cultures, transnational relations, and myriad other issues. However, it also needs analysts capable of integrating knowledge across domains, working with experts from other fields, and coping with shifting assignments. As a result, the IC needs analysts with both the intellectual capacity for synthetic thinking and substantive familiarity with the full range of analytical methods. The former is a stable individual trait, which must
be pursued in the IC’s recruitment and selection processes. The latter is a malleable individual skill that can be acquired through training. The goal of such training is not mastery of alternative methods; rather, the goal is enough familiarity to recognize different kinds of problems and to work with others who have technical mastery of the methods.
Thus, every analyst should have a basic understanding of the fundamental ways of thinking captured by probability theory, game theory, operations research, qualitative analysis, and other analytic methods (see Chapter 3). Each method provides a different way to look at the world and organize data. Each has been refined through decades (even centuries) of rigorous peer review and has well-understood strengths and limitations. Making them part of IC analysts’ basic intellectual repertoire will increase analysts’ ability to address their customers’ needs.
The Director of National Intelligence should ensure that intelligence community agencies use evidence-based methods to recruit, select, train, motivate, and retain an adaptive workforce able to achieve the performance levels required by intelligence community missions. On the basis of that research:
The intelligence community should recruit and select individuals who have the stable individual attributes (e.g., cognitive ability, personality, values) known through research to be associated with better performance.
The intelligence community’s training, motivation, and performance feedback should focus on improving malleable individual attributes (e.g., job-specific skills) associated with better performance.
The intelligence community should expand opportunities for continuous learning that will enhance collaboration, innovation, and growth in the application of analytical skills.
Create a course to provide all IC analysts with basic familiarity with the full range of analytical methods with strong scientific foundations (e.g., probability theory, statistics, game theory, qualitative analysis).
Create an inventory of psychometrically validated measures of intellectual ability that can be administered to current and pro-
spective analysts, in order to study which abilities are related to analytical performance.
Convene an independent working group of human resource scientists to review current recruitment, selection, motivation, and retention practices in light of the relevant behavioral and social science.
Develop on-the-job training programs to cultivate a culture of continuous learning, whereby the entire workforce is actively involved as both teachers and students.
Recognizing that essential information is often scattered across individuals and units, the IC has made collaboration central to its current efforts. The need for collaboration is recognized in Enterprise Objectives 1, 2, and 4 of the National Intelligence Strategy (Office of the Director of National Intelligence, 2009; see Box 1-1 in Chapter 1) and has been the motivation for such innovations as A-Space, Intellipedia, the Analytical Resources Catalogue (ARC), the Library of National Intelligence, and joint IC duty positions (see Intelligence Community Directive 601 [Office of the Director of National Intelligence, 2006]). All of these innovations are intended to familiarize intelligence officers with a wide variety of intelligence requirements, methods, users, and capabilities (see Intelligence Reform and Terrorism Prevention Act of 2004, P.L. 108-458). These innovations seek to create the agility needed to cope with adversaries who have rapidly shifting identities and operations.
Behavioral and social science findings provide reason to believe that these innovations do, in fact, promote the collaboration that the IC seeks. They are flexible, allowing analysts to adapt them to their own purposes. They are open, allowing analysts to create self-organizing groups that are adapted to specific tasks. They are complementary, allowing analysts to choose the methods best suited to their needs. At the same time, however, those behavioral and social science findings also provide reasons to question the efficacy of these innovations. For example, these methods can be time consuming and provide information from unfamiliar sources, with uncertain quality. Given these contradictory possibilities, we recommend that the IC undertake systematic empirical study of these “natural experiments,” assessing the impacts of the various methods for different uses and users. Although not expensive, such evaluations require methodological sophistication in order to create fair tests and to provide useful guidance on how the innovations could be improved.
The Director of National Intelligence should require systematic empirical evaluation of current and proposed procedures for enhancing the collaboration that is essential to fulfilling the intelligence community’s mission. That evaluation should be based on scientific principles that prescribe the extent and form of collaborative methods for effective performance under the intelligence community’s operating conditions. This approach will require ongoing innovation, evaluation, and learning about collaborative methods.
Conduct field evaluations of at least two collaborative methods, assessing their uses, users, and impacts. Create and implement an evaluation methodology that can then be used more broadly.
Collaborative aids like A-Space should be subjected to rigorous evaluation of what they do well and poorly. That evaluation should examine the possibility of enhancing A-Space with programs that prompt collaboration between analysts who are working on related problems, but are unaware of their mutual interests.
Develop a database, or modify the Library of National Intelligence, to characterize analyses in terms of features that might be related to their effectiveness, such as the methods used, the contacts consulted, and the collaborations undertaken.
Effective communication is essential to ensuring that analysts understand their customers’ information needs and that their customers understand the analysts’ conclusions and confidence levels. These needs are recognized in Enterprise Objective 2 of the National Intelligence Strategy (Office of the Director of National Intelligence, 2009; see Box 1-1 in Chapter 1). However, there are many potential barriers to effective communication, including the natural tendency to exaggerate how well one communicates, the frequent lack of direct contact between analysts and their customers, and the many steps in the IC review, coordination, and approval process, each capable of improving or degrading how well the resulting report conveys the original analysts’ intent.
The clarity demanded by the evaluation processes that the committee proposes in Recommendation 2 will provide a foundation for better communication, by requiring analysts to be explicit about their terms, predictions, uncertainty, rationale, and conditions of validity. Standardization
will facilitate creating communication protocols that convey the intended meaning of the analyses to customers, as well as permitting the elicitation of customers’ needs in clear terms. These kinds of protocols can build on the science of communication and use its methods to evaluate analysts’ success in establishing the needed situational awareness. Over time, a disciplined approach to communication will make analysts and customers more sophisticated about one another’s worlds, improving the collaboration between them.
The Director of National Intelligence should implement scientific, evidence-based protocols for ensuring that analysts and customers understand one another. Achieving this goal will require standard protocols for communicating the uncertainties and limitations of analyses, expanded opportunities for analysts to learn about customers’ needs, and feedback evaluating the usefulness and presentation of analyses.
Develop and evaluate standard protocols for communicating the confidence that should be placed in analytic judgments (following Recommendation 2).
Evaluate the efficacy of current methods for requesting analyses in terms of how well they convey customers’ intentions to analysts.
Evaluate the impact of internal review processes on how well the resulting reports convey analysts’ intending meaning.
The IC has recently undergone its most sweeping structural changes since 1947, including the creation of the ODNI. The IC is also undergoing a demographic transition, with new analysts bringing different backgrounds and capabilities into the community. At the same time, new technologies offer new capabilities for data gathering, data sharing, and collaboration which might aid or distract analysts. The IC has received additional resources, along with growing public awareness of its importance.
Taking full advantage of these opportunities will require carefully planned strategies. New analysts must be trained in tradecraft, rewarded for high-quality performance, and provided access to veteran analysts’ wisdom and tacit knowledge. New methods and technologies have to be designed with analysts in mind, subjected to rigorous evaluation, and kept from interfering with normal individual and collective thought processes.
Even if the world were static, complex activities that involve people, like intelligence analysis, will never be perfect. As a result, the IC must continually evaluate its own performance, both to learn from its experience and to provide policy makers with realistic expectations of its capabilities.
The committee’s recommendations are interdependent. Without appropriate human resource policies, analysts cannot create the communication networks needed to share information. Without regularly updating their theoretical knowledge, analysts cannot take advantage of the evidence and information sources available to them. Without sound, informative performance evaluation, no one can know how well any methods are working.
Any change involves a gamble, sacrificing the relative stability of current practices in return for the promise of improved future performance. Change is necessary today, because traditional analytical methods and institutional arrangements are increasingly challenged by the demands on IC analysis. Pursuing such disciplined, evidence-based change will require strong leadership. Analysts need to know that their organization will support them if they innovate and if they rigorously evaluate their own performance.
Strong leadership is needed to acknowledge that intelligence analysis is inherently imperfect, then to create realistic standards of accountability, demanding the best feasible performance. Leadership is needed to recognize that even the best systems lose their efficacy if the world changes faster than they do. That leadership must be manifested externally, by subjecting performance to well-designed tests and rejecting unfair ones; it must be manifested internally, by showing that “evaluation” denotes learning and not fixing blame. The behavioral and social sciences provide a foundation for taking the best possible gambles, regarding analytical and management processes, then objectively evaluating their success.
Baddeley, A.D. (1979). Applied cognitive and cognitive-applied psychology: The case of face recognition. In L.G. Nilsson, ed., Perspectives on Memory Research (pp. 367-388). Hillsdale, NJ: Lawrence Erlbaum.
Fischhoff, B. (2008). Assessing adolescent decision-making competence. Developmental Review, 28, 12-28.
Heuer, R.J., Jr. (1999). Psychology of Intelligence Analysis. Washington, DC: Center for the Study of Intelligence, Central Intelligence Agency.
Mandel, D. (2009). Canadian Perspectives: Applied Behavioral Science in Support of Intelligence Analysis. Paper presented at the meeting of the Committee on Behavioral and Social Science Research to Improve Intelligence Analyses for National Security, Washington, DC. May 14. Available: http://individual.utoronto.ca/mandel/nas2009.pdf [October 2010].
Office of the Director of National Intelligence. (2006). Human Capital Joint Intelligence Community Duty Assignments. Intelligence Community Directive Number 601 (Effective: May 16, 2006). Available: http://www.dni.gov/electronic_reading_room/ICD_601.pdf [April 2010].
Office of the Director of National Intelligence. (2009). National Intelligence Strategy of the United States of America. Available: http://www.dni.gov/reports/2009_NIS.pdf [June 2010].