The Intelligence Community (IC) and the social and behavioral sciences (SBS) research community have had many opportunities to work jointly since the early 20th century, and several structures are currently in place to facilitate that collaboration (see Chapter 9 and Appendix A). Nonetheless, each of these communities has a distinct culture and has developed specialized ways of working. For readers not already deeply familiar with the way both academic researchers and IC analysts work, it may be useful to learn more about the contexts in which their work is carried out. This chapter provides a brief introduction to the two communities and offers some observations about similarities and differences in the challenges of research and analysis they face.
The United States devotes considerable resources to intelligence and national security (see Box 2-1). In 2017, the total spent on intelligence was $70.3 billion, of which $53.5 billion was for national intelligence (the remainder was for military intelligence) (Miles, 2016). This work is carried out by what is loosely known as the IC, defined by the Office of the Director of National Intelligence (ODNI) as “a federation of executive branch agencies and organizations that work separately and together to conduct
intelligence activities necessary for the conduct of foreign relations and the protection of the national security of the United States.”1
The work of the 17 agencies that make up the IC2 (see Box 2-2) is not readily understood by outside observers. Much of that work is classified, and those who perform it rely on complex procedures and traditions that have developed over decades. During the past few decades, the public and the academic community have sought greater understanding of how the national security infrastructure—which carries out a function widely recognized as critical—functions. Aspects of the agencies’ work have become better understood, in part because of journalists’ and scholars’ pressure for access to both historical and current information, and in part as a result of the work of a growing number of academic researchers drawn from such fields as history, international studies, and political science who have focused on the study of intelligence gathering and assessment (Monaghan, 2009), contributing interdisciplinary understanding of the role and functioning of spy agencies, intelligence abuses, and other issues.3
1 Available: https://www.dni.gov/files/documents/ICD/ICD%20203%20Analytic%20Standards.pdf [January 2019].
2 The terms “national security,” “defense,” “intelligence,” and “Intelligence Community” are used somewhat informally to refer to related and sometimes overlapping functions associated with protecting the United States from domestic and external threats. In this report we focus on the work of intelligence analysts—those employed by the 17 IC agencies to analyze intelligence; when we refer to the IC, we mean that workforce and the agency leadership that oversees it.
3 See, for example, the Journal of Intelligence and National Security, Journal of National Security, and Journal of Global Security Issues.
The focus of this report is on the work of the intelligence analyst, carried out in the IC context. In this section, we provide an overview of the primary functions of the agencies dedicated to protecting the nation’s security, based on information made publicly available through official documents and websites produced by the IC agencies.
The government officials responsible for the nation’s security rely on information collected by the IC. Each of the 17 IC agencies has a particular focus and mission, but together they provide support to decision makers (often referred to as “customers” or “clients” because it is they who act
on the intelligence provided) in the executive and legislative branches of government, law enforcement, and the military.
The National Security Act of 19474 launched the U.S. national security establishment. The changing nature of both foreign and domestic threats, as well as differing views about such issues as the purposes for which intelligence may be collected and used, the proper leadership and oversight for agencies involved in collecting it, and protections for U.S. citizens’ privacy, has influenced numerous reorganizations of the IC in the course of the nation’s history (Federation of American Scientists, 1996; Rosenwasser and Warner, 2017). As new agencies involved in collecting and analyzing intelligence have emerged, coordinating their work has become an increasing challenge. The Intelligence Reform and Terrorism Prevention Act of 2004 (IRTPA)5 established ODNI to lead the IC, and charged it with synchronizing the collection, analysis, and counterintelligence carried out across the IC agencies (Office of the Director of National Intelligence, 2013).6
While the structure of the IC has evolved in response to the nation’s changing needs, its overall responsibility has not changed: it is collectively responsible for identifying issues that need to be investigated, collecting and analyzing relevant information of many kinds, and conveying the information and analysis in a timely manner to those who need them while also keeping the information secure and complying with laws regarding its collection and handling.
Of the 17 IC agencies, all but 9 reside within the U.S. Department of Defense (DoD). While ODNI has a coordination function, it is also an agency itself, as is the Central Intelligence Agency (CIA).7 Five departments of the executive branch have their own intelligence agencies as well: the Federal Bureau of Investigation (FBI) and the Drug Enforcement Administration (DEA) under the Department of Justice and agencies under the Departments of Energy, Homeland Security, State, and Treasury (refer to Box 2-1).
The IC agencies are guided by “a multitude of laws, executive orders, policies, and directives,” and their missions overlap to some extent (Federation of American Scientists, 1996). The entire IC is guided by Executive Order 12333 (as amended),8 but each of the IC agencies also contains many offices, directorates, and other units, each with its own areas of authority, methods, and responsibilities. For example, the CIA integrates intelligence
4National Security Act of 1947, Public Law No. 80-253, 61 Stat. 495 (1947).
5Intelligence Reform and Terrorism Prevention Act of 2004, Public Law No. 108-458, 118 Stat. 3638 (2004).
8 Exec. Order No. 12,333, 3 C.F.R. 200 (1981), reprinted in 50 U.S.C. § 401 app. at 44-51 (1982).
from all sources into a body of “national intelligence” that supports the development of foreign and domestic policy; it is also the principal collector of certain types of intelligence and has responsibility for covert action. The FBI’s mission is to “protect the America people and uphold the Constitution of the United States,” but it also has agents in other nations and collects intelligence of many sorts (Federal Bureau of Investigation, 2018). The Department of Homeland Security has an almost identical mission—to “safeguard the American people, our homeland, and our values”—although it focuses on threats to aviation and border security, the integrity of cyber networks, and violent extremism (U.S. Department of Homeland Security, 2018). The Defense Intelligence Agency (DIA) has a different focus: integration of intelligence collected by the various service branches into “defense intelligence” used to support the missions of U.S. military forces (Defense Intelligence Agency, 2018).
There are many different types of intelligence that reflect the purposes for which they are collected. While no canonical definitions exist for these various types of intelligence, ODNI defines as the foundational intelligence mission of the IC the collection of the following:
- strategic intelligence—“to inform and enrich understanding of enduring national security issues”;
- anticipatory intelligence—“to detect, identify, and warn of emerging issues and discontinuities”; and
- current operations—“to support ongoing actions and sensitive intelligence operations.”9
Observers of intelligence work also commonly distinguish among strategic intelligence, used by “policymakers to make policy and military decisions at the national and international level” (Rosenbach and Peritz, 2009, p. 10); operational intelligence, used by “leaders to plan and accomplish strategic objectives within the operational area” (Rosenbach and Peritz, 2009, p. 10); and tactical intelligence, best known as the information provided to military leaders in the field to support them in planning and executing battles and engagements, helping them “accomplish immediate tactical objectives” (Rosenbach and Peritz, 2009, p. 10). As the ODNI definition indicates, strategic intelligence deals with long-range issues and needs related to the development of national strategy and policy. It is a primary source of insights into international situations, and supports the
9 Available: https://www.dni.gov/index.php/what-we-do/what-is-intelligence [January 2019].
development of military plans and strategic operations (Clark, 2013; Miles, 2016). Operational intelligence concerns the capabilities and intentions of adversaries, and is used both in military contexts (e.g., to monitor events or support military campaigns) and in diplomatic contexts (e.g., to support negotiation of an arms reduction treaty). Law enforcement also may use operational intelligence to plan an operation, such as mass arrest of members of an organized crime syndicate (Clark, 2013; Miles, 2016). Tactical intelligence provides commanders with information regarding imminent threats to their forces and changes in the operational environment (Miles, 2016). Law enforcement uses tactical intelligence in a similar manner.
These categories of intelligence encompass a wide range of information needs and possible sources of information. It is important to note as well that intelligence includes more than information that is gathered through clandestine means or needs to be classified; it also includes many types of readily available information, and a key function of the analyst is to identify which information will be useful to the client. In developing strategic plans, for example, policy makers may need analyses of developments in a geographic area that are based on historical and political science research, as well as continuous monitoring of groups and trends, but that also reflect up-to-the-minute intelligence about fast-moving events. Decision makers may need support in anticipating immediate or longer-range threats or opportunities, in making sense of unexpected developments, or in responding to crises. The six basic types of raw information that make up most intelligence are described in Box 2-3. They are distinguished primarily by the methods used to collect the information, and multiple agencies may contribute to the collection of each type (Krizan, 1999; Office of the Director of National Intelligence, 2019; Rosenbach and Peritz, 2009).
The IC agency tasked with providing the intelligence and the client (i.e., user of the intelligence) must work together: the analyst must understand the client’s intelligence need to identify and efficiently synthesize the information that will meet that need. As discussed in greater detail in Chapter 4, the analyst’s responsibility is to provide information that is as complete and accurate as possible, but often in an extremely tight timeframe. Frequently, the client’s need is both complex and urgent: the questions requiring answers may require both deep understanding of social, cultural, and historical trends and the integration of complex technologically generated information, all in the context of fast-moving developments in which lives may hang on a policy maker’s decision.
The set of academic disciplines generally referred to as the social and behavioral sciences (SBS) is large and diverse. Researchers in these fields use
a wide range of scientific methods, research strategies, and tools and rely on diverse theoretical approaches,10 but what they share is “their analytic focus on the behavior, attitudes, beliefs, and practices of people and their organizations, communities, and institutions” (National Research Council, 2012, p. 10). SBS researchers inquire into people and what they do from many different stances and ask a wide variety of questions about individuals, groups, communities, societies, and nations. They may examine, for example, individual mental processes that guide behavior, ways in which cultural practices and attitudes are shared and evolve across generations, or how water shortages are influencing political developments in a particular region. Within each SBS discipline, moreover, there are multiple subspecialties that have developed their own research approaches and methodologies. This is a large research community: by one estimate it encompasses 35,490 social scientists in the United States alone, and that number does not include researchers employed outside of academic institutions (U.S. Bureau of Labor Statistics, 2018). Research collaboration and sharing of knowledge across international borders are commonplace in SBS disciplines as well; U.S. researchers expand their knowledge through international journals and academic conferences.
While it is not simple to develop a comprehensive list of SBS fields, they include areas as diverse as demography and social statistics, sociology, economics, linguistics, social anthropology, international relations, and psychology. Most—though not all—of these fields have applications important for national security. The challenges and risks discussed in this report reflect the importance of analyses that, for example, reveal implications of the aging of populations in certain countries, support forecasts of political crises, or indicate shifts in the narratives of extremist groups that provide indicators of objectives or targets.
Researchers across SBS fields use many different research approaches, including varieties of experimental and observational studies, evaluation, meta-analysis, and qualitative and mixed-methods research.11 As discussed below, significant advances have recently been made possible by expanded computing power, increased capacity for work with large-scale datasets, and improved methods of analysis (National Research Council, 2012). A comprehensive grounding in the methods and approaches used across the
10 This report addresses many aspects of SBS research. The term “research tools” is used to refer to specific means of collecting data, including both traditional means, such as questionnaires or survey instruments, and such technology-supported tools as software for use with large-scale datasets. The terms “research approach” and “theoretical approach” are used to refer to the researcher’s theoretically based ideas about how to apply tools and methodology to answer a defined question.
SBS is beyond the scope of this report, but before turning to key technological developments, we note three general trends that apply across SBS fields.
One important trend is an increase in work that cuts across disciplines, in some cases yielding new hybrid disciplines (transdisciplinary research) (Scott et al., 2015). Integrating research methods and types of data from different disciplines, including many of the physical and life sciences, has proven critical for satisfactorily investigating complex problems. For example, researchers interested in environmental sustainability have recognized that to develop a complete picture, it is necessary to combine understanding of such areas as human cognition, risk perception, economic behavior, and problem solving and decision making with understanding of climate change and its impacts on weather and infrastructure (see, e.g., Tainter et al., 2015). Interdisciplinary work is particularly important in the context of the technological advances discussed below.
Also of note—and an important intellectual backdrop for much of the research discussed in this report—is a significant evolution in thinking that has grown out of work involving not only such SBS fields as neuro- and cognitive psychology, social psychology, and sociology but also physical and life science disciplines, including neuroscience and genetics. Building on work by Bronfenbrenner and others (e.g., Bronfenbrenner, 1977, 1994) that elaborated on how the individual is influenced by family, community, and the broader social context, researchers have developed a much more sophisticated picture of these interactions.12 Emerging work has demonstrated the dynamic interplay between human development and the environment—from the genetic and epigenetic levels to the broadest sociocultural level. Insights about the importance of culture and environment have profoundly influenced the study of many aspects of the human experience, including learning and development (see NASEM, 2018) and the factors that influence mental and behavioral disorders in young people (see NASEM, forthcoming). This rapidly developing area of study has clear implications for security-related topics, from the growth of extremist ideology in an individual to the influences that shape the beliefs and norms of a particular group or society. Application of these emerging ideas in the context of national security concerns is a promising frontier for further research.
A third trend that cuts across SBS fields is growing understanding that a significant majority of SBS research has relied on information collected within cultures that are Western, educated, industrial, rich, and democratic (WEIRD). Conclusions drawn from research conducted with samples that are limited in this way cannot readily be applied beyond the context in which the data were collected; this is known as the WEIRD problem (see, e.g., Henrich et al., 2010; Nielsen et al., 2017; see also the
discussion of study populations in NASEM, 2018). Researchers across numerous fields have become increasingly circumspect about considering these characteristics in the design of studies and the analysis and reporting of results. This issue has particular relevance for the IC, not least because the IC is concerned with understanding the behavior and motivations of people and societies around the globe.
The accuracy and validity of research are of paramount importance for both the SBS researchers whose calling is to produce it and the intelligence analysts who rely on this and other research while also engaging in many forms of research in their own work. Moreover, understanding human behavior is, in a sense, the primary objective for each of these professional communities, and they draw on many of the same resources in their quests for that understanding. At the same time, however, very different pressures, constraints, and cultures have shaped these two professions. A brief look at an emerging phenomenon—big data—that has had a significant impact on both SBS research and intelligence analysis highlights the overlap between these two endeavors; we also look briefly at interdisciplinary collaborations and at key differences between the two communities.
The digitization of society has resulted in an exponential increase in the amount of data available to analyze, and has made access to new sorts of data easy and widespread—a phenomenon loosely referred to as “big data.”13 While there is no one best definition of the term “big data,” it is generally used to refer to extremely large sets of digital data that cannot be digested without advanced analytic techniques. This development has had a profound influence on the kinds of work that can be done by both SBS researchers and the IC. It has led to increased interdisciplinary collaboration (see below), increased use of mixed methods, increased reliance on computational techniques and models, and increased reliance on online environments and publicly available data for research (Scott et al., 2015).
Digital data fall into two categories. Most broadly, virtually all existing forms of data are now routinely archived digitally. Historical information that would be of interest to the IC and was originally stored in analog form is being converted to digital form through scanning and optical character recognition. These data are stored in digital repositories accessible through
websites. A second, more rapidly growing category of digital data is information being generated continuously on digital platforms supported by Internet and web protocols. These data include the content—text, graphics, audio, and video—resulting from actions, interactions, and transactions that occur in digital space. In addition to this content, digital trace data provide “metadata” about, for instance, who interacts with whom, about what, and at what time.
All of these data offer unprecedented opportunities for both scholarly researchers and IC analysts. Beginning approximately a decade ago, both groups began to encounter exponentially increased volumes of data potentially relevant to their work, although much of the newer data is what statisticians term “noisy” (data whose significance is difficult or impossible to discern). This relatively new data glut has precipitated considerable progress in descriptive, predictive, and prescriptive analytic techniques as researchers have pursued ways to use newly available types of data:
- Descriptive analytic techniques are means of summarizing large tracts of data to discern patterns, often using exploratory data analyses. These techniques can be used, for instance, to detect a significant and abrupt growth in messaging within an adversarial social network.
- Predictive analytic techniques are means of discerning associations among variables, with the goal of predicting certain variables (e.g., civil unrest) based on their association with others (e.g., use of certain terms in social media chatter). Machine learning techniques are valuable for this sort of analysis: computers are “trained” with known data (e.g., available historical data about the variable of interest) to recognize the sorts of patterns in which the analyst is interested, and the resulting algorithms are used to make predictions about the variable of interest (NASEM, 2018).
- Prescriptive analytic techniques are particularly useful for identifying which variables can be manipulated to result in a desired change in the variable of interest. Here, recent advances in what are called optimization techniques can leverage insights from machine learning models to go beyond predicting what might happen to offering insights on what to manipulate to bring about a desired outcome. A helpful, although not entirely accurate, analogy can be found in climate science. In that field, descriptive analytics indicate the current weather conditions, while predictive analytics indicate the forecast. Prescriptive analytics might result in recommending the use of cloud seeding to trigger precipitation.
Advances in large-scale data collection and computational analytic techniques, as well as the continuing growth of computer power, have opened up promising new frontiers in SBS research (National Research Council, 2012). Large-scale datasets—including social media, events data, and geographic information system (GIS) data—are sources of information that analysts and policy makers can use to better monitor and understand global events in real time and at a finer granularity than was previously possible. Advances in machine learning, robust optimization, computer-assisted content analysis, social network analysis, agent-based modeling, and other emerging computational approaches are facilitating the generation of reliable and robust SBS insights using big data.
Machine learning tools such as Random Forest—an algorithm used for decision making and classification (see Breiman and Cutler, n.d.)—for example, have the potential to provide more accurate predictions of rare events, such as the outbreak of civil war, relative to traditional statistical methods (Muchlinski et al., 2016). Combining automated textual analysis and social network analysis, researchers have shown that it is possible to “analyze hacker chats and other data faster and more efficiently than had previously been possible,” enabling them to identify hackers’ intentions (NASEM, 2017b, p. 17). This work may improve forecasts of hacker threats (NASEM, 2017b). And recent research at the intersection of cognitive and computer science is generating new visualization tools that facilitate human exploration and understanding of the complex, multiscalar phenomena that are the subjects of computational social science (National Science Foundation, 2011).
Research at these new frontiers is often interdisciplinary, drawing on multiple methodological, theoretical, and empirical approaches. This is the case because the intellectual advances in these areas are most likely to occur at the nexus of advances in data science, computer science, and SBS research. Traditionally, SBS researchers have relied most heavily on theory-driven approaches that are either deductive (hypotheses derived from a priori theories) or inductive (theories inferred from data, sometimes referred to as “grounded theory”). In contrast, data science and computer scientists rely on data-driven approaches often focused primarily on prediction, even at the cost of theoretical explanation or understanding (Anderson, 2008). This artificial and unhelpful methodological divide can be transcended by interdisciplinary approaches that generate insights by iterating between theory-driven and data-driven approaches. The result has been renewed interest in what social scientists characterize as the abductive approach, to distinguish it from the deductive or inductive approach (Haig, 2005).
Indeed, recent assessments of the state and future of SBS research have focused on the advantages of interdisciplinary work in this area. The National Institutes of Health’s Office of Behavioral and Social Research has highlighted the promise that interdisciplinary research exploiting new technologies and advances in data analytics holds for SBS knowledge (National Institutes of Health, 2017). And as emphasized by the National Science Foundation’s Rebuilding the Mosaic report, which is based on a synthesis of more than 200 white papers from the SBS research community, the future of SBS research is “interdisciplinary, data-intensive, and collaborative” (National Science Foundation, 2011).
Another development is a recent move to thinking in terms of “broad” data, which offers researchers the unprecedented ability to fuse diverse sources of data, yielding information that is much more valuable than an undifferentiated stream of unstructured data. An example is the juxtaposition of neuroimaging and behavioral data on humans. This juxtaposition has demonstrated the potential for new theoretical advances in predicting, for instance, the efficacy of strategies designed to influence individuals’ attitudes and behaviors based on activation of specific areas in the brain (Estrada et al., 2017). Another promising avenue has been the use of physical activity data on team members collected from wearable devices to offer new insights into team performance through real-time assessment of cognitive functioning (Kim et al., 2012; Pentland, 2012).
Cognitive neuroscience and related technologies are expected to continue to converge with and in some respects transform SBS research, drawing on advances in psychopharmacology, functional neuroimaging, and computational biology, among other fields. Models of psychological states and intentions are being drawn from increasingly sophisticated neurophysiological assessment technology. Noninvasive functional brain imaging technologies have been progressing rapidly (National Research Council, 2008), and may be applied to measurements of neural degeneration related to cognitive function and assessments of readiness for complex tasks.
These examples illustrate the significant potential of big data and related developments for both SBS researchers and intelligence analysts, but also highlight challenges. The scale of the available information continues to grow exponentially, and the pace of cyber-based developments also demands much faster responses. The Internet has often been described as the “wild west” because it is largely unregulated, although recent efforts to undermine democratic processes and other security threats are leading to an examination of possibilities for policing aspects of this realm. In any case, the openness and rapid pace of change characteristic of the Internet pose challenges for the IC and for researchers who hope to study the data it yields.
Although challenges and opportunities presented by big data highlight possibilities that can benefit both the SBS research community and the IC, they also point to significant differences that have implications for the work the two communities do together. Many of these differences stem from the fundamentally different purposes of the researcher and the analyst. The researcher’s basic purpose is to add to the sum of human knowledge using the methods of scholarship, including forming a hypothesis based on data collected, testing that hypothesis, analyzing the results, and drawing inferences based on the evidence obtained. Researchers are trained to use the methods honed within their specialty, and to apply and contribute to theoretical models designed to explain observable phenomena. They are encouraged to pursue questions because those questions are interesting and unanswered, even if the practical application of the answers is not immediately apparent. Although resource limitations—and the imperative to produce regular publications—are natural constraints, researchers generally feel relatively little pressure to produce results within a given timeframe.
Analysts, by contrast, are focused on objectives related to national security and foreign policy. As discussed in Chapter 4, they may have a range of specific assignments, which may require them to seek deep understanding of a region, historical trends, and other more general knowledge quite similar to that sought by academic researchers. But their attention does not waver from the potential application of that knowledge to security concerns: answering policy makers’ questions about the world and alerting them to possible risks and opportunities. Typically, this means providing analysis that is a best guess based on the information available at the time. Researchers have the luxury of time to complete a systematic process of collecting and analyzing information and testing their conclusions, whereas analysts must often provide rapid-fire, definitive answers in urgent circumstances.
The academic researcher also generally has greater latitude than the analyst to plan a systematic approach to the collection of evidence. Few research projects afford the opportunity to collect all the data that could be of value, but means of addressing limitations can be part of the research design. Analysts, by contrast, must generally rely on data that are acquired opportunistically and are therefore not representative, unbiased samples. Bias and questions about representativeness also affect SBS research—and researchers, too, may take advantage of available data—but the data available to analysts may also be compromised by the efforts of adversaries to disguise the data or otherwise deceive, which adds a layer of questions about validity that do not affect most academic research.
Curiosity and educated guesswork likely play an important role in both academic research and intelligence analysis,14 but a researcher is ultimately judged by his or her peers on the basis of the documented methods used in the research and the extent to which the work is substantiated by that of others. The analyst, on the other hand, is judged in practice primarily by the extent to which he or she has provided valuable information and answers to the decision makers who rely on that information.15 The researcher’s conclusions may be described in an article or book dozens or hundreds of pages long, and filled with contextual information and detailed discussion of how conclusions are supported. The analyst’s results, by contrast, may be presented in a very brief text or PowerPoint presentation, or some other format, as needed by a client who must digest the results quickly and has very limited time for subtleties related to multiple alternatives or other complexities.
It is perhaps natural, then, that tensions have sometimes arisen when these two communities have collaborated, as discussed in Chapter 9. But it is also clear that the two share important interests and methods.
Anderson, C. (2008). The end of theory: The data deluge makes the scientific method obsolete. WIRED, June 23. Available: https://www.wired.com/2008/06/pb-theory [November 2018].
Breiman, L., and Cutler, A. (n.d.). Random Forests. Available: https://www.stat.berkeley.edu/~breiman/RandomForests/cc_home.htm#workingstheory [November 2018].
Bronfenbrenner, U. (1977). Toward an experimental ecology of human development. American Psychologist, 32(7), 513–531.
Bronfenbrenner, U. (1994). Ecological models of human development. In International Encyclopedia of Education (2nd ed., Vol. 3). Oxford, UK: Elsevier.
Clark, R. (2013). Intelligence Collection. Washington, DC: CQ Press.
Defense Intelligence Agency. (2018). Office of Intelligence and Analysis Mission. Available: https://www.dhs.gov/office-intelligence-and-analysis-mission [November 2018].
Estrada, M., Schultz, P.W., Silva-Send, N., and Boudrias, M.A. (2017). The role of social influences on pro-environment behaviors in the San Diego region. Journal of Urban Health, 94(2), 170–179.
Federal Bureau of Investigation. (2018). Mission & Priorities. Available: https://www.fbi.gov/about/mission [November 2018].
Federation of American Scientists. (1996). An Overview of the Intelligence Community. Available: https://fas.org/irp/offdocs/int023.html [November 2018].
George, R., and Rishikof, H. (2017). The National Security Enterprise: Navigating the Labyrinth. Washington, DC: Georgetown University Press.
14 For discussion of the development of expertise and how it shapes an individual’s capacity to integrate new information and recognize significant patterns and anomalies, see NASEM (2018); National Research Council (2000).
15 There are analytic standards for the IC; see https://www.dni.gov/files/documents/ICD/ICD%20203%20Analytic%20Standards.pdf [January 2019].
Haig, B.D. (2005). An abductive theory of scientific method. Psychological Methods, 10(4), 371–388.
Henrich, J., Heine, S.J., and Norenzayan, A. (2010). The weirdest people in the world? Behavioral and Brain Sciences, 33(2-3), 61–135. doi:10.1017/S0140525X0999152X.
Kim, T., McFee, E., Olguin, D.O., Waber, B., and Pentland, A. (2012). Sociometric badges: Using sensor technology to capture new forms of collaboration. Journal of Organizational Behavior, 33(3), 412–427.
Krizan, L. (1999). Intelligence Essentials for Everyone. Washington, DC: Joint Military Intelligence College. Available: http://www.dtic.mil/dtic/tr/fulltext/u2/a476726.pdf [November 2018].
Miles, A.D. (2016). Intelligence Community Programs, Management, and Enduring Issues. Washington, DC: Congressional Research Service. Available: https://fas.org/sgp/crs/intel/R44681.pdf [November 2018].
Monaghan P. (2009). Intelligence studies. The Chronical of Higher Education, March 20. Available: https://www.chronicle.com/article/Intelligence-Studies/33353 [February 2019].
Muchlinski, D., Siroky, D., He, J., and Kocher, M. (2016). Comparing random forest with logistic regression for predicting class-imbalanced civil war onset data. Political Analysis, 24(1), 87–103.
National Academies of Sciences, Engineering, and Medicine (NASEM). (2017a). Strengthening Data Science Methods for Department of Defense Personnel and Readiness Missions. Washington, DC: The National Academies Press. doi:10.17226/23670.
NASEM. (2017b). The Value of Social, Behavioral, and Economic Sciences to National Priorities: A Report for the National Science Foundation. Washington, DC: The National Academies Press. doi:10.17226/24790.
NASEM. (2018). How People Learn II: Learners, Contexts, and Cultures. Washington, DC: The National Academies Press. doi:10.17226/24783.
NASEM. (forthcoming). Report of the Committee on Fostering Healthy Mental, Emotional, and Behavioral Development Among Children and Youth; see http://sites.nationalacademies.org/dbasse/bcyf/meb-health-promotion/index.htm.
National Academy of Sciences. (2018). The Frontiers of Machine Learning: 2017 Raymond and Beverly Sackler U.S.–U.K. Scientific Forum. Washington, DC: The National Academies Press. doi:10.17226/25021.
National Institutes of Health. (2017). The Office of Behavioral and Social Sciences Research Strategic Plan 2017–2021. Available: https://obssr.od.nih.gov/wp-content/uploads/2016/12/OBSSRSP-2017-2021.pdf [May 2017].
National Research Council. (2000). How People Learn: Brain, Mind, Experience, and School: Expanded Edition. Washington, DC: National Academy Press.
National Research Council. (2008). Emerging Cognitive Neuroscience and Related Technologies. Washington, DC: National Academy Press.
National Research Council. (2012). Using Science as Evidence in Public Policy. Committee on the Use of Social Science Knowledge in Public Policy. K. Prewitt, T.A. Schwandt, and M.L. Straf (Eds.). Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.
National Science Foundation. (2011). Rebuilding the Mosaic: Fostering Research in the Social, Behavioral, and Economic Sciences at the National Science Foundation in the Next Decade. Arlington, VA: National Science Foundation. Available: https://www.nsf.gov/pubs/2011/nsf11086/nsf11086.pdf [November 2018].
Nielsen, M., Haun, D., Kärtner, J., and Legare, C.H. (2017). The persistent sampling bias in developmental psychology: A call to action. Journal of Experimental Child Psychology, 162, 31–38. doi:10.1016/j.jecp.2017.04.017.
Office of the Director of National Intelligence. (2013). U.S. National Intelligence: An Overview. Available: https://www.dni.gov/files/documents/USNI%202013%20Overview_web.pdf [November 2018].
Office of the Director of National Intelligence. (2019). What Is Intelligence? Available: https://www.dni.gov/index.php/what-we-do/what-is-intelligence [May 2019].
Pentland, A. (2012). The new science of building great teams. Harvard Business Review, April. Available: https://hbr.org/2012/04/the-new-science-of-building-great-teams [November 2018].
Rosenbach, E., and Peritz, A. (2009). Confrontation or Collaboration? Congress and the Intelligence Community. Cambridge, MA: Harvard University, Belfer Center for Science and International Affairs. Available: https://www.belfercenter.org/sites/default/files/legacy/files/energy-security.pdf [November 2018].
Rosenwasser, J., and Warner, M. (2017). History of the interagency process for foreign relations in the United States: Murphy’s Law? In R. George and H. Rishikof (Eds.), The National Security Enterprise: Navigating the Labyrinth (2nd ed.) (pp. 13–31). Washington, DC: Georgetown University Press.
Scott, R.A., Buchmann, M.C., and Kosslyn, S.M. (2015). Emerging Trends in the Social and Behavioral Sciences. Hoboken, NJ: John Wiley & Sons. doi:10.1002/9781118900772.
Tainter, J.A., Taylor, T.G., Brain, R.G., and Lobo, J. (2015). Sustainability. Available: https://onlinelibrary.wiley.com/doi/pdf/10.1002/9781118900772.etrds0326 [November 2018].
U.S. Bureau of Labor Statistics. (2018). Occupational Employment and Wages, May 2017: 193099 Social Scientists and Related Workers, All Other. Available: https://www.bls.gov/oes/current/oes193099.htm [November 2018].
U.S. Department of Homeland Security. (2018). Office of Intelligence and Analysis Mission. Available: https://www.dhs.gov/office-intelligence-and-analysis-mission [November 2018].
This page intentionally left blank.