An important part of this committee’s charge was to propose ways to “facilitate productive interchange between the security community and the external social science research community” (refer to Box 1-1 in Chapter 1). In this chapter, we offer reflections on how these ties can be strengthened. We begin with reflections on what we learned about the relationship between these two communities from the process of conducting this decadal survey, and then explore past collaborations between the two communities, drawing lessons from both successful and less-successful experiences and offering our conclusions about ways to strengthen these ties.
The committee was asked to reflect on the process of conducting this decadal survey and identify insights and practices that could be useful for any future such studies in social and behavioral sciences (SBS) fields. As noted in Chapter 1, this study marked the first time the National Academies of Sciences, Engineering, and Medicine’s decadal process was applied to SBS fields. The decadal process was chosen because the Office of the Director of National Intelligence (ODNI) recognized the need for a comprehensive look across this research landscape: the process offered a way to address the urgency of integrating SBS research into intelligence analysis while also opening the door to a wide array of ideas. Our observations about this process may bear on planning for any future decadal studies in this universe but may also help ODNI to continue capitalizing on SBS research in a systematic and continuous way, apart from any future decadal surveys.
The committee’s charge did not identify a precise objective to be met within 10 years, akin to the development of a space telescope. Rather, it directed us to look for opportunities that would “contribute to the IC’s analytic responsibilities,” a task that by its nature is ongoing, not one that might be complete at the end of 10 years. The breadth of this charge, which required us to look across a very wide research landscape, initially appeared to be the greatest challenge, but it turned out to have significant advantages.
It was clear from the start that the processes used in prior decadal surveys would be valuable but not easy to apply to this committee’s work.1 In particular, we recognized that, while surveying the research community for ideas was a key element of the process for this study, there was no practical way to survey such a broad community systematically. Indeed, although there is in a sense a community of SBS researchers—in that researchers in these fields share many common interests—SBS is by no means a single discipline. No institution or entity links all members of this set of disciplines; rather, the various SBS disciplines form an abstract community that encompasses a wide range of theories and methods. Further, the breadth of our charge and the importance of representing such a diverse array of work had a cost in terms of the level of depth with which we could explore particular research areas. The product of our deliberations, then, certainly is not exhaustive, and it would be impossible to forecast precisely which areas of research across all SBS fields will make the most important contributions to the IC in the coming decade.
Despite these constraints, however, there were distinct benefits to casting a wide net in seeking intersections between the needs of the IC and the available SBS research. One benefit was that, because a multidisciplinary approach was necessary, our work was not driven by the perspectives of a single discipline, and we had no preconceptions about where to look for relevant work. The process we developed to pursue understanding of the needs of the IC and merge that understanding with input on potentially relevant research exposed us to new ideas while also supporting some of our hypotheses.
1 For example, prior decadal committees relied on the work of subpanels, designated from the start of the project to gather information in particular relevant disciplines. This approach was not practicable for our decadal study because we could not assume that a particular set of individual disciplines would have the most important contributions to make to fulfill our charge. See Chapter 1 and Appendix B for description of the process used for this study.
Although the committee’s charge was to examine SBS research, exploiting most of the opportunities identified in this report will depend on integrating research from SBS fields as disparate as neuroscience, engineering, and computer science. There are always challenges when researchers from different fields work together. Researchers from different domains may each sometimes view their counterparts as naïve, but here we focus on some errors that might occur when technically based research is conducted without the benefit of deep SBS knowledge.
One potential problem is rediscovery, or the reinvention of wheels. For example, early roboticists interested in the coordination of groups of agents “discovered” a finding already well established by organizational science—that teams adapt more rapidly than do hierarchies (Bersin, 2017). Of greater concern are cases in which researchers unaware of well-established findings make significant errors. An example is when claims about networks based on mathematical analysis (Barabási and Bonabeau, 2003) contradicted earlier work in sociology and demography (Heathcote et al., 2000; Jones and Handcock, 2003) and were later disproved (Broido and Clauset, 2018). Such errors can be serious. For instance, when artificial intelligence (AI) techniques not based in results of computational models driven by SBS theory are used in ways that have concrete impacts for individuals, the result can be unintended discrimination or other harms (Hauch et al., 2015; Siegel, 2018).
SBS researchers and engineers, computer scientists, and physicists take different approaches even when addressing the same phenomena (Borgatti et al., 2009), and the benefit of integrating SBS methods and findings has been noted in such realms as health care delivery (Burger, 2017) and the marketing of technology (Brookey, 2007; Grindley, 1990). More generally, technological developments occur in a social and economic context, and SBS research is essential to understanding technology’s potential applications and benefits, risks, and long-term effects (Smith and Stirling, 2007). The industrial revolution was not driven by improvements in manufacturing and engineering alone; SBS research was needed to support the development of applications of these advances (Porter, 1986). As discussed throughout this report, SBS perspectives are essential to the development of sound research and applications involving sophisticated technology, and direct collaboration across SBS and technological fields is necessary for that to happen.
Adapting the decadal process to a new context also required that the committee simultaneously conduct this study and continuously take stock of the effectiveness of the study process. Doing so allowed us to see firsthand some of the obstacles to integration and collaboration between the two communities on which we were focusing. One such obstacle is that even within the IC, and in the context of the numerous mechanisms it already has in place for drawing specifically on SBS research, there appears to be less coordination between the two than would be optimal. Existing IC entities were developed to pursue particular missions, not necessarily with the goal of advancing the integration of SBS knowledge. This reality, along with the need to keep some projects or information classified, is likely to work against coordination of these communities, to say nothing of funding and political considerations that were beyond our purview.
Another obstacle to integration is that awareness of potential applications of research to IC needs is highly uneven across relevant SBS fields. It is notable, for example, that this report contains little discussion of developments in such fields as political science and international relations. These fields make extremely important contributions to national security, and have done so for decades. Yet there was little need to address them here because for the most part, scholars in these fields are highly attuned to security issues, and the IC is highly attuned to their findings. Methodological and technological breakthroughs of which the IC may not be fully aware seemed far more likely in other areas of the SBS terrain.
Applying the decadal survey process to the IC context had another significant benefit. The committee cast the widest possible net in seeking white papers and other input from the SBS community (refer to Chapter 1 and Appendix B). The results, while valuable and intriguing, also clearly demonstrate that there is a long way to go in building awareness within the SBS research community of the potential application of its work to national security. However, our iterative process did reveal elements that would likely have emerged even if a different, parallel set of committee members had embarked on this study and devised a different method for applying the decadal survey process in the IC context. Without a doubt, for example, any attempt to fulfill our charge would highlight the importance of learning more about human–machine interactions. Likewise, emerging research in data science has many potential applications to national security work that would surely be included in any report such as this in some form. It is similarly difficult to imagine that the critical importance of integrating insights about human behavior and group functioning into the pursuit of cybersecurity would not have been recognized.
Finally, an issue that was a key challenge for this committee may be relevant to any future efforts to cull information from this broad research landscape. We struggled to find the best way to take stock of the diverse knowledge and expertise brought by the 16 committee members, and to find an optimal way to take advantage of our own knowledge base while also extending our reach widely in areas not well known to any of us. Our own knowledge base was a valuable foundation, but it was also limited, as were our resources for supplementing it. The rapid project schedule required us to quickly assess promising areas in order to make decisions about how to use our six workshops and other information-gathering strategies; there was no established process on which we could rely for this purpose. This procedural challenge mirrors a challenge faced by the IC: to systematically utilize an ever-expanding base of foundational SBS research while also identifying new work in unexpected areas that may prove equally valuable.
As noted in the overview of the SBS community and the IC in Chapter 2, the objectives and perspectives of these two communities are not always aligned, but the two have always had much to learn from one another. The relevance of SBS research to national security challenges has been apparent to both communities at least since researchers first worked with the U.S. military during World War I.2 The first division of the National Academy of Sciences devoted to SBS research, the Division of Anthropology and Psychology, formed committees to explore military issues as early as 1919. Collaborations between the security and SBS communities began to play a critical and sustained role in military operations, and to expand to intelligence issues beyond military concerns, once the United States became involved in World War II. Since then, research partnerships between the two communities have generated important scientific insights and provided valuable support for intelligence and security activities, although the relationship has not always been smooth.
2 In 1916, the leadership of the National Academy of Sciences formally “place[d] itself at the disposal of the Government for any services within its scope.” This involvement led directly to the establishment of the National Research Council, the operating arm of the Academy, which would “advise the nation on matters of science and engineering.” For more information about this history, see https://sites.nationalacademies.org/PGA/PGA_180900 [November 2018] and http://www.nasonline.org/about-nas/history/archives/milestones-in-NAS-history/organization-of-the-nrc.htm [November 2018]. The institution formally changed its name to the National Academies of Sciences, Engineering, and Medicine in 2015.
Successful collaborations between SBS researchers and the IC have run the gamut from fundamental research into human–computer teaming and human cognition to applied work that facilitates cross-cultural and wartime operations. During World War II, the Office of Strategic Services—the nation’s first foreign intelligence agency and the precursor to the Central Intelligence Agency (CIA)—hired political scientists, psychologists, anthropologists, sociologists, and economists to support such functions as analyzing foreign intelligence, assessing enemy and allied morale, screening and training intelligence operatives, calculating the enemy’s military capacity, and identifying optimal bombing routes and payloads (O’Rand, 1992). Similarly, anthropologists and psychologists working at the Office of War Information (OWI) provided insights into the cultures and values of foreign populations relevant to the war effort. Anthropologist Ruth Benedict’s groundbreaking and best-selling study of Japanese culture, The Chrysanthemum and the Sword, which began as an OWI report, influenced the values and design of the United Nations when it was established at war’s end (Mandler, 2013).
Wartime projects benefited SBS scholarship even as they served national security. For example, political scientist Harold Lasswell’s studies of international political opinion and morale for the Wartime Communications Research Project were widely recognized as proving the value of large-scale, quantitative methods for studying communication (Backhouse and Fontaine, 2010; Rohde, 2013). Likewise, during and after World War II, the Office of Naval Research helped support the development of field theory in sociology through its financial support for the Massachusetts Institute of Technology (MIT) because it hoped to learn about group dynamics and the ways people identify with each other as members of a particular group (O’Rand, 1992, p. 190).
Federal investments in SBS research expanded in scale and scope in the second half of the 20th century. These investments demonstrate that the intelligence and security communities have consistently supported a spectrum of SBS research, from fundamental scientific investigations to scholarship with direct applications to intelligence and related security activities. In the applied domains, for example, research at the RAND Corporation facilitated new understandings of human decision making derived from game theory, social psychology, and systems analysis. These insights fostered and supported nuclear deterrence strategies that helped keep the Cold War cold in the United States and Europe.
Such efforts have parallels today. For example, SBS researchers work in partnership with intelligence and military agencies to enhance cultural and linguistic knowledge. Drawing on basic research in communications, cul-
tural anthropology, political science, social psychology, and sociology, for instance, the Marine Corps Center for Advanced Operational Culture and Language provides the security community with concepts and skills that facilitate cross-cultural understanding (National Academies of Sciences, Engineering, and Medicine, 2017, p. 23).
Recognizing a need for more sustained investment in SBS research relevant to counterterrorism and counterinsurgency in the 21st century, the secretary of defense created the Minerva Research Initiative in 2008. This initiative bridges basic and applied SBS research by supporting unclassified social science research that improves “basic understanding of the social, cultural, behavioral, and political forces that shape” strategically important regions of the globe.3 Funded projects include studies of social, cultural, economic, and psychological factors that affect radicalization; the role of cybermedia in state stability; and the impacts of environmental, economic, social, and political factors on conflict and instability among both state and nonstate actors.
The Defense Advanced Research Projects Agency (DARPA) began funding research on machine-aided cognition and human–computer communication in the early 1960s. The first director of DARPA’s behavioral sciences and computer science division, psychologist and computer scientist J.C.R. Licklider, articulated a vision of human–computer symbiosis in 1960 that is still relevant today. He wrote, “in not too many years, human brains and computing machines will be coupled together very tightly . . . the resulting partnership will think as no human brain has ever thought” (Licklider, 1960, p. 4; Norberg et al., 1996). DARPA has funded research that has produced major advances. For example, its funds contributed to the creation of PLATO (Programmed Logic for Automatic Teaching Operations), which, first released in 1972, harnessed research in psychology to revolutionize computer-based education (Dear, 2017). More recently, IC investments in social network analysis have generated methodological breakthroughs.
Together with the Intelligence Advanced Research Projects Activity (IARPA)—created by ODNI in 2006—DARPA continues to support research that combines computational tools with SBS knowledge to develop social forecasting techniques and other approaches to improved human and organizational decision making. This research has been foundational to the development of such systems as the Worldwide Integrated Crisis Early Warning system, which uses natural language processing, modeling, and other methods to track international events and forecast political instability (NASEM, 2017, p. 22).
The research portfolios of DARPA and IARPA also include support for research in the decision sciences, cognitive science, and other SBS areas
with the potential to enhance basic and applied knowledge that can contribute to both national security and scientific knowledge (Defense Advanced Research Projects Agency, 2018; Office of the Director of National Intelligence, 2018). In addition to providing new computational tools for monitoring and forecasting of global events, these research portfolios advance SBS knowledge about human judgment and decision making in high-stakes and rapidly changing environments.
One recent example of such contributions, discussed in Chapter 7, is the Good Judgment Project (Office of the Director of National Intelligence, 2015). Developed by scholars at the University of Pennsylvania and the University of California, Berkeley as part of IARPA’s forecasting tournaments (which ran annually from 2011 to 2015), the project generated results valued by both IARPA and SBS researchers and has been adopted by the National Intelligence Council. The project demonstrated the validity of forecasting tournaments as predictive tools and provided IARPA with insights into best practices for designing and running such tournaments. It also produced results directly relevant to the SBS research community broadly, including the identification of mechanisms for quantifying good judgment and methods for facilitating it, such as cognitive debiasing, providing incentives for accuracy, and designing predictive questions that facilitate accuracy. Notably, both researchers and project managers have been careful not to oversell the results of their findings (Rohde, 2017, pp. 792–813). Previous predictive projects had claimed more accuracy and foresight than they were able to provide. By contrast, researchers involved in the Good Judgment Project have stressed that their work shows that prediction is currently most accurate for approximately 1 year into the future, rather than longer timespans (Chen, 2015; Tetlock et al., 2014, pp. 290–295).
The above research investments run parallel to research in other high-stakes professional domains, such as medicine and finance, in yielding promising insights that can improve judgment and performance. SBS research has shown that human cognitive fallibilities, such as hindsight bias (the unsubstantiated belief that one could have predicted an event) and outcome bias (the tendency to judge decisions by how they turned out rather than by how thoughtfully they were made) hinder learning in workforces. These findings have led to the development of new methods for facilitating the reduction of biases in thinking in high-stakes situations by better identifying sources of failure in judgment and decision making (National Research Council, 2012, pp. 15–16).
The examples of productive collaboration discussed in this report demonstrate that the partnership between the IC and the SBS community
has often been successful when research, whether basic or applied, has advanced knowledge of mutual interest to both communities. The relationship between SBS researchers and the national security community has not always operated as smoothly as in these examples, however. We examine here a few cases in which problems arose so as to draw lessons for future collaborations.
Careful attention to transparency in funding relationships on both sides has helped restore the public reputation of joint efforts, as well as protect the integrity of the products of joint research.
During the early years of the Cold War, the need for research-based information was great, but security concerns were also heightened. On some occasions, normal protocols for transparency and accountability were not observed, and financial and other ties between SBS researchers and the IC were hidden from the public (Rohde, 2013). This lack of transparency caused problems within both communities, as illustrated by the experience of MIT’s prestigious Center for International Studies. The center, which had the mission of applying “social science to problems bearing on the peace and development of the world community” (Gilman, 2004, p. 159), was established in 1951 in large part with secret CIA funds. For the next 15 years, the center’s researchers produced some of the most influential work in modernization theory, international communications, and development studies without disclosing their relationship to the IC. Despite being questioned, they denied any ties to the CIA until investigative journalists produced unequivocal proof in 1966, at which time MIT terminated its relationship with the agency (Rohde, 2013). The American Political Science Association’s integrity was similarly compromised when social scientists learned that the association’s longtime executive director disguised ideological statements as scholarship and founded a private research corporation that secretly took CIA money (Oren, 2002).
While less common, programs that used or appeared to use SBS as a cover for IC programs also contributed to public suspicion of SBS–IC collaboration. From 1955 to 1962, for example, a Michigan State University program that allegedly provided public administration training in South Vietnam served as cover for a CIA-funded counterespionage training program that was implicated in accusations of torture and assassination in South Vietnam.
SBS researchers and the IC have taken these experiences to heart. It has been decades since collaborations between the two communities have been marred by evidence of covert funding.
Explicit attention to balancing scientific norms and procedures with the need to protect security in the design and execution of research benefits both the researchers and security agencies involved.
The agencies that fund and use SBS research face the challenge of balancing their own needs with the desire of academic researchers to advance intellectual developments in their fields. As noted in a recent report by the Institute for Defense Analyses, when the goals of a research program are ambiguous, or when the uses to which funding agencies will put the research are unclear, particularly in relation to the program’s security or intelligence mission, the program is more likely “to sow seeds of mistrust” between the two communities (Koonin et al., 2013, p. 12). While the Good Judgment Project, discussed above, provides an example of successful balancing of agency and SBS needs and interests, examples of failure provide instructive lessons for future collaboration.
One such example is Project Camelot, an ambitious, unclassified, multidisciplinary study of political revolution in Latin America during the Cold War. The project, funded by the U.S. Army and DARPA, was designed to identify the causes of communist insurgency to facilitate prediction of the onset of revolution. The project attracted international criticism in 1965 when press accounts in several Latin American countries where the work was being carried out revealed that researchers affiliated with the work had misrepresented it to potential research partners in those countries as a foundation-supported effort. Journalists in several of the countries involved argued that these events proved that the U.S. government used SBS research as a cover for intelligence gathering and espionage (Rohde, 2013).
The National Research Council concluded that Project Camelot demonstrated “that military sponsorship of social science research pertaining to the internal politics of other nations may have adverse repercussions on American foreign policy” (National Research Council, 1971, p. 7). The accusations directed at the project triggered congressional hearings into the military’s research programs, prompting the Army to cancel the project abruptly so as to avoid further scrutiny and embarrassment (Rohde, 2013; Zehfuss, 2012).
Project Camelot also was emblematic of the failure of SBS researchers and their sponsors to balance scientific and national security goals. Critics in the SBS community argued that the project’s core problem was that its researchers converted SBS research to militarized language and goals, thereby subverting the natural direction of research and introducing politics into science. Critics argued that researchers and their sponsors had brought “the whole of social sciences under the heading of counter-insurgency” by framing the research questions in military terms—as studies of counterinsurgency, guerrilla warfare, and the like (Rohde, 2013, p. 84). As the National Academy of Sciences concluded at the time, government agencies
and researchers could improve their relationship by assuming more responsibility for stating needs in terms that are meaningful to the investigator rather than the military (National Research Council, 1971).
The funders of Project Camelot had reasons to be critical of their SBS partners as well. For example, they had to intervene during the study’s planning phase when social scientists proposed including studies of the French Canadian separatist movement in their investigation (Rohde, 2013, pp. 70–71). Experts on counterinsurgency research at DARPA and members of the Defense Science Board pointed out that the project would likely have failed on its own merits. Months before it attracted public attention, government officials worried that the researchers directing the project had not produced a clear research plan, but instead listed only “generalities . . . about research hypotheses” and “vague and formless” descriptions of the project’s methodology (Deitchman, 1976, p. 146).
SBS research projects during the Vietnam War era also include cases in which government funders were let down by SBS researchers who failed to provide rigorous and relevant expertise. In 1967, for example, DARPA hired the research group Simulmatics Corporation—which was led by an MIT political scientist and staffed by reputable SBS scholars—to study social relationships in South Vietnamese hamlets and determine what motivated support for the North and South Vietnamese causes. Government experts found that the studies seemed as if “someone had taken a book of rules about scientific methodology, then systematically violated each one” (Weinberger, 2017, p. 179). A study of communications and propaganda, for example, yielded results that DARPA officials found were riddled with contaminated variables and violations of basic rules of inference.
Camelot and Simulmatics were military-sponsored, not IC, projects. Nevertheless, their histories point to the fact that the partnership between the SBS community and the IC may be weakened when research priorities, methods, and administration fail to meet the needs, standards, and values of both collaborators. Achieving that balance is difficult even when research is carried out with sufficient scholarly integrity and careful management.
Recent examples indicate that members of both communities have become more successful in balancing their mutual needs, standards, and values. For example, SBS researchers criticized the U.S. Department of Defense (DoD)–funded Minerva Research Initiative in its first year for defining research areas that leaned too heavily toward national security concerns, which thus failed to appeal to many relevant researchers (Gearty, 2008). They also criticized the program for its grant review process, which initially was performed internally within DoD rather than through peer review. In subsequent years, Minerva program managers responded to these concerns and incorporated substantial scholarly input into the processes for setting research priorities and reviewing grants (Krebs, 2008).
Relationships between research universities and the IC can be complicated in other ways as well. A 2017 book documents cases in which members of the IC have enrolled in university programs in an undercover capacity—so that at least their fellow students were not aware of their official role—and of cases where students who were foreign nationals took advantage of access to research for the benefit of their governments (Golden, 2017).
It is important to recognize that not all SBS researchers view partnerships between their community and the IC in the same way. Researchers may have differing expectations for fundamental and applied research conducted in this context, or for classified as opposed to unclassified research. Furthermore, researchers’ interpretation of the appropriate balance between scholarship and application can differ across and within scholarly disciplines. While many scholars see DARPA and Minerva programs as well balanced, some criticize such efforts for implicitly favoring narrow and short-term American interests and unwittingly supporting “non-democratic actions or governments” (Koonin et al., 2013, p. 11). Scholars in the fields of anthropology, political science, and international relations have been particularly critical of security-funded research, while scholars in other fields have embraced collaboration more fully (Zehfuss, 2012).
The SBS–IC relationship can become strained when clarity or consensus with respect to values and the ethics of research projects and programs is lacking. Respecting ethical norms for research will require that members of the SBS community and the IC engage in ongoing dialogue concerning research ethics in new research domains.
Research endeavors in which both academics and members of the IC take part or have a stake can highlight differences between the two cultures (see Chapter 2) and raise sometimes challenging questions. The development of complex and sophisticated technologies adds another layer of complexity to many questions about research protocols and ethics. We look briefly here at three key issues: ethics and values in a research context, emerging ethical standards in a world of big data, and the reproducibility of research findings.
Ethics and Values in a Research Context
Neither SBS research nor intelligence analysis is a value-free enterprise. Like all researchers, those in SBS fields must be aware of and articulate the influence of values in their work. The values of scientific objectivity and rigor are paramount for most researchers, but values come into play as well in the selection and definition of research questions to pursue. Scholars investigating the effects of poverty on children, for example, recognize
that they regard promoting children’s welfare as an undisputed good, just as medical researchers regard protecting or restoring health as a noncontroversial objective.
More complex and subtle values and assumptions may arise in research that is relevant to national security issues. For example, researchers working on national security issues agree that protecting democracy is a positive good, but they may disagree about whether certain research programs or policies embody those values. Social scientists who assisted stability operations of the U.S. military in Iraq and Afghanistan, for instance, argued that their research made Americans, Iraqis, and Afghanis safer, but other social scientists argued that the research only facilitated military operations and did not serve science or democracy (Zehfuss, 2012). While researchers will likely continue to disagree about national security policy, clear and open communication about objectives and the relationship between research and application may help build and maintain trust between SBS researchers and the IC even in the face of policy disagreement.
If history is a good indicator, scrutiny of SBS research and its relationship to government is more likely to arise in contexts of heightened political and other sensitivities. During some periods—the 1940s and 1950s, for example—Americans have largely shared a public consensus as to American strategic and security interests. At such times, collaborations between the SBS community and the IC have attracted very little attention or concern. During the 1960s, however, especially as dissent related to U.S. policies in Vietnam grew, the IC–SBS collaboration faced greater public scrutiny. During congressional hearings in 1968, for example, Senator William J. Fulbright linked U.S. failures in the Vietnam War to the security community’s SBS research investments. As a result, congressional appropriations for security community–funded SBS research dropped from $40 million in 1967 to $13.7 million in 1969 (Rohde, 2013).
Similar challenges are apparent today. Revelations about controversial intelligence practices in the first decade of the 21st century—from extraordinary rendition;4 to the National Security Agency’s bulk data collection programs; to harsh interrogation methods, including methods based in psychological research (Voosen, 2015)—also have fostered concerns about intelligence and security practices among the public. While these practices do not pertain directly to research relevant to the capabilities of intelligence analysts, they may seed mistrust in or heightened scrutiny of the SBS–security community relationship (Goolsby, 2005; National Research
4 Extraordinary rendition is a policy first used by the United States in the early 1990s in which foreign nationals who are suspected of involvement in terrorist-related activities are detained on foreign soil for interrogation in U.S. facilities or those of another country; see https://www.aclu.org/other/fact-sheet-extraordinary-rendition [January 2019].
Council and National Academy of Engineering, 2014; Zehfuss, 2012) This report appears at a time when questions related to terrorism, immigration, cybersecurity, and many other issues keep those concerns very current. Practices that build trust, including transparency in funding and clarity about research methods, goals, and applications, may help mitigate such concerns and strengthen ties. Examination of the relationship between academics and the IC has prompted several social science disciplines to embrace the recommendation that unless a national emergency presents an overriding need, the following activities should be avoided (Johnson, 2019, p. 17):
- agency covert recruitment on campus;
- covert research relationships;
- the use of academic cover by intelligence officers;
- the tasking of faculty and students to collect intelligence; and
- the tapping of academicians for counterintelligence or covert action operations.
It is also important to recognize that SBS research is valuable to the IC, as well as to other sectors, including education, finance, and medicine, to name but a few, because of its capacity to expand analytic understanding of human emotions, motivations, and actions. But because that understanding can facilitate the shaping and perhaps even the manipulation of emotional responses and perceptions, government-funded SBS research may generate public concern.
One example of such concern arises from the dramatic development of new data sources, such as social media mining and new computational social science techniques that can be used to analyze and possibly shape population sentiments. In 2018, controversy was sparked by revelations that the private research firm Cambridge Analytica was improperly obtaining data on Facebook users and deploying nontransparent and proprietary behavioral technologies to design information campaigns intended to influence human attitudes and actions. Whether Cambridge Analytica’s psycho-graphic techniques are scientifically sound or effective is an open question (Gibney, 2018). But scholars, journalists, and public officials have expressed grave concern that the firm’s services were or could be purchased by governmental and nongovernmental clients (Cadwalladr and Harrison-Graham, 2018; Lewis et al., 2018; Shaw, 2018; Wildermuth, 2018). This episode demonstrates sensitivities that can accompany the deployment of behavioral technologies. These concerns are likely to remain salient with the growth of computational social science and big data research, including some of the research areas endorsed in this report, because of their valuable applications for protecting security.
Changing norms with regard to research ethics also require attention from SBS researchers and their partners in the IC. The U.S. military’s aboveground nuclear exercises, carried out in Nevada from 1951 to 1957 and code-named Desert Rock, illustrate the way changing norms have complicated the relationship. As part of this effort, psychologists seeking to understand soldiers’ ability to function in the tactical atomic environment and gauge the risk of panic on the atomic battlefield studied 600 soldiers to assess the psychological impact of witnessing an atomic explosion. While the soldiers were informed of the risks of radiation exposure and reassured that the exercises were systematic, the research design and scientific utility of the tests were the subject of internal disagreement among military officials. Military officials also disagreed about an emerging ethical question: whether the test subjects should be considered volunteers to whom risks were disclosed or soldiers involved in training (Advisory Committee on Human Radiation Experiments, 1995).
At the time of this research, the 1950s, attitudes in the country with respect to large institutions, including both the military and scientific establishments, were in a state of transition. There was little public comment on this research at the time. Public controversy about the physical effects of the blasts on the “atomic soldiers” emerged only decades later as norms for human subjects protections changed.
A more contemporary example is DARPA’s Brain Machine Interface program, which has generated debate among neuroscientists (Moreno, 2012). Some pointed to the program’s potential medical benefits, including treatments for degenerative neurological diseases. Other researchers argued that DARPA-funded researchers were conducting research that would support the development of human enhancements, such as “brain–machine interfaces,” that would be used to enhance performance on the battlefield, which many regard as unethical (Hoag, 2003; National Research Council, 2008, 2009; Rudolph, 2003; Silence of the Neuroengineers, 2003). Another example is so-called enhanced interrogation: in the wake of the September 11, 2001, attacks, psychologists helped the CIA establish and carry out a program of interrogation of suspected terrorists that included methods regarded by many as torture, which led to a highly critical investigation by the Senate Select Committee on Intelligence (U.S. Senate Select Committee on Intelligence, 2014; see also Pfiffner, 2010), as well as policy statements by the American Psychological Association. One of the methods, water-boarding, was used by Japanese soldiers interrogating captured U.S. soldiers during World War II, for which they later went to prison (Johnson, 2018).
Such examples demonstrate how important it is that research sponsors consider researchers’ positions and evolving perspectives so as to avoid proposing and supporting projects that cross ethical lines. Clear statements of
mission intent also help sponsors and researchers avoid misunderstandings about the implications or goals of research programs (Moreno, 2012). SBS researchers are concerned about ensuring that their research does not cause unnecessary harm, but the relationship between research and potential harm is often unclear—and itself is the subject of unresolved debate. These episodes are instructive because this report is emerging at a time when the ethics of research using big data are also under scrutiny and in flux.
Emerging Ethical Standards in a World of Big Data
Advances in computational social science are offering exciting new possibilities for IC-related SBS research, including enhanced analysis of open-source intelligence such as social media data, data collected from sensors, and other digital information produced by routine human actions and behaviors (Harman, 2015). This information is often granular, is durable, and can be shared across institutional and national borders at high speeds. Research conducted using such data has the potential to cause harm to the individuals whose information or attributes are collected, including the loss of privacy and of individuals’ autonomous control over their personal information.
While SBS researchers have long been accustomed to addressing research ethics via the Common Rule (a federal policy statement regarding the protection of human subjects involved in research), computational social science research transcends traditional human subjects protections and raises a number of new ethical questions. For example: Are data subjects the same as human subjects, or are they different? What reasonable expectations of privacy do people have with respect to their digital traces, and how do those expectations change in different digital venues? Is informed consent possible, realistic, or required in big data research? Researchers and ethicists stress that these questions do not have straightforward answers (Buchanan, 2017; Zook et al., 2017). Furthermore, because digital collection tools are proliferating, because digital methods are changing rapidly, and because machine learning tools create decision rules that may not be transparent or intuitive, digital research may generate new and unexpected ethical questions.
These issues are discussed in detail in Appendix D, but we note here that they are especially salient when data are collected or analyzed in a national security context. Internet users typically do not know what traces of their everyday lives are monitored; what happens to their information; or what, if anything, happens because of it. The potential for surveillance by the security community afforded by digital data compounds the power
imbalances already present in digital spaces. As discussed above, failure to address ethical concerns can have a chilling effect on research; the public may lose its trust in the SBS research community, as well as in the government agencies that sponsor such research, including the security community. In the digital context, widespread concern that the Internet is a space where actions are monitored, stored, and analyzed rather than a site of free information exchange also may cause people to censor their online behavior, with a chilling effect on Internet use itself (Brunton and Nissenbaum, 2015; Mayer-Schönberger, 2011; Penny, 2016). Researchers and the IC will continue to grapple with the need to balance privacy and autonomy on the one hand and security on the other (Walsh and Miller, 2016). These domains both overlap and conflict, and navigating this terrain will require careful attention to evolving ethical norms and values.
Reproducibility of Research Findings
Questions about the reproducibility of research results, not only in SBS fields but also across the sciences, have implications for intelligence analysis. The reproducibility of results and testing for generality are key ways in which researchers confirm that their findings are valid; these steps are regarded as keystones of the scientific method. Studies completed over the past decade, however, have pointed to the widespread difficulty of replicating results in numerous fields, such as medicine, big data and computation, and biometrics and behavior metrics. In science, as in national security, any index or finding may fail to replicate precisely because of intrinsic limits in measurement; in some cases, more valid conclusions may be derived from measures other than those originally used. The research community is focused on identifying best practices for enhancing both reproducibility and validity (see Appendix C for a more detailed discussion).
A consensus committee of the National Academies has examined this issue,5 but it is clear that the importance of understanding and quantifying the reliability of information and of determining how best to process and display the information and integrate or aggregate information from multiple sources will influence emerging SBS research in the coming decade. The robustness and validity of new data, modeling, and theory development are also important in answering researchable questions related to national security, and it will therefore be important for the IC to consider these issues carefully in planning its own empirical efforts.
5 Information about the study can be found at http://sites.nationalacademies.org/dbasse/bbcss/reproducibility_and_replicability_in_science/index.htm [December 2018].
The committee saw ample evidence of the productive potential of collaboration between the IC and the SBS research community. We offer three conclusions regarding the elements needed for productive collaboration.
CONCLUSION 9-1: Explicit attention to the respective intellectual goals, values, and perspectives of members of the Intelligence Community (IC) and academic researchers is a prerequisite for productive collaboration. Collaborations between the two have yielded important scientific and analytic insights, and have functioned well when funding sources and agency goals have been transparent, when social and behavioral sciences research questions and agency missions and goals have been harmonized and clear, and when ethical and value-based concerns have been treated with sufficient care. Conversely, the relationship has fractured in the past when funding sources have been kept secret or misrepresented, researchers and government agencies have struggled to balance research and agency needs, and research has touched on broader ethical or value-based disagreements.
CONCLUSION 9-2: Ethical issues may arise at all steps of the research process, from planning, to dissemination of findings, to the operationalization of digital tools in analytic contexts. Because standards with respect to some ethical issues—particularly those concerning the use of large-scale digital datasets—are developing, and because these issues are context-sensitive, ethical assessments require careful attention throughout the research process.
CONCLUSION 9-3: Meticulous clarity and openness about the approaches taken to ensure the reproducibility and validity of the evidence generated in the course of research conducted by or with the support of the Intelligence Community (IC) are critical to the utility of the research results. The IC can promote this standard by requiring researchers to identify project components that incorporate assessments of reproducibility, replication, and validity.
Advisory Committee on Human Radiation Experiments. (1995). Final Report of the Advisory Committee on Human Radiation Experiments (No. 061-000-00-848-9). Washington, DC: U.S. Government Printing Office. Available: https://www.osti.gov/opennet/servlets/purl/120931/120931.pdf [December 2018].
Backhouse, R., and Fontaine, P. (2010). The History of the Social Sciences since 1945. Cambridge, UK: Cambridge University Press.
Barabási, A.L., and Bonabeau, E. (2003). Scale-free networks. Scientific American, 288(5), 60–69.
Bersin, J. (2017). Robotics, AI and cognitive computing are changing organizations even faster than we thought. Forbes, March 9. Available: https://www.forbes.com/sites/joshbersin/2017/03/09/robotics-ai-and-cognitive-computing-are-changing-organizations-even-faster-than-we-thought/#2b7db67fa3f4 [December 2018].
Borgatti, S.P., Mehra, A., Brass, D.J., and Labianca, G. (2009). Network analysis in the social sciences. Science, 323(5916), 892–895.
Broido, A.D., and Clauset, A. (2018). Scale-Free Networks Are Rare. arXiv:1801.03400. Available: https://arxiv.org/abs/1801.03400v1 [December 2018].
Brookey, R.A. (2007). The format wars: Drawing the battle lines for the next DVD. Convergence: The International Journal of Research into New Media Technologies, 13(2), 199–211.
Brunton, F., and Nissenbaum, H. (2015). Obfuscation: A User’s Guide for Privacy and Protest. Cambridge, MA: MIT Press.
Buchanan, L. (2017). Brief Overview of the Revised Common Rule and Subpart B—Pregnant Women. Washington, DC: Office for the Human Research Protections. Available: https://www.nichd.nih.gov/sites/default/files/2017-11/4-OverviewNew_Rule_SubpartB.pdf [December 2018].
Burger, J. (2017). The Health of People: How the Social Sciences Can Improve Population Health. London, UK: SAGE Publications.
Cadwalladr, C., and Harrison-Graham, E. (2018). 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian, March 17. Available: https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election [December 2018].
Chen, A. (2015). Seeing into the future: Does Philip Tetlock hold the key to accurate predictions? The Chronicle Review, October 5. Available: https://www.chronicle.com/article/Philip-Tetlock-s-Tomorrows/233507 [December 2018].
Dear, B. (2017). The Friendly Orange Glow: The Untold Story of the Rise of Cyberculture. New York: Penguin Random House
Defense Advanced Research Projects Agency. (2018). Our Research. Available: https://www.darpa.mil/our-research [December 2018].
Deitchman, S. (1976). The Best-Laid Schemes: A Tale of Social Research and Bureaucracy. Cambridge, MA: The MIT Press.
Gearty, C. (2008). Skewing scholarship. The Social Science Research Council, October 9. Available: http://essays.ssrc.org/minerva/2008/10/09/gearty [November 2018].
Gibney, E. (2018). The scant science behind Cambridge Analytica’s controversial marketing techniques. Nature News Explainer, March 29. Available: https://www.nature.com/articles/d41586-018-03880-4?utm_source=twt_nnc&utm_medium=social&utm_campaign=naturenews&sf185785067=1 [November 2018].
Gilman, N. (2004). Mandarins of the Future: Modernization Theory in Cold War America. Baltimore, MD: Johns Hopkins University Press.
Golden, D. (2017). Spy Schools: How the CIA, FBI, and Foreign Intelligence Secretly Exploit America’s Universities. New York: Henry Holt and Company.
Goolsby, R. (2005). Ethics and defense agency funding: Some considerations. Social Networks, 27(2), 95–106.
Grindley, P. (1990). Winning standards contests: Using product standards in business strategy. Business Strategy Review, 1(1), 71–84.
Harman, J. (2015). Disrupting the Intelligence Community: America’s spy agencies need an upgrade. Foreign Affairs, March/April. Available: https://www.foreignaffairs.com/articles/united-states/2015-03-01/disrupting-intelligence-community [November 2018].
Hauch, V., Blandón-Gitlin, I., Masip, J., and Sporer, S.L. (2015). Are computers effective lie detectors? A meta-analysis of linguistic cues to deception. Personality and Social Psychology Review, 19(4), 307–342.
Heathcote, A., Brown, S., and Mewhort, D.J.K. (2000). The power law repealed: The case for an exponential law of practice. Psychonomic Bulletin and Review, 7(2), 185–207.
Hoag, H. (2003). Remote control. Nature, 423(6942), 796–798. doi:10.1038/423796a.
Johnson, L.K. (2018). Spy Watching: Intelligence Accountability in the United States. New York: Oxford University Press.
Johnson, L.K. (2019). Spies and scholars in the United States: Winds of ambivalence in the groves of academe. Intelligence and National Security, 34(1), 1–21. doi:10.1080/02684 527.2018.1517429.
Jones, J.H., and Handcock, M.S. (2003). Sexual contacts and epidemic thresholds. Nature, 423(6940), 605–606.
Koonin, S., Keller, S.A., Shipp, S.S., Allen, T.W., and Walejko, G.K. (2013). Pathways to Cooperation between the Intelligence Community and the Social and Behavioral Science Communities (Analyses Paper P-5000). Alexandria, VA: Institute of Defense.
Krebs, R.R. (2008). Minerva: Unclipping the owl’s wings. The Social Science Reseach Council, November 19. Available: http://essays.ssrc.org/minerva/2008/11/19/krebs [November 2018].
Lewis, P., Grierson, J., and Weaver, M. (2018). Cambridge Analytica academic’s work upset university colleagues. The Guardian, March 24. Available: https://www.theguardian.com/education/2018/mar/24/cambridge-analytica-academics-work-upset-university-colleagues [December 2018].
Licklider, J.C.R. (1960). Man–computer symbiosis. IRE Transactions on Human Factors in Electronics, HFE-1(1), 4–11. doi:10.1109/THFE2.1960.4503259.
Mandler, P. (2013). Return from the Natives: How Margaret Mead Won the Second World War and Lost the Cold War. New Haven, CT: Yale University Press.
Mayer-Schönberger, V. (2011). Delete: The Virtue of Forgetting in the Digital Age. Princeton, NJ: Princeton University Press.
Minerva Research Initiative. (2018). Minerva Research Initiative. Available: https://minerva.defense.gov [November 2018].
Moreno, J. (2012). Mind Wars: Brain Science and the Military in the 21st Century. New York: Bellevue Literary Press.
National Academies of Sciences, Engineering, and Medicine. (2017). The Value of Social, Behavioral, and Economic Sciences to National Priorities: A Report for the National Science Foundation. Washington, DC: The National Academies Press.
National Research Council. (1971). Behavioral and Social Science Research in the Department of Defense: A Framework for Management. Washington, DC: The National Academies Press.
National Research Council. (2008). Emerging Cognitive Neuroscience and Related Technologies. Washington, DC: The National Academies Press.
National Research Council. (2009). Opportunities in Neuroscience for Future Army Applications. Washington, DC: The National Academies Press.
National Research Council. (2012). Using Science as Evidence in Public Policy. Washington, DC: The National Academies Press.
National Research Council and National Academy of Engineering. (2014). Emerging and Readily Available Technologies and National Security: A Framework for Addressing Ethical, Legal, and Societal Issues. Washington, DC: The National Academies Press. doi:10.17226/18512.
Norberg, A.L., O’Neill, J.E., and Freedman, K. (1996). Transforming Computer Technology: Information Processing for the Pentagon, 1962–1986. Baltimore, MD: Johns Hopkins University Press.
O’Rand, A.M. (1992). Mathematizing social science in the 1950s: The early development and diffusion of game gheory. History of Political Economy, 24, 177–204.
Office of the Director of National Intelligence. (2015). The Good Judgment Project. Available: https://www.iarpa.gov/index.php/newsroom/iarpa-in-the-news/2015/439-the-good-judgment-project [November 2018].
Office of the Director of National Intelligence. (2018). Getting Started with IARPA. Available: https://www.iarpa.gov/index.php [November 2018].
Oren, I. (2002). Our Enemies and Us: America’s Rivalries and the Making of Political Science. Ithaca, NY: Cornell University Press.
Penny, J.W. (2016). Chilling effects: Online surveillance and Wikipedia use. Berkeley Technology Law Journal, 31(1), 117. doi:10.15779/Z38SS13.
Pfiffner, J.P. (2010). Torture as Public Policy: Restoring U.S. Credibility on the World Stage. Boulder, CO: Paradigm.
Porter, T.M. (1986). The Rise of Statistical Thinking, 1820–1900. Princeton, NJ: Princeton University Press.
Rohde, J. (2013). Armed with Expertise: The Militarization of American Social Research During the Cold War. Ithaca, NY: Cornell University Press.
Rohde, J. (2017). Pax Technologica: Computers, international affairs, and human reason in the Cold War. ISIS, 108(4), 792–813.
Rudolph, A. (2003). Military: Brain machine could benefit millions. Nature, 424(6947), 369. doi:10.1038/424369b.
Shaw, T. (2018). The new military-industrial complex of big data psy-ops. The New York Review of Books, March 21. Available: https://www.nybooks.com/daily/2018/03/21/the-digital-military-industrial-complex [December 2018].
Siegel, E. (2018). Blatantly discriminatory machines: When algorithms explicitly penalize. Predictive Analytics World, September 25. Available: https://www.predictiveanalyticsworld.com/patimes/blatantly-discriminatory-machines-when-algorithms-explicitly-penalize/9697 [December 2018].
Silence of the Neuroengineers. (2003). Nature, 423, 787. doi:10.1038/423787b.
Smith, A., and Stirling, A. (2007). Moving outside or inside? Objectification and reflexivity in the governance of socio-technical systems. Journal of Environmental Policy and Planning, 9(3-4), 351–373.
Tetlock, P.E., Mellers, B.A., Rohrbaugh, N., and Chen, E. (2014). Forecasting tournaments: Tools for increasing transparency and improving the quality of debate. Current Directions in Psychological Science, 23(4), 290–295.
U.S. Senate, Select Committee on Intelligence. (2014). Committee Study of the Central Intelligence Agency’s Detention and Interrogation Program, 113th Cong., 2d Sess., December 3, 2014.
Voosen, P. (2015). Damning revelations prompt social science to rethink its ties to the military. Chronicle of Higher Education, July 15. Available: https://www.chronicle.com/article/Damning-Revelations-Prompt/231591 [July 2018].
Walsh, P.F., and Miller, S. (2016). Rethinking ‘five eyes’ security intelligence collection policies and practice post Snowden. Intelligence and National Security, 31(3), 345–368.
Weinberger, S. (2017). The Imagineers of War: The Untold Story of DARPA, the Pentagon Agency That Changed the World. New York: Knopf Doubleday.
Wildermuth, J. (2018). California elected officials dismayed at use of Facebook data. San Francisco Chronicle, March 20. Available: https://www.sfchronicle.com/politics/article/California-elected-officials-dismayed-at-use-of-12769078.php [December 2018].
Zehfuss, M. (2012). Culturally sensitive war? The human terrain system and the seduction of ethics. Security Dialogue, 43(2), 175–190.
Zook, M., Barocas, S., Boyd, D., Crawford, K., Keller, E., Gangadharan, S.P., Goodman, A., Hollander, R., Koenig, B.A., Metcalf, J., Narayanan, A., Nelson, A., and Pasquale, F. (2017). Ten simple rules for responsible big data research. PLoS Computational Biology, 13(3), e1005399. doi:10.1371/journal.pcbi.1005399.
This page intentionally left blank.