National Academies Press: OpenBook

Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions (2017)

Chapter: 2 The Role of Social, Behavioral, and Decision Sciences in Security Science

« Previous: 1 Cybersecurity Challenges and Security Science
Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×

2

The Role of Social, Behavioral, and Decision Sciences in Security Science

Technical approaches alone will not suffice for cybersecurity insofar as humans play roles in systems as developers, users, operators, or adversaries. Major security breakdowns have occurred because individuals misplace trust, organizations create perverse incentives, or adversaries discover and exploit design flaws. Meeting security needs effectively requires understanding that human context. How does cybersecurity affect the real-world business or organization? Does it drain human or other resources, or does it reflect a balance between keeping the business or organization secure and keeping it economically viable? What investments does it deserve and receive? How does the perceived value of possible practices compare with their demonstrated efficacy? What evidence would help to make that assessment?

Social, behavioral, and decision sciences provide the reservoir of knowledge for addressing some of these questions and for making other research more useful for those responsible for vulnerable systems. Such expertise can also be vital, especially during design, in revealing any disconnects between intention and actual use and in articulating the variety of potential users and their contexts. Relevant fields include economics (e.g., incentives, resources), sociology (e.g., social networks, norms, criminology), psychology (e.g., motivation, perception, user interfaces), decision science (e.g., risk assessment, communication), linguistics (e.g., framing and conveying information), organizational psychology (e.g., multi-team systems and information sharing), political science (e.g., deterrence and international norms), and organizational behavior (e.g., recruitment, retention, reporting procedures). The law, although not a science, often synthesizes

Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×

results from the sciences, with the broad integrative perspective of the humanities (see Box 2.1).

On the whole, the traditional cybersecurity community does not typically have expertise in social science analysis tools and methods. As a result, it not only relies on intuition, even when science is available, but the community also struggles in its attempts to validate its approaches and falls below scientific standards in experimental design, confounding variables, classes of confounds that occur in different domains or environments, and so on. Collaborating with social scientists and understanding their standards for research and publication would bring new tools to bear and yield new insights. The primary psychological barrier to such collaborations is that system designers and operators have unwarranted confidence in their intuitive theories regarding others’ behavior (or neglect to fully consider the implications). Indeed, the human–computer interaction research community has a motto, “You are not the user,” to remind researchers and practitioners not to assume that other people share their perceptions and motivations.

The primary institutional barrier to utilizing social, behavioral, and decision science expertise is that these disciplines are largely absent from the cybersecurity research community. Indeed, the community often lacks even the absorptive capacity to identify these needs, recruit the exper-

Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×

tise needed to address them, and critically evaluate claims about system security that can be compromised by human failures in design, deployment, training, or management. Without that expertise, the cybersecurity community must improvise its own theories and research methods that are central to those sciences: experimental design, identifying confounds, meta-analysis, sensitivity analysis, and so on. Conversely, because those sciences have not been drawn into addressing the unique challenges of cybersecurity, they have limited experience in applying (and explaining) their research to those domains and in identifying the unique foundational research questions that they pose.

To create more effective collaborations, it is essential to find ways to foster interactions that address the needs of both of these disciplines and the cybersecurity community of researchers and practitioners. Part of what cross-disciplinary researchers can do is verify that use of a result in one context still holds true in another context; a result described in one environment may not be found in a different environment—and the environmental differences and their importance may not be obvious. Chapter 5 offers some programmatic suggestions for how such interactions can be encouraged. The cybersecurity community has long acknowledged and understood the gravity of issues such as insider threat and unrecognized design flaws (e.g., those inherited from legacy systems). Involving the social, behavioral, and decision sciences directly should be more effective than attempting to create cybersecurity versions of those disciplines from scratch.1

Approaches to security and technology need to be seen in the larger context of all that a user or organization must accomplish in the socio-technical domain in which it operates. For instance, how can users and organizations be enabled to perform in ways that maintain security, especially when security-related tasks and activities often compete with other tasks for time and effort? Social science can illuminate what effectiveness means in terms of enabling users to get to their needed activities and accomplishments without significant time and effort spent on compliance. More importantly, rather than focus only on security mechanisms, researchers can begin or extend this kind of research by getting to the heart of why users and organizations are expected to perform security-related tasks. For example, users need to be authenticated, so what are the most effective authentication mechanisms? And which mechanisms are

___________________

1 For more examples, see the August 2015 issue of IEEE Security and Privacy, which explored the topic of learning from other disciplines and the importance of a multidisciplinary perspective and described three case studies in which application of another discipline’s techniques led to important security and privacy insights (IEEE Security and Privacy, 13(4), July-August 2015). See also F. Stajano and P. Wilson, Understanding scam victims: Seven principles for systems security, Communications of the ACM 54(3), 2011.

Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×

most effective in which contexts? The answers to these questions depend in large part on how users and organizations perceive the need for authentication, the time and effort required to enact the authentication, and the trade-offs between authentication effort and accomplishment of the primary tasks (i.e., the actual tasks for which the users are being rewarded).

Human behavior affects all stages of the life cycle of a cybersecurity system: design, implementation, evaluation, operation, maintenance, monitoring, revision, replacement, and training.2 Each stage offers opportunities to increase or reduce vulnerability: how design teams are constituted and managed, how procedures and interfaces are tested for usability,3 what incentives and resources are provided for security, how operators are trained and their performance evaluated, and how the business case is made and understood. Stakeholder conflicts are a well-known problem in software engineering, and there are methods for detecting and managing them that could be adapted and evaluated for conflicts involving utility, security, and usability.4

In the modern threat environment, cybersecurity researchers and practitioners also need to be ready for social science-based approaches being adopted by hostile adversaries and ready to respond. Box 2.2 describes how the emerging Internet of Things is an example of the multifaceted and multidisciplinary nature of the cybersecurity challenge.

One example of a project that integrated social and organizational analysis with technical research analyzed the “spam value chain.”5

___________________

2 Caputo et al. specifically studied software development and barriers to attending to security and usability needs (D.D. Caputo, S.L. Pfleeger, M.A. Sasse, P. Ammann, J. Offutt, and L. Deng, Barriers to usable security? Three organizational case studies, IEEE Security and Privacy 14(5): 22-32, 2016).

3 Ivan Flechais developed the AEGIS method and UML extension to do this (I. Flechais, C. Mascolo, and M.S. Sasse, Integrating security and usability into the requirements and design process, International Journal of Electronic Security and Digital Forensics 1(1):12-26, 2007, doi:10.1504/IJESDF.2007.013589).

4 In one of his last lectures at the Royal Society in 2002, Roger Needham raised the fact that security goals of the owner who pays for the system are the ones that are implemented, even if they run counter to the interests of the other stakeholders (R. Needham, Computer security? Philosophical Transactions of the Royal Society, Series A 361:1549-1555, 2003).

5 K. Levchenko, A. Pitsillidis, N. Chachra, B. Enright, M. Félegyházi, C. Grier, T. Halvorson, C. Kanich, C. Kreibich, H. Liu, D. McCoy, N. Weaver, V. Paxson, G.M. Voelker, and S. Savage, “Click Trajectories: End-to-End Analysis of the Spam Value Chain,” in Proceedings of IEEE Symposium on Security and Privacy, https://cseweb.ucsd.edu/~savage/papers/Oakland11.pdf, 2011. Stefan Savage, one of the researchers, reported on outcomes at a 2015 workshop, summarized in National Academies of Sciences, Engineering, and Medicine, Continuing Innovation in Information Technology: Workshop Report, The National Academies Press, Washington, D.C., 2016, p. 58:

Once alerted to the penalties of working with spammers, the banks quickly dropped these accounts, leaving spammers with no way to monetize their sales.

Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×

Researchers discovered that a significant majority of transactions for spam-advertised products were handled by a very small number of companies. As a result, they identified a nontechnical chokepoint in the system where a combination of a policy and financial mechanisms (refusing to authorize credit card payments for those companies) significantly reduced the amount of spam on the Internet. Spam up to that point had been seen primarily as a technical identification and filtering problem.


While switching e-mails or domains is easy, switching banks is far more difficult for spammers, and this strategy has proved effective in shutting down certain types of spammers. As a result, there has been a substantial drop in sales of pirated software as a whole.

Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×

The rest of this chapter discusses how social, behavioral, and decision sciences can contribute to research in cybersecurity and to security science; provides a preliminary list of areas that provide opportunities for collaboration; and examines in more detail two specific topics of particular importance to cybersecurity: (1) incentives and resources and (2) risk analysis. The chapter concludes with a discussion of opportunities to improve prospects for interdisciplinary work in cybersecurity.

CONTRIBUTIONS FROM SOCIAL, BEHAVIORAL, AND DECISION SCIENCES

Collaborating with social, behavioral, and decision scientists would put their substantive theories and methodological procedures at the service of the cybersecurity community. Achieving that integration will, however, require a sustained commitment. Without it, though, these scientists will not develop the working relationships needed to achieve and deserve the trust of cybersecurity researchers and practitioners. Nor will they have the incentive to broaden their traditional pursuits in order to master cybersecurity topics and translate them into terms recognizable by their colleagues. Unless researchers trained in these disciplines become part of the cybersecurity research community, its members will be forced to continue to do the best they can to find and interpret the relevant literature.6

Results and approaches from other disciplines can help improve cybersecurity research and ultimately cybersecurity outcomes. For instance, research in other domains has documented that people do not have a perfect understanding of threats and can be oblivious to some kinds of threats, and has documented how to afford them the mental models and incentives needed for better risk perceptions. Social scientists study the human determinants of trust. They find, for example, that people tend to anthropomorphize technical systems and look for cues similar to what they would get in a human interaction. Research into e-mail and “flaming” in the 1980s, when e-mail became popular, revealed this.7 People do not appreciate how much information was eliminated when voice and in-person interactions were restricted to plain text. There is research available for many of those sorts of human topics and behaviors. In this example, what are the aspects of trust in human–system interactions that

___________________

6 The Computing Research Association recognized the value and importance of these disciplines to computing broadly in a 2015 letter to the House Science Committee (available at http://cra.org/govaffairs/blog/2015/04/cra-statement-opposing-america-competesreauthorization-act/).

7 S. Kiesler, J. Siegel, and T.W. McGuire, Social psychological aspects of computer-mediated communication, American Psychologist 39:1123-1134, 1984.

Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×

would never have occurred to the people who would study trust in the full-bandwidth, interpersonal situation? Those researchers studying interpersonal communication on e-mail asked questions that had not occurred to social psychologists before because there was this new environment that manipulated or constrained people’s behavior in a way that had not occurred to that field before. Researchers might not have been able to make progress on the question of interpersonal trust without technology. And new technology-mediated forms of communication continue to pose new challenges to trust between humans. This illustrates a way in which cybersecurity challenges can offer new opportunities for social scientists to explore their own theories.8

With regard to large-scale systems such as today’s cloud-based services, an example of the cross-cutting nature of the cybersecurity problem relates to a broader sense of trust. How should trust and expectations regarding trust, as they exist in an enterprise or between institutions, be realigned in a digital context? Many information systems treat trust as transitive (if entity A trusts B, and B trusts C, then A also trusts C), but, of course, trust does not work the same way or mean the same thing in interpersonal, social, or organizational contexts. Moreover, users of such systems may have little idea what string of relationships underlies any direct relationship.

The lack of bandwidth mentioned above does not provide a complete explanation, however. Even in non-computer-mediated interactions, people look for efficiencies and use trust cues as shortcuts to carrying out a full risk and benefit assessment. For instance, consumers do not read all reviews of a store they might visit. Instead, they quickly scan whether it is well stocked, if staff are appropriately helpful and trained (all signs of investment), and, last, whether the other customers are similar to themselves. Once they have had a successful experience, they expect that store to deliver in the future. This approach works reasonably reliably in the physical world, but may not be as reliable online where it is inexpensive for adversaries to mimic trust cues and impersonate others. People bring their trust models from the physical world and do not always realize that they do not apply in the online environment. The recently revealed hacks of major political figures show vulnerability and consequences for even (or perhaps especially) prominent individuals.

___________________

8 See, for instance, J. Steinke, B. Bolunmez, L. Fletcher, V. Wang, A.J. Tomassetti, K.M. Repchik, S.J. Zaccaro, R.S. Dalal, and L.E. Tetrick, Improving cybersecurity incident response team effectiveness using teams-based research, IEEE Security and Privacy 13:20-29, 2015. This work applies theories about multi-team systems to cybersecurity incident response teams. Not only has that work tested existing theories; it has also led to new processes for documenting multi-team systems, whether or not they involve cybersecurity.

Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×

BUILDING COLLABORATIVE COMMUNITIES

Designing competent socio-technical systems will necessarily encompass multiple disciplines (see Box 2.3 for a list of such topics). Although there are promising subcommunities within computer science and related disciplines,9 there are significant barriers to such collaboration. Crossing disciplinary boundaries requires deep, sustained participation from researchers in the relevant disciplines, not just expecting experts in one area to cover topics outside their home discipline or to work in parallel, hoping that the pieces mesh.

If the cybersecurity community is to engage social scientists, there must be compelling reasons why research in cybersecurity would enhance their careers. These incentives may include providing a pathway for publication in social science venues that can lead to tenure and promotion, as well as framing projects so that graduate students can be involved and use the results for doctoral dissertations. Such collaborations are not natural acts for researchers in most communities. As a result, it may be useful to engage researchers who are already well versed in several disciplines to act as “translators” on cross-disciplinary projects. Thus, cybersecurity researchers should not be expected to become experts in social science research, but they need to understand the kinds of controls and quality assurance mechanisms that it requires.

Moreover, it is not likely to be fruitful to expect cybersecurity researchers to learn about the findings of other disciplines on their own. Indeed, it can be dangerous if they read a single paper (or worse, a secondary account) without knowing the context for the reported finding—how strong it is, how consistent with other research. More productive would be to find ways to foster interactions with other disciplines so that cybersecurity researchers and practitioners can describe the elements of the problems needing solutions, and researchers from other disciplines can identify those aspects of their disciplines that might be useful in providing insights or solutions. It is the job of a researcher in a given discipline to know the literature and implications of that discipline. This avoids the risk of cybersecurity researchers reading small bits of the literature of that discipline, potentially out of context, and applying results inappropriately. Part of what the cross-disciplinary teams do is verify that a result observed in one context still holds true in another, considering differences that may not be obvious to those without professional training. The institutional challenge is to support social, behavioral, and decision science experts as equal partners when working on cybersecurity research

___________________

9 See, for example, the Symposium on Usable Privacy and Security (https://cups.cs.cmu.edu/soups/), the Workshop on the Economics of Information Security (http://econinfosec.org/), and the Privacy Enhancing Technologies Symposium (https://petsymposium.org/).

Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×

efforts.10 The point of a meaningful and sustained cross-disciplinary effort is collaboration (not consultation).

INCENTIVES, RESOURCES, AND RISK IN CYBERSECURITY

This section examines in more detail two specific topics of particular importance to cybersecurity practices in real-world environments: (1) incentives and resources and (2) risk analysis.

___________________

10 The research community in the United Kingdom has endeavored to develop such a community in the Research Institute in Science of Cyber Security: Researchers present results to practitioners, and practitioners present challenges and questions. If the questions cannot be addressed with existing results, a discussion ensues that in many cases leads to new research studies or projects to investigate those questions.

Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×

Understanding Incentives and Resources

Incentives for attackers and, separately, for the industry and marketplace as a whole are constantly in flux. Resources are limited, and investments need to be made carefully. One challenge is to understand when cybersecurity measures are worth investing in and when investments in other capabilities would improve security for an organization. Insights from the social sciences can help in making this determination. For instance, an analysis of airport security efforts suggests that sometimes investing in improving non-security aspects of the systems can improve security more than security measures.11 Even considering the narrower challenge of specifically cyber-related attacks and defenses, examining resource availability for each can be instructive. That examination should consider the incentives shaping the cybersecurity actions of all actors, from the most casual “script kiddies” to the most competent agencies of first-rate powers. Every attack has desired outcomes, and the more resources (money, time, opportunity costs) required to acheive those outcomes, the fewer of those outcomes there will be. Understanding resource constraints on the part of attackers can help with planning and decision making, and can help focus research activity.

A commonly repeated claim is that defenders have to defend everything all the time, but the attackers only have to be successful once, meaning that defenders must expend significantly more resources than attackers, especially if attackers are trading information and tools. However, there is a symmetric proposition that sometimes applies: Defenders only have to catch an attack once and make changes that thwart it worldwide, after which the attacker’s work to exploit that vulnerability becomes much more difficult, not just in terms of the expense of the immediate engagement, but possibly also in terms of consumption of “zero days.”12

But for this to be true, the defenders have to understand the specific techniques used in the attack (so-called indicators of compromise), develop and implement protective measures, and share that information with people and organizations capable of using it. And those individuals and organizations have to be incentivized to apply the appropriate remediations. Construction of defenses can include mechanisms whereby defenders not only repel an attack, but also engage with sensors, deflection, and other processes that enable them to gain useful information related to the methods, motives, and identities of attackers through the

___________________

11 H. Molotch, Against Security: How We Go Wrong at Airports, Subways, and Other Sites of Ambiguous Danger, Princeton University Press, Princeton, N.J., 2012.

12 A “zero day” is a vulnerability that has not been publicly disclosed or reported before being exploited. The term refers to the amount of time (zero days) available to create fixes or work-arounds before the exploit becomes active.

Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×

process of an engagement. Unfortunately, too many organizations do not (or cannot) benefit from such knowledge when it is developed and are thus incapable of contributing to the common defense.

Another example of how incentives affect cybersecurity and might benefit from such research is how to secure an infrastructure that depends on components developed by the open-source community. How can open-source developers be incentivized to use better tools and practices in the development of software, particularly safety and security-critical software? More specifically, how can they be incentivized to audit or rewrite older code as new classes of vulnerabilities come to light? One potential advantage of open source is the tremendous sharing of tools and practices within the community. As a result, investments from industry and government stakeholders are likely to yield broad benefit—and also influence vendor organizations, commercial tool developers, and so on. What should be the obligations of those who use open-source software in their systems?

One effort under way to address open-source software security is the Linux Foundation’s Core Infrastructure Initiative, which was set up in the aftermath of the discovery of the Heartbleed vulnerability in OpenSSL. The processes in place allowed a single developer to make updates over a holiday with just a cursory review by a “committer” before being added to the codebase. There were only a handful of developers responsible for software that underpins much online commerce activity. In addition, major corporations were running systems that depended on that component, and the extent of that dependence was either not realized or was not seen as a significant risk by risk analysts or auditors in those corporations. That vulnerability revealed a number of ways in which a combination of technical, social, and organizational decisions and practices led to a significant problem.

The initiative is set up to bring organizations that depend on open-source software together with open-source development communities with the aim of improving practice. It provides funding for individual developers to work on improving the security of their projects and is also grappling with the problem of how to improve practices more broadly. This initiative and its results are an opportunity to understand better what processes, technologies, and structures make efforts to maintain and improve open-source projects effective.

Risk Analysis

A formal risk analysis provides a way to organize information about different kinds of weaknesses a system may have—from implementation errors to inadequate backup and recovery mechanisms—and the kinds of

Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×

threats that may exploit those weaknesses. It helps indicate whether sufficient resources, or too many resources, are being applied to mitigating vulnerabilities, preparing for recovery, and so on. Assessments of vulnerabilities and the likely costs of an effective response are both made in the context of threat assumptions, which also need to be characterized as part of the analysis, as do consequences.

There are essentially two ways to conduct a risk analysis.13 One is to evaluate relative risk of competing procedures and operational decisions in a given system and to understand implications of changes in the system (e.g., design, operation, maintenance) and its operating environment (e.g., tasks, adversaries, threats). This evaluation can clarify how the system works, how to manage it, and how to evaluate the quality of competing approaches. The toolkit for such analyses includes systematic analyses of existing performance data, translation of existing research into model terms, and disciplined expert elicitation.14 A much more ambitious way of using risk analysis is to try to assess the absolute level of risk in a system. Often, the latter is unproductive or even counterproductive, as it leads to large, complicated, unreviewable projects with an incentive to leave out things that are not easily quantified. Indeed, in 2013, Langner and Pederson argued that without effective ways to calculate risk, managers will always underestimate it and choose to underinvest in security.15 Instead, they will invest in other areas that they understand better and that will lead to clear payoffs (e.g., profits, better service, or improved efficiency, depending on the kind of organization).

Conducted appropriately, risk analysis provides an orderly way of putting diverse kinds of information on a common platform. Risk analysis can inform system design by creating a transparent platform for sharing assumptions about performance of the system’s elements and their interdependencies. It forces thinking about the metrics that are relevant to a decision maker. Often things that are readily countable are not the things that are most important. And, as will be discussed in Chapter 4, metrics regarding cybersecurity have been notoriously difficult to develop. Effective risk analysis can provide tools for structuring and understanding what information can be collected. Risk analysis requires inputs from experts in all the factors affecting system performance, integrated with the

___________________

13 See B. Fischhoff, The realities of risk-cost-benefit analysis, Science 350(6260):527, 2015; and B. Fischhoff and J. Kadvany, Risk: A Very Short Introduction, Oxford University Press, Oxford, U.K., 2011.

14 M.G. Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, Proceedings of the National Academy of Sciences 111(20):7176–7184, 2014.

15 R. Langner and P. Pederson, “Bound to Fail: Why Cyber Security Risk Cannot Be ‘Managed’ Away,” Brookings Institution, 2013, https://www.brookings.edu/research/bound-to-fail-why-cyber-security-risk-cannot-be-managed-away/.

Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×

analytical tools of decision science. The result is a transparent platform for summarizing, evaluating, and communicating the state of system knowledge. This type of risk analysis is not done widely in cybersecurity, and research here has high potential payoff in taking fullest advantage of knowledge in security science.

MAKING PROGRESS

The committee identified barriers to collaboration both within and among the disciplines and recommends strategic institutional initiatives to overcome them. It also identifies some promising initiatives to integrate social and behavioral sciences into cybersecurity. One is the National Science Foundation’s (NSF’s) Secure and Trustworthy Cyberspace portfolio, which is a cross-cutting effort that explicitly includes social and behavioral science. Another recent NSF effort laid out a vision for experimental cybersecurity research that urges a culture of experimentation, an emphasis on multidisciplinarity, grounding basic work in real-world systems, and recognizing the range of human actors who shape system performance.16

More can be done to improve prospects for and outcomes of interdisciplinary research. There are many barriers to effective and ongoing integration of disparate research cultures. In particular, although individual researchers and research projects can look for ways to work across disciplines, sustained long-term efforts that build on results over time and continue to integrate new results from other disciplines will require support and commitment from both the research community and research funders. Knowledge regarding user and organizational incentives for or obstacles to implementing changes in practice and policies is needed for such changes to be put into place. An assertive approach to multidisciplinary integration could lead to a culture of foundational research that involves a conscious and sustained interplay between technical advances and incorporating results from the social and behavioral sciences about how to change systems, developer practices, user expectations, and institutional policies.

__________________

16 D. Balenson, L. Tinnel, and T. Benzel, Cybersecurity Experimentation of the Future (CEF): Catalyzing a New Generation of Experimental Cybersecurity Research, 2015, http://www.cyberexperimentation.org/files/5514/3834/3934/CEF_Final_Report_20150731.pdf.

Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×
Page 21
Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×
Page 22
Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×
Page 23
Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×
Page 24
Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×
Page 25
Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×
Page 26
Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×
Page 27
Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×
Page 28
Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×
Page 29
Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×
Page 30
Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×
Page 31
Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×
Page 32
Suggested Citation:"2 The Role of Social, Behavioral, and Decision Sciences in Security Science." National Academies of Sciences, Engineering, and Medicine. 2017. Foundational Cybersecurity Research: Improving Science, Engineering, and Institutions. Washington, DC: The National Academies Press. doi: 10.17226/24676.
×
Page 33
Next: 3 Engineering, Operational, and Life-Cycle Challenges in Security Science »
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!