Optimizing policies that encourage scientific openness and transparency while in appropriate cases limiting the dissemination of research results that might be applied to harmful ends is a difficult challenge.1 There is significant debate among and within different communities about whether (and, if so, how) to limit the dissemination of such research results.2 Numerous questions inform the debate. What types of information could be employed to cause harm? What criteria should be met before potentially harmful information is restricted? How widely should information be shared? What damage might particular information cause in the wrong hands? What materials and resources are necessary to translate information into a harmful application? Could the information be obtained through other means? What benefits might be foregone by not sharing information with those who could use it for legitimate purposes? What does the scientific community lose in quality control or follow-on work by withholding information from a wider audience?3
To assist it in answering questions such as these, the committee invited presentations and commissioned papers to explore options for the management of dual use research of concern (DURC). These papers are available at https://www.nap.edu/catalog/24761 under the Resources tab. The committee gathered information both at a public information gathering meeting on July 11-12, 2016, and at a public workshop on January 4, 2017. The July meeting consisted of presentations by invited experts, and the January workshop consisted
1 For a discussion of this “wicked” problem, see, e.g., G. D. Koblentz, “Dual Use Research as a Wicked Problem, Frontiers in Public Health, August 2014, Vol. 2, Article 113, doi:10.3389/ fpubh.2014.00113.
2 Journal Editors and Authors Group, “Statement on Scientific Publication and Security,” Science Online, 2003, Vol. 299, No. 5610, p. 1149; David A. Relman, Stanford University and VA Palo Alto Health Care System, Presentation to the committee, July 11, 2016, New York, NY; and Carrie Wolinetz, National Institutes of Health, Presentation to the committee, July 11, 2016, New York, NY.
3 Gerald L. Epstein, White House Office of Science and Technology Policy, Presentation to the committee, July 11, 2016, New York, NY.
of presentations by the authors of the commissioned papers. At each event, members of the committee engaged presenters and members of the public in open discussions.
Presenters widely recognized the inherent tension between conducting life sciences research for the public good (e.g., to achieve economic, environmental, and public health benefits) and the need to increase awareness of risks associated with the fraction of life sciences research that could be directly misapplied to cause great harm. Acknowledging the limitations of current mechanisms for the management of life sciences research of concern, many expressed support for more effective policies and for guidance for researchers, research institutions, journal editors, and funders with regard to the conduct and dissemination of such research.4
FOUNDATIONAL U.S. POLICIES ON THE DISSEMINATION OF RESEARCH
National Security Decision Directive 189 (NSDD-189)
As noted in Chapter 2, NSDD-189 provides the foundation for current U.S. policy on the dissemination of scientific information. It states that “no restrictions may be placed upon the conduct or reporting of federally funded research that has not received national security classification, except as provided in applicable U.S. statutes.”
In their paper commissioned for the committee, Michael Imperiale (University of Michigan) and David A. Relman (Stanford University and VA Palo Alto Healthcare Center) argued that much research currently falls into a gray area between fundamental research, which is to be shared openly, and sensitive research warranting classification. They noted that the Corson Report, whose recommendations were the basis for the creation of NSDD-189, identified four criteria for identifying types of research that fell into neither the fundamental research nor the sensitive research categories. Research in this category was identified as potentially needing some sort of restriction, including voluntary
4 Other researchers believe that too many rules and regulations apply to (and ultimately hinder) the conduct of research. Many are documented in the 2016 National Academies’ report, Optimizing the Nation’s Investment in Academic Research: A New Regulatory Framework for the 21st Century [National Academies of Sciences, Engineering, and Medicine, Optimizing the Nation’s Investment in Academic Research: A New Regulatory Framework for the 21st Century (Washington, DC: The National Academies Press, 2015), doi:https://doi.org/10.17226/21824]. The report notes, for example, that the lack of harmonization of select agent regulations across agencies decreases efficiency (see p. 185). It suggests that improvements to the current export control regime could “bring significant benefits to national security, to commerce, and to the economy, as well as to federally funded university research” (see p. 191).
Imperiale observed that the international threat environment has changed since NSDD-189 was adopted. At that time, the United States enjoyed a significant technological advantage over its adversaries. “Now that the research playing field has leveled out,” he asked, how will the United States “stay one step ahead of those with nefarious tendencies?”7 He suggested that “slowing down access to dangerous information temporarily” is an alternative to having such information “freely accessible immediately.”8
Other workshop participants made the counter-argument that more emphasis should be placed on determining what is dangerous and on what can be addressed through either regulatory or technical means (versus keeping information secret and not taking counter actions). Further, they suggested that, now that the playing field has leveled out, other nations also are funding and publishing (or not publishing) information that has the potential to be misused.9
In certain circumstances, U.S. export control regulations can impose restrictions on the flow of potentially dangerous biological information and materials. Voluntary acceptance of restrictions on communications could make fundamental research results that might otherwise be freely disseminated subject to export controls.10 The International Traffic in Arms Regulations (ITAR), administered by the U.S. Department of State Directorate of Defense Trade Controls, and the Export Administration Regulations (EAR), administered by the U.S. Department of Commerce, Bureau of Industry and Security, apply to the dissemination of life sciences research.11 Of course, export controls are
5 Michael Imperiale and David A. Relman, Options for Management of Potentially Dangerous Information Generated by Life Science Research (commissioned paper available at https://www.nap.edu/catalog/24761 under the Resources tab.)
6 NSDD-189 did not address research in this “gray area.” Imposing the type of restrictions described would require a revision to NSDD-189 or the adoption of a specific statute.
7 Michael Imperiale, University of Michigan, Presentation to the committee, July 11, 2016, New York, NY.
8 Imperiale and Relman, p. 10.
9 In the particular cases of the mousepox and botulinum papers, some have suggested that scientific/technical information later came to light that demonstrated that both papers may not have been as useful for misuse as they first seemed. This knowledge would not have come to light had the information not been available to a diverse technical audience.
10 Specifically, the subject arose during the controversy over the publication of the H5N1 avian influenza papers in 2011.
11 Kimberly Strosnider, Doron Hindin, and Peter D. Trooboff, The Role of Export Controls in Regulating Dual Use Research of Concern: Striking a Balance Between Freedom of Fundamental Re-
inherently limited as a control mechanism because they can result in restrictions only for interactions that constitute exports. Exchanges of information among U.S. citizens within the United States do not constitute exports and are beyond the reach of the export control system.
In their presentation to the committee at its January 2017 workshop, Kimberly Strosnider and Doron Hindin (Covington & Burling LLP) discussed the ITAR’s U.S. Munitions List and the EAR’s Commerce Control List as they relate to controls on pathogens and toxins. Prior to December 31, 2016, Category XIV of the Munitions List controlled all “[b]iological agents and biologically derived substances specifically developed, configured, adapted, or modified for the purpose of increasing their capability to produce casualties in humans or livestock, degrade equipment or damage crops,” as well as related technical data and defense services. However, effective December 31, 2016, this language has been replaced, clarifying that the ITAR control only the most highly sensitive pathogens that have been effectively weaponized through gain-of-function (GOF) intervention. Other pathogens are subject to U.S. export controls through the EAR.12 The EAR’s Commerce Control List includes “dozens of microbes, including all 15 DURC agents and those regulated by the Federal Select Agents Program, as well as certain related vaccines, immunotoxins, medical products, etc. and related technology.”13 Strosnider and Hindin described how export controls cover cross-border activity, such as transfers from the United States or transfers between foreign countries, as well as the release of controlled information within the United States or abroad to non-U.S. persons.
Both regulations complement NSDD-189 in that they have carve-outs for fundamental research. EAR exemptions include “published” information. They allow internet upload and prepublication review by co-authors, other researchers, and conference organizers. In their paper commissioned for the committee, Strosnider, Hindin, and Trooboff stated that, under the EAR, researchers can also “freely share their research with the public, such as by uploading their research results to the Internet, unabated by EAR controls.”14 This applies to research that is not otherwise restricted, such as, for example, through government security classification or by the terms of a federally funded research grant. The ITAR have a public domain exemption and do not cover unrestricted information released into the public domain via eight specific modes of release, including libraries, newsstands, open conferences in the United States, and others. A federal appeals court in September 2016 ruled that, unlike the EAR,
search and National Security (commissioned paper available at https://www.nap.edu/catalog/24761 under the Resources tab).
12 Ibid, p. 5.
13 Kimberly Strosnider and Doron Hindin, Covington & Burling LLP, Presentation to the committee, Washington, DC, January 4, 2017.
14 Strosnider, Hindin, and Trooboff, p. 14.
the ITAR do not recognize an exemption for information released through the internet without government approval, where the information does not otherwise qualify for the public domain exemption.15
At the January workshop, Strosnider and Hindin argued that there is incongruity between the aims of export controls and the consequences of their being applied to the dissemination of research results. As an example, they explained that export controls do not apply to the importation of DURC to the United States or to release within the United States to U.S. persons within the United States, despite national security risks that can arise, and have in fact arisen, in such contexts. Moreover, export controls typically regulate items based on technical specifications or sensitivity of the item and, in some cases, based upon the purposes for which the item was developed. When export controls might apply to the results of unrestricted research, a key issue is whether the information has been or will be released to the public through a designated means. A researcher’s decision to release such unrestricted and unclassified information to the public can remove EAR controls on publicly available material, regardless of the content of the communication. As a result, the sensitivity of the information does not control whether it is subject to export controls.
Thus, Strosnider and Hindin suggested that export controls are poorly suited to protect national security with respect to DURC that is imported into the United States or released within the United States to U.S. nationals, or with respect to unrestricted research results that are or will be properly released to the public.
Strosnider and Hindin further explained that broadening U.S. export controls to apply to privately generated research results could expose regulatory authorities to constitutional First Amendment claims. Such claims have been litigated in the past with respect to the ITAR, though the government has thus far prevailed—albeit by a narrow margin at times—with federal district
15 In their paper (see p. 8), Strosnider, Hindin, and Trooboff address the particular case of publication of material on the internet and describe the 2016 ruling:
“a federal appeals court ruled that the Department of State, through the ITAR, could deny a U.S. citizen the ability to share privately-generated, unclassified information with the public through the Internet. The information at issue was a computer-aided-design file of a gun that would allow anyone to produce firearms using commercially available 3-D printers. In a 2-1 ruling, the court sided with the State Department, determining that the government’s ‘exceptionally strong interest in national defense and national security outweighs Plaintiff[’s] very strong constitutional rights under these circumstances.’ The court accepted the State Department’s position that the ITAR’s public domain exception was not available because the intangible and informal mode of dissemination of the computer file had failed to correspond to any of the ITAR’s eight enumerated public domain provisions; had the data been published in materials available at a public library or newsstand, instead of the Internet, the ITAR public domain provision likely would have applied. The majority decision prompted a vigorous dissent,” which “argued that the government’s attempt to restrict uploading such information to the Internet ‘appears to violate the [ITAR’s] governing statute, represents an irrational interpretation of the [ITAR], and violates the First Amendment.’”
and appellate courts prioritizing national security interests over litigants’ First Amendment rights.
Finally, Strosnider and Hindin suggested that asserting more stringent export control rules on research results could erode incentives for collaboration between the scientific research community and the U.S. government. As an example, they explained that increased enforcement or stricter export control rules might encourage researchers to avoid important prepublication national security reviews by relying on the EAR and ITAR public domain exceptions.
LIMITATIONS OF DURC POLICY
The 2012 and 2014 U.S. government policies for the oversight of life sciences DURC apply to U.S. government-funded research that involves the use of one of the 15 specified agents or toxins and one of the 7 categories of experiments. Several presenters noted that the result is policy that is simultaneously too broad and too narrow—not all research that involves the identified pathogens and experiments is research of concern and experiments outside of those on the listed experiments may raise dual use concerns. At the January workshop, Imperiale observed, “It is obvious that manuscripts involving pandemics are important, but what about new technologies—synthetic biology, gene drives, etc.? What about technologies that we haven’t thought of yet?” As we move forward, he noted, there will always be new things to be concerned about.
It is necessary to weigh biosecurity risks against the benefits of free and open communication, which complicates determinations regarding the appropriate scope of restraints on dissemination. Such assessments currently are made on a case-by-case basis, by a variety of agencies and organizations. With no agreed upon common standard or process, the tradeoffs between biosecurity risks and the benefits of open communication continue to be debated. Under the current system, DURC policies could, in certain instances, place constraints on research that exceed the level of control necessary to serve legitimate biosecurity goals. On the other hand, current DURC policies might not constrain research that arguably should be subject to restriction.
Current DURC policies do not apply to classified research, research that does not involve 1 of the 15 specified agents and particular types of experiments, or to research at institutions that do not receive U.S. government funding.16 Non-compliance presents the potential risk of the withdrawal of federal funding, but it is not clear whether other sanctions would, in fact, be imposed.
Much of the current policy is focused on formal publication. However, as noted in statement 4 of the journal editors’ “Statement on Scientific Publication and Security” in 2003, scientific information is communicated by many other
16 Elisa D. Harris, University of Maryland, Presentation to the committee, July 11, 2016, New York, NY.
means: “seminars, meetings, electronic posting, etc.”17 Traditionally, scientific results were published primarily in printed journals, and these journals were only readily accessible to subscribers or at institutions holding subscriptions to the journals. Even when journal articles began to be posted online, they were accessible only to those with subscriptions. An open access movement and the proliferation of public digital libraries has subsequently made much scientific literature widely available. Furthermore, researchers increasingly choose to post pre-prints of their research to internet servers before peer review.18 Furthermore blogs, conferences, and widespread internet communications have meant that research results are often made available long before formal publication. Moreover, while “journals and scientific societies can play an important role in encouraging investigators to communicate results of research in ways that maximize public benefits and minimize risks of misuse,”19 there is no mutually-agreed-upon approach to decisions surrounding the publication of DURC findings. The diverse channels through which information may be shared present challenges for the development of any policy attempting to manage dissemination.
At the committee’s July 2016 meeting, Philip Campbell (Nature) reported that, in 2012, the editors of Nature decided that, as a general policy, the journal would not redact key findings or distribute information only to selected recipients. He suggested that redacting key data or methods disables subsequent research and peer review and that the distribution of redacted information to a select group of people on a need-to-know basis is practically infeasible because of questions such as: “Who holds the data?”; “Which criteria are used to determine who is allowed to see the redacted information?”; “Who decides by which mechanisms is the information then made accessible?”; and “How can information distributed to a university or public health laboratory remain confidential?” He suggested that biosecurity constraints on publication risk eroding the robustness of the field if reproducibility is not tested. Further, he suggested that delays and uncertainty about the ability to publish and to get credit may discourage young scientists from entering the field.
Campbell said that Nature has had a few papers of dual use concern since the 2011 GOF controversy. “There are six examples of such papers from 2015
17 Journal Editors and Authors Group, “Statement on Scientific Publication and Security.” See also A. S. Fauci and F. S. Collins, “Publishing Risky Research,” Nature, May 2, 2012, Vol. 485, No. 5.
18 For a discussion of the use of pre-prints in the biological sciences, see P. D. Schloss, Preprinting Microbiology, doi:https://doi.org/10.1101/110858. Available at http://www.biorxiv.org/content/early/2017/03/15/110858.article-info.
19 Journal Editors and Authors Group, “Statement on Scientific Publication and Security.”
and 2016” for which a technical assessment was seen as needed. In each case, he said, the outcome was “that no paper was rejected on the basis of risk.”20
Inder Verma (Proceedings of the National Academy of Sciences of the United States of America) raised the issue of pre-print servers. What, he asked, should be done with material deposited into pre-print servers, where information is deposited and available publicly before it receives external review?21
Imperiale stated that he believes that neither reviewers nor editors are properly trained to identify DURC. He suggested that few individuals are qualified to make appropriate decisions about DURC and argued that it is unfair to request that journals screen for DURC manuscripts, as they do not have the proper experience.22 He suggested that we “change the status quo and encourage [funding agency] responsibility by identifying potential DURC projects upfront and com[ing] up with a proactive plan.”23 In his paper commissioned for the committee, Tim Stearns (Stanford University) stated that he also believes
20 Philip Campbell, Nature, Presentation to the committee, July 11, 2016, New York, NY.
A wider literature illustrates how few journals have policies to address dual use research. A 2009 survey of 28 major life sciences journals found that “few of the English-language publishers and none of the Russian and Chinese publishers surveyed implement formal biosecurity policies or inform their authors and reviewers about potentially sensitive issues in this area.” See J. van Aken and I. Hunger, “Biosecurity Policies at International Life Science Journals,” Biosecurity and Bioterrorism, April 2009, Vol. 7, No. 1. In 2011, an even wider survey found that, “of the 155 journals that responded” (a 39% response rate), “only 7.7% stated that they had a written dual-use policy and only 5.8% said they had experience reviewing dual-use research in the past 5 years.” See D. B. Resnik, D. D. Barner, and G. E. Dinse, “Dual-Use Review Policies of Biomedical Research Journals,” Biosecurity and Bioterrorism, March 2011, Vol. 9, No. 1. A 2012 survey “of 127 chief editors of life sciences journals in 27 countries to examine their attitudes toward and experience with the review and publication of dual-use research of concern” found that “very few editors (11) had experience with biosecurity review, and no editor…reported having ever refused a submission on biosecurity grounds. Most respondents (74.8%) agreed that editors have a responsibility to consider biosecurity risks during the review process, but little consensus existed among editors on how to handle specific issues in the review and publication of research with potential dual-use implications.” See D. Patrone, D. Resnik, and L. Chin, “Biosecurity and the Review and Publication of Dual-Use Research of Concern,” Biosecurity and Bioterrorism, September 2012, Vol. 10, No. 3.
21 Inder Verma, Proceedings of the National Academy of Sciences of the United States of America, Presentation to the committee, July 11, 2016, New York, NY.
22 M. Imperiale, Presentation to the committee, July 11, 2016, New York, NY.
23 Ibid. Prior to the promulgation of the United States Government Policy for Oversight of Life Sciences Dual Use Research of Concern in 2012, a number of agencies (e.g., the National Institutes of Health, the Department of Homeland Security, the Department of Defense) had policies in place to review their intramural research for potential DURC. The 2012 policy required all “federal departments and agencies that conduct or fund life sciences research” to “conduct a review to identify all current or proposed, unclassified intramural or extramural, life sciences research projects” involving specific select agents and types of experiments to determine whether they met the definition of DURC. See United States Government Policy for Oversight of Life Sciences Dual Use Research of Concern.
that there “are relatively few practicing scientists with sufficient background to assess [. . . DURC], and to engage with the relevant government personnel.”24
Were a federal agency to determine that specific research poses a risk, current DURC policy mandates that it must work with the researcher or institution to formulate a risk mitigation plan, which may include “determining the venue and mode of communication (addressing content, timing, and possibly the extent of distribution of the information) to communicate the research responsibly.”25 At the committee’s July 2016 meeting, Harris noted that government agencies have different requirements for when the risk-benefit assessments must be carried out and risk mitigation plans developed.26
Carrie Wolinetz (National Institutes of Health) noted that the U.S. Department of Health and Human Services has developed a framework to guide funding decisions on proposals for research anticipated to generate Highly Pathogenic Avian Influenza (HPAI) H5N1 viruses.27 The framework requires multi-disciplinary, department-level, pre-funding review and approval for research that is reasonably anticipated to generate certain avian influenza viruses that are transmissible in mammals via the respiratory route.28
EDUCATION AND TRAINING
U.S. guidance policies place substantial responsibility on Principal Investigators (PIs) and institutional review entities (IREs). Fulfilling the responsibilities requires significant awareness of the relevant issues, but there are serious questions as to whether these individuals or entities have opportunities to gain the expertise necessary to identify, assess, and mitigate communication risks. Furthermore, systematic mechanisms for sharing best practices and lessons learned do not exist.
In his paper, Stearns provided a description of the lack of awareness among his peers about DURC and security issues. Citing his own academic experience in a highly regarded biology department, he relayed how few faculty are familiar with the work of the National Science Advisory Board for Biosecurity (NSABB) or with DURC and security concerns in general: “it is possible to function at a very high level in the research community with essentially no engagement with this issue.” Anecdotally, he noted that many of his colleagues believe that DURC “regulations are for researchers working on explicitly ‘concerning’
24 Tim Stearns, Moving Beyond Dual Use Research of Concern Regulation to an Integrated Responsible Research Environment (commissioned paper available at https://www.nap.edu/catalog/24761 under the Resources tab.)
25 National Science Advisory Board for Biosecurity. Responsible Communication of Life Sciences Research with Dual-Use Potential (Washington, DC, 2007). See item 4.1.e.vii., p. 3.
26 E. D. Harris, Presentation to the committee.
28 C. Wolinetz, Presentation to the committee.
problems” and have the “tendency to view the government simultaneously as a welcome source of research funding, and an unwelcome source of burdensome regulations.”29 He also believes that many life scientists have little to no knowledge about the history of the development of nation-state biological weapons programs.
Stearns cited an inadequate articulation of the risk. “Neither ‘adversary’ nor ‘malevolent purposes,’” he said, “is well-defined in most scenarios, and are taken to mean different things in different contexts, and by people with differing knowledge of the capabilities and intent of various potential adversaries.”32
In their commissioned paper commissioned for the committee, Duane Lindner and Winalee Carter of Sandia National Laboratories described the laboratories’ methodology for assessing possible risk associated with information generated by their research programs. They recognized the challenge posed by the rapid pace of change in science and biotechnology, “which can affect the risk/benefit calculus in sudden and discontinuous ways,” and emphasized that “attention to establishing a culture that is aware of the risks and ready to help manage them is essential.”33
Lindner and Carter acknowledged that, for researchers, the “natural enthusiasm about the benefit of specific work can lead to amplification, while lack of specific information about risk—information about actions by adversaries or careful and thoughtful assessment of potential negative consequences—can lead one to minimize or discount risk.” They acknowledged that this dynamic may be less severe at institutions like theirs where most researchers have access to
29 Stearns, p. 5.
30 S. Whitby and M. Dando, “Effective Implementation of the BTWC: The Key Role of Awareness Raising and Education,” Bradford Review Conference Paper No. 26, November 2010. Available at http://www.brad.ac.uk/acad/sbtwc/briefing/RCP_26.pdf.
31 E. D. Harris, Presentation to the committee.
Similar findings were reported by the National Research Council [see National Research Council, A Survey of Attitudes and Actions on Dual Use Research in the Life Sciences: A Collaborative Effort of the National Research Council and the American Association for the Advancement of Science (Washington, DC: The National Academies Press, 2009), doi:https://doi.org/10.17226/12460] and in B. Rappert, ed., “Education and Ethics in the Life Sciences” (Australian National University E Press, 2010), available at http://press-files.anu.edu.au/downloads/press/p51221/pdf/book.pdf?referer=190).
32 Stearns, p. 8.
33 Duane Lindner and Winalee Carter, Sandia National Laboratories, Control of Sensitive Information: Policy, Procedure, and Practice in a National Security Context (commissioned paper available at https://www.nap.edu/catalog/24761 under the Resources tab).
fuller details about potential threats: “knowledge can help engender a culture of caution as appropriate.”34
Several presenters suggested that a key way to increase the scientific community’s awareness of DURC is to integrate national security issues into students and staff training, potentially through expanded programs covering biosafety and/or responsible conduct of research. Linder and Carter noted that “training can ensure that personnel can understand why information is sensitive, how to identify sensitive information, and what policies and procedures should be followed.”35 At the January 2017 workshop, Joseph Kanabrocki (The University of Chicago) urged training programs for researchers and staff.36 In his paper commissioned for the committee, Sam Weiss Evans (Harvard Kennedy School of Government) recommended including training on biosecurity from the outset of students’ careers, “incorporated within a broader curriculum on responsible research and innovation.”37
Evans believes that the strongest change will come from efforts to promote, in the next generation of academic leaders in emerging technologies, the view that science and security are not mutually exclusive and then support efforts to achieve institutional change in the training of students.38 He described three programs in synthetic biology that train students and early career professionals in responsible innovation in general or biosecurity in particular. The Synthetic Biology Leadership Excellence Accelerator Program (Synbio LEAP) brings a network of people from academia, industry, and government into broad discussions about responsible innovation and stewardship within synthetic biology. The Emerging Leaders in Biosecurity Initiative of the Johns Hopkins Bloomberg School of Public Health’s Center for Health Security focuses more specifically on biosecurity. In the international Genetically Engineered Machines (iGEM) competition, where more than 6,000 students from 40 countries compete each year, the Human Practices committee has worked closely with the Federal Bureau of Investigation and other national organizations, industry, and academia to design a range of methods to both make students aware of security concerns in their work, and to structure the type of work they are allowed to do to avoid the most likely security-sensitive areas, such as a policy on the development of gene drives. Together these initiatives constitute models for ways to provide biosecurity training to students and researchers in the life sciences.
34 Ibid, p. 7.
35 Ibid, p. 7.
36 Joseph Kanabrocki, The University of Chicago, Presentation to the committee, January 4, 2017, Washington, DC.
38 Ibid, p. 6.
Reflecting on the “maker” community of life sciences practitioners, Stearns stated in his paper that he believes that the number of those “who don’t pass through the standard college or university training . . . [and] who are capable and came upon that capability independent of such training is relatively small.” He suggested, however, that the number “is likely growing as part of the growth of the maker community.” He suggested that “there are many opportunities to interact with this and related communities, and [that] there are some excellent examples of individual efforts in the government [that] can have a large effect.”39
In his paper commissioned for the committee, Joseph Kanabrocki discussed how, at The University of Chicago, “all research staff involved with” select agent research “are committed to the ethical and responsible conduct of science.”40 He described their code of conduct, one element of a culture of awareness about biosafety, which is signed annually by life sciences researchers who work with select agents. Kanabrocki suggested that similar codes of conduct be expanded to embrace biosecurity concerns related to DURC. He argued that such codes of conduct would heighten researchers’ awareness of DURC and would encourage them to explore alternative experimental approaches that would generate the desired information through less risky means.41
A general code of conduct used at Kanabrocki’s institution includes items relating to scientific and personal integrity (intellectual honesty, transparency in conflicts of interest, fairness in peer review, etc.) and technical responsibilities around safety protocols and various training requirements. Unlike the code of conduct for those working with select agents, researchers are not required to sign the general code of conduct. Kanabrocki believes that their signature should be required.
One item of the code relates to research with dual-use potential: “responsibilities include protection of potentially sensitive information and awareness of reporting and publication requirements associated with research with dual use potential.”42 Kanabrocki noted that the code of conduct is taken seriously by researchers and is embedded in a research culture in which regularly scheduled meetings offer a place for questions and discussion of potential biosafety issues. An expansion of this point could integrate biosecurity concerns into a proven
39 Stearns, p. 10. Stearns cited, as an example, the efforts of the Federal Bureau of Investigation to connect with the “homebrew” communities, though, for instance, its sponsorship of and involvement in the International Genetically Engineered Machine (iGEM) competition.
Stearns used the terms “maker community” and “homebrew community” as synonymous for the DIYbio community (see Box 2-1). Technically, these terms describe different DIY communities.
41 Ibid, p. 11.
42 J. Kanabrocki, Presentation to the committee.
structure already in place for biosafety. Kanabrocki noted, however, that not all institutions have such systems in place.
With regard to laboratory practice, Kanabrocki believes that, “lab accidents and laboratory-acquired infections [are] under-reported and the opportunities for sharing our best practices are missed as a result.” Further, “Lessons that are learned through the investigations of accidents and injuries or illnesses should be shared so that we can learn from each other’s mistakes.”
While data on laboratory safety are incomplete,43 biosafety and biosecurity data are available for research involving select agents. Those data show that where laboratory workers are provided with rigorous training, the safety and security record is very good. From 2004 to 2010, there were approximately 10,000 select agent investigators and among them there were: (1) no reports of theft; (2) one lost shipment (out of 3,412); and (3) 11 laboratory-acquired infections, with no fatalities or secondary infections. Of the 11 infections, the majority occurred outside of high-containment facilities in laboratories that did not customarily work with highly pathogenic organisms (i.e., diagnostic and BSL-2 laboratories) and where workers may lack training in handling such organisms.44
Recent biosafety incidents have sparked efforts to improve biosafety training and practices. A series of biosafety lapses at U.S. government laboratories in 2014, for example, led to the creation of a trans-federal task force that issued an array of recommendations designed to improve biosafety practices and foster a strong culture of responsibility.45 The implementation of those recommendations engaged multiple agencies in addressing the problems.
Biosafety and biosecurity experts nonetheless continue to express concern about the lack of consistent and systematic reporting of biosafety—and biosecurity—lapses.46 The reporting of biosafety errors and accidents provides a basis for continual learning and improvement. A lack of national requirements for reporting and sharing data about such “near misses” represents a lost opportunity to promote the best possible biosafety practices. Indeed, the
44 J. Kanabrocki, Presentation to the committee.
45 See Federal Experts Security Advisory Panel, Report of the Federal Experts Security Advisory Panel (Washington, DC, December 2014) and White House, Next Steps to Enhance Biosafety and Biosecurity in the United States (Washington, DC, 2015). Both documents are available at https://www.phe.gov/s3/Pages/default.aspx.
46 In addition to the discussion at the committee’s January 4, 2017, workshop, see the accounts of the two international symposia on the GOF controversy organized by the Academies at the request of the White House: Institute of Medicine and National Research Council, Potential Risks and Benefits of Gain-of-Function Research: Summary of a Workshop (Washington, DC: The National Academies Press, 2015), doi:https://doi.org/10.17226/21666 and National Academies of Sciences, Engineering, and Medicine, Gain-of-Function Research: Summary of the Second Symposium, March 10-11, 2016 (Washington, DC: The National Academies Press, 2016), doi:https://doi.org/10.17226/23484.
NSABB called for the creation of a national database to house such data.47 As part of its review of the 2014 biosafety lapses in federal laboratories, the Federal Experts Security Advisory Panel (FESAP) recommended the creation of “a new voluntary, anonymous, non-punitive incident-reporting system for research laboratories that would ensure the protection of sensitive and private information, as necessary.” At the time of this report, this was being developed and implemented in stages beginning with a pilot by the U.S. Department of Health and Human Services.48
Those working with select agents are required to report “theft, loss, and release” of agents from laboratories registered with the program. Some argue, however, that the punitive tone of select agent reporting requirements discourages individuals from sharing biosecurity “near misses” that would be valuable learning tools if they were more widely available.49
Lindner and Carter highlighted the importance of a culture of awareness about sensitive information at Sandia National Laboratories. While acknowledging that policies and procedures are important for ensuring effective risk assessment in a changing environment, they emphasized that these policies and procedures must be embedded in a culture of awareness about information management in order to be effective. The authors noted that “all institutions have policy, procedures, and cultures that control sensitive information of other types” and suggested that those structures be expanded to include information generated from DURC. Elements of their culture of awareness include signs throughout the laboratory spaces reminding workers of the presence of sensitive information, risks inherent in mishandling it, and researchers’ individual responsibility; the regular dissemination of information about malicious attempt to gain access to sensitive information; and briefings to researchers about threats to sensitive information generated at the laboratories.50
Local, national, and international approaches can provide awareness-raising, education and training, and ongoing guidance and opportunities to share best practices and develop common approaches to manage the dissemination of
47 See Recommendations for the Evaluation and Oversight of Proposed Gain-of-Function Research.
48Report of the Federal Experts Security Advisory Panel, p. 4, and Implementation of Recommendations of the Federal Experts Security Advisory Panel (FESAP) and the Fast Track Action Committee on Select Agent Regulations (FTAC-SAR), October 2015, p. 8 (available at https://www.phe.gov/s3/Documents/fesap-ftac-ip.pdf).
49 See, e.g., the presentation of Barbara Johnson in Institute of Medicine and National Research Council, Potential Risks and Benefits of Gain-of-Function Research: Summary of a Workshop (Washington, DC: The National Academies Press, 2015), doi:https://doi.org/10.17226/21666 and of Gavin Huntley-Fenner in National Academies of Sciences, Engineering, and Medicine, Gain-of-Function Research: Summary of the Second Symposium, March 10-11, 2016 (Washington, DC: The National Academies Press, 2016), doi:https://doi.org/10.17226/23484.
50 Lindner and Carter, p. 6, and Duane Lindner, Sandia National Laboratories, Presentation to the committee, January 4, 2017, Washington, DC.
scientific information.51 Measures can include biosecurity modules in training courses for graduate students and others; frequent review of guidelines and the framework for oversight and regulation; and careful monitoring and reporting of situations in which misuse of biological materials might occur.52
Stearns suggested that it is necessary to “develop a better understanding of the effectiveness of the measures already taken to educate [researchers] about dual use issues” and of measures “that might be taken in the future.” “There is very little data,” he said, “about what scientists, broadly considered, and the public really understand in this domain, and how they think about some important issues.”53
INTERNATIONAL OVERSIGHT OF DURC
In his paper commissioned for the committee, Piers D. Millett (Biosecure Ltd.) discussed international perspectives on DURC and potential channels for expanding discussions around DURC management. His overall view was that no international consensus exists on the need to address DURC. Indeed, he said, the subject has been largely ignored at the international level in recent years; expanding the discussion will require a concentrated effort. Millett’s analysis was based on a review of work done as part of the Biological Weapons
51 A general introduction and overview may be found in National Research Council, Challenges and Opportunities for Education About Dual Use Issues in the Life Sciences (Washington, DC: The National Academies Press, 2011), doi:https://doi.org/10.17226/12958. The National Academies have carried out an extensive program of international activities on dual use issues supported by the Department of State, including Education Institutes in Responsible Science in the Middle East/ North Africa and South and Southeast Asia (see Responsible Conduct in the Global Research Enterprise: An Educational Guide, available at: http://www.interacademies.net/File.aspx?id=19789). An international version of the Academies’ On Being a Scientist released in 2016 by the InterAcademy Partnership called Doing Global Science includes a discussion of dual use and biosecurity issues (see http://www.interacademycouncil.net/24026/29429.aspx). From 2013 to 2015, the European Union Chemical Biological Radiological and Nuclear Risk Mitigation Centres program ran a project to create networks of universities to raise awareness on dual use concerns (see http://www.cbrn-coe.eu/Projects/TabId/130/ArtMID/543/ArticleID/46/Project-18-International-Network-of-universities-and-institutes-for-raising-awareness-on-dual-use-concerns-in-bio-technology.aspx). For examples of activities in the United States, see a series of collaborative activities among the American Association for the Advancement of Science (AAAS) and the Federal Bureau of Investigation that can be found on the AAAS website at https://www.aaas.org/oisa/aaas-fbi.
52 See, e.g., the findings and recommendations in several reports from the National Academies, including National Research Council, Science and Security in a Post 9/11 World: A Report Based on Regional Discussions Between the Science and Security Communities (Washington, DC: The National Academies Press, 2007), doi:https://doi.org/10.17226/12013; National Research Council, Responsible Research with Biological Select Agents and Toxins (Washington, DC: The National Academies Press, 2009), doi:https://doi.org/10.17226/12774; and National Research Council, Challenges and Opportunities for Education about Dual Use Issues in the Life Sciences (Washington, DC: The National Academies Press, 2011), doi:https://doi.org/10.17226/12958.
53 Stearns, p. 10.
Convention (BWC) and by the World Health Organization (WHO), as well as a small, informal survey of national experts who have been involved in DURC discussions at an international level.54
Millett attributed low levels of engagement with DURC to: (1) limited awareness of the issue, (2) competing demands on countries’ limited resources, (3) a sense that the issue is not relevant to most countries, and (4) suspicions in many developing countries that U.S. motivations for raising the issue of DURC are to protect its technological lead in the life sciences and deny access to a broad category of technologies and knowledge to other nations.
In light of this context, Millett described several starting points for discussion that he considered most productive for broadening international engagement with DURC. First, he urged that the issue not be portrayed as a zero-sum game in which every security benefit comes with a development cost. Rather, the relationship between development and security should be highlighted. Second, the role of biosecurity in safeguarding the bioeconomy, where there is increasing interest and investment globally, should be emphasized. Third, he encouraged the expansion of discussion of DURC to be a part of the entire spectrum of measures used to address biosecurity risks.
Speaking at the January 2017 workshop, Millett argued that, independent of policy decisions, technical discussions also need to take place. He specified that these need to be in good faith: “one thing that the United States could do to show leadership would be to express very early on a willingness to listen to the output of that discussion.” He stated that the United States must recognize that its own policies may need to be revised as a result of further international discussions, “and saying very clearly at the start of that process a willingness to do that would . . . help to engender a real sense of buy-in and to demystify some of what the U.S. motivations might be.”55
Millett described the past and present roles of the BWC and the WHO and considered their and other organizations’ possible role as a future home for discussions of DURC. In the early 2000s, the BWC began to address research, in addition to its long-time focus on the development and acquisition of biological weapons. In the 2008 Meeting of States Parties to the BWC, parties were encouraged to “be alert to potential misuse of research, and assess their own
54 Piers D. Millett, Biosecure Ltd., Gaps in the International Governance of Dual-Use Research of Concern (commissioned paper available at https://www.nap.edu/catalog/24761 under the Resources tab). The survey was circulated to experts from BWC delegations in 28 countries (Australia, Belgium, Canada, Finland, France, Georgia, Germany, Hungary, India, Indonesia, Ireland, Italy, Japan, Kenya, Liberia, Malaysia, Mali, Mexico, the Netherlands, Norway, Pakistan, Portugal, Russia, Sierra Leone, Spain, Switzerland, Ukraine, and the United Kingdom). In total, eight responses were received from Canada, Germany, the Netherlands, Pakistan, Portugal, Spain, Switzerland, and the United Kingdom.
55 Piers D. Millett, Biosecure Ltd., Presentation to the committee, January 4, 2017, Washington, DC.
research for dual-use potential,” “seek to stay informed of literature, guidance, and requirements related to dual-use research,” and “provide concise, practical guidelines, including criteria to define sensitive research and identify areas of greatest risk.”56
The BWC engaged more directly with DURC from 2012 to 2014, following the controversy about the two papers describing GOF mutations of H5N1 avian influenza viruses. The report from the 2012 meeting “express[ed] support for ‘enhanced national oversight of dual use research of concern without hampering the fullest possible exchange of knowledge and technology for peaceful purposes’,” and the 2013 report again articulated “the value of increased national oversight of DURC and highlighted the possibility of developing international approaches” to DURC management. This report also outlined a possible role for the BWC in facilitating the exchange of national experiences as a foundation for expanded international harmonization. The 2014 report summarized nations’ common understandings of DURC and described key areas for future work; an appendix also included proposals for national measures for dealing with DURC, which had not found consensus.57 Since that time, less attention has been paid to these issues.
Millett discussed the WHO’s key role in the 2012 GOF controversy and the consultation on DURC that it hosted in 2013. This consultation identified key concerns surrounding DURC, gaps in existing management systems, and potential ways these gaps might be addressed. It concluded that DURC is an issue of relevance to all countries, affirmed the importance of oversight mechanisms, noted that oversight pertains to the entire research cycle, and considered that while the “establishment of a legally binding global agreement or regulation is theoretically possible, such an approach would be expensive, slow, likely impractical and would not necessarily yield the desired benefits,” recommending instead “guiding principles, toolkits, best practices and other forms of technical assistance would help countries formulate their own policies and procedures for managing DURC.” The WHO highlighted that “communication and continuing dialogue across a broad range of sectors and stakeholders are essential to create a culture of responsibility, cooperation and trust,” including an exploration of different ways of assessing risk. Lastly, the WHO’s consultation found that “awareness-raising, education and training on biosafety, biosecurity and DURC are essential not only for researchers but also for all sectors and stakeholders.”58 Millett noted that the 2013 findings in the WHO’s consultation have not been followed up on, although the WHO has been active in discussions of DURC at BWC reviews.
56 Quoted on Millett, p. 2.
57 Millett, pp. 3-4.
58 Quoted in Millett, p. 5.
A third organization mentioned by Millett as a possible home for international discussions around DURC was the United Nations Educational, Scientific and Cultural Organization (UNESCO), which, he noted, seems aware of dual use issues in the life sciences and carries out work on responsible research and innovation. Nonetheless, UNESCO has not assumed a leadership role in this area, and its interest in these issues has been inconsistent.
THE NATIONAL SCIENCE ADVISORY BOARD FOR BIOSECURITY
While the numerous reports of the NSABB offer recommendations to the broader community for the oversight of DURC, the board’s role is to advise the government.59
Wolinetz remarked that, if an “investigator had no awareness of DURC policies . . . googled DURC” and emailed the National Institutes of Health Office of Science Policy and said, I have this manuscript “and am worried that it has DURC concerns” the manuscript could end up in “internal biosecurity committees and discussion groups and interagency groups that have dealt with DURC policies” and “potentially trigger an NSABB review.” But she said this would “very much depend on the manuscript.” If additional expertise were needed from other parts of the government, she said, that could be brought in, but the process is not transparent. Wolinetz said that this would “be an extremely rare situation” and that she does not “know that it makes sense to create a bureaucratic process for that situation.”60
As noted in Chapter 1, since 2004, the NSABB has reviewed six manuscripts of dual use concern. The board’s review of the two controversial H5N1 GOF manuscripts in 2011 led to the development of a framework for reviewing DURC that is based on risk-benefit analysis.61 According to Wolinetz, the framework seeks to address several questions:
- Are there reasonably anticipated risks to public health and safety from direct misapplication of this information, i.e., is novel scientific information provided that could be intentionally misused to threaten public health or safety?
- Are there reasonably anticipated risks to public health and safety from direct misapplication of this information, i.e., does the information point out a vulnerability in public health and/or safety preparedness?
59 At the time of the renewal of its charter following the 2012 GOF controversy, the NSABB was given a reduced advisory role.
60 C. Wolinetz, Presentation to the committee.
61 This is encapsulated in the United States Government Policy for Oversight of Life Sciences Dual Use Research of Concern.
- Is it reasonably anticipated that this information could be directly misused to pose a threat to agriculture, plants, animals, the environment, or materiel?
- If a risk has been identified, in what timeframe (e.g., immediate, near future, years from now) might this information be used to pose a threat to public health and/or safety, agriculture, plants, animals, the environment, or materiel?
- If the information were to be broadly communicated “as is,” what is the potential for public misunderstanding, that is, what might be the implications of such misunderstandings (e.g., psychological, social, health/dietary decisions, economic, commercial, etc.)? For sensationalism?62
Imperiale and Relman argued that the criteria for triggering special consideration of research results need to be broader than those currently articulated by the NSABB in its May 2016 guidelines for GOF research, which are focused on pathogenic infectious agents, and should encompass as-yet-unknown situations in the future in other research areas, for example, synthetic biology and systems biology. They observed that the 2005 paper that modeled an introduction of botulinum toxin into the milk supply63 provided a particularly “important case study because it did not involve wet lab research, but rather was a theoretical modeling study, and can be viewed as representative of an increasingly common type of research involving ‘big data’ and data mining tools.” “Work of this type,” they said, “typically arises outside of science research settings routinely subjected to biosafety and biosecurity oversight, and is typically undertaken by individuals unfamiliar with the history of biosafety guidelines.”64
At the January 2017 workshop, Stearns agreed that the NSABB has failed to embrace all research that could potentially be of concern within its definition of DURC, including unpredictable developments in the life sciences, such as the genome editing tool CRISPR/Cas9 or research in gene therapy.
OPTIONS FOR THE FUTURE MANAGEMENT OF THE DISSEMINATION OF DURC
As mentioned in Chapter 1, the discussion of specific options for managing the dissemination of DURC takes place within the larger context of changing perceptions in the international scientific community about the appropriate
62 C. Wolinetz, Presentation to the committee.
63 L. M. Wein and Y Liu, “Analyzing a Bioterror Attack on the Food Supply: The Case of Botulinum Toxin in Milk,” Proceedings of the National Academy of Sciences of the United States of America, July 12, 2005, Vol. 102, No. 28, pp. 9984-9989.
64 Imperiale and Relman, p. 5.
balance between scientific freedom and the broader social responsibilities of science. Historically, freedom of inquiry has been an absolute value. It remains so for an important part of the scientific community.65 However, given the complex ethical, legal, social—and security—issues posed by continuing scientific advances, there is increasing support for a view that scientific research must operate within a broader social context and that scientific freedom comes with important responsibilities. The struggle to develop effective policies for GOF research is illustrative of how these issues are playing out in the life sciences.66
Several presenters acknowledged the difficulty of arriving at clear criteria for what constitutes DURC, but identified elements that they believe are important. Imperiale and Relman, for example, suggested that criteria should be able to encompass all areas of the life sciences and that the “line will undoubtedly be context-dependent in many dimensions, including the area of the work, the availability of countermeasures against any potential dangers and the means to use them, and even the socio-political environment of the world at the time the work is performed.” They acknowledged that “it is difficult to develop
65 A striking example comes from the International Council for Science (ICSU), for decades one of the most staunch advocates for the primacy of scientific freedom.
“To address and promote both aspects [freedom and responsibility], ICSU established the Committee on Freedom and Responsibility in the conduct of Science (CFRS) in 2006. This Committee differs significantly from its predecessors that, since 1963, had focused on scientific freedom, in that it is explicitly charged with also emphasizing scientific responsibilities.” [ICSU, Freedom, Responsibility, and Universality of Science (Paris: International Council for Science, 2014), p. 3. Available at http://www.icsu.org/publications/cfrs/freedom-responsibility-and-universality-of-science-booklet-2014/CFRS-brochure-2014.pdf.]
66 Finding 5 of the NSABB’s Recommendations for the Evaluation and Oversight of Proposed Gain-of-Function states:
Finding 5. There are life sciences research studies, including possibly some GOF research of concern [GOFROC], that should not be conducted because the potential risks associated with the study are not justified by the potential benefits. Decisions about whether specific GOFROC should be permitted will entail an assessment of the potential risks and anticipated benefits associated with the individual experiment in question. The scientific merit of a study is a central consideration during the review of proposed studies but other considerations, including legal, ethical, public health, and societal values are also important and need to be taken into account.
European Academies Scientific Advisory Council (EASAC), Gain-of-Function: Experimental Applications Relating to Potentially Pandemic Pathogens (EASAC Policy Report 27) (Halle: EASAC, 2015) offers the following findings:
3.2 Self-regulation and harmonisation
Self-regulation means that there are checks and balances on research agreed within the scientific community and does not mean that each researcher is free to decide for themselves what procedures to follow. (p. 17)
3.7 Publication of sensitive information
Scientific freedom is not absolute and the scientific community recognises that some information is sensitive. (p. 19)
clear criteria that broadly define a line that ought not be crossed” but identified an experiment—a deliberate attempt to isolate a mutant form of human immunodeficiency virus (HIV) that can be transmitted by aerosol route—which should not be performed. They stated that, as is the case in their hypothetical example, when risks are potentially high and the benefits nonexistent, “an experiment should not be performed solely because someone finds it intellectually interesting.”67
Evans outlined two paradigms for understanding the relationship between the scientific enterprise and society. One holds that they are separate, where science produces objective knowledge that will lead to societal benefit and society intervenes only occasionally in order to regulate where there is a clear likelihood of research having harmful impacts. The other holds that science and society are mutually constitutive; science is not separate from society, and decisions about whether science benefits or harms society are often contested and irresolvable. He likened the two perspectives to a “Newtonian versus a Quantum view of biosecurity.” In a Newtonian perspective, he said, “you have discrete, fully characterized entities that you can control in their movements based on a set of simple laws.” With the Quantum perspective, “you have an entangled set of systems where measurement of the system changes the system itself, and therefore control is very often an indirect process.” In a “Quantum” biosecurity environment, in other words, the processes by which we determine whether knowledge is a security concern heavily structure which concerns we are able to see, and we can never fully know whether, at a particular point in time, a particular piece of knowledge is a concern or not. Such determinations, Evans said, depend on the context, on who is using the knowledge, and how the knowledge interacts with other pieces of knowledge, resources, and intentions. Further, he said, indirect governance of a system like this means giving those who construct and use dual use knowledge (not just DURC) the tools to make their own determinations of how concerned we should be about potential security issues.68
Evans supported the creation of “networks for constructing security concerns” that would provide flexible governance for emerging DURC concerns and include the scientific community, government, nongovernmental organizations, and industry. He also noted that, among the recommendations issued from past reports from the National Academies of Sciences, Engineering, and Medicine on DURC, such as the Corson Report and Fink Report, the recommendations that were often not implemented were those that see science and
67 Imperiale and Relman, pp. 5-6.
68 Sam Weiss Evans, Harvard Kennedy School of Government, Presentation to the committee, January 4, 2017, Washington, DC.
society as mutually constitutive: “our institutional structures don’t have the capacity to see the world in this way.”69
In his commissioned paper, Evans described the eight communication principles put forth in the NSABB’s Responsible Communication of Life Sciences Research with Dual Use Potential (see Table 3-1), as “emblematic of a way of constructing threats in biology that works well for DURC, but provide little help when considering dual use issues in the rest of the life sciences.” He noted how principles one through three assume a “linear model of innovation” in which societal concerns are raised only at specific points in the process, if at all. He criticized the “combination of a linear model of innovation and a hard line between academic freedom and national security” because they lead to the viewpoint that the security concerns of life sciences research are “a zero-sum game between freedom and security” and can “be resolved by drawing a line in the innovation process where societal concerns like security can come in.” He critiqued principle four’s focus on the technical elements of risk assessment and its exclusion of political and broader public concerns.70
In place of the NSABB’s eight principles, Evans suggested seven new ones—“principles for crafting new objects of security concern within the life sciences” (see Table 3-1)—noting that they share many elements with the 2006 NRC report Globalization, Biosecurity, and the Future of the Life Sciences.71 These seven principles assert that “decisions about which [research] lines to pursue, as well as the actual conduct of research, are inextricably embedded in cultural, economic, political, and technical systems” and that “communities, not individuals, are best placed to determine the level of security concern around an area of research.”
Imperiale and Relman observed that “there are events that may occur in the near or long term that could force a reactive response and a scheme for managing information that may not be productive.”72 They called for a “thoughtful, deliberate plan for managing information that will inevitably arise and pose major risks to humans, other animals, plants and their supporting ecosystems”73 and enumerated some common arguments against proactive action and identified flaws in each. Some have made the argument that, since no act of terrorism has, to date, used biological materials, there is a lack of interest among malicious actors in biological attacks. This, Imperiale and Relman argue, is incor-
69 S. W. Evans, Presentation to the committee.
70 S. W. Evans, The Construction of New Security Concerns in the Life Sciences, p. 2.
71 Institute of Medicine and National Research Council, Globalization, Biosecurity, and the Future of the Life Sciences (Washington, DC: The National Academies Press), doi: https://doi.org/10.17226/11567.
72 These include accidental or deliberate release of an agent from a laboratory; a bioterrorist attack; an unexpected zoonosis by a highly virulent pathogen; or development of an additional transformative bioengineering technology. See Imperiale and Relman, p. 7.
73 Imperiale and Relman, p. 7.
|NSABB Principles for the Responsible Communication of Research with Dual Use Potential||Evans’ Principles for Crafting New Objects of Security Concern within the Life Sciences|
|NSABB Principles for the Responsible Communication of Research with Dual Use Potential||Evans’ Principles for Crafting New Objects of Security Concern within the Life Sciences|
|NSABB Principles for the Responsible Communication of Research with Dual Use Potential||Evans’ Principles for Crafting New Objects of Security Concern within the Life Sciences|
a S. Rayner and R. Cantor, “How Fair Is Safe Enough? The Cultural Approach to Societal Technology Choice.” Risk Analysis, 1987, Vol. 7, No. 1, pp. 3–9.
b M. J. Palmer, F. Fukuyama, and D. A. Relman, “A More Systematic Approach to Biological Risk.” Science, December 18, 2015, Vol. 350, Issue 6267, pp. 1471–1473.
rect. Biological materials have indeed been used for nefarious purposes, e.g., in an attempt by the Rajneeshee to alter the outcome of an election in Oregon by contaminating salad bars with Salmonella bacteria in 1984 and in the mailing of live anthrax spores in the aftermath of the September 11 attacks. They also addressed the argument that since only a few experiments will generate dangerous information they can be dealt with individually as they arise. They observed that “the consequences of just one episode of deliberate misuse of information could be enormous.”74 Imperiale and Relman countered the argument that full control of information is currently impossible. While this, they said, may be true, it is possible to create policies to discourage people with malicious intent or slow their progress. Finally, they responded to the argument that sensitive information already in the published literature has not been misused: this, they suggested, “is akin to someone in 2000 stating that since no one has ever deliberately flown a commercial jet into a large building, we don’t have to
74 Imperiale and Relman, p. 11.
worry about it.” In light of this, they supported “a more proactive strategy for addressing what is already a clear and pressing set of challenges.”75
Several presenters favored a broad approach, in two respects. They believed that the process of establishing guidelines for DURC management should be inclusive, and that an important attribute of whatever process is ultimately chosen should be based on broad, diverse input in determining what information is sensitive and defining the level of risk.
Evans recommended that a “relationship of mutual trust and shared expertise should be fostered in particular between the life science and intelligence communities.” He recommended that the NSABB resume its efforts to build a network between the intelligence community and journal editors. He noted that the relationship between law enforcement and the scientific community has not always been optimal but highlighted the Federal Bureau of Investigation’s Weapons of Mass Destruction Directorate, which has become a resource for the scientific community’s security concerns, and urged that it be strengthened, institutionalized, and studied for how it might be shared more broadly. He also urged that the Department of Commerce Bureau of Industry and Security’s Emerging Technology Research Advisory Committee strengthen its focus on the life sciences, noting its current lack of expertise.76
Both Stearns and Relman noted that, in the past 20 years, efforts have been made to stimulate discussion between scientists and national security experts.77 Relman noted, however, that the nation has not undertaken bridge-building between the two communities in a strategic, coordinated, or thoughtful way.78
Imperiale and Relman examined several possible mechanisms for controlling dissemination of research results. One was controlled unclassified information (CUI), a category implemented by George W. Bush in 2008 in response to a proliferation of types of sensitive but unclassified information. They found it inadequate, citing a lack of clarity around who would authorize and then manage the CUI, but believed that some features of the concept could be useful.
Lindner and Carter described their procedures for protecting sensitive information at a research enterprise in which the generation of sensitive research results is commonplace. Their point of departure was the notion that all people, in their daily lives, have access to sensitive information of various sorts and carry out frequent risk/benefit analyses to decide when and where to make it public. In professional situations, common sense is often supplemented by policies and training: “laws, policies, and procedures create a framework for management of sensitive information. Training and situational awareness—especially aware-
75 Imperiale and Relman, p. 11.
76 S. W. Evans, Presentation to the committee.
77 D. A. Relman, Stanford University School of Medicine and VA Palo Alto Health Care System, Presentation to the committee, January 4, 2017, Washington, DC, and T. Stearns, Stanford University, Presentation to the committee, January 4, 2017, Washington, DC.
78 D. A. Relman, Presentation to the committee.
ness of risk—help create an environment that establishes norms and practices for assessing sensitivity of specific information and for managing it.”79 They discussed laws and policies that guide their actions concerning classified and controlled unclassified information, situations without strong parallels in the broader scientific community, and described their “explicit attempt to create and re-enforce a culture in which our staff are equipped to make appropriate decisions as they handle and manage sensitive information,” which does have parallels.80
Self-regulation is valued by the scientific community. In self-regulation, Imperiale and Relman suggested, each researcher would realize what information is “of unusual risk” and would “somehow either decide not to disseminate or self-censor in some way.” “Scientists,” they said, “always do this. We deliberately choose . . . we put certain things in papers and put things on blogs or not.” However, self-regulation in the realm of DURC is not widely practiced, meaning that “a well-meaning person runs into a lot of problems when the system is not set up to deal with this sort of circumstance.”81
Relman described the properties of a new system, whose “purpose is to simply guide research towards mitigating the risks that have now been revealed by this information so that it no longer is so risky, so that this inevitably failing effort to fully prevent dissemination can then be released and information made available to everyone deliberately.” A new system “would enhance the dissemination of the information to those that were so designated as in need of access, could make good use of access, and . . . slow access to everyone else.” It would apply to publicly and privately funded research and would be transparent, deliberative, standardized, international, and adaptive. It would rely on the expertise of people in the sciences, public health, security, policy, and ethics.82
Imperiale and Relman suggested the formation of a diverse group of people to handle the management of DURC when research generating potentially sensitive results emerges. They asked, “if information needs to be controlled, who controls it?” as risk mitigation measures are created and deployed. Relman and Imperiale highlighted the importance of participation and buy-in from critical constituencies, including “respected members of the scientific, policy, and security communities, as well as other representatives of the general public,” who would solicit the input of scientific experts as needed. The group would be agile and responsive yet forward-looking: “Ideally, this group would appreciate the need in some cases for taking action far in advance of the generation of the information.” The authors’ recommendation is that one or more entities take responsibility for controlling the sensitive information. They suggested
79 Lindner and Carter, p. 1.
80 Lindner and Carter, p. 2.
81 D. A. Relman, Presentation to the committee.
that scientific societies (and the InterAcademy Partnership in particular) could fulfill this role.83
Several options for managing the dissemination of DURC emerged in the papers commissioned by the committee and through the course of its discussions and consideration of relevant external materials. Of all the issues related to the oversight of DURC, questions about limiting the dissemination of research are the most controversial, as many in the scientific and policy communities believe that any restrictions placed on scientific pursuits could harm the research enterprise by limiting knowledge that might have value in efforts to respond to significant public health crises. The committee recognized that the presented options would be contentious.
During the committee’s discussions and review of materials, the following elements were raised as important in the effective management of DURC:
- Ongoing, interactive education and training of individuals in the broader life sciences community;
- Engagement with advisory bodies with monitoring and/or enforcement capabilities;
- International harmonization of policies and approaches;
- Engagement with extant or newly convened international entities;
- Uniform roles and responsibilities for publishers;
- Legislative, regulatory, or policy mechanisms positioned at critical stages of the dissemination process; and
- Increased engagement with the public.
Implementation may necessitate additional resources, the establishment of best practices, refinement of policies and guidances, adoption of new laws, broader stakeholder engagement, and appropriately positioned and empowered advisory bodies. A clearer understanding of risk and benefit and the tradeoffs associated with these options is necessary before policy can be successfully implemented. To aid this process, in the following chapter, the committee provides specific findings that may serve to advance discussions to develop approaches for the future management of DURC.
83 Imperiale and Relman, pp. 9-10.
Founded in 1993 and expanded and re-launched in 2016, the InterAcademy Partnership (IAP) is a global partnership of more than 130 merit-based national and regional academies of science, engineering, and health, which aims to maximize the contributions of science toward understanding and solving the world’s most challenging problems. Through this structure, IAP and its members are active in countries that constitute 95 percent of the world’s population. For more information, see http://www.interacademies.org/.