In the workshop’s first session, Roger Brent, Member of the Division of Basic Sciences and Adjunct Member of the Division of Public Health Sciences, The Fred Hutchinson Cancer Research Center and Affiliate Professor, Department of Genome Sciences, University of Washington, provided an overview of the major developments in molecular biology over the past 40 years. Brent highlighted key developments in scientists’ ability to deconstruct and recombine DNA and RNA, beginning with their capacity to remove and make copies of bacterial DNA and reinsert it into organisms of a different species, and extending to scientists’ current capacity to engineer viral genomes. (For a timeline of events and related publications, see Appendix A.)
The development of recombinant DNA technology in the early 1970s marked the beginning of technical capabilities that would, within the next three decades, enable the scientific community to move genetic material between species, induce bacteria to synthesize new proteins using foreign genes inserted for that purpose, and build new genomes that reveal genes of great interest to those charged with protecting the public health. Between 1973 and 1978, scientific advances led to the ability to compel E. coli bacteria to produce complex recombinant proteins. In 1978, researchers engineered a bacterial species to produce human insulin, and in 1982, researchers successfully transferred bacterial genes into plants, thereby conferring new traits to agriculturally important species.
The 1980s brought further developments in genetic manipulation and gene replacement, most importantly, the ability to reverse engineer a virus. The first complete clone of an animal virus genome was for a plus strand virus, which was synthesized and expressed, in 1981 (bacterial viruses had been genetically manipulated previously). In 1993, scientists synthesized and expressed a negative strand virus (influenza is also a negative strand virus) and had, by 2007, synthesized and expressed a double stranded virus.
By the 2000s, it was possible to add or remove biological functions genetically to examine the effect on a pathogen’s virulence or transmissibility. This capability allowed laboratory scientists to investigate evolutionary questions in a manner that had never before been possible. A common experimental design involved creating an environment hospitable only to organisms possessing a specific trait—for example, virulence or transmissibility. Genetic material from surviving organisms would be sequenced in order to identify the mutation(s) responsible for specifically selected traits. Genetic material associated with the mutations would be extracted and inserted into new viruses to determine whether they caused the appearance of the trait.
In parallel with these developments, there was ever increasing access by ever larger numbers of people to the tools and information needed to manipulate potentially lethal viruses. The equipment necessary to rapidly sequence and reconstruct genomes, for instance, has become affordable and knowledgeable, both of genomics and of how to use the relevant equipment, has become readily available. As a result, access to potentially dangerous information has expanded well beyond the boundaries of what has traditionally been considered the scientific community, both in the United States and internationally.
With particular regard to influenza viruses, Brent noted that researchers have publicly stated, since at least 2004, the goal of constructing human-transmissible H5N1. The rationale behind the goal is that the relative ease or difficulty of the task will provide an indicator of the relative risk posed by the H5N1 virus to public health.
Regulatory Developments Prompted by These Advances
In 1974, in light of the development of recombinant DNA technology and the uncertainties surrounding its safety, the scientific community imposed a moratorium on further research, and in 1975, convened a conference at the Asilomar Conference Center in California for the purposes of defining a framework for governing recombinant DNA technology and its products. The Asilomar Conference (see Box 2-1) was followed in 1976 by NIH’s regulatory framework for recombinant DNA, which included local control (in the form of Institutional Biohazard Committees) and national
As recombinant DNA techniques became more widespread in the 1970s, concern grew in the scientific community and among the public that microbes manipulated through recombinant DNA techniques could endanger the health of humans and the environment.
In 1971, researchers inserted the genome of a tumor-causing virus, SV40, into a bacterial plasmid that could reproduce in E. coli. As the research proceeded, concerns arose that if this engineered strain of E. coli were accidentally released into the human population, it could cause a cancer epidemic. Scientists voluntarily halted the experiments until a determination could be made regarding the risk of the experimental plasmid spreading to strains of E. coli that exist naturally in the human body.
A group of leading scientists asked the National Academy of Sciences (NAS) to assess the concerns and provide recommendations on how to proceed. The resulting NAS Committee on Recombinant DNA Molecules issued a letter endorsing a voluntary moratorium on specific types of recombinant DNA research “until the potential hazards of such recombinant DNA molecules have been better evaluated or until adequate methods are developed for preventing their spread.” In the letter, the committee acknowledged that it was difficult to estimate risk and recommended that an international conference of involved scientists be held to examine the question more closely.a
The resulting Asilomar Conference on Recombinant DNA took place in February 1975 in Pacific Grove, California. Its purpose was to make recommendations on whether to end the moratorium, and if so, under what circumstances. One-hundred fifty participants gathered, including biologists, lawyers, physicians, and journalists. The discussions were vigorous and contentious, and encompassed views ranging from the insistence that no limits be placed on scientists’ freedom to the view that limits should be entertained. Participants from the scientific community felt strongly that if they did not arrive at a path forward, that path would likely be determined by others.
The outcome of the conference was a nearly unanimous agreement to lift the moratorium and to require that recombinant DNA research be carried out according to yet-to-be-determined guidelines that would define levels of physical and biological containment based upon the potential risk posed by a given research project.b
a Committee on Recombinant DNA Molecules, “Potential Biohazards of Recombinant DNA Molecules,” Proceedings of the National Academy of Sciences 71, no. 7 (July 1974): 2593-2594.
b Paul Berg, et al., “Summary Statement of the Asilomar Conference on Recombinant DNA Molecules,” Proceedings of the National Academy of Sciences 72, no. 6 (June 1975): 1981-1984; Susan Wright, Molecular Politics: Developing American and British Regulatory Policy for Genetic Engineering, 1972-1982 (Chicago: University of Chicago Press, 1994).
oversight (in the form of the Recombinant DNA Advisory Committee1). In 1977, public representatives were given seats on the Recombinant Advisory Committee. According to Brent, between 1977 and 1982 the combination of directed experimentation, a lack of evidence of harm, and a deeper understanding of the pertinent scientific questions to ask or address resulted in more flexible oversight and an exemption of most experiments from NIH guidelines.
Dr. Brent concluded his remarks by offering observations about the wisdom of attempting to regulate research. He stated that any regulation places burdens on researchers. Great care, he argued, should therefore be taken before attempting to hinder the unfettered pursuit of research. Brent then observed that, if research with potential benefits as well as dangers to public health were performed, there is no system in place to release the experimental details selectively. He stated that he believes that it would be very difficult to devise such a system. Brent further remarked upon the difficulty of formulating workable policies regarding research funding or the publication of research when no scientific consensus exists about what right behavior is. He observed that weighing foreseeable benefits versus risks to public health requires an omniscience that humans do not possess. Brent also observed that knowledge, once obtained, cannot be undone. For Brent, the fact that knowledge about the virulence and transmissibility of, for example, lethal human-transmissible influenza strains, would be available from the point of production onward, is an important consideration. Experts, he remarked, are individually less likely to predict a distinct benefit or risk from their research. With regard to the H5N1 controversy itself, Brent viewed the incident as an ethical failure and indicative of a far too fragmented scientific community.
1 The Recombinant DNA Advisory Committee (RAC) is a federal advisory committee that “issues recommendations to the NIH Director that are conveyed through the NIH Office of Biotechnology Activities (OBA), which is responsible for the NIH system of oversight of recombinant DNA research.” The RAC developed and suggests changes to a set of NIH guidelines (now known as the NIH Guidelines for Research Involving Recombinant DNA Molecules) to “govern the safe conduct of recombinant DNA research by outlining appropriate biosafety practices and containment measures.” It is important to note that compliance with these guidelines “is mandatory for investigators at institutions receiving NIH funds for research involving recombinant DNA,”[emphasis added] but voluntary for institutions, companies, or individuals not subject to NIH requirements. See the website of the Office of Biotechnology Activities, “About Recombinant DNA Advisory Committee (RAC),” http://oba.od.nih.gov/rdna_rac/rac_about.html.
The moderator of the workshop’s opening session, David Baltimore, Robert Andrews Millikan Professor of Biology and President Emeritus, California Institute of Technology, stated that his view about the Asilomar Conference diverged in one respect from Brent’s, namely, that, while Brent considered the conference and period immediately afterwards to be a lamentably short period of self-governance, Baltimore considered the period to be the first step in a process of continued self-governance. He also expressed the opinion that the Asilomar model is an appropriate one for the situation at hand.
Brent clarified his larger point, stating that he sees the Asilomar/governance process as a manifestation of what works well in our national culture, which is that an external entity is given regulatory power and the people whose activities it regulates lobby that entity; in other words, “a representative democracy is easier on us all.”
David Relman, Thomas C. and Joan M. Merigan Professor, Departments of Medicine, and of Microbiology and Immunology, Stanford University and Chief, Infectious Disease Section, VA Palo Alto Health Care System, discussed developments in national security and responses to dual-use research, highlighting the tension between the long-established value of openness in science (particularly strong in the life sciences) and the ever-changing needs of national security. Relman reiterated that concerns about dual-use research rose to the fore with weapons research in the Cold-War era and with questions about what should be done with the information and materials generated.
Cold War Deliberations
In 1982, the National Research Council (NRC) released the report Scientific Communication and National Security (known as the Corson Report after the authoring panel’s chair Dale Corson of Cornell University). The Corson panel defined three categories of university research. “The first, and by far the largest share,” the panel observed, “are those activities in which the benefits of total openness overshadow their possible near-term military benefits to the Soviet Union. There are also those areas of research for which classification is clearly indicated [and] between the two lies a
small ‘gray area’ of research activities for which limited restrictions short of classification are appropriate.”2
Following release of the Corson Report and intergovernmental discussions, in 1985 the Reagan administration issued National Security Decision Directive 189 (NSDD-189), which declared that, “to the maximum extent possible,” it was the policy of the administration that “the communication of the products of [federally-funded] fundamental research [should] remain unrestricted,” but “where the national security requires control, the mechanism for control of information generated … is classification.”3
The Dawn of the 21st Century: New Scientific and Political Developments
During the late 1990s and early 2000s, attention focused on the potential for using life sciences research to deliberately cause large-scale harm. This was due, in part, to advances in life sciences and growing concern over increases in the numbers of people seeking to do harm (as exemplified, for instance, by the anthrax mailings in the fall of 2001). In 2004, the NRC published Biotechnology Research in an Age of Terrorism (known as the Fink Report after the authoring committee’s chair Gerald Fink of the Massachusetts Institute of Technology). This report recognized the potential for misuse of knowledge in the biological sciences, described seven classes of experiments of concern, and recognized the need for oversight throughout the life cycle of research to protect against misuse. The report looked to journal editors rather than the government as gatekeepers for decisions about publication and recommended creating a national science advisory board for biodefense to provide advice, guidance, and leadership for the review and oversight of research of concern. While the report emphasized the importance of self-governance by the scientific community, the report’s authors recognized a need for the development of federal guidelines through a process similar to that adopted with recombinant DNA.
In 2004, in response to the Fink Report, the National Science Advisory Board for Biosecurity (NSABB) was created. The NSABB was to be an advisory board to the U.S. Department of Health and Human Services and all other federal agencies that support life sciences research. Its purpose “is to provide … advice, guidance, and leadership regarding biosecurity oversight of dual use research” and to “provide advice on and recommend specific strategies for the efficient and effective oversight of federally conducted or supported dual use biological research.” The NSABB was tasked to advise
2 Panel on Scientific Communication and National Security, Scientific Communication and National Security (Washington, DC: National Academy Press, 1982).
3 National Security Decision Directive 189 (NSDD.189): National Policy on the Transfer of Scientific, Technical, and Engineering Information, September 21, 1985.
on (1) “policies governing publication, public communication, and dissemination of dual use research methodologies and results,” (2) “programs for outreach, education and training in dual use research issues for scientists, laboratory workers, students, and trainees in relevant disciplines,” and (3) “the development, utilization, and promotion of codes of conduct.” It was also to “recommend strategies for fostering international engagement on dual use biological research issues.”4
Following the creation of the NSABB, both the NRC and the NSABB released additional biosecurity reports. The NRC, for instance, published Globalization, Biosecurity, and the Future of the Life Sciences (2006), Science and Security in a Post 9/11 World (2007), and Review of the Scientific Approaches Used During the FBI’s Investigation of the 2001 Anthrax Letters (2011). In 2011, the NSABB released its Recommendations on Communication of Experimental Adaptation of Avian Influenza A/H5N1.
Dual-use Research in The Life Sciences
Dr. Relman cited the NSABB criterion for identifying dual-use research of concern. That criterion is “research that, based on current understanding, can be reasonably anticipated to provide knowledge, products, or technologies that could be directly misapplied by others to pose a threat to public health and safety, agricultural crops and other plants, animals, the environment or materiel”5 [emphasis added by the speaker]. He observed that two principles—the scope (magnitude) and the immediacy of the impact of the research—were an intrinsic part of this definition and emphasized that the evaluation of dual-use potential is seen to be based upon: (1) a current understanding regarding the implications of the research results, and (2) a reasonable anticipation that research results could be misapplied.
Relman discussed how the concept of dual-use research of concern originally pertained to science and technology that could be applied to both civilian and military purposes (helicopters and satellite technology, for example). In recent years, Relman noted that the distinction has broadened and shifted and now signals research intended for beneficial purposes that also has the potential to be misused for harm. Regarding research on infectious agents specifically, Relman referred to research that the NSABB specifically identified as worthwhile but which may also need special review.
4 U.S. Department of Health and Human Services, “Charter, National Science Advisory Board for Biosecurity,” as renewed April 4, 2012.
5 National Science Advisory Board for Biosecurity, “Proposed Framework for the Oversight of Dual Use Life Sciences Research: Strategies for Minimizing the Potential Misuse of Research,” (June 2007):17.
Of greatest concern are those experiments that have the potential to produce information, products, or technologies that could:
• Enhance the harmful consequences of a biological agent or toxin;
• Disrupt immunity or the effectiveness of an immunization without a clinical and/or agricultural justification;
• Confer to a biological agent or toxin, resistance to clinically and/or agriculturally useful prophylactic or therapeutic interventions against that agent or toxin, or facilitate their ability to evade detection methodologies;
• Increase the stability, transmissibility, or the ability to disseminate a biological agent or toxin;
• Alter the host range or tropism of a biological agent or toxin;
• Enhance the susceptibility of a host population; or
• Generate a novel pathogenic agent or toxin, or reconstitute an eradicated or extinct biological agent.6
Continuing the discussion on biosecurity, Lawrence Kerr, Deputy Director for Global Biological Threats, Office of the Director of National Intelligence, spoke about government regulations and recommendations, with a focus on defining threat, risk, vulnerability, and consequence. Kerr began his remarks with a historical overview, citing concern within the U.S. national security community during World War II that the country could lose its military superiority. At that time, Kerr stated, the President’s Scientific Research Board strongly advised that “security regulations … should not attempt to cover basic principles of fundamental knowledge.”7 In 1949, Kerr continued, President Truman told a panel convened by the American Association for the Advancement of Science that “[c]ontinuous research by our best scientists is the key to American leadership and true national security.”
Kerr noted that a later Executive Order stated that “basic science research information not clearly related to the national security may not be classified.” He elaborated on the Corson Report’s argument against “security by secrecy” and observed that, at the time of the report’s 1982 publication, there was no practical way to restrict international scientific exchange without also hindering communication within U.S. borders. Following the Corson Report, Kerr said, those in governmental, nongovernmental, national, and international circles attempted unsuccessfully to find frameworks for handling research in a “gray zone”—research not immediately
6 Ibid., 18-21.
7 The President’s Scientific Research Board was established by President Truman in October 1946.
related to national security (and therefore not classifiable) but with possible national security implications. In 1985, Kerr noted, NSDD-189 declared that “to the maximum extent possible, the [communication of] products of fundamental research [should] remain unrestricted.” Kerr stated that federal agencies were responsible for reviewing research projects at the time of a funding decision and for periodically reviewing research findings. Shortly after the events of September 11, 2001 and the subsequent anthrax mailings, Kerr said, National Security Advisor Condoleezza Rice reaffirmed that NSDD-189 would remain in effect.
Kerr proceeded to discuss “risk” as a function of the threat, vulnerability, and consequence of or to an action. Kerr noted that vulnerability and consequences (or impact) can be discussed and sometimes measured. Regarding infectious diseases, vulnerability can be measured on the basis of a population’s past experience (or lack of experience) with a particular pathogen. Vulnerability, he said, can be mitigated by the public health community’s possession of countermeasures and its ability to deliver them effectively. Consequences, he continued, are the magnitude of the damage, given a specific attack type at a specific time that damages a specific target. Threat, Kerr observed, is more difficult to forecast. Threat is the probability that a specific target will be attacked in a certain way during a specific time period, and includes a consideration of the intent of a person seeking to do harm and that person’s capability to do such harm. Threat, he argued, is therefore extremely difficult to predict because intent is an emotional state, and thus, the assessment of “threat” is beyond the purview of most scientists.
Kerr discussed levels of risk associated with three situations and suggested how risk might be mitigated in each case: (1) unintentional risk associated with the research and technologies themselves (e.g., accidental exposure, contamination, accidental release) and intentional risk from outside of the laboratory (e.g., people gaining access to the pathogens) and inside the laboratory (e.g., lab workers being bribed); (2) risk associated with the information obtained from research (e.g., ill-willed people using that information for nefarious purposes or loss of public trust in the government and scientific establishment); and (3) risk associated with the withholding of dual-use research and information (that might, for example, hinder planning and implementation of preparedness and response plans, impede surveillance activities and the development of countermeasures, or harm international relationships of the United States).
How will federal agencies and departments acquire the expertise necessary to make the regulatory decisions with which they will be tasked?
Individual panel members pointed to the expertise of individual scientists working outside of government and expressed hope that these scientists will take up the task. They also suggested that effective practices across institutions should be assimilated into a “toolkit” created for the purpose of assisting all institutions with this regulatory task.
Should current policy discussions be organized around the life sciences overall, or are the regulatory questions best applied on a case-by-case basis? Dr. Baltimore expressed the opinion that in many cases, for example, in the case of H5N1 research, policy decisions must be based on an assessment of the specific hazards associated with a specific pathogen and that it would be difficult to arrive at general guiding principles.
To what degree should those in other disciplines be paying attention to these deliberations? How cross-disciplinary are these issues? Many panelists suggested that people in numerous related fields pay close attention to the H5N1 controversy and associated discussions, because similar dual-use questions are pertinent to synthetic biology, systems biology, biological engineering, chemistry, physics, and many types of engineering.