KEY CONSIDERATIONS IN RISK ASSESSMENT
In his talk intended to provide an introduction to some of the key considerations that might be included in the risk assessment that the National Institutes of Health plans to commission, Dr. David Relman identified six key considerations, several of which he covered in his presentation and several of which were addressed in other presentations and discussion. These were as follows:
- The properties of newly created strains, their consequences, and alternative approaches;
- Science and technology (S&T) trends over time;
- The global distribution of risks and benefits, their relative weights, and questions of justice;
- The types of possible misuse, in particular safety and security;
- Moral and ethical responsibilities of scientists and issues of public trust; and
- Risk assessment and mitigation
But he began by stressing that he believed there was substantial agreement on a number of points, both by many at the symposium and within the larger scientific community.
Relman emphasized that he believed greater clarity, specificity, and precision were needed in identifying what aspects of Gain-of-Function (GoF) research should be the source of the most concern. In his view, it
was the creation of certain combinations of those properties that relate to high degrees of pathogenicity and high degrees of transmissibility, perhaps with or without properties that would allow the infectious agent to become impervious to currently available countermeasures. He stated that he saw a major difference between deliberately creating agents that are not now believed to exist in nature with these combinations of properties as opposed to research to understand properties that have arisen naturally for particular viruses. He commented that he took to heart the comments that had been made about the importance of being proactive, that he absolutely ascribed to its importance in doing good science, and that he recognized that GoF can result from knockout mutations. Thus, he thought the discussion was about properties and not experimental approaches per se. Powerful selective conditions and screens are frequently applied, for example, that can be reasonably anticipated to yield agents with the properties that these selective conditions and screens are meant to identify.
In his view, risk assessments should be focused on intentional work that can be reasonably anticipated to produce these problematic properties. Such research is consequential because in some cases there are not adequate countermeasures available to thwart or contain these agents, and the results could be used in further research in less secure settings.
He acknowledged that there are also potential risks in not doing experiments, but the question is how large they are and argued that there are ways of trying both to anticipate and address this concern. Relman also stated that, in almost every case, other experimental approaches will provide the knowledge that scientists agree is critical and many of the benefits that they agree are needed without necessarily undertaking this very small subset of experiments to create the specific agents with these combined properties.
These alternative approaches include knockout approaches, GoF experiments in altered genetic backgrounds, as well as much more aggressive surveys of what already exists in nature. Relman accepted that there may be circumstances when these approaches would not yield what has been learned from the more risky GoF experiments, and that Dr. Yoshihiro Kawaoka had clearly described them in his talk (see Chapter 3). There are uncertainties in those results just as there are uncertainties in the results of these alternative approaches. There is always uncertainty about whether or not the results are truly relevant to the circumstance that scientists most wish to understand.
With regard to the future, over time the technology of reverse genetics developed by Peter Palese in the 1990s (e.g., Palese et al., 1996) is becoming increasingly easy to undertake, more efficient, and less expensive. Today, it is not known how many people could download an influenza
virus sequence and remake the pathogen in his or her own laboratory. But one can say with confidence that the number of people with such capabilities will grow as the capacity to do this work continues to expand. This is a good thing, but it has consequences and implications. Not only is the information now in a digital form, but the procedures are also now digitized and rendered into protocols that can be uploaded to robots. Companies are emerging, for example in Silicon Valley, that will do research for money of whatever type and specificity are indicated in the experimental protocol.
For Relman, these trends mean that one cannot simply talk about risk and benefit at the site where the original information is produced or the site at which the original experiment takes place. It is more than a question of biosafety. This means that the work is a distributed effort across the globe and that many people are interested in this work for a wide variety of reasons, some of which the participants in this meeting may not have been able to fathom, and who were not represented in the room. They were not there, nor have they been so far in discussions about both benefits and risk. He argued this also means that governance and oversight should be distributed in the same fashion that the scientific capability is becoming distributed.
Relman regards the potential risk of deliberate misuse of GoF research as a plausible scenario in today’s world. In misuse, he also included people willing to do irresponsible, if not deliberately mischievous acts, as well as those who show callous disregard or accidental or benign neglect of proper procedures and mindfulness. The discussion of potential risk is thus about safety and security. He underscored the range of motivations in the life sciences beyond a quest for knowledge or to help people. Some people now undertaking life sciences research are just curious, some are interested in fame, and some are interesed in fortune and economic benefit. None of these is bad, but the variety of potential motives and actors has serious implications for how to think and talk about risk.
Relman also noted that part of the diffusion of capacity was the rapid expansion of the capabilities of individuals in life sciences, which are now comparable to those of large organizations 10-20 years ago. In addition, in the current geopolitical climate, individuals and small groups with mal-intent are easier to muster and “radicalize.” When added to other new participants in life science research from alternative backgrounds and with different motivations, this suggests that the risks of misuse are increasing and that these issues must be included in the assessments for GoF research.
Relman began his remarks about moral and ethical principles in science with the statement that scientists have an obligation to think about public beneficence, which means maximizing benefits, minimizing harm.
Everyone has a responsibility to be a good steward of the ecosphere of the planet to the betterment of all and especially those without representation who do not have access to the same kinds of routine public health measures available to those in developed countries generally.
Scientists have an important commitment to intellectual freedom and responsibility in science—and the two go hand in hand. This should include support for democratic and deliberative processes of decision-making. Furthermore, modern science, much of which is funded by the public, demands consideration of morality because of the increasingly blurred line between basic and applied science. Scientists have to talk about the common good and, as Joseph Rotblat said, “spend a little bit of time thinking about the consequences of our work” (Mertl, 2000). It is scientists’ obligation to society. This leads to the question of whether there are any experiments that ought not to be undertaken because the risks outweigh the benefits or because the benefits will only be realized in the indefinite future. For Relman, the answer is clearly “yes.” In addition, it would be possible to triangulate in on exactly what they might be. The experiments should be defined carefully today, mindful that those definitions might have to be altered very quickly.
In summary, Relman offered several points about how to move forward:
- As others have said, we need narrow and specific definitions of what research we are concerned about.
- We need to come to some agreement, if we can, about whether there are experiments that will not be funded and whether there are, morally, experiments that should not be undertaken for now. He thinks these experiments are very few and far between and should not scare anyone away from addressing this question.
- We should also give guidance about which experiments others might view as very closely related but that we think are acceptable and should be funded, perhaps under certain degrees of oversight. But we need to give positive messages as well as negative messages.
- We need standardized reviews, standardized assessment approaches, and flexible mechanisms.
- We need democratic, deliberative, and iterative processes that have neutral sponsors and hosts.
- Finally, as others had already said, this effort must international and must be collaborative.
During the discussion of the papers by Kawaoka and Dr. Ronald Fouchier, Dr. Robert Webster from St. Jude Children’s Research Hospital
and a member of the symposium planning committee, asked whether the sequences of pandemic pathogens should continue to be released now that research has moved into the genomic era. Kawaoka answered that before mechanisms to restrict access and keep information only in certain groups are in place, it will be difficult to use that information in a useful way. Relman added that a distinction must be made between information that comes directly from nature or is deliberately created to have a combination of properties that renders them unusually dangerous to humans.
Dr. Harvey Fineberg noticed that Relman’s talk stressed the idea of the danger of the combination of creating in the laboratory any agent that is simultaneously transmissible, highly pathogenic, and resistant, and asked whether his conclusion would stop short of considering transmissibility alone. Relman specified that he is not concerned about the deliberate enhancement of only one of the properties cited earlier in an agent and, therefore, he thinks that the broad term GoF does not specify what people are the most concerned about.
Webster stated that in the influenza field, the goal is to predict which influenza viruses in birds have the potential to shift its host range to humans. He then asked whether we can achieve this kind of information without the GoF studies. Kawaoka and Relman agreed that researchers should understand more than what is currently known and that they are just beginning to learn what is involved in transmissibility and how to make predictions about it.
Dr. Susan Wolf, University of Minnesota, made a comment in reaction to Relman’s statement that even a careful analysis of the benefits and the risks will not be enough because the analysis will also need to make value judgments about what risks are related to what benefits. She asked how the National Science Advisory Board for Biosecurity could structure a more capacious and robust risk/benefit analysis that would capture some of these additional value considerations. Relman suggested that particular pieces of useful experience from other circumstances where scientists had to deal with difficult to quantify potential risks, such as the nuclear power industry or the creation of airplanes or even variability in the infectious disease community, all are worth capturing.
Dr. Gerald Epstein asked Relman whether there are there meritorious scientific experiments we should not do. Relman thinks there are experiments that would yield interesting information that may have some value, but may fail on the appropriateness and prudence question because of the magnitude of risk (e.g., create a strain of Ebola virus with respiratory transmissibility capability).
Alta Charo, from the University of Wisconsin (UW)-Madison and a member of the symposium planning committee, moderated Session 6 on Biosafety, with four speakers serving on a panel: Dr. Barbara Johnson (Biosafety Biosecurity International), Dr. Rob Weyant (U.S. Centers for Disease Control and Prevention), Rebecca Moritz (UW), and Dr. Marc Lipsitch (Harvard University). Johnson began the session with a presentation that highlighted examples of the kinds of accidents that have occurred recently in the United States as well as in other countries. Some of these accidents involved shipping of incompletely inactivated pathogens that should have been harmless but were not, improper handling of contaminated wastes, and “inventory holdovers,” such as the recent discovery of viable smallpox in an NIH storage area. Such incidents can result in Laboratory Acquired Infections (LAIs) among laboratory workers, and these have occurred all over the world. Johnson presented data on LAIs from Europe, Asia, the Middle East, New Zealand, and Africa that occurred between 2000 and 2011. Causes of these accidents included noncompliance with biosafety procedures, human error, and equipment failures. Fortunately among the recent mishaps in the United States, there have been no LAIs, but this is not the case elsewhere in the world.
The causes of LAIs may be difficult to recognize at the time of the exposure. There may be no definitive moment that indicates an LAI potential, such as a needle stick, animal bite, or dropped pipette. It is estimated that only 20 percent of the causes of LAIs are actually recognized. To add to this problem, many countries under-report accidents or may claim to have never had an LAI. Johnson noted that to say that a lab has never had an LAI may mean that the operators have never been able to account for it, that it has never been investigated, or that it has never been reported. Johnson believes that under-reporting is harmful in that it prevents us from learning from our mistakes—we are losing the opportunity to benefit from lessons learned.
Johnson provided information on the actions that had been taken in the United States in response to the recent spate of incidents. At the CDC, these included suspension of activities, review and remediation of all procedures, verification of adequate inactivation procedures, strengthening of biosafety agency-wide, the formation of an external group of experts to review and advise CDC, improvement of management of internal incidents, investigation of root causes and personnel issues, and a new requirement for single point of contact to be established for biosafety issues. At the NIH, “Operation Clean Sweep” was initiated. This was a “top-to-bottom” inventory of all NIH laboratories. The NIH also declared a “National Biosafety Stewardship Month.” Finally, on August 18, 2014, the White House issued a memorandum titled “Enhancing Biosafety and
Biosecurity in the United States,” urging all federal government agencies that work with pathogens to “take immediate and long-term steps to enhance safety and security in research facilities to minimize the potential for biosafety and biosecurity incidents.” Agencies were urged to institute a “Stand-Down” to include an “immediate sweep of their facilities to identify Biological Select Agents and Toxins (BSAT) and ensure proper registration, safe stewardship, and secure storage or disposal.”
Johnson noted that in the United States there are numerous layers of regulation and oversight for pathogen research. At the institutional level, there are Institutional Biosafety Committees (IBCs) and Institutional Animal Care and Use Committees (IACUCs) that must approve proposed research. There are also Environmental, Health, and Safety units that provide oversight of ongoing research at the local level. At the national level, researchers are expected to follow the requirements in the publication Biosafety in Microbiological and Biomedical Laboratories (BMBL), and there are requirements for the use of appropriate levels of biocontainment for work with pathogens, culminating in the mandatory Select Agent Regulations for the most hazardous infectious organisms. H5N1 avian influenza and SARS-CoV are both classified as Select Agents and are subject to stringent regulation, but MERS-CoV has not yet been brought under this particular regulatory umbrella. Finally there are also the Recombinant DNA Advisory Committee at the NIH, which has oversight for certain kinds of experimental approaches, and the NSABB, which advises the NIH on policy. The NIH has had additional guidelines for certain types of GoF research related to highly pathogenic avian influenza since 2012.
Johnson pointed out, however, that the extensive regulatory framework in the United States is not replicated in many places elsewhere in the world. The World Health Organization’s (WHO’s) Western Pacific regional office investigated a number of biosafety incidents that occurred in countries under their jurisdiction. They found problems with laboratory management and lack of biosafety policies, procedures, training, Personal Protective Equipment (PPE), and supervision of less experienced lab personnel. Many laboratories do not have occupational health and safety organizations or programs, and there is also a need for greater quality control and assurance. Many nations do not have codified standards for laboratory work on pathogens. The WHO recommended the development of legislation for national biosafety standards, procedures for timely reporting and follow-up of accidents, worker health monitoring and countermeasures, accreditation or certification of Biosafety Level (BSL)-3 labs, and inventories of infectious agents. Johnson has seen many improvements made in the past 5 years following the issuance of the WHO recommendations, mostly in the more advanced of the developing nations, although many developing nations still have work to do.
In global terms, the proliferation of BSL-3 labs has been widespread. The good news, said Johnson, is that the number of biosafety associations globally has recently increased to about 30, quadrupling since 2005. These nascent oversight organizations are, however, struggling to find expertise and volunteers. Most focus only on genetically modified materials, not on the pathogens of interest in this symposium. Even nations that lack codified standards tend to rely on the U.S. BMBL, the WHO standards, or the United Kingdom’s Health and Safety Executive guidelines, but the reality is that many labs in developing countries barely meet BSL-2 standards in terms of infrastructure, training, PPE, and supplies. A safety culture is incremental, hard to develop, and takes time. Many senior scientists don’t embrace change quickly. Johnson’s last presentation slide contained the following quote:
[M]any accidents are caused not by a lack of physical barriers or regulations, but by the absence of a strong biosafety culture in labs and their oversight bodies.
Nature editorial, July 29, 2014
But, as noted by a questioner from the audience later, although many guidance documents suggest that the development of a culture of biosafety and biosecurity is important, there is little practical advice available to implement this suggestion.
Weyant was the next speaker in Session 6. The Federal Select Agent Program is jointly administered by CDC’s Division of Select Agents and Toxins and the Agricultural Select Agent Program of the Animal and Plant Health Inspection Service (APHIS). The Program oversees the possession, use, and transfer of biological select agents and toxins, which have the potential to pose a severe threat to public, animal, or plant health or to animal or plant products.1
Weyant explained that the Select Agents and Toxins regulations require that any theft, loss, release causing an occupational exposure, or release outside of primary biocontainment barriers must immediately be reported to CDC or APHIS as well as to the appropriate federal, state, or local law enforcement agencies. This reporting requirement provides the CDC with a database on releases of the Select Agents under its jurisdiction. Weyant noted that if a laboratory does not intend to possess a Select Agent but encounters one (e.g., in a sample provided to a hospital), the laboratory is not required to register with CDC but must fulfill the reporting requirement. Between 2005 and 2012, there were 1,059 release reports
with 3,780 potential worker exposures. Of this number only 10 LAIs2 resulted and there was no evidence of transmission to the general public. These data allow the calculation of an approximate registered entity annual rate of LAIs per worker less than 0.0005 (5 confirmed LAIs per 10,000 workers over 7 years).
Moritz next reported on the processes by which UW reviews and oversees the GoF research of Kawaoka. She noted that “Select Agent research at the University of Wisconsin is considered a privilege, not a right.” The UW IBC had a thorough discussion of Kawaoka’s research before it was funded and approved the risk mitigation measures he proposed. Since the beginning of 2006, his influenza research protocol has been reviewed more than 40 different times. Following the issuance of the U.S. government’s March 2012 dual use of research concern (DURC) policy, UW chose to form an IBC subcommittee to review research for DURC, and this policy has been revised to account for subsequent changes to the regulations. The DURC subcommittee reviews grants, experiments, and manuscripts for DURC and passes its findings on to the full IBC for additional discussion. The IBC’s findings are, in turn, passed on to an additional committee, one that is considered a best practice by multiple federal agencies, including the U.S. Federal Bureau of Investigation. This is the Biosecurity Task Force, which is comprised of key individuals and experts from UW’s campuses as well as all responsible university officials (e.g., biosafety officer, director of environmental health and safety, communications, health services) and law enforcement. The mission of this broad group is to ensure the highest level of safety and security for UW’s Select Agent laboratories. The Biosecurity Task Force, with the IBC and the DURC subcommittee, helps ensure that the facilities, practices, and procedures meet the stringent biosafety and biosecurity standards set by these committees. While the merits of Kawaoka’s research are decided by the agencies that fund it, it is imperative that these UW committees evaluate the risks to the institution and the public so as not to jeopardize the institution’s $1 billion research enterprise. Moritz stated that the risk assessment performed by UW included extensive risk mitigation measures that have been put in place and sets forth the principles for a culture of safety. Examples of the risk mitigation measures at UW include the following:
- The Influenza Research Institute (IRI) facility is a stand-alone structure that houses only Kawaoka’s research group, which
2 Five of the 10 LAIs were from registered laboratories, and 5 were from labs exempt from registration. The calculation made by Dr. Weyant uses only the figure for the registered laboratories.
allows university officials to control all aspects of the building and the activities inside it. Entry to the building is strictly controlled. Annual preventive maintenance is performed to ensure the IRI is functioning optimally.
- Kawaoka’s transmission experiments are done in a BSL-3 agricultural suite, which is essentially a BSL-4 minus the positive pressure suits and chemical shower. Basically BSL-3s are boxes within boxes that allow for open animal holding. But Kawaoka takes this one step further and uses High Efficiency Particulate Arrestance (HEPA) filters for their cages for primary containment of the animals.
- Personnel wear PPE when working inside the high containment facility and follow numerous procedures for entering, exiting, and conducting research in the laboratory. All researchers at IRI must undergo extensive hands-on training with a mentor and must pass proficiency testing before they are allowed to work in the containment laboratories. In addition, they must undergo hands-on laboratory refresher training as well as regular training updates. GoF transmission experiments are only performed by Kawaoka’s most senior research staff.
- Like all Select Agent programs, IRI is required to have incident response and security plans for any number of potential events. Kawaoka and his researchers participate in hands-on and scenario drills with a dedicated trainer who runs these programs. They train for security threats and natural and human made disasters, such as a fire in a containment laboratory. Imagine telling the fire department that they cannot fight a fire. That is exactly what had been done when the IRI was being commissioned. If there is a fire inside the facility, the fire department has orders to let it burn and only prevent the fire from spreading beyond the laboratory perimeter. There are also routine drills for an exposure or potential exposure or for researchers who exhibit influenza-like symptoms.
- Kawaoka has an elaborate exposure control plan that was developed in conjunction with UW health services and infection disease physicians and state, county, and city public health agencies. All researchers receive seasonal influenza vaccines as well as thermometers at home. All of the influenza strains used inside the high containment laboratories are sensitive to the antivirus.
- Possibly the most important part of this process is communication. If anything out of the ordinary happens in one of the containment labs or with regard to the building, then the researchers are obligated to notify the responsible official or the alternate
responsible officials, who form a task force to determine whether there is a risk to be addressed.
- Lastly, there have been questions about research transparency. The UW is a public institution that worked hard to engage the local community prior to building the IRI. From the beginning, UW has been clear about the laboratory, its mission, and the research conducted there.
Later in the discussion, Lamb, who spoke at the very end of Session 8, commented that IBCs vary enormously in competence. He noted that while the UW has a highly competent system for oversight of GoF research, many other universities have much less expertise with designing safety protocols for investigators.
Lipsitch presented the final talk in the Biosafety Session, focusing on pandemic strains of influenza and GoF research that increases their transmissibility among mammals. He noted that the safety record for biocontainment lab research is good, but not perfect, and pointed to data from the literature that indicate that there were 1,141 LAIs and 24 deaths worldwide in the 1979-2005 period. He stated that research projects that pose risks to public health, such as those that make pandemic influenza strains more transmissible, appropriately face greater restrictions. He stated that the public health risks of such research affect a broader, potentially global public and that even low accident rates may be unacceptable given the larger number of people who could be affected.
Lipsitch presented data from Henkel et al. (2012) and from an Environmental Impact Statement for the U.S. Department of Homeland Security’s National Bio and Agro-Defense Facility (2007) that provide much higher rates of LAIs for Select Agent labs in the United States and for laboratories belonging to the National Institute of Allergy and Infectious Diseases (NIAID). For the former, the rate he presented was 0.2 percent per lab per year, and for the latter it was 1 percent per full-time BSL-3 lab worker per year. He went through a series of calculations that he believes demonstrate that a release of an H5N1 or other pandemic influenza strain enhanced through GoF research to increase its transmissibility among mammals could result in a 0.01 percent to 0.1 percent chance of 2 million to 1.4 billion fatalities, or an expected death toll of 2,000 to 1.4 million per BSL-3 laboratory-year based on the Select Agent data LAI rate. Using the NIAID data, each full-time person-year of GoF research in a BSL-3 lab could produce a death toll of 10,000 to 10 million. (He noted that these calculations have recently been published, see Lipsitch and Inglesby, 2014.) While acknowledging that these numbers represent “a small probability of a very severe outcome” because the most likely outcome is no pandemic from the work at any lab per year, he emphasized that the problem
is that such levels of risk should not be ignored when the consequences of an accident are on such a scale.
Lipsitch went on to argue that the choice should not be viewed as GoF research versus no GoF research, but rather that there are alternative ways to obtain the knowledge needed about PPPs. (He referred the audience to Lipsitch and Galvani  for a list of such alternatives with citations.) He believes that the alternatives he suggests create no significant health risks while maintaining the ability of science to pursue countermeasures to PPPs. This absence of pandemic risk for Lipsitch tips the scales in favor of alternative approaches.
During the question-and-answer period at the end of the session as well as in the final discussion on the second day, Nicholas Evans, University of Pennsylvania, noted that biosafety data are not well collected in the United States or in many other countries. He pointed to a Government Accountability Office (GAO) report (2009) that stated that the number of BSL-3 laboratories in the United States is not known and there is no central data collection point either inside or outside the United States for biosafety data. Jim Welch of the Elizabeth R. Griffin Foundation also agreed that there is a need to develop a repository for biosafety best practices, particularly with regard to LAIs, beyond simply Select Agents, where the option of noncompliance does not exist because you will be shut down by the CDC. Weyant also noted that although the labs that are performing GoF research are well run and overseen, there are only a small number of them at present. He asked what will be needed to ensure that additional work will also be adequately overseen if the number of such labs grows by one or two orders of magnitude. Charo also pointed out that case-by-case evaluations of particularly hazardous experiments require knowing who is going to be executing them and what their capabilities are.
Another topic that was given attention was the presentation of calculations on risk of GoF research by Lipsitch. Fouchier disputed the calculations, stating, “I prefer no numbers rather than ridiculous numbers that make no sense….” Lipsitch’s response was that he invites anyone with better information to challenge his numbers and provide better ones so that the calculations on the risks of GoF research can be as realistic and refined as possible. He stated that “we should incorporate all the data available and the more relevant the better.” Since the symposium, Fouchier’s own calculations, which he described briefly in the final session and which showed a far lower risk of accidents or pandemic outcomes, will be published soon (Fouchier, 2015).
Over the course of the 2-day meeting a number of participants, such as Gregory Koblentz and Koos van der Bruggen of the Royal Netherlands Academy of Arts and Sciences, commented on the apparent shift over the past 3 years in the international discourse regarding the potential risks of GoF studies from one focused on biosecurity to one focused on biosafety. Nonetheless, Carol Linden of the U.S. Biomedical Advanced Research and Development Authority (BARDA) noted that while biosafety and biosecurity are inextricably linked, they remain distinctly different with different legal, policy, and regulatory regimes. Both aim to keep dangerous pathogens safely and securely inside the areas where they are used and stored, yet they mitigate against different risks. Biosafety provides policies and practices to prevent the unintentional or accidental release of specific biological agents and toxins, whereas biosecurity provides policies and practices to prevent the intentional or negligent release of biological materials or the acquisition of knowledge, tools, or techniques that could be used to cause harm. Thus, while providing a foundation upon which to build biosecurity capacity, biosafety measures, in and of themselves, cannot fully address biosecurity risks.
Consequently, some participants, such as Ed You of the Federal Bureau of Investigation, noted that both the biosafety and biosecurity environments are important to consider in any risk assessment, taking into account the organism/pathogen and its weaponization potential, capability (including both scientific knowledge, tacit knowledge, and technological know-how) and intent of an adversary, and the potential consequence of intentional release or misuse. It would also require the expertise of scientists and security experts to fully address the range of potential risks.
These components necessarily require additional considerations as well. Some participants noted that current nomenclature, whether dual use, DURC, or GoF, complicates the focus of any assessment. Several argued for definitions for the identification of organisms/agents and experiments that raise substantial concern that would be both more precise and yet flexible enough to adapt as the science advances. (This point is made several times elsewhere in this report.) Gigi Gronvall of the UPMC Center for Health Security and Koblentz also pointed out that the capability of an adversary is equally, if not more, difficult to assess because understanding of who would act in such a manner (e.g., state actors, non-state actors, or lone wolves) and of their capacity for acting is limited. Furthermore, our understanding of the consequences of a deliberate release or use of a biological weapon in any given set of circumstances is also limited. This includes knowledge of the availability and effectiveness of appropriate measures and infrastructure to respond to such an
event, although we know that capacity varies, in some cases quite significantly, from country to country or even within countries.
Peter Hale of the Foundation for Vaccine Research also noted that a one-time risk assessment is not appropriate, calling for periodic review and updating of any assessment based upon scientific advances, technological know-how, improvements in country capacity and infrastructure, and increases in the number of laboratories undertaking research of particular concern on the one hand and the changing security threat environment on the other. Taken together, this means that any assessment of potential biosecurity risks would have to acknowledge and deal with substantial uncertainties across most, if not all, major parameters
SPECIFIC TOPICS FOR CONSIDERATION
Koblentz noted that today’s concerns regarding possible biosecurity risks associated with GoF research in the United States are rooted in biosecurity risk discussions that stemmed from several events that occurred in the mid-1990s: the sarin gas attack in the Tokyo subway in 1995; the bombing of the federal building in Oklahoma City in 1995; and the 1995 attempts by Larry Wayne Harris to acquire plague samples through the mail. Koblentz offered an approach for assessing biosecurity risk, stating that risk assessment is a judgment about the likelihood that an event will occur and the consequence of that event:
Risk = Threat (Capability + Intent) × Vulnerability × Consequences – Mitigation Countermeasures
Looking at just one component of the risk calculation is not sufficient, Koblentz cautioned.
One of the similarities between the current debate and the debate of the late 1990s is the lack of good data, meaning there are very few incidents. While this is good news, the lack of data leaves room for speculation and uncertainty, creating three distinct schools of thought about the nature of the threat: Optimists, Pessimists, and Pragmatists. Koblentz elaborated on each of these as follows:
Optimists: Individuals who adhere to this school of thought generally believe that the risks of bioterrorism are exaggerated. They note several factors:
- The number of attempts by terrorists to acquire and use biological weapons has been miniscule compared to the overall number of terrorist attacks, which for Optimists implies that terrorists sim-
ply do not want or need these types of weapons to achieve their goals;
- Terrorists tend to be conservative and choose weapons that are readily available and have a proven track record, such as guns and bombs;
- There is a stigma against the use of biological weapons and groups that are capable of pursuing such weapons have not; and
- Technical hurdles are significant. Dual use research requires a set of skills and tacit knowledge that is acquired through hands-on-training and trial and error.
Pessimists: Individuals who hold this perspective believe that it is a matter of when, not if, a biological attack will occur and the consequences will be catastrophic. Pessimists hold this view because:
- the small number of incidents is a harbinger of the future and provide evidence that terrorists are innovative and not averse to failure;
- terrorists take risks and experiment with new ways to cause harm and death;
- terrorists have employed increasingly lethal measures over time;
- terrorists are motivated by a rise in religious conviction, which tends to place less constraints on causing mass casualties; and
- advances in science and technology are decreasing technical hurdles, while diffusion of knowledge and technologies are increasing global access.
Pragmatists: Individuals who adhere to this view believe that bioterrorism is a low-probability, low-consequence event. Pragmatists worry about the emergence of terrorists groups with the intent and capability to acquire and use biological weapons but they consider the likelihood of such groups emerging to be quite rare. Pragmatists pay less attention to probability and consequence and focus more on understanding how and why terrorists groups pursue these types of weapons. Instead of thinking that the past is a predictor of the future as do the Optimists, or predicting the future based on current trends, Pragmatists focus on the conditions or variables that might lead certain groups to seek such weapons. Finally, Pragmatists share the Pessimists’ view of the growing capability of certain groups to acquire biological materials and resources but at the same time share the Optimists’ perspective that it takes substantially more knowledge and know-how and skill to pull off such an attack than these groups have or will soon acquire.
These three perspectives lead to risk assessments that would weigh
benefits and risk quite differently. And while a risk assessment may not explicitly state which of these views is driving the analysis, the values embedded in these perspectives will nonetheless find their way into the process. Koblentz thus argued it was essential to surface these assumptions early in any risk assessment process.
Gronvall offered six key points to consider in any assessment of biosecurity risks stemming from GoF research:
- There are few big surprises in life sciences research—science is incremental and rarely presents a clear dividing line that, if crossed, triggers an immediate biosecurity concern;
- It cannot be denied that publishing sequences of new, potentially pandemic-causing strains of any pathogen lowers barriers to weaponization;
- While the current debate is focused on flu viruses, SARS, and MERS, these join a host of other pathogens that could be misused;
- With a plethora of pathogens available for misuse, many paths can be pursued toward weaponization that do not require GoF experiments;
- Assessing who would likely take steps to weaponize pathogens is incredibly difficult and dependent upon experts with widely diverging views; and
- Assessing GoF research must include not only the biosecurity risks but also the biosecurity benefits for countermeasures that new knowledge acquired from such research may provide.
Linden expanded the discussion of biosecurity to include the public health aspect of biosecurity. She noted that over the past few decades there has been a growing understanding within the federal government that biosafety will not address all of the biosecurity concerns. Yet, in the end concerns about both biosafety and biosecurity lead to a discussion of the need for a strong and sustained culture of responsibility in science that involves both individuals and institutions and that includes federal laws and regulations. Through a series of education and training protocols efforts have been made to enhance the understanding of biosecurity by laboratory personnel so that work is done safely and securely. Similarly, facilities have been assessed and requirements have been implemented to enhance the physical security of laboratories. Additionally, regulations governing transportation of pathogens contribute to the overall soundness of laboratory and personnel handling of dangerous materials. While this web of education, training, regulations, and requirements contribute to overall improvements, we must be mindful that these efforts do not have unintended consequences. A measured approach that recognizes
that zero risk is not achievable can provide layers of protection to address legitimate safety and security concerns while allowing continued scientific progress.
Several participants and webcast viewers offered additional comments during the final session of the symposium. Van der Bruggen commented on the shift in the GoF debate from security to biosafety. To some this implies that, if the biosafety issues are solved, which they believe is not difficult, then the biosecurity issues are solved as well. He did not agree. He is convinced that the biosecurity risk is very small but still exists. Laboratory biosecurity certainly overlaps with biosafety. However, there are possible biosecurity threats that call for separate attention and often the inclusion of other experts to give advice on these issues. That makes regulating and decision-making more complicated, but it is essential.
Epstein offered an observation about the different ways in which different communities will look at this problem. One difference in these communities’ perceptions should be highlighted and that is the difference between what is considered tangible and what is considered speculative or hypothetical. He said he was exaggerating for effect, but for the scientific community the benefits of fundamental research are tangible, while the risks of bioterrorism are speculative. But the security community comes at this the other way around: for many in this group, human intent to do harm is very tangible and real, even if the risk of bioterrorism might be uncertain, while the benefits of fundamental research are seen as speculative.
Piers Millett, Biosecure Ltd., commented that, while he was impressed with the discussion of bioterrorism, he thought that in moving forward consideration should be given to the history of state-level activities and whether something as sophisticated as GoF research was more likely to be misused by a state or a non-state actor. He also suggested looking beyond the WHO to consider the convening capacity of the Biological Weapons Convention and the relevance of the Convention on Biological Diversity for some issues.
Evans commented that any deliberative process going forward would need to recognize the possibility of incommensurate values regarding the bounds of public health value, the value of innovation, and the value of security. All three have to be weighed against each other, and there may be fundamental disagreements about all of them. He believed that the symposium had not adequately discussed the distinction between the restrictions on GoF studies that are proposed or the funding of other research that may offer more or different benefits to society. Those are two different types of regulatory effects, and different considerations apply in each case. This dovetails with the larger issue of the importance of insti-
tutional memory. This controversy over GoF research is not a new issue, and although the dual use aspects of GoF have dropped off the agenda to a certain extent, considerable progress has been made in the discussions regarding the issue, and that should be remembered.
Kavita Berger, American Association for the Advancement of Science, offered a comment that the types of risks being discussed had not been well defined. There had been general discussions of biosafety, laboratory biosecurity, personnel security, and bioterrorism, but almost nothing, for example, about the issue that someone might use information that is publicly available. Going forward, it would be important to be clear about what specific risks are the major concerns. She also commented that, although not perfect, guidance, regulation, and infrastructure are in place dealing with security and biosafety from a laboratory perspective. Too often much of the focus is on the scientist who is actually following the rules, as opposed to the individual who will not follow the rules, which should be of greatest concern. The core of the GoF issue is that someone might use the results to cause harm, and how to include that risk in the overall risk assessment. Her other comment was that in the 10 years since the dual use issue emerged, institutions in the United States, Europe, Canada, and a growing number in other parts of the world have gained experience in reviewing and overseeing research. Again, these efforts are not perfect, but there are best practices that can be shared and examples and models that can be highlighted as ways to start.