In the workshop’s third session, five panelists discussed the notion of a social contract between scientists and society. The panel considered: (1) how the rapid pace of advances in the life sciences changes how we weigh risk versus benefit, (2) how research that possesses some element of risk is regulated, (3) how mechanisms currently in place can guide how scientists weigh risks/benefits and whether these mechanisms are appropriate, (4) whether there are types of research that should not be pursued or supported by the federal government or research whose findings should be restricted, (5) whether there is a place for classification in life sciences research, and (6) the role of the public in scientific decision-making. Each panelist was given an opportunity to offer opening remarks on the topic.
Robert Cook-Deegan, Director for Genome Ethics, Law, and Policy, Institute for Genome Sciences and Policy, Duke University, reflected on the distinction between biosafety and biosecurity and, with regards to the latter, on efforts to thwart the deliberate misuse of biological knowledge. Cook-Deegan observed that a major difference between biosafety and bio-security is the degree to which politics and policy making will be influenced by external events. A more readily apparent difference between biosafety and biosecurity is the addition, on institutional biosafety committees, of the expertise necessary to assess the level of threat—by anticipating and preventing deliberate misuse of research materials or information. Cook-Deegan acknowledged, however, that the influence of external events cannot be avoided, particularly events that are unpredictable, unprecedented, or violent.
Cook-Deegan observed that many past events were not preventable. He used the examples of Sverdlovsk1 and Amerithrax2 as examples and argued that neither situation could have been prevented by the measures under discussion at the workshop. A workshop participant pointed out that the Sverdlovsk incident was partly accidental and complicated by the failure to admit what had happened and a government cover-up. Cook-Deegan concurred and observed that personnel decisions and laboratory oversight is critical. In Cook-Deegan’s view, complete risk characterization will never be possible, regardless of the level of risk-assessment expertise. Nevertheless, Cook-Deegan observed that the twin questions of whether to “perform” or to “publish” research will continue to arise in the foreseeable future and that we must have solid procedures in place for making determinations about how to address this. Cook-Deegan noted that one of the recommendations of the Fink Report was that such a mechanism should be developed for this purpose and observed that the Fink committee had recommended that the purview of existing institutional biosafety committees be expanded to include determinations about the threat of a research project’s potential misuse.
Cook-Deegan observed that emotions run so high in the current debate because there are deeply held opinions about scientific freedom. Given the strong culture of openness in the biological sciences, Cook-Deegan believes that the current deliberations should be based on a presumption that the strong tendency toward openness in life sciences research will remain. He added that this tendency will serve the U.S. well in its international relationships, as access to information will assist other countries in protecting themselves.
Ruth Berkelman, Rollins Professor and Director, Center for Public Health Preparedness and Research, Emory University and Director, Emory Preparedness and Emergency Response Research Center, defined the social contract as the giving up of certain rights in exchange for the benefit of the protection of society or the community. She discussed the example of policies surrounding clinical research studies on human subjects, policies that, while imposing a “burden” on the research enterprise, have ensured that such research proceeds in an atmosphere of public support and cooperation. In contrast, regulatory systems—national or international—to protect populations of people (and ecosystems overall) have not developed
1 In 1979, an anthrax outbreak occurred in the city of Sverdlovsk (now named Ekaterinburg) in the Soviet Union. Anthrax infections occurred in livestock and humans who had eaten contaminated meat. The source of the pathogen was determined to be an accidental release of Bacillus anthracis (anthrax) spores from a military-run microbiology laboratory.
2 In September and October 2001, letters containing anthrax spores were sent through the U.S. postal system. As a result, twenty-two people are known to have been infected with anthrax. Five died. The FBI investigation of the mailings is known as Amerithrax.
in tandem. Berkelman emphasized the need for local, national, and international mechanisms which are developed transparently, critiqued regularly, and include public representation. She concluded her remarks by noting that scientific freedom and the public interest rarely come into conflict, but when they do, the public interest must come first. When scientists put the public’s interest first, she observed, the public will be in the best position to continue offering its trust and support.
Greg Kaebnick, Research Scholar, The Hastings Center, stated, “I don’t think there is any such thing” as a contract between society and scientists. He noted, however, that there is a “nested” relationship in which science occurs within a “social milieu” and is subject to the social values and norms in the way that all of us are. A social contract that is seen as a contract between co-equals, between a group and society, he observed, tends to be framed in terms of the group securing social privileges—in this case, the privilege of self-governance. Kaebnick encouraged the audience to think of the relationship in nested terms wherein the group is part of society and has certain obligations.
Kaebnick discussed how, for many people within the scientific community, science is considered technical and value-free. When this perspective is in play, he noted, the concept of a social contract between science and society becomes an effort by scientists either to keep broader social values at bay or to protect scientific values from the social values that surround them.
Given this, Kaebnick saw a system of pure self-governance as cause for concern. He recommended that any such system have a broad range of social inputs. Kaebnick acknowledged that a system may need to rely somewhat on self-governance simply because the field is highly technical and changing rapidly. Kaebnick also acknowledged that it is quite difficult to decide who among the public would be involved and how. He urged that such a system ensure that the participants well understand the science. He envisioned, rather than an open consultation with the public, a mechanism whereby existing scientific bodies (such as institutional biosafety committees or the National Science Advisory Board for Biosecurity) are enriched by the presence of non-scientists; people who are committed to the process and committed to learning the science.
Daniel Kevles, Stanley Woodward Professor of History and Professor of History of Medicine, American Studies, and Law (adjunct), Yale University, discussed the H5N1 controversy in the context of two historical examples—nuclear weapons in the early 1940s and recombinant DNA in the 1970s. He noted that, in both cases, scientists acted responsibly. Nuclear physicists halted the publication of research on nuclear fission in the early stages of the nuclear era and, in the latter case, scientists imposed a moratorium on recombinant DNA research while the biosafety questions could begin to be addressed. Kevles also addressed the role of the public.
He noted ways in which public involvement in both situations was critical. In the case of recombinant DNA specifically, he noted the importance of the public role in both the deliberations about and establishment of regulations regarding recombinant DNA. The public, he observed, was part of state and local governmental processes, advisory committees, and institutional review boards, for example.
In the case of both recombinant DNA and nuclear fission, consideration was taken both of the organisms/technologies themselves and information about them. In the case of recombinant DNA, Kevles noted that the danger was primarily associated with the organisms themselves and not their possible misuse as weapons. Containment measures therefore had to do with securing laboratory facilities to prevent the organisms’ escape. In contrast, with nuclear fission, the threat was not in the experimental material itself (though fissionable material was a critical ingredient), but rather in the technological knowledge that enabled its manipulation.
Following World War II, scientists considered how nuclear science could be advanced while protecting national security. Upon the release of the Smyth Report,3 they drew a “bright line” between research that would be available publicly and research that would be classified. A group of scientists also led an effort to establish international control of nuclear research.
Regarding the question of whether public policy should be formulated in a general or abstract way or formulated specifically to a given research area or project, Kevles was in favor of specific oversight of particular knowledge or technology.
Journalist Carl Zimmer has written about the H5N1 controversy for the New York Times and other publications. He spoke about the role of journalism in scientific debates and deliberations, affirmed the importance of journalism in bringing scientific debates to the public’s attention, and discussed journalism’s function in distinguishing real risk from baseless fear. In the case of the H5N1 papers, Zimmer noted, journalists found themselves writing about research that they did not have access to: the relevant research papers had not been released, and although many scientists were quite open with journalists, a number were unwilling to talk with the media.
Zimmer reflected on a commonly heard accusation that, in the uproar about the Fouchier and Kawaoka papers, journalists exaggerated the risks of conducting the research. While acknowledging that some irresponsible coverage of the controversy took place, Zimmer pointed out that in most cases journalists were basing their analyses on information provided
3 Henry DeWolf Smyth, “Nuclear Energy: A General Account of Methods of Using Atomic Energy for Military Purposes Under the Auspices of the United States Government 1940-1945.”
directly from scientists. He admonished scientists who unjustly use journalists as scapegoats.
In future instances involving scientific controversy, Zimmer called for more transparency on the part of the scientific community. Zimmer concluded his remarks by observing that there are cases where scientists have done an exceptionally good job of communicating with the public on, for example, public health topics. He urged the broader scientific community to learn from such examples.
Session moderator Harold T. Shapiro, President Emeritus and Professor of Economics and Public Affairs, Princeton University, agreed with Cook-Deegan that the Fink Report offered practical suggestions. Shapiro also agreed that the advancement of knowledge has attendant risks and that a workable process will need to be flexible.
In discussion, the panelists and participants also discussed the forms that public input might take, reasons why scientists were hesitant to talk with journalists, laboratory culture and its effect on public safety, methods of estimating risk, and the possible inclusion of fields of study not typically overseen by institutional biosafety committees.
At the workshop, public participation was a topic of vigorous discussion. Three themes emerged: how and whether to solicit the input of members of the general public in deliberations about regulating life sciences research of concern; how to communicate most effectively with the public about the regulatory oversight of and the review of research; and how to best create regulatory frameworks that warrant and/or inspire public confidence.
Some discussants raised the question of how the public should be involved in the development of regulatory schemes. How, if the public were to serve on regulatory bodies, would individuals be selected and how might a group with admittedly mixed expertise most efficiently deliberate on highly technical issues?
Concerning how to communicate most effectively with the public, an audience member stressed the importance of communicating in calm, rational terms. A panel member added that positive hyperbole—over-optimism—should be avoided as well. Another audience member expressed the opinion that the media should give greater consideration to the use of inflammatory terms such as “doomsday virus,” “biological weapons,” and
“biowarfare” and instead consider using more restrained descriptive terms such as “public health research” and “concern.” Zimmer agreed and noted his own frustration with inflammatory headlines. He observed, however, that some of the apocalyptic language used by the media came from quotes by scientists themselves.
Joe Palca, Science Correspondent, National Public Radio, spoke about what scientists think the public wants to know about science and the types of questions the public actually asks. Palca observed that scientists are often accustomed to communicating with a degree of subtlety and nuance lost on the public. He suggested that the scientific community needs to better understand the mechanics of providing accessible answers to questions that the public wants answered. Palca acknowledged that the media is often an intermediary in this process and reminded scientists that journalists often have final responsibility for what is communicated.
Throughout the workshop, a recurring theme involved the question of how best to create a regulatory framework in which the public has confidence. Suggestions included having the public decide which persons or what types of expertise should be represented in regulatory discussions.
Dr. Shapiro suggested that the group consider the role of private corporations in life sciences research. Dr. Kevles discussed his experience in 2001 at the 25th anniversary of the Asilomar Conference. He noted that, unlike previously, many life scientists now have ties to industry. He suggested that such relationships would provide a barrier to the creation of a disinterested set of recommendations similar to those put for that the Asilomar Conference. Kevles noted, however, that it is essential to involve corporations in governance discussions especially in light of many corporations’ positions as international actors. Dr. Cook-Deegan agreed and noted that many corporations devote a much more significant portion of their budgets to life sciences research than was the case in 1975.