Foundational technologies (more properly, foundational science and technologies) are by definition those that can enable progress and applications in a variety of problem domains. Even in a military or national security context, it is rare that research on foundational technologies is entirely classified. Work on foundational technologies is mostly unclassified, or else classified work and unclassified work on such technologies happen contemporaneously. Lastly, useful applications based on a foundational technology often take a long time to emerge. Even then, one foundational technology may be used in combination with other technologies, both foundational and specialized, to create useful applications.
Each of the three main sections of this chapter addresses the scientific and technological maturity, describes some possible military applications, and discusses some illustrative ELSI questions that may be associated with each of the three technologies selected by the committee for examination (or applications that might be enabled through the technologies). The reader is cautioned that ELSI concerns related to these technologies— information technology, synthetic biology, and neuroscience—are not handled uniformly from section to section, reflecting the fact that different kinds of ethical, legal, and societal issues arise with different foundational technologies and the applications they enable. This chapter and Chapter 3 (on application domains) provide case studies for empirically grounding the framework of ELSI-related questions laid out in Chapter 5.
In general, information technology is designed to store, process, manipulate, and communicate information rendered in digital form. Information technology includes computing and communications technology. Both hardware and software fall under the rubric as well. The academic disciplines of computer science and computer engineering provide much of the intellectual underpinning of information technology.
Information technology as a field is simultaneously mature, in the sense that the underlying technologies of information technology are sufficiently stable and well understood to support useful applications, and also newly emerging, in the sense that innovation and invention in information technology continue apace as they have for several decades.
The fundamental trends underlying advances in information technology hardware have for several decades been characterized by exponential growth in processor power and storage capacity, with doubling times measured in periods ranging from 9 months to 2 years. And there has been a corresponding flowering of applications resulting from the general public’s easy access to computing power.
The same is true for communications technologies. These technologies support increasingly ubiquitous interconnectivity between computing devices, and it is not an exaggeration to suggest that most computing devices in the world are connected—although perhaps with a significant time lag—to most other computing devices. Such connectivity has led to exponential increases in the numbers of computers (and individuals) that communicate with each other.
With respect to the “packaging” of the fundamental hardware components of information technology, there are three hardware trends of note today.
• Mobile computing and communications. To an ever-increasing degree, users are demanding and vendors are supplying a wide variety of mobile computing and communications platforms, ranging from smart phones and tablet computing devices that are familiar to many consumers to ubiquitous sensor networks that are physically distributed over wide areas. Wireless data services needed to support mobile applications are proliferating as well. One form of mobile computing of particular note is wearable computing, as discussed in Box 2.1.
2.1 Wearable Computing
In contrast to handheld computing devices such as smart phones and personal data assistants, wearable computing devices are generally integrated into a human’s clothing or accessories (e.g., watches, glasses, belts). As such, they are less conspicuous as they are carried or used, and onlookers are more likely to miss them in casual observation. Moreover, the placement of these devices means that computing capability and large volumes of information are nearly instantaneously available.
Such devices have both military and civilian applications. Wearable computing and communications are already used to provide tactical information to soldiers in the field, and the canonical person in the street can often make use of instantaneous knowledge about geography (mapping), products on sale, and a wide variety of other consumer applications. Advances in such technology can also be used to provide bidirectional real-time translation between English and other languages.
But the inconspicuousness of wearable computing also raises many privacy issues. For example, one oft-raised privacy concern involves the possibility of instant facial recognition. A camera mounted on a user’s glasses connected to a computer can allow the user to look at another person, capture an image of his or her face, and using facial recognition software, identify that person—along with any other information associated with that identity. Especially in an environment in which everyone does not have equal access to such capabilities, the potential for information asymmetry is large.
Another wearable computing application is the electronic capture of everything that a person can see or hear. Under most circumstances, video and audio events are fleeting—and people’s memories of these events are known to be of questionable reliability under many circumstances. Those participating in such events often count on some degree of transience to make it safer for them to engage in such participation. The availability of potentially permanent records of previously transient phenomena thus has a potential for inhibiting a large range of behavior, most of which is not illegal. Again, the possibility of such an outcome most certainly carries ELSI implications.
• Cloud computing. For reasons of efficiency and economy, cloud computing is becoming increasingly popular among corporate users. Cloud computing provides computing power on demand, and because cloud computing is managed centrally, important IT support functions, such as security and maintenance, are simpler for many enterprises to obtain.
• Embedded computing. A modern automobile today has several dozen central processing units that control the braking, navigation, steering, entertainment, and power-train systems. (Indeed, safe computer-controlled driving has been demonstrated in a number of instances, and driv-
ing laws in some states are being updated to allow for this possibility.1) Computing power is also increasingly embedded in myriad devices and artifacts such as refrigerators and watches to make more effective use of the physical resources at hand and to provide desired services.
In the applications space, one of the most significant trends is the emergence of social computing and networking. Broadly speaking, social computing and networking support cooperative relationships for sharing information, and they take advantage of such shared information. In addition, information technology today is such that end users find it easier than ever before to assemble do-it-yourself applications for their own purposes.
Another important trend is the increasing use of a “big data” approach to solving a broad class of computational problems. Data storage capabilities have increased more rapidly than the processing power increases described by Moore’s law. And as technology is increasingly used in everyday life, more data can be and are collected. When such data are appropriately represented and structured, obtaining value from large data collections is often possible.
Processing these large data sets has required many additions to traditional computer processing algorithms and engineering paradigms (e.g., as in the paradigm used by programmers whereby they abstract, encapsulate, and re-use encapsulated objects). In particular, computer scientists now apply machine learning and knowledge discovery algorithms to large data sets and continually refine these algorithms based on evaluation of their results, and certain branches of computer science today have a substantial empirical basis.
Roughly speaking, machine learning involves methods that allow computers to make inferences from known relationships and patterns. For example, machine learning can be involved when a computer looks at many pictures of vehicles and identifies which pictures contain tanks. Here, the presumption is that tanks have distinguishing characteristics (e.g., vehicles with a gun sticking out of a turret that is mounted on a tracked chassis).
Knowledge discovery seeks to identify previously unknown relationships hidden in large volumes of heterogeneous data collected from myriad sources (text-based databases, video surveillance cameras, and so on). For example, knowledge discovery can be involved when a computer looks at a large volume of phone call records to identify networks of fre-
1 See, for example, Maggie Clark, “States Take the Wheel on Driverless Car,” USA Today, June 29, 2013, available at http://www.usatoday.com/story/news/nation/2013/07/29/states-driverless-cars/2595613/.
quent communicators with geographical locations in Yemen or Somalia. Machine learning may, of course, be used in knowledge discovery—for example, systems can be “trained” on many examples and then asked to identify new patterns consistent with the examples in those training sets.
The result has been programs that are highly adaptive, even to the point of being able to learn. Direct consumer impact has occurred in everything from search engines, to classic artificial intelligence applications like speech recognition and translation, to modern e-commerce applications like interest-based advertising.
Finally, one of the most important truths about developments in information technology is that despite the origins of modern information technology in military R&D, advances in IT for the last few decades have been driven primarily by the private sector. This is not to deny the role of military R&D for certain very specialized technologies, but increasingly the military (and intelligence) communities seek ways of adapting commercially developed technologies for their own purposes, rather than building those base technologies from scratch. Such adaptations take advantage of an extensive IT R&D infrastructure developed in the civilian sector.
For example, scientists and engineers from the Massachusetts Institute of Technology Artificial Intelligence Laboratory founded a company in 1990 to commercialize their expertise in robotics—the fruits of their work include both bomb disposal robots and robotic vacuum cleaners. And this example is just one of the myriad developments originating in the private sector, including information retrieval and ubiquitous information, three-dimensional modeling, the “internet of things,”2 and natural language and image understanding.
U.S. military forces are highly dependent on information technology in a wide variety of contexts. To take the most basic example of such dependence, much of the IT used by DOD personnel for administrative and management purposes is essentially technology that can be obtained more or less unadorned from commercial vendors. But the DOD also has specialized needs for weaponry, command and control, training, and intelligence analysis.
• Modern military forces use systems and equipment that are controlled by computer for navigation, propulsion, communications, sur-
2 The “internet of things” refers to a densely connected array of objects imbued with computing power that share information to work more effectively and efficiently together.
veillance, fire control, and so on. One of the most significant examples of applying information technology to military problems in the past several decades is the trend toward “smarter” guided munitions. IT is used to guide such a weapon after release directly to its target, thus vastly increasing the probability of a hit. IT is also used to effectuate smart fusing (e.g., optimal timing for when an explosive should detonate), thus increasing the probability that a hit will actually destroy the target. Another advantage is that the use of such weapons instead of “dumb” munitions potentially reduces the collateral damage of certain kinds of military operations by orders of magnitude.
• The movements and actions of military forces are increasingly coordinated through IT-based systems for command, control, communications, and intelligence (C3I) that allow information and common pictures of the battlefield to be shared and through analytical tools that help commanders make better decisions. C3I is an enabler for commanders to place (and thus use) their forces where and when they are needed, multiplying the operational effectiveness of those forces. Smaller forces are thus needed to create the same military effects.
• Training of U.S. military forces at many levels relies heavily on simulation, from training of individual soldiers to large-scale exercises that bring together many units. By definition, a simulation is a computer-generated representation of parts of a real environment. The use of simulation reduces costs of training and limits risks to individuals (e.g., from training accidents) but obviously does not substitute entirely for “live” training. In many cases, training simulations have their roots in gaming applications from the civilian sector.
• Intelligence analysis is based on finding connections in large disparate data sets. For example, machine learning and big data applications may be able to help predict major impending events, such as an assault or a jump in insurgent activity. Analysis of surveillance videos may identify an individual leaving a bomb in a public place or about to conduct a suicide attack. Authoritarian nations may use such technologies to identify dissidents. Adversaries might use predictive data mining to uncover putatively secret information, such as operational deployments of U.S. military units or identities of U.S. undercover operatives. High-quality facial recognition that can operate on degraded or obscured signals or can penetrate attempts at disguise has obvious value, especially in environments in which surveillance cameras are plentiful.3
As an illustration of some of the applications that new trends in com-
puting might enhance, a presentation to the committee by Peter Lee from Microsoft Research suggested three important classes of application that have military or security implications and also present ethical, legal, and societal issues.
• Prediction. Large volumes of data can often be used to make predictions about future events (e.g., human behavior, outcomes of processes), the paradigm known as “big data” mentioned above. For example, based on the data routinely collected by electronic medical records systems in hospitals, it is possible to predict quite accurately the likelihood that a discharged patient will be readmitted. Prediction has also been demonstrated in software development (predicting the most likely locations of software defects and schedule delays); in Web browsing (predicting the Web pages that a user is likely to access); and in consumer buying behavior (predicting buying decisions in the near future).
• Extraction of information from degraded sensor data. In many sensing applications, data streams are highly redundant. Such redundancy can be used to compensate for missing or degraded data. For example, a group at the University of Illinois at Urbana-Champaign has applied the technique of compressive sensing to face recognition4 and has achieved a success rate for correct face recognition above 80 percent even operating on a severely degraded signal. (Compressive sensing is a signal-processing technique for reconstructing in certain contexts a relatively complete signal from relatively sparse measurements.)
• Behavioral inference. Computers increasingly can infer meaning from data that originate from people, whether such data takes the form of physical gestures, words, pictures, and so on. Even today, software can scan e-mail (for example) and make inferences about one’s schedule and travel plans. Microsoft’s Kinect uses various cameras to look at a human’s movements and gestures and specialized software that provides interpretation of those gestures. Kinect has also been used in a number of applications, including the use of gestures to direct the music of a computerized orchestra and enabling a small drone to avoid obstacles in its immediate surroundings.5 Other commercial applications have emerged: helping shoppers find the right size of clothing; assisting drivers in parallel parking; spotting suspicious human behavior in a casino. Intent-inferring technologies may be able to assist in situations relevant to national security as
4 John Wright, Allen Y. Yang, Arvind Ganesh, S. Shankar Sastry, and Yi Ma, “Robust Face Recognition via Sparse Representation,” IEEE Transactions on Pattern Analysis and Machine Intelligence 31(2):210-227, 2009.
5 Rob Walker, “Freaks, Geeks and Microsoft,” New York Times, May 31, 2012, available at http://www.nytimes.com/2012/06/03/magazine/how-kinect-spawned-a-commercial-ecosystem.html.
well—recognizing when a person is about to give a package to another person, when someone is pulling out a gun, or what events are being planned from a trail of e-mail.
Two of the most prominent application domains involving IT for military purposes—autonomous military systems and cyber weapons—are discussed in Chapter 3.
Information technology alters many traditional concepts and activities by separating out and amplifying the information dimensions of such concepts and activities. IT is often used in situations and problem domains for which there is no accepted law, policy, or ethical stance. Moreover, these situations and problem domains themselves evolve and change at a very rapid rate. To understand what ethical behavior is when IT is involved, traditional principles of ethics are relevant but often not sufficient by themselves, and considerable interpretation and analogical thinking are needed to understand how those principles apply in any given situation.6
In a civilian context, some of the ethical, legal, and societal issues raised with IT concern privacy; intellectual property; accountability; trust; loss of control; and software dependability, including safety and reliability. In a military or national security context, each of these issues can sometimes play out differently than it might in a civilian context.
In the United States, many individuals place a significant value on privacy, especially privacy against government intrusion.7 Privacy is often an issue in the context of certain national security applications of IT. When contemplating the use of some IT application against an adversary, it is not so much the privacy rights of adversaries at issue (they have few or none), but rather the possibility that a given application may compromise the privacy rights of innocent individuals (that is, ordinary citizens).
6 These ideas are explored in two papers written in 1985 and 1998 by James Moor (a member of the committee): James H. Moor, “What Is Computer Ethics?,” pp. 266-75 in Computers and Ethics, Terrell Ward Bynum, ed., Blackwell Publishers, Ltd., 1985, published as the October 1985 issue of Metaphilosophy; and James H. Moor, “Reason, Relativity, and Responsibility in Computer Ethics,” Computers and Society 28(1):14-21, March 1998.
7 National Research Council, Engaging Privacy and Information Technology in a Digital Age, James Waldo, Herbert S. Lin, and Lynette I. Millett (eds.), The National Academies Press, Washington, D.C., 2007.
Much of the tension regarding privacy and national security applications of IT focuses on managing the tradeoff between the intended security benefits of an IT application and the unintended “collateral damage” to the privacy of innocent citizens.
To the extent that privacy exists as an enforceable right, privacy rights of individuals have been enforced in the past both by law and by the practical difficulty of finding certain kinds of personal information. However, information technology reduces the practical difficulties of finding information, and much of what might have previously been hard to learn about an individual can in fact be learned by analyzing large amounts of data that reside in a number of different places. Protecting privacy through obscurity is increasingly difficult.
Considering the big data applications described above, one might note that with compressive sensing, the task of automating facial recognition in noisy environments (e.g., where cameras might not be able to obtain unobstructed images) will become easier. Compressive sensing would thus be an important component of a system capable of tracking the public movement of individuals on a large-scale basis. Intent detection potentially turns innocent movements into suspicious events, perhaps unjustly singling out individuals for examination and possible detention. Predictive analysis thus raises privacy concerns, because it requires the collection of data about an individual’s behaviors and history to make inferences about that person’s intent when he or she does something anomalous. Furthermore, privacy concerns—which themselves may evolve as people become more familiar with new technologies—may be accentuated if or when individuals improperly suffer negative consequences (e.g., arrest, loss of jobs) because putatively private information is revealed.
An extended discussion of privacy impacts of information technology can be found in the 2007 National Research Council report Engaging Privacy and Information Technology in a Digital Age.8
With modern information technology, the cost of replicating digital property (sometimes also known as digital objects) is essentially zero. Replications of digital property can be perfect, unlike replications of material property. These two aspects of digital property upend many traditional understandings of property, such as ownership, that have been
8 National Research Council, Engaging Privacy and Information Technology in a Digital Age, 2007.
developed primarily for property manifested as tangible objects consisting of arrangements of atoms.
Traditional concepts associated with intellectual property may have to be modified (and in many cases simply recognized as inapplicable) when “property” is manifested as arrangements of binary digits (bits). For example, information “objects” such as data files are much more easily transported than physical objects. Although much more convenient to store and search, information objects are also much easier to misappropriate—and in the civilian world, a wide range of economic and societal interests have a stake in striking the right balance between how to protect and how to provide access to information objects.
Issues of intellectual property protection have also become important for national security in three ways:
• The use of various IT applications to create, manage, and store digitally represented intellectual property of all kinds has proliferated tremendously in the past 50 years—and so have the opportunities for misappropriation of such property. In this context, intellectual property is construed broadly to include product information, software, business plans, proprietary R&D, and economic forecasts—and when competitors are able to misappropriate such information, individual U.S. firms can be placed at a significant disadvantage. In recent years, the scale of the problem has expanded in such a way that the inability to keep such intellectual property secure and confidential is no longer just an issue for individual companies but has also become a national security concern because of how it threatens U.S. economic leadership and primacy.
• The misappropriation of intellectual property specifically related to national security (e.g., weapons blueprints and specifications, military plans, and so on) creates direct risks to national security. Adversaries may learn of vulnerabilities in U.S. weapons or operational procedures, may be able to anticipate U.S. military moves, and so on—all such information in the wrong hands constrains the freedom of action that is otherwise enjoyed by U.S. military forces.
• Adversaries exploring the IT systems and networks controlling critical infrastructure facilities could acquire certain kinds of intellectual property (e.g., facility configurations, communications links between parts of a plant, and so on) that would help them to attack these facilities.
Today’s computers can process inputs and then take different actions based on the specific inputs received. In common parlance and understanding, such computers are making decisions—choosing between alter-
native courses of action. Information technology underlies increasing automation of many functions previously delegated to people,9 but today and more so in the future, computers will make decisions that have traditionally been made by responsible humans in positions of authority. This phenomenon is not limited to civilian systems, and there are many pressures today toward increasing the role of computer-based decision making in operational military scenarios, especially those that involve highly compressed timelines.
Notions of accountability and responsibility, as applied to individuals, have focused on the ability of humans to make appropriate decisions under various circumstances. How and to what extent, if any, are such notions applicable to computers? This question is especially complicated in light of three facts: humans program computers (or program computers to program other computers); an IT system is sometimes so complex that no single individual can have a complete understanding of it; and users of such programs often have less understanding of the program than do the creators. In a military context, such facts call into question traditional notions of command and accountability, and thus the organizational structures built around these notions.
Many human relationships (e.g., commercial relationships) are built on trust. But trust relationships can be difficult to establish at a distance, and a great deal of information technology is used to enable connections at a distance. Information technologists have developed a wide variety of mechanisms for developing and enhancing trust, which in this context refers in part to assurances that an asserted identity does indeed correspond to an actual identity. Personal trust that depends on face-to-face interaction cannot be fully accommodated by technologies that connect individuals over long physical distances.
Even so, some of the limitations of long-distance interaction can be mitigated by technical improvements, such as increased bandwidth. Larger bandwidth is an enabler for video and audio connections with higher fidelity, making it easier for individuals on both ends of a connection to see and hear subtleties in the expressions of their counterparts. Reputations and social networks can also facilitate the establishment of trust. For example, John may assert that X is true. I may not know John,
9 For example, the World War II Baltimore-class cruiser (CA-68) displaced 13,600 tons and carried a crew ranging from 1650 to 1950 individuals. By contrast, the planned DDX Zumwalt-class destroyer (DDG-1000) is expected to displace approximately 14,500 tons and carry a crew of 140 individuals.
but if I know that he is friends with and trusted by Bob and Mary, whom I know well and trust, then I might infer with greater accuracy that John is trustworthy than I could in the absence of my own connections to Bob and Mary.
In an operational military context, consider that trust is an essential element that binds commanders and the troops that they command. In many instances, command relationships cannot be reduced simply to superiors passing orders to subordinates and subordinates passing information to superiors. Commanders need to know, for example, that a subordinate is apprehensive about an upcoming operation—and that information technology systems to support command and control that do not allow for direct unmediated communication between commander and subordinate may well be less effective operationally than systems that do.
Loss of Control
The possibility of excessive automation leading to a loss of human control in nuclear weapons systems has been particularly problematic. Much of nuclear strategy has focused on ensuring retaliation against an adversary, regardless of what that adversary might attempt to do. A “launch on warning” strategy—rejected by most strategists as being too risky—was based in part on the idea that a largely automated system of sensors could provide highly reliable warning about a nuclear attack in progress and thus enable nuclear missiles to be launched before they were destroyed on the ground.
According to a 2007 NRC report, a system is dependable when users can rely on it to produce the consequences for which it was designed, and no adverse effects, in its intended environment.10 Although information technology hardware has been characterized for several decades by exponential growth in its sophistication, advances in software technology and the corresponding ability to build complex networked computer systems have been relatively scarcer. Today, it is a given that any complex computer system will not be entirely dependable under all possible circumstances of operation.
The 2007 NRC report Software for Dependable Systems argues that demonstrating software dependability is essentially a social process—that a developer must convince the user of such software that the software
10 National Research Council, Software for Dependable Systems: Sufficient Evidence?, The National Academies Press, Washington, D.C., 2007.
is dependable, using both technical and nontechnical evidence. A software system should be regarded as dependable only if the developer has made a credible case for its dependability, which includes a compilation and presentation of relevant evidence that the software behaves as it is expected to behave. Further, the level of dependability required for any given software system is not a technical matter alone, but is determined instead by a mix of factors, some of which are societal (and sometimes ethical) in nature. As one example, software developers may have to make tradeoffs between increased software functionality and the increased difficulty of making an adequate case for the software’s dependability.
The DOD has special needs in software, such as the need for software dependability in the presence of highly sophisticated adversaries; manageability of the complex architectures needed to fulfill mission requirements; criticality with respect to safety, availability, and responsiveness; and overall complexity and scale.11 Thus, software dependability is particularly significant in a military context.
As an illustration, consider the problems inherent in a 1998 computing problem aboard the USS Yorktown, an Aegis-class cruiser designated as an information technology testbed for the U.S. Navy, that disabled all onboard propulsion systems.12 Such a glitch, occurring in the midst of battle, may well have had catastrophic consequences.
As is true of much research in genetic engineering and recombinant DNA, research in synthetic biology is in general concerned with the design and construction of biological systems not found in nature. Synthetic biology and these other approaches to construction of new systems offer the hope of new drugs, materials, and fuels. They may also lead to the creation of new organisms with dangerous properties that might be harmful to the public and/or the environment. In addition, adversaries have pursued biological weapons for use against the United States, despite international agreements prohibiting the development and use of biological weapons.13
11 National Research Council, Critical Code: Software Producibility for Defense, The National Academies Press, Washington, D.C., 2010.
13 For example, the Director of Central Intelligence testified to the Senate Select Committee on Intelligence on February 6, 2002, that “documents recovered from al-Qa’ida facilities in Afghanistan show that Bin Laden was pursuing a sophisticated biological weapons research program.” See https://www.cia.gov/news-information/speeches-testimony/2002/senate_select_hearing_03192002.html.
Synthetic biology, as a member of a family of genetic engineering technologies, thus implicates many of the ethical, legal, and societal issues that arise in the context of such technologies.
A field with a coherent set of research objectives and methodologies, synthetic biology uses design principles from engineering, such as standardization, decoupling, and abstraction, to understand, take apart, rebuild, and construct new biological systems.14 Synthetic biologists are working to construct and catalog a set of biological components with known and predictable properties and performance qualities. When assembled on a “chassis” into a functional cellular or acellular “machine,” these standard biological parts then are expected to act and interact predictably, even when used in varying combinations, thus reducing the cost of designing new biological systems.
The cost of the technological infrastructure needed to conduct serious work in synthetic biology—technologies for DNA sequencing and synthesis—has followed an exponentially decreasing cost curve similar to Moore’s law (although with different time constants). Consequently, technological capabilities for such work are much more widespread than ever before.
A major goal of synthetic biology is the construction of “minimal cells” possessing only the genetic program necessary to sustain essential cellular functions.15 In a minimal cell, the functional redundancy and complexity arising from the long evolutionary history of natural organisms might be eliminated through reverse engineering. In fact, a synthetic minimal cell need not be built from the same “parts” as natural cells at all. For example, the genetic instructions encoded in a product could be entirely different from natural genetic codes. Downstream, the instructions could specify the assembly of a protein from custom amino acids that do not occur in natural systems. Such a product could then serve as a cellular chassis to which genetic applications could be added, for example to produce a hydrocarbon or an enzyme of choice.
In 2010, Science published a paper from the J. Craig Venter Institute describing the construction of the first self-replicating, synthetic bacterial
14 Steven A. Benner and Michael A. Sismour, “Synthetic Biology,” Nature Reviews Genetics 6(7):533-5431, 2005.
15 Anthony C. Forster and George M. Church, “Toward Synthesis of a Minimal Cell,” Molecular Systems Biology 2:45, 2006.
cell.16 The institute reported the synthesis, assembly, cloning, and successful transplantation of the 1.08 million base pair Mycoplasma mycoides JCVI-syn1.0 genome to create a new cell controlled by this synthetic genome and capable of replication. In the words of an accompanying press release, the synthetic cell provided “the proof of principle that genomes can be designed in the computer, chemically made in the laboratory and transplanted into a recipient cell to produce a new self-replicating cell controlled only by the synthetic genome.”17
By late 2011, another group of scientists, having experimented with larger and more complex yeast chromosomes that are harder to synthesize than the bacterial chromosome, announced that they were able to replace all of the DNA in the “arm” of a yeast chromosome with synthetically produced computer-designed DNA that is structurally distinct from its original DNA to produce a healthy yeast cell.18
Such advances, coupled with federal and private investments in research and development, are helping synthetic biology to develop into an ever more promising field. The many potential applications of synthetic biology include production of pharmaceuticals and biofuels, specialty chemicals and enzymes, and customized synthetic DNA sequences as well as minimal cell chassis. The real and/or perceived efficacy of the synthetic biology paradigm for these applications has led to the growth of a new bioengineering sector. The global market for this synthetic biology sector was $1.6 billion in 2011 and is forecast to exceed $10 billion within 5 years.19
Nevertheless, at the time of this writing, synthetic biology has yielded few commercially viable products, and it is fair to say that synthetic biology is not a mature technology. However, given that the barriers to entry for R&D in synthetic biology are so low, the field may mature quite rapidly and unexpectedly.
16 Daniel G. Gibson et al., “Creation of a Bacterial Cell Controlled by a Chemically Synthesized Genome,” Science 329(5987):52-56, 2010, available at http://www.sciencemag.org/content/329/5987/52.full.
18 Jessica S. Dymond, Sarah M. Richardson, Candice E. Coombes, Timothy Babatz, Heloise Muller, Narayana Annaluru, William J. Blake, Joy W. Schwerzmann, Junbiao Dai, Derek L. Lindstrom, Annabel C. Boeke, Daniel E. Gottschling, Srinivasan Chandrasegaran, Joel S. Bader, and Jef D. Boeke, “Synthetic Chromosome Arms Function in Yeast and Generate Phenotypic Diversity by Design,” Nature 477(7365):471-476, 2011.
19 John Bergin, “Synthetic Biology: Emerging Global Markets,” BCC Research, November 2011.
In September 2011, DARPA issued a broad agency announcement (DARPA-BAA-11-60) for innovative research proposals to develop new tools, technologies, and methodologies to transform biology into an engineering practice. The Living Foundries program is intended to revolutionize manufacturing by enabling the rapid development of previously unattainable technologies and products. In 2012, DARPA awarded $15.5 million to six different organizations to carry out research projects that eventually will create new on-demand manufacturing production, thus providing the military with access to “new materials, novel capabilities, fuel and medicines.”20
Many of the civilian applications imagined for synthetic biology would be useful to the military as well. The Presidential Commission for the Study of Bioethical Issues identified several broad application domains for synthetic biology: renewable energy sources, health care, food and agriculture, and environmental remediation.21
• Renewable energy sources. Synthetic biology researchers hope to develop organisms that can produce alcohols, oils, and hydrogen gas, all of which can be used for fuel. The U.S. military is a prodigious user of fuel and would benefit from technologies that could help to secure its access to sources of such fuels.
• Health care. Synthetic biology researchers hope to develop the means for improved production of drugs and vaccines, advanced mechanisms for personalized medicine, and novel, programmable drugs and devices for prevention and healing. Again, the U.S. military provides a very large volume of health care services, both for active duty personnel and for their families, and improvements in heath care technology will have a significant effect on the services thus provided. In addition, the U.S. military has specialized medical needs, because it must cope with a variety of injuries and ailments that are not common among civilians. As-yet-unimagined applications of synthetic biology may provide new treatments for such conditions.
• Food and agriculture. Synthetic biology researchers hope to develop crops that produce higher yields, are more disease-resistant, or have higher levels of food-grade protein. To the extent that troops in the field have specialized nutritional needs, synthetic biology may be able to speed the development of foods that are better able to meet these needs.
21 Presidential Commission for the Study of Bioethical Issues, New Directions: The Ethics of Synthetic Biology and Emerging Technologies, Washington, D.C., December 2010.
• Environmental remediation. Synthetic biology researchers have focused on developing organisms capable of performing certain clean-up functions, such as the digestion of oil slicks and the removal of heavy metals from soil. A military application of clean-up organisms might be the removal of nerve gas residues from contaminated surfaces or the use of enzymes that can neutralize nerve agents if the human body is exposed to them.
Many of the ELSI concerns raised by synthetic biology are quite similar to those raised earlier in considerations of recombinant DNA technology—R&D on both technologies seek to create biological entities that are not found in nature. In both cases, these issues involve safety construed broadly (applications of synthetic biology or recombinant DNA getting out of control or harming the environment), undesirable side effects if such applications are used, and malicious use.22
What sets synthetic biology apart from other technologies developed with similar intent is the approach it takes to creating these new biological entities. Modularization of biological components with predictable behavior is intended to make creation of such entities easier, less expensive, and more reliable. These properties are expected to enable a broad spectrum of work in synthetic biology—much broader than what might be possible in the absence of these properties.
Recognizing the inherent ethical and societal issues that might arise from its investment in synthetic biology, DARPA created in 2011 an advisory committee modeled after its Privacy Panel to advise the Living Foundries program staff. Members of the advisory committee receive compensation from DARPA and are leading authorities in diverse fields including ethics, biosecurity, intellectual property, and environment risk and regulation. The advisory committee reviews all proposals and highlights potential areas of concern, which may include how the research is conducted and disseminated as well as how the research might be used.23 Additional discussion of the advisory committee is provided in Chapter 7.
The discussion below of ethical, legal, and societal issues draws heavily on two sources: a 2009 report from the Hastings Center and the Woodrow Wilson Center titled Ethical Issues in Synthetic Biology: An
22 See, for example, Jonathan Tucker and Raymond Zilinskas, “The Promise and Perils of Synthetic Biology,” The New Atlantis 12(Spring):25-45, 2006, available at http://www.thenewatlantis.com/publications/the-promise-and-perils-of-synthetic-biology.
23 Conversation with Alicia Jackson, DARPA, Ken Oye, and Anne-Marie Mazza, June 25, 2012.
Overview of the Debates24 and the 2010 report New Directions: The Ethics of Synthetic Biology and Emerging Technologies,25 issued by the Presidential Commission for the Study of Bioethical Issues. (Shortly after the announcement by the Venter Institute of its successful construction of a synthetic bacterial cell, President Obama asked the Commission to review the emerging field of synthetic biology and to address the ethical issues associated with this new field so as to maximize public benefits and minimize risks.)
Environmental and Safety Risks
As with other genetic engineering technologies, synthetic biology raises concerns about how new biological entities will interact with and affect human beings and the natural environment:
• Engineered microbes introduced into the human body may trigger unanticipated adverse effects, such as infections or unexpected immune responses, or may displace the natural microbiome.
• New organisms that escape into the environment may pose novel risks resulting from their potential to reproduce or evolve. Such organisms may alter the ecology of areas they are inadvertently introduced to, affecting local food webs and perhaps displacing natural species, including animals and plants as well as microbes. In addition, because organisms produced by synthetic biology may have entirely novel genetic makeups, they may have altered rates of evolution and may adapt to new environments in unpredictable ways. Synthetic organisms may transfer one or more engineered genes to naturally occurring species, with unknown and perhaps irreversible consequences.26
• In the case of engineered organisms for the production of renewable energy, concerns arise from the need to dedicate large amounts of land and other natural resources to the production of biomass as feedstock for biofuels. Such use could crowd out other uses of land, affecting food production, communities, and current ecosystems.
Furthermore, because the evolutionary or ecological history of a
24 The full report can be found at http://www.synbioproject.org/process/assets/files/6334/synbio3.pdf.
26 Genya V. Dana, Todd Kuiken, David Rejeski and Allison Snow, “Four Steps to Avoid a Synthetic Biology Disaster,” Nature 483:29, 2012; Markus Schmidt, Agomoni Ganguli-Mitra, Helge Torgersen, Alexander Kelle, Anna Deplazes, and Nikola Biller-Andorno, “A Priority Paper for the Societal and Ethical Aspects of Synthetic Biology,” Systems and Synthetic Biology 3(1-4):3-7, 2009.
novel organism will likely be incompletely known or entirely nonexistent, risks of escape and contamination may be extremely difficult to assess in advance.
ELSI concerns in this category appear to relate to both civilian and military applications of synthetic biology equally.
Humanity and the Sanctity of Life
Different religious groups may have different answers to the question of whether there is an inherent sanctity of life or of living systems, and whether this sanctity is violated by the construction of novel life-forms. The Wilson Center report addresses this issue under the heading of “nonphysical” harms, which are primarily “concerns about the appropriate attitude to adopt toward ourselves and the rest of the natural world.”27
The report notes that these concerns involve “the possibility of harm to deeply held (if sometimes hard-to-articulate) views about what is right or good, including … the appropriate relationship of humans to themselves and the natural world.”
Further, the Wilson Center report argues, many people disagree about “whether a particular activity threatens these values, how we should reduce nonphysical harm, who should be responsible and what may be sacrificed along the way…. We do not always agree about what counts as a nonphysical harm, because we disagree about what is human well-being … [and this is because we embrace] different ethical frameworks.”
The Wilson Center report cites work by Boldt and Müller28 as the most ambitious attempt to date to articulate these concerns in the synthetic biology literature. Boldt and Müller argue that
if we begin to create lower forms of life and to think of them as “artifacts” (as researchers in synthetic biology propose), then we “may in the (very) long run lead to a weakening of society’s respect for higher forms of life.” That is, if we continue down this road, we risk undermining our respect for animals and, ultimately, humans as they naturally occur. They [Boldt and Müller] also argue that when creatures like us adopt the attitude of creators, we are making a category mistake—a mistake about the sorts of beings we really are. Less self-conscious, nonacademic authors would have used an unfashionable phrase about “playing God” to describe this mistake.
As in the category of environmental and safety risks related to the
28 Joachim Boldt and Oliver Müller, “Newtons of the Leaves of Grass,” Nature Biotechnology 26(4):387-389, 2008.
sanctity of living systems, ELSI concerns related to the sanctity of life appear to relate to both civilian and military applications of synthetic biology equally.
New Adversary Threats
All of the risks described above are framed as inadvertent and unintentional. But some biological research conducted in the 2000s—including the laboratory creation of infectious polio virus,29 the creation of a cell with a synthesized mycoplasma genome,30 the re-creation of the 1918 strain of influenza virus,31 and the creation of a highly transmissible avian flu32—led to concerns that an adversary could have undertaken
29 Jeronimo Cello, Aniko V. Paul, and Eckard Wimmer, “Chemical Synthesis of Poliovirus cDNA: Generation of Infectious Virus in the Absence of Natural Template,” Science 297(5583):1016-1018, 2002. One member of the research team argued that the experiment demonstrated the risk of further viruses being created from just their genetic code—by bioterrorists, for example. See http://www.nature.com/news/2002/020712/full/news020708-17.html.
30 Daniel E. Gibson et al., “Creation of a Bacterial Cell Controlled by a Chemically Synthesized Genome,” Science 329(5987):52-56, 2010. Concerns were raised about bioterrorism and environmental disaster, as discussed in http://www.jyi.org/issue/synthetic-biology-an-era-of-promised-uncertainty/.
31 Terrence M. Tumpey et al., “Characterization of the Reconstructed 1918 Spanish Influenza Pandemic Virus,” Science 310(5745):77-80, 2005. On October 17, 2005, in a New York Times op ed, Ray Kurzweil and Bill Joy, both information technologists, called the publication of this paper a “recipe for destruction” and characterized the genome of the virus as the design of a weapon of mass destruction whose realization would be easier than that of an atomic bomb. See http://www.nytimes.com/2005/10/17/opinion/17kurzweiljoy.html.
32 Masaki Imai et al., “Experimental Adaptation of an Influenza H5 HA Confers Respiratory Droplet Transmission to a Reassortant H5 HA/H1N1 Virus in Ferrets,” Nature 486:420-428, 2012; Sander Herfst et al., “Airborne Transmission of Influenza A/H5N1 Virus Between Ferrets,” Science 336(6088):1534-1541, 2012. In the lead-up to publication, the National Science Advisory Board for Biosecurity (NSABB) expressed security concerns to two journals, Science and Nature, about unrevised versions of these manuscripts, and requested that these papers be published only with the redaction of certain “experimental details and mutation data that would enable replication of the experiments.” These papers demonstrated the isolation of highly pathogenic avian H5N1 viruses that were capable of aerosol transmission between mammals. This research went through both scientific peer review and programmatic review at the NIH, as well as review by local institutional biosafety committees; none of these reviews were designed to consider ethics apart from issues of safety or misuse. It was not until the manuscripts were submitted for publication that any concerns arose, and even then these were largely about biosafety and biosecurity, the fear being that accidental or deliberate release of an agent with >50 percent mortality could cause a severe pandemic. Their authors subsequently submitted revised manuscripts (cited above), and the papers were published in full with the support of the NSABB. The NSABB cited two reasons for its reversal. First, it noted that “[t]he data described in the revised manuscripts do not appear to provide information that would immediately enable misuse of the research in ways that would endanger public health or national security,” and
these experiments with the deliberate intent of creating harmful organisms, even though these experiments were not in fact performed with any harmful intent. To varying degrees, these experiments used traditional recombinant DNA techniques, although arguably some used techniques from synthetic biology when they employed synthesized DNA.
As in the previous category, ELSI concerns in this category appear to relate to both civilian and military applications of synthetic biology equally. Nonetheless, the notion of adversary threats based on synthetic biology is relevant to national security.
Impact of Classification
A recommendation of the President’s Commission was that the federal government should start to coordinate and oversee agency activities in synthetic biology.33 It called for no new oversight function at that time but rather recommended that the government stay abreast of any major advances in the field, especially those that offer potential benefits and risks to the public.
But the commission was not charged specifically with addressing the oversight of classified research in synthetic biology, should any such research be contemplated. (The committee does not know of classified research in synthetic biology, but it undertook its information-gathering efforts in an entirely unclassified environment.) Some of the issues that arise when research is classified include the degree of coordination that is feasible when there may be different levels of secrecy associated with the research, and how to establish effective oversight in these environments. Staying abreast of developments and the associated benefits and risks can also be difficult because the research, by definition, is shielded from public view.
ELSI concerns related to classification appear to relate primarily to military applications of synthetic biology.
The term “neuroscience” refers to the interdisciplinary study of the nervous system. The Society for Neuroscience describes neuroscience as
it cited new evidence “that understanding specific mutations may improve international surveillance and public health and safety.” See http://oba.od.nih.gov/oba/biosecurity/PDF/NSABB_Statement_March_2012_Meeting.pdf. More recently, however, there has been a call to broaden the discussion about this type of gain-of-function experiments to include ethics. See Simon Wain-Hobson, “H5N1 Viral-Engineering Dangers Will Not Go Away,” Nature 495(7442):411, 2013.
the entire range of scientific research endeavors aimed at understanding the nervous system and translating this knowledge to the treatment and prevention of nervous system disorders. It fosters the broad interdisciplinarity of the field that uses multiple approaches (e.g., genetic, molecular, cellular, anatomical, neurophysiological, system, comparative, evolutionary, computational, and behavioral) to study the nervous system of organisms ranging from invertebrates to humans across various stages of development, maturation, and aging.34
In its 2008 report Emerging Cognitive Neuroscience, the National Research Council describes neuroscience as “includ[ing] the study of the central nervous system and somatic, autonomic, and neuroendocrine processes,” and defines the term “cognitive” as covering “psychological and physiological processes underlying human information processing, emotion, motivation, social influence, and development…. It [neuroscience] includes contributions from behavioral and social science disciplines as well as contributing disciplines such as philosophy, mathematics, computer science, and linguistics.”35
Modern neuroscience is thus an interdisciplinary field that combines new knowledge of molecules, cells, neural circuits, and cognition; is allied with clinical medicine; and uses methodologies of mathematics, molecular biology, genomics, neuroendocrinology, neuroimaging, and the social and behavioral sciences. Some important achievements and ongoing goals of neuroscience are the mathematical modeling of systems of electrical signals and of electrochemical transmission from one neuron to another via synapses, and of the ways that brain cells store memories.
Acknowledging the importance of this emerging field, both the United States and the European Union have launched large-scale science programs in neuroscience. In April 2013, the Obama Administration committed $100 million in the FY 2014 budget to the BRAIN (Brain Research through Advancing Innovative Neurotechnologies) Initiative.36 The White House fact sheet on this initiative notes that its ultimate aim is to “help researchers find new ways to treat, cure, and even prevent brain disorders, such as Alzheimer’s disease, epilepsy, and traumatic brain injury.” Further, the fact sheet says, the initiative will
accelerate the development and application of new technologies that will enable researchers to produce dynamic pictures of the brain that show how individual brain cells and complex neural circuits interact at the
34 Society for Neuroscience, Strategic Plan, available at http://www.sfn.org/index.aspx?pagename=strategicPlan. Last updated September 30, 2010.
35 National Research Council, Emerging Cognitive Neuroscience and Related Technologies, The National Academies Press, Washington, D.C., 2008.
speed of thought. These technologies will open new doors to explore how the brain records, processes, uses, stores, and retrieves vast quantities of information, and shed light on the complex links between brain function and behavior.
In January 2013, the European Union announced that as part of its effort to advance future and emerging technologies, it was proposing to devote €1 billion over 10 years to the Human Brain Project,37 which is intended to create the world’s largest experimental facility for developing the most detailed model of the brain for “studying how the human brain works and ultimately to develop personalized treatment of neurological and related diseases.”
One measure of the field’s maturation is the growth in the annual number of neuroscience publications, which has increased by a factor of 8 to 10 over the past 20 years.38 In that period the membership of the Society for Neuroscience more than doubled, from 18,976 in 1991 to 42,576 in 2011, and annual meeting attendance increased from 16,447 in 1991 to 32,357 in 2011.39
Advances in the neuroscience of memory (with ramifications for some of the applications discussed below) provide one illustration of scientific progress in the field. The neuroscience of memory addresses neurological processes for encoding information for storage and future retrieval. It is understood today that short-term memory capacity resides in the hippocampus, encoded by measurable strengthening or weakening of synapses, long-term potentiation, and long-term depression. Components of memories are transferred to cortical structures, where they are consolidated into their long-term, stable, protein-synthesis-dependent form during sleep and rest. Neuroscience research using functional magnetic resonance imaging (fMRI) has demonstrated functional connections between the hippocampus and the medial prefrontal cortex. Genetic knock-out studies
38 This factor is derived from data extracted by the committee from the Web of Knowledge/Web of Science with the following query:
Topic=(neuroscience);Refined by: Research Areas=( NEUROSCIENCES NEUROLOGY ) AND Research Areas=( NEUROSCIENCES NEUROLOGY) AND Document Types=( ARTICLE OR MEETING OR CASE REPORT OR ABSTRACT OR REFERENCE MATERIAL OR REPORT )
39 UN International Bioethics Committee, “Initial Reflections on the Principle of Nondiscrimination and Nonstigmatization,” Unesco.org, August 23, 2012, available at unesdoc.unesco.org/images/0021/002174/217421e.pdf.
in mice have found that memory depends on a wide variety of receptors, enzymes, and proteins.40
Possible applications of neuroscience can be divided roughly into two classes—those that help humans to recover normal functionality and those that help humans change normal functionality.
In the first category (recovery of normal functionality), humans sometimes lose neurological functionality through accident or birth defects. For example, boxers and football players are known to suffer neurological damage in playing their sports, as do people who are victims of car accidents. In a military context, traumatic brain injuries (incurred, e.g., as a result of soldiers being exposed to explosions) have been described as the “signature injury” of the wars in Iraq and Afghanistan,41 and advances in neuroscience may be able to help wounded soldiers recover from such injuries.
In the second category (changing normal functionality), neuroscience could be used to enhance or to diminish normal functionality. For example, through neuroscience-based applications, individuals might be able to operate equipment through a direct brain-machine interface rather than manipulating a joystick or typing commands on a keyboard. Workers in high-stress occupations, such as air traffic control, might be able to process larger amounts of information more quickly. Individuals with needs for the selective enhancement or inhibition of learning and memory might meet those needs with the administration of designer drugs based on neuroscience research. Antisocial tendencies of certain criminals, such as sexual offenders, could be diminished. Psychological traumas might be reduced for victims of abuse, torture, or other horrific events.
Enhancements of the types described in the previous paragraph have obvious military applications for soldiers operating weapons or commanders coordinating battles. Much more controversial from an ELSI standpoint are other proposals suggesting that false human memories can be created and different emotional states induced (e.g., reduced or increased fear, feelings of anger or calm) and that degrading the performance of adversaries in military contexts may be possible—applications that are generally not associated with civilian use.
40 For example, Ramirez et al. have demonstrated the insertion of false memories into mice. See Steve Ramirez et al., “Creating a False Memory in the Hippocampus,” Science 341(6144):387-391, 2013, available at http://www.sciencemag.org/content/341/6144/387.
Enhancement may be defined as performance that exceeds a physiological or statistical norm in healthy persons. For example, transcranial magnetic stimulation (TMS) may suppress the effects of sleep deprivation and enable individuals to perform above their baseline capability at specialized tasks, both of which would have obvious advantages for war-fighters. Repetitive TMS (rTMS) might also serve to improve learning and working memory, for example, increasing the ability of an operative to speak a native dialect or to recall complicated instructions. Some believe that near-infrared spectroscopy could detect deficiencies in a warfighter’s neurological processes and feed that information into a device utilizing in-helmet or in-vehicle TMS to suppress or enhance individual brain functions, such as mood and social cognition. A 2009 National Research Council report titled Opportunities in Neuroscience for Future Army Applications recommended that the Army increase its investment in TMS research.42 That committee estimated the development timeframe for using TMS to enhance attention at 5 to 10 years, and for in-vehicle deployment at 10 to 20 years.
A different form of cognitive enhancement comes in the form of mitigating the effects of sleep deprivation, which is the source of so much error in civilian as well as in military life. Historically, fatigue has been mitigated through such measures as cocaine, nicotine, and caffeine. More recently amphetamines (“speed”) have acquired popularity and have, again, been used by both students and warfighters, especially air force pilots it seems, in the form of “go pills.” Modern pharmaceutical technologies may be entering new and somewhat more efficacious territory with evidence that modafinil (originally approved for the treatment of narcolepsy) may reduce fatigue-related cognitive decline, or even outperform methylphenidate (Ritalin) in healthy persons. Short-term memory enhancement may also be achieved through nasally delivered orexin-A, as shown by a DARPA-sponsored study of sleep-deprived monkeys.43
Neurological processes may be modified without the open-skull experiments incident to neurosurgery that have been so important in the history of neuroscience. For example, TMS uses electromagnetic induction to penetrate the skull and modulate the electrical activity of the cerebral cortex. Another method, transcranical direct current stimulation (tDCS), may be safer, but used less often, than TMS. To perform TMS, a techni-
42 National Research Council, Opportunities in Neuroscience for Future Army Applications, The National Academies Press, Washington D.C., 2009.
43 S.A. Deadwyler et al., “Systemic and Nasal Delivery of Orexin-A (Hypocretin-1) Reduces the Effects of Sleep Deprivation on Cognitive Performance in Nonhuman Primates,” Journal of Neuroscience 27(52):14239-14247, 2007.
cian holds an iron-core insulated coil on one side of a patient’s head while a large, brief current is passed through the coil. The current generates a magnetic pulse that painlessly penetrates the layers of skin, muscle, and bone covering the brain and induces weak, localized electrical currents in the cerebral cortex. It is believed that the induced electrical field triggers the flow of ions across neuronal membranes and causes the cells to discharge, resulting in a chain reaction of neuronal interactions. TMS offers hope for individuals suffering from major depression, Parkinson’s disease, and treatment-resistant migraine headaches, and it is under investigation for the treatment of post-traumatic stress disorder. TMS has also helped to map brain circuitry and connectivity.
Neuroscience technologies are often “dual use,” having both military/counterintelligence and medical/scientific applications. Examples of brain-computer interfaces are prosthetic limbs and communication devices. Thus they may benefit both patients and warfighters or other security personnel. These two examples are also convergent technologies: during the past two decades laboratory experiments have shown that simple movements of both rodents and nonhuman primates may be controlled and that primates can be trained to manipulate robotic arms through neural activity alone.44 The same principle of remote control of a robotic prosthesis has been applied to human patients suffering from tetraplegia, by means of an implanted intracortical electrode array.
Technological refinements suggest that, for some purposes at least, brain-computer interfaces need not be invasive. In the past, electroencephalogram-sensitive caps, which help control artificial joints during rehabilitation, were expensive and also required the application of a gel. Recent designs for such caps dispense with the gel and are far less expensive. They are now being produced for commercial application to computer gaming, with the potential for control over environmental conditions like room lighting, door locks, and window shades. DARPA has been interested in using new and noninvasive ways to gather neurological information to help adapt a pilot’s brain to inputs from a cockpit array, reducing “noise” and distraction for the operator depending on what information is required for specific circumstances. Similarly the Cognitive Threat Warning System seeks to convert unconscious human neurological responses into usable information, as in a pair of binoculars
44 L.M. Dauffenbach, “Simulation of the Primate Motor Cortex and Free Arm Movements in Three-Dimensional Space: A Robot Arm System Controlled by an Artificial Neural Network,” Biomedical Sciences Instrumentation 35:360-365, 1999.
that cue the viewer to certain portions of the visual field.45 In time, a true feedback loop that also helps adjust the computer to the human user may also be practical.
Interventions intended as therapy may in some cases enhance normal function. Brain-computer interfaces that control advanced prostheses that render the user faster or stronger would be one example, although perhaps an exoskeleton would be a nearer-term example of the same phenomenon. Dual-use considerations apply to this technology, just as they would for drugs intended to enhance cognitive performance (such as methylphenidate—marketed as Ritalin—which is often believed to help academic performance).
Deception Detection and Interrogation
Traditional measures of deception have relied on neurological correlates of stress like blood pressure and heart and breathing rates, but these are at best physiological proxies of intentional deception. One system known as the “brain fingerprinter” uses an EEG measure to detect an event-related potential called the P300 wave, which is associated with the recognition of a stimulus, such as a photograph of a certain location of interest. Services based on functional magnetic resonance imaging are being offered by companies such as No Lie MRI and CEPHOS, which market their products to governmental and nongovernmental organizations.
A 2008 NRC report entitled Emerging Cognitive Neuroscience and Related Technologies stated that ‘‘traditional measures of deception detection technology have proven to be insufficiently accurate,” recommending that research be pursued “on multimodal methodological approaches for detecting and measuring neurophysiological indicators of psychological states and intentions….’’46 The report cautioned, however, that neurological measurements do not directly reveal psychological states, and so there is a distinct risk of over-interpretation of results, leading to both false negatives and false positives.
Another possible approach to deception detection involves the brain hormone oxytocin, which has been shown to be associated with a wide variety of social impulses. In the laboratory, subjects exposed to oxytocin via the nasal route have behaved in a more trusting and generous manner. The National Research Council’s 2008 report on emerging neuroscience identified oxytocin as a “neuropeptide of interest.” 47 However, the notion
46 National Research Council, Emerging Cognitive Neuroscience and Related Technologies, The National Academies Press, Washington, D.C., 2008.
47 National Research Council, Emerging Cognitive Neuroscience and Related Technologies, 2008.
that oxytocin could be useful in interrogation requires extrapolating from laboratory experiments conducted under highly specified conditions with subjects whose background and motivation differed from those of likely interrogation targets.
In addition to the potential for advances in neuroscience to enhance the performance of one’s own forces, these developments also offer possible opportunities to inhibit or reduce the performance of adversaries. At present, the primary focus for such efforts to support military missions and law enforcement goals—as well as applications in areas such as counterterrorism or counterinsurgency where the lines between the two domains are often blurred—is on so-called incapacitating chemical agents (ICAs). The ethical and societal issues associated with ICAs are discussed in Chapter 3; this section briefly introduces the relevant scientific and technological developments. A number of recent reviews have addressed S&T potentially relevant to ICAs.48
As an example of these technical reviews, a 2012 Royal Society report, part of a larger Brain Waves project on the implications of developments in neuroscience for society and public policy,49 identifies two particularly prominent areas of relevant research.50 These are neuropharmacology, which studies the effects of drugs on the nervous system and the brain, and advances in drug delivery methods. A number of pharmaceutical agents, which are primarily chemicals, have at least the theoretical potential to provide the basis for ICAs. Current research on ICAs tends to focus on agents that offer a combination of rapid-action and short-duration effects and thus on those that “reduce alertness and, as the dose increases,
48 International Committee of the Red Cross, “Incapacitating Chemical Agents: Implications for International Law,” Expert meeting, Montreux, Switzerland, March 24-26, 2010, available at http://www.icrc.org/eng/resources/documents/publication/p4051.htm; Stefan Mogl, ed., Technical Workshop on Incapacitating Chemical Agents, Spiez Laboratory, Federal Department of Defence, Civil Protection and Sports, DDPS, Federal Office for Civil Protection, Spiez, Switzerland, September 8-9, 2011, available at http://www.labor-spiez.ch/de/dok/hi/pdf/web_e_ICA_Konferenzbericht.pdf; Scientific Advisory Board, Organization for the Prohibition of Chemical Weapons, “Report of the Scientific Advisory Board on Developments in Science and Technology for the Third Special Session of the Conference of States Parties to Review the Operation of the Chemical Weapons Convention,” RC-3/DG.1, 2012, available at http://www.opcw.org/documents-reports/conference-states-parties/third-review-conference/; Royal Society, “Brain Waves Module 3: Neuroscience, Conflict, and Security,” Royal Society, London, 2012.
49 Information about the Brain Waves project is available at http://royalsociety.org/policy/projects/brain-waves/.
50 Royal Society, “Brain Waves Module 3: Neuroscience, Conflict, and Security,” 2012.
produce sedation, sleep, anaesthesia, and death.” Some of the classes of pharmaceutical agents under consideration are opioids, benzodiazepines, alpha2 adrenoreceptor agonists, and neuroleptic anaesthetics.51
In addition to these chemical agents, bioregulators—biochemical compounds that occur naturally and control vital functions such as temperature, heart rate, and blood pressure—have also been the subject of military research. Advances in the synthesis of bioregulatory peptides appear to offer the promise of overcoming some of the problems that have so far limited therapeutic applications and could also potentially enable national security applications as well.
Advances in medical research are also yielding more effective means of delivering drugs into the central nervous system, including across the blood-brain barrier. With regard to ICAs, advances in aerosol delivery are of particular interest because inhalation seems the most plausible dissemination mode for military and law enforcement purposes. At the same time, nanotechnology is offering significant potential to provide more effective, targeted delivery to the brain. To date, however, with some exceptions for veterinary applications, the two streams of research have focused on delivering doses to individuals.52
A number of recent technical reviews have concluded that, in spite of the advances in several fields, the current state of S&T does not provide the basis for safe delivery of ICAs for law enforcement purposes, given all the challenges of delivering nonlethal doses in a variety of settings to groups that would vary by characteristics such as age, health status, and individual sensitivity to the chosen agent(s).53 In its report on S&T developments in advance of the third review conference of the Chemical Weapons Convention, the Scientific Advisory Board (SAB) of the Organization for the Prohibition of Chemical Weapons commented that “in
51 Morphine is the primary example of an opioid, but the search for novel agents with fewer side effects continues. Fentanyl, the agent reportedly used as part of the aerosol compound piped into the ventilation system to break the Moscow theater siege in October 2002, is an opioid. Benzodiazepines are used to treat anxiety and also as part of general anesthesia. Alpha2 adrenoreceptor agonists, which reduce alertness and wakefulness and can also increase the effects of local and general anesthesia, have been the subject of U.S. Army research as a potential ICA. Neuroleptic anesthetics are able to induce unconsciousness without significant effects on reflexes or muscle tone.
52 National Research Council, Life Sciences and Related Fields: Trends Relevant to the Biological Weapons Convention, The National Academies Press, Washington, D.C., 2011.
53 International Committee of the Red Cross, “Incapacitating Chemical Agents: Implications for International Law,” Expert meeting, Montreux, Switzerland, March 24-26, 2010, available at http://www.icrc.org/eng/resources/documents/publication/p4051.htm; Royal Society, “Brain Waves Module 3: Neuroscience, Conflict, and Security,” Royal Society, London, 2012; Michael S. Franklin et al., “Disentangling Decoupling: Comment on Smallwood (2013),” Psychological Bulletin 139(3):536-541, 2013.
the view of the SAB, the technical discussion on the potential use of toxic chemicals for law enforcement purposes has been exhaustive.”54 The associated ethical and societal issues related to military and law enforcement applications are taken up in Chapter 3.
Informed and Voluntary Consent to Use
The widely accepted moral principle of autonomy prohibits nonvoluntary neurotechnological interventions without informed consent or its moral equivalent. Nonetheless, it is clear that some feel impelled to accept such interventions regardless of the low likelihood that their personal goals would be realized. For example, there is little evidence that drug therapies for conditions like ADHD improve academic performance, although the off-label use of medications like Ritalin by college students surely has much to do with the notion that their performance might be improved.
The very term “human enhancement” could beg the question of the actual net benefits of claimed “enhancements.” Their social implications need to be examined on a case-by-case basis. Exaggerated claims about cognitive enhancement, or even accurate statements about short-term benefits, could lead to an increase in addictions due to competitive pressures. Differences in socioeconomic status related to contingent advantages like opportunities for acquiring new skills could be exacerbated by unequal access to enhancing technologies.
In the military, both competitive and coercive pressures are uniquely pronounced. In general, persons in uniform are required to accept interventions that commanders believe will maintain their fitness for duty or enable them to return to duty. In some circumstances, warfighters might even be required to accept medical interventions otherwise regarded as “experimental,” or at least not validated for a particular purpose, if there is a sound basis for believing that they could be of benefit if forces are threatened. A real-world example is described in Box 2.2.
As useful military technologies proliferate, including those that in some sense enhance normal cognitive functions, veterans may face the prospect of adjusting to civilian life without those advantages. The tragic
54 Scientific Advisory Board, Organization for the Prohibition of Chemical Weapons, “Report of the Scientific Advisory Board on Developments in Science and Technology for the Third Special Session of the Conference of States Parties to Review the Operation of the Chemical Weapons Convention,” RC-3/DG.1, 2012, p. 21, available at http://www.opcw.org/documents-reports/conference-states-parties/third-review-conference/.
experience of many returning veterans, especially those who have faced the stresses of combat, demonstrates that this adjustment is already difficult enough.
A separate but important issue concerns the proliferation of these technologies—in civilian life and in the likely access that unfriendly persons, groups, organizations, and nations will gain to them. Is the potential for gain in U.S. military capabilities sufficient to overcome these potential negative effects? Or is it likely that civilian access to these technologies will precede their presence in military contexts?
Longstanding, ill-defined but persistent worries and rumors about “brain-washing” and “mind control” will surely be reinforced by advances in neuroimaging, which is an excellent example of a technology that has both military and civilian applications. But do they raise valid privacy concerns? Besides issues of harm resulting from false positives and negatives, the extent to which brain imaging raises issues of privacy depends of course on the ultimate accuracy of the technology in revealing psychological states—and how such accuracy is perceived by users of the technology. Exaggerated notions of technological capacity can also have adverse social consequences, such as the premature admission of imaging data into courts of law. Constitutional barriers may also be insurmountable if these data are found to violate guarantees against self-incrimination or unacceptable forms of search and seizure.
Privacy challenges are emerging in many fields, including genetics and information technology, and brain imaging may or may not create unique ethical or policy issues. Even relatively simple technologies currently claimed to improve on traditional “lie detector” results have limited accuracy, require a cooperative subject, and may not be more efficient (or more cost-effective) than a simple interview with a skilled interrogator.
ELSI concerns in this category appear to relate to both civilian and military applications of neuroscience. However, in a military or national security context, it is easy to imagine that such applications raise particular concerns when they are applied to innocent bystanders—as they would inevitably be in any kind of counterintelligence investigation.
The safety of neuroscience-based interventions, whether drugs or devices, is of course a threshold concern. For example, external neuromodulatory systems like dTCS and TMS are generally considered to present a low risk, but safety studies have generally been performed on
Box 2.2 Military Use in Combat of Drugs Not Approved by the Food and Drug Administration
The 1991 Gulf war raised a number of ethical and policy questions regarding the use of investigational new drugs (INDs)—drugs that have not yet received Food and Drug Administration (FDA) approval for use in particular applications but that are currently being investigated for such use—to defend troops against the possibility that they might be attacked by chemical and biological warfare agents. As a matter of policy, the Department of Defense (DOD) has complied with all FDA requirements concerning the development and use of new drugs, including the requirement to obtain informed consent before administering INDs to research subjects.
At the time of the Gulf war, two INDs were promising candidates for drugs to defend against certain chemical weapon/biological weapon agents. To comply with FDA regulations, the DOD would have had to obtain informed consent for the use of these drugs from every service member deployed to the Persian Gulf. Allowing deployed troops to refuse drugs intended for their own protection could, however, have jeopardized the combat mission. Accordingly, the DOD requested that the FDA both establish authority to waive informed consent requirements and grant waivers for administration of those particular drugs. The FDA agreed that obtaining informed consent might not be feasible “in certain combat-related situations” and that withholding potentially life-saving INDs in such situations would be “contrary to the best interests of military personnel involved,” and subsequently granted the DOD the waivers it sought.1
This decision led to controversy, much of it focused on the difference between research (in which case informed consent must be obtained for administering a drug to research subjects) and treatment (in which case no such requirement obtains in a military context). Those opposed to the waivers argued that the use of any IND was, by definition, “research” because the consequences, risks, and benefits of use were unknown, and thus informed consent was required under all circumstances. Those opposing pointed to a long line of ethical guidelines, such as
healthy, normal subjects rather than persons with neurological or major psychiatric illnesses. There is a potential for seizures, although less than with conventional electroconvulsive therapy (ECT). However, the longer-term risks of repeated use of external neuromodulation are not known. The larger the populations exposed, the greater the likelihood of untoward results.
ELSI concerns in this category appear to relate to both civilian and military applications of neuroscience equally.
the guidelines in the Belmont report,2 that make no exception for waiving informed consent for research conducted under wartime conditions. They further argued that the mere intent to use an IND to provide medical benefit could not transform an experimental investigation into therapy—otherwise, researchers could simply change their stated intentions and redefine an experimental intervention as treatment, thereby evading informed consent requirements.
Proponents of the waivers argued that the DOD had an ethical responsibility to protect its service members to the greatest extent possible. During the Gulf war, the best protection the DOD could offer its personnel included use of the INDs in question. Proponents further argued that despite their status as “investigational,” the drugs were neither remarkably novel nor experimental in a scientific or medical sense because they had already been subjected to “extensive research”; one drug had also been approved for uses that were similar to those that the DOD proposed. Moreover, prior ethical guidelines had been written with human experimentation in mind, in which the outcome of the research was in doubt and could result in serious harm to the subject. But the guidelines had not anticipated the ethical issues surrounding the use of drugs that would provide the only available means of avoiding death or serious disability under combat situations. Finally, the proponents noted that, under the doctrine of military command authority, the DOD could justifiably have chosen to act on its own, without FDA approval, but sought waivers to avoid even the appearance of impropriety.
1 Food and Drug Administration, “Informed Consent for Human Drugs and Biologics; Determination That Informed Consent Is Not Feasible; Interim Rule and Opportunity for Public Comment,” 21 CFR Part 50, Federal Register 55(246):52814-52817, December 21, 1990, available at http://archive.hhs.gov/ohrp/documents/19901221.pdf.
2 The Belmont report can be found at http://www.hhs.gov/ohrp/humansubjects/guidance/belmont.html.
SOURCE: Adapted in large part from RAND, Waiving Informed Consent: Military Use of Non-FDA-Approved Drugs in Combat, 2000, available at http://www.rand.org/pubs/research_briefs/RB7534/index1.html.
Responsibility and Loss of Control
Some of the most challenging societal questions relate to the possibility that techniques or drugs derived from neuroscience may be used to alter trust and moral judgment. For example, as noted above, administration of oxytocin to humans has the effect of increasing trust toward individuals shown to be untrustworthy.55 A TMS disruption of the right temporo-parietal junction of individuals was shown to increase the like-
55 Thomas Baumgartner et al., “Oxytocin Shapes the Neural Circuitry of Trust and Trust Adaptation in Humans,” Neuron 58:639-650, 2008.
lihood that those individuals would forgive an unsuccessful murder attempt, as compared with a control group,56 raising the possibility that such disruptions affect moral judgments. In the absence of such manipulations of trust and moral judgment, individuals are often held accountable for behaving appropriately. What remains of the notion of individual responsibility when individuals are subject to such manipulations?
In a military context, one might imagine the use of such techniques to reduce the qualms and inhibitions of soldiers about morally suspect or questionable activities. How and under what circumstances might neurally manipulated soldiers be accountable for activities that violate the laws of war?
Impact of Classification
As with synthetic biology, issues arise regarding coordination of neuroscience research in a classified environment and how to establish effective oversight in these environments. Staying abreast of developments and the associated benefits and risks can also be difficult because the research, by definition, is shielded from public view. As one example—the draft agenda for a conference titled “Evolving Neuro-Cyber Technologies and Applications and the Threats Within” held at Fort McNair in Washington, D.C., on March 14, 2012, included a panel to discuss the question of the ethics of such technologies and applications, and the session was classified top secret.
56 Liane Young et al., “Disruption of the Right Temporoparietal Junction with Transcranial Magnetic Stimulation Reduces the Role of Beliefs in Moral Judgments,” Proceedings of the National Academy of Sciences 107(15):6753-7657, 2010, available at http://www.pnas.org/content/early/2010/03/11/0914826107.full.pdf+html. (One of the investigators in this study, Marc Hauser, was found to have committed scientific misconduct in the falsification of data associated with a number of other experiments, leading to a number of retractions of published papers involving such data. However, there is no indication that the paper cited in this footnote has been similarly discredited. See http://www.boston.com/whitecoatnotes/2012/09/05/harvard-professor-who-resigned-fabricated-manipulated-data-says/UvCmT8yCcmydpDoEkIRhGP/story.html.)