tion, and in some cases have the potential for doing harm to U.S. interests on a large scale.

In 2010, the Defense Advanced Research Projects Agency asked the National Academies to develop a framework for policy makers, institutions, and individual researchers to use in thinking through ethical, legal, and societal issues as they relate to research and development (R&D) on ERA technologies with military or other national security relevance. What are the ethics of using autonomous weapons that may be available in the future? How should we think about the propriety of enhancing the physical or cognitive capabilities of soldiers with drugs or implants or prostheses? What limits, if any, should be placed on the development of cyber weapons, given the nature and extent of the economic damage that they can cause? Such questions illustrate the general shape of ethical, legal, and societal issues considered in this report.

This report begins with the assumption that defending and protecting national security against external threats are morally sound and ethically supportable societal goals. A related premise is that individuals who are part of the national security R&D establishment want to behave ethically.

That said, the notion of deliberately causing death and destruction, even in defense of the nation from external threats, raises ethical, legal, and societal issues for many. Those who engage in combat, those who support combatants, directly or indirectly, and those whom they defend—that is, the American public at large—all have a considerable stake in these issues and the questions they raise.

Knowledge regarding ethical, legal, and societal issues associated with R&D for technology intended for military purposes is not nearly as well developed as that for the sciences (especially the life sciences) in the civilian sector more generally. (This is generally true, even recognizing that the line between military and civilian technologies is not always entirely clear.) Some of the important differences between the two contexts include the following:

• Unlike civilian technologies, some military technologies are designed with the explicit purpose of causing harm to people and to property.

• Civilian technologies and products may unexpectedly turn out to be relevant to a military need and in that context raise the possibility of heightened and/or new ELSI implications.

• Technologies developed in a military context may turn out to have significant ELSI implications when applied in a civilian context.

• Advancing military technologies may also outpace the evolution of the laws designed to govern their use. For example, cyber weapons offer



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement