The United States faces a broad and complex array of challenges to its national security. Its potential adversaries cover a broad range, including nations large and small, organized terrorist groups, drug cartels, organized crime, and even individual terrorists. The weapons they use (or wish to use) cover an equally broad range and include conventional military weapons, weapons of mass destruction and disruption, improvised explosive devices, and cyber/information warfare. Moreover, the scope and the nature of threats facing the nation are constantly evolving.
The armed forces of the United States exist to deter its adversaries from threatening action against it, its allies, and its interests more broadly. In the words of the National Security Strategy 2010, “We are strengthening our military to ensure that it can prevail in today’s wars; to prevent and deter threats against the United States, its interests, and our allies and partners; and prepare to defend the United States in a wide range of contingencies against state and nonstate actors.”1 In the event that deterrence fails, the United States structures and equips its armed forces with the personnel and tools they need to defeat adversary threats, although U.S. policy calls for a military approach only when other approaches, such as diplomacy, are unsuccessful in resolving disagreements between nations or controlling threats to U.S. national security.
America’s experience at war and in planning for war since the end of World War II has persuaded military planners that technological military superiority is the best way to approach this goal. That is, the U.S. approach to national security emphasizes technologically derived qualitative advantages over its adversaries, and technology is an integral aspect of national security. (By contrast, the U.S. approach to armed conflict during World War II generally placed much greater emphasis on the large-scale production of weapons rather than technological superiority.)
Technology supports a number of military functions. For example, weapons are the tools that cause direct effects against an adversary, such as when a bomb explodes on the battlefield. Technologies for command, control, intelligence, surveillance, and reconnaissance (C4ISR) help decision makers ensure that these effects occur when and where they are intended to occur; for example, a system for data analysis identifies an important target that would otherwise be overlooked or collateral damage that might result from an attack. Countermeasures seek to frustrate an adversary’s use of weapons and C4ISR systems. Logistics provide indirect support for the personnel involved, such as food, fuel, transportation, and medical assistance.
Adversaries also seek technologically enabled capabilities for their own purposes, and sometimes they are influenced by demonstrations that a given technology has proven useful in practice. Indeed, sometimes the utility of such a technology is demonstrated by the United States itself. Those adversaries can acquire and adapt for their own use the technologies that the United States develops, can find alternative technologies that are more available or less expensive (e.g., commercial products), and can identify ways to negate U.S. technological advantages. They may also give the technology or the ability to create the technology to others for use against the United States.
To enhance and expand technological superiority, the Department of Defense (DOD) and other government agencies invest in science and technology (S&T) on an ongoing basis. (Investment in technologies for military purposes sometimes has benefits for the civilian world as well.) These investments cover a broad range from fundamental science that might eventually support national security needs, broadly defined, to specific development and eventual production efforts intended to address particular national security problems. (In some cases, the national security problem for the United States is the possibility that an actual or potential adversary will develop a new capability.) In addition, the U.S. government adapts technologies originating in the civilian sector, without national security in mind, to national security needs.
The development of technology for national security needs is a complex endeavor, given the strategy of technological superiority as well as
changes in the technological and societal environment. These changes are discussed in greater detail in Section 1.4, “Emerging and Readily Available Technologies of Military Significance.”
The U.S. Office of Management and Budget uses the following definitions for research and development:2
• Basic research is defined as “systematic study directed toward fuller knowledge or understanding of the fundamental aspects of phenomena and of observable facts without specific applications toward processes or products in mind. Basic research, however, may include activities with broad applications in mind.” An example might be research in quantum computing, a field that is at the forefront of basic research even as its potential for revolutionary advancements in computing is acknowledged [even without specific applications in mind].
• Applied research is defined as “systematic study to gain knowledge or understanding necessary to determine the means by which a recognized and specific need may be met.” An example is research to improve flight control for remotely piloted aircraft.
• Development is defined as “systematic application of knowledge or understanding, directed toward the production of useful materials, devices, and systems or methods, including design, development, and improvement of prototypes and new processes to meet specific requirements.” An example is technical work needed to meet a particular range requirement for a particular remotely piloted aircraft.
The categories of activity described above speak to how the DOD may invest in S&T research, from which may emerge findings and results that can lead to military applications. But, of course, the DOD does not live in a closed environment, and today it also keeps track of civilian S&T that might have military application. Indeed, civilian S&T are sometimes more mature and developed than S&T overtly developed for military purposes. Civilian science and technology may be introduced at any appropriate stage.
U.S. investment in science and engineering research and development (R&D) has been substantial, and its results have helped to shape physi-
2 Executive Office of the President, Office of Management and Budget Circular No. A–11 (2012), Section 84, page 11, available at http://www.whitehouse.gov/sites/default/files/omb/assets/a11_current_year/a_11_2012.pdf.
cal and social landscapes throughout the world. Policy makers seek new science and technology largely because of the larger range of policy and programmatic options they afford. But efforts to develop new S&T have also raised concerns about a variety of ethical, legal, and societal issues (ELSI).3 Furthermore, many such concerns emerge from the increasingly global scope of certain new technologies and the applications these technologies enable.
This report uses the adjective “ethical” to describe issues that are matters of principle (what people regard as right). By contrast, “social” or “societal” is used in reference to issues that are matters of interests (what people regard as desirable). Often the two will overlap. People should always desire the things that they believe are right. However, they may also desire things without invoking a moral principle. Both can refer to how choices are made, which actions are taken, and what outcomes arise. Ethical issues are often illuminated by analysis (e.g., philosophy) and social issues by empirical research (e.g., psychology, sociology). However, each can inform the other (as when analysis suggests topics for empirical research or when such research identifies behavior worth analyzing).
As for the relationship between law and ethical/societal issues, it is true that law is intrinsically a part of those issues. Law establishes authority to decide questions (who decides), set substantive limits on the content of decisions (what gets decided), and create processes or procedures for decision making (how decisions get made). Law can channel how policy makers make decisions when ethical or societal consensus is lacking, and indeed law is often the essential point of departure for a consideration of ethical or societal issues. Legal concerns often become more salient as a given weapons concept unfolds from R&D to deployment to use.
However, against the backdrop of an evolving legal context and understanding is the reality that law and ethics are not identical, and even well-established law cannot be the final word on ethical and societal issues for several reasons:
• Established law may not even address ethical or societal issues that are important in any given instance. The relationship of legal, ethical, and societal factors is not always straightforward, although they do overlap in some cases. In general, the law is supposed to reflect the ethical, as well as the practical, values of the community to which it applies. Law can thus be an expression of both ethical and societal concerns, but it is not always so. By contrast, ethical and societal considerations are not bounded by their expressions in law; indeed, some are not captured by law at all,
3 The acronym ELSI stands for “ethical, legal, and societal issues” and is strictly speaking a noun. However, this report uses the acronym as an adjective.
perhaps because it may not be possible to condense an ethical or societal concern to a simple expression of black-letter law. Most importantly, in many cases the emergence of ethical and societal concerns leads the development of law. In this interval, decision makers have to cope with such concerns and the controversies they may engender in the absence of formal (e.g., legal) guidance for their decisions.
• The interpretation of established law may depend on the particular facts and circumstances of any research problem. For example, a law may prohibit the use of human subjects under conditions that expose those subjects to significant danger. What counts as “significant” danger? Resolving this question is, by definition, not a matter for law unless the law provides some specific definition for “significant”—which it often does not. Moreover, there is often profound disagreement in many instances about what is ethical, a disagreement often reflected in laws that are ambiguous or incomplete. Law, which is usually designed to withstand rapid changes in popular opinion, may be unclear in its practical application. Thus, a debate rages today within the United States about the scope of constitutional protections when drones are used to carry out targeted killings, and disagreement about the morality or “rightness” of that use is even more heated. That is, new circumstances may highlight tensions between ethical and legal constructs that might otherwise be overlooked.
• The ethical and societal environment extant at the time a law might be applied could be very different from that at the time the law was formulated. Although some degree of ethical or societal consensus may have to be present when a given law is enacted or otherwise goes into force, that consensus may no longer be operative at the moment policy makers must make a decision about a given research effort. That is, laws themselves are sometimes overtaken by events that call into question some of their underlying but unstated ethical assumptions. Similar considerations apply for new technological capabilities that may not have been anticipated in the initial formulation of a law.
• Strategic or tactical concerns also may not line up well with ethical considerations. For example, a decision to develop a new weapon system for use under particularly exigent circumstances might be considered by some to be ethically objectionable (e.g., because of the bad precedents its use might set) and by others to be tactically necessary (e.g., because of the lives its use might save in a particular situation).
If any of these reasons is relevant to a given decision-making situation, the law may not by itself be in any way final or dispositive. In such cases, decision makers have no choice but to refer to the ethical principles that they believe were inherent in the initial formulations of the law.
Research and deliberation can guide the examination of ethical, legal, and societal concerns. Without such examination, public policies and programs may not be stable and sustainable. Law and regulation are expressions of public policy that reflect societal concerns and establish norms or standards regarding how to address those concerns.
ELSI concerns regarding S&T are not new.4 For example, in the years after World War II, governments have made efforts to come to grips with some of the ethical concerns that result from developments and research practices in S&T. These efforts span a broad range, and they include (but are not limited to) the following:
• In 1946, the postwar Nuremberg trials resulted in the convictions of a number of German physicians and bureaucrats who conducted or facilitated horrific medical experiments on concentration camp prisoners. These trials have become an important point of departure for international discussions on bioethics issues.
• In 1972, the United States signed the Convention on the Prohibition of the Development, Production and Stockpiling of Biological and Toxin Weapons and on Their Destruction (usually known as the Biological Weapons Convention (BWC)),5 in part for ethical reasons.6 The BWC bans “the development, production, stockpiling, acquisition and retention of microbial or other biological agents or toxins, in types and in quantities that have no justification for prophylactic, protective or other peaceful purposes,” and “weapons, equipment or means of delivery designed to use such agents or toxins for hostile purposes or in armed conflict.” The actual use of biological weapons is prohibited by the 1925 Geneva Protocol.7
• In 1979, the Belmont report of the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research presented three basic ethical principles regarding the conduct of biomedical research involving human subjects: respect for persons (e.g., research subjects should be treated as autonomous), beneficence (e.g., research subjects should not be harmed), and justice (benefits and costs of research
4 An overview of this subject can be found in Carl Mitcham, Encyclopedia of Science Technology and Ethics, Macmillan Reference, Detroit, Mich., 2005.
6 The U.S. decision to sign the BWC was also influenced by the conclusion of the U.S. military that biological weapons had little military utility and that signing the convention would not deprive the United States of a significant military capability.
7 The 1925 protocol is formally known as the Protocol for the Prohibition of the Use in War of Asphyxiating, Poisonous or Other Gases, and of Bacteriological Methods of Warfare. See http://www.un.org/disarmament/WMD/Bio/1925GenevaProtocol.shtml.
should be shared equitably).8 The Belmont report and other reports by the National Commission formed the basis of regulations implementing these principles that govern the conduct of most federally supported research involving human subjects. These regulations, usually known collectively as the Common Rule, require institutions to establish institutional review boards (IRBs) that approve, modify, or reject such research.
• In the late 1980s, the Human Genome Project (HGP) established a program of research on ethical, legal, and societal issues associated with sequencing the human genome. Such issues include questions of how genetic information should be interpreted and used, who should have access to it, and how people could be protected from the harm that might result from the improper disclosure or use of such information.
• In 1993, the United States signed the Chemical Weapons Convention (CWC),9 in part for ethical reasons. The CWC bans the development, production, stockpiling, and use of chemical weapons, although the CWC acknowledges the benefits of peaceful chemistry and the desire to promote free trade in chemicals and international cooperation in chemical activities not prohibited by the convention.
• In 2001, the National Nanotechnology Initiative (NNI) was launched. One of the NNI’s goals is promoting the responsible development of nanotechnology, an important component of which is the consideration of the ethical, legal, and societal implications associated with nanotechnology research and development, and the development of plans for addressing environmental, health, and safety implications as well. Some of the issues include how applications of nanotechnology research are introduced into society; how transparent the related decision-making processes are; and how sensitive and responsive policies are to the needs of the full range of stakeholders. To help explore the ethical, legal, and societal issues associated with nanotechnology research, NNI agencies support two centers for nanotechnology in society, at Arizona State University and the University of California, Santa Barbara, and also incorporate ELSI components in their new nanotechnology R&D programs.
Nongovernmental organizations and individuals have also mounted important efforts, which include the following:
• In 1955, the Russell-Einstein manifesto addressed the dangers of nuclear war, arguing that the use of nuclear weapons threatened the continued existence of mankind.
8 The Belmont report can be found at http://www.hhs.gov/ohrp/humansubjects/guidance/belmont.html.
• In 1964, the Declaration of Helsinki was adopted by the World Medical Association as a statement of ethical principles for medical research involving human subjects, including research on identifiable human material and data. Since then, the declaration has undergone several revisions and clarifications.
• In 1974, the paramountcy clause was first included in a code of engineering ethics. It obligates engineers to “hold paramount the safety, health and welfare of the public and protect the environment in performance of their professional duties.”10
• In 1983, the U.S. Catholic Bishops issued their Pastoral Letter on War and Peace, a document that spoke to the dangers of nuclear war from an ethical perspective grounded in Catholic theology.11
• In 2005, the National Council of Churches issued an open letter titled “God’s Earth Is Sacred: An Open Letter to Church and Society in the United States,” an ecumenical statement on the environment that argued “the central moral imperative of our time is the care for Earth as God’s creation.”12
• In 2009, the National Academies issued the third edition of On Being a Scientist, which notes that “the standards of science extend beyond responsibilities that are internal to the scientific community. Researchers also have a responsibility to reflect on how their work and the knowledge they are generating might be used in the broader society.”13
In some instances, the efforts of government and nongovernment bodies have been intimately intertwined. A well-known example is the story of the National Institutes of Health (NIH) Recombinant DNA Advi-
10 See Carl Mitcham, Encyclopedia of Science Technology and Ethics, Macmillan Reference, Detroit, Mich., 2005, p. 265; and Charles E. Harris, Jr., Michael S. Pritchard, and Michael Jerome Rabins, Engineering Ethics: Concepts and Cases, Wadsworth Publishing, Belmont, Calif., 1995.
11 The letter can be found at http://old.usccb.org/sdwp/international/TheChallengeofPeace.pdf.
12 The letter can be found at http://www.ncccusa.org/news/godsearthissacred.html.
13 The second edition of On Being a Scientist, issued in 1995, said:
Even scientists conducting the most fundamental research need to be aware that their work can ultimately have a great impact on society … [and] tremendous societal consequences. The occurrence and consequences of discoveries in basic research are virtually impossible to foresee. Nevertheless, the scientific community must recognize the potential for such discoveries and be prepared to address the questions that they raise. If scientists do find that their discoveries have implications for some important aspect of public affairs, they have a responsibility to call attention to the public issues involved…. science and technology have become such integral parts of society that scientists can no longer isolate themselves from societal concerns.
See National Research Council, On Being a Scientist, National Academy Press, Washington, D.C., 1995, pp. 20-21.
sory Committee and the Asilomar conference in the early 1970s. In 1973, a letter published in Science described the recommendations of the National Academy of Sciences’ Committee on Recombinant DNA Molecules,14 including a recommendation that life scientists voluntarily refrain from conducting certain kinds of experiments involving recombinant DNA until the potential hazards were better understood. Largely in response to this letter, the NIH in 1974 established the Recombinant DNA Advisory Committee to address public concerns regarding the safety of manipulating genetic material through the use of recombinant DNA techniques.15 In 1975 and with the support of the NIH and others, the Asilomar conference hosted many of the world’s leading researchers on recombinant DNA to consider the hazards of such research. One key outcome of the conference was the establishment of voluntary guidelines to improve the safety of recombinant DNA technology.16
The development of any new science or technology often raises ELSI concerns. But the scope and nature of these concerns depend on the specific science or technology in question and the context in which it is found. This report focuses on the ethical, legal, and societal issues that may be associated with science and technology (S&T) of relevance to military problems.
The report assumes that defending and protecting national security and protecting the individuals involved are widely regarded as morally sound and ethically supportable societal goals. A related premise is that individuals who are part of the national security establishment (that is, those who make decisions for the government relevant to national security) want to behave ethically.
As noted at the outset of this chapter, technology plays a critical role in the U.S. approach to national security, and technologically derived advantages can help both to defeat adversaries and to reduce friendly
14 Paul Berg, David Baltimore, Herbert W. Boyer, Stanley N. Cohen, Ronald W. Davis, David S. Hogness, Daniel Nathans, Richard Roblin, James D. Watson, Sherman and Norton D. Zinder, “Potential Biohazards of Recombinant DNA Molecules,” Science 185(4148):303, 1974, available at https://www.mcdb.ucla.edu/Research/Goldberg/HC70A_W11/pdf/BergLetter.pdf.
16 Paul Berg et al., “Summary Statement of the Asilomar Conference on Recombinant DNA Molecules,” Proceedings of the National Academy of Sciences 72(6):1981-1984, 1975, available at http://authors.library.caltech.edu/11971/1/BERpnas75.pdf.
and noncombatant casualties. At the same time, individuals may disagree about what national security requires, and how best to promote and achieve it. Some of those disagreements are ethical in origin. That is, a nation that behaves ethically has to find an appropriate balance between national security and the protection of “national rights” versus the protection of individual rights and other ethical norms.
Still, the notion of deliberately causing death and destruction, even in defense against external threats, gives many people pause. How much death or destruction? Whose death and destruction? What kinds of destruction and death (e.g., quick and painless death versus slow and painful death)? Under what circumstances? At their core, such questions are ethical questions, and those who engage in combat, those who support combatants, directly or indirectly, and the citizenry whom they defend have a considerable stake in the answers to these questions.
Ethical concerns about military technology are not new. Deuteronomy 20:19 says that one should not cut down fruit trees in preparing for the siege of a city. Daniel Headrick notes that in 1139 Pope Innocent II banned as a religious matter the use of crossbows because they were so devastating, even by an untrained fighter, against the powerful, noble, and revered knight in plate armor.17 (This ban applied only to use against Christians.18) In the wake of World War I, the London Naval Treaty of 1930 outlawed unrestricted submarine warfare, a practice that allowed submarines to sink civilian ships without warning or providing for the safety of their crews.19
As a more recent example of ELSI concerns regarding science and technology for military and national security use, it is instructive to consider revelations of Senate committee hearings in the 1970s. These hearings revealed that the CIA had been conducting experiments involving the administration of hallucinogenic drugs to nonconsenting subjects who were U.S. citizens. According to the 1977 Senate Report of the Select Committee on Intelligence and Committee on Human Resources,20
17 Daniel R. Headrick, Technology: A World History, Oxford University Press, New York, 2009. Cited in Patrick Lin, “Robots, Ethics, & War,” Stanford Law School, 2010, available at http://cyberlaw.stanford.edu/blog/2010/12/robots-ethics-war.
18 Bernard Brodie and Fawn M. Brodie, From Crossbow to H-Bomb: The Evolution of the Weapons and Tactics of Warfare, Indiana University Press, Bloomington, Ind., 1973.
19 See http://www.microworks.net/pacific/road_to_war/london_treaty.htm. In the case of both crossbows and submarines, these bans were subsequently ignored as the military value of using these weapons in the forbidden ways became more important.
20 U.S. Senate Select Committee on Intelligence and Committee on Human Resources, “Project MKUltra, The CIA’s Program of Research in Behavioral Modification,” Joint Hearing before the Committee on Intelligence and Committee on Human Resources, 95th Congress, 1st session, August 3, 1977, available at http://www.intelligence.senate.gov/pdfs/95mkultra.pdf.
[CIA] research and development programs to find materials which could be used to alter human behavior were initiated in the late 1940s and early 1950s. These experimental programs originally included testing of drugs involving witting human subjects, and culminated in tests using unwitting, nonvolunteer human subjects. These tests were designed to determine the potential effects of chemical or biological agents when used operationally against individuals unaware that they had received a drug….
The research and development program, and particularly the covert testing programs, resulted in massive abridgments of the rights of American citizens, sometimes with tragic consequences. The deaths of two Americans can be attributed to these programs; other participants in the testing programs may still suffer from the residual effects. While some controlled testing of these substances might be defended, the nature of the tests, their scale, and the fact that they were continued for years after the danger of surreptitious administration of LSD to unwitting individuals was known, demonstrate a fundamental disregard for the value of human life.
The report noted that the original rationale for this and other similar programs was based on U.S. concern over the use of chemical and biological agents by the Soviet Union and the People’s Republic of China in interrogations, brainwashing, and in attacks designed to harass, disable, or kill Allied personnel. Such concerns created pressure for “a ‘defensive’ program to investigate chemical and biological agents so that the intelligence community could understand the mechanisms by which these substances worked and how their effects could be defeated.”
But the 1977 report went on to note that “the defensive orientation soon became secondary. Chemical and biological agents were to be studied in order ‘to perfect techniques … for the abstraction of information from individuals whether willing or not’ and in order to ‘develop means for the control of the activities and mental capacities of individuals whether willing or not.’”
According to the 1977 report, the program of clandestine testing of drugs on U.S. citizens is believed to have been suspended in 1963. Then-CIA Director Richard Helms argued that
because of the suspension of covert testing, the Agency’s “positive operational capability to use drugs is diminishing, owing to a lack of realistic testing. With increasing knowledge of the state of the art, we are less capable of staying up with Soviet advances in this field. This in turn results in a waning capability on our part to restrain others in the intelligence community (such as the Department of Defense) from pursuing operations in this area.” Helms attributed the cessation of the unwitting testing to the high risk of embarrassment to the Agency as well as the
“moral problem.” He noted that no better covert situation had been devised than that which had been used, and that “we have no answer to the moral issue.”
The national security context for S&T has a number of characteristics that differentiate it from a civilian environment for S&T. These differences raise the question regarding the extent to which insights regarding ethical, legal, and societal issues associated with S&T accumulated in the context of civilian S&T apply in a military context. For example:
• The nature of destructive military technologies. Whereas civilian technologies are usually designed not to do harm, certain military technologies are designed with the explicit purpose of reducing the capabilities and willingness of adversaries to fight further, and are often intended to cause harm to people and property. In the context of nonpacifist responses to threats, the goal becomes to design technologies that do the least harm to innocent parties and that do not inflict unnecessary harm on the adversary.
• Civilian casualties. The use of many military technologies can result in civilian deaths (e.g., “collateral damage” from military operations), and at times civilian casualties may outnumber military casualties.21 During armed conflict, the laws of war acknowledge that some degree of collateral damage is inevitable and that it is unrealistic to expect zero collateral damage from military operations. Controversy regarding civilian casualties often arises over whether an “armed conflict” (in the legal sense of the term) is indeed underway and the magnitude of collateral damage that is regarded as legally acceptable in any given military operation in an armed conflict.
• Technologies and products developed by the private sector for civilian use. These technologies and products may prove relevant to a military need and in the latter context raise heightened and/or new ethical, legal, and societal issues for policy makers to address. Because these technologies and products are developed by the private sector, there are few opportunities for addressing or even characterizing ethical or societal issues before they are adopted for military use. One example is the adversary use of cell phones as remote detonators of improvised explosive devices. A second example is the military/intelligence use of data mining techniques, developed first in the context of analyzing large data sets for commercial purposes.
21 Taylor B. Seybolt, Jay D. Aronson, and Baruch Fischhoff, eds., Counting Civilian Casualties: An Introduction to Recording and Estimating Nonmilitary Deaths in Conflict, Oxford University Press, Oxford, 2013.
• Civilian adaptation of military technologies. Military technologies may be adapted for use in civilian contexts (e.g., surveillance, drones) and in those contexts raise issues such as privacy and civil liberties. Such dual use is part of the calculus for examining ethical, legal, and societal issues that arise from military R&D.22
• Time urgency. The timelines available for developing military technologies for specific applications may be compressed for a variety of reasons. For example, urgent military needs may emerge under the pressure of operations (e.g., new adversary weapons or tactics), and R&D is sometimes needed quickly to develop an appropriate response.23 (The same considerations apply, with somewhat less force, to new intelligence that may indicate that an adversary is close to deploying a new weapon or employing new tactics that might undermine U.S. military capabilities.)
Also, when time is limited (as is often the case during times of crisis or actual conflict), policy makers are likely to consider long-term ethical or societal considerations to a lesser degree if they believe that such considerations may delay a useful response.
• Rapid changes in militarily relevant technologies. Given rapid technological change in some of the tools of warfare, the nature of conflict can also be expected to change rapidly. But because international law (especially the laws of war) are built on social consensus, definitions and understandings of what is and is not justified during conflict may change on much longer time scales. In the current “war on terrorism,” legal matters are further complicated by a lack of consensus as to whether countering terrorism is subject to the international law of armed conflict (LOAC), international humanitarian law, or domestic law enforcement principles—or some combination thereof. For example, the United States has asserted
22 This report adopts a “traditional” definition of dual-use technology that has both civilian and military application that is consistent with the usage of the U.S. government (www.gpo.gov/fdsys/pkg/CFR-2010-title15-vol2/xml/CFR-2010-title15-vol2-sec730-3.xml) and the European Commission (http://ec.europa.eu/trade/creating-opportunities/trade-topics/dual-use/). Other reports and analysts define dual-use technology as technology intended for beneficial purposes that can also be misused for harmful purposes. For this latter usage, see National Research Council, Biotechnology Research in an Age of Terrorism, The National Academies Press, Washington, D.C., 2004, and Seumas Miller and Michael J. Selgelid, “Ethical and Philosophical Consideration of the Dual-Use Dilemma in the Biological Sciences,” Science and Engineering Ethics 13(4):523-580, 2007.
23 For example, military commanders during the first Gulf war realized that they needed the capability to destroy deeply buried Iraqi bunkers, and existing ordnance was inadequate for this task. Texas Instruments and Lockheed mounted an effort that resulted in the first combat use of the GBU-28 laser-guided bomb 17 days after the initiation of the development effort. See “Guided Bomb Unit-28 (GBU-28),” available at http://www.globalsecurity.org/military/systems/munitions/gbu-28.htm.
that its use of armed drones complies with LOAC.24 A competing position is put forward by those who assert that a blend of international humanitarian/human rights law and the principles of domestic law enforcement should govern the use of drones when they are employed outside a “hot” battlefield to kill Al-Qaeda leaders and also those who argue that even if LOAC is the correct framing, U.S. policy is not compliant. Advocates of this competing position argue that the present strategy causes unnecessary suffering,25 violates national sovereignty, and amounts to extrajudicial killing.26 In general, these advocates would tend to prefer a “capture and detain” strategy, which they would regard as more humane. Other concerns point to the frequency of civilian deaths and the asymmetric military advantage that use of this technology creates.27
Emerging and readily available (ERA) technologies are the primary focus of this report. Such technologies are important for three essential reasons. First, the pathways on which these technologies will evolve (and the applications that may be enabled) are much less predictable than would be the case if access to these technologies were more limited. Second, these technologies are more readily available to a much wider array
24 See, for example, John Brennan, Assistant to the President for Homeland Security and Counterterrorism, “The Efficacy and Ethics of U.S. Counterterrorism Strategy,” Wilson Center, April 30, 2012, available at http://www.wilsoncenter.org/event/the-efficacy-and-ethics-us-counterterrorism-strategy.
25 For example:
If the State is not operating within the self-defense or armed conflict paradigms, it must be operating in the human rights paradigm. Simply put, if a State does not meet the legal criteria of self-defense or armed conflict, but uses force without Security Council authorization, it is doing so unlawfully. Thus, it becomes imperative for a State utilizing military force to justify and legitimize its actions as either a lawful right to self-defense or engagement in an armed conflict.
See Molly McNab and Megan Matthews, “Clarifying the Law Relating to Unmanned Drones and the Use of Force: The Relationships Between Human Rights, Self-Defense, Armed Conflict, and International Humanitarian Law,” Denver Journal of International Law and Policy 39(4, Fall):665, 2011; and Mary Ellen O’Connell, “Remarks: The Resort to Drones Under International Law,” Denver Journal of International Law and Policy 39(4, Fall):585, 2011.
26 See, for example, Philip Alston’s statement that “[m]y concern is that these drones, these Predators, are being operated in a framework which may well violate international humanitarian law and international human rights law. “U.S. Warned on Deadly Drone Attacks,” BBC.com, October 28, 2009, available at http://news.bbc.co.uk/2/hi/americas/8329412.stm.
27 “Secrecy of U.S. Drone Strikes in Pakistan Criticized,” MSNBC.com, January 29, 2010, available at http://www.msnbc.msn.com/id/35149384/ns/world_news-south_and_central_asia/t/secrecy-us-strikes-pakistan-criticized/#.UJVVzsXR5go.
of nations and possibly subnational groups than many of the traditionally important, militarily relevant technologies. Third, international legal regimes that may affect how nations use such technologies must reflect the reality that by definition, ERA technologies are readily available to nonstate entities—and to the extent that nonstate entities can use these technologies to cause significant effects, they perturb at least some of the traditional understandings underlying international law.
ERA sciences and technologies share most or all of the following basic characteristics:
• Low barriers to entry. At least by comparison to previous industrial-age technologies, advances in and exploitation of ERA technologies often do not require large investments or infrastructure. In other cases (in particular, in information technology), the incremental costs for developing any specific application are low because of significant investment in the commercial sector. That is, there are few or no technical chokepoints through which all necessary information or resources must pass, and thus access to these technologies is difficult or impossible to limit. The resources required for significant R&D efforts in these areas are relatively modest. Relevant specialized knowledge, once limited to articles published in paper-based journals, is now often accessible through the Internet, with little regard for national borders or distance. And whereas in the past building useful artifacts required great technical skill, kits are now often available that reduce the knowledge and skills needed to do so. A consequence is that advantages gained by the United States through a monopoly on military and other national security applications based on these technologies are likely to be transient. A second consequence is that nation-states themselves have less control over sensitive data regarding these technologies, data that might ultimately have military application. The bottom line is that both non-industrialized states and certain nonstate actors now have significantly greater access to ERA technologies, and these technologies can be used in ways that are contrary to U.S. national security interests.
• Rapid change. Again by comparison to most industrial-age technologies, advances in these technologies (especially information technology (IT), and those technologies that depend on IT) occur often, and significant advances on time scales measured in months are not uncommon. These time scales for advancement are short compared to the time scales on which nontechnical concerns such as law, policy, and ethics have traditionally been addressed, which means that advances associated with these technologies are likely to stress existing processes for policy formulation and/or arouse public concern. In short, fast, frequent, and significant
advances in a technology limit the time available for societal response and evaluation.
• Blurring of lines between basic research and applied research. As noted above, the OMB definitions draw categorical lines between basic and applied research, in which basic research develops fundamental knowledge without any specific application in mind and subsequent applied research expands and applies knowledge to develop useful materials, devices, and systems or methods and is sometimes oriented ultimately toward the design, development, and improvement of prototypes. In some cases (notably software), what emerges from applied research may already be very close to an artifact with operational utility.
The model embodying a sharp distinction between basic and applied research captures some elements of scientific progress in some fields, but it is particularly inapplicable to ERA technologies. For example, in practice but especially so when ERA technologies are involved, “applied” research may uncover problems that require additional “basic” research to solve; such feedback loops are common rather than rare. In addition, that model overlooks an important mode of progress increasingly common in today’s R&D environment—what is often called “use-inspired” basic research. One canonical example of such work was done by Louis Pasteur; driven by concerns related to public health, that work laid many of the foundations of microbiology.28 In this context, the potential to solve a societal problem drives basic research in specific domains. The knowledge it produces can be regarded as fundamental and is likely to be as broadly applicable to multiple problem domains as “pure” basic research.
• High uncertainty about how the future trajectories of ERA technologies will evolve and what applications will be possible. Rapid evolution of a field implies that the periods of time between fundamental research and potential applications are shorter. In addition, the underlying scientific paradigms exhibit considerable instability—new discoveries often cause researchers to question previously accepted basic understandings. Because of the interconnectedness of various technologies, no single discipline is “in charge,” and the influences on research direction and application are even more diverse than when only one discipline is involved. In turn, uncertainty about the future trajectories of a given technology is a significant contributor to the technological risk that may be faced by any particular applications-development effort involving that technology. Thus, empirical evidence can go only so far in mitigating such risk.
28 Donald E. Stokes, Pasteur’s Quadrant: Basic Science and Technological Innovation, Brookings Institution Press, Washington, D.C., 1997.
This report distinguishes between two categories of ERA science and technology: the category of foundational sciences and technologies and the category of specific application domains.
Foundational sciences and technologies enable progress and applications in a variety of application domains. They support, facilitate, drive, and may even be essential to other technologies, leading to a high degree of interconnectedness between many technologies. For example, many ERA technologies (e.g., neuroscience, cyber weaponry, synthetic biology, human enhancement technologies, robotics) depend on information technology to process and manipulate large amounts of data in short periods of time. Neuroscience is likely to be an enabler for robotics and prosthetics. The consequence of such interconnectedness is that advances in one area may in some cases help to stimulate advances or even eliminate severe bottlenecks in another. Furthermore, these fields share the characteristic that they are malleable—that is, they can be used in many different ways to address many different types of problems.29
Chapter 2 discusses three foundational sciences and technologies for illustrative purposes:
• Information technology. Nearly any aspect of military operations today is dependent on the effective processing of information, and visions for IT applications (if not the practicality of such applications) are limited primarily by the imagination of potential users of information. IT is the foundational and enabling technology underlying two application domains discussed in this report, autonomous military systems and cyber weapons. IT is also fundamental to various intelligence applications, such as predictive analysis.30
• Synthetic biology. Although there are today few civilian or military products with their origins in synthetic biology, the field holds great promise for new drugs, materials, and fuels. But the technology also may lead to the construction of new organisms with dangerous properties that might be harmful to the public and/or the environment.
• Neuroscience. Advances in neuroscience may be able to help wounded soldiers recover from traumatic brain injuries, but they may also be able to help uninjured soldiers process information more quickly, operate equipment through a direct brain-machine interface, and remember
29 Notions of technological malleability and technology interconnectedness are further explored in James H. Moor, “Why We Need Better Ethics for Emerging Technologies,” Ethics and Information Technology 7:111-119, 2005.
30 Predictive analysis seeks to make predictions about significant events in the future based on correlations found in patterns of data. An introduction to predictive analytics can be found in Eric Siegel, Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die, John Wiley & Sons, Inc., Hoboken, N.J., 2013.
more information. Nor is neuroscience limited to these enhancements of normal function—various proposals have emerged suggesting that false human memories can be created and different emotional states induced (e.g., reduced or increased fear, feelings of anger or calm). It may also be possible to turn neuroscience-based applications on adversaries, and a number of such applications are conceptualized in particular as possibly effective nonlethal weapons.31
By contrast, an application domain is associated with a set of specific operational military problems, the solutions to which may draw on many different technologies. Four application domains are discussed in Chapter 3, again for illustrative purposes:
• Robotics and autonomous systems. In many conflict scenarios, unmanned weapons systems with varying degrees of autonomy are preferred for reasons of operational effectiveness and efficiency and minimizing casualties among noncombatants and friendly forces.
• Prosthetics and human enhancement. Human beings engage in combat with capabilities that are limited by biology and degraded through injury. The use of prostheses is one approach to restoring human capabilities lost through injury and enhancing human capabilities above and beyond biological limits.
• Cyber weapons. Given the increasing dependence of adversaries on computer and communications technology, cyber weapons provide a potentially important means by which adversary systems can be destroyed, degraded, disrupted, denied, and usurped.
• Nonlethal weapons. In many scenarios involving U.S. military forces engaged in military operations other than war (e.g., policing secured territory), it is desirable to have operational options other than the use of deadly force. Nonlethal weapons are often conceptualized as providing one such option.
The societal environment in which science and technology are embedded today has characteristics different from those in the past: increasing globalization and higher degrees of connectivity are two of its most prominent characteristics. Unlike the state of affairs immediately after World War II, the United States is no longer always and automatically the dominant and leading actor in all fields of S&T. Other nations have invested
31 The Chemical Weapons Convention constrains the use of chemical nonlethal weapons in a military context. However, certain kinds of directed-energy weapons might be developed for the purpose of affecting neurological function in some (nonlethal) way. See http://royalsociety.org/policy/projects/brain-waves/conflict-security/.
heavily in S&T, and foreign as well as domestic expertise drives important advancements in many fields. For example, in 2009, eight Asian countries had collectively caught up with U.S. investment in R&D.32 Nonetheless, the United States remains by far the largest national investor in R&D.
The impact of globalization is magnified by the phenomenon of increasing connectivity throughout the world. An ever denser and faster global Internet connects more and more scientists and technologists of many nationalities, allowing them to learn from each other. Of equal and perhaps greater significance is the fact that rapid communications of all kinds between individuals and groups are increasingly possible, enabling small groups to reach large audiences with information that may affect public opinion and social movements, including information that governments might prefer to keep out of public view. Social media in particular provide unprecedented opportunities for groups to organize and grow, a fact that can create enormous public pressures on government policy makers.
Economic considerations—faced by all governments and nations today—also increase pressures on governments and nations to justify support for scientific research in terms of its potential payoff and on researchers to justify their efforts in terms of positive economic and social effects. In this environment, policy makers feel strong pressures to shorten the time from government-supported research to useful applications—and such pressures reduce the time available for thoughtful consideration of how these applications might fit into a larger societal context.
Furthermore, much of the progress in certain ERA technologies—information technology stands out as a notable example—is the result of private-sector activity. Thus, government controls and influences on technological trajectories are weaker than they have been in the past.33 And, of course, R&D conducted by the private sector must usually be justified on the basis of return-on-investment projections, which also inevitably emphasize nearer-term payoffs. Companies seek to gain a competitive advantage in the marketplace as a result of their R&D investment.
The parties that can take advantage of ERA technologies include parties that are neither wealthy nor technologically advanced. This definition spans a wide range, including relatively poorer or less technologically
32 The United States, the largest single R&D-performing country, accounted for about 31 percent of the 2009 global total. Asian countries—including China, India, Japan, Malaysia, Singapore, South Korea, Taiwan, and Thailand—represented 24 percent of the global R&D total in 1999 but accounted for 32 percent in 2009. See http://www.nsf.gov/statistics/seind12/c4/c4h.htm.
33 Defense Science Board, “The Defense Science Board 1999 Summer Study Task Force on 21st Century Defense Technology Strategies, Volume 1,” U.S. Department of Defense, 1999, available at http://www.dtic.mil/docs/citations/ADA433941.
advanced nation-states, private organizations such as nongovernmental welfare organizations but also crime cartels and well-funded terrorist organizations, small groups of independent “freelance” actors, and even individuals. Of course, within this wide range, there is significant variation in their ability to take advantage of these technologies—with greater ability being associated with greater access to resources and talent.
The conduct of war has always raised ethical and societal concerns—and to the extent that technology is an instrument of war, the use of military technologies raises such concerns as well. For example, international law (the law of armed conflict as expressed in the UN Charter and the Hague and Geneva Conventions as well as a number of other treaties) today governs the conduct of armed conflict. The UN Charter describes the circumstances under which nations are permitted to engage in armed conflict. The Hague and Geneva Conventions and associated protocols govern how states may use force once conflict has started. A number of other international agreements ban the use of certain weapons, such as chemical and biological weapons,34 land mines,35 and blinding lasers.36 These international conventions—and arms control agreements more generally—are motivated in part by ELSI considerations. Chapter 4 provides some history and discusses LOAC and other international law in greater detail.
As the nature of conflict, technology, and the larger world environment have evolved over the last several decades, a number of these changes pose a variety of new ethical challenges to existing international legal regimes and to our understanding of conflict. These changes include:
• Nonstate adversaries. State-on-state conflict, at least between industrialized nations, has given way to what some have called “violent peace.” Although nation-states are the primary focus of international treaties and agreements (and the Geneva Conventions bind nations), actual and potential adversaries of the United States include not only near-peer nation-states but also developing nations and terrorist groups that are not affiliated with any particular nation. Additional Protocol II of the Geneva Conventions (1977) fleshes out LOAC as it applies to non-international armed conflict (that is, armed conflict not involving two states). In addi-
34 Geneva Protocol, 1925; Chemical Weapons Convention; Biological Weapons Convention.
35 The Ottawa Treaty, 1999. The United States is not a party to this treaty, although as a matter of policy, it has mostly complied with its main provisions.
36 Blinding Laser Protocol of the Convention on Conventional Weapons, 1995.
tion, the United Nations acknowledges the significance of nonstate actors in United Nations Security Council Resolution 1540,37 which specifically obliges states “to refrain from supporting by any means non-State actors from developing, acquiring, manufacturing, possessing, transporting, transferring or using nuclear, chemical or biological weapons and their delivery systems.”
• Asymmetric warfare. U.S. advantages in conventional military power have led many adversaries to seek other ways to challenge the United States on the battlefield. Rather than seeking to overcome U.S. strengths, asymmetric tactics seek to take advantage of U.S. weaknesses, vulnerabilities, and dependencies—and one element of such tactics may be to ignore, disregard, or even take advantage of constraints imposed by traditional understandings of the laws of war. A historical example is that terrorists and insurgents may deliberately blend with noncombatant civilians on an expanded and nontraditional battlefield, and distinctions between the two categories are increasingly blurred in many situations of conflict. More recently, concerns have arisen that U.S. military forces may be excessively vulnerable to cyber threats because of their great dependence on information technology.
• Volunteer service in the armed forces. In the last 50 years, U.S. policy regarding military service has changed dramatically, from near-universal conscription of male citizens to all-volunteer armed forces. Today, an increasingly small fraction of the population has served directly in the armed forces. Most U.S. civilians lack firsthand knowledge of issues (some of them ethical) that may be associated with armed conflict, and fewer civilians know others who have served in the armed forces. Thus, many do not have a basis for making informed ELSI judgments about technologies that may be useful in modern warfare. In addition, current members of the armed forces have voluntarily relinquished certain rights to personal autonomy in choosing to be subject to a military chain of command, although the scope and nature of the rights they have surrendered are not necessarily clear in all cases. The fact of volunteering means that these individuals cannot say that they did not choose to be subject to military rules, which may require them to do things that they could not be required to do in civilian life.
In 2010, the Defense Advanced Research and Projects Agency asked the National Academies to develop and articulate a framework for policy
37 The resolution can be found at http://www.un.org/en/ga/search/view_doc.asp?symbol=S/RES/1540%20(2004).
makers, institutions, and individual researchers to use to think through ethical, legal, and societal issues as they relate to democratized technologies with military relevance. In DARPA’s usage, “democratized technologies” are technologies with rapid rates of progress and low barriers to entry, as illustrated in Chapters 2 and 3. However, the committee believed that the term “democratized technologies” is easily misunderstood, and this report thus uses the term “emerging and readily available technologies” (ERA technologies).
The ethical, legal, and societal scope of this report encompasses three categories of concern that in the committee’s judgment are central to any consideration of the ethics associated with new military technologies:
• The conduct of research. Conduct includes the selection of research areas, the design of particular research investigations (e.g., protocols, experiments), and the execution of those investigations. ELSI concerns relating to the conduct of research focus primarily on the impact of doing the research on the subjects that may be involved, whether by choice or by chance. “Subjects” here are defined broadly—communities, animals, individuals concerned about the environment, and workers in addition to those parties that are explicitly acknowledged as being research subjects.38 (ELSI concerns related to acknowledged research subjects are important, but there is today a well-developed infrastructure to address such concerns.) In a military context, ethical, legal, and societal issues related to the conduct of research also include matters of classification and the impact that such classification may have on oversight and review.
• The applications of research as they relate to intended capabilities enabled by research. ELSI concerns associated with specified applications fall into two categories: concerns over the intended effects or purposes of the application and concerns over undesired effects (“side effects”) that might occur when the application has its intended effects. An example of the first category is R&D intended to develop a laser to blind soldiers on the battlefield—one ELSI concern relates to whether it is in fact ethical to develop a weapon for such a purpose. (Some of the history regarding an international ban on the use of lasers designed to blind soldiers is recounted in Chapter 3.) An example of the second category is R&D on a vaccine against a biological weapon. In this case, there is little ELSI controversy over the intended result, namely, some degree of immunity to that weapon. However, if the side effects of the vaccine (which might include severe allergic reactions, pain, or muscle weakness) were signifi-
38 The term “subject” in this context is used informally, and in particular is not tied to any legal definition of the term, as might be provided (for example) by regulations of the Department of Health and Human Services.
cant and widespread, ELSI concerns could arise over whether the benefits were worth the costs (e.g., how to account for benefits to the individual soldier versus benefits to the fighting force as a whole). Widely adopted applications may also require, impede, facilitate, or encourage institutional or organizational changes, and there may be ethical dimensions to such changes as well.
ELSI concerns related to technologies that can be used for both military and civilian purposes are an important subset of the second category. A decision to pursue one technology for an application in one context (a military context) may well raise ELSI concerns about its use in another context (e.g., a civilian context) because of different societal norms and laws/regulations that might be operative in the latter. One contemporary example is the law enforcement use of surveillance drones developed for military purposes, a use that has raised public concerns about privacy.39
• Unanticipated, unforeseen, or inadvertent ELSI consequences of either research or applications. These consequences are usually manifested by something going awry, as when research does not proceed as expected (e.g., experimental control is lost) and thus causes harm outside the original bounds on the research or when unanticipated applications raise additional ELSI concerns.40 ELSI concerns in this domain often relate to applications that are not intended by the proponents of such research. For example, an application may be used in ways entirely unanticipated or unimagined by its creators, and thus bring into play a set of side effects that were also unanticipated. These concerns are thus particularly difficult to imagine ahead of time. After due diligence has been exercised, it is also necessary to put into place a process that monitors how applications are used and that can respond quickly when unanticipated ELSI side effects manifest themselves. Chapter 4 discusses approaches for reducing the likelihood of unpleasant surprises.
For these categories of concern, the committee sought to build on previous work that addresses ethical, legal, and societal issues associated with S&T and with the military. In many cases, however, the committee found little work at the nexus of ethics, emerging technologies, and military applications. Nevertheless, some relevant work includes the following:
40 “Unforeseen” in this context means unforeseen by the proponents or the performers of the research.
• Work sponsored by the International Society of Military Ethics (ISME), which was established to examine professional military ethics in all dimensions, including but not limited to military technology.41 Of particular note is a special issue of the Journal of Military Ethics (Volume 9, Issue 4, 2010), published by the ISME, entitled Ethics and Emerging Military Technologies, with articles such as:
—“Postmodern War,” by George R. Lucas, Jr.;
—“The Ethics of Killer Applications: Why Is It So Hard to Talk About Morality When It Comes to New Military Technology?,” by P.W. Singer;
—“Ethical Blowback from Emerging Technologies,” by Patrick Lin;
—“The Case for Ethical Autonomy in Unmanned Systems,” by Ronald C. Arkin;
—“Moral Predators: The Duty to Employ Uninhabited Aerial Vehicles,” by Bradley Jay Strawser;
—“Saying ‘No!’ to Lethal Autonomous Targeting,” by Noel Sharkey;
—“The Ethics of Cyberwarfare,” by Randall R. Dipert; and
—“‘Cyberation’ and Just War Doctrine: A Response to Randall Dipert,” by Colonel James Cook.
• A February 2012 publication by the Royal Society entitled Neuroscience, Conflict and Security.42 This study examined the ethics of neuroscience for military purposes and was charged with reviewing the current policy, legal, and ethical frameworks governing military applications of neuroscience.
• A RUSI publication, circa 2008,43 which addressed the ethics and legal implications of military unmanned vehicles.
• A framework outlined by the Consortium for Emerging Technologies, Military Operations and National Security (CETMONS) for assessing the implications of emerging technologies for military capability and national security.44 This framework considers issues related to a technology’s implications for civil society; civil reaction affecting military
43 Elizabeth Quintana, The Ethics and Legal Implications of Military Unmanned Vehicles, British Computer Society, Royal United Services Institute, available at http://www.rusi.org/downloads/assets/RUSI_ethics.pdf.
44 Consortium for Emerging Technologies, Military Operations, and National Security, “Framework for Assessing the Implications of Emerging Technologies for Military Capability and National Security,” 2013, available at http://lincolncenter-dev.asu.edu/CETMONS/index.php/research-areas/framework-assessment.
missions or civil society; external threats to U.S. security; the impact on treaties and military law; and the impact on military doctrine, military culture, military education, and military operations.
• A 2004 report of the National Research Council titled Biotechnology Research in an Age of Terrorism (aka the Fink report), which addressed “technologies [in the life sciences that] can be used legitimately for human betterment and [also] misused for bioterrorism [through the creation of biological weapons].”45 In this context, the 2004 report noted that ““biological scientists have an affirmative moral duty to avoid contributing to the advancement of biowarfare or bioterrorism…. scientists can and should take reasonable steps to minimize this possibility [that knowledge they generate will assist in advancing biowarfare or bioterrorism].”
In addition, a 2008 report of the National Research Council, Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Assessment,46 developed a framework for the systematic assessment of information-based programs being considered or already in use for counterterrorist purposes. This framework posed a set of questions focused on the effectiveness, lawfulness, and consistency with U.S. values of such programs, the answers to which would be useful to those making decisions about such programs.
The committee notes that perspectives on ethical, legal, and societal issues related to science, technology, and military affairs are hardly unitary. Even within a single nation such as the United States, different constituencies are likely to have different ethical stances toward the same issue. Furthermore, perspectives on ethics may vary with military might. A nation that is accustomed to military superiority on the battlefield may well have an ethical perspective different from that of other nations without such power (Box 1.1). The ethical perspectives of allies, adversaries, and neutral observers may well be different from that of the United States; under some circumstances, the differences may have consequences for U.S. freedom of action.
Addressing differences in ethical perspectives has two aspects, only one of which is covered in any detail in this report. Chapters 2 through 5 of this report address the first aspect, namely, the identification and articulation of possibly competing ethical perspectives. To properly consider ethical, legal, and societal issues, decision makers must begin by understanding the scope and nature of those issues. Part of that understanding
45 National Research Council, Biotechnology Research in an Age of Terrorism, The National Academies Press, Washington, D.C., 2004.
46 National Research Council, Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Assessment, The National Academies Press, Washington, D.C., 2008.
Box 1.1 Possible Ethical, Legal, and Societal Implications of Seeking Technological Superiority
As a matter of U.S. policy, superior military technology is
a cornerstone of the U.S. military’s strategic posture…. DOD Research and Engineering (R&E) programs are needed to create, demonstrate, and partner in the transition to operational use of affordable technologies that can provide a decisive military superiority to defeat any adversary on any battlefield…. [Furthermore] continued technology development should enable future military superiority.1
The U.S. declaratory policy of seeking technological military superiority over U.S. adversaries has an overarching impact on ethical, legal, and societal issues that involve the R&D associated with new technologies of military relevance. But a detailed examination of the ELSI implications of this policy is not within the scope of this project’s statement of task, which implicitly asks the committee to assume the validity of this policy. Some aspects of this policy that may have ELSI implications include the following:
• Weapons to implement the policy of technological superiority have to conform to the laws of war, but since technology often outstrips the laws of war, the laws of war per se may not be much of a constraint. Thus, the development of such weapons stresses existing understandings of law and ethics that may be operative before the introduction of such weapons.
• Technological superiority may provide transient rather than long-lasting advantage as adversaries learn to counter or obtain the technologies available to the United States. However, even transient advantages can be tactically significant in the short term (in terms of enabling U.S. forces to perform missions at lower human and economic cost), especially if they come as a surprise to an unprepared adversary.
• Adversaries, both real and potential, react to the introduction of new U.S. military technologies. The availability of such technologies to the United States may deter adversaries from taking hostile actions against U.S. interests, may cause adversaries to seek to adopt those technologies for their own use, or may cause them to seek to counter the advantages conferred by U.S. use. Indeed, the first
is an explicit decision regarding whose ethical perspectives should be considered and taken into account.
The second aspect of addressing differing ethical perspectives is just as important. Once competing ethical perspectives have been identified, how should they be weighed and who should weigh them? Furthermore, on what basis should a party whose ethical perspectives are not adequately included in any policy decision, however inclusive and honest
successful uses of a technologically superior weapon may themselves be signals to adversaries about the utility of such weapons. Observed and anticipated adversarial responses to technological superiority and associated effects on stability may have ELSI implications.
• Because transient advantages dissipate (by definition), additional work is always needed to find new generations of technologically superior weapons—and enduring advantages can be secured only by making a commitment to constant reinvestment in technology.
• The first user of a weapon often sets precedents that other nations follow for the circumstances under which such a weapon can be used. Indeed, such precedents may be the initial seeds out of which international law and rules of the road governing such use can grow.
• A focus on technological superiority may cause the United States to neglect the “soft power” dimensions of its security strategy. In 2007, Secretary of Defense Robert Gates argued that in the future, “success [in asymmetric warfare] will be less a matter of imposing one’s will and more a function of shaping behavior—of friends, adversaries, and most importantly, the people in between.”2
• Presumptions of technological superiority may deflect attention from consideration of ethical, legal, and moral issues associated with military applications of technology. The prospect of reciprocal use has historically been a spur for reflection on the ethical implications of military applications of technologies, whereas asymmetric advantage has historically had the effect of deferring and diffusing ethical deliberation.
Because it treats the policy of seeking technological superiority over U.S. adversaries as a given, this report does not assess or even address the issues described above in any systematic way. Nevertheless, policy makers may wish to consider this policy as an area for future ELSI analysis that may have impacts on ELSI considerations of individual technologies or research projects.
1 Thomas M. McCann, Defense Manufacturing Management Guide for Program Managers, October 16, 2012, p. 230, available at https://acc.dau.mil/docs/plt/pqm/mfg-guidebook-10-16-12.pdf.
the decision-making process, be expected to trust and acquiesce in that decision?
In both national and international law, legal practitioners and scholars have developed approaches to balancing competing or conflicting interests, even when those conflicting interests are well grounded and legitimate. Examples of such approaches include procedural requirements such as burden-of-proof obligations; criteria to ensure that the impact on the interests that are adversely affected is minimized to the extent feasible
given the conflicting interests at stake; and appeal to case law to identify binding or guiding precedents.
As a broad generalization, approaches to balancing competing ethical claims and to comparing the ethics of different courses of action are considerably less developed. As a practical matter, it is often true that individuals presented with an ethical dilemma in a specific case come to similar conclusions about the appropriate course of action, even if they would disagree vehemently on the underlying reasoning or ethical theories. And in some cases, examination of similar cases from the past may help to shed some light on ethical matters. But to the extent that any party’s ethical beliefs are deeply held, one might expect that party to be predisposed toward opposing any decision-making process that does not result in the accommodation of those beliefs.
In the end, if and when agreement cannot be found in contemplating any given dilemma, participants will usually engage in some ad hoc process that resolves it one way or another. It is not too strong to describe such a process as being political (and hence outside the scope of this report), and the political nature of this process serves as a reminder of the very complex milieu in which decision makers operate.
This report does not evaluate or assess the ethical, legal, and societal issues in any part of DARPA’s technology R&D portfolio. That is, although the report does identify ethical issues that are associated with some of the technologies of interest to DARPA, it does not come to any specific conclusions about the ethical, legal, or societal propriety of any particular research program or project in the DARPA portfolio.
Also, this report does not address specific operational programs. While research programs are supported because they might enable important capabilities (and thus an ELSI assessment of a given research effort necessarily entails a consideration of applications), it is rarely clear at the outset how those capabilities might be integrated into an operational program. The reason is that the latter involves many specific decisions about how the program must operate—specific personnel, specific logistics, specific command-and-control configurations, specific rules of engagement, specific mechanisms for oversight, and so on. There are of course ethical, legal, and societal issues associated with these arrangements (e.g., a given arrangement may or may not raise ELSI concerns), but because these arrangements cannot be anticipated at the research stage, addressing the ethical, legal, and societal issues associated with operational programs is not within the scope of this report.
Furthermore, research-supporting agencies have general counsels that are charged with ensuring that all programs and projects by those agencies, both external and internal, are conducted in accordance with all applicable legal requirements. Processes intended to fulfill this mandate are not addressed in this report, except insofar as they are points of
departure as mechanisms for considering ethical, legal, and societal issues more broadly.
Other topics not addressed in this report under the broad rubric “the ethics of science” include scientific misconduct (e.g., data falsification, plagiarism, improper allocation of publication credit), specific laws and regulations as they might apply to specific research projects, financial conflicts of interest, the perspectives of specific religions on matters of war and peace, and the impact of classification on intellectual inquiry and academic freedom.
Last and as noted above in this chapter, this report assumes that some precursor efforts (whether basic research or applied research/development efforts) that may lead to advanced military technologies are appropriate for the nation to pursue and can be morally justified. Thus, any debate over the fundamental ethics of doing military research at all is outside its scope.
So that it could base its analysis, findings, and recommendations on real-world trends, the committee examined seven illustrative S&T areas: information technology, synthetic biology, neuroscience, robotics, prosthetics, cyber weapons, and nonlethal weapons. Other relevant technology domains that the committee could have chosen to address include space technologies, geoengineering technologies, and nanotechnology.47
Chapter 2 addresses the first three, which are foundational sciences and technologies that enable progress and applications in a variety of problem domains. Chapter 3 address the last four, which are application domains associated with specific operational military problems. To varying degrees, each of the S&T areas above has many or most of the characteristics of ERA technologies in the sense defined above. That is, even without large investment, a multitude of state and nonstate actors, friendly or not, can adopt and adapt their results to a multitude of purposes. Chapters 2 and 3 examine each of these S&T areas from the perspective of technology maturity (that is, how close the science or technology in question is to producing useful applications) and possible military applications. Without attempting to be comprehensive, it highlights some of the ELSI implications that emerge in each domain.
47 Although the statement of task mentioned nanotechnology as an illustrative technology for this report, the committee did not examine nanotechnology explicitly. The reason was that the U.S. government does support the National Nanotechnology Initiative—and, as noted above, within that initiative is embedded a significant ELSI component. That dedicated effort is well resourced and positioned to make meaningful statements about ethical, legal, and societal issues associated with nanotechnology.
Chapter 4 describes sources of ELSI insight, including a variety of theoretical and disciplinary approaches to ethics and insights from social sciences such as anthropology and psychology.
Chapter 5 uses the sources of Chapter 4 and ELSI commonalities that appear in many of the technologies discussed in Chapters 2 and 3 to articulate questions for various stakeholders that might be used when contemplating the development of a technology or an application. These questions are useful for identifying possible ethical, legal, and societal issues that might arise from such development, and they are the heart of the framework requested in DARPA’s charge to the committee.
Chapter 6 considers the limitations of a priori analysis and proposes two additional techniques for augmenting and increasing the value of what such analysis can provide. The chapter explores deliberative processes as a way to expand the scope of ELSI insights that might be relevant, and an adaptive approach to planning that can mitigate some of the ELSI uncertainties that can accompany any given development.
Chapter 7 describes various mechanisms that have been used to address ethical, legal, and societal issues arising from S&T endeavors, as well as considerations for the use of such mechanisms in a military context.
Chapter 8 provides the report’s findings and recommendations.