1

Framing the Issues

1.1 NATIONAL SECURITY AND THE ROLE OF TECHNOLOGY

The United States faces a broad and complex array of challenges to its national security. Its potential adversaries cover a broad range, including nations large and small, organized terrorist groups, drug cartels, organized crime, and even individual terrorists. The weapons they use (or wish to use) cover an equally broad range and include conventional military weapons, weapons of mass destruction and disruption, improvised explosive devices, and cyber/information warfare. Moreover, the scope and the nature of threats facing the nation are constantly evolving.

The armed forces of the United States exist to deter its adversaries from threatening action against it, its allies, and its interests more broadly. In the words of the National Security Strategy 2010, “We are strengthening our military to ensure that it can prevail in today’s wars; to prevent and deter threats against the United States, its interests, and our allies and partners; and prepare to defend the United States in a wide range of contingencies against state and nonstate actors.”1 In the event that deterrence fails, the United States structures and equips its armed forces with the personnel and tools they need to defeat adversary threats, although U.S. policy calls for a military approach only when other approaches, such as diplomacy, are unsuccessful in resolving disagreements between nations or controlling threats to U.S. national security.

___________________

1 See http://www.whitehouse.gov/sites/default/files/rss_viewer/national_security_strategy.pdf.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 15
1 Framing the Issues 1.1  NATIONAL SECURITY AND THE ROLE OF TECHNOLOGY The United States faces a broad and complex array of challenges to its national security. Its potential adversaries cover a broad range, includ- ing nations large and small, organized terrorist groups, drug cartels, organized crime, and even individual terrorists. The weapons they use (or wish to use) cover an equally broad range and include conventional military weapons, weapons of mass destruction and disruption, impro- vised explosive devices, and cyber/information warfare. Moreover, the scope and the nature of threats facing the nation are constantly evolving. The armed forces of the United States exist to deter its adversaries from threatening action against it, its allies, and its interests more broadly. In the words of the National Security Strategy 2010, “We are strengthening our military to ensure that it can prevail in today’s wars; to prevent and deter threats against the United States, its interests, and our allies and partners; and prepare to defend the United States in a wide range of con- tingencies against state and nonstate actors.”1 In the event that deterrence fails, the United States structures and equips its armed forces with the personnel and tools they need to defeat adversary threats, although U.S. policy calls for a military approach only when other approaches, such as diplomacy, are unsuccessful in resolving disagreements between nations or controlling threats to U.S. national security. 1 See http://www.whitehouse.gov/sites/default/files/rss_viewer/national_security_ strategy.pdf. 15

OCR for page 15
16 ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY America’s experience at war and in planning for war since the end of World War II has persuaded military planners that technological mili- tary superiority is the best way to approach this goal. That is, the U.S. approach to national security emphasizes technologically derived qualita- tive advantages over its adversaries, and technology is an integral aspect of national security. (By contrast, the U.S. approach to armed conflict during World War II generally placed much greater emphasis on the large-scale production of weapons rather than technological superiority.) Technology supports a number of military functions. For example, weapons are the tools that cause direct effects against an adversary, such as when a bomb explodes on the battlefield. Technologies for command, control, intelligence, surveillance, and reconnaissance (C4ISR) help deci- sion makers ensure that these effects occur when and where they are intended to occur; for example, a system for data analysis identifies an important target that would otherwise be overlooked or collateral damage that might result from an attack. Countermeasures seek to frustrate an adversary’s use of weapons and C4ISR systems. Logistics provide indirect support for the personnel involved, such as food, fuel, transportation, and medical assistance. Adversaries also seek technologically enabled capabilities for their own purposes, and sometimes they are influenced by demonstrations that a given technology has proven useful in practice. Indeed, sometimes the utility of such a technology is demonstrated by the United States itself. Those adversaries can acquire and adapt for their own use the technolo- gies that the United States develops, can find alternative technologies that are more available or less expensive (e.g., commercial products), and can identify ways to negate U.S. technological advantages. They may also give the technology or the ability to create the technology to others for use against the United States. To enhance and expand technological superiority, the Department of Defense (DOD) and other government agencies invest in science and technology (S&T) on an ongoing basis. (Investment in technologies for military purposes sometimes has benefits for the civilian world as well.) These investments cover a broad range from fundamental science that might eventually support national security needs, broadly defined, to specific development and eventual production efforts intended to address particular national security problems. (In some cases, the national secu- rity problem for the United States is the possibility that an actual or potential adversary will develop a new capability.) In addition, the U.S. government adapts technologies originating in the civilian sector, without national security in mind, to national security needs. The development of technology for national security needs is a com- plex endeavor, given the strategy of technological superiority as well as

OCR for page 15
FRAMING THE ISSUES 17 changes in the technological and societal environment. These changes are discussed in greater detail in Section 1.4, “Emerging and Readily Avail- able Technologies of Military Significance.” The U.S. Office of Management and Budget uses the following defini- tions for research and development:2 • Basic research is defined as “systematic study directed toward fuller knowledge or understanding of the fundamental aspects of phe- nomena and of observable facts without specific applications toward processes or products in mind. Basic research, however, may include activities with broad applications in mind.” An example might be research in quantum computing, a field that is at the forefront of basic research even as its potential for revolutionary advancements in computing is acknowledged [even without specific applications in mind]. • Applied research is defined as “systematic study to gain knowl- edge or understanding necessary to determine the means by which a recognized and specific need may be met.” An example is research to improve flight control for remotely piloted aircraft. • Development is defined as “systematic application of knowledge or understanding, directed toward the production of useful materials, devices, and systems or methods, including design, development, and improvement of prototypes and new processes to meet specific require- ments.” An example is technical work needed to meet a particular range requirement for a particular remotely piloted aircraft. The categories of activity described above speak to how the DOD may invest in S&T research, from which may emerge findings and results that can lead to military applications. But, of course, the DOD does not live in a closed environment, and today it also keeps track of civilian S&T that might have military application. Indeed, civilian S&T are sometimes more mature and developed than S&T overtly developed for military purposes. Civilian science and technology may be introduced at any appropriate stage. 1.2  ETHICAL, LEGAL, AND SOCIETAL ISSUES IN SCIENCE AND TECHNOLOGY U.S. investment in science and engineering research and development (R&D) has been substantial, and its results have helped to shape physi- 2 Executive Office of the President, Office of Management and Budget Circular No. A–11 (2012), Section 84, page 11, available at http://www.whitehouse.gov/sites/default/files/ omb/assets/a11_current_year/a_11_2012.pdf.

OCR for page 15
18 ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY cal and social landscapes throughout the world. Policy makers seek new science and technology largely because of the larger range of policy and programmatic options they afford. But efforts to develop new S&T have also raised concerns about a variety of ethical, legal, and societal issues (ELSI).3 Furthermore, many such concerns emerge from the increasingly global scope of certain new technologies and the applications these tech- nologies enable. This report uses the adjective “ethical” to describe issues that are matters of principle (what people regard as right). By contrast, “social” or “societal” is used in reference to issues that are matters of interests (what people regard as desirable). Often the two will overlap. People should always desire the things that they believe are right. However, they may also desire things without invoking a moral principle. Both can refer to how choices are made, which actions are taken, and what outcomes arise. Ethical issues are often illuminated by analysis (e.g., philosophy) and social issues by empirical research (e.g., psychology, sociology). However, each can inform the other (as when analysis suggests topics for empiri- cal research or when such research identifies behavior worth analyzing). As for the relationship between law and ethical/societal issues, it is true that law is intrinsically a part of those issues. Law establishes author- ity to decide questions (who decides), set substantive limits on the content of decisions (what gets decided), and create processes or procedures for decision making (how decisions get made). Law can channel how policy makers make decisions when ethical or societal consensus is lacking, and indeed law is often the essential point of departure for a consideration of ethical or societal issues. Legal concerns often become more salient as a given weapons concept unfolds from R&D to deployment to use. However, against the backdrop of an evolving legal context and understanding is the reality that law and ethics are not identical, and even well-established law cannot be the final word on ethical and societal issues for several reasons: • Established law may not even address ethical or societal issues that are important in any given instance. The relationship of legal, ethical, and societal factors is not always straightforward, although they do overlap in some cases. In general, the law is supposed to reflect the ethical, as well as the practical, values of the community to which it applies. Law can thus be an expression of both ethical and societal concerns, but it is not always so. By contrast, ethical and societal considerations are not bounded by their expressions in law; indeed, some are not captured by law at all, 3 The acronym ELSI stands for “ethical, legal, and societal issues” and is strictly speaking a noun. However, this report uses the acronym as an adjective.

OCR for page 15
FRAMING THE ISSUES 19 perhaps because it may not be possible to condense an ethical or societal concern to a simple expression of black-letter law. Most importantly, in many cases the emergence of ethical and societal concerns leads the development of law. In this interval, decision makers have to cope with such concerns and the controversies they may engender in the absence of formal (e.g., legal) guidance for their decisions. • The interpretation of established law may depend on the par- ticular facts and circumstances of any research problem. For example, a law may prohibit the use of human subjects under conditions that expose those subjects to significant danger. What counts as “significant” danger? Resolving this question is, by definition, not a matter for law unless the law provides some specific definition for “significant”—which it often does not. Moreover, there is often profound disagreement in many instances about what is ethical, a disagreement often reflected in laws that are ambiguous or incomplete. Law, which is usually designed to with- stand rapid changes in popular opinion, may be unclear in its practical application. Thus, a debate rages today within the United States about the scope of constitutional protections when drones are used to carry out targeted killings, and disagreement about the morality or “rightness” of that use is even more heated. That is, new circumstances may highlight tensions between ethical and legal constructs that might otherwise be overlooked. • The ethical and societal environment extant at the time a law might be applied could be very different from that at the time the law was for- mulated. Although some degree of ethical or societal consensus may have to be present when a given law is enacted or otherwise goes into force, that consensus may no longer be operative at the moment policy makers must make a decision about a given research effort. That is, laws them- selves are sometimes overtaken by events that call into question some of their underlying but unstated ethical assumptions. Similar considerations apply for new technological capabilities that may not have been antici- pated in the initial formulation of a law. • Strategic or tactical concerns also may not line up well with ethical considerations. For example, a decision to develop a new weapon system for use under particularly exigent circumstances might be considered by some to be ethically objectionable (e.g., because of the bad precedents its use might set) and by others to be tactically necessary (e.g., because of the lives its use might save in a particular situation). If any of these reasons is relevant to a given decision-making situa- tion, the law may not by itself be in any way final or dispositive. In such cases, decision makers have no choice but to refer to the ethical principles that they believe were inherent in the initial formulations of the law.

OCR for page 15
20 ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY Research and deliberation can guide the examination of ethical, legal, and societal concerns. Without such examination, public policies and pro- grams may not be stable and sustainable. Law and regulation are expres- sions of public policy that reflect societal concerns and establish norms or standards regarding how to address those concerns. ELSI concerns regarding S&T are not new.4 For example, in the years after World War II, governments have made efforts to come to grips with some of the ethical concerns that result from developments and research practices in S&T. These efforts span a broad range, and they include (but are not limited to) the following: • In 1946, the postwar Nuremberg trials resulted in the convictions of a number of German physicians and bureaucrats who conducted or facilitated horrific medical experiments on concentration camp prisoners. These trials have become an important point of departure for interna- tional discussions on bioethics issues. • In 1972, the United States signed the Convention on the Prohibi- tion of the Development, Production and Stockpiling of Biological and Toxin Weapons and on Their Destruction (usually known as the Biological Weapons Convention (BWC)),5 in part for ethical reasons.6 The BWC bans “the development, production, stockpiling, acquisition and retention of microbial or other biological agents or toxins, in types and in quantities that have no justification for prophylactic, protective or other peaceful purposes,” and “weapons, equipment or means of delivery designed to use such agents or toxins for hostile purposes or in armed conflict.” The actual use of biological weapons is prohibited by the 1925 Geneva Protocol.7 • In 1979, the Belmont report of the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research presented three basic ethical principles regarding the conduct of biomedi- cal research involving human subjects: respect for persons (e.g., research subjects should be treated as autonomous), beneficence (e.g., research subjects should not be harmed), and justice (benefits and costs of research 4 An overview of this subject can be found in Carl Mitcham, Encyclopedia of Science Technology and Ethics, Macmillan Reference, Detroit, Mich., 2005. 5 See http://www.opbw.org/. 6 The U.S. decision to sign the BWC was also influenced by the conclusion of the U.S. military that biological weapons had little military utility and that signing the convention would not deprive the United States of a significant military capability. 7 The 1925 protocol is formally known as the Protocol for the Prohibition of the Use in War of Asphyxiating, Poisonous or Other Gases, and of Bacteriological Methods of Warfare. See http://www.un.org/disarmament/WMD/Bio/1925GenevaProtocol.shtml.

OCR for page 15
FRAMING THE ISSUES 21 should be shared equitably).8 The Belmont report and other reports by the National Commission formed the basis of regulations implementing these principles that govern the conduct of most federally supported research involving human subjects. These regulations, usually known collectively as the Common Rule, require institutions to establish institutional review boards (IRBs) that approve, modify, or reject such research. • In the late 1980s, the Human Genome Project (HGP) established a program of research on ethical, legal, and societal issues associated with sequencing the human genome. Such issues include questions of how genetic information should be interpreted and used, who should have access to it, and how people could be protected from the harm that might result from the improper disclosure or use of such information. • In 1993, the United States signed the Chemical Weapons Conven- tion (CWC),9 in part for ethical reasons. The CWC bans the development, production, stockpiling, and use of chemical weapons, although the CWC acknowledges the benefits of peaceful chemistry and the desire to pro- mote free trade in chemicals and international cooperation in chemical activities not prohibited by the convention. • In 2001, the National Nanotechnology Initiative (NNI) was launched. One of the NNI’s goals is promoting the responsible develop- ment of nanotechnology, an important component of which is the con- sideration of the ethical, legal, and societal implications associated with nanotechnology research and development, and the development of plans for addressing environmental, health, and safety implications as well. Some of the issues include how applications of nanotechnology research are introduced into society; how transparent the related decision-making processes are; and how sensitive and responsive policies are to the needs of the full range of stakeholders. To help explore the ethical, legal, and societal issues associated with nanotechnology research, NNI agencies support two centers for nanotechnology in society, at Arizona State Uni- versity and the University of California, Santa Barbara, and also incorpo- rate ELSI components in their new nanotechnology R&D programs. Nongovernmental organizations and individuals have also mounted important efforts, which include the following: • In 1955, the Russell-Einstein manifesto addressed the dangers of nuclear war, arguing that the use of nuclear weapons threatened the con- tinued existence of mankind. 8 The Belmont report can be found at http://www.hhs.gov/ohrp/humansubjects/ guidance/belmont.html. 9 See http://www.opcw.org/chemical-weapons-convention/about-the-convention/.

OCR for page 15
22 ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY • In 1964, the Declaration of Helsinki was adopted by the World Medical Association as a statement of ethical principles for medical research involving human subjects, including research on identifiable human material and data. Since then, the declaration has undergone sev- eral revisions and clarifications. • In 1974, the paramountcy clause was first included in a code of engineering ethics. It obligates engineers to “hold paramount the safety, health and welfare of the public and protect the environment in perfor- mance of their professional duties.”10 • In 1983, the U.S. Catholic Bishops issued their Pastoral Letter on War and Peace, a document that spoke to the dangers of nuclear war from an ethical perspective grounded in Catholic theology.11 • In 2005, the National Council of Churches issued an open letter titled “God’s Earth Is Sacred: An Open Letter to Church and Society in the United States,” an ecumenical statement on the environment that argued “the central moral imperative of our time is the care for Earth as God’s creation.”12 • In 2009, the National Academies issued the third edition of On Being a Scientist, which notes that “the standards of science extend beyond responsibilities that are internal to the scientific community. Researchers also have a responsibility to reflect on how their work and the knowledge they are generating might be used in the broader society.”13 In some instances, the efforts of government and nongovernment bodies have been intimately intertwined. A well-known example is the story of the National Institutes of Health (NIH) Recombinant DNA Advi- 10 See Carl Mitcham, Encyclopedia of Science Technology and Ethics, Macmillan Reference, Detroit, Mich., 2005, p. 265; and Charles E. Harris, Jr., Michael S. Pritchard, and Michael Jerome Rabins, Engineering Ethics: Concepts and Cases, Wadsworth Publishing, Belmont, Calif., 1995. 11 The letter can be found at http://old.usccb.org/sdwp/international/TheChallenge ofPeace.pdf. 12 The letter can be found at http://www.ncccusa.org/news/godsearthissacred.html. 13 The second edition of On Being a Scientist, issued in 1995, said: Even scientists conducting the most fundamental research need to be aware that their work can ultimately have a great impact on society . . . [and] tremendous societal consequences. The occurrence and consequences of discoveries in basic research are virtually impossible to foresee. Nevertheless, the scientific community must recognize the potential for such discoveries and be prepared to address the questions that they raise. If scientists do find that their discoveries have implications for some important aspect of public affairs, they have a responsibility to call attention to the public issues involved. . . . science and technology have become such integral parts of society that scientists can no longer isolate themselves from societal concerns. See National Research Council, On Being a Scientist, National Academy Press, Washington, D.C., 1995, pp. 20-21.

OCR for page 15
FRAMING THE ISSUES 23 sory Committee and the Asilomar conference in the early 1970s. In 1973, a letter published in Science described the recommendations of the National Academy of Sciences’ Committee on Recombinant DNA Molecules,14 including a recommendation that life scientists voluntarily refrain from conducting certain kinds of experiments involving recombinant DNA until the potential hazards were better understood. Largely in response to this letter, the NIH in 1974 established the Recombinant DNA Advisory Committee to address public concerns regarding the safety of manipulat- ing genetic material through the use of recombinant DNA techniques.15 In 1975 and with the support of the NIH and others, the Asilomar conference hosted many of the world’s leading researchers on recombinant DNA to consider the hazards of such research. One key outcome of the conference was the establishment of voluntary guidelines to improve the safety of recombinant DNA technology.16 1.3  ELSI CONSIDERATIONS FOR SCIENCE AND TECHNOLOGY IN A NATIONAL SECURITY CONTEXT The development of any new science or technology often raises ELSI concerns. But the scope and nature of these concerns depend on the spe- cific science or technology in question and the context in which it is found. This report focuses on the ethical, legal, and societal issues that may be associated with science and technology (S&T) of relevance to military problems. The report assumes that defending and protecting national security and protecting the individuals involved are widely regarded as morally sound and ethically supportable societal goals. A related premise is that individuals who are part of the national security establishment (that is, those who make decisions for the government relevant to national secu- rity) want to behave ethically. As noted at the outset of this chapter, technology plays a critical role in the U.S. approach to national security, and technologically derived advantages can help both to defeat adversaries and to reduce friendly 14 Paul Berg, David Baltimore, Herbert W. Boyer, Stanley N. Cohen, Ronald W. Davis, David S. Hogness, Daniel Nathans, Richard Roblin, James D. Watson, Sherman ­ eissman, W and Norton D. Zinder, “Potential Biohazards of Recombinant DNA Molecules,” Science 185(4148):303, 1974, available at https://www.mcdb.ucla.edu/Research/Goldberg/ HC70A_W11/pdf/BergLetter.pdf. 15 National Institutes of Health, “About Recombinant DNA Advisory Committee (RAC),” available at http://oba.od.nih.gov/rdna_rac/rac_about.html. 16 Paul Berg et al., “Summary Statement of the Asilomar Conference on Recombinant DNA Molecules,” Proceedings of the National Academy of Sciences 72(6):1981-1984, 1975, available at http://authors.library.caltech.edu/11971/1/BERpnas75.pdf.

OCR for page 15
24 ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY and noncombatant casualties. At the same time, individuals may disagree about what national security requires, and how best to promote and achieve it. Some of those disagreements are ethical in origin. That is, a nation that behaves ethically has to find an appropriate balance between national security and the protection of “national rights” versus the protec- tion of individual rights and other ethical norms. Still, the notion of deliberately causing death and destruction, even in defense against external threats, gives many people pause. How much death or destruction? Whose death and destruction? What kinds of destruction and death (e.g., quick and painless death versus slow and painful death)? Under what circumstances? At their core, such questions are ethical questions, and those who engage in combat, those who support combatants, directly or indirectly, and the citizenry whom they defend have a considerable stake in the answers to these questions. Ethical concerns about military technology are not new. Deuteronomy 20:19 says that one should not cut down fruit trees in preparing for the siege of a city. Daniel Headrick notes that in 1139 Pope Innocent II banned as a religious matter the use of crossbows because they were so devas- tating, even by an untrained fighter, against the powerful, noble, and revered knight in plate armor.17 (This ban applied only to use against Christians.18) In the wake of World War I, the London Naval Treaty of 1930 outlawed unrestricted submarine warfare, a practice that allowed submarines to sink civilian ships without warning or providing for the safety of their crews.19 As a more recent example of ELSI concerns regarding science and technology for military and national security use, it is instructive to con- sider revelations of Senate committee hearings in the 1970s. These hear- ings revealed that the CIA had been conducting experiments involving the administration of hallucinogenic drugs to nonconsenting subjects who were U.S. citizens. According to the 1977 Senate Report of the Select Com- mittee on Intelligence and Committee on Human Resources,20 17 Daniel R. Headrick, Technology: A World History, Oxford University Press, New York, 2009. Cited in Patrick Lin, “Robots, Ethics, & War,” Stanford Law School, 2010, available at http://cyberlaw.stanford.edu/blog/2010/12/robots-ethics-war. 18 Bernard Brodie and Fawn M. Brodie, From Crossbow to H-Bomb: The Evolution of the Weapons and Tactics of Warfare, Indiana University Press, Bloomington, Ind., 1973. 19 See http://www.microworks.net/pacific/road_to_war/london_treaty.htm. In the case of both crossbows and submarines, these bans were subsequently ignored as the military value of using these weapons in the forbidden ways became more important. 20 U.S. Senate Select Committee on Intelligence and Committee on Human Resources, “Project MKUltra, The CIA’s Program of Research in Behavioral Modification,” Joint Hearing before the Committee on Intelligence and Committee on Human Resources, 95th Congress, 1st session, August 3, 1977, available at http://www.intelligence.senate.gov/ pdfs/95mkultra.pdf.

OCR for page 15
FRAMING THE ISSUES 25 [CIA] research and development programs to find materials which could be used to alter human behavior were initiated in the late 1940s and early 1950s. These experimental programs originally included testing of drugs involving witting human subjects, and culminated in tests using unwitting, nonvolunteer human subjects. These tests were designed to determine the potential effects of chemical or biological agents when used operationally against individuals unaware that they had received a drug. . . . The research and development program, and particularly the covert test- ing programs, resulted in massive abridgments of the rights of Ameri- can citizens, sometimes with tragic consequences. The deaths of two Americans can be attributed to these programs; other participants in the testing programs may still suffer from the residual effects. While some controlled testing of these substances might be defended, the nature of the tests, their scale, and the fact that they were continued for years after the danger of surreptitious administration of LSD to unwitting individuals was known, demonstrate a fundamental disregard for the value of human life. The report noted that the original rationale for this and other similar programs was based on U.S. concern over the use of chemical and bio- logical agents by the Soviet Union and the People’s Republic of China in interrogations, brainwashing, and in attacks designed to harass, disable, or kill Allied personnel. Such concerns created pressure for “a ‘defensive’ program to investigate chemical and biological agents so that the intel- ligence community could understand the mechanisms by which these substances worked and how their effects could be defeated.” But the 1977 report went on to note that “the defensive orientation soon became secondary. Chemical and biological agents were to be stud- ied in order ‘to perfect techniques . . . for the abstraction of informa- tion from individuals whether willing or not’ and in order to ‘develop means for the control of the activities and mental capacities of individuals whether willing or not.’” According to the 1977 report, the program of clandestine testing of drugs on U.S. citizens is believed to have been suspended in 1963. Then- CIA Director Richard Helms argued that because of the suspension of covert testing, the Agency’s “positive opera- tional capability to use drugs is diminishing, owing to a lack of realistic testing. With increasing knowledge of the state of the art, we are less capable of staying up with Soviet advances in this field. This in turn results in a waning capability on our part to restrain others in the intel- ligence community (such as the Department of Defense) from pursuing operations in this area.” Helms attributed the cessation of the unwitting testing to the high risk of embarrassment to the Agency as well as the

OCR for page 15
34 ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY advanced nation-states, private organizations such as nongovernmental welfare organizations but also crime cartels and well-funded terrorist organizations, small groups of independent “freelance” actors, and even individuals. Of course, within this wide range, there is significant varia- tion in their ability to take advantage of these technologies—with greater ability being associated with greater access to resources and talent. 1.5  ETHICS OF ARMED CONFLICT The conduct of war has always raised ethical and societal concerns— and to the extent that technology is an instrument of war, the use of military technologies raises such concerns as well. For example, inter- national law (the law of armed conflict as expressed in the UN Charter and the Hague and Geneva Conventions as well as a number of other treaties) today governs the conduct of armed conflict. The UN Charter describes the circumstances under which nations are permitted to engage in armed conflict. The Hague and Geneva Conventions and associated protocols govern how states may use force once conflict has started. A number of other international agreements ban the use of certain weapons, such as chemical and biological weapons,34 land mines,35 and blinding lasers.36 These international conventions—and arms control agreements more generally—are motivated in part by ELSI considerations. Chapter 4 provides some history and discusses LOAC and other international law in greater detail. As the nature of conflict, technology, and the larger world environ- ment have evolved over the last several decades, a number of these changes pose a variety of new ethical challenges to existing international legal regimes and to our understanding of conflict. These changes include: • Nonstate adversaries. State-on-state conflict, at least between indus- trialized nations, has given way to what some have called “violent peace.” Although nation-states are the primary focus of international treaties and agreements (and the Geneva Conventions bind nations), actual and potential adversaries of the United States include not only near-peer nation-states but also developing nations and terrorist groups that are not affiliated with any particular nation. Additional Protocol II of the Geneva Conventions (1977) fleshes out LOAC as it applies to non-international armed conflict (that is, armed conflict not involving two states). In addi- 34 Geneva Protocol, 1925; Chemical Weapons Convention; Biological Weapons Convention. 35The Ottawa Treaty, 1999. The United States is not a party to this treaty, although as a matter of policy, it has mostly complied with its main provisions. 36 Blinding Laser Protocol of the Convention on Conventional Weapons, 1995.

OCR for page 15
FRAMING THE ISSUES 35 tion, the United Nations acknowledges the significance of nonstate actors in United Nations Security Council Resolution 1540,37 which specifically obliges states “to refrain from supporting by any means non-State actors from developing, acquiring, manufacturing, possessing, transporting, transferring or using nuclear, chemical or biological weapons and their delivery systems.” • Asymmetric warfare. U.S. advantages in conventional military power have led many adversaries to seek other ways to challenge the United States on the battlefield. Rather than seeking to overcome U.S. strengths, asymmetric tactics seek to take advantage of U.S. weaknesses, vulnerabilities, and dependencies—and one element of such tactics may be to ignore, disregard, or even take advantage of constraints imposed by traditional understandings of the laws of war. A historical example is that terrorists and insurgents may deliberately blend with noncombatant civilians on an expanded and nontraditional battlefield, and distinctions between the two categories are increasingly blurred in many situations of conflict. More recently, concerns have arisen that U.S. military forces may be excessively vulnerable to cyber threats because of their great depen- dence on information technology. • Volunteer service in the armed forces. In the last 50 years, U.S. policy regarding military service has changed dramatically, from near-univer- sal conscription of male citizens to all-volunteer armed forces. Today, an increasingly small fraction of the population has served directly in the armed forces. Most U.S. civilians lack firsthand knowledge of issues (some of them ethical) that may be associated with armed conflict, and fewer civilians know others who have served in the armed forces. Thus, many do not have a basis for making informed ELSI judgments about technologies that may be useful in modern warfare. In addition, current members of the armed forces have voluntarily relinquished certain rights to personal autonomy in choosing to be subject to a military chain of com- mand, although the scope and nature of the rights they have surrendered are not necessarily clear in all cases. The fact of volunteering means that these individuals cannot say that they did not choose to be subject to military rules, which may require them to do things that they could not be required to do in civilian life. 1.6  WHAT IS AND IS NOT WITHIN THE SCOPE OF THIS REPORT In 2010, the Defense Advanced Research and Projects Agency asked the National Academies to develop and articulate a framework for policy 37 The resolution can be found at http://www.un.org/en/ga/search/view_doc. asp?symbol=S/RES/1540%20(2004).

OCR for page 15
36 ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY makers, institutions, and individual researchers to use to think through ethical, legal, and societal issues as they relate to democratized technolo- gies with military relevance. In DARPA’s usage, “democratized technolo- gies” are technologies with rapid rates of progress and low barriers to entry, as illustrated in Chapters 2 and 3. However, the committee believed that the term “democratized technologies” is easily misunderstood, and this report thus uses the term “emerging and readily available technolo- gies” (ERA technologies). The ethical, legal, and societal scope of this report encompasses three categories of concern that in the committee’s judgment are central to any consideration of the ethics associated with new military technologies: • The conduct of research. Conduct includes the selection of research areas, the design of particular research investigations (e.g., protocols, experiments), and the execution of those investigations. ELSI concerns relating to the conduct of research focus primarily on the impact of doing the research on the subjects that may be involved, whether by choice or by chance. “Subjects” here are defined broadly—communities, animals, individuals concerned about the environment, and workers in addition to those parties that are explicitly acknowledged as being research subjects.38 (ELSI concerns related to acknowledged research subjects are important, but there is today a well-developed infrastructure to address such con- cerns.) In a military context, ethical, legal, and societal issues related to the conduct of research also include matters of classification and the impact that such classification may have on oversight and review. • The applications of research as they relate to intended capabilities enabled by research. ELSI concerns associated with specified applications fall into two categories: concerns over the intended effects or purposes of the application and concerns over undesired effects (“side effects”) that might occur when the application has its intended effects. An example of the first category is R&D intended to develop a laser to blind soldiers on the battlefield—one ELSI concern relates to whether it is in fact ethical to develop a weapon for such a purpose. (Some of the history regard- ing an international ban on the use of lasers designed to blind soldiers is recounted in Chapter 3.) An example of the second category is R&D on a vaccine against a biological weapon. In this case, there is little ELSI controversy over the intended result, namely, some degree of immunity to that weapon. However, if the side effects of the vaccine (which might include severe allergic reactions, pain, or muscle weakness) were signifi- 38The term “subject” in this context is used informally, and in particular is not tied to any legal definition of the term, as might be provided (for example) by regulations of the Department of Health and Human Services.

OCR for page 15
FRAMING THE ISSUES 37 cant and widespread, ELSI concerns could arise over whether the benefits were worth the costs (e.g., how to account for benefits to the individual soldier versus benefits to the fighting force as a whole). Widely adopted applications may also require, impede, facilitate, or encourage institu- tional or organizational changes, and there may be ethical dimensions to such changes as well. ELSI concerns related to technologies that can be used for both military and civilian purposes are an important subset of the second category. A decision to pursue one technology for an application in one context (a military context) may well raise ELSI concerns about its use in another context (e.g., a civilian context) because of different soci- etal norms and laws/regulations that might be operative in the latter. One contemporary example is the law enforcement use of surveillance drones developed for military purposes, a use that has raised public concerns about privacy.39 • Unanticipated, unforeseen, or inadvertent ELSI consequences of either research or applications. These consequences are usually manifested by something going awry, as when research does not proceed as expected (e.g., experimental control is lost) and thus causes harm outside the origi- nal bounds on the research or when unanticipated applications raise additional ELSI concerns.40 ELSI concerns in this domain often relate to applications that are not intended by the proponents of such research. For example, an application may be used in ways entirely unanticipated or unimagined by its creators, and thus bring into play a set of side effects that were also unanticipated. These concerns are thus particularly difficult to imagine ahead of time. After due diligence has been exercised, it is also necessary to put into place a process that monitors how applications are used and that can respond quickly when unanticipated ELSI side effects manifest themselves. Chapter 4 discusses approaches for reducing the likelihood of unpleasant surprises. For these categories of concern, the committee sought to build on previous work that addresses ethical, legal, and societal issues associ- ated with S&T and with the military. In many cases, however, the com- mittee found little work at the nexus of ethics, emerging technologies, and military applications. Nevertheless, some relevant work includes the following: 39 See http://www.cbsnews.com/8301-201_162-57521768/more-than-a-third-fear-drone- use-in-u.s.-poll/. 40 “Unforeseen” in this context means unforeseen by the proponents or the performers of the research.

OCR for page 15
38 ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY • Work sponsored by the International Society of Military Ethics (ISME), which was established to examine professional military ethics in all dimensions, including but not limited to military technology.41 Of particular note is a special issue of the Journal of Military Ethics (Volume 9, Issue 4, 2010), published by the ISME, entitled Ethics and Emerging Military Technologies, with articles such as: —“Postmodern War,” by George R. Lucas, Jr.; —“The Ethics of Killer Applications: Why Is It So Hard to Talk About Morality When It Comes to New Military Technology?,” by P.W. Singer; —“Ethical Blowback from Emerging Technologies,” by Patrick Lin; —“The Case for Ethical Autonomy in Unmanned Systems,” by Ronald C. Arkin; —“Moral Predators: The Duty to Employ Uninhabited Aerial Vehi- cles,” by Bradley Jay Strawser; —“Saying ‘No!’ to Lethal Autonomous Targeting,” by Noel Sharkey; —“The Ethics of Cyberwarfare,” by Randall R. Dipert; and —“‘Cyberation’ and Just War Doctrine: A Response to Randall Dipert,” by Colonel James Cook. • A February 2012 publication by the Royal Society entitled Neurosci- ence, Conflict and Security.42 This study examined the ethics of neurosci- ence for military purposes and was charged with reviewing the current policy, legal, and ethical frameworks governing military applications of neuroscience. • A RUSI publication, circa 2008,43 which addressed the ethics and legal implications of military unmanned vehicles. • A framework outlined by the Consortium for Emerging Technolo- gies, Military Operations and National Security (CETMONS) for assess- ing the implications of emerging technologies for military capability and national security.44 This framework considers issues related to a tech- nology’s implications for civil society; civil reaction affecting military 41 See http://isme.tamu.edu. A European perspective on military ethics can be found at http://www.euroisme.org. 42 See http://royalsociety.org/policy/projects/brain-waves/conflict-security/. 43 Elizabeth Quintana, The Ethics and Legal Implications of Military Unmanned Vehicles, British Computer Society, Royal United Services Institute, available at http://www.rusi. org/downloads/assets/RUSI_ethics.pdf. 44 Consortium for Emerging Technologies, Military Operations, and National Security, “Framework for Assessing the Implications of Emerging Technologies for Military Capability and National Security,” 2013, available at http://lincolncenter-dev.asu.edu/CETMONS/ index.php/research-areas/framework-assessment.

OCR for page 15
FRAMING THE ISSUES 39 missions or civil society; external threats to U.S. security; the impact on treaties and military law; and the impact on military doctrine, military culture, military education, and military operations. • A 2004 report of the National Research Council titled Biotechnol- ogy Research in an Age of Terrorism (aka the Fink report), which addressed “technologies [in the life sciences that] can be used legitimately for human betterment and [also] misused for bioterrorism [through the creation of biological weapons].”45 In this context, the 2004 report noted that ““bio- logical scientists have an affirmative moral duty to avoid contributing to the advancement of biowarfare or bioterrorism. . . . scientists can and should take reasonable steps to minimize this possibility [that knowledge they generate will assist in advancing biowarfare or bioterrorism].” In addition, a 2008 report of the National Research Council, Protect- ing Individual Privacy in the Struggle Against Terrorists: A Framework for Assessment,46 developed a framework for the systematic assessment of information-based programs being considered or already in use for coun- terterrorist purposes. This framework posed a set of questions focused on the effectiveness, lawfulness, and consistency with U.S. values of such programs, the answers to which would be useful to those making deci- sions about such programs. The committee notes that perspectives on ethical, legal, and soci- etal issues related to science, technology, and military affairs are hardly unitary. Even within a single nation such as the United States, different constituencies are likely to have different ethical stances toward the same issue. Furthermore, perspectives on ethics may vary with military might. A nation that is accustomed to military superiority on the battlefield may well have an ethical perspective different from that of other nations with- out such power (Box 1.1). The ethical perspectives of allies, adversaries, and neutral observers may well be different from that of the United States; under some circumstances, the differences may have consequences for U.S. freedom of action. Addressing differences in ethical perspectives has two aspects, only one of which is covered in any detail in this report. Chapters 2 through 5 of this report address the first aspect, namely, the identification and articu- lation of possibly competing ethical perspectives. To properly consider ethical, legal, and societal issues, decision makers must begin by under- standing the scope and nature of those issues. Part of that understanding 45 National Research Council, Biotechnology Research in an Age of Terrorism, The National Academies Press, Washington, D.C., 2004. 46 National Research Council, Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Assessment, The National Academies Press, Washington, D.C., 2008.

OCR for page 15
40 ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY Box 1.1 Possible Ethical, Legal, and Societal Implications of Seeking Technological Superiority As a matter of U.S. policy, superior military technology is a cornerstone of the U.S. military’s strategic posture. . . . DOD Research and Engineer- ing (R&E) programs are needed to create, demonstrate, and partner in the transition to operational use of affordable technologies that can provide a decisive military superior- ity to defeat any adversary on any battlefield. . . . [Furthermore] continued technology development should enable future military superiority.1 The U.S. declaratory policy of seeking technological military superiority over U.S. adversaries has an overarching impact on ethical, legal, and societal issues that involve the R&D associated with new technologies of military relevance. But a detailed examination of the ELSI implications of this policy is not within the scope of this project’s statement of task, which implicitly asks the committee to assume the validity of this policy. Some aspects of this policy that may have ELSI implica- tions include the following: • Weapons to implement the policy of technological superiority have to conform to the laws of war, but since technology often outstrips the laws of war, the laws of war per se may not be much of a constraint. Thus, the development of such weapons stresses existing understandings of law and ethics that may be operative before the introduction of such weapons. • Technological superiority may provide transient rather than long-lasting advantage as adversaries learn to counter or obtain the technologies available to the United States. However, even transient advantages can be tactically significant in the short term (in terms of enabling U.S. forces to perform missions at lower human and economic cost), especially if they come as a surprise to an unprepared adversary. • Adversaries, both real and potential, react to the introduction of new U.S. military technologies. The availability of such technologies to the United States may deter adversaries from taking hostile actions against U.S. interests, may cause adversaries to seek to adopt those technologies for their own use, or may cause them to seek to counter the advantages conferred by U.S. use. Indeed, the first is an explicit decision regarding whose ethical perspectives should be considered and taken into account. The second aspect of addressing differing ethical perspectives is just as important. Once competing ethical perspectives have been identified, how should they be weighed and who should weigh them? Further- more, on what basis should a party whose ethical perspectives are not adequately included in any policy decision, however inclusive and honest

OCR for page 15
FRAMING THE ISSUES 41 successful uses of a technologically superior weapon may themselves be signals to adversaries about the utility of such weapons. Observed and anticipated ad- versarial responses to technological superiority and associated effects on stability may have ELSI implications. • Because transient advantages dissipate (by definition), additional work is always needed to find new generations of technologically superior weapons—and enduring advantages can be secured only by making a commitment to constant reinvestment in technology. • The first user of a weapon often sets precedents that other nations follow for the circumstances under which such a weapon can be used. Indeed, such precedents may be the initial seeds out of which international law and rules of the road governing such use can grow. • A focus on technological superiority may cause the United States to ne- glect the “soft power” dimensions of its security strategy. In 2007, Secretary of De- fense Robert Gates argued that in the future, “success [in asymmetric warfare] will be less a matter of imposing one’s will and more a function of shaping behavior—of friends, adversaries, and most importantly, the people in between.”2 • Presumptions of technological superiority may deflect attention from con- sideration of ethical, legal, and moral issues associated with military applications of technology. The prospect of reciprocal use has historically been a spur for reflection on the ethical implications of military applications of technologies, whereas asym- metric advantage has historically had the effect of deferring and diffusing ethical deliberation. Because it treats the policy of seeking technological superiority over U.S. adversaries as a given, this report does not assess or even address the issues described above in any systematic way. Nevertheless, policy makers may wish to consider this policy as an area for future ELSI analysis that may have impacts on ELSI considerations of individual technologies or research projects. 1 Thomas M. McCann, Defense Manufacturing Management Guide for Program Man- agers, October 16, 2012, p. 230, available at https://acc.dau.mil/docs/plt/pqm/mfg- guidebook-10-16-12.pdf. 2 See http://www.defense.gov/speeches/speech.aspx?speechid=1199. the decision-making process, be expected to trust and acquiesce in that decision? In both national and international law, legal practitioners and schol- ars have developed approaches to balancing competing or conflicting interests, even when those conflicting interests are well grounded and legitimate. Examples of such approaches include procedural requirements such as burden-of-proof obligations; criteria to ensure that the impact on the interests that are adversely affected is minimized to the extent feasible

OCR for page 15
42 ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY given the conflicting interests at stake; and appeal to case law to identify binding or guiding precedents. As a broad generalization, approaches to balancing competing ethi- cal claims and to comparing the ethics of different courses of action are considerably less developed. As a practical matter, it is often true that individuals presented with an ethical dilemma in a specific case come to similar conclusions about the appropriate course of action, even if they would disagree vehemently on the underlying reasoning or ethical theories. And in some cases, examination of similar cases from the past may help to shed some light on ethical matters. But to the extent that any party’s ethical beliefs are deeply held, one might expect that party to be predisposed toward opposing any decision-making process that does not result in the accommodation of those beliefs. In the end, if and when agreement cannot be found in contemplat- ing any given dilemma, participants will usually engage in some ad hoc process that resolves it one way or another. It is not too strong to describe such a process as being political (and hence outside the scope of this report), and the political nature of this process serves as a reminder of the very complex milieu in which decision makers operate. This report does not evaluate or assess the ethical, legal, and societal issues in any part of DARPA’s technology R&D portfolio. That is, although the report does identify ethical issues that are associated with some of the technologies of interest to DARPA, it does not come to any specific conclusions about the ethical, legal, or societal propriety of any particular research program or project in the DARPA portfolio. Also, this report does not address specific operational programs. While research programs are supported because they might enable impor- tant capabilities (and thus an ELSI assessment of a given research effort necessarily entails a consideration of applications), it is rarely clear at the outset how those capabilities might be integrated into an operational pro- gram. The reason is that the latter involves many specific decisions about how the program must operate—specific personnel, specific logistics, spe- cific command-and-control configurations, specific rules of engagement, specific mechanisms for oversight, and so on. There are of course ethical, legal, and societal issues associated with these arrangements (e.g., a given arrangement may or may not raise ELSI concerns), but because these arrangements cannot be anticipated at the research stage, addressing the ethical, legal, and societal issues associated with operational programs is not within the scope of this report. Furthermore, research-supporting agencies have general counsels that are charged with ensuring that all programs and projects by those agencies, both external and internal, are conducted in accordance with all applicable legal requirements. Processes intended to fulfill this man- date are not addressed in this report, except insofar as they are points of

OCR for page 15
FRAMING THE ISSUES 43 departure as mechanisms for considering ethical, legal, and societal issues more broadly. Other topics not addressed in this report under the broad rubric “the ethics of science” include scientific misconduct (e.g., data falsification, plagiarism, improper allocation of publication credit), specific laws and regulations as they might apply to specific research projects, financial conflicts of interest, the perspectives of specific religions on matters of war and peace, and the impact of classification on intellectual inquiry and academic freedom. Last and as noted above in this chapter, this report assumes that some precursor efforts (whether basic research or applied research/develop- ment efforts) that may lead to advanced military technologies are appro- priate for the nation to pursue and can be morally justified. Thus, any debate over the fundamental ethics of doing military research at all is outside its scope. 1.7  A ROADMAP TO THIS REPORT So that it could base its analysis, findings, and recommendations on real-world trends, the committee examined seven illustrative S&T areas: information technology, synthetic biology, neuroscience, robotics, pros- thetics, cyber weapons, and nonlethal weapons. Other relevant technol- ogy domains that the committee could have chosen to address include space technologies, geoengineering technologies, and nanotechnology. 47 Chapter 2 addresses the first three, which are foundational sciences and technologies that enable progress and applications in a variety of problem domains. Chapter 3 address the last four, which are applica- tion domains associated with specific operational military problems. To varying degrees, each of the S&T areas above has many or most of the characteristics of ERA technologies in the sense defined above. That is, even without large investment, a multitude of state and nonstate actors, friendly or not, can adopt and adapt their results to a multitude of pur- poses. Chapters 2 and 3 examine each of these S&T areas from the per- spective of technology maturity (that is, how close the science or technol- ogy in question is to producing useful applications) and possible military applications. Without attempting to be comprehensive, it highlights some of the ELSI implications that emerge in each domain. 47 Although the statement of task mentioned nanotechnology as an illustrative technology for this report, the committee did not examine nanotechnology explicitly. The reason was that the U.S. government does support the National Nanotechnology Initiative—and, as noted above, within that initiative is embedded a significant ELSI component. That dedi- cated effort is well resourced and positioned to make meaningful statements about ethical, legal, and societal issues associated with nanotechnology.

OCR for page 15
44 ELSI FRAMEWORK FOR EMERGING TECHNOLOGIES AND NATIONAL SECURITY Chapter 4 describes sources of ELSI insight, including a variety of theoretical and disciplinary approaches to ethics and insights from social sciences such as anthropology and psychology. Chapter 5 uses the sources of Chapter 4 and ELSI commonalities that appear in many of the technologies discussed in Chapters 2 and 3 to articulate questions for various stakeholders that might be used when contemplating the development of a technology or an application. These questions are useful for identifying possible ethical, legal, and societal issues that might arise from such development, and they are the heart of the framework requested in DARPA’s charge to the committee. Chapter 6 considers the limitations of a priori analysis and proposes two additional techniques for augmenting and increasing the value of what such analysis can provide. The chapter explores deliberative pro- cesses as a way to expand the scope of ELSI insights that might be rel- evant, and an adaptive approach to planning that can mitigate some of the ELSI uncertainties that can accompany any given development. Chapter 7 describes various mechanisms that have been used to address ethical, legal, and societal issues arising from S&T endeavors, as well as considerations for the use of such mechanisms in a military context. Chapter 8 provides the report’s findings and recommendations.