Responsibility and Creativity in Engineering

CAROLINE WHITBECK

Online Ethics Center for Engineering and Science

Case Western Reserve University

Engineering ethics, like medical ethics, has become a branch of the larger field of practical and professional ethics. Engineers first formulated ethical norms specifically for engineering practice in the first half of the twentieth century, when many professional engineering societies developed codes of ethics for their members. Since the National Project on Philosophical Ethics and Engineering in 1978–1981, philosophers and other scholars in the humanities have also weighed in on the subject. This paper examines the notions of responsibility, which is central to engineering ethics and to professional ethics generally, and creativity, which is necessary for the exercise of responsibility. The topics addressed in this paper include: professions and professional ethics; the role of engineering experience in the development of ethical guidelines for engineers; the notion of responsibility per se; the role of synthetic or creative reasoning in the fulfillment of professional responsibility; limitations on the foresight necessary for the exercise of responsibility; and bringing engineering knowledge to bear on societal choices about technology. Some of these topics have been discussed in more detail elsewhere (Whitbeck, 1998).

PROFESSIONS AND PROFESSIONAL ETHICS

My first topic is a description of professional ethics and a discussion of how the moral requirements for engineers (and other professionals) differ from the requirements for everyone else. Two characteristics distinguish the practice of professions from other occupations: (1) the mastery of a specialized body of knowledge; and (2) the application of that knowledge to securing or preserving the well-being of others.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 95
Emerging Technologies and Ethical Issues in Engineering: Papers from a Workshop, October 14–15, 2003 Responsibility and Creativity in Engineering CAROLINE WHITBECK Online Ethics Center for Engineering and Science Case Western Reserve University Engineering ethics, like medical ethics, has become a branch of the larger field of practical and professional ethics. Engineers first formulated ethical norms specifically for engineering practice in the first half of the twentieth century, when many professional engineering societies developed codes of ethics for their members. Since the National Project on Philosophical Ethics and Engineering in 1978–1981, philosophers and other scholars in the humanities have also weighed in on the subject. This paper examines the notions of responsibility, which is central to engineering ethics and to professional ethics generally, and creativity, which is necessary for the exercise of responsibility. The topics addressed in this paper include: professions and professional ethics; the role of engineering experience in the development of ethical guidelines for engineers; the notion of responsibility per se; the role of synthetic or creative reasoning in the fulfillment of professional responsibility; limitations on the foresight necessary for the exercise of responsibility; and bringing engineering knowledge to bear on societal choices about technology. Some of these topics have been discussed in more detail elsewhere (Whitbeck, 1998). PROFESSIONS AND PROFESSIONAL ETHICS My first topic is a description of professional ethics and a discussion of how the moral requirements for engineers (and other professionals) differ from the requirements for everyone else. Two characteristics distinguish the practice of professions from other occupations: (1) the mastery of a specialized body of knowledge; and (2) the application of that knowledge to securing or preserving the well-being of others.

OCR for page 95
Emerging Technologies and Ethical Issues in Engineering: Papers from a Workshop, October 14–15, 2003 Professional societies play a major role in the development of ethical norms (see Herkert’s contribution to this workshop, pp. 107–114 in this volume). In fact, their role in developing ethical norms is what distinguishes professional societies from disciplinary, technical, scholarly, and “learned” societies. Whereas professional societies focus on professional practice, these other kinds of societies focus on technical or scholarly advances in a specific discipline or field. The National Society of Professional Engineers is a purely professional engineering society; other engineering societies, such as IEEE, are both disciplinary and professional. Although professional engineering societies began to develop codes of ethics in the early decades of the twentieth century, and the American Chemical Society did so in the 1930s, the American Physical Society did not issue its first code of ethics until 1992 (that code deals exclusively with research ethics). Before 1992, physicists seem to have considered themselves practitioners of a discipline rather than a profession. However, there is a growing realization that research (especially publicly funded research) involves the welfare of many others, including, but not only, the subjects or participants in the research. The American Philosophical Association is a disciplinary or learned society, and philosophers have no code of ethics, reflecting a judgment that philosophy is a discipline and not a profession. Some philosophers seem confident that they affect nothing. Philosophers who are also teachers are members of the teaching profession, however, and university professors do have a code of ethics. THE ROLE OF ENGINEERING EXPERIENCE IN THE DEVELOPMENT OF ETHICAL CODES FOR ENGINEERS Next, we shall consider the role of engineering experience in the development of ethical guidelines for engineers. Ethical codes and guidelines for engineers come from many sources. Philosophers have had a hand in some of them. However, the most interesting and most valuable codes are based on engineers’ experiences, the problems and pitfalls they actually encounter in their professional practice. These codes and guidelines embody the profession’s accumulated wisdom about its practice, the morally significant problems that arise, and appropriate limits, priorities, and prudent measures for avoiding potential moral pitfalls. They stand closest to the Aristotelian tradition of philosophical ethics, as contrasted with top-down Enlightenment theories of ethics that attempt to deduce ethical norms from a few general principles. Engineering ethics has paid much closer attention to practical experience than at least one influential wing of biomedical ethics, which early on attempted to formulate a few abstract principles and then fit all problems and issues to those principles. Examples of experienced-based rules that set prudent boundaries are rules that limit the value of gifts an engineer can accept from business associates and the warning against working under a commission because it might create a

OCR for page 95
Emerging Technologies and Ethical Issues in Engineering: Papers from a Workshop, October 14–15, 2003 conflict of interest. Examples of guidance about setting priorities for responsibilities and obligations are rules that give public health and safety priority over other important values, such as maintaining client confidentiality. (Research integrity is of central concern in engineering research.) Ethical codes and guidelines are generally “living” documents in the sense that they are revised as engineers’ understanding of a moral situation evolves or as conditions of practice and the moral situations themselves change with social or technological changes. However, as I will discuss later, the rapid rate of technological and social change has made it difficult for revisions to keep pace with new problems. THE NOTION OF RESPONSIBILITY The notion of “responsibility” is the central moral concept in engineering ethics, and in professional ethics generally. Responsibility in the moral or ethical sense is based on the ends to be achieved rather than the acts to be performed. Responsibility typically requires the application of the specialized knowledge that characterizes a profession. As we have seen, professions are distinguished from other occupations because the practice of a profession draws on a body of expert knowledge and is directed toward securing major aspects of well-being for others. Many ethical notions are applicable to professional ethics, but the notion that best captures the special moral situation of the practitioner of a profession is professional responsibility. Professional responsibilities—exemplified by statements such as “engineers are responsible for the public safety” or “research investigators are responsible for the integrity of research”—require that relevant expert knowledge be synthesized to achieve an end. They require judgments that only those who have mastered such knowledge can make. Whatever moral lessons we may have learned in kindergarten, we did not and could not have learned as children how to fulfill professional responsibilities. Only adults with higher cognitive skills can learn to fulfill professional responsibilities. The exercise of professional responsibility requires both competence and concern. The ethical dimension of practicing competently is highlighted for engineers by the rule in many engineering codes of ethics that forbids engineers to accept assignments beyond their competence.1 Engineers working beyond their 1   At the time of this writing, research investigators do not generally recognize a professional obligation to work only within the limits of their competence, and many represent the distinction between incompetent research and unethical research as an absolute distinction. However, in some areas of research, certain sorts of incompetence, such as incompetence that creates major safety

OCR for page 95
Emerging Technologies and Ethical Issues in Engineering: Papers from a Workshop, October 14–15, 2003 competence are, for that reason alone, considered to be acting unethically. Responsible engineering practice requires both competence and the exercise of sufficient care to bring that competence to bear on a given problem. In contrast to specifications of ethical responsibilities, ethical rules and obligations (as well as legal and organizational rules and obligations) typically specify acts that are forbidden or required—for example, “do not offer or accept bribes” or “you are obligated to disclose any conflicts of interest to all parties to an agreement.” Following ethical rules and meeting ethical obligations may not require the application of professional knowledge (either “knowing how” or “knowing that”) other than perhaps the ability to recognize when the conditions mentioned in the rules apply—for example, what forms a bribe might take in a specific professional context. Rules that are specific to a profession derive their moral authority from their contribution to the fulfillment of the characteristic responsibilities of the profession. For example, the stricture against abandoning a patient is specific to medicine; an engineer might sometimes be wrong to leave a project without first being assured of the presence of another engineer, but there is no general stricture against an engineer leaving a project without finding a replacement. This is because the physician-patient relationship is itself an instrument of healing, and, therefore, rupturing that relationship without finding a replacement may damage the aspect of a client’s welfare that is entrusted to physicians. In contrast, the people whose safety and health are the overriding responsibility of engineers and members of the public, are generally people the engineer will never meet. Thus, the interpersonal relationship between the engineer and the people whose needs he or she must consider is not an ethical consideration. The notion of responsibility we have been considering, the notion exemplified in “engineers are responsible for the public safety” or, as Michael Loui (1998) has argued, “[e]ngineers have a responsibility for the quality of their products,” is responsibility in the ethical or moral sense. As Kathryn Pyne Addelson first observed, moral or ethical responsibility is “prospective” or “forward-looking” in contrast to “blame” and other notions that are backward-looking in that they are concerned with situations that have already occurred (Kathryn Pyne Addelson, Mary Huggins Gamble Professor Emerita of Philosophy, Smith College, personal communication). A moral responsibility specifies the ends to be achieved. Responsibility is sometimes used in the causal sense, as in “the storm was     hazards to the public or people working in the laboratory, have long been recognized as derelictions of responsibility by organizations such as the American Chemical Society. I am gratified to see that the federal definition of research misconduct now recognizes that reckless, as well as intentional, behavior may be considered research misconduct.

OCR for page 95
Emerging Technologies and Ethical Issues in Engineering: Papers from a Workshop, October 14–15, 2003 responsible for (i.e., caused) three deaths and millions of dollars in property damage.” In this sense, responsibility may not have ethical significance—for example, when the causal agent is something, such as a storm, that is not a moral agent. If a causal agent is also a moral agent, that is, one who is capable of acting morally, then the causal agent usually bears some moral responsibility for dealing with the situation s/he has created. Considered by itself, however, responsibility in the causal sense is not an ethical notion. Responsibility is also used in a third sense as a synonym for being accountable; in this sense, responsibility simply specifies to whom a rational agent must answer. For example, “the CEO is responsible (i.e., accountable) to the board.” (Notice that when responsibility is used in this sense the phrase is always “responsibility to.”) Responsibility in the sense of accountability applies only to rational agents. It does not specify what is required of the agent, but only who will judge the adequacy of the agent’s actions. In addition, there is what John Ladd (1970) has called “official responsibility,” which is limited to what one is charged to do as a result of holding a particular job or office within an organization. As Ladd has argued, official or organizational responsibilities differ significantly from professional responsibilities and other moral responsibilities in that official responsibilities attach to job categories and impersonal roles rather than to particular people in particular circumstances with particular histories and human relationships who are subject to the moral demands that they carry with them. Furthermore, official or organizational responsibilities, unlike moral responsibilities, are “alienable,” that is, an official responsibility is fully transferable from one person to another so that the first no longer has it. Notice that the official responsibilities in a job description could conceivably require unethical behavior. Nevertheless, the person holding the job still has a professional responsibility to draw attention to safety problems. Professions claim to be autonomous, that is, that only members of the profession can establish and administer the standards that govern the practice of their profession, because people outside the profession cannot judge the quality of professional performance. For example, I may know that a surgeon ought not to leave surgical instruments inside patients, but even if I were able to monitor a surgeon’s actions, even guide the surgeon’s hand, it would not substitute for a trustworthy surgeon, because I would not know what to do. Because people outside the profession cannot judge professional performance, external regulation, although sometimes necessary, is a poor substitute for having trustworthy (i.e., responsible) professions and professionals. RESPONSIBILITY AND CREATIVE/SYNTHETIC REASONING The fourth topic is the role of synthetic or creative reasoning in the fulfillment of professional responsibility. In the exercise of professional responsibility, creative reasoning must be used to bring expert knowledge to bear on specific

OCR for page 95
Emerging Technologies and Ethical Issues in Engineering: Papers from a Workshop, October 14–15, 2003 problems. In several respects, solving ethical problems, which is required for the exercise of professional responsibility, is analogous to solving problems of engineering design. Just as a designer needs creative abilities that a critic of design does not need, being a responsible professional requires more than judging ethical behavior. Obeying moral rules, fulfilling obligations, and respecting others’ rights typically require little professional knowledge because they provide explicit descriptions of the required actions. For example, the rule that “engineers have an obligation not to disclose a client or employer’s proprietary information” specifies what engineers must avoid doing. (If other moral demands, such as ensuring public safety, justify disclosing proprietary information in a specific situation, the disclosing party should be able to produce that justification.) Obeying moral rules, fulfilling obligations, and respecting others’ rights may require conscientiousness, even courage, but seldom require creative thinking. Statements of prospective responsibility, which specify the ends to be achieved rather than the acts to be performed, require more creative reasoning. For example, fulfillment of an engineer’s responsibility for safety requires understanding the safety hazards posed by a given situation or technology and figuring out the best way to reduce or eliminate those hazards. An engineer’s responsibility to promote the public understanding of technology requires knowledge of the technology, an assessment of (some segment of) the public’s understanding of it, a knowledge of the opportunities available for improving that understanding, and finally the construction of a statement or presentation that fits the situation and is appropriate to the current level of public understanding. Statements of prospective responsibility do not provide directions for doing what needs to be done or in what order they should be done. The synthetic tasks of devising appropriate actions are usually not completed before action is taken; they are continually revised in light of changing circumstances and new discoveries. The exercise of creative or synthetic reasoning in fulfillment of professional responsibility does not necessarily require originality (a novel synthesis); it only requires a synthesis appropriate to the situation. However, an appropriate synthesis can be extremely challenging for the kinds of multiply constrained problems engineers typically face. I agree with Woodie Flowers that creation under multiple constraints is more challenging than artistic creation, which is typically less constrained (Woodie C. Flowers, Pappalardo Professor of Mechanical Engineering, Massachusetts Institute of Technology, personal communication). In some borderline cases, of course, obligation may shade into responsibility. For example, maintaining a client’s confidentiality might require expert judgment to ensure that information is not disclosed; in that event, the requirement to ensure confidentiality is a borderline case. Significant moral problems, such as how best to fulfill one’s responsibilities, are like design problems, especially engineering design or experimental design problems (Whitbeck, 1998). Both sorts of problems require synthesis as well as

OCR for page 95
Emerging Technologies and Ethical Issues in Engineering: Papers from a Workshop, October 14–15, 2003 analysis. Moral problems are not multiple-choice problems, “decision problems” (in the technical sense), or “dilemmas” (literally, multiple-choice problems in which all of the choices are unacceptable). In other words, they are not problems that require choosing between preexisting alternatives. Although it may be useful to assign multiple-choice ethical problems to teach certain lessons, understanding the differences between the structure of actual moral problems and multiple-choice problems (including dilemmas) is important for developing the full range of skills necessary to moral reasoning and moral problem solving. Here are some common features of interesting or substantive engineering design problems and moral problems: There is rarely, if ever, a uniquely correct solution. If there is any solution, there is usually more than one. Although there is no uniquely correct solution, some possible responses are clearly unacceptable; there are wrong answers even if there is no unique right answer, and some solutions are better than others. Two solutions may have different advantages. Therefore, it is not necessarily true that, for any two candidate solutions, one must be incontrovertibly better than the other. Any acceptable solution must do the following things: Achieve the desired end (e.g., design the requested item or fulfill one’s responsibility). Conform to given specifications or explicit criteria for this act (e.g., for the designed item: meet size requirements; for the responsibility: not require an inordinate amount of time that causes one to forego other major responsibilities). Be reasonably secure against accidents and other mishaps. Be consistent with background constraints that are often unstated (e.g., a consumer item should be affordable and not use very hazardous materials; the response to an ethical problem should not violate anyone’s human rights). The analogy between moral problems and design problems draws attention to several frequently neglected features of moral problems. First, morally relevant considerations, analogous to design constraints, should not be assumed to be opposed to each other. Second, satisfying one moral demand does not generally mean disregarding others. I emphasize this point because inexperienced teachers of professional ethics often simplify moral problems and present them as choices between two values—for example, between loyalty to one’s employer and devotion to public safety or between policies that protect the environment and policies that further job growth. Simplifying moral problems encourages stereotypic thinking, rather than critical thinking, and so closes students’ minds to the possibility of satisfying multiple moral demands simultaneously. For

OCR for page 95
Emerging Technologies and Ethical Issues in Engineering: Papers from a Workshop, October 14–15, 2003 example, one can further the welfare of one’s client or employer by preventing a disastrous accident. Designers consider many design criteria simultaneously, and teachers of engineering ethics must foster similar skills. LIMITS OF FORESIGHT AND RESPONSIBILITY My fifth topic is limitations on the foresight necessary for the exercise of responsibility. Here I make contact with the themes of other papers in this volume: specifically, the concerns raised by Wm. A. Wulf about complexity and unpredictability (see pp. 1–6 in this volume) and Braden Allenby’s discussion about the macro-effects of human action (see pp. 9–27 in this volume). The scope of engineering responsibility (what some have called the “problem space” of engineering) has expanded repeatedly in the course of the twentieth century as engineers have been called upon to consider a greater range of factors and to foresee a wider range of consequences. We must remember that there are limits to what engineers can foresee, and hence how effectively they, individually or in teams, can achieve ethical ends, such as safety of the public. Henri Petroski (1985) has argued that engineering often advances by learning from failures and accidents and that those experiences have broadened the range of factors engineers must consider in fulfilling their responsibilities. Experience with the consequences of engineering design decisions has widened the scope of consequences responsible engineers are expected to foresee and the range of factors they are expected to consider in controlling those consequences. Not only the number of factors, but also the range of eventualities has increased. For example, automobiles are not intended to have collisions, but they can be expected to have them. The goal of reducing injuries and damage from automobile accidents is, therefore, now recognized as a responsibility of automotive designers. The list of questions below illustrates how much the scope of engineering considerations has expanded in the crucial area of safety. (The responsibility for safety might be replaced with other responsibilities in engineering practice or engineering research or the ethical treatment of human and animal subjects.) The responsibility to ensure that a device or construction is safe in its intended use is only the beginning of what engineers must consider to fulfill that responsibility: Will the device or construction operate safely under conditions for its intended use? Example: boiler explosions. Will the device or construction be safe in accidents that are likely to occur? Example: boating accidents. Will the device or construction be safe under condition of common misuse? Example: children playing “house” in clothes dryers. (Attempting to forestall every possible harmful misuse may be self-defeating, as well as

OCR for page 95
Emerging Technologies and Ethical Issues in Engineering: Papers from a Workshop, October 14–15, 2003 paternalistic, inasmuch as one may thereby block important beneficial uses that would have a net positive effect on health and safety.) Will the device or construct be safe if maintained in a way that may be improper but is likely to be a temptation given the design? Example: the 1979 American Airline DC-10 crash caused by cracks in the flange of the rear bulkhead resulting from time-saving shortcuts in maintenance procedures. Will the device or construct be safe in interactions with other technologies? Example: a patient’s death that showed the need for an electrical ground isolation standard in medical devices. A patient who had survived a heart attack lay in his hospital bed with an electrocardiograph attached to his chest and plugged into the wall. He also had an internal heart-pressure catheter, which was plugged into the opposite wall. In the next room a janitor was operating a vacuum cleaner that had a near short. When the vacuum cleaner was plugged into the wall, it caused current flow in the ground wire, killing the patient (Woodie C. Flowers, personal communication). As we try to anticipate the problems engineers will face in the twenty-first century, we must consider not only how they will handle responsibilities in designing, testing, manufacturing, and recycling new technologies that raise considerations similar to those raised by previous technologies, but also how they will handle an expanded range of considerations. Another major question is under what circumstances the possible effects of accidents might be so great that we cannot afford to “learn from experience.” Normal Accidents in Complex Systems Although expanding the scope of design criteria based on lessons learned from failures and accidents has made the design of devices and components safer, these lessons may be of little use in addressing what Charles Perrow has called “normal accidents” to which technologically sophisticated, complex systems fall prey. Perrow (1984) coined the term “normal” or “system” accidents to describe accidents with the following characteristics: They arise in “tightly coupled” (time-constrained) complex systems. They involve the unanticipated interaction of multiple failures of components. They involve an interaction of component failures that neither the operator nor the designer could anticipate or comprehend. They are, nonetheless, often attributed to operator error.

OCR for page 95
Emerging Technologies and Ethical Issues in Engineering: Papers from a Workshop, October 14–15, 2003 Because normal accidents arise from complexity, they cannot be remedied with technical fixes, such as safety devices. In very complex systems, a safety device often creates another component subject to failure—which failure could interact with other failures to produce situations that defy ready diagnosis. In addition, some safety devices may allow for more risky behavior—for example, safety devices for marine transport have allowed captains, spurred by competition, to increase their speed, so there has been no reduction in the accident rate. As the designer’s adage reminds us, “the best part is no part at all.” So, perhaps, redesign for safety may be accomplished through reduction in complexity, rather than the addition of safety devices. Furthermore, as Michael Loui reminds us, ARPANET, a predecessor of the Internet, was specifically designed to withstand various kinds of failures (such as lost packets, noisy communication links, and failed nodes). Thus, in some cases, engineers have had great success in designing against failure. However, reducing complexity often requires broader collaboration and may not be within the control of engineering designers or design teams. New Technologies Since the late 1970s, Michael Martin and Roland Schinzinger (1983) have argued that technological innovation amounts to social experimentation and, therefore, requires informed consent analogous to the consent from patients for using experimental therapies. Note that the informed consent for use of an experimental therapy is quite different from the consent for human subjects in experimental studies. Martin and Schinzinger’s analogy is not between engineering innovation and clinical experimentation, but between engineering innovation and the use of experimental medical treatment. The purpose of technological innovation, like medical therapy, is to meet a practical need, not simply to acquire knowledge. The use of an experimental medical treatment is governed by standards of competent care and informed consent for care, rather than the more stringent norms applied to clinical experiments. Because experimental therapies may pose significant unanticipated risks to health and safety or to other major aspects of well-being (such as financial security) that patients are best able to appreciate, it is widely agreed that patient consent should be obtained before such therapies are used. However, informed choice generally requires help from experts—for example, physicians must outline the possible risks and benefits and the therapeutic alternatives. Estimating the possible social, economic, political, and environmental consequences of developing or adopting a new technology also generally requires expert knowledge, often from engineering experts and experts in other disciplines. Therefore, to fulfill their responsibility for educating the public about a new technology, engineers will have to collaborate with other disciplines. Martin

OCR for page 95
Emerging Technologies and Ethical Issues in Engineering: Papers from a Workshop, October 14–15, 2003 and Schinzinger proposed using “proxy groups” composed of people similar to those who would be greatly affected by a new technology to assess the extent of possible harm or benefits. (I understand that experiments with citizen panels are now being conducted.) The challenge is to describe the consequences of new technologies or new uses of technologies in a clear and convincing way. The consequences will presumably include not only health and safety risks, but also social, economic, political, and environmental risks. Engineering expertise will certainly be necessary to characterize many risks and consequences, but it will not be sufficient for characterizing all of them. Developing the interdisciplinary collaboration for estimating consequences in a rapidly evolving social and technological environment will present serious challenges. RESPONSIBILITY, CREATIVITY, AND CHANGE Finally, we must consider how engineering knowledge can be brought to bear on societal choices about technology in an age of technological complexity and rapid social and technological change. We have seen that technological innovation requires not only addressing multiple, sometimes competing, design constraints, but also extending foresight into new areas. For example, design considerations have been expanded to include how a design might provide incentives for improper and/or unsafe maintenance procedures. We live in a time of rapid social change, as well as technological innovation, and social change makes the consequences of cumulative technological change more difficult to foresee and predict. The interaction of medical technologies with a short in janitorial equipment illustrated the interaction of technologies. Social change, particularly when a significant number of people begin to use technologies for new purposes, can increase risks significantly. For example, there has recently been an increase in the risk of sabotage, first by computer hackers and more recently by terrorists. Rapid change makes it difficult to use prior engineering experience to guide current practice. In addition, we are confronted with the unpredictability of complex systems that Bill Wulf discussed in his paper for this workshop. Thus, we are left with two distinct, crucially important questions: What is the best way to prepare engineers to fulfill their responsibilities for consequences they can, in principle, foresee? What is the best way to integrate engineering expertise with non-engineering knowledge (both lay and expert) to define the scope and limits of the problems engineers are now being asked to solve? Answering the second question will require determining on which problems engineering knowledge should be brought to bear, which risks are too great to

OCR for page 95
Emerging Technologies and Ethical Issues in Engineering: Papers from a Workshop, October 14–15, 2003 allow a technology to be pursued, and how we can reduce the complexity that gives rise to normal accidents and inherently complex systems. Such delimitation cannot be accomplished by engineering design alone. Thus, answering the second question will require creative—and interdisciplinary—skills on the part of engineers. REFERENCES Ladd, J. 1970. Morality and the ideal of rationality in formal organizations. The Monist 54(4): 488–516. Loui, M.C. 1998. The engineer’s responsibility for quality. Science and Engineering Ethics 4(3): 347–350. Martin, M.W., and R. Schinzinger. 1983. Ethics in Engineering. New York: McGraw-Hill. Perrow, C. 1984. Normal Accidents. New York: Basic Books. Petroski, H. 1985. To Engineer Is Human. New York: St. Martin’s Press. Whitbeck, C. 1998. Ethics in Engineering Practice and Research. Cambridge, U.K.: Cambridge University Press.