Fostering Responsible Conduct in Science and Engineering Research: Current University Policies and Actions
Nicholas H. Steneck
Modern universities are commonly seen as serving three main functions. They educate students. They foster research. Through education, research, and other activities, they serve society.
As both the place where future researchers are trained and the place where much of the nation's research is conducted, universities are vital to science and engineering. A century or two ago, science and engineering were not dependent on universities and higher education. Today they are. Were universities to abdicate their roles in science and engineering, society would have to invent new institutions to train future scientists and engineers and to conduct much of the research that has become vital to the future of society.
The role that universities play in science and engineering encompasses both privileges and responsibilities. Much of the financial and social support that universities enjoy today is based on their capacity to contribute to science and engineering. The support for science and engineering is, in turn, accompanied by a great deal of autonomy, accepting the premise that as professionals, scientists and engineers should be given intellectual or academic freedom. These are the privileges. In return, society assumes that university scientists and engineers will act in ways that serve the best interests of society, however those interests are defined.
This paper describes and analyzes some of the actions universities are taking to foster responsible conduct in science and engineering research, beginning with the most passive steps, those that simply seek to establish normative rules, and progressing through three degrees of proactive policies: monitoring research, promoting discussion, and undertaking institutional reform. Throughout, the term ''responsibility"
Nicholas H. Steneck is professor of history and director of the Historical Center for the Health Sciences at the University of Michigan.
is taken in its most challenging sense. It is assumed, following the stated policies of many universities, that the sought after goal is setting high standards, not minimum standards. That is to say, "responsibility" is taken to imply more than simply following the letter of the law or not engaging in blatant misconduct (plagiarizing, falsifying data, conflicts of interest, and so on). Responsibility is taken to imply discharging the duties or meeting the obligations of a professional in an exemplary way. It is this broad understanding of responsibility, rather than the narrow sense of avoiding fraud or misconduct, that is the main focus of this report, as applied to science and engineering research at universities.
Although this report focuses on science and engineering, it is important to note that there is very little about science and engineering research that is truly unique, other than its subject matter and its particular research methods. Humanists engage in funded research projects; they collect, interpret, and publish data; and they train graduate students and postdoctoral fellows. Accordingly, it is not possible when discussing university policies and actions designed to foster responsible conduct in science and engineering research to focus exclusively or even mostly on actions and policies directed to scientists and engineers. The context of university policy and action is much broader than this. However, broader policies, when combined with policies and actions that do focus more on science and engineering, can potentially do a great deal to foster responsible conduct in science and engineering research. Some of that potential is now being realized on university campuses across the country. The ways in which it is being realized form the subject of this report.
The least burdensome, but not necessarily the most effective, way to foster responsible conduct in science and engineering research is to establish and publicize responsible behavior. Most professional organizations, including those for science and engineering, have published materials relating to professional conduct, such as Sigma Xi's influential Honor in Science1 or the National Institutes of Health's widely used Guidelines for the Conduct of Research.2 These materials have bearing on science and engineering research on university campuses and are commonly used (formally and informally) by universities for establishing standards for responsible behavior.
Universities have not been as eager as professional societies and the federal government to adopt comprehensive normative rules for responsible conduct in research.3 Most do have general codes of
conduct that apply broadly to faculty, administrators, staff, and/or students. However, their expectations for researchers are more commonly set out within the context of administrative policies dealing with specific problems, such as fraud or misconduct in research, conflict of interest, intellectual property rights, human and animal use in experimentation, computer use, and so on. Piece by piece, these polices provide normative rules that cover most of the major concerns regarding responsible conduct in science and engineering research.
In response to increasing concern over cases of research misconduct and spurred on by Public Health Service requirements in 1985,4 major research universities have adopted procedures for investigating allegations of misconduct. Although differing in detail, most follow a common format. First, the importance of integrity, the rarity of misconduct, and the need to maintain high standards are stressed. Then, definitions of misconduct, and the need to maintain high standards are stressed. Then, definitions of fraud or misconduct, reaching conclusions, and, when called for, meting out punishment are discussed.
The normative portions of these policies are found in the definitions of misconduct. Some are very short—a sentence with a few examples: "The word fraud means serious misconduct with intent to deceive, for example, faking data, plagiarism, or misappropriation of ideas."5
Others include more extensive inventories of unacceptable behavior. The University of Michigan misconduct policy gives definitions for:
Falsification of data,
Abuse of confidentiality,
Dishonesty in publication,
Deliberate violation of regulations,
Property violations, and
Failure to report observed fraud.6
A similar list is given under "Definition of Academic Misconduct" in the University of Maryland misconduct policy:
Falsification of data,
Improprieties of authorship,
Misappropriation of the ideas of others,
Violation of generally accepted research practices,
Material failure to comply with federal requirements affecting research, and
Inappropriate behavior in relation to misconduct.7
Variations of these and other lists, along with explanations and examples, are found in most university policies for dealing with misconduct in research.
The bounds established for unacceptable behavior by inference provide normative guidelines for acceptable behavior. For example, the Maryland misconduct policy defines "improprieties of authorship" as:
Improper assignment of credit, such as excluding others; misrepresentation of the same material as original in more than one publication; inclusion of individuals as authors who have not made a definite contribution to the work published; or submission of multi-authored publications without the concurrence of all authors.8
This statement could easily be rewritten as a set of normative rules for responsible behavior in research: researchers should properly assign credit to others for the work they have done; present original material in only one publication; include in publications only the names of those who have contributed to research; and include the names of coauthors in publications only after seeking permission to do so. In this way, the reactive misconduct policies in place in the major research universities can become proactive statements of expected or normative behavior in research.
Normative statements about research can also be found in conflict of interest policies. Again, as with the misconduct policies, the primary intent is to clarify what should not be done, but by inference or logical extension, proper conduct is also defined. The form of conflict of interest policies is not as uniform as that of misconduct policies, thus making it more difficult to identify the normative statements about research conduct. Nonetheless, these policies do provide another source of information that is applicable to scientific and engineering research.
Researchers at Ohio State University can receive guidance on honoring confidences gained during research if they know that their university's conflict of interest policy refers them to the State of Ohio government code of ethics, which states that:
No present or former public official or employee shall disclose or use, without appropriate authorization, any information acquired by him in the course of his official duties which is confidential because of statutory provisions, or which has been clearly designated to him as confidential when such confidential designation is warranted because of the status of the proceedings or the circumstances under which the information was received and preserving its confidentiality is necessary to the proper conduct of government business.9
If "university business" is construed as "government business," then researchers should understand that information given in confidence, such as information received when reviewing manuscripts for publication and grant requests for peer review, cannot be disclosed or used for personal gain. The State of Ohio statutes thus provide normative rules for handling manuscripts, shared data, student theses, and the like: researchers should honor confidences and not use or disclose information received in confidence without getting permission to do so.
Researchers at Pennsylvania State University can find normative rules for directing graduate students and postdoctoral fellows in their institution's conflict of interest policy, which states that it is wrong to direct students into research activities that are designed primarily to serve personal interests rather than to further their [the students'] scholarly achievement."10 While not easy to apply in difficult cases, i.e., when there is a genuine conflict between the obligations to a grantor and to those hired under the grant, one normative rule that applies to such situations is again made clear: researchers who serve as mentors to students assume obligations to those students and should not compromise these obligations for personal gain or career advancement.
In the same vein, a set of questions set out in a Johns Hopkins University School of Medicine conflict of interest policy statement provides, again by inference, a fairly sophisticated set of guidelines to help researchers sort out responsibilities:
Does the secondary commitment detract from the ability of a faculty member to discharge his primary obligations to The Johns Hopkins University School of Medicine?
To what extent is the opportunity for outside commitment offered because of the University affiliation and thus, to what extent should the financial rewards be shared with the University?11
The normative rules inferred in these questions and the subsequent explanations help clarify for researchers how they should sort out their obligations when they have responsibilities to more than one constituency.
Miscellaneous Research Policies and Other Documents
Similar guidance can be found in other research policies and documents relating to the conduct of research on university campuses. At the University of Michigan, the Division of Research Development and Administration provides researchers with a document that describes, summarizes, or contains verbatim policies and procedures relating to:
conflict of interest,
the responsibilities of project directors,
account and grant administration,
export control restrictions,
the transfer of university equipment,
restrictions on lobbying, and
The latter refers to policies relating to human subjects research review, animal research, radiation safety, biological research review (recombinant DNA research), and occupational safety and environmental health.12 Subsequent forms and/or policy statements issued by the human- and animals-use committees, the radiation safety committee, and so on provide further guidance on responsible conduct in research, as, for example, questions and guidance on the humane use of animals (discussed below under monitoring).
The normative rules scattered throughout university policies and documents relating to science and engineering research are an important first step for promoting responsible conduct. In defining what is illegal, unethical, and irresponsible, they suggest what is legal, ethical, and responsible. They also provide guidance on fiscal responsibility, safety, the responsible use of human subjects, the humane treatment of animals, the use of computers, the handling of data, and other matters. Therefore, even those universities that do not have comprehensive codes of ethics for science and engineering research, which is the majority, do provide researchers with guidelines for responsible conduct. If these "guidelines" are combined with the various federal regulations and professional statements about professional conduct in research, the total package does provide fundamental rules for determining what is responsible and irresponsible in the conduct of research.
As basic as this first step might seem, it is without question needed. There are researchers who do not know what is meant by plagiarism or the ownership of ideas. There are researchers who believe that words and phrases can be borrowed from someone else's publications without attribution as long as original ideas are not plagiarized.13 There are
researchers who believe that they "own" not only the data generated in their laboratories but also the ideas. Practicing researchers do not always understand the basic normative rules that help to determine responsible conduct in research. Students and beginning researchers may have less understanding. Publications such as Sigma Xi's Honor in Science, the codes of conduct published by professional societies, and the research policies of universities are thus useful documents for raising consciousness and establishing a knowledge base for fostering responsibility in research.
The effectiveness of normative rules in fostering responsibility is, however, limited. First, as disjointed and piecemeal as they are on most campuses, they do not make it easy for researchers to comprehend and consider all the responsibilities raised by modern science and engineering. As conditions exist on many campuses, the burden for integrating rules and resolving contradictions is often left to the individual. Given all of the other pressures on modern-day researchers, it may not be reasonable to expect them to read through three, four, or more policies to find out what they should or should not be doing.
In addition, simply stating how researchers should act in no way guarantees that they will act in this way. This is particularly true if the normative rules aim at unrealistically high standards. Researchers today are rarely able to meet all of their obligations in an exemplary way. More commonly obligations exceed the time available to meet them. Increasing competition for research funds means that more hours must be spent writing and submitting grant applications. More time spent on applications means less time working in the laboratory, advising or teaching students, and reviewing manuscripts. Corners have to be cut. What are needed, therefore, in addition to normative rules for ideal behavior, are guidebooks for how to survive in the increasingly competitive world of academic science and engineering research.
Policy statements about normative or ideal conduct become useful when they are explained, elaborated upon, and illustrated with examples. They also become useful when they deal with the difficult rather than the obvious. There seems to be little doubt that most researchers do not, and know that they should not, manufacture data or forge experimental results. It may be less clear, however, how results should be presented in grant applications, when "enough data" are needed to give confidence that a project will succeed but "enough work" remains to be done to justify getting the grant. How "preliminary" should ''preliminary research" be?
To the extent that most normative rules leave many questions unanswered, they fall short of the fostering that is needed to render science and engineering research as responsible as it could be. They do
provide important general rules. They also satisfy legal requirements and soothe consciences. But this approach to fostering responsible conduct in research may not be effective, particularly if the rules are not accompanied by other actions. Given the pressures on researchers today, they often are not only busy but also cynical. When their laboratory space and salaries depend on the research dollars generated and their promotions on the number of articles published, they can have a hard time believing the normative rules are anything more than guidelines for staying out of trouble. If this is the case, the sense of responsibility that researchers have will be minimal at best. Recognition of this fact has prompted universities to take additional steps to foster responsible conduct in research.
Universities today routinely monitor their research programs, among other reasons because they are required to do so. They must ensure fiscal responsibility. They must supervise the use and treatment of animals and human subjects. They must comply with environmental and workplace regulations. And they must enforce their own policies regulating such activities as classified and proprietary research. Monitoring is the second way universities foster responsible conduct in science and engineering research. It is an active rather than a passive way to foster responsibility.
At the University of Michigan, one monitoring process for research is triggered by an internal form that must be completed by all researchers prior to submitting projects for support (internal or external). The form lists 13 areas of concern that must be checked "yes" or "no."14 If "yes" is checked for an area, subsequent information or action is required. For the more important areas, such as the use of human subjects, vertebrate animals, and radioactive materials, the researcher is referred to a series of special peer-review committees for approval. These committees review the applications both for their compliance with specific laws and regulations and, in some cases, for problems that could raise questions about responsibility.
For example, researchers using vertebrate animals in research are required to submit an additional form to the University Committee on the Use and Care of Animals. This form asks researchers to explain why they must use animals in their research, why they cannot use "lower" animals or fewer animals, and why the amount of pain inflicted, if any, cannot be reduced. They are also required to identify the person in charge of the animals during experimentation, to give the latter's
qualifications, and to indicate how the work will be supervised. The form on which this information is recorded contains explanations of each of the questions, which, in essence, provide brief lessons in the responsible use of animals in research. If the answers given on the forms are not satisfactory or if they raise questions about the use of animals, the researcher is asked to appear before the Use and Care Committee to discuss the project. In this way, researchers are encouraged to think about and justify their responsibilities when they use animals in research and, simultaneously, their use of animals is monitored.
The same procedure is followed for the use of human subjects at Michigan, with the university having a total of twelve peer committees to review grant requests prior to submission.
Again, detailed questions are asked that compel researchers to think about their responsibilities before they begin their work. Medical researchers must tell whether they are using subjects that are:
Children (age < 18),
[Of] questionable state of mental competence or consciousness,
[A result of] human in vitro fertilization,
Prisoners or other institutionalized persons, or
Others who are likely to be vulnerable.15
If they are, they must provide a "rationale for and justify their [each subject's] involvement."16 Providing the rationale again compels researchers to think about their responsibilities. If a rationale is unclear of unsatisfactory, then the researcher must discuss the research with colleagues on a review committee. The human subjects committees also require justifications for the use of human subjects, explanations of the likely benefits to the subjects from the research, and a description of the steps that have been taken to minimize risks—requirements that again compel researchers to think about their responsibilities and, in gray areas, to discuss their responsibilities with colleagues.
Similar monitoring of responsibility in science and engineering research takes place in other ways on most university campuses. Researchers and universities are required bylaw to monitor the use of radioactive materials, some biological materials, and hazardous chemicals. Most universities also now routinely require researchers to file conflict of interest statements and property rights statements with every grant application or on some regular basis. The monitoring
inherent in these requirements forces researchers to think about their responsibilities in ways that might not otherwise occur to them and to think about relationships and obligations that might otherwise be ignored.
Responsibility is also routinely monitored through peer review for promotions or annual reviews for salary increases. These reviews provide faculty with opportunities to monitor the work of their colleagues, looking, for example, for the possibility of duplicate publication of the same material, misattribution of authorship, or the sloppy use/misuse of data. Similarly, student evaluations are routinely used to determine how well faculty are discharging their duties as teachers. Such evaluations are not used, but could be adapted, to determine how well faculty discharge their duties as research mentors.
Asking researchers in advance how they will exercise responsibility is intrusive. It requires an investment of time to answer questions for no apparent reason. Moreover, in subtle ways it represents a shift in burden. Rather than presuming that researchers act responsibly and then raising questions when there is reason to believe someone has acted irresponsibly, asking researchers to discuss their research conduct in advance or to be subjected to constant scrutiny during research places a burden on them to demonstrate that they will act or are acting responsibly. In other words, monitoring presumes guilt rather than assuming innocence. It is also compulsory rather than voluntary. It requires that certain standards be met rather than making responsibility a matter of personal initiative. As such, monitoring does not find a comfortable home in professional communities that are accustomed to openness and trust.
Why, for example, should researchers be required to demonstrate in advance how they will comply with rules, regulations, and standards for responsible behavior, if those rules, regulations, and standards are clearly spelled out? We do not require the same researchers to file forms before leaving for work in the morning explaining that they will travel in a licensed car using seat belts and driving at safe, legal speeds. We presume that they know the laws and will obey them, intervening only when there is reason to believe that the law is not being obeyed. Similarly, for science and engineering to develop freely and in a collegial atmosphere, some degree of responsibility must be assumed. If every aspect of research were subject to monitoring, either in advance or in process, the burdens of time and cost could rapidly overwhelm the research enterprise.
For these reasons, it is unlikely that universities will or should use monitoring to any great extent to ensure that research is undertaken responsibly. At the present time, active monitoring is undertaken only
when it is required, e.g., in the use of animals, human subjects, dangerous chemicals, conflict of interest, and so on. If used sparingly, primarily as a tool to get researchers to think about particular issues such as the use of animals in research, monitoring can be an effective device for fostering responsible conduct in science and engineering research. If overused, monitoring and the enforcement of compulsory rules of behavior will rapidly become a burden that can destroy the freedom and collegiality that are essential to the vitality of science and engineering research in particular and all academic life in general.
If universities do not directly check responsibility, through monitoring, how else can responsibility be fostered? A third approach to encouraging responsibility is to ensure that researchers at least are aware of the normative rules for undertaking research by bringing the rules to their attention and promoting discussion. At the University of Michigan Medical School
all faculty receive and have the obligation to read Guidelines for the Responsible Conduct of Research (1989). … This document is also distributed to all Department administrators for subsequent distribution to all postdoctoral fellows, graduate students and research technical staff.17
If reading and being informed are all that are required for ensuring responsibility, then this simple policy will go a long way toward fostering responsible conduct in science and engineering research.
Increasing numbers of research universities have chosen to be more aggressive in bringing the responsibilities of researchers to their attention. Their approaches vary, depending on where within administrative structures initiatives derive and how they are most conveniently implemented. However, the goal of each is basically the same: to foster discussion.
Harvard University provides a good example of a top-down approach to promoting discussions of professional ethics, including research ethics. The former president of Harvard University, Derek Bok, has long been a proponent of fostering discussions of ethics in the university setting.18 He was instrumental in raising funds to establish two major professional ethics programs at Harvard, one in the Kennedy School of Government, the other a separate Program in Ethics and the Professions. The latter fosters scholarly research on professional ethics and serves as a resource for other units seeking to take steps to foster professional responsibility.19 These and other influences have prompted
the medical faculty to revise their rules for research conduct and to join with others in sponsoring symposia on research ethics.20 The result will undoubtedly be an increased level of discussion of the importance of and special problems pertaining to research conduct. How much impact this will have on students and faculty remains to be seen.
The University of Colorado, Boulder, has taken a different approach to fostering discussions of professional responsibility in research. The Regents of the University of Colorado system vested authority for dealing with research misconduct in a series of standing committees. Besides conducting investigations of "suspected or alleged misconduct," these committees are charged by the regents to "promote exemplary ethical standards for research and scholarship."21 The Boulder campus decided to form one joint Standing Committee on Research Misconduct and included "Education of Academic Community" in its charge. The written definition of this task reads:
Deans, directors, chairs and graduate advisors shall be reminded annually of University of Colorado Administrative Policy on Research Misconduct and Authorship and their responsibility to inform all faculty, students, and staff of (1) the need for integrity in research performance and (2) the role of the Standing Committee in considering allegations of research misconduct.22
In practice, the committee has adopted a much more ambitious role in fostering responsible conduct in research.
Under the leadership of Alan Greenberg, associate professor of mechanical engineering, the Boulder campus's Standing Committee on Research Misconduct is playing down its policing duties in favor of a more positive image. The committee plans to send a short, personal letter to all faculty members describing its goals and expressing a desire for dialogue. The letters are being sent to faculty because they are seen as the key to a responsible research environment. Later, through faculty and appropriate administrative units, the committee hopes to expand its reach to graduate education. In each case, the committee's main goal will be to make researchers (and future researchers) aware of and responsive to the existing normative rules for exercising responsibility in research. The committee is not seeking to write new rules; it is simply trying to make researchers more aware of the rules that are already in existence.23
The impetus for more discussion at Colorado comes from within. A supportive administration and an ambitious committee have determined that researchers should and hopefully will spend more time talking about their responsibilities as researchers. At other universities, there is more discussion today than a few years ago, in part as the result of an outside influence—the National Institutes of Health's new
requirement for the inclusion of some material on "the responsible conduct of research" in institutional training programs. The requirement states that
all competing National Research Service Award institutional training grant applications must include a description of the formal or informal activities related to the instruction about the responsible conduct of research that will be incorporated into the proposed research training program.24
Those universities that have training grants or are anticipating applying for them are now in the process of planning "formal and informal activities" that will meet this objective.25
One way to satisfy the new NIH requirement is to foster discussions about responsibility in research settings. Several years ago, Floyd Bloom of the Scripps Clinic and Research Foundation decided this was precisely what he needed in his laboratory and began planning special sessions to discuss problems that had arisen or could arise in the course of research. The special sessions were well received. Three have been turned into video tapes that are now circulated to others with similar needs.26 If other universities follow this lead, the new NIH training grant requirement should at a minimum serve to promote discussions of the normative rules for responsible conduct in science and engineering research. If the rules are rigorously enforced, the impact could be even greater.27
In evaluating the role of discussion in fostering responsibility, an important distinction needs to be made. "Responsibility" is both an academic subject and a matter of practical importance. As an academic subject, "responsibility" can be studied, researched, discussed, and written about in the same way as any other academic subject. There is more than enough that is controversial in the consideration of conflict of interest, the ownership of ideas, the responsible use of humans or animals for experimental purposes, or any other aspect of research to engage scholars who specialize in research ethics in discussion for years to come. However, "responsibility" is also a matter of practical importance. Every day, in small and large ways, individuals who engage in science and engineering research must decide for themselves what it means "to be responsible" and then act. For them, responsibility is not a matter of intellectual curiosity but of practical necessity.
At the present time, there is no lack of academic or scholarly discussions of research ethics, both in general and as applied to science and engineering.28 The major science and engineering journals routinely publish articles and editorials on the responsibilities of researchers. Most major scientific and engineering meetings have had sessions devoted to the responsibilities of researchers. Scholars who study the
social, ethical, and professional side of science and engineering publish articles on responsibility in research. Science educators discuss ways to foster responsibility through science education. The researcher who wants to become better educated on responsibility in science and engineering has no lack of material to consult. The problem that exists today, if there is a problem, is getting this material to researchers who barely have time to keep abreast of developments in their own fields.
It therefore seems logical to assume, for convenience if for no other reason, that the discussion of responsibility in science and engineering research should begin in the settings in which that research is undertaken, with mentors and their advisees talking about their work, the way it is being undertaken, and its consequences. It is in these settings that the norms of professional conduct are set and passed on. The discussions can be informal and personal. They can also be enriched by adding some organization and involving others, who bring different perspectives to bear on difficult problems. However they are planned or undertaken, the important point is that discussions of responsibility in research should begin in the laboratory and in the classroom. They should, however, not end there.
There are at least two problems that arise if the discussion of responsibility is left exclusively to research settings. First, relying on discussions in research settings to address problems of responsibility is not efficient. To get different points of view on difficult problems it is usually necessary to involve philosophers, social scientists, lawyers, theologians, and others who are removed from the problems and can bring special expertise to bear on them. Generally the number of "outsiders" who are prepared to discuss issues relating to responsibility in science and engineering research is limited. To ask them to come to every science and engineering laboratory or department on a campus is not realistic.
A second problem is that research settings may not be conducive to the discussion of some difficult problems that arise in these settings. Junior researchers or graduate students who feel their work is not being fairly cited in a publication may not feel comfortable discussing authorship with their mentors. Students who disagree with a mentor's way of interpreting data may have qualms about raising this issue in a laboratory meeting. Ideally, of course, discussions should be open to any questions or points of view, but settings in which there are problems associated with responsibility are not ideal.
For these and other reasons, other, more generic settings need to be provided for discussions of responsibility in science and engineering research on university campuses. Departmental and university forums allow opportunities for researchers to consider and talk about their
responsibilities with colleagues in other fields. Lecture series are a useful device for raising consciousness. Orientation programs for new graduate students, postdoctoral fellows, and even faculty can provide information and along with that the message that responsibility in research is taken seriously at the university. There are many ways to promote discussion of issues associated with responsible conduct in science and engineering research. The more ways a university tries to promote discussion, the stronger the message it sends about its commitment to responsibility.
UNDERTAKING INSTITUTIONAL REFORM
As efforts to promote discussion have grown, new institutional arrangements have emerged for their support and coordination. The strategies employed differ significantly from campus to campus. Their goals are basically the same: to provide opportunities for the consideration of professional responsibility and related issues within the normal context of education and university life.29
It is impossible in this paper to discuss all of the different ways in which the professional responsibility of scientists and engineers is being addressed through institutional reform. Changes have been suggested for the entire spectrum of science education, from elementary schooling to postdoctoral studies, clinical training, and even continuing education. This section provides a few examples, focusing on advanced undergraduate education, graduate education, and two campuswide programs.
Beginning in the late 1960s and early 1970s, hundreds of courses were instituted at the undergraduate level to address what became known as STS (science, technology, and society) studies. In the 1980s, some of these courses added material dealing with professional responsibility.30 To provide additional support, a significant number of universities (over 100) developed STS programs. STS programs were and remain particularly popular at schools that train large numbers of scientists and engineers, such as MIT, the Illinois Institute of Technology, and Rensselaer Polytechnic Institute, to name only a few.31 For some students, the discussion of professional responsibility fostered by undergraduate STS programs begins their introduction to the norms
of professional life as scientists and engineers. For others, it may be not only their first but also their last formal contact with these issues.
A few schools have gone beyond the single-course/program approach and attempted to change completely the way undergraduates are educated. In 1986, the University of Minnesota College of Agriculture received a two-year grant from the Kellogg Foundation to formulate a curriculum that would provide students with "enhanced learning opportunities in leadership, communication, problem identification and solution, teamwork skills, interdisciplinary approaches, nutritional issues, environmental awareness, societal values and international perspectives." The Kellogg funds were used to provide students with opportunities for discussion, personal interaction, and case-based learning throughout the curriculum. As with all such programs, the long-term effects will be difficult to measure. Short-term, Project Sunrise's directors are pleased enough with the results to heartily recommend their approach to others.32
Research, per se, is generally not a major component of undergraduate education. Some undergraduates have research experiences, but they usually do not start thinking seriously about research until graduate school and their first independent work as researchers. Nonetheless, attitudes and knowledge gained during the undergraduate years can play a major role in determining the future responsibility of scientists and engineers. Attitudes about personal and social responsibility gained during undergraduate years can be transferred to graduate work and the laboratory. Knowledge about professional life and its role in society can provide a framework for questioning and seeking solutions when potential problems arise in the research environment. Just as basic mathematics, chemistry, physics, or biology can be essential for careers in science and engineering, so too basic knowledge about the social and values dimensions of science and engineering can be essential ingredients for being a responsible scientist or engineer. For many scientists and engineers, the only opportunity they have to gain such knowledge comes during their undergraduate years. This is particularly true for engineers, who can more easily engage in research without pursuing graduate studies.
Graduate education (including professional and postdoctoral studies) provides a second setting for formal instruction on professional responsibility, either in general or as related specifically to science and engineering research. As noted above, it is during these years that
scientists and engineers begin to think seriously about research.33 It is also during these years that they have increasing opportunities to consider questions of responsibility. At the present time, most instruction on responsibility at the graduate level takes place informally through discussions in laboratory settings and between mentors and their students (see ''Promoting Discussion," above). A few schools have instituted special programs, recognizing that graduate education provides an ideal atmosphere for more formal instruction on responsibility.
The University of Texas Health Science Center requires that all entering graduate students take a 17-week course titled "Philosophical Issues in Science." The course meets weekly for one hour, at lunchtime. To encourage participation, the Dean of the Graduate School of Biomedical Sciences, William Butcher, provides a free lunch and some course materials. The course covers a wide range of topics, from the history and philosophy of science to discussions of research techniques, honesty in science, animal and human experimentation, and laboratory safety. As currently taught by Stanley Reiser, M.D.-Ph.D., it continues to draw support, both from students and administration.34
Adding formal instruction on responsibility and related issues at the graduate level is problematic. It is at this level that educational paths start to diverge and specialize dramatically. For the most part students are no longer in large, common classes. Their programs are full, their time limited, and their needs more focused on particular problems. For these and other reasons, there has not been a parallel STS movement at the graduate level. Still, if the Texas experience is at all indicative, there clearly is room for some instruction in common about responsibility and related problems at the graduate level.
The promotion of the activities discussed in this section and previous sections can be accomplished more effectively if there is some coordination. It is for this reason that a few campuses have sought to establish campuswide programs aimed at one or more aspects of the problems and issues associated with professional responsibility.
Emory University has recently established its Center for Ethics and Public Policy and the Professions under the directorship of Robert DeHaan, professor of anatomy and cell biology. Similar programs have been or are being established on a number of campuses to encourage and support the discussion of professional ethics.35 The Emory center has formulated a set of guidelines for responsible conduct in scholarship, since the center is now fully operational, which will include major
sections on scientific research. It is also planning major educational initiatives, working through a series of subcommittees of the center's main Steering Committee. One of the educational initiatives will likely be targeted at graduate education. Other initiatives will target specific audiences or problems, such as a program ("AIDS Training Network") designed to help physicians and researchers consider professional problems raised by the AIDS epidemic. Overall, the Emory center is focused squarely on fostering responsibility, based on the assumption that future scientists, physicians, and other professionals (Emory does not have a school of engineering) should have read and thought about their responsibilities before they become and as they are becoming professionals. The reforms anticipated will be campuswide.36
The Poynter Center for the Study of Ethics and American Institutions at Indiana University has for a number of years taken an active campus and national role in promoting discussions of professional ethics. In line with similar centers, it has sponsored courses; encouraged curricular innovation, both on its own campus and other campuses; and organized a number of national symposia. Its director, David Smith, is also the prime organizer of the new Association for Practical and Professional Ethics. The Poynter Center has recently begun a major new initiative, "Catalyst: Indiana University's Program on Ethics in Research," which is seeking to "increase awareness about research ethics issues among students and faculty, through discussion and through the introduction of course units on research ethics. …"37 The impact is intended to be campuswide, introducing the discussion of research ethics issues into as many different forums and settings as possible, but with some direction and coordination from a single program.
OBSERVATIONS AND CONCLUSIONS
The examples given in this report leave little doubt that universities are seeking to foster responsible conduct in research. The ways vary considerably, from simply publishing rules for appropriate and inappropriate conduct to bringing the discussion of responsibility into research settings, changing courses of study, and instituting campuswide programs. The variations in turn reflect differing commitments and opinions on need. There are those who believe that there is very little that universities can do to foster responsible conduct among scientists, engineers, or any of the other professionals they train or hire. There are others who believe that universities not only can make a difference but also have an obligation to do so. To test whether this range of
opinion exists, all one has to do is raise the question of making more room for ethics in the curriculum at a meeting of science or engineering faculty on any university campus.
Those who favor minimal involvement tend to believe that responsibility is learned early in life and outside the classroom, not in university settings. Norms such as honesty, integrity, and reliability, it is argued, are applicable to life in general and are therefore fostered (or not fostered) well before individuals make decisions to become researchers. For those individuals who do eventually become researchers, their sense of responsibility (of morality) adopted early in life may be all that matters when they become scientists and engineers—an assessment that leads some to conclude that responsible researchers are "born," or at least trained early, if not "made."
While it may be true that early education can guide scientists or engineers through some sticky professional problems, it certainly will not help them resolve problems that involve genuine ethical dilemmas. What should a researcher do if she believes she can see a pattern in data being collected but is not sure? What should an engineer do if he is asked to work on a project that might be injurious to the environment or put large numbers of persons out of work? What should clinical researchers or physicians do if they are concerned about the dangers of AIDS research? How should priorities be sorted out when an unread thesis, an unreviewed journal article manuscript, and an unwritten research proposal are all sitting on a scientist's or engineer's desk demanding attention and the time for that attention is limited? Even those who honestly want to act responsibly to follow cherished principles are at times put in situations where principles and general attitudes about responsibility give no clear answers.
Pressures on researchers are real. Data must be interpreted, written up, and published. Names must be included or not included on journal articles. Experimental results are property that someone owns. The ownership of ideas is important; it has a bearing on promotion, and ideas can sometimes be sold for profit. Conflicts of interest exist. Future scientists and engineers must be trained. Public and private interests do compete. Researchers have responsibilities to more than one constituency. Superiors do not always make responsible decisions. The modern practice of science and engineering is complex. It is unlikely that anyone can intuitively know how to act or will instinctively want to act responsibly in every situation. Therefore, even if it is true that basic moral character is set before students come to universities and that basic moral character is what determines whether scientists, engineers, and other researchers act responsibly in research settings,
there is still much that universities can do to remind and clarify for researchers what it means to be "responsible."
How much universities will ultimately do to foster responsible conduct in science and engineering research will, no doubt, remain proportional to perceived needs. As long as the present public concern continues about fraud in science, conflicts of interest by researchers, the questionable "good" of some projects, the high cost of research, and other problems, it is likely that universities will seek to do more to foster responsibility. Moreover, whether universities believe so or not, there can be no doubt that the public believes that universities have obligations to foster responsibility, including in science and engineering research.
The stance universities take on their obligations to foster responsibility will, in turn, ultimately determine how much is done. This fact became apparent in talking with colleagues on different campuses, some of whom had active programs on their campuses to foster responsible conduct in research and others who had tried to develop such programs but failed. Where there was a supportive atmosphere, programs, courses, discussions, and so on flourished. Where supportive atmospheres have been lacking, some very well intentioned efforts have failed.
What are the ingredients of a supportive atmosphere? Ideally, an administration that is willing to devote some of its time, attention, and support to activities that will foster responsible conduct in science, engineering, and scholarship in general, plus a faculty that has the willingness to devote some of its time and energies to students, campus service, and discussion of the role of science and engineering in modern society. Where either one of these ingredients has been lacking, steps to foster responsibility have been slow in coming. The best-intentioned faculty have a difficult time making changes without administration support. Administrators cannot make changes without the support of faculty, unless they have been able to raise large amounts of money to make changes.
The atmosphere present on any one campus is, of course, the product of many influences.38 The size of research budgets has a great deal to do with how much time researchers have to devote to students, to service, and to thinking about anything other than how to get the next grant. The type of research undertaken can influence the way groups of researchers act. The pressure or incentives for advancement, some of which are internal, others external, influence how researchers spend their time. For administrators, research is only one of their concerns. They have to listen to many voices and respond to many calls for action, some of which are louder than others. In sum, the amount that can be
done to foster responsible conduct in science and engineering research is dependent on many factors, not all of which can be controlled or predicted with any certainty.
Granting that there is uncertainty, it is nonetheless instructive, encouraging, and exciting to learn of and think about the variety of actions that faculties and administrators on university campuses are taking to ensure that science and engineering research will remain responsible activities in the future. Their efforts surely will not be irrelevant to the role science and engineering play in American society in the decades that lie ahead.