Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 107
The Experiences and Challenges of Science and Ethics: Proceedings of an American-Iranian Workshop Appendix N Extract from Report Honor in Science WHY HONESTY MATTERS1 “The reason I stop at a traffic light is not because I have a commitment to social justice, but because there may be a cop at the light and if I don’t he’ll nail me.”2 This remark was made by the president of a major hospital, during a discussion of whistleblowing by scientists. It is a good place to begin, if only because there are several reasons why we stop at traffic lights: because obeying traffic lights is an effective solution to the problem of how to cross busy intersections; because the cop may be there to nail us if we do not stop; because we may get killed or injured by someone who is (legitimately) crossing the intersection in the other direction. Or we may kill or injure others, including pedestrians. For most of us, the risk of getting caught may not be the main reason 1 This text is extracted from “Honor in Science,” (pp. 1-7), by Sigma Xi, The Scientific Research Society, Research Triangle Park, NC, 2000. 2 Foreman, Spencer, “Commentary: The Conflicting Missions of IRBs,” in Swazey, Judith P. and Stephen R. Scher, eds., Whistleblowing in Biomedical Research, Government Printing Office, Washington, D.C.: 1982, pp. 41-45.
OCR for page 108
The Experiences and Challenges of Science and Ethics: Proceedings of an American-Iranian Workshop that we do not run red lights, nor is it the primary incentive that keeps us honest in our scientific research. But how much has honesty in science got to do with such mundane matters as traffic lights? Are we comparing apples and oranges? Since the hospital president used the traffic light analogy in a discussion of integrity in science, he probably takes the view that the principles guiding a scientist in research are not significantly different from those affecting behavior in other facets of life. That is the position taken in this booklet, but it is not a universally-held view. For example, some would argue that science requires higher standards of ethical behavior than can be expected in the world at large. Others prefer to believe that the nature of science is such that ethical questions are less important than in the rest of life: how we deal with traffic lights, or with our friends and enemies, involves moral judgments and ethical standards, but the structure of DNA and the origin of submarine canyons are not affected by the character of the scientists who study them. The latter view misses the point. Scientific problems such as the structure of DNA or the origin of submarine canyons are investigated by scientists, who may be all-too-human in their capacity to make mistakes, to miss or misinterpret critical pieces of evidence, and, on occasion, deliberately to fake research results. Science may be morally neutral, but so is a traffic light; car drivers and scientists are not. This does not mean that mistakes and omissions are frequent in science, still less that fraud and dishonesty are commonplace. Most of us follow the rules most of the time, in our daily lives as in our scientific activities. We make occasional scientific mistakes, and on deserted streets at four in the morning we may occasionally be tempted to run a red light. But accuracy and responsible behavior are much more common than their opposites. There are, nevertheless, many scientists who believe that to stress the fact that scientists are fallible human beings does imply that mistakes, omissions, and unethical behavior are common in science. They feel that this is not merely bad for the image of science but is simply not true. Few of them, probably, believe that research scientists can somehow avoid the temptations and frailties that affect humanity in general, but they would argue that the scientific method has, over the centuries, come to incorporate so many checks and balances that the mistakes and misinterpretations which do occur are inevitably detected and corrected. Scientists may be fallible, but science is self-correcting. Such contrasting attitudes are evident in the responses of different scientists to the instances of scientific fraud that have been exposed from time to time. To many people, such spectacular cases are probably the visible tip of an iceberg of unknown but substantial dimensions. How
OCR for page 109
The Experiences and Challenges of Science and Ethics: Proceedings of an American-Iranian Workshop ever, to those who believe that the scientific method is effective in identifying mistakes and fraud, such exposures are proof that the system is working as it should. Deliberate dishonesty is rare and quickly recognized; accidental errors are similarly corrected by subsequent research, and “there is no iceberg.” For the purposes of this booklet it is not necessary to resolve this question here. In the last analysis, there is no means of knowing how much scientific research is inaccurate or fraudulent. Intuitively, it may be wise to assume that the iceberg is rather larger than some would like to think. Error and even unethical behavior may not be much less prevalent in science than in other aspects of human life, and detection of error may not be inevitable. Most of the best-known exposures of fraud have tended to be in areas of scientific research where there is vigorous activity—cancer research, for example—and where replication of experiments and critical reviews of earlier work are therefore more likely to happen. Most scientists work in fields where there is much less interest or competition; and the specialized character of most research is such that it may be a very long time before your errors are noticed. Before going further, a word is necessary about the distinction between fraud and error. We all make mistakes from time to time, despite our best efforts to be accurate. In our daily lives, for example, practically all of us have driven through a red light unintentionally, simply because we did not see it. Surely this is very different from deliberate fraud or law-breaking? Is this booklet concerned only with the latter, or with both fraud and errors? Mainly, of course, it is concerned with unethical behavior, rather than honest mistakes, but the distinction between them is not a simple one. As The only ethical principle which has made science possible is that the truth shall be told all the time. If we do not penalise false statements made in error, we open up the way, don’t you see, for false statements by intention. And of course a false statement of fact, made deliberately, is the most serious crime a scientist can commit. C.P. Snow, The Search, Charles Scribner’s Sons, New York, revised edition, 1959
OCR for page 110
The Experiences and Challenges of Science and Ethics: Proceedings of an American-Iranian Workshop the cop who stops you is liable to point out, you are as likely to be involved in an accident if you did not see the red light as if you deliberately decided to ignore it, and the scientific paper that includes an accidental error may be as unreliable as one that is based on deliberate fraud. It is not sufficient for the scientist to admit that all human activity, including research, is liable to involve errors; he or she has a moral obligation to minimize the possibility of error by checking and rechecking the validity of the data and the conclusions that are drawn from the data. Some would go further and argue that mistakes should be punished as severely as outright fraud, if only because it may be impossible for anyone but the scientist involved to know whether the error was accidental or deliberate. Not seeing the red light is no defense. Some scientists may agree that carelessness deserves to be punished, but believe that to be equally severe on all types of error is to ignore one of the most important characteristics of science: that it is very difficult to know what is truth and what is not. Much research takes the form of questioning previous assumptions or “facts,” and the results often show that these assumptions are invalid or are limited to certain situations. If, as Popper has suggested, we can only disprove theories, [and] never prove them, surely science is full of uncertainties? If this is so, is it reasonable that scientists should be blamed for unintentional error? It is, of course, precisely because of these uncertainties that accuracy in research and in reporting research results becomes so important. The attempt to draw general conclusions from limited data is basic to science: we cannot put every specimen under the microscope nor can we put major weather systems into a test-tube. If subsequent work, by ourselves or others, shows that our conclusions are not so general as we had hoped, that is no discredit, provided that the conclusions were not inherently unlikely and that the data on which they were based had been obtained and reported accurately. If our original investigation was flawed, however, that is another matter. One objection to this booklet may be that it is likely to be read only by those who have no need of the advice it contains: those who are honest and accurate by nature and whose scientific research will be therefore reliable. Those who are unscrupulous are unlikely to be deterred by anything short of discovery and punishment. Certainly, neither codes of behavior nor statements of principles can prevent unethical behavior. They may even be endorsed enthusiastically by individuals who ignore them in practice, if only because many people are capable of rationalizing their own actions as justifiable exceptions. “Of course there needs to be a red light at that intersection, but in this particular situation, I was not in danger of harming myself or anyone else.” Galileo was reputed to be better at devising scientific truths in his mind
OCR for page 111
The Experiences and Challenges of Science and Ethics: Proceedings of an American-Iranian Workshop than performing the tedious experiments that verified them, and since his time there have been many scientists with less ability who have followed his example. But such statements of principle need not be useless, either. When the Founding Fathers of the American Republic held “these truths to be self-evident,” they did not mean that there was no point in including the truths in the Declaration of Independence, only that the statements did not need to be argued or proved. This booklet is written for those who are honest and responsible; it is intended to give them practical advice, as well as reassurance that ethical issues are of vital importance. Another type of objection is that advice on scientific research ethics ought to be unnecessary, simply because science is not different from the rest of human life. There may be rules of behavior to be learned to meet specific situations (e.g., “always quote exactly, even if you spot a misprint or an apparent minor error in the passage you are quoting”), but the basic principles are a matter of human experience and individual conscience. This may be true, but there are also many situations where ethical issues are not clear-cut, and may not even be perceived by everyone. The following problems have not received much attention—or solution—since they were stated twenty years ago, yet they affect many scientists. Note that the author is not concerned with individuals who misuse their positions, but with how the position is liable to subvert the individual. What is most alarming about the workings of the referee system is not the occasional overt lapse of honesty on the part of some referee who suppresses prompt publication of a rival’s work while he harvests the fruit by quickly repeating it—perhaps even extending it—and rushing into publication with his own account. What is far more dangerous, I believe, because it is far more insidious and wide spread, is the inevitable subconscious germination in the mind of any referee of the ideas he has obtained from the unpublished work of another person. If we are frank with ourselves, none of us can really state where most of the seminal ideas that lead us to a particular theory or line of investigation have been derived…. What has been said about referees applies with even greater force to the scientists who sit on panels that judge the merit of research proposals made to government agencies or to foundations. The amount of confidential information directly applicable to a man’s own line of work in the course of several years staggers the imagination…. This information consists not only of reports of what has been done in the recent past, but of what is still unpublished. It includes also the plans and protocols of work still to be performed, the truly germinal ideas that may occupy a
OCR for page 112
The Experiences and Challenges of Science and Ethics: Proceedings of an American-Iranian Workshop scientist for years to come…. One simply cannot any longer distinguish between what one properly knows, on the basis of published scientific information, and what one has gleaned from privileged documents. The end of this road is self-deception on the one hand, or conscious deception on the other, since in time, scientists who must make research proposals learn that it is better not to reveal what they really intend to do, or to set down in plain language their choicest formulations of experimental planning, but instead write up as the program of their future work what they have in fact already performed. Again, the integrity of science is seriously compromised.3 If it is likely to be several years before you are invited to act as a referee or as a research award panel member, think instead about the situation that frequently arises in which you intend to publish a paper jointly with an author from another discipline. Say that the paper is in mathematical biology and that you as a biologist have worked with a mathematician. You have done your work conscientiously, and you believe that your colleague is equally reliable, but you do not have the necessary knowledge to verify that the mathematical analysis is fair and accurate. Nor does the mathematician know much biology. Are your respective responsibilities for the paper limited to your specific contributions, so that it is the job of the journal editor, referees and, ultimately, the readers to assess the validity of the paper as a whole? Many would say so, would behave that way in their professional scientific careers, and would have no doubt that they have been honorable and responsible scientists. Others would say that if you cannot understand every word and symbol in a paper of which you are the coauthor, it is your responsibility to have those sections read critically by someone who is not an author, and that whether you do this or not, you remain responsible for the entire paper, as do all the other coauthors. Why does it all matter so much? Science may build on what others have already discovered, but surely an inaccurate or even forged piece of research can only delay other work: it will eventually be recognized as spurious, and science itself has not been harmed. Similarly, if I do my research and “shade” my experimental results just a trifle towards the result that seems to be obvious and logical, who suffers? Even if we agree that shading or carelessness are wrong, are they any worse than the similar lapses that we observe continually in other aspects of life? After all, most of us are irritated by the car that runs a red light but few of us are inclined to take the license number and report it to the police, unless some child or elderly person was endangered by the incident. 3 Glass, Bently, “The Ethical Basis of Science,” Science, 150, 3 December 1965, pp. 1257-1258.
OCR for page 113
The Experiences and Challenges of Science and Ethics: Proceedings of an American-Iranian Workshop There are many valid answers to such questions. First, however, it should be said that there are few situations, if any, in which there is no “victim.” In some situations--medical research, for example--the victims may be very obvious: those who remained ill or died because fraud or carelessness diverted research away from the problems that should have been investigated. In any field, however, fraudulent or careless research is likely to benefit the perpetrator at the expense of others. Take, as an extreme case, the example of the “scientist” whose extensive list of publications consisted almost entirely of articles by others that he copied word-for-word from obscure biomedical journals and then published in his own name in other obscure journals.4 It could be argued that the original authors had gained the credit due to them when the articles were first published, and that scientific knowledge benefited through the wider dissemination of these research reports in other journals. Who suffered? The answer should be obvious: those scientists who did not get the academic appointments that the plagiarist obtained on the strength of his spurious list of publications. This example is an extension of the situation in which someone obtains a job by claiming a degree or other qualifications that he or she does not possess: it is unfair both on those who do not have the qualifications and are honest about it, and on those who earned those qualifications the hard and honorable way. More fundamentally, however, scientific honesty is vital because there is no cop at the scientific research traffic light. Nor can there be, for scientific accuracy and honesty cannot normally be reduced to something as simple as whether the light was red or green. The referee of a scientific journal, for example, is not a cop and should not be expected to determine whether a research report has been honestly produced. A referee is appointed to advise whether the results that are reported are sufficiently important to merit publication. Some errors are detected by referees, and others by readers, but neither referee nor reader can verify the critical elements of much scientific research except by doing the work over again. It is because we cannot police scientific research as we do our highway intersections that thesis-writing is such a fundamental part of the work required for Ph.D. and other research degrees. Those of us who have been through the experience, even if it was many years ago, can usually recall that frustrating and time-consuming period, after the research was done and the thesis had been drafted, when we had to go back and check on the accuracy of quotations, page references and other details, so that the thesis could not be faulted or sent back on such grounds. 4 See Broad, William and Nicholas Wade, Betrayers of the Truth, Simon & Schuster, New York: 1982, pp. 38-56.
OCR for page 114
The Experiences and Challenges of Science and Ethics: Proceedings of an American-Iranian Workshop The university was saying, in effect, “We put you through these hoops at this time with everyone watching your performance very carefully, because in the research that you are likely to do in the future we and other scientists need to be able to trust you to jump through the same hoops without being watched.” Graduate school is also the place to learn that one does not publish research results and conclusions until one is certain of their accuracy and that this is why it is necessary to define one’s problem sufficiently narrowly that one can gain the comprehensive knowledge and understanding that are essential. Inevitably, therefore, individual scientists tend to become fairly narrow specialists. Yet the progress of science as a whole depends on communication and integration of these individual specialized results: the loneliness of the individual scientist exists simultaneously with interdependence among all scientists. In Bronowski’s words: All this knowledge, all our knowledge, has been built up communally; there would be no astrophysics, there would be no history, there would not even be language, if man were a solitary animal. What follows? It follows that we must be able to rely on other people; we must be able to trust their word. That is, it follows that there is a principle which binds society together, because without it the individual would be helpless to tell the truth from the false. This principle is truthfulness.5 5 Quoted by Bently Glass (see note 3 above) from J. Bronowski, Science and Human Values, Messner, New York: 1956, p. 73.