Click for next page ( 110


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 109
v Risk Communication

OCR for page 109

OCR for page 109
A Mental Mode} Approach to Risk Communication M. Granger Morgan When the word risk is mentioned, many of us first think in terms of a single value such as the expected number of deaths or injuries. However, a simple thought experiment can quickly convince us that things are more complicated. Suppose that we are considering introducing a new product. After careful market research we have determined that we can sell a number of them and make a profit, but there will be some net impact, D, on overall US mortality. What sorts of things do we need to know before we decide whether we are justified in introducing this product? In addition to the sign and magnitude of D, most people want to know things such as whether the risk is immediate or delayed, how equally or unequally it is distributed among different people, whether those at risk have any control over their exposure, whether the effects are immediate or delayed, whether there are intergenerational effects, how well the risk is understood, whether it is similar to other risks society already accepts, what the product does, who uses it, and how responsibility and liability will be distributed. As we pursue this simple question we quickly come to understand that risk is a multi-attribute concept. We care about more than just some measure of the number of deaths and . . . nJurles. Slovic, Fischhoff and Lichtenstein82 have shown that one can group such attributes of risk into three broad factors which allow us to reliably sort risks into a "factor space." People's perceptions of risks, including their beliefs about the need for regulatory intervention, are a strong Unction of where a risk falls in this space. Risks posed by such common and well known objects and activities as skiing, bicycles, and automobiles appear in the lower left corner 82Slovic, P. B Fischhoff and S Lichtenstein (1980). Facts and fears: Understanding perceived risk, in Schwing, R and W Albers (ed.), Societal Risk Assessment. New York: Plenum. 111

OCR for page 109
. 112 BLOOD AND BLOOD PRODUCTS: SAFETY AND RISK of the space. In contrast, risks such as those associated with nuclear power, asbestos, and pesticides lie in the upper right hand comer of the space. Experimental psychologists have discovered that in making judgments, such as the number of deaths from a chance event, people use simple mental rules of thumb called "cognitive heuristics."83 In many day-to-day circumstances, these serve us very well, but in some instances, they can lead to biases in the judgments we make. This can be a problem for both laypeople and for experts. Three such heuristics are particularly common: Availability: the probability of an event is judged in proportion to the ease with which people can think of previous occurrences of the event or can imagine such occurrences. Anchoring and adjustment: the probability judgment is driven by a starting value (anchor) from which people typically do not adjust sufficiently as they consider various relevant factors. Representiveness: the probability that an object belongs to a particular class is judged in terms of how much it resembles that class. The design of effective risk communication requires a recognition of the multi-attribute nature of risk, an awareness of the psychology of risk perceptions and judgment under uncertainty, and an analysis of the information needs of the people for whom the communication is intended. Developing a risk communication has traditionally been a two-step process. First, you find some health or safety specialist who knows a lot about the risk and you ask them what they think people should be told. Then you find someone who is called a "communications expert," who is usually either a writer or someone who works in public relations. You give them the information you got Dom the health or safety specialist and they decide how they think it should be packaged and delivered. If you think about it for a while, you will notice that two key things are missing Mom this traditional approach. First, it doesn't determine systematically what people already know about the risk. People's knowledge is important because they interpret anything you tell them in light of what they already believe. If some of those beliefs happen to be wrong, or misdirected, your message may be misunderstood. It may even lead people to draw conclusions that are exactly the opposite from what you intended. Second, the traditional method doesn't determine systematically the precise information that people need to make the decisions they face. There are formal methods, 83Kahneman, D, P Slovic and A Tversky (ed.) (1982). Judgment Under Uncertainty: Heuristics and Biases. New York: Cambridge University Press. Dawes, RM (1988). Rational Choice in an Uncertain World. Orlando, Florida: Harcourt Brace.

OCR for page 109
RISK COMMUNICATION 113 decision analysis, that can provide a precise answer to a question such as "what is the minimum set of infonnation I need to make the decisions I care about." Of course, most of us have never heard of decision analysis, and we don't use it in our daily decision making. For various reasons we typically require more than the minimum set of information to make the decisions we face. Yet, when you start reviewing traditional risk communication messages it is amazing how many of them fail to provide even this minimum set of information. Most of us already have some relevant knowledge and beliefs about any risk that a communication is designed to inform us about. Often we've already heard some specific things about the risk. If we haven't, we've heard about other risks which sound pretty similar. In any event, we have a lot of knowledge about the world around us. We have beliefs about how things work, and about which things are more and less important. When someone tells us about a risk we use all our previous knowledge and beliefs, called our "mental model," in order to interpret what we are being told. Finding out what someone already knows about a risk means learning about their "mental model." That's easier said than done. We could administer a questionnaire, but people aren't stupid. I have to ask questions about something. As soon as I start putting infonnation in my questions, people are going to start using that information to make inferences and draw conclusions. Pretty soon I'm not going to know if the answers I am getting are telling me about the mental model that the person already had before I started quizzing them, or the new mental model that the person is building because of the all the infonnation I am supplying in my questions. To overcome these and other problems, we have developed a five-step method for creating, testing and refining risk communication messages: 84 1. Carefully review scientific knowledge about the risk, and summarize it in terms of a fonnal diagram called an "influence diagram." 2. Conduct open-ended elicitations of people's beliefs about the hazard, allowing expression of both accurate and inaccurate concepts. Use a "mental model interview protocol" that has been shaped by the influence diagram. 3. Administer structured questionnaires to a larger set of people in order to determine the prevalence of the beliefs encountered in the "mental model" interviews conducted in Step 2. ~ Morgan, MG, B Fischhoff, A Bostrom, ~ Lave, and C Atman (1992). Communicating risk to the public, Environmental Science & Technology, 26: 2048-2056. Bostrom, A, B Fischhoff and MG Morgan (1992). Characterizing mental models of hazardous processes: A methodology and an application to radon, Journal of Social Issues, 48: 85-100.

OCR for page 109
114 BLOOD AND BLOOD PRODUCTS: SAFETY AND RISK 4. Develop a draft risk communication message based on both a decision analytic assessment of what people need to know in order to make informed decisions and a psychological assessment of their current beliefs. 5. Iteratively test and refine successive versions of the risk communication message using open-ended interviews, closed-form questionnaires, and various problem-solving tasks, administered before, during, and after people receive the message. Suppose that I want to learn about the mental model that someone has for a risk such as the safety of the blood supply. In response to a question like "Tell me about the safety of the blood supply," most people can only talk for a few sentences before they run out of steam. However, those few sentences often contain five or ten different ideas. If the interviewer has been trained to keep track of all the things that are mentioned, they can then go on to ask questions that follow up on each one. For example, they might say "You mentioned that screening blood donors can improve safety. Tell me more about that. . ." By systematically following up on all the concepts that the subject introduces, a well-trained interviewer can often sustain a conversation about the risk for 10 to 20 minutes, introducing no new ideas of their own. Only in a later stage of the interview will the interviewer go on to ask questions about other key ideas which the subject did not bring up on their own. By conducting a number of interviews of this sort, we can begin to build up some sense of what people know and believe about a risk. Then in step three of the process, using a closed-form questionnaire, we can determine the relative frequency with which various beliefs actually occur in the general public. Every time that we have conducted mental model interview studies to crecare a risk communication we have learned surprising and important things ~ ~ _ _ which have had a major effect on the message we developed. Among the most important insights have come from some of the common misconceptions. For example, in studies of radon, we learned that a significant number of Americans believe that once they get radon in their house, the house becomes permanently contaminated and there is nothing they can do about it. Of course, this is not true. Radon is a radioactive gas that decays into various particles which become nonradioactive in a matter of hours. Thus, if the source of radon gas can be closed off, all of the resulting radioactivity will soon be gone from the home. . . . ~. This is an important part of any risk communication message, because if you test your home for radon and find elevated concentrations, you can take various steps to prevent the radon Tom entering the home and thus reduce or eliminate the risk. How could people think that a house that has radon is permanently contaminated? Probably they

OCR for page 109
RISK COMMUNICATION 115 are extrapolating from other things they know. They have heard about radioactive waste from power plants and bomb factories that remains dangerous for 100,000 years. They also may have heard of houses that have become chemically contaminated when they were sprayed by very long lasting pesticides. They make a reasonable extrapolation (which in this case happens to be wrong) that radon is like these other cases. This is the sort of misconception that it is critical to know about if you are going to design an effective risk communication. When EPA designed their first Citizen's Guide to Radon, which was mailed to citizens all over the country, they didn't know about this common misconception. Their central message was "You should test your house for radon." However, many people who believe that radon permanently contaminates a house would probably be inclined to ignore this message, figuring they are better off not knowing whether their house has a high concentration of radon. For example, if they don't know, they could sell their house some time in the future with a clear conscience. In summary, in order to develop an effective risk communication one must recognize that risk is a multi-attribute concept. Building on the literature on risk perception and judgment under uncertainty, one should learn what people already know about the risk at hand, first through open-ended mental model interviews, and then through closed-form questionnaires. On the basis of this, and a careful assessment of the information people need to make on the decisions they face, a first draft of the communication can be developed. However, there is no such thing as an expert in risk communication. The only way to be certain that a message works, that it is understood in the way that it is intended, is to try it out on real people. By using a variety of methods, including one-on-one read aloud protocols and focus groups, the message can be iteratively refined. Test it; refine it; test it again. Don't stop until it works. DISCUSSION Henrik Bendixen: How do you know when risk communication works effectively? M. Granger Morgan: One strategy is to present people with a hypothetical situation and ask them what actions they would take. For example, we did a three-way study with the first version of the EPA Citizen's Guide to Radon and two brochures we developed from the results of our work. In terms of simple recall, regurgitating the facts, the three did roughly comparably. Then we asked participants to respond to a question such as: "You have a neighbor who has just measured radon levels in their home of 10 picocuries per liter. What advice do you give them?" The people who read either of the brochures

OCR for page 109
1 16 BLOOD AND BLOOD PRODUCTS: SAFETY AND RISK we developed were able to give direct, cogent, and correct advice, but the people who had read only the EPA brochure had a lot of trouble. They basically were not able to provide an answer.

OCR for page 109
Risk Communication: Building Credibility Caron Chess The following true story illustrates the difficulty of relying only on numbers to explain risk. A high-ranking government official was speaking to a public meeting of hundreds of people about a proposal for a hazardous waste incinerator in their neighborhood. The audience was told that the incinerator would pose only a 1 in 1 million risk of an increased death from cancer. The crowd's response: "We hope you are the one."8s If this agency representative had been able to explain more eloquently the 1 in a million risk, would people have said, "Oh, now we understand. We will graciously accept your incinerator." The answer, of course, is no. Nonetheless, senior scientists, administrators, and public health officials seek to improve their explanations of the risk numbers in hopes that the public will yield to experts' judgments. These experts are overemphasizing the impact of explanations of the mortality and morbidity data. I suggest that this and other environmental communication issues apply to communication about blood safety. In the following few pages I discuss three myths that scientists and policy makers believe about explaining risk and propose alternative approaches. ROLE OF INFORMATION One myth is that information changes behavior, while in reality, information has little association with behavior. Although scientific experts are very careful to limit extrapolation beyond the data when dealing with their own disciplines, they tend go beyond the social science data (and sometimes go where no social scientist has gone before). 85Hance, B.], C Chess, PM Sandman (1988). Improving Dialogue with Communities: ~ Risk Communication Manual for Government. Trenton: New Jersey Department of Environmental Protection. 117

OCR for page 109
118 BLOOD AND BLOOD PRODUCTS: SAFETY AND RISK Experts who are not social scientists have in mind a model about the role of information that is flawed. Many non-social scientists erroneously believe that if you give people information about risk, they will change their attitudes and then they will change their behaviors. However, empirical research suggests that increased knowledge about technology is not necessarily associated with support of that technology. For example, in a review of research on perception of the risks of nuclear energy, about one-half of the studies show that supporters of nuclear power know more factual knowledge than opponents. The other studies show that opponents of nuclear power know more information than supporters or that opponents and supporters have equal amounts of information. 86 In the health education field there is an evolving consensus that knowledge has a weak link to behavior. For example, high school kids right now can pass tests of their AIDS knowledge, but they have not changed their behavior dramatically. Likewise, women may know that they are at risk for AIDS if they have unprotected sex with a partner who is HIV infected. That knowledge does not mean that women will insist that their partners use condoms. Advertisers who want to change behavior talk about messages and appeals rather than information: appeals to status, social approval, sex appeal and so forth. Similarly, those who attempt to change behavior for the social good, dubbed social marketing, develop communication strategies and messages based on what motivates people to change their behavioral For instance, years ago antismoking campaigns were replete with images of anatomically explicit photographs of damage to lungs caused by smoking. Somewhat later, there was a complete change in message, based on research about what motivates teenagers' behavior: television ads featured Brooke Shields telling kids that it was uncool to smoke. When those of you in the blood industry are considering how to increase the pool of donors, you need to think in terms of social marketing. For example, the message that donating blood cannot lead to HIV infection, while a seemingly logical response to public fears, needs to be supported by empirical research on questions such as: To what extent does fear of HIV affect those people who have a record of donating blood? Does that fear largely affect those whose psychological portrait suggests that they would be donors? Or does it largely affect those who would be unlikely to give blood under any conditions? If the fear does affect people otherwise likely to give 86Johnson, BB (1993). Advancing understanding of knowledge's role in lay risk perception. Risk: Issues in Health, Safer, and the Environment, 4: 189-212. 87For example, Rice, RE, CK Atkin (1989). Public Communication Campaigns. Newbury Park: Sage.

OCR for page 109
RISK COMMUNICATION 119 blood, how well is their fear reduced by the message about the lack of connection between HIV and blood donation? Some existing empirical research begins to answer this question by distinguishing the attitudes of those who donate blood from those who do not donate blood. According to one study, people who give blood tend to associate blood donation with generosity, civic mindedness, and usefulness as well as feelings of assurance and relaxation. Those who do not donate blood connect donation with illness and discomfort. This study suggests that motivating donors might build on their positive associations, not merely factual infonnation about the lack of connection between giving blood and AIDS. Attracting more donors requires more such research to determine feelings about blood donation, to develop potential messages based on that research, and to test those messages. If you want to change behavior, for example, encouraging people to donate blood or to become repeat donors, you need to understand their motivations, biases, and beliefs. This same need for understanding is also essential to discouraging people who continue to donate blood, even though they know they are HIV-positive or that they may be at risk of HIV infection. If you are going to communicate with those individuals, you want to know what their attitudes and beliefs are and what motivates them to behave in this manner. When you do provide information, you should consider what your audience wants to know, not merely what you want to tell them. To provide this information, you need to know what your audience thinks is important and knowing your audience requires further research. THE ROLE OF RISK NUMBERS Scientific experts tend to subscribe to a second myth: the public is influenced largely by the risk numbers. However, as the previous discussion suggests, individuals are influenced by factors other than the risk data. A true story about a prominent Environmental Protection Agency risk assessor illustrates the influence of other variables.89 The risk assessor was in the hospital for tests, including one that had a slight risk of causing kidney failure. He found that more sophisticated diagnostic equipment, without the potential risk, existed at a hospital across town. He decided to take an ambulance to the other hospital to use the more sophisticated diagnostic equipment even heckler, SJ (1989). Scales for the measurement of attitudes towards blood donation. Transfusion, 29: 401-404. 89Siegel, B (1987). Managing risks: Sense and science. Los Angeles Times, July 5, I 28.

OCR for page 109
144 BLOOD AND BLOOD PRODUCTS: SAFER AND RISK One of the classic cases that became notorious involved Chester Southam from Cornell Medical School, New York Hospital, conducting research at the Brooklyn Jewish Chronic Disease Hospital.'04 He was interested in the body's rejection of cancer cells and thought that he could begin to solve some of the puzzles of cancer. He injected demented senile old men with foreign cancer cells. When asked why he did this without telling the subjects that he was going to be injecting them with cancer cells, he said that if he would have told them that it was cancer cells, they would have become much too frightened, would have exaggerated the risk, and would have refused to join the protocol. Because he didn't want to lose his subjects, he did not communicate to them. At one point he was asked why he didn't also inject himself, as was frequently the case with researchers. He responded that, "There aren't enough good cancer investigators." This is one of the problems of leaving it to the investigator to determine the risks. It was decided, with NIH as the driving force, that we would no longer leave risk calculus to the investigator on the grounds that the investigator was not a neutral figure in this calculation. The investigator's first charge was to accumulate knowledge. Whether the motive was knowledge, the next grant, or the prize, investigators as a rule underestimated the risks to which they exposed their subjects. The two major functions of the IRB are to calculate the risk-benef~t ratio and ultimately to insure that the consent process is implemented in a way that those risks and benefits are clearly communicated to the subject. What we have done in the area of human experimentation is to require a collective judgment on the risk by the investigator's peers, with a few other people also involved. Equally important, the IRB insists that that finding on risk and benefit be told to the subject, so that the subject has the last word. The difficulties, the imperfections, and the impossibility of fully informing the subject are clear and apparent. The question is how far down the disclosure of risks do you go? When consent in this process was first started in the early 1970s, you could find spoofs in the major journals. The choice of subject for the spoof was the deriding of consent hv listing `'v~rv rick acing . . . circumcision as the test case _= ~ __ _^AA ~_ ~_A ~ ^ ~ In the end, we now require communication about risk to the subject, and we are going about it in a very decentralized way, leaving it to local IRBs, and to individual patients to calculate the nature of how much risk should be taken. Were CreutzLeldt-Jakob disease part and parcel of an experimental protocol, knowing what we know now, would that risk be communicated to the subject? The answer to that would really depend ultimately on individual Ibid, pp 77-93.

OCR for page 109
RISK COMMUNICATION 145 IRBs. They might bend over backwards in an experimental protocol and say, "Yes, that ought to be included." They probably would not rule out an experiment simply on the basis of that unknown risk being present, but they might. The key point is that they certainly would want the subject ultimately to make the decision about the risk. You are not going to find an existing national body that is going to make that decision on its own. You will find in elements of risk calculus that we have accepted informed consent as the basis for approval. We anticipate through IRB and consent that ultimately we do share risk analysis collectively and ultimately trust the choice made by the subject. We now tend on the whole in human experimentation to be risk minimizing. In the area of research communication of risk, sharing information about risk, and calculating risk against benefit is not merely standard, it is mandated, and however imperfect in practice, it is altogether the requirement. If you look at the clinical setting and allow me my choice of the case of radical mastectomy versus lumpectomy, you begin to find a second and very interesting model in teens of risk calculus. The decision in favor of radical mastectomy as almost the only procedure to be used in all cases of breast cancer lasted well into the late 1970s and even into the early 1980s. The ultimate reason was the surgeon's response of risk minimization. The shift from radical mastectomy to lumpectomy is complicated. Parts of the story involve radiology and oncology, as well as psychiatry. No roster of participants who moved us from mastectomy to lumpectomy would be complete without talking about women and patients' groups themselves. It would not be an exaggeration to say that although the change certainly was abetted, encouraged, and finally conclusively established in the territory of medicine, the role of women activists was absolutely critical. There the risk calculus shifted the other way as opposed to the psychological pains and physical pains of a radical mastectomy. Where the choice turned out to be a realistic one, many women prepared to carry the additional burden of risk in return for the benefit of not undergoing radical mastectomy. That conversation between a surgeon or oncologist and woman patient is as complicated a conversation medically and emotionally speaking as one can imagine. The variables that go into the risks of each procedure are very exquisite. It is very difficult, intellectually and emotionally, to parse out the risks, and yet it is being done all the time. In this day and age for a breast surgeon in this country to say, as I recently heard from a physician in Israel, "This is much too complicated, I am simply going to make the decision and not involve the woman," would be absurd. It may well be than an older patient or a patient overwhelmed by the diagnosis may turn to her surgeon and say, "I don't want to hear about it. Do what you want to do." That certainly happens. The right to say no to information certainly is to be respected, but A., .. . . .

OCR for page 109
146 BLOOD AND BLOOD PRODUCTS: SAFETY AND RISK the standard is surely to share the information about risk, although it is complicated. In the area of clinical practice, we have clearly reached the point where sharing the decision-making about risk after intensive conversation is standard. It is fairly apparent that clinically many physicians have great difficulty communicating risks to their patients. Some are paternalistic. Physicians on the whole do not do a very good job communicating about risk. There is a fear that if they tell a patient all the risks, the patient won't be compliant. Physicians may also be trying to be protective of patients by trying not to worry them. It seems if you tell the patient there is a side effect, that patient will call you tomorrow and report the side effect. In summary, it is easy to spoof risk disclosure and point to the flaws, but I would suggest on the basis of my two models that the benefits of doing it far outweigh whatever the problems are. Furthermore, physicians probably don't have a lot of choice any longer. In experimentation, they certainly don't. Dealing with women and breast cancer, they don't. The list is going to get longer, not shorter. You may agree or disagree with one or another of my arguments, but in the end it is moving in exactly the direction I have described. DISCUSSION Paul Russell: There is no question that we must strive toward informed consent. There is also no doubt that it is a very flawed process and that we cannot get to total informed consent. Our responsibility lies with the issue of diminishing risks as you get out toward the end of the risk curve. It is difficult, but it is an important thing to do. I am concerned about it. How far should I go in telling people about risks? David Rothman: In an experimental setting, I would think you would want to go further. In a therapeutic setting, the patient gives you cues. I am sure older patients, or maybe those less educated, might press you less. Paul Russell: I go as far as the patient wants to go. David Rothman: You are not going to bludgeon the patient with information, but if the patient is a long-distance runner and there is a 5 percent risk that this surgery is going to keep him off the track for the rest of his life, you might well want to tell him that. If the patient that comes in is rather corpulent, beer drinking, and poker playing, the 5 percent risk to a running career would be irrelevant. You will tailor it. You are going to go down the major items and

OCR for page 109
RISK COMMUNICATION 147 then, depending on that patient's tolerance for information, you go down below the normal list to include those that you think would be particularly relevant for the patient. Paul Russell: I think that is a helpful guide, but it doesn't get me all the way. Edward A. Dauer: In those instances in which a patient's perception or evaluation of a risk is not consonant with its actual quantity, is it the physician's responsibility to respect the patient's evaluation as it is or to try to correct that perception and bring it back into line with the actual quantification? David Rothman: The answer to that is absolutely yes, you have to bring it back into line. No patient is going to ever get to exercise informed consent on his own unless the physician helps him realize it. You have to correct. A patient may have a fear that is completely out of perspective. You will give the information. What do you then do when a patient still says, "No thank you"? You try a second time, but it may not be appropriate to try the fifth time. That you must make an effort is absolutely clear, but in the end when you are content that that patient has understood you, then it remains the patient's choice. Llewellys Barker: By telling patients what the risks are, the practical approach is very appropriate. I have been on both sides of this process. In your mind, does what we tell patients the risks are equate with "these are the risks we tolerate"? David Rothman: I have a lot of trouble with "we." Is that the five people who are sitting in the lab or the five people at an NIH consensus conference? I might prefer "we" to become "I." The point of my remarks is to personalize it, individualize it, and communicate it. The "we tolerate" formulation may make it too easy not to communicate.

OCR for page 109

OCR for page 109
Communication of Risk and Uncertainly to Patients Donald Colburn I wear a lot of hats in the world of hemophilia. I have hemophilia, severe Factor VIII deficiency. I also have HIV disease, and I happen to be president and chief executive officer of American Homecare Federation, which is a company that provides service to people with hemophilia. In addition, I am currently serving as a member ofthe National Hemophilia Foundation's (NHF) Blood Products Monitoring Committee, and I am chairing the legislative campaign for passage of the Ricky Ray Relief Act. I am going to discuss the general concept of blood safety and take a look at how we currently educate folks. Having been on the receiving end of that education, I would say that I have not experienced the inclusion of the patient in decision making that you have discussed. The attitude of the physicians is still, "Do what I say. This is good for you. If you don't do it, you will be more ill." With respect to blood and blood products, I believe we should be moving toward a signed informed consent. We can certainly talk about how far we go in levels of exploring uncertain risks. Unfortunately, the process of sharing the infonnation necessary to institute informed consent is time-consuming, and, as we know, the health care system is already operating with a lack of quality time spent with patients. The challenge we have is to devise a mechanism that will allow blood bankers, for instance, who are very interested in blood safety issues to serve a broader role in the process of educating patients. There has been a great deal of discussion about the pamphlet that is put out by American ssociation of Blood Banks (AABB) in conjunction with the Council of Community Blood Centers (CCBC) and the Red Cross. Designed for recipients of cellular blood products, it is supposed to mimic a package insert as far as information. To date I have not found one person with hemophilia or another condition who received this pamphlet at the time of receiving a cellular product. Even that mechanism isn't being utilized. What can we do that will allow some "right of information" for that patient? For 149

OCR for page 109
150 BLOOD AND BLOOD PRODUCTS: SAFETY AND RISK starters, we can provide signed and informed consent with chronic users of blood products on an annualized basis. In addition, we first must make information understandable for people with various levels of education. The second thing that we have to do is to present the information in an unbiased manner. Then we have to be certain that patient is up to date with the information when he makes that decision. Once there has been uncertainty about a particular product, that uncertainty hinders communication even more, as we experienced with HIV. The patient, as the blood products' consumer should have the ultimate role in the decision. Without that type of participation, without informed consent that somebody has on file somewhere transmitted. then the whole thing becomes bogus if a disease is To sum up, it is important, tremendously important, that we present materials to people in an unbiased fashion that they can understand. One would like to think that the patient would have many more questions, especially in a situation involving chronic use. As far as communicating the risk of uncertainty, that is a tough one. Blood goes through various processes which eradicate most diseases most of the time. However, there are many conditions that are not tested for and that we don't know about. Those are unknown risks. It is important that we don't panic the person, but that he or she understands that there will always be risks in any medication. Even recombinant products have a risk factor. That is part of life. We have to make a sincere effort to bring the patient into the loop, and in some areas it works very well. It works really well in discussion groups like this, but when you get back out into the field you often hear remarks from folks like, "Doe just said that this is what I should do," remarks which reinforce the continuing absence of shared information and informed consent. At NHF we are working very hard to empower our consumers to come back to the medical community with a lot of questions. We think it is important. We have had a very severe lesson as a result of our own complacency. DISCUSSION Paul Russell: Having the patient specifically sign makes a lot of sense, and that is really pretty much what we are trying to do throughout. In the case of transfusion it could be applied a little bit more broadly and repeatedly in the case of patients like those you are familiar with. Harvey Klein: Informed consent for blood transfusion has been national policy since 1986. I don't know whether that is standard of care in the outside world and whether every institution practices that, but the AABB, for example,

OCR for page 109
RISK COMMUNICATION 151 has had that written form widely disseminated to every one of its members since 1986. The circular of information, the so-called package insert, was not really developed for patient use any more than the package inserts in the drugs that you get. The package inserts were developed to inform the physicians, whose job it is to infonn the patient who receives a blood transfusion. Paul Schmidt: There was a study about 3 years ago in which a group of investigators doing informed consent talked to patients after surgery and then some months later. The patients didn't remember what had been discussed and their understanding was different. Donald Colburn: You are always going to have that obstacle. My emphasis is on the chronic user of medical services. There the time is importantly spent because that patient is going to take up a lot of the medical services' time. The better informed that person is, the better the relationship will be. The difficulty that we have observed in the hemophilia community is that when there is a perception that there has not been the opportunity to participate in the therapeutic regimen decision, then all the other systems become distrusted. That perceived breach by the individual causes a great deal of hann, even if the patient would not have remembered everything he or she was told. Henrik Bendixen: The point is that informed consent is not a freestanding event but should be part of the continuum of interaction between physician and patient. David Rothman: The world wide web on the Internet is becoming a real player. A colleague of ours talked about the fact that his patients ask where the newest protocol is. The AIDS community brought us into that consumer-based knowledge concerning protocol availability. If you don't have a protocol going they are not coming to your institution. Although I know those stories about how if you poll the patient 3 months later he doesn't remember the consent form, there are lots of newer stories about patient groups that are all on the Interpret communicating with each other in teens of where the best protocol is. Harold Sox: At Dartmouth there has been a reverse epidemic of operations for benign prostatic hypertrophy as a result of Wennberg and Mulley's work.'05 They developed video disks that convey in an individualized way ~05Kasper, JF, AG Mulley Jr, JE Wennberg (1992). Developing shared decision-making programs to improve the quality of health care (comments). Qualify Review Bulletin, 18(6): 1 83-190.

OCR for page 109
152 BLOOD AND BLOOD PRODUCTS: SAFE775 AND RISK the risks and benefits of a procedure with respect to the various therapeutic options. As a result patients bring a lot more information to the discussion with the physician. The process of the discussion is at least enlightened and may be even truncated somewhat. This might be another instance in which annual provision of informed consent might be in order. James Reilly: Whether we are a blood center providing blood to the transfusion center or a product manufacturer providing products to a treatment facility, where in your view is the communication breakdown? Is it that we are not providing enough to the physician or that the physician is not communicating to the patient? Donald Colburn: I did ask a number of blood bankers why isn't that circular used, for instance, because it is supposed to go up with every unit. I started to probe in that direction and the responses I got were amazing. "When we first got them we started sending them up, and all of a sudden I got calls from about five or six different hematologists. 'What the hell are you sending up to my patients? If I want them to know anything, I will tell them."' The overall problem is in physician communication to patient. I see it as time management because it takes a long time to be certain someone is educated. Frederick Manning: One of the projects I have been involved in here at the Institute of Medicine involved an experimental drug and the whole question of informed consent: how much is enough; how much should you tell the patients. It turned out the drug did not work out all that well. There was a lot of talk from patients about what they should have been told. In the process we did a little poll of the committee, who were all relatively eminent clinical researchers. The question was whether they could recall in their careers ever having a potential subject in one of their experiments who listened to their pitch, read the informed consent form, and said, "No, I am not interested." In fact, none of them could think of a single instance in which that happened. For the audience today, I would ask you to think back and tell us how many times have you had a patient leave the office after you described what the risks were? Maybe there are some inherent limitations about how scared you can make a patient who has already come to you and decided that he or she is going to trust you. Paul Russell: I have certainly had patients decide against surgery after I explained the risks. We have immensely complicated clinical protocols that have to be applied to almost every patient we see. When they come in for an organ transplant they may have to sign about half a dozen consent forms. It is not uncommon to have one or two of them be inappropriate. Their refusal

OCR for page 109
RISK COMMUNICATION 153 may not be because they are scared, but because it is an immensely complicated setting. I think you are thinking of a setting with a simple, straightforward, single question. In the situations that I deal with, there is an awful lot of experimentation of all kinds, one superimposed on top of another. There are patients who are in several protocols at once. David Rothman: There are protocols out there that don't get to enroll patients. There will be a research team to which the treating physician may or may not dispatch his or her patient. For example, at our institution the investigator who really wants to figure out if bone marrow transplantation has any efficacy in breast cancer cannot get patients to enroll because if you want it, it is out there to be had, and if you don't want it, you are not going to leave yourself to a random draw. No one goes into the protocol. Elaine Eyster: There are two very different types of protocols. One is the type that has been referred to, in which we have a variety of experimental studies for persons who are infected with HIV. We present those and patients may or may not choose to go on that protocol. It doesn't interfere with our relationship with them, but they have that choice. On the other hand, we have a routine informed consent that everybody signs and re-signs on some sort of interim basis saying that they agree to participate in a study in which their blood can be used for certain purposes. Most patients will accept that as part of what we do, but others will say, "No, I don't care to do that." That wish is honored, and they are not included in whatever study that you are doing. 1 ~ William Sherwood: There is also a great deal of informed consent that is really for comfort. Virtually all hospitals instituted an informed consent for transfusion, particularly prior to surgery. There are a couple of paragraphs given to the patient at a time when that patient has a lot of other things to sign, and the patient is under a good deal of stress having surgery the next day or the next morning. What we are searching for is just how far to go with informed consent. An interplay with the patient to try to find out how far to go was suggested, but I had trouble with that. I would like to be more uniform. I would like to be able to tell all patients what the potentials are and what I should be doing. Is there a way to stratify these risks proven, potential, and theoretical and where should we draw those lines? It is difficult to handpick something for each patient when you know in the long run that the patient will be disaffected in some way and later come back and say, "Why didn't you tell me about that risk?" I cannot come back and say to them, "I didn't think you needed to know it." We need more guidance on how deep to go in the risk spectrum in helping patients understand.

OCR for page 109
154 BLOOD AND BLOOD PRODUCTS: SAFETY AND RISK David Rothman: When an institutional review board (IRB) does a risk assessment for informed consent, it lists the major potential side effects by percentage; it keeps going down to maybe include some in the 10 percent range, but it probably doesn't go down to the 1 percent range. I don't know that all the patients read it, but they are going to certainly read the major untoward effects. The kinds of complaints that often end up being told to me are one in which the patient has been given a prescription, but the ordinary side effect was not mentioned to the patient. The patient experiences it, calls the physician the next day, and physician says, "Relax, it is one of the side effects," to which the response is, "So, why didn't you tell me?" We need to be between the extremes of parsing it out to 1 percent and telling a patient, "Look, you are going to be taking this. Here are the major side effects." Donald Colburn: I have signed off on admission to a hospital on a blood form. What you sign is incredible. If you sit there and challenge the document, you become a troublesome patient at that point. If you say, "I don't understand this," the person who is doing the intake to put you into the hospital is a clerk for the most part and responds, "Sign these forms." You ask "What happens if I don't sign them?" and the clerk says "We cannot put you in the hospital." You sign the forms. There is some degree of pressure that is put on the patient at that stage, in addition to whatever else may have been tried to have been communicated. Harvey Klein: Many good hospitals don't consider informed consent a form given at the time of intake with a clerk. That really isn't informed consent. David Rothman: Exactly. What could be a worse time? We have some data on hospitals trying to do advanced directives at the time of admission. Could there be a worse time to do anything, especially because somebody knows you are coming. James Allen: When you talk about listing potential adverse side effects in the 10 percent range, but not the 1 percent range, is that a 10 percent risk of occurrence versus a 1 percent risk of occurrence? Surely, for serious effects such as risk of HIV infection post-transfusion, a risk of one in several tens of thousands is considered highly significant and certainly a"must notify." David Rothman: You are absolutely right.