Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
8 Decision Making The extent to which teams successfully deal with transitions to crisis situations depends heavily on the extent to which the right actions are cho- sen: the process of decision making. Because of the great publicity that decision making failures often receive, we typically hear more about faulty decisions than successes in crisis situations. For example, the disaster at Three Mile Island resulted in part because operators in the control room made a decision to shut off an emergency feed water pump, having misdiag- nosed the state of the reactor (Rubinstein and Mason, 1979~. Less publi- cized because of their less disastrous outcomes were the more appropriate decisions taken in response to nuclear power emergencies at the L)avls- Bessie and Brownsville plants. The flight crew on board the Air Florida Boeing 737 at Washington's National Airport in 1982 exhibited faulty judgment in their decision to take off in icy conditions without requesting an immediate deicing (O'Hare and Roscoe, 1991; Van Dyne, 1982~. The sluggishly handling plane never gained necessary air speed and crashed into the 14th Street Bridge in Washington, DC. In contrast, the problem-solving and decision-making sequence fol- lowed by the flight crew of the United flight 232, which suffered a total hydraulics failure over Iowa, was testimony to the good judgment of the team in crisis bringing a totally crippled and nearly uncontrollable jet to earth (see Chapter 101. For tank crews, also, the decisions made under combat stress have a major bearing on the success of the mission and the survivability of the crew. Does one chose to engage the enemy or not? Is it worth risking 198
DECISION MAKING 199 exposure by following the shortest path to an urgent destination? Or should one follow a longer, slower route, maintaining a low-profile, hidden posi- tion? In most teams, decision-making responsibilities fall most heavily on the team leader-the tank commander, the airplane pilot in command, the hos- pital physician, or the fire chief. These individuals possess the ultimate responsibility for choosing the appropriate course of action, but other mem- bers of the team also play a critical role in communicating the information on which the optimal decision can and should be based. Analysis of the decision-making case studies mentioned above and of numerous others reveals that shortcomings in decision making may result from limitations in a number of the processes necessary to execute a deci- sion, from initial information gathering to final choice. In particular, it is possible to partition the process into two overall phases: the acquisition and maintenance of situation awareness necessary to diagnose or estimate the most likely state of affairs, and the selection of a course of action. While each of these phases can themselves be analyzed further, as indicated in Figure 8.1, it is instructive to consider examples of how faulty decisions can result from a breakdown of each. The operators at Three Mile Island, for example, made the right choice of action given their diagnosis of the state of the plant which, unfortunately, had been faulty. A correct choice of action was also made by the commanding officer on the U.S.S. Vincennes (Klein, 1989; Slovic, 1987; U.S. Navy, 1988~. Given the information pro- vided to him, that the aircraft approaching the ship was probably hostile, his decision to launch a missile was probably the best one. The tragedy result- ing when the missile struck a civilian airliner resulted because the identifi- cation of that aircraft was in error. In contrast, the Air Florida crash did not result because the crew failed to diagnose the icing conditions, but because their choice of action was inappropriate given those conditions. In the first half of this chapter, we treat these two phases of decision making separately before discussing how both are related specifically to the transition period. The section concludes by recommending some possible remediations to guard against decision failures. HEURISTICS AND BIASES IN HYPOTHESIS FORMATION A necessary, but not sufficient, precondition for good decision making is the existence of a correct hypothesis of the most likely state of the world within which the chosen action will be carried out. Wickens and Flach (1988) have presented an information processing model or framework of the hypothesis formation process, shown in Figure 8.1. Within the model, vari- ous biases and heuristics (Kahneman et al., 1982) are identified that may create distortions in hypothesis formation and situation awareness. In the
200 Cues WORKLOAD TRANSITION E! Perception and Attention ' - , ~ I ~ ,_ Long Term Memory Working Memory Criterion j tin I 33 Situation Assessment (Diagnosis) t.1 -- - - --I-- - - - - - - - - - - - - - - - - - - - - --t-- - - - - - --I--- ~ Hypothesis Generation Action ~ | GenerationJ ~ ~ Choice ~Action | ~ _ _ _ _ _ _ _ 1 1 Risk Assessment ~ Salience Bias E] Representativeness Heuristic @ As-lf Heuristic 1~ Availability Heuristic E! Confirmation Bias ] Framing Bias FIGURE 8.1 A model of decision making. Biases and heuristics, denoted by letters surrounded by a square, are discussed in the text. Source: Wickens and Flach (1988~. Copyright held by Academic Press. Reprinted by permission. following, each of these is illustrated as it might be or has been manifest in an applied setting. When integrating multiple sources of information to formulate a hy- pothesis, the salience bias describes the decision maker's tendency to focus on the most salient (loudest, brightest, most prominent) cue, rather than that which may be most informative and diagnostic, when these are not the same. As one example, in the analysis of the incident aboard the U.S.S. Vincennes when the Iranian airliner was shot down, it was apparent that the salient writing of the word F14 on a message board in the combat informa- tion center, following the uncertain hypothesis of the aircraft's identify, contributed to the ultimate misidentification of the aircraft. Which hypothesis a decision maker chooses to base his or her actions on depends very much on which hypothesis is most available in memory, rather than in fact which may be the most likely in the circumstances. This
DECISION MAKING 201 is known as the availability heuristic (Tversky and Kahneman, 1974~. Thus, the tank commander will find most available the enemy's plan of attack for which he had just been briefed, or which he had encountered in a recent drill, because these would be easily brought to mind. Analysis of the Vincennes incident revealed that the misdiagnosis was the result of interpreting the actions of the radar contact in terms of a predefined script of a hostile attack (U.S. Navy, 1988~. That is, an easily recallable sequence of events that would be likely to occur in combat. Once a tentative hypothesis is formulated, based perhaps on available memories and salient information sources, two closely related forces join to increase the likelihood that available hypotheses will prevail and alterna- tives will not be considered. The anchoring heuristic describes the ten- dency to stay with a current hypothesis and consider new information that might shift one's beliefs in favor of a different hypothesis less than ad- equately (Tversky and Kahneman, 1974; Van Dyne, 1982~. The confirma- tion bias (Klayman and Ha, 1987; Tolcott et al., 1989) describes the deci sion makers' tendency to seek new information that supports one's currently held hypothesis and to ignore (or at least downplay) information that may support an alternative hypothesis. Both anchoring and the confirmation bias seem to have been partially responsible for the disaster at the Three Mile Island nuclear power plant, in which operators focused on the inappro- priate hypothesis that the water level was high and ignored critical display information suggesting just the opposite (i.e., that the water level was too low, which turned out in fact to be the correct hypothesis). It is easy to envision how the tank commander, with a preconceived hypothesis regard- ing the nature of enemy intentions, will interpret ambiguous evidence as consistent with these intentions. Tolcott et al. (1989) have found that such a bias is present in the performance of Army intelligence analysts. EXPERTISE IN DIAGNOSIS The information flow in Figure 8.1 suggests that human decision mak- ers go through a time-consuming computational process of evaluating and interpreting evidence, relying heavily on the limited capacity of working memory. Yet under time pressure and in potential crisis situations, there is good evidence that expert decision makers-the skilled tank commander, pilot, or nuclear power plant control room operator may adopt a very different strategy of hypothesis diagnosis in which they simply match the available evidence with the most similar experience already stored in long- term memory (Ebbeson and Koneci, 19811. Klein (1989), for example, has documented that expert fire crew chiefs, when diagnosing the nature of a fire upon first arriving at the scene, go through such a pattern match pro- cess, as do expert (but not novice) tank commanders in simulated battle
202 WORKLOAD TRANSITION games. The viewed scene is simply compared with a series of mental representations of typical scenes from past experience, to determine which one is an adequate match. In a study of pilot decision making, Wickens et al. (1987) and Barnett (1989) inferred that highly experienced pilots (those with more than 500 flight hours) used a qualitatively different style of pilot judgment than novices, relying less on working memory and more on direct retrieval of the appropriate solutions from long-term memory. In spite of the apparent advantages of this pattern-matching decision strategy in some contexts, particularly those involving time pressure, the limits of this approach should be clearly noted as well. On one hand, such a technique may be applied effectively only in the domain for which expertise has been developed. Thus, for example, Klein observed that fire chiefs used pattern-matching diagnosis behavior when they diagnosed the nature of a fire, but not when they needed to deal with administrative commands and personnel decisions. On the other hand, although the pattern-matching technique is more rapid, less resource demanding, and qualitatively differ- ent, it is not necessarily more accurate than the time-consuming computa- tional technique used by novices. Indeed, Wickens et al. (1987) found little difference in the accuracy of judgments made by high-time versus low-time pilots, only that high-time pilots were more confident in their decisions. As noted in Chapter 4, Koehler and McKinney (1991) found that inflight diag noses of expert pilots were no better than those of less experienced pilots, and actually suffered more when the problems were nonroutine. In accounting for these failures, it is reasonable to assume that, because the pattern-matching approach forces a set of environmental cues to match a stored template in memory, it may be relatively more susceptible to the biases in hypothesis formulation discussed above (confirmation, anchoring, and availability). This is because a situation that is generally similar to the mental representation of past experience, but may be different in some key respects, could be classified as identical, with those key differences simply ignored (i.e., anchoring on what is available from past experience). Unfor- tunately, with the exception of the Wickens et al. (1987) study, few data are available regarding the relative quality of pattern matching versus computa- tional decision making in applied environments. Furthermore, the studies discussed above have defined expertise in terms of years of experience, rather than (necessarily) a high quality of performance. It will be important to continue research that examines decision style and quality as a function of expertise, unconfounded with experience. CHOICE Classic decision theory has focused its efforts on ways to integrate the uncertainty and the value associated with prospective outcomes of decision
DECISION MAKING 203 alternatives. That is, a given choice is assumed to have a number of pos- sible outcomes (depending on the uncertain state of the world in which a choice is made) (see Figure 8.23. The option that is favored is assumed to maximize some subjective quantity or preferer~ce for the decision maker. It is often assumed that this quantity is the subjective expected utility, which is computed as the utility (subjective value) of each possible outcome for the choice, multiplied by the subjective probability that that outcome will be observed. There are numerous alternative models that have been applied to decision making. For example, it might be assumed that decision makers will minimize losses or will pick the least effortful decision that has some minimum level of expected gain (Slovic, 19873. However, if one option _ - - ~_ - - FIGURE 8.2 A hypothetical risky choice. Decision Option A will yield one of two possible outcomes, depending on whether the state of the world is 1 or 2. These states are not known for sure, but are estimated (diagnosed) with probabilities Pi and P2, respectively. If state 1 is in effect, outcome Al, which has utility UAI, will be obtained. State 2 will yield outcome A2, with utility UA2. In contrast, decision Option B will yield a certain (nonrisky) outcome with utility UB, no matter which state is true.
204 WORKLOAD TRANSITION promises more attractive outcomes given some circumstances, but the chances of those outcomes are lower than those of less appealing outcomes prom- ised by competing alternatives how should the conflict be resolved? A critical concept in such analysis is that of risk, typically characteriz- ing an option for which the two or more possible outcomes associated with the choice may differ in their probability of occurrence and do differ in their utility (i.e., cost or benefit). This situation characterizes option A in the figure. For example, a low-probability outcome may be associated with dire consequences and a high-probability outcome may be associated with consequences that are far more benign. In the nuclear power industry, a decision to keep a reactor on line when it has shown a faintly suspicious symptom might be such an example. There is a small probability of a major disaster if the symptom really does herald a failure but a high probability that nothing is wrong. How bad such a risky option is perceived to be (its expected negative utility) will, of course, depend on how large is the perceived probability of the negative outcome. There are in fact a number of sometimes conflicting influences on perceived probability that may bias the estimate in different directions. For example, very rare events are typically overestimated (ex- plaining why people believe they will win by entering lotteries) (Sheridan and Farrell, 1974; Wickens, 1984~. This overestimation is particularly likely to occur when low-probability events are well publicized so that they be- come available to memory (Slovic, 1987~. Humans have a tendency to be overconfident of their own likelihood of success to underestimate the probability that infrequent, bad things will happen to them (as opposed to someone else) (Bettman et al., 1986~. This is one example of the general bias toward overconfidence in human diagnosis and choice. As shown in the figure, risky options are often paired against "sure thing" options, for which the consequences are reasonably well known: shut the nuclear reactor down and the power plant will surely suffer some disruption, but it also will surely avoid disaster. Research by Kahneman and Tversky (1981, 1984) into the heuristic known as framing suggests that, when choosing between a risky and a sure thing option, people respond differently when the outcomes of both options have positive expected utili- ties, than when both options have outcomes with negative expected utilities. Teams in crisis usually are confronted with a pair of negative outcomes. The tank commander might, for example, choose between a safe retreat, with a sure loss of position but sure preservation of safety, and a risky advance, win a low probability of encountering fatality-inducing battle conditions. In analogous circumstances, when the choice is between negatives, Kahneman and Tversky found that people usually are biased to choose the risky option. When in contrast, the choice is viewed as one between positive outcomes, the choice is more likely to be biased toward the sure thing safe option.
DECISION MAKING 205 The important feature of this framing bias is that the very same decisions may be framed positively by emphasizing the good characteristics (e.g., probability of winning the battle, preserving crew lives) or negatively (e.g., probability of losing the battle, encountering fatalities), and the difference in framing will influence the choice that is made. As with our discussion of diagnosis, so also with choice: it appears that expert decision makers do not generally employ a fully analytic strategy. They do not carefully weigh all the alternatives, outcome utilities, and prob- abilities before arriving at a choice. Rather, evidence suggests that these processes, too, are often circumvented by a decision based on direct memory retrieval. If, in a particular circumstance (evaluated by diagnosis), a choice has proven successful in the past (yielded a favorable outcome), it will be chosen again. Any action that has yielded this outcome can be chosen, not necessarily that with the highest expected utility (Klein, 1989~. Such a strategy, in which the first alternative that satisfies all relevant attributes is selected, has been labeled as satisficing (Simon, 19551. This characterizes the verbal protocols that fire chiefs give when describing the tactics they choose to fight a fire (Klein, 19893. Analogous to our treatment of diagnosis, the alternative direct memory retrieval method of decision making shown by experts is not necessarily better than the strategy shown by novices, although it is more rapid and made with less effort. The strategy will lead to the choice of actions that are familiar and easy to recall. Hence, certain biases toward availability may be shown. This appears to be an adaptive strategy in times of stress, when time pressure is intense, as will be the case in most transition situa- tions. We consider now the potential effects of the transition on decision quality. TRANSITION EFFECTS The influence of the transition process on decision making may be roughly described from two perspectives: the exchange between pretransition effort and post-transition performance, and the overall influence of stress on . . . . transition decision per ormance. Pre-Post Exchange The first section of this chapter has described the extent to which effec- tive decision making is based on accurate situation awareness. One can easily imagine that efficient and accurate judgments in a post-transition period will depend on the fidelity of knowledge gained in the pretransition period. Three kinds of knowledge and preparation would seem to be of use here. The first concerns static knowledge of features that are unlikely to
206 WORKLOAD TRANSITION change. For the tank commander, this characterizes knowledge of the geog- raphy of the area, the acquisition of which was described in the previous chapter. For the nuclear power crew, this static information is characterized by long-term knowledge of plant operation as well as the more transient knowledge of the repair status of the plant which lines are open and which are closed. For a disaster relief coordinator, it may involve knowledge of which units and equipment are located where. There is a high payoff in investing resources into acquiring this static knowledge or situation aware- ness, since it will be unlikely to change and can then be used as a basis for fast and effective decision making after the transition. Second, there is a class of knowledge and information that can be gath- ered prior to transition that has uncertainty associated with it. Any decision that will be taken in response to future meteorological conditions, for ex- ample, must certainly be of this form. So also will a tank commander's decision based on intelligence about what an enemy might do (Scott and Wickens, 1983~. In this instance, it would seem important to prepare for and consider, not only the most likely hypothesis (or state) based on the available cues, but also those conditions of less, but nonzero, likelihood. In short, it is valuable to develop a weighted contingency diagnosis in which crews are prepared for alternate states in proportion to their degree of likeli- hood. The prepared decision maker should guard against a sharpening of preparedness only for the most likely state. In this way the team will be less likely to misidentify and falsely classify a particular situation, suc- cumbing to the heuristics of anchoring and availability. An important corol- lary of this strategy is that estimates of uncertainty should be carefully preserved as situation information or intelligence is relayed from person to person or unit to unit. It was in part the failure to relay uncertainty infor- mation up the chain of command that led to the disaster in the U.S.S. Vincennes incident (U.S. Navy, 1988~. Third, and most obviously, post-transition decision making can be fa- cilitated by the rehearsal of and preparation for contingency response plans. As noted in Chapter 3, greater pretransition preparation will lead to more efficient post-transition response. But here again, care must be taken to guard against blind following of a preprogrammed procedure, without monitoring its ongoing appropriateness and without entertaining a willingness to modify or abandon the procedure as needed (i.e., to guard against the confirmation bias) (Woods and Roth, 19879. Stress Effects The second relevant transition effect on decision making concerns the specific effects of post-transition stress on decision-making performance. Given that many of the stress effects were covered previously in Chapter 4, we
DECISION MAKING 207 present here only a brief review of how the combined effects of noise, danger, and time pressure might be expected to amplify decision-making biases. Communications Noise will have a clear and direct degrading effect on communica- tions the exchange of auditory information necessary for effective deci- sion making. We might also anticipate a specific form of stress bias for the listener to expect and therefore hear the subjectively most likely message. Such was a clear factor responsible for the KIM-PAN AM collision on the runway in the Canary Islands (Bailey, 1989~. Perceptual Tunneling Stress is known to induce an attentional focusing on the most subjec- tively important source of information (Broadbent, 1971~. Where the sub- jective importance of an information source does not directly correlate with its true reliability and diagnosticity, major problems could be encountered. Confirmation Bias There is at least anecdotal evidence that stress can enhance the confir- mation bias, reinforcing still further the belief that the hypothesis or action one has already chosen is correct. This tendency seems to have character- ized the operators' behaviors at Three Mile Island (Rubinstein and Mason, 1979) and was diagnosed as a contributing cause of a substantial number of recent accidents in British military aircraft (Chappelow, 19881. Phonetic Working Memory It is not difficult to envision how stress, particularly that characterized by noise, can reduce working memory capacity (Hockey, 1986) and there- fore the effectiveness of that memory system in storing verbal information necessary for hypothesis testing and evaluation (Mehle, 1982~. This is particularly true when the situation is unfamiliar and the operator is less able to rely on pattern-matching techniques. Spatial Working Memory Wickens et al. (1988a, 1988b) found that pilot decisions made under the combined stress of noise and time pressure were degraded to the extent that they depended on visualization of the airspace. One might anticipate, there- fore, similar effects on decisions that require rapid updating or revision of a
208 WORKLOAD TRANSITION mental model of terrain (in the case of the tank commander) or some other visual-spatial environment (i.e., the structure of a ship, building, or plant on fire). Speed-Accuracv Tradeoff It has been shown that stress induces a shift to fast, but less accurate, performance on a speed-accuracy tradeoff (Hockey, 1986~. Furthermore, given that accurate performance on certain kinds of decisions depends on a time-consuming weighing of various alternatives, it is reasonable to con- clude that decision performance following transition will be more error prone, to the extent that it depends on an analytic computational strategy. Alternatively, it can be predicted that the preferred strategy of decision making will be likely to shift to one involving direct memory retrieval, given that the operator has stored the necessary domain-related knowledge base (Klein, 19891. Remediation Decision researchers have long been aware of the failures and limita- tions of human decision making (Slovic et al., 1977~. More recently, they have acknowledged the human's very real strengths in this area compared with the capabilities of many artificial intelligence systems (Klein, 1989~. To counteract these limitations, four general remediation solutions have been proposed, any of which might be appropriate for the transition envi- ronment. Each of these solutions is described below. Decision Aids The increasing power and sophistication of computer technology has made more feasible the development of programs that can assist decision making. In each phase, there are two alternative approaches. One involves the development of artificial intelligence/expert system technology in which potential solutions (diagnoses or recommended choices) are computer-gen- erated, to be accepted or rejected by the human operator (Madni, 1988; Rouse et al., 19901. Such techniques would still appear to be somewhat limited unless the optimum decision rules can be clearly and unambigu- ously articulated, and the decision problem is quite self-contained (i.e., does not involve extracting information from unforeseeable sources). For ex- ample, a decision aid in recommending diagnoses of a failed gunnery com- puter might be appropriate. One that recommends battlefield tactics would be far more tenuous, because of the diversity of factors that should go into such a consideration.
DECISION MAKING 209 The alternative approach is a decision aid that provides assistance to the human operator but does not recommend diagnoses or actions. Such an aid might, for example, list alternative hypotheses for a diagnosis, to safe- guard against the tunneling produced by availability and anchoring. Corre- spondingly, it might list alternative courses of action (without recommend- ing particular ones). A major potential source of benefit here could be realized in aids that provided an effective and organized display of informa- tion cues that could assist in hypothesis evaluation and situation awareness (MacGregor and Slovic, 1986; Scott and Wickens, 1983~. Such a display aid could present cues in terms of their information value, in such a way that salience would not distort the overall representation of information. In general, displays that minimize the need for cognitive transformation be- tween what is displayed and what is meant, and that organize information in logical groupings, will aid the decision maker. Debias Training An alternative approach to designing aids that will minimize the impact of bias is to train or teach the operator about these biases and heuristics directly. Such debias training has been introduced with the belief that, once a decision maker is aware of the existence of these biases, he or she will be less likely to fall prey to their influence; however, the consensus in the literature is that simple awareness of these biases rarely alleviates them (Fischhoff, 1982~. Some examples of modest success in debiasing have been observed, rendering decision making less susceptible to anchoring (Lopes, 1982) and overconfidence in meteorological forecasting (Murphy et al., 1985~. While not yet fully validated as an effective technique, it would seem that provid- ing decision makers with some level of training into the nature of heuristics, and the understanding of probability would be of considerable value in many applied contexts. Domain Training An alternative form of training to that used in debiasing is direct train ing in the domain of the decision itself (rather than in the mechanics of the decision process). Certainly included here would be training in planning and diagnosis or situation assessment. One issue that is not well resolved is the extent to which training to deal with events following crisis shout focus on highly specific (but brittle) procedures following. In the context of the nuclear power industry, Woods (1988) has voiced some concern about the dangers of overtraining operators to follow very specific procedures given a particular failure diagnosis. The concern results when the diag
210 WORKLOAD TRANSITION noses on which the procedure is based is itself uncertain (i.e., a diagnosis of the most likely candidate, but not a certain candidate, so that others are plausible). In this case, overtraining on a particular routine or decided course of action to follow given that hypothesis, may lead the operator to follow it blindly, without carefully checking as the decision actions are carried out, to determine whether the routine remains appropriate. As we have seen, this tendency may be exaggerated in times of stress. In this regard, some consideration should be given to training the decision maker to closely monitor the outcomes of the actions following a decision, in order to ensure that the choice was in fact the correct one, and to be pre- pared to alter those actions as necessary. Such training could make use of the realistic, dynamic simulation facilities offered by SIMNET, a team- oriented tank training facility to be discussed in further detail in Chapter 10. The issue of the specificity with which emergency procedures following should be trained is one for which more research is clearly needed. In at least one domain, programs to train decision makers have demonstrated valid success. Diehl (1991) has reviewed the effectiveness of air crew decision training in a variety of aviation programs and has concluded that such programs substantially reduce the likelihood of erroneous pilot judg- ments. Team Cohesion The final remediation addresses the need to create efficient decision making teams within which the communication of information necessary for optimal diagnosis and choice proceeds in a smooth and unambiguous fashion. Clearly some degree of standardization and redundancy in vocal communications is necessary. But other critical factors involve the spirit, coherence, training, and personality of the team members, and in particular of the team leader. These are issues that are discussed at the end of Chapter 4 and are dealt with again in some depth in Chapter 10; some of them are also addressed in Druckman and Bjork (1991~. SUMMARY The decision making of teams in transition depends jointly on the deci- sion-making capabilities of the team leader, and on the flow of information via voice communication from other team members and from well config- ured visual displays. When this information transmission is effective, it provides the basis for good situational awareness or diagnosis of the state of the world that requires a decision. Yet this diagnosis may be hindered or distorted by a number of biases or heuristics, some of which are amplified
DECISION MAKING 211 under times of stress. High degrees of expertise may eliminate some of these problems and produce more rapid decisions. Diagnosis is often followed by choice, which depends on the accurate assessment of outcomes, their utility, and their risk. Here again, certain biases in risk perception have been identified, and here also the decision process may be facilitated by expertise. Two categories of transition effects on the decision process may be identified. On one hand, there are certain actions the operator can take before the transition that can improve decision quality (or accuracy) after the transition. On the other hand, the transition itself will induce a level of stress that is likely to systematically degrade certain aspects of the decision process. The limitations of decision making can be remediated by one of four techniques: computer-based decision aiding, particularly that which em- phasizes the display and organization of relevant cues, training of self- awareness of the decision maker's biases, training in the decision domain, and development of team cohesion. REFERENCES Bailey, R.W. 1989 Human Performance Engineering Using Human FactorslErgonomics to Achieve Computer System Usability (2nd edition). Englewood Cliffs, New Jersey: Prentice Hall. Barnett, B. 1989 Information processing components and knowledge representation: An individual difference approach to modeling pilot judgment. Proceedings of the Human Fac tors Society 33rd Annual Meeting. Santa Monica, California: Human Factors Society. Bettman, J.R., J.W. Payne, and R. Staelin 1986 Cognitive considerations in designing effective labels for presenting risk informa- tion. Journal of Marketing and Public Policy 5:1-28. Broadbent, D.E. 1971 Decision and Stress. New York: Academic Press. Chappelow, J.W. 1988 Causes of aircrew error in the Royal Airforce. In Human Behavior in High Stress Situations in Aerospace Operations. NATO AGAARD Conference Proceedings 458. Diehl, A. 1991 The effectiveness of training programs for preventing aircrew error. In R. Jensen, ea., Proceedings of the Sixth Symposium on Aviation Psychology. Columbus, Ohio: Ohio State University. Druckman, D., and R.A. Bjork, eds. 1991 In the Mind's Eye: Enhancing Human Performance. Committee on Techniques for the Advancement of Human Performance. Washington, DC: National Acad- emy Press.
212 WORKLOAD TRANSITION Ebbeson, E.D., and V. Koneci 1981 On external validity in decisionmaking research. In T. Wallsten, ea., Cognitive Processes in Choice and Decisionmaking. Hillsdale, New Jersey: Erlbaum. Fischhoff, B 1982 Debiasing. In D. Kahneman, P. Slovic, and A. Tverksy, eds., Judgment under Uncertainty: Heuristics and Biases. New York: Cambridge University Press. Hockey, G.R.J. 1986 Changes in operator efficiency as a function of environmental stress, fatigue, and circadian rhythms. In K.R. B off, L. Kaufman, and J.P. Thomas, eds., Handbook of Perception and Human Performance, Vol. II. New York: Wiley. Kahneman, D., P. Slovic, and A. Tversky, eds. 1982 Judgment Under Uncertainty: Heuristics and Biases. New York: Cambridge University Press. Kahneman, D., and A. Tversky 1981 The framing of decisions and the psychology of choice. Science 211(4481):453- 458. 1984 Choices, values and frames. American Psychologist 39:341-350. Klayman, J., and Y.W. Ha 1987 Confirmation, disconfirmation and information in hypothesis testing. Journal of Experimental Psychology: Human Learning and Memory 94(2):211-228. Klein, G.A. 1989 Recognition-primed decisions. Pp. 47-92 in W. Rouse, ea., Advances in Man- Machine Systems Research, Volume 5. Greenwich, Connecticut: JAI Press. Koehler, J.J., and E.H. McKinney 1991 Uniqueness of task, experience, and decision making performance: A study of 176 U.S. Air Force mishaps. Lopes, L. 1982 Procedural Debiasing. Technical Report WHIPP 15. Madison, Wisconsin: Hu man Information Processing Program. MacGregor, D., and P. Slovic 1986 Graphic representation of judgmental information. Human-Computer Interaction 2: 179-200. Madni, A.M. 1988 The role of human factors in expert systems design. Human Factors 30:395-414. Mehle, T. 1982 Hypotheses generation in an automobile malfunction inference task. Acta Psychologica 52:87-1 16. Murphy, A., W. Hsu, R. Winkler, and D. Wilks 1985 The use of probabiIities in subjective quantitative precipitation forecast. Monthly Weather Review 113 :2075-2089. O'Hare, D., and S. Roscoe 1991 Fligh~deck Performance, The Human Factor. Ames: Iowa State University Press. Rouse, W.13., N.D. Geddes, and J.M. Hammer 1990 Computer aided fighter pilots. IEEE Spectrum 27(3):38-40. Rubinstein, T., and A.F. Mason 1979 The accident that shouldn't have happened: An analysis of Three Mile Island. IEEE Spectrum 16:33-57. Scott, B., and C.D. Wickens 1983 The effects of a spatial information format on decision making performance in a C3 (Command, Control, and Communications) probabilitic information integration task. Pp. 96-99 in MIT Proceedings of the Sixth MITIONR Workshop on C3 Systems. Report No. AD-P002887. Urbana, Illinois: Illinois University.
DECISION MAKING 213 Sheridan, T.B., and W.R. Farrell 1974 Man-Machine Systems. Cambridge, Massachusetts: MIT Press. Simon, H.A. 1955 A behavioral model of rational choice. Quarterly Journal of Economics 69:99 118. Slovic, D. 1987 Facts versus fears: Understanding perceived risk. In F. Farley and C. Null, eds., Using Psychological Science. Washington, DC: Federation of Behavioral, Psy chological, and Cognitive Sciences. Slovic, D., B. Fischhoff, and Lichtenstein 1977 Behavioral decision theory. Annual Review of Psychology 28:1-39. Tolcott, M.A., F.F. Marvin, and T.A. Bresdick 1989 The Confirmation Bias In Military Situation Assessment. Reston, Virginia: Deci- sion Science Consortium. Tversky, A., and D. Kahneman 1974 Judgment under uncertainty: Heuristics and biases. Science 185:1124-1131. U.S. Navy 1988 Investigation Report: Formal Investigation into the Circumstances Surrounding the Downing of Iran Air Flight 655 on 3 July 1988. Investigation Report. Wash ington, DC: Department of Defense. Van Dyne, L. 1982 A false feeling of security. The Washingtonian October: 1 12-144. Wickens, C.D. 1984 Processing resources in attention. Pp. 63-102 in R. Parasuraman and R. Davies? ea., Varieties of Attention. San Diego, California: Academic Press. 1992 Engineering Psychology and Human Performance New York: Harper Collins. Wickens, C.D., and J. Flach 1988 Human information processing. Pp. 111-155 in E. Wiener and D. Nagel, eds., Human Factors in Aviation. New York: Academic Press. Wickens, C.D., A.F. Stokes, B. Barnett, and F. Hyman 1988a The Elects of Stress on Pilot Judgment in a MIDIS Simulator. Final Technical Report, Subcontract C87-101376. Savoy, Illinois: Institute of Aviation. 1988b Stress and pilot judgment: An empirical study using MIDIS, a microcomputer- based simulation. In Proceedings of the Human Factors Society 32nd Annual Meeting. Santa Monica, California: Human Factors Society. Wickens, C.D., A. Stokes, B. Barnett, and T. Davis, Jr. 1987 Componential Analysis of Pilot Decision Making. Final Technical Report No. ARL-87-4/SCEEE-87-1. Savoy, Illinois: University of Illinois Aviation Research Laboratory. Woods, D.D. 1988 Commentary: Cognitive engineering in complex and dynamic worlds. In E. Hollnagel, G. Mancini, and D. Woods, eds., Cognitive Engineering in Complex, Dynamic Worlds. London, England: Academic Press. Woods, D.D., and E.M. Roth 1987 Cognitive systems engineering. In M. Helander, ea., Handbook of Human-Com- puterInteraction. New York: North-Holland.