4
Optimizing Decision Making

Extensive work in the behavioral sciences and neuroscience over the past decade has highlighted two realities that confront any organization that relies on decisions made by its members. First, decision making by humans is often suboptimal in ways that can be reliably predicted—and sometimes remediated with training (e.g., Kahneman and Tversky, 1979; Ariely, 2008). If a decision maker is explicitly told to make decisions that “maximize financial gain on average” or to “minimize human casualties on average” and fails to do so in a systematic way, social scientists consider the outcome to be a result of suboptimal or inefficient decisions. For example, humans make errors in the ways that they estimate the probabilities of events, react to heavy losses, and deal with ambiguous situations (Kahneman and Tversky, 2000). Suboptimal decision making has been identified in financial circles where billions of dollars are at stake, and it undoubtedly occurs in the Army as well.

Second, individuals differ predictably in their decision making (see, for example, Wu and Gonzales, 1998). These differences can be highly idiosyncratic and are likely tied to stable, long-term traits of personality, physiology, and neurology (see, for example, Weber et al., 2002). This variability in decision making can make an individual better suited for one particular task and less well suited for another.

At present, the Army neither seeks to measure, in a formal way, deviations from optimal or efficient decision making nor attempts to characterize the stable individual differences that shape a particular soldier’s decision making. This chapter explores ways in which current and emerging insights from neuroscience, together with neural monitoring tools such as neuroimaging, can help the Army to optimize the decision making of its soldiers and officers in both these senses: first, by identifying and providing countermeasures to suboptimal decision making and, second, by identifying and making optimal use of individual variability in decision-making traits.

THE SOURCES OF SUBOPTIMAL DECISION MAKING

Consider a basketball coach who must transport his team to a distant city and must choose between flying and driving. If he chooses to drive because he believes he is minimizing the risk to his players, he has made a suboptimal decision. Because flying is statistically safer than driving, he is exposing his players to more risk by driving. Indeed, purely from the perspective of rational risk management, even if his team were involved in an aviation accident, he would have made the right decision if he had decided to fly.

The Army could do much more than it currently does to study and correct suboptimal decision making using the full range of neuroscience, including the behavioral and systemic levels. Leader training and assessment has not kept pace with advances in the science. Although decision making is critical to the conduct of warfare, no research unit within the Army focuses specifically on assessing or improving decision making by officers or enlisted soldiers at a behavioral or neurophysiological level. Even high-profile projects such as the Augmented Cognition project (a previous Defense Advanced Research Projects Agency (DARPA) project discussed in Chapter 6) have focused on how to present information rather than on ways to improve decision making per se. Because we now know that human decision making is predictably suboptimal in many situations, the Army must decide whether to optimize the force along this human dimension or ignore an emerging opportunity. Two classes of suboptimal decision making for which there is abundant recent behavioral literature will illustrate how much is at stake in choosing whether to seize on this new understanding or ignore it.

Errors in Assessing Relative Risk

All decision making involves risk, and individuals have different tolerances for risk. As the eighteenth-century mathematician Daniel Bernoulli famously observed, a poor man



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 36
4 optimizing decision making The sources oF suBoPTimal decisioN makiNG Extensive work in the behavioral sciences and neuro- science over the past decade has highlighted two realities Consider a basketball coach who must transport his team that confront any organization that relies on decisions made to a distant city and must choose between flying and driving. by its members. First, decision making by humans is often If he chooses to drive because he believes he is minimizing suboptimal in ways that can be reliably predicted—and the risk to his players, he has made a suboptimal decision. sometimes remediated with training (e.g., Kahneman and Because flying is statistically safer than driving, he is expos- Tversky, 1979; Ariely, 2008). If a decision maker is explic- ing his players to more risk by driving. Indeed, purely from itly told to make decisions that “maximize financial gain on the perspective of rational risk management, even if his team average” or to “minimize human casualties on average” and were involved in an aviation accident, he would have made fails to do so in a systematic way, social scientists consider the right decision if he had decided to fly. the outcome to be a result of suboptimal or inefficient deci- The Army could do much more than it currently does to sions. For example, humans make errors in the ways that they study and correct suboptimal decision making using the full estimate the probabilities of events, react to heavy losses, range of neuroscience, including the behavioral and systemic and deal with ambiguous situations (Kahneman and Tversky, levels. Leader training and assessment has not kept pace 2000). Suboptimal decision making has been identified in with advances in the science. Although decision making is financial circles where billions of dollars are at stake, and it critical to the conduct of warfare, no research unit within the undoubtedly occurs in the Army as well. Army focuses specifically on assessing or improving deci- Second, individuals differ predictably in their decision sion making by officers or enlisted soldiers at a behavioral making (see, for example, Wu and Gonzales, 1998). These or neurophysiological level. Even high-profile projects such differences can be highly idiosyncratic and are likely tied to as the Augmented Cognition project (a previous Defense stable, long-term traits of personality, physiology, and neu- Advanced Research Projects Agency (DARPA) project dis- rology (see, for example, Weber et al., 2002). This variability cussed in Chapter 6) have focused on how to present infor- in decision making can make an individual better suited for mation rather than on ways to improve decision making per one particular task and less well suited for another. se. Because we now know that human decision making is At present, the Army neither seeks to measure, in a predictably suboptimal in many situations, the Army must formal way, deviations from optimal or efficient decision decide whether to optimize the force along this human making nor attempts to characterize the stable individual dimension or ignore an emerging opportunity. Two classes differences that shape a particular soldier’s decision making. of suboptimal decision making for which there is abundant This chapter explores ways in which current and emerging recent behavioral literature will illustrate how much is at insights from neuroscience, together with neural monitoring stake in choosing whether to seize on this new understand- tools such as neuroimaging, can help the Army to optimize ing or ignore it. the decision making of its soldiers and officers in both these senses: first, by identifying and providing countermeasures errors in assessing relative risk to suboptimal decision making and, second, by identifying and making optimal use of individual variability in decision- All decision making involves risk, and individuals have making traits. different tolerances for risk. As the eighteenth-century math - ematician Daniel Bernoulli famously observed, a poor man 

OCR for page 36
 OPTIMIZING DeCISION MAKING with a lottery ticket that has a 50 percent chance of winning Over the past decade, the neurophysiological roots of $20,000 might very rationally trade that ticket for $9,000. On this kind of behavior have begun to be understood. It seems average, the lottery ticket he holds is worth $10,000, but he likely that this enhanced understanding of the mechanisms might choose not to take the risk. Of course how little he is of decision making will shed new light on the sources of willing to accept for the ticket reflects just how risk averse suboptimal decisions. We now know, for example, that he is. Someone who is very risk averse might settle for a sure structures in the basal ganglia and the prefrontal cortex gain of $1,000 in return for the ticket; someone who is toler- (PFC) provide valuations for actions, and these valuations ant of risk might refuse even $9,900. The important point is appear to be passed to the posterior parietal cortex, among that there is nothing suboptimal about any of these decisions other areas, for decision (see, for example, Glimcher et al., for Bernoulli’s poor man. These different decisions reflect 2005). In fact, recent research has even begun to identify the different tolerances for risk that might affect the suitability neural algorithms that lead to some of these inconsistencies of an individual for a specific job, but scientists (unlike mis- (Glimcher et al., 2007). sion planners) must be silent on notions of optimality with regard to risk tolerance. Of course it may be that individuals loss aversion in decision making who are more risk averse are better suited for peacekeeping missions and individuals who are more risk tolerant are better Another factor that leads to seriously suboptimal deci- suited for frontline combat missions, but different tolerances sion making is the asymmetry with which human decision for risk, from the point of view of optimality theory, simply makers typically weigh risky losses versus risky gains (see represent different approaches. Ariely, 2008, and Plous, 1993). Consider a gambler who has The assessment of risk by individuals, however, is an $1,000 in his pocket and is offered a $100 bet that has a 1 in area where the tools of optimal decision making can be 20 chance of paying off $500. With the money in his pocket brought to bear. Consider a commander ordered specifically and the night’s action still to come, he reasonably refuses to “minimize vehicle losses.” Real-life situations seldom this highly improbable bet. Later on, having lost $10,000 on provide clear-cut alternatives for which risks and benefits can credit, he might well accept that same bet. Contemporary be estimated with a high degree of certainty. But suppose that studies of decision making suggest that this reflects a change a commander has a choice between two plans for committing in his risk tolerance. As losses accumulate, decision makers 500 vehicles to an operation and that his staff has prepared typically become more and more risk tolerant, even to the the following best estimates of probable losses: point of becoming risk seekers. In the financial world, this behavior pattern has led to a number of catastrophic financial • For plan 1, there is a 2 percent chance that 50 vehicles events and even to bank failures. Whether the pattern occurs will be lost. in combat has not been studied formally, but the constancy • For plan 2, there is a 50 percent chance that 4 vehicles of human nature and the annals of military history (e.g., will be lost. Napoleon at Waterloo) suggest that it does occur, with omi- nous consequences. From a decision theoretic point of view, given the Over the past year, the neural basis of this phenomenon goal of minimizing vehicle losses, plan 1 is preferable. If has been identified. This finding has broad implications 100 commanders face this decision, all choose plan 1, and for understanding loss aversion as a source of suboptimal the probability estimates are accurate, the losses would be decision making. Before the neurological studies were con- half what they would have been if all had chosen plan 2.1 ducted, the widespread conviction was that loss aversion was Despite the logic behind the probability estimates, behavioral a discrete fear response that biased other decision-making research indicates unambiguously that most people would m echanisms. The neurophysiological evidence weighs choose plan 2. For behavioral scientists, this preference against this view, instead suggesting that a unitary valuation reflects a standard feature of decision making, subjective system unrelated to fear drives loss-averse behavior (Tom probability distortion (Kahneman and Tversky, 1979). It is et al., 2007) now abundantly clear that human decision makers see very small probabilities as much larger than they actually are (for makiNG oPTimal use oF iNdividual variaBiliTy an accessible review, see Plous, 1993). The financial industry is now beginning to take this feature of human behavior into The preceding section discussed the evidence that nearly account. Some financial firms now provide their decision everyone, under some conditions, makes decisions that are makers with training and tools that help them overcome this less than optimal for achieving the organizational goals set widespread inefficiency in decision making. for them. Irrespective of the type of suboptimal decision making, individuals differ in how they make decisions. For example, some individuals are more impulsive than others and some are more tolerant of risk. Such differences do not plan 1, two commanders are likely to lose 100 (2 × 50) vehicles; for 1For necessarily mean that risk-tolerant individuals are better plan 2, fifty commanders are likely to lose 200 (50 × 4) vehicles.

OCR for page 36
8 OPPORTUNITIeS IN NeUROSCIeNCe fOR fUTURe ARMY APPLICATIONS decision makers than risk-averse individuals. From an insti- assignment, given that individual’s inherent behavioral and tutional point of view—that is, what is best for accomplish- neurophysiological characteristics. ing the Army’s mission—a certain decision-making style Is there a more efficient and reliable way to find the best may be better or less well suited to a given task. officer for a particular job? Can the Army routinely provide Economic, psychological, and neurophysiological its commanding officers with a more complete, reliable, and studies conducted over the past several decades have shown useful assessment of their junior officers at the beginning of a that individuals differ in their decision making in predict- tour of duty? Can neurophysiological monitoring tools iden- able ways (see, for example, Wu and Gonzales, 1998) and tify soldiers who face specific idiosyncratic risks in particular that these predictable differences can remain stable for long decision environments? Issues such as these do not appear to periods of time (see, for example, Weber et al., 2002). In an have been explored by the Army. Based on what the private economic context, the attitudes of an individual to risk are sector is already doing to remain competitive in a global idiosyncratic, but they appear to be relatively stable personal- environment, these and similar questions should no longer be ity features. Personality features such as “conscientiousness” ignored. As the ability to characterize and predict behavior have been shown to be both measurable and stable across an and psychology using the tools of neuroscience grows, these individual’s life span (see, for example, Costa and McCrea, tools may well become critical for the Army. 1985). At a neurophysiological level, the distributions of neurochemicals, receptors, and brain activations differ Tools For characTeriziNG iNdividual among individuals, and there is growing evidence that these decisioN makers differences are related to stable features of personality (see, for example, Kable and Glimcher, 2007). Both officers and soldiers make decisions that contrib- A common theme that emerges from all this work is ute to (or detract from) achieving mission objectives. We not that one type of decision maker is inherently better than know three important things about their decision-making another in all circumstances but rather that, given the natural processes: First, human decision making is likely to be sub- range of differences in a population, one set of decision- optimal in the sense discussed above. Second, humans vary making characteristics may be better suited to a particular in the stable long-term personality traits that suit them for task than another. Consider a commanding officer who must different kinds of decision-making tasks and different kinds select a lieutenant for command in an area filled with civil- of decision-making training. Third, people can be trained to ians. An individual who is more risk-averse may be better be more efficient decision makers. If the goal of the Army suited for this task than one who is more risk-tolerant. But is to optimize decision making in both ways—as a counter- an action conducted with a large force in a highly uncertain measure to suboptimal decisions and to maximize the value environment may call for a more risk-tolerant individual. of trait variability—then it must characterize how individu- Experienced commanding officers certainly know this, and als make decisions. Once individual decision making can be they select officers for missions according to their individual characterized objectively, then training regimes and selection strengths. procedures for both kinds of optimization can be applied. Informal assessment occurs routinely throughout the Several tools already exist for characterizing relevant Army, as it does in every organization. The issue is whether traits of individuals as decision makers. These tools fall adopting more formal techniques based on the results of under the broad categories of behavioral testing, psycho- research in neuroeconomics, neuropsychology, and other physiological assessment, neurophysiological monitoring, neuroscience disciplines can give the Army an advantage in and genetic analysis. Each tool can provide information decision making. The Army does not now assess officers for about how an individual compares to a larger population of traits such as risk tolerance; nor does it train commanding decision makers, individual decision-making strategies, the officers to make the best use of the differential decision- effects of training on decision making, or combinations of making traits among their subordinates. Battalion-level these. commanders are not given objective information about the decision-making traits of newly assigned subordinates, nor Personality as a Factor in decision making have those subordinates been trained explicitly in how to adjust their decision making to different tasks. More gener- The most well-developed and robust form of assessment ally, the Army has not sought to characterize the decision- available today is personality assessment based on factor making characteristics of its individual soldiers and officers analysis. During the 1980s and 1990s a number of tools were through any set of validated metrics even though such developed for classifying personality. Investigators subse- information could be useful for (1) determining which tasks quently asked whether the features of personality measured an individual appears particularly well suited to perform or with these tests remained stable across an entire life span. (2) identifying training strategies that might better prepare an In general, the answer has been yes. Among the well-known individual for the decision-making responsibilities of his/her and well-validated tools for classifying personality traits

OCR for page 36
9 OPTIMIZING DeCISION MAKING emerging Tools: Genetics, Neurohormones, and quantitatively are the NEO personality inventory2 and the Brain imaging Minnesota Multiphasic Personality Inventory. Personality traits as measured by these tools have Genetic markers, neurohormones, and brain imaging already been used successfully to predict performance in are emerging as sources for biomarkers that may prove to Army assignments. For example, Lawrence Katz has shown be reliable indicators of neural state when individuals make that the results from this kind of assessment can lead to a choices—that is, they can signify behavior underlying the twofold improvement in selecting helicopter pilot candi- emotional or subjective elements during decision making. dates, when success is defined as selecting individuals who (See Chapter 2 for the definition of “biomarker.”) Genetic will later be assessed by their experienced peers as being markers are particularly relevant for identifying stable traits. good helicopter pilots.3 Research data suggest that some genetic markers can identify The extension of these quantified personality inventories individuals at greater risk of reacting to chemical agents to characterizing individual decision making is less well or suffering from PTSD. It is also known that hormonal established. How individuals differ in stable personality traits state—specifically, an individual’s hypothalamo-pituitary that influence their decision making has been the subject of axis responsivity—influences decision making as well as fit- intense study in behavioral economics over the last decade. ness for duty (Taylor et al., 2006, 2007). The cost of genetic Individuals differ in their tolerance of risk, and they also testing for such traits will decrease over the next decade, and differ in how they react to risky situations that involve losses the selectivity and specificity of the tests will improve. As and risky situations that involve gains. Individuals can differ this happens, the Army should position itself to take advan- as well in the ways that they deal with ambiguous situations tage of the new tools. and in how they interpret probabilities. People also differ At present, brain scanning seems too costly relative to in impulsivity. Some place great value on immediate gains the utility of the information that could be gained to be of while others are more patient. We do not yet know how much use for assessing how people make choices. A single stable these traits are over an individual’s life span, but we brain scan costs more than $1,000, so this technology is do know that they have a huge effect on the decisions he or unlikely to be useful for this purpose in the near term. she makes. For the far term, its potential value for Army applications related to decision making will depend on the increase in emotional reactivity in decision making understanding it provides relative to other techniques (e.g., the conventional personality inventories or the improved The personality inventories discussed above use sub- tools described in Chapter 3) and on its cost. For the time jects’ responses to a psychological instrument such as a being, the Army should follow the direction of research in questionnaire to characterize and quantify personality traits. this area and should be able to assess the practical value of This is not, however, the only way to characterize aspects of the science and applications emerging from external parties personality relevant to how an individual makes decisions. (academia and the commercial sector). Over the past decade, galvanic skin response (GSR) has become a valuable tool for quantifying an individual’s emo- NeuroscieNce-relaTed Theories oF tional reactivity. GSR is an inexpensive biometric device decisioN makiNG that measures changes in skin conductivity (essentially, the rate of sweating). Neuropsychological studies have shown No single theory of decision making is accepted that GSR is a reliable surrogate for a subject’s arousal throughout the multiple disciplines, subdisciplines, and and emotional engagement during a number of behavioral research paradigms that constitute neuroscience as defined tests (for a review, see Phelps, 2002). Recently, the Army for this report. Even within a single discipline, researchers Institute for Creative Technologies implemented a GSR- are working on approaches that partly overlap (agree) and based system for assessing post-traumatic stress disorder partly compete or conflict with one another. The com - (PTSD). GSR could also be used to monitor an individual’s mittee has selected two such approaches from among the emotional reactivity when making decisions in a range of many that could be discussed to illustrate how the general contexts. For example, a GSR-based monitor could be used theories of decision making are relevant to the Army and during simulation training to select individuals for future which aspects of the ongoing research are worth monitor- high-stress missions such as explosive ordnance disposal. ing to gain insights and practical approaches to improving decision making in military-relevant contexts. The first approach is belief-based decision making and the neuro- 2The acronym NEO derives from the first three of the five axes, or major physiological activation patterns that have been linked traits, of personality measured by this tool: neuroticism, extraversion, open - with its more theoretical constructs. The second approach ness, agreeableness, and conscientiousness. Each axis is further divided is intuitive, or naturalistic, decision making, which thus into six subtraits. far has been primarily descriptive (taxonomic) of actual 3Lawrence Katz, research psychologist, Army Research Institute, briefing to the committee on April 28, 2008.

OCR for page 36
0 OPPORTUNITIeS IN NeUROSCIeNCe fOR fUTURe ARMY APPLICATIONS decision behavior and has served as a framework for under- or her beliefs truthfully. Behavioral experiments show that standing recognition-primed decisions. a false belief engages greater cognitive resources, resulting in longer response times. (In this context, a “false belief” is a presumed state of reality that differs from the experienced Belief-Based decision making state of reality [Apperly et al., 2008] or an expected outcome This section reviews one approach to decision making of an action that differs from the actual outcome [Grèzes et that occurs frequently in ordinary situations and that has al., 2004].) A false belief may increase the cognitive load profound implications for performance in complex envi- because subsequent judgments and decision making need to ronments. Specifically, belief-based decision making is take it into account as a conflict with the individual’s percep- concerned with selecting one action over another (or others) tion of reality (Apperly et al., 2008). when knowledge about the possible consequences and When individuals are asked to make belief-based deci- outcomes of the alternatives is either uncertain, unclear, or sions, they tend to (1) examine less closely those arguments subject to very strong a priori assumptions about how actions that support their existing beliefs and (2) seek out evidence “should” result in certain outcomes. The theory of belief- that confirms rather than contradicts current beliefs. These based decision making deserves close scrutiny because tendencies often lead to belief-maintaining biases and high many choices that an outsider might call illogical might be irrationality (Evans et al., 1993). Interestingly, repetition better understood when looked at from the perspective of of the presentation of a belief has been shown to increase the decider’s preexisting belief system. Finally, and most the intensity of that belief (Hertwig et al., 1997), which important, belief-based decision making has been associ- parallels the observation that individuals prefer stimuli that ated with brain structures that may be accessible to real-time are frequently presented over those that are rarely presented monitoring—and perhaps even to modification—using the (Zajonc et al., 1974). Dysfunctional beliefs and associ - tools of neuroscience. ated choices of actions are thought to play a crucial role in various forms of dysfunction, including dysfunctional decision making. Experimentally, researchers have used the The Cognitive Psychology of Belief Sets Implicit Association Test (Greenwald et al., 1998) to assess A belief can be defined as a propositional mental con- belief-based associations, which is thought to be superior struct that affirms or denies the truth of a state of affairs to traditional self-reporting measures (De Houwer, 2002). and is closely linked to basic judgment processes. A large Moreover, the Implicit Association Test can be used to and stable set of beliefs is essential for intelligent behavior, assess the strength of evaluative associations in the domain since such a belief set forms the basis for any actions that a of beliefs (Gawronski, 2002). person may take to achieve his or her goals. Beliefs are the More specific for decision making, people have erroneous building blocks we use to build mental models of the state beliefs about the laws of chance (Tversky and Kahneman, of the world. They are therefore important constructs used 1971). Beliefs can have direct effects on the perceived likeli- to guide decision making. hood of certain outcomes. For example, familiarity tends to The neuropsychological approach to characterizing increase the probability of the outcome associated with an beliefs has focused recently on developing testable process event (Fox and Levav, 2000). Moreover, when judging cause theories, which through neural state monitoring—for exam- and effect, beliefs can interfere with detecting cause and effect ple, neuroimaging or event-related potentials—can be related in decision-making situations by favoring more believable to functions of different systems in the brain. To understand causes over those perceived as less believable (Fugelsang and recent neuroscience findings related to belief-based decision Thompson, 2000). Some investigators have integrated these making, a few essential constructs need to be explained. findings into the context of the ToM literature (Fox and Irwin, A “theory of mind” (ToM) refers to an individual’s 1998), suggesting that probabilities of available options are everyday ability to attribute independent mental states to immediately embedded into a mentalizing context. A person’s him- or herself and to others for the purpose of predicting beliefs about the relative importance of probabilities and and explaining behavior (Happé, 2003) (see also Box 3-2). values also compete with limitations on his or her ability to The cognitive mechanisms underlying this ability to attribute make decisions on the basis of these beliefs when processing a “mind” to oneself or others have been extensively investi- information in a specific decision-making situation (Slovic gated via reasoning tasks that require participants to predict and Lichtenstein, 1968). For example, greater deviation from the action of agents based on information about those agents’ optimal decision making in favor of belief-based decision beliefs and desires. Researchers who have investigated this making can be observed when individuals are making deci- ability have found systematic reasoning errors (Wertz and sions in complex and extreme environments. Finally, beliefs German, 2007). For example, we typically infer the beliefs can bias a decision maker to adopt what has been referred of others from what they tell us. However, behavioral experi- to as rule following, which occurs when a rule is applied ments have the advantage that the experimenter does not have to a situation to minimize the cognitive effort and provide to rely on the degree to which an individual is reporting his emotionally satisfying solutions that are good enough but not

OCR for page 36
 OPTIMIZING DeCISION MAKING necessarily the best (Mellers et al., 1998). Thus, it should not possibly with emphasis on the correct or incorrect, accept- be surprising that adding measures of personal belief salience able or unacceptable, valence of the belief. to considerations of the decision-situation characteristics The strength with which one holds to a belief is an improves the accuracy of predictions concerning decision- important regulator of human behavior and emotion. A recent making behavior (Elliott et al., 1995). neuroimaging study shows that states of belief, disbelief, and uncertainty differentially activated distinct regions of the prefrontal and parietal cortices, as well as the basal Neural Correlates of Belief Processing ganglia. The investigators proposed that the final acceptance Based on the preceding brief review of cognitive pro- of a statement as true or its rejection as false relies on more cesses that contribute to belief formation and how belief primitive, hedonic processing in the medial PFC and the affects decision making, it should be clear that no single area anterior insula (Harris et al., 2007). Comparing belief-based of the brain is responsible for belief-based decision making, and belief-neutral reasoning, some researchers have reported and surprisingly few brain regions have been reported to activation of the left temporal lobe and the parietal lobe, play the same role repeatedly. Neuroimaging studies reveal respectively, and modulations by the lateral PFC in cases a neural system with three components: the medial PFC, of overcoming belief bias and by the ventral medial PFC in temporal poles, and a posterior superior temporal sulcus cases of succumbing to belief bias (Goel and Dolan, 2003). (Frith and Frith, 2003; Gallagher and Frith, 2003). Examining belief-based decision making in a navigation The right temporoparietal junction (TPJ) has been impli- task under uncertain conditions showed distinct regions of cated not only in processing the attribution of beliefs to other PFC activation, which was consistent with an underlying people (ToM) but also in redirecting attention to task-relevant Bayesian model of decision making that permits efficient, stimuli. Recent studies support the hypothesis that the over- goal-oriented navigation (Yoshida and Ishii, 2006). Thus, lap between ToM and attentional reorienting suggests the the dorsolateral and ventromedial PFC may have important need for new accounts of right TPJ function that integrate regulatory functions in moderating belief intensity and the across these disparate task comparisons (Mitchell, 2008). degree to which beliefs influence decision making. Addi- Others have found that the right TPJ activates differentially tional brain regions may modulate belief-based processing as a function of belief and outcome and that the activation is by contributing resources to (1) override beliefs, (2) enforce greatest when there could be a negative outcome based on the beliefs, (3) engage in reward-related processing associated belief that action of a protagonist would cause harm to others, with beliefs (that is, how good it feels to be right or how one’s even though the harm did not occur (Young et al., 2007). A own belief-based decision affects others). study of brain-damaged individuals reported that, in addition This brief review of brain areas that are important for to the frontal cortex, the left TPJ is necessary for reasoning belief-based decision making shows that beliefs are most about the beliefs of others (Samson et al., 2004). Thus, the likely decomposed in the brain into various component TPJ appears to be crucial for the representation of a mental processes—for example, self-relevance, valence, simula- state associated with belief formation, possibly focused on tion, social relatedness, and other yet-to-be-examined com- disambiguating true from false beliefs. ponents. These components are subsequently processed by Activation of the medial PFC is often observed with the various regions of the brain to arrive at preferences that subject at rest, a time when individuals presumably engage allow selecting an option and engaging in decision making. in internally directed attention-processing of self-relevant Understanding these component processes may help to better thoughts (Kelley et al., 2002) and beliefs (Wicker et al., monitor and potentially modulate belief-based decision 2003). Some investigators have found that, whereas the making, particularly considering the significant distortions medial PFC is recruited for processing belief valence, the and suboptimal decisions that beliefs can generate. TPJ and precuneus are recruited for processing beliefs in moral judgment and mediate both the encoding of beliefs Potential Neural Targets for Modifying and their integration with outcomes for moral judgment Belief-Based Decision Making (Young and Saxe, 2008). These findings are consistent with those of others, showing occurrence of greater activation of In Army operations, decision making frequently occurs the anteromedial PFC and rostral anterior cingulate when in extreme environments, which place a high demand on participants were implicitly making associations consistent an individual’s physiological, affective, cognitive, or social with gender and racial biases. Using event-related potentials processing resources. In other words, extreme environments to investigate the neural substrates of reasoning with false strongly perturb the body and mind, which in turn initiate beliefs revealed a specific late negative component that complex cognitive and affective response strategies. Belief- appeared to be located in the middle cingulate cortex and based decision making is often a way for the individual that might be related to conflict or error detection (Wang et to save resources and to select an action that seems to be al., 2008). Therefore, the medial PFC appears to be important the best under the circumstances. However, as pointed out for processing the degree to which beliefs are self-relevant, above, belief-based decision making frequently results in sub-

OCR for page 36
2 OPPORTUNITIeS IN NeUROSCIeNCe fOR fUTURe ARMY APPLICATIONS optimal outcomes because the individual fails to adequately — In the medium term, develop training tools to take into account chance, value, and other information. let individuals know when they are making non- Therefore, modulating belief-based decision making pro- optimal, belief-based decisions. vides an opportunity to optimize outcomes. — In the long term, monitor (and potentially also The main advance that neuroscience has brought to modulate) in real time the brain systems that understanding belief-based decision making is the ability contribute to nonoptimal, belief-based decision to assign component processes to specific brain regions making. Investigative tools such as transcranial and to potentially target these regions for monitoring and magnetic stimulation and real-time fMRI are not modulation purposes. Although some brain regions are likely envisioned to be deployed in actual operations to be more difficult to modulate, others, located on the con- but could be used as tools for training those who vexity of the brain, are easily accessible to modulations that make high-impact, high-risk decisions (e.g., field could in theory be used to alter the degree to which belief- commanders). processing occurring in that area influences decisions. For example, the TPJ is easy to find and can be accessed using intuitive decision making and brain stimulation techniques such as transcranial magnetic recognition-Primed decisions stimulation (Fitzgerald et al., 2002). Moreover, both fMRI and event-related potentials may be useful in monitoring While advances in military technology do change the the TPJ for reliable biomarkers of belief strength and truth tactics, techniques, and procedures of armed conflict, the of belief. Finally, using real-time fMRI (Cox et al., 1995), eighteenth-century military strategist Carl von Clausewitz one could monitor this and other brain regions in response to stated that war is influenced more by human beings than inquiries about various types of beliefs and measure whether by technology or bureaucracy (von Clausewitz, 1976). interventions were successful in modulating beliefs, similar Relatively young leaders making decisions at the point of to very recent experiments aimed at attenuating emotion the strategic spear can become the critical factor in whether (Caria et al., 2007) or pain (deCharms, 2007). Although these an engagement is won, lost, or drawn. For the U.S. military, approaches are in their infancy, they point to unprecedented including the Army, the challenge is less to empower these opportunities to directly and specifically manipulate the brain young leaders to make decisions (the Army has already to alter decision-making processes. r estructured to encourage such behavior) and more to improve the quality of the decisions they make. Military strategists have emphasized that the strength needed to win Summary Points for Army Applications the next war will be less about kinetic energy than about The preceding discussion of belief-based decision cognitive power and performance (Scales, 2006). making suggests three key opportunities for the Army: For many years, the military community believed that the classical decision-making literature (e.g., normative • Belief-based decision making is an important process decision theory, statistical decision theory, and decision that occurs in extreme environments and can have analysis) appropriately defined rational decision making for profound implications for optimizing behavior and military commanders. However, the last two decades have outcomes. Accordingly, understanding such decision moved toward studying naturalistic decision making (posi- making and modulating it is an important neuro- tive, or descriptive, decision theory). Military leaders most science opportunity for the Army. often make decisions based on their previous experiences. • Research on belief-based decision making supports These experiences can be based on actual combat, combat the idea that a few distributed brain systems are training (for example, at the Joint Readiness Training Center, responsible for processing beliefs that contribute to the Combined Maneuver Training Center, or the National selecting appropriate or inappropriate actions. These Training Center), or home station training. Or, leaders may brain systems can be (1) monitored and (2) potentially acquire vicarious experience by studying past leaders in modulated or modified in real time during training or military history. Taken together, these experiences arguably preparation exercises to optimize decision making. build a military commander’s base for so-called intuitive • Research directed at understanding and applying decision making. belief-based decision making can be categorized Unpacking the terms “intuition” and “intuitive” is as near term (applicable results expected within an arduous task, as different researchers have defined the 5 years), medium term (applicable results expected terms in different ways, contingent on their own specialty. within 10 years), or far term (applicable results In a recent popular treatment of the topic, Gladwell (2005) expected in 10-20 years). referred to the idea of intuition as “blink” for a decision — In the near term, identify instances when indi- made in an instant. Dijksterhuis and Nordgren (2006) define viduals are making inappropriate, belief-based intuition as a gut feeling based on past experience and believe decisions that result in nonoptimal performance. that intuition plays a critical role in the theory of unconscious

OCR for page 36
 OPTIMIZING DeCISION MAKING thought. Klein (1989) posits that intuition precedes analysis Cox, R.W., A. Jesmanowicz, and J.S. Hyde. 1995. Real-time functional magnetic resonance imaging. Magnetic Resonance in Medicine 33(2): and that battlefield commanders are told they need to trust 230-236. their own judgment and intuition, but they aren’t told how De Houwer, J. 2002. The implicit association test as a tool for studying to develop their intuition so that it is trustworthy. Shattuck et dysfunctional associations in psychopathology: Strengths and limita - al. (2001) found that more-experienced (longer time in ser- tions. Journal of Behavior Therapy and Experimental Psychiatry 33(2): vice and higher rank) military leaders tended to ask for less 115-133. deCharms, R.C. 2007. Reading and controlling human brain activation using information when making decisions than did officers with real-time functional magnetic resonance imaging. Trends in Cognitive less experience. Their study of decisions made by military Sciences 11(11): 473-481. leaders (a process they called “cognitive integration”) sug- Dijksterhuis, A., and L.F. Nordgren. 2006. A theory of unconscious thought. gested that experienced leaders show an intuition that leads Perspectives on Psychological Science 1(2): 95-109. them to sample fewer sources and to ignore sources they Elliott, R., D. Jobber, and J. Sharp. 1995. Using the theory of reasoned action to understand organisational behaviour: The role of belief salience. deem unworthy of their attention. Less-experienced officers British Journal of Social Psychology 34(2): 161-172. sampled all of the information sources available and usually Evans, J.St.B.T., D.E. Over, and K.I. Manktelow. 1993. Reasoning, decision as much of each source as was allowed. making and rationality. Cognition 49(1-2): 165-187. A particular focus of work on naturalistic decision Fitzgerald, P.B., T.L. Brown, and Z.J. Daskalakis. 2002. The application making has been a theoretical model called recognition- of transcranial magnetic stimulation in psychiatry and neurosciences research. Acta Psychiatrica Scandinavica 105(5): 324-340. primed decision making (RPD) (Klein and MacGregor, Fox, C.R., and J.R. Irwin. 1998. The role of context in the communication of 1987; Klein, 1989; Hancock, 2007). RPD aims to provide uncertain beliefs. Basic and Applied Social Psychology 20(1): 57-70. a descriptive model of how people actually make decisions Fox, C.R., and J. Levav. 2000. Familiarity bias and belief reversal in rela- by predicting the kind of strategy they will use (Klein, 1998, tive likelihood judgment. Organizational Behavior and Human Decision 2008). It has been validated in a number of different domains Processes 82(2): 268-292. Frith, U., and C.D. Frith. 2003. Development and neurophysiology of men- by different research teams, and it has been empirically tested talizing. Philosophical Transactions of the Royal Society B: Biological (Klein et al., 1995). In most studies, decision makers have Sciences 358(1431): 459-473. been found to use an RPD strategy for about 90 percent of Fugelsang, J.A., and V.A. Thompson. 2000. Strategy selection in causal their difficult decisions. For simpler decisions, the propor- reasoning: When beliefs and covariation collide. Canadian Journal of tion is likely to be higher. Of course, some decisions force Experimental Psychology 54(1): 15-32. Gallagher, H.L., and C.D. Frith. 2003. Functional imaging of “theory of a comparison between alternatives (e.g., candidates for a mind.” Trends in Cognitive Sciences 7(2): 77-83. position) that more closely resembles the decision processes Gawronski, B. 2002. What does the implicit association test measure? A test advocated in normative decision theory. of the convergent and discriminant validity of prejudice-related IATs. Although the RPD model predicts the strategy that will Experimental Psychology 49(3): 171-180. be used, it does not predict the choice the decision maker will Gladwell, M. 2005. Blink: The Power of Thinking Without Thinking. New York, N.Y.: Little, Brown and Co. make using that strategy. Nevertheless, attempts to predict Glimcher, P.W., M.C. Dorris, and H.M. Bayer. 2005. Physiological util - or model actual choices are likely to benefit from referring ity theory and the neuroeconomics of choice. Games and Economic to the decision strategy as a starting point. A pure content Behavior 52(2): 213-256. model that ignores process (the strategy employed) may not Glimcher, P.W., J. Kable, and K. Louie. 2007. Neuroeconomic studies prove useful in describing the actual decision process of of impulsivity: Now or just as soon as possible? American Economic Review 97(2): 142-147. experienced leaders or in helping less-experienced leaders to Goel, V., and R.J. Dolan. 2003. Explaining modulation of reasoning by make better decisions. One approach to predicting decision belief. Cognition 87(1): B11-B22. choices is to build on the RPD structure and add, for exam- Greenwald, A.G., D.E. McGhee, and J.L.K. Schwartz. 1998. Measuring ple, pattern repertoire content. Several ongoing attempts to individual differences in implicit cognition: The implicit association test. formulate computational versions of the RPD model are Journal of Personality and Social Psychology 74(6): 1464-1480. Grèzes, J., C.D. Frith, and R.E. Passingham. 2004. Inferring false beliefs employing this approach to predicting the decision as well from the actions of oneself and others: An fMRI study. Neuroimage as the strategy. 21(2): 744-750. Hancock, P.A. 2007. Cold calculation or intuitive instinct: Solving the secrets of sustained survival. Pp. 197-206 in The Psychology of Sur- reFereNces vivor: Overanalyze, Overemote, Overcompensate: Leading Psycholo- gists Take an Unauthorized Look at the Most Elaborate Psychological Apperly, I.A., E. Back, D. Samson, and L. France. 2008. The cost of think- Experiment Ever Conducted—Survivor! R.J. Gerrig, ed. Dallas, Tex.: ing about false beliefs: Evidence from adults’ performance on a non- BenBella Books. inferential theory of mind task. Cognition 106(3): 1093-1108. Happé, F. 2003. Theory of mind and the self. Annals of the New York Ariely, D. 2008. Predictably Irrational: The Hidden Forces That Shape Our Academy of Sciences 1001: 134-144. Decisions. New York, N.Y.: Harper. Harris, S., S.A. Sheth, and M.S. Cohen. 2007. Functional neuroimaging Caria, A., R. Veit, R. Sitaram, M. Lotze, N. Weiskopf, W. Grodd, and N. of belief, disbelief, and uncertainty. Annals of Neurology 63(2): Birbaumer. 2007. Regulation of anterior insular cortex activity using 141-147. real-time fMRI. NeuroImage 35(3): 1238-1246. Hertwig, R., G. Gigerenzer, and U. Hoffrage. 1997. The reiteration effect in Costa, P.T., and R.R. McCrae. 1985. NEO-PI Professional Manual. Lutz, hindsight bias. Psychological Review 104(1): 194-202. Fla.: Psychological Assessment Resources, Inc.

OCR for page 36
 OPPORTUNITIeS IN NeUROSCIeNCe fOR fUTURe ARMY APPLICATIONS Kable, J.W., and P.W. Glimcher. 2007. The neural correlates of subjec- Taylor, M., A. Miller, L. Mills, E. Potterat, G. Padilla, and R. Hoffman. tive value during intertemporal choice. Nature Neuroscience 10(12): 2006. Predictors of Success in Basic Underwater Demolition/SEAL 1625-1633. Training. Part I: What Do We Know and Where Do We Go from Here? Kahneman, D., and A. Tversky. 1979. Prospect theory: An analysis of deci - Naval Health Research Center Technical Report No. 06-37. Springfield, sion under risk. Econometrica 47(2): 263-291. Va.: National Technical Information Service. Kahneman, D., and A. Tversky, eds. 2000. Choices, Values and Frames. New Taylor, M., G. Larson, A. Miller, L. Mills, E. Potterat, J. Reis, G. Padilla, York, N.Y.: Russell Sage Foundation. and R. Hoffman. 2007. Predictors of Success in Basic Underwater Kelley, W.M., C.N. Macrae, C.L. Wyland, S. Caglar, S. Inati, and T.F. Demolition/SEAL Training. Part II: A Mixed Quantitative and Quali- Heatherton. 2002. Finding the self? An event-related fMRI study. tative Study. Health Research Center Technical Report No. 07-10. Journal of Cognitive Neuroscience 14(5): 785-794. Springfield, Va.: National Technical Information Service. Klein, G.A. 1989. Recognition-primed decision. Advances in Man-Machine Tom, S.M., C.R. Fox, C. Trepel, and R.A. Poldrack. 2007. The neural basis Systems Research 5: 47-92. of loss aversion in decision making under risk. Science 315(5811): Klein, G.A. 1998. Sources of Power: How People Make Decisions. 515-518. Cambridge, Mass.: MIT Press. Tversky, A., and D. Kahneman. 1971. Belief in the law of small numbers. Klein, G. 2008. Naturalistic decision making. Human Factors 50(3): Psychological Bulletin 76(2): 105-110. 456-460. von Clausewitz, C. 1976. On War. M.E. Howard and P. Paret, eds. Princeton, Klein, G.A., and D. MacGregor. 1987. Knowledge Elicitation of Recognition- N.J.: Princeton University Press. Primed Decision Making. Yellow Springs, Ohio: Klein Associates, Wang, Y.W., Y. Liu, Y.X. Gao, J. Chen, W. Zhang, and C.D. Lin. 2008. False Inc. belief reasoning in the brain: An ERP study. Science in China Series C: Klein, G., S. Wolf, L. Militello, and C. Zsambok. 1995. Characteristics of Life Sciences 51(1): 72-79. skilled option generation in chess. Organizational Behavior and Human Weber, E.U., A.-R. Blais, and N.E. Betz. 2002. A domain-specific risk- Decision Processes 62(1): 63-69. attitude scale: Measuring risk perceptions and risk behaviors. Journal Mellers, B.A., A. Schwartz, and A.D.J. Cooke. 1998. Judgment and decision of Behavioral Decision Making 15(4): 263-290. making. Annual Review of Psychology 49: 447-477. Wertz, A.E., and T.C. German. 2007. Belief–desire reasoning in the explana- Mitchell, J.P. 2008. Activity in right temporo-parietal junction is not selec - tion of behavior: Do actions speak louder than words? Cognition 105(1): tive for theory-of-mind. Cerebral Cortex 18(2): 262-271. 184-194. Phelps, E.A. 2002. Issues in the cognitive neuroscience of emotion. P. 539 in Wicker, B., P. Ruby, J.-P. Royet, and P. Fonlupt. 2003. A relation between Cognitive Neuroscience: The Biology of Mind, 2nd ed. M.S. Gazzaniga, rest and the self in the brain? Brain Research Reviews 43(2): 224-230. R.B. Ivry, and G.R. Mangun, eds. New York, N.Y.: Norton. Wu, G., and R. Gonzalez. 1998. Common consequence effects in decision Plous, S. 1993. The Psychology of Judgment and Decision Making. New making under risk. Journal of Risk and Uncertainty 16(1): 115-139. York, N.Y.: McGraw-Hill. Yoshida, W., and S. Ishii. 2006. Resolution of uncertainty in prefrontal Samson, D., I.A. Apperly, C. Chiavarino, and G.W. Humphreys. 2004. Left cortex. Neuron 50(5): 781-789. temporoparietal junction is necessary for representing someone else’s Young, L., F. Cushman, M. Hauser, and R. Saxe. 2007. The neural basis of belief. Nature Neuroscience 7(5): 499-500. the interaction between theory of mind and moral judgment. Proceedings Scales, R.H. 2006. Clausewitz and World War IV. Available online at http:// of the National Academy of Sciences of the United States of America www.afji.com/2006/07/1866019/. Last accessed July 23, 2008. 104(20): 8235-8240. Shattuck, L.G., J.L. Merlo, and J. Graham. 2001. Cognitive integration: Young, L., and R. Saxe. 2008. The neural basis of belief encoding and inte - Exploring performance differences across varying types of military gration in moral judgment. Neuroimage 40(4): 1912-1920. operations. In Proceedings from the Fifth Annual Federated Laboratory Zajonc, R.B., R. Crandall, and R.V. Kail. 1974. Effect of extreme exposure Symposium on Advanced Displays and Interactive Displays. College frequencies on different affective ratings of stimuli. Perceptual and Park, Md., March 20-22. Motor Skills 38(2): 667-678. Slovic, P., and S. Lichtenstein. 1968. The relative importance of probabilities and payoffs in risk taking. Journal of Experimental Psychology Mono- graphs 78(3 Pt. 2): 1-18.