Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 36
Opportunities in Neuroscience for Future Army Applications 4 Optimizing Decision Making Extensive work in the behavioral sciences and neuroscience over the past decade has highlighted two realities that confront any organization that relies on decisions made by its members. First, decision making by humans is often suboptimal in ways that can be reliably predicted—and sometimes remediated with training (e.g., Kahneman and Tversky, 1979; Ariely, 2008). If a decision maker is explicitly told to make decisions that “maximize financial gain on average” or to “minimize human casualties on average” and fails to do so in a systematic way, social scientists consider the outcome to be a result of suboptimal or inefficient decisions. For example, humans make errors in the ways that they estimate the probabilities of events, react to heavy losses, and deal with ambiguous situations (Kahneman and Tversky, 2000). Suboptimal decision making has been identified in financial circles where billions of dollars are at stake, and it undoubtedly occurs in the Army as well. Second, individuals differ predictably in their decision making (see, for example, Wu and Gonzales, 1998). These differences can be highly idiosyncratic and are likely tied to stable, long-term traits of personality, physiology, and neurology (see, for example, Weber et al., 2002). This variability in decision making can make an individual better suited for one particular task and less well suited for another. At present, the Army neither seeks to measure, in a formal way, deviations from optimal or efficient decision making nor attempts to characterize the stable individual differences that shape a particular soldier’s decision making. This chapter explores ways in which current and emerging insights from neuroscience, together with neural monitoring tools such as neuroimaging, can help the Army to optimize the decision making of its soldiers and officers in both these senses: first, by identifying and providing countermeasures to suboptimal decision making and, second, by identifying and making optimal use of individual variability in decision-making traits. THE SOURCES OF SUBOPTIMAL DECISION MAKING Consider a basketball coach who must transport his team to a distant city and must choose between flying and driving. If he chooses to drive because he believes he is minimizing the risk to his players, he has made a suboptimal decision. Because flying is statistically safer than driving, he is exposing his players to more risk by driving. Indeed, purely from the perspective of rational risk management, even if his team were involved in an aviation accident, he would have made the right decision if he had decided to fly. The Army could do much more than it currently does to study and correct suboptimal decision making using the full range of neuroscience, including the behavioral and systemic levels. Leader training and assessment has not kept pace with advances in the science. Although decision making is critical to the conduct of warfare, no research unit within the Army focuses specifically on assessing or improving decision making by officers or enlisted soldiers at a behavioral or neurophysiological level. Even high-profile projects such as the Augmented Cognition project (a previous Defense Advanced Research Projects Agency (DARPA) project discussed in Chapter 6) have focused on how to present information rather than on ways to improve decision making per se. Because we now know that human decision making is predictably suboptimal in many situations, the Army must decide whether to optimize the force along this human dimension or ignore an emerging opportunity. Two classes of suboptimal decision making for which there is abundant recent behavioral literature will illustrate how much is at stake in choosing whether to seize on this new understanding or ignore it. Errors in Assessing Relative Risk All decision making involves risk, and individuals have different tolerances for risk. As the eighteenth-century mathematician Daniel Bernoulli famously observed, a poor man
OCR for page 37
Opportunities in Neuroscience for Future Army Applications with a lottery ticket that has a 50 percent chance of winning $20,000 might very rationally trade that ticket for $9,000. On average, the lottery ticket he holds is worth $10,000, but he might choose not to take the risk. Of course how little he is willing to accept for the ticket reflects just how risk averse he is. Someone who is very risk averse might settle for a sure gain of $1,000 in return for the ticket; someone who is tolerant of risk might refuse even $9,900. The important point is that there is nothing suboptimal about any of these decisions for Bernoulli’s poor man. These different decisions reflect different tolerances for risk that might affect the suitability of an individual for a specific job, but scientists (unlike mission planners) must be silent on notions of optimality with regard to risk tolerance. Of course it may be that individuals who are more risk averse are better suited for peacekeeping missions and individuals who are more risk tolerant are better suited for frontline combat missions, but different tolerances for risk, from the point of view of optimality theory, simply represent different approaches. The assessment of risk by individuals, however, is an area where the tools of optimal decision making can be brought to bear. Consider a commander ordered specifically to “minimize vehicle losses.” Real-life situations seldom provide clear-cut alternatives for which risks and benefits can be estimated with a high degree of certainty. But suppose that a commander has a choice between two plans for committing 500 vehicles to an operation and that his staff has prepared the following best estimates of probable losses: For plan 1, there is a 2 percent chance that 50 vehicles will be lost. For plan 2, there is a 50 percent chance that 4 vehicles will be lost. From a decision theoretic point of view, given the goal of minimizing vehicle losses, plan 1 is preferable. If 100 commanders face this decision, all choose plan 1, and the probability estimates are accurate, the losses would be half what they would have been if all had chosen plan 2.1 Despite the logic behind the probability estimates, behavioral research indicates unambiguously that most people would choose plan 2. For behavioral scientists, this preference reflects a standard feature of decision making, subjective probability distortion (Kahneman and Tversky, 1979). It is now abundantly clear that human decision makers see very small probabilities as much larger than they actually are (for an accessible review, see Plous, 1993). The financial industry is now beginning to take this feature of human behavior into account. Some financial firms now provide their decision makers with training and tools that help them overcome this widespread inefficiency in decision making. Over the past decade, the neurophysiological roots of this kind of behavior have begun to be understood. It seems likely that this enhanced understanding of the mechanisms of decision making will shed new light on the sources of suboptimal decisions. We now know, for example, that structures in the basal ganglia and the prefrontal cortex (PFC) provide valuations for actions, and these valuations appear to be passed to the posterior parietal cortex, among other areas, for decision (see, for example, Glimcher et al., 2005). In fact, recent research has even begun to identify the neural algorithms that lead to some of these inconsistencies (Glimcher et al., 2007). Loss Aversion in Decision Making Another factor that leads to seriously suboptimal decision making is the asymmetry with which human decision makers typically weigh risky losses versus risky gains (see Ariely, 2008, and Plous, 1993). Consider a gambler who has $1,000 in his pocket and is offered a $100 bet that has a 1 in 20 chance of paying off $500. With the money in his pocket and the night’s action still to come, he reasonably refuses this highly improbable bet. Later on, having lost $10,000 on credit, he might well accept that same bet. Contemporary studies of decision making suggest that this reflects a change in his risk tolerance. As losses accumulate, decision makers typically become more and more risk tolerant, even to the point of becoming risk seekers. In the financial world, this behavior pattern has led to a number of catastrophic financial events and even to bank failures. Whether the pattern occurs in combat has not been studied formally, but the constancy of human nature and the annals of military history (e.g., Napoleon at Waterloo) suggest that it does occur, with ominous consequences. Over the past year, the neural basis of this phenomenon has been identified. This finding has broad implications for understanding loss aversion as a source of suboptimal decision making. Before the neurological studies were conducted, the widespread conviction was that loss aversion was a discrete fear response that biased other decision-making mechanisms. The neurophysiological evidence weighs against this view, instead suggesting that a unitary valuation system unrelated to fear drives loss-averse behavior (Tom et al., 2007) MAKING OPTIMAL USE OF INDIVIDUAL VARIABILITY The preceding section discussed the evidence that nearly everyone, under some conditions, makes decisions that are less than optimal for achieving the organizational goals set for them. Irrespective of the type of suboptimal decision making, individuals differ in how they make decisions. For example, some individuals are more impulsive than others and some are more tolerant of risk. Such differences do not necessarily mean that risk-tolerant individuals are better 1 For plan 1, two commanders are likely to lose 100 (2 × 50) vehicles; for plan 2, fifty commanders are likely to lose 200 (50 × 4) vehicles.
OCR for page 38
Opportunities in Neuroscience for Future Army Applications decision makers than risk-averse individuals. From an institutional point of view—that is, what is best for accomplishing the Army’s mission—a certain decision-making style may be better or less well suited to a given task. Economic, psychological, and neurophysiological studies conducted over the past several decades have shown that individuals differ in their decision making in predictable ways (see, for example, Wu and Gonzales, 1998) and that these predictable differences can remain stable for long periods of time (see, for example, Weber et al., 2002). In an economic context, the attitudes of an individual to risk are idiosyncratic, but they appear to be relatively stable personality features. Personality features such as “conscientiousness” have been shown to be both measurable and stable across an individual’s life span (see, for example, Costa and McCrea, 1985). At a neurophysiological level, the distributions of neurochemicals, receptors, and brain activations differ among individuals, and there is growing evidence that these differences are related to stable features of personality (see, for example, Kable and Glimcher, 2007). A common theme that emerges from all this work is not that one type of decision maker is inherently better than another in all circumstances but rather that, given the natural range of differences in a population, one set of decision-making characteristics may be better suited to a particular task than another. Consider a commanding officer who must select a lieutenant for command in an area filled with civilians. An individual who is more risk-averse may be better suited for this task than one who is more risk-tolerant. But an action conducted with a large force in a highly uncertain environment may call for a more risk-tolerant individual. Experienced commanding officers certainly know this, and they select officers for missions according to their individual strengths. Informal assessment occurs routinely throughout the Army, as it does in every organization. The issue is whether adopting more formal techniques based on the results of research in neuroeconomics, neuropsychology, and other neuroscience disciplines can give the Army an advantage in decision making. The Army does not now assess officers for traits such as risk tolerance; nor does it train commanding officers to make the best use of the differential decision-making traits among their subordinates. Battalion-level commanders are not given objective information about the decision-making traits of newly assigned subordinates, nor have those subordinates been trained explicitly in how to adjust their decision making to different tasks. More generally, the Army has not sought to characterize the decision-making characteristics of its individual soldiers and officers through any set of validated metrics even though such information could be useful for (1) determining which tasks an individual appears particularly well suited to perform or (2) identifying training strategies that might better prepare an individual for the decision-making responsibilities of his/her assignment, given that individual’s inherent behavioral and neurophysiological characteristics. Is there a more efficient and reliable way to find the best officer for a particular job? Can the Army routinely provide its commanding officers with a more complete, reliable, and useful assessment of their junior officers at the beginning of a tour of duty? Can neurophysiological monitoring tools identify soldiers who face specific idiosyncratic risks in particular decision environments? Issues such as these do not appear to have been explored by the Army. Based on what the private sector is already doing to remain competitive in a global environment, these and similar questions should no longer be ignored. As the ability to characterize and predict behavior and psychology using the tools of neuroscience grows, these tools may well become critical for the Army. TOOLS FOR CHARACTERIZING INDIVIDUAL DECISION MAKERS Both officers and soldiers make decisions that contribute to (or detract from) achieving mission objectives. We know three important things about their decision-making processes: First, human decision making is likely to be suboptimal in the sense discussed above. Second, humans vary in the stable long-term personality traits that suit them for different kinds of decision-making tasks and different kinds of decision-making training. Third, people can be trained to be more efficient decision makers. If the goal of the Army is to optimize decision making in both ways—as a countermeasure to suboptimal decisions and to maximize the value of trait variability—then it must characterize how individuals make decisions. Once individual decision making can be characterized objectively, then training regimes and selection procedures for both kinds of optimization can be applied. Several tools already exist for characterizing relevant traits of individuals as decision makers. These tools fall under the broad categories of behavioral testing, psychophysiological assessment, neurophysiological monitoring, and genetic analysis. Each tool can provide information about how an individual compares to a larger population of decision makers, individual decision-making strategies, the effects of training on decision making, or combinations of these. Personality as a Factor in Decision Making The most well-developed and robust form of assessment available today is personality assessment based on factor analysis. During the 1980s and 1990s a number of tools were developed for classifying personality. Investigators subsequently asked whether the features of personality measured with these tests remained stable across an entire life span. In general, the answer has been yes. Among the well-known and well-validated tools for classifying personality traits
OCR for page 39
Opportunities in Neuroscience for Future Army Applications quantitatively are the NEO personality inventory2 and the Minnesota Multiphasic Personality Inventory. Personality traits as measured by these tools have already been used successfully to predict performance in Army assignments. For example, Lawrence Katz has shown that the results from this kind of assessment can lead to a twofold improvement in selecting helicopter pilot candidates, when success is defined as selecting individuals who will later be assessed by their experienced peers as being good helicopter pilots.3 The extension of these quantified personality inventories to characterizing individual decision making is less well established. How individuals differ in stable personality traits that influence their decision making has been the subject of intense study in behavioral economics over the last decade. Individuals differ in their tolerance of risk, and they also differ in how they react to risky situations that involve losses and risky situations that involve gains. Individuals can differ as well in the ways that they deal with ambiguous situations and in how they interpret probabilities. People also differ in impulsivity. Some place great value on immediate gains while others are more patient. We do not yet know how stable these traits are over an individual’s life span, but we do know that they have a huge effect on the decisions he or she makes. Emotional Reactivity in Decision Making The personality inventories discussed above use subjects’ responses to a psychological instrument such as a questionnaire to characterize and quantify personality traits. This is not, however, the only way to characterize aspects of personality relevant to how an individual makes decisions. Over the past decade, galvanic skin response (GSR) has become a valuable tool for quantifying an individual’s emotional reactivity. GSR is an inexpensive biometric device that measures changes in skin conductivity (essentially, the rate of sweating). Neuropsychological studies have shown that GSR is a reliable surrogate for a subject’s arousal and emotional engagement during a number of behavioral tests (for a review, see Phelps, 2002). Recently, the Army Institute for Creative Technologies implemented a GSR-based system for assessing post-traumatic stress disorder (PTSD). GSR could also be used to monitor an individual’s emotional reactivity when making decisions in a range of contexts. For example, a GSR-based monitor could be used during simulation training to select individuals for future high-stress missions such as explosive ordnance disposal. Emerging Tools: Genetics, Neurohormones, and Brain Imaging Genetic markers, neurohormones, and brain imaging are emerging as sources for biomarkers that may prove to be reliable indicators of neural state when individuals make choices—that is, they can signify behavior underlying the emotional or subjective elements during decision making. (See Chapter 2 for the definition of “biomarker.”) Genetic markers are particularly relevant for identifying stable traits. Research data suggest that some genetic markers can identify individuals at greater risk of reacting to chemical agents or suffering from PTSD. It is also known that hormonal state—specifically, an individual’s hypothalamo-pituitary axis responsivity—influences decision making as well as fitness for duty (Taylor et al., 2006, 2007). The cost of genetic testing for such traits will decrease over the next decade, and the selectivity and specificity of the tests will improve. As this happens, the Army should position itself to take advantage of the new tools. At present, brain scanning seems too costly relative to the utility of the information that could be gained to be of much use for assessing how people make choices. A single brain scan costs more than $1,000, so this technology is unlikely to be useful for this purpose in the near term. For the far term, its potential value for Army applications related to decision making will depend on the increase in understanding it provides relative to other techniques (e.g., the conventional personality inventories or the improved tools described in Chapter 3) and on its cost. For the time being, the Army should follow the direction of research in this area and should be able to assess the practical value of the science and applications emerging from external parties (academia and the commercial sector). NEUROSCIENCE-RELATED THEORIES OF DECISION MAKING No single theory of decision making is accepted throughout the multiple disciplines, subdisciplines, and research paradigms that constitute neuroscience as defined for this report. Even within a single discipline, researchers are working on approaches that partly overlap (agree) and partly compete or conflict with one another. The committee has selected two such approaches from among the many that could be discussed to illustrate how the general theories of decision making are relevant to the Army and which aspects of the ongoing research are worth monitoring to gain insights and practical approaches to improving decision making in military-relevant contexts. The first approach is belief-based decision making and the neurophysiological activation patterns that have been linked with its more theoretical constructs. The second approach is intuitive, or naturalistic, decision making, which thus far has been primarily descriptive (taxonomic) of actual 2 The acronym NEO derives from the first three of the five axes, or major traits, of personality measured by this tool: neuroticism, extraversion, openness, agreeableness, and conscientiousness. Each axis is further divided into six subtraits. 3 Lawrence Katz, research psychologist, Army Research Institute, briefing to the committee on April 28, 2008.
OCR for page 40
Opportunities in Neuroscience for Future Army Applications decision behavior and has served as a framework for understanding recognition-primed decisions. Belief-Based Decision Making This section reviews one approach to decision making that occurs frequently in ordinary situations and that has profound implications for performance in complex environments. Specifically, belief-based decision making is concerned with selecting one action over another (or others) when knowledge about the possible consequences and outcomes of the alternatives is either uncertain, unclear, or subject to very strong a priori assumptions about how actions “should” result in certain outcomes. The theory of belief-based decision making deserves close scrutiny because many choices that an outsider might call illogical might be better understood when looked at from the perspective of the decider’s preexisting belief system. Finally, and most important, belief-based decision making has been associated with brain structures that may be accessible to real-time monitoring—and perhaps even to modification—using the tools of neuroscience. The Cognitive Psychology of Belief Sets A belief can be defined as a propositional mental construct that affirms or denies the truth of a state of affairs and is closely linked to basic judgment processes. A large and stable set of beliefs is essential for intelligent behavior, since such a belief set forms the basis for any actions that a person may take to achieve his or her goals. Beliefs are the building blocks we use to build mental models of the state of the world. They are therefore important constructs used to guide decision making. The neuropsychological approach to characterizing beliefs has focused recently on developing testable process theories, which through neural state monitoring—for example, neuroimaging or event-related potentials—can be related to functions of different systems in the brain. To understand recent neuroscience findings related to belief-based decision making, a few essential constructs need to be explained. A “theory of mind” (ToM) refers to an individual’s everyday ability to attribute independent mental states to him- or herself and to others for the purpose of predicting and explaining behavior (Happé, 2003) (see also Box 3-2). The cognitive mechanisms underlying this ability to attribute a “mind” to oneself or others have been extensively investigated via reasoning tasks that require participants to predict the action of agents based on information about those agents’ beliefs and desires. Researchers who have investigated this ability have found systematic reasoning errors (Wertz and German, 2007). For example, we typically infer the beliefs of others from what they tell us. However, behavioral experiments have the advantage that the experimenter does not have to rely on the degree to which an individual is reporting his or her beliefs truthfully. Behavioral experiments show that a false belief engages greater cognitive resources, resulting in longer response times. (In this context, a “false belief” is a presumed state of reality that differs from the experienced state of reality [Apperly et al., 2008] or an expected outcome of an action that differs from the actual outcome [Grèzes et al., 2004].) A false belief may increase the cognitive load because subsequent judgments and decision making need to take it into account as a conflict with the individual’s perception of reality (Apperly et al., 2008). When individuals are asked to make belief-based decisions, they tend to (1) examine less closely those arguments that support their existing beliefs and (2) seek out evidence that confirms rather than contradicts current beliefs. These tendencies often lead to belief-maintaining biases and high irrationality (Evans et al., 1993). Interestingly, repetition of the presentation of a belief has been shown to increase the intensity of that belief (Hertwig et al., 1997), which parallels the observation that individuals prefer stimuli that are frequently presented over those that are rarely presented (Zajonc et al., 1974). Dysfunctional beliefs and associated choices of actions are thought to play a crucial role in various forms of dysfunction, including dysfunctional decision making. Experimentally, researchers have used the Implicit Association Test (Greenwald et al., 1998) to assess belief-based associations, which is thought to be superior to traditional self-reporting measures (De Houwer, 2002). Moreover, the Implicit Association Test can be used to assess the strength of evaluative associations in the domain of beliefs (Gawronski, 2002). More specific for decision making, people have erroneous beliefs about the laws of chance (Tversky and Kahneman, 1971). Beliefs can have direct effects on the perceived likelihood of certain outcomes. For example, familiarity tends to increase the probability of the outcome associated with an event (Fox and Levav, 2000). Moreover, when judging cause and effect, beliefs can interfere with detecting cause and effect in decision-making situations by favoring more believable causes over those perceived as less believable (Fugelsang and Thompson, 2000). Some investigators have integrated these findings into the context of the ToM literature (Fox and Irwin, 1998), suggesting that probabilities of available options are immediately embedded into a mentalizing context. A person’s beliefs about the relative importance of probabilities and values also compete with limitations on his or her ability to make decisions on the basis of these beliefs when processing information in a specific decision-making situation (Slovic and Lichtenstein, 1968). For example, greater deviation from optimal decision making in favor of belief-based decision making can be observed when individuals are making decisions in complex and extreme environments. Finally, beliefs can bias a decision maker to adopt what has been referred to as rule following, which occurs when a rule is applied to a situation to minimize the cognitive effort and provide emotionally satisfying solutions that are good enough but not
OCR for page 41
Opportunities in Neuroscience for Future Army Applications necessarily the best (Mellers et al., 1998). Thus, it should not be surprising that adding measures of personal belief salience to considerations of the decision-situation characteristics improves the accuracy of predictions concerning decision-making behavior (Elliott et al., 1995). Neural Correlates of Belief Processing Based on the preceding brief review of cognitive processes that contribute to belief formation and how belief affects decision making, it should be clear that no single area of the brain is responsible for belief-based decision making, and surprisingly few brain regions have been reported to play the same role repeatedly. Neuroimaging studies reveal a neural system with three components: the medial PFC, temporal poles, and a posterior superior temporal sulcus (Frith and Frith, 2003; Gallagher and Frith, 2003). The right temporoparietal junction (TPJ) has been implicated not only in processing the attribution of beliefs to other people (ToM) but also in redirecting attention to task-relevant stimuli. Recent studies support the hypothesis that the overlap between ToM and attentional reorienting suggests the need for new accounts of right TPJ function that integrate across these disparate task comparisons (Mitchell, 2008). Others have found that the right TPJ activates differentially as a function of belief and outcome and that the activation is greatest when there could be a negative outcome based on the belief that action of a protagonist would cause harm to others, even though the harm did not occur (Young et al., 2007). A study of brain-damaged individuals reported that, in addition to the frontal cortex, the left TPJ is necessary for reasoning about the beliefs of others (Samson et al., 2004). Thus, the TPJ appears to be crucial for the representation of a mental state associated with belief formation, possibly focused on disambiguating true from false beliefs. Activation of the medial PFC is often observed with the subject at rest, a time when individuals presumably engage in internally directed attention-processing of self-relevant thoughts (Kelley et al., 2002) and beliefs (Wicker et al., 2003). Some investigators have found that, whereas the medial PFC is recruited for processing belief valence, the TPJ and precuneus are recruited for processing beliefs in moral judgment and mediate both the encoding of beliefs and their integration with outcomes for moral judgment (Young and Saxe, 2008). These findings are consistent with those of others, showing occurrence of greater activation of the anteromedial PFC and rostral anterior cingulate when participants were implicitly making associations consistent with gender and racial biases. Using event-related potentials to investigate the neural substrates of reasoning with false beliefs revealed a specific late negative component that appeared to be located in the middle cingulate cortex and that might be related to conflict or error detection (Wang et al., 2008). Therefore, the medial PFC appears to be important for processing the degree to which beliefs are self-relevant, possibly with emphasis on the correct or incorrect, acceptable or unacceptable, valence of the belief. The strength with which one holds to a belief is an important regulator of human behavior and emotion. A recent neuroimaging study shows that states of belief, disbelief, and uncertainty differentially activated distinct regions of the prefrontal and parietal cortices, as well as the basal ganglia. The investigators proposed that the final acceptance of a statement as true or its rejection as false relies on more primitive, hedonic processing in the medial PFC and the anterior insula (Harris et al., 2007). Comparing belief-based and belief-neutral reasoning, some researchers have reported activation of the left temporal lobe and the parietal lobe, respectively, and modulations by the lateral PFC in cases of overcoming belief bias and by the ventral medial PFC in cases of succumbing to belief bias (Goel and Dolan, 2003). Examining belief-based decision making in a navigation task under uncertain conditions showed distinct regions of PFC activation, which was consistent with an underlying Bayesian model of decision making that permits efficient, goal-oriented navigation (Yoshida and Ishii, 2006). Thus, the dorsolateral and ventromedial PFC may have important regulatory functions in moderating belief intensity and the degree to which beliefs influence decision making. Additional brain regions may modulate belief-based processing by contributing resources to (1) override beliefs, (2) enforce beliefs, (3) engage in reward-related processing associated with beliefs (that is, how good it feels to be right or how one’s own belief-based decision affects others). This brief review of brain areas that are important for belief-based decision making shows that beliefs are most likely decomposed in the brain into various component processes—for example, self-relevance, valence, simulation, social relatedness, and other yet-to-be-examined components. These components are subsequently processed by various regions of the brain to arrive at preferences that allow selecting an option and engaging in decision making. Understanding these component processes may help to better monitor and potentially modulate belief-based decision making, particularly considering the significant distortions and suboptimal decisions that beliefs can generate. Potential Neural Targets for Modifying Belief-Based Decision Making In Army operations, decision making frequently occurs in extreme environments, which place a high demand on an individual’s physiological, affective, cognitive, or social processing resources. In other words, extreme environments strongly perturb the body and mind, which in turn initiate complex cognitive and affective response strategies. Belief-based decision making is often a way for the individual to save resources and to select an action that seems to be the best under the circumstances. However, as pointed out above, belief-based decision making frequently results in sub-
OCR for page 42
Opportunities in Neuroscience for Future Army Applications optimal outcomes because the individual fails to adequately take into account chance, value, and other information. Therefore, modulating belief-based decision making provides an opportunity to optimize outcomes. The main advance that neuroscience has brought to understanding belief-based decision making is the ability to assign component processes to specific brain regions and to potentially target these regions for monitoring and modulation purposes. Although some brain regions are likely to be more difficult to modulate, others, located on the convexity of the brain, are easily accessible to modulations that could in theory be used to alter the degree to which belief-processing occurring in that area influences decisions. For example, the TPJ is easy to find and can be accessed using brain stimulation techniques such as transcranial magnetic stimulation (Fitzgerald et al., 2002). Moreover, both fMRI and event-related potentials may be useful in monitoring the TPJ for reliable biomarkers of belief strength and truth of belief. Finally, using real-time fMRI (Cox et al., 1995), one could monitor this and other brain regions in response to inquiries about various types of beliefs and measure whether interventions were successful in modulating beliefs, similar to very recent experiments aimed at attenuating emotion (Caria et al., 2007) or pain (deCharms, 2007). Although these approaches are in their infancy, they point to unprecedented opportunities to directly and specifically manipulate the brain to alter decision-making processes. Summary Points for Army Applications The preceding discussion of belief-based decision making suggests three key opportunities for the Army: Belief-based decision making is an important process that occurs in extreme environments and can have profound implications for optimizing behavior and outcomes. Accordingly, understanding such decision making and modulating it is an important neuroscience opportunity for the Army. Research on belief-based decision making supports the idea that a few distributed brain systems are responsible for processing beliefs that contribute to selecting appropriate or inappropriate actions. These brain systems can be (1) monitored and (2) potentially modulated or modified in real time during training or preparation exercises to optimize decision making. Research directed at understanding and applying belief-based decision making can be categorized as near term (applicable results expected within 5 years), medium term (applicable results expected within 10 years), or far term (applicable results expected in 10-20 years). In the near term, identify instances when individuals are making inappropriate, belief-based decisions that result in nonoptimal performance. In the medium term, develop training tools to let individuals know when they are making nonoptimal, belief-based decisions. In the long term, monitor (and potentially also modulate) in real time the brain systems that contribute to nonoptimal, belief-based decision making. Investigative tools such as transcranial magnetic stimulation and real-time fMRI are not envisioned to be deployed in actual operations but could be used as tools for training those who make high-impact, high-risk decisions (e.g., field commanders). Intuitive Decision Making and Recognition-Primed Decisions While advances in military technology do change the tactics, techniques, and procedures of armed conflict, the eighteenth-century military strategist Carl von Clausewitz stated that war is influenced more by human beings than by technology or bureaucracy (von Clausewitz, 1976). Relatively young leaders making decisions at the point of the strategic spear can become the critical factor in whether an engagement is won, lost, or drawn. For the U.S. military, including the Army, the challenge is less to empower these young leaders to make decisions (the Army has already restructured to encourage such behavior) and more to improve the quality of the decisions they make. Military strategists have emphasized that the strength needed to win the next war will be less about kinetic energy than about cognitive power and performance (Scales, 2006). For many years, the military community believed that the classical decision-making literature (e.g., normative decision theory, statistical decision theory, and decision analysis) appropriately defined rational decision making for military commanders. However, the last two decades have moved toward studying naturalistic decision making (positive, or descriptive, decision theory). Military leaders most often make decisions based on their previous experiences. These experiences can be based on actual combat, combat training (for example, at the Joint Readiness Training Center, the Combined Maneuver Training Center, or the National Training Center), or home station training. Or, leaders may acquire vicarious experience by studying past leaders in military history. Taken together, these experiences arguably build a military commander’s base for so-called intuitive decision making. Unpacking the terms “intuition” and “intuitive” is an arduous task, as different researchers have defined the terms in different ways, contingent on their own specialty. In a recent popular treatment of the topic, Gladwell (2005) referred to the idea of intuition as “blink” for a decision made in an instant. Dijksterhuis and Nordgren (2006) define intuition as a gut feeling based on past experience and believe that intuition plays a critical role in the theory of unconscious
OCR for page 43
Opportunities in Neuroscience for Future Army Applications thought. Klein (1989) posits that intuition precedes analysis and that battlefield commanders are told they need to trust their own judgment and intuition, but they aren’t told how to develop their intuition so that it is trustworthy. Shattuck et al. (2001) found that more-experienced (longer time in service and higher rank) military leaders tended to ask for less information when making decisions than did officers with less experience. Their study of decisions made by military leaders (a process they called “cognitive integration”) suggested that experienced leaders show an intuition that leads them to sample fewer sources and to ignore sources they deem unworthy of their attention. Less-experienced officers sampled all of the information sources available and usually as much of each source as was allowed. A particular focus of work on naturalistic decision making has been a theoretical model called recognition-primed decision making (RPD) (Klein and MacGregor, 1987; Klein, 1989; Hancock, 2007). RPD aims to provide a descriptive model of how people actually make decisions by predicting the kind of strategy they will use (Klein, 1998, 2008). It has been validated in a number of different domains by different research teams, and it has been empirically tested (Klein et al., 1995). In most studies, decision makers have been found to use an RPD strategy for about 90 percent of their difficult decisions. For simpler decisions, the proportion is likely to be higher. Of course, some decisions force a comparison between alternatives (e.g., candidates for a position) that more closely resembles the decision processes advocated in normative decision theory. Although the RPD model predicts the strategy that will be used, it does not predict the choice the decision maker will make using that strategy. Nevertheless, attempts to predict or model actual choices are likely to benefit from referring to the decision strategy as a starting point. A pure content model that ignores process (the strategy employed) may not prove useful in describing the actual decision process of experienced leaders or in helping less-experienced leaders to make better decisions. One approach to predicting decision choices is to build on the RPD structure and add, for example, pattern repertoire content. Several ongoing attempts to formulate computational versions of the RPD model are employing this approach to predicting the decision as well as the strategy. REFERENCES Apperly, I.A., E. Back, D. Samson, and L. France. 2008. The cost of thinking about false beliefs: Evidence from adults’ performance on a non-inferential theory of mind task. Cognition 106(3): 1093-1108. Ariely, D. 2008. Predictably Irrational: The Hidden Forces That Shape Our Decisions. New York, N.Y.: Harper. Caria, A., R. Veit, R. Sitaram, M. Lotze, N. Weiskopf, W. Grodd, and N. Birbaumer. 2007. Regulation of anterior insular cortex activity using real-time fMRI. NeuroImage 35(3): 1238-1246. Costa, P.T., and R.R. McCrae. 1985. NEO-PI Professional Manual. Lutz, Fla.: Psychological Assessment Resources, Inc. Cox, R.W., A. Jesmanowicz, and J.S. Hyde. 1995. Real-time functional magnetic resonance imaging. Magnetic Resonance in Medicine 33(2): 230-236. De Houwer, J. 2002. The implicit association test as a tool for studying dysfunctional associations in psychopathology: Strengths and limitations. Journal of Behavior Therapy and Experimental Psychiatry 33(2): 115-133. deCharms, R.C. 2007. Reading and controlling human brain activation using real-time functional magnetic resonance imaging. Trends in Cognitive Sciences 11(11): 473-481. Dijksterhuis, A., and L.F. Nordgren. 2006. A theory of unconscious thought. Perspectives on Psychological Science 1(2): 95-109. Elliott, R., D. Jobber, and J. Sharp. 1995. Using the theory of reasoned action to understand organisational behaviour: The role of belief salience. British Journal of Social Psychology 34(2): 161-172. Evans, J.St.B.T., D.E. Over, and K.I. Manktelow. 1993. Reasoning, decision making and rationality. Cognition 49(1-2): 165-187. Fitzgerald, P.B., T.L. Brown, and Z.J. Daskalakis. 2002. The application of transcranial magnetic stimulation in psychiatry and neurosciences research. Acta Psychiatrica Scandinavica 105(5): 324-340. Fox, C.R., and J.R. Irwin. 1998. The role of context in the communication of uncertain beliefs. Basic and Applied Social Psychology 20(1): 57-70. Fox, C.R., and J. Levav. 2000. Familiarity bias and belief reversal in relative likelihood judgment. Organizational Behavior and Human Decision Processes 82(2): 268-292. Frith, U., and C.D. Frith. 2003. Development and neurophysiology of mentalizing. Philosophical Transactions of the Royal Society B: Biological Sciences 358(1431): 459-473. Fugelsang, J.A., and V.A. Thompson. 2000. Strategy selection in causal reasoning: When beliefs and covariation collide. Canadian Journal of Experimental Psychology 54(1): 15-32. Gallagher, H.L., and C.D. Frith. 2003. Functional imaging of “theory of mind.” Trends in Cognitive Sciences 7(2): 77-83. Gawronski, B. 2002. What does the implicit association test measure? A test of the convergent and discriminant validity of prejudice-related IATs. Experimental Psychology 49(3): 171-180. Gladwell, M. 2005. Blink: The Power of Thinking Without Thinking. New York, N.Y.: Little, Brown and Co. Glimcher, P.W., M.C. Dorris, and H.M. Bayer. 2005. Physiological utility theory and the neuroeconomics of choice. Games and Economic Behavior 52(2): 213-256. Glimcher, P.W., J. Kable, and K. Louie. 2007. Neuroeconomic studies of impulsivity: Now or just as soon as possible? American Economic Review 97(2): 142-147. Goel, V., and R.J. Dolan. 2003. Explaining modulation of reasoning by belief. Cognition 87(1): B11-B22. Greenwald, A.G., D.E. McGhee, and J.L.K. Schwartz. 1998. Measuring individual differences in implicit cognition: The implicit association test. Journal of Personality and Social Psychology 74(6): 1464-1480. Grèzes, J., C.D. Frith, and R.E. Passingham. 2004. Inferring false beliefs from the actions of oneself and others: An fMRI study. Neuroimage 21(2): 744-750. Hancock, P.A. 2007. Cold calculation or intuitive instinct: Solving the secrets of sustained survival. Pp. 197-206 in The Psychology of Survivor: Overanalyze, Overemote, Overcompensate: Leading Psychologists Take an Unauthorized Look at the Most Elaborate Psychological Experiment Ever Conducted—Survivor! R.J. Gerrig, ed. Dallas, Tex.: BenBella Books. Happé, F. 2003. Theory of mind and the self. Annals of the New York Academy of Sciences 1001: 134-144. Harris, S., S.A. Sheth, and M.S. Cohen. 2007. Functional neuroimaging of belief, disbelief, and uncertainty. Annals of Neurology 63(2): 141-147. Hertwig, R., G. Gigerenzer, and U. Hoffrage. 1997. The reiteration effect in hindsight bias. Psychological Review 104(1): 194-202.
OCR for page 44
Opportunities in Neuroscience for Future Army Applications Kable, J.W., and P.W. Glimcher. 2007. The neural correlates of subjective value during intertemporal choice. Nature Neuroscience 10(12): 1625-1633. Kahneman, D., and A. Tversky. 1979. Prospect theory: An analysis of decision under risk. Econometrica 47(2): 263-291. Kahneman, D., and A. Tversky, eds. 2000. Choices, Values and Frames. New York, N.Y.: Russell Sage Foundation. Kelley, W.M., C.N. Macrae, C.L. Wyland, S. Caglar, S. Inati, and T.F. Heatherton. 2002. Finding the self? An event-related fMRI study. Journal of Cognitive Neuroscience 14(5): 785-794. Klein, G.A. 1989. Recognition-primed decision. Advances in Man-Machine Systems Research 5: 47-92. Klein, G.A. 1998. Sources of Power: How People Make Decisions. Cambridge, Mass.: MIT Press. Klein, G. 2008. Naturalistic decision making. Human Factors 50(3): 456-460. Klein, G.A., and D. MacGregor. 1987. Knowledge Elicitation of Recognition-Primed Decision Making. Yellow Springs, Ohio: Klein Associates, Inc. Klein, G., S. Wolf, L. Militello, and C. Zsambok. 1995. Characteristics of skilled option generation in chess. Organizational Behavior and Human Decision Processes 62(1): 63-69. Mellers, B.A., A. Schwartz, and A.D.J. Cooke. 1998. Judgment and decision making. Annual Review of Psychology 49: 447-477. Mitchell, J.P. 2008. Activity in right temporo-parietal junction is not selective for theory-of-mind. Cerebral Cortex 18(2): 262-271. Phelps, E.A. 2002. Issues in the cognitive neuroscience of emotion. P. 539 in Cognitive Neuroscience: The Biology of Mind, 2nd ed. M.S. Gazzaniga, R.B. Ivry, and G.R. Mangun, eds. New York, N.Y.: Norton. Plous, S. 1993. The Psychology of Judgment and Decision Making. New York, N.Y.: McGraw-Hill. Samson, D., I.A. Apperly, C. Chiavarino, and G.W. Humphreys. 2004. Left temporoparietal junction is necessary for representing someone else’s belief. Nature Neuroscience 7(5): 499-500. Scales, R.H. 2006. Clausewitz and World War IV. Available online at http://www.afji.com/2006/07/1866019/. Last accessed July 23, 2008. Shattuck, L.G., J.L. Merlo, and J. Graham. 2001. Cognitive integration: Exploring performance differences across varying types of military operations. In Proceedings from the Fifth Annual Federated Laboratory Symposium on Advanced Displays and Interactive Displays. College Park, Md., March 20-22. Slovic, P., and S. Lichtenstein. 1968. The relative importance of probabilities and payoffs in risk taking. Journal of Experimental Psychology Monographs 78(3 Pt. 2): 1-18. Taylor, M., A. Miller, L. Mills, E. Potterat, G. Padilla, and R. Hoffman. 2006. Predictors of Success in Basic Underwater Demolition/SEAL Training. Part I: What Do We Know and Where Do We Go from Here? Naval Health Research Center Technical Report No. 06-37. Springfield, Va.: National Technical Information Service. Taylor, M., G. Larson, A. Miller, L. Mills, E. Potterat, J. Reis, G. Padilla, and R. Hoffman. 2007. Predictors of Success in Basic Underwater Demolition/SEAL Training. Part II: A Mixed Quantitative and Qualitative Study. Health Research Center Technical Report No. 07-10. Springfield, Va.: National Technical Information Service. Tom, S.M., C.R. Fox, C. Trepel, and R.A. Poldrack. 2007. The neural basis of loss aversion in decision making under risk. Science 315(5811): 515-518. Tversky, A., and D. Kahneman. 1971. Belief in the law of small numbers. Psychological Bulletin 76(2): 105-110. von Clausewitz, C. 1976. On War. M.E. Howard and P. Paret, eds. Princeton, N.J.: Princeton University Press. Wang, Y.W., Y. Liu, Y.X. Gao, J. Chen, W. Zhang, and C.D. Lin. 2008. False belief reasoning in the brain: An ERP study. Science in China Series C: Life Sciences 51(1): 72-79. Weber, E.U., A.-R. Blais, and N.E. Betz. 2002. A domain-specific risk-attitude scale: Measuring risk perceptions and risk behaviors. Journal of Behavioral Decision Making 15(4): 263-290. Wertz, A.E., and T.C. German. 2007. Belief–desire reasoning in the explanation of behavior: Do actions speak louder than words? Cognition 105(1): 184-194. Wicker, B., P. Ruby, J.-P. Royet, and P. Fonlupt. 2003. A relation between rest and the self in the brain? Brain Research Reviews 43(2): 224-230. Wu, G., and R. Gonzalez. 1998. Common consequence effects in decision making under risk. Journal of Risk and Uncertainty 16(1): 115-139. Yoshida, W., and S. Ishii. 2006. Resolution of uncertainty in prefrontal cortex. Neuron 50(5): 781-789. Young, L., F. Cushman, M. Hauser, and R. Saxe. 2007. The neural basis of the interaction between theory of mind and moral judgment. Proceedings of the National Academy of Sciences of the United States of America 104(20): 8235-8240. Young, L., and R. Saxe. 2008. The neural basis of belief encoding and integration in moral judgment. Neuroimage 40(4): 1912-1920. Zajonc, R.B., R. Crandall, and R.V. Kail. 1974. Effect of extreme exposure frequencies on different affective ratings of stimuli. Perceptual and Motor Skills 38(2): 667-678.