Cover Image


View/Hide Left Panel

rationality and claimed that people were “intuitive statisticians” and “intuitive scientists.” The heuristics and biases research program changed that perspective; current views that incorporate research on emotion, culture, and the unconscious have changed it yet again.

Heuristics and Biases Approach

Since at least the 1970s, psychologists and decision theorists have been documenting the many fallibilities and “irrationalities” in individual judgment. Countless examples show that people do not reason according to the rules of logic and probability, that we fail to recognize missing information, and that we are overconfident in our judgments. That list is just a small sample of what was discovered by the “Heuristics and Biases Program” (for an anthology of the classic works, see Kahneman et al., 1982; for a more recent update, see Gilovich et al., 2002). Lists of reasoning fallacies can be found in many places and, indeed, Heuer’s (1999) classic work, Psychology of Intelligence Analysis, was an attempt to interpret those findings with respect to the intelligence analyst. Among the better known irrationalities are the availability and representativeness heuristics and the hindsight and overconfidence biases (all discussed below). However, creating lists of fallacies is not very useful; more are likely to be found, and when attempting to “repair” one such leak, others may emerge. To better understand, predict the occurrence of, and, perhaps, remedy such irrationalities, it is useful to understand when, why, and how they arise.

Perhaps the most important thing to know about reasoning errors is that the errors are not random. That observation (Tversky and Kahneman, 1974)—that the errors are systematic (or, as in Ariely’s 2008 clever book title, Predictably Irrational)—is what makes such errors interesting, informative, and sometimes treatable. If such irrationalities are built into our reasoning, what in our cognitive system causes them?

Some theorists argued that many of the “irrationalities” were just laboratory tricks—specific to the presentation of the problems and the populations routinely used in such studies. Indeed, some errors may be reduced when minor changes are made to the presentation (e.g., when information is presented in frequency rather than probability format or when people see that a random generator was at work; e.g., Gigerenzer and Hoffrage, 1995). However, most errors cannot be made to disappear and most are (1) present in experts as well as novices and (2) resistant to debiasing attempts.

The National Academies of Sciences, Engineering, and Medicine
500 Fifth St. N.W. | Washington, D.C. 20001

Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement