Closed Ignorance

Closed ignorance can affect individuals and organizations alike. One form of closed ignorance stems from individuals and groups that are averse to recognizing possible disruptions as they pursue their goals or objectives. This ignorance can be partially countered by opening the system to outside opinion. Gerard Tellis, with the University of Southern California, has conducted several studies involving disruption and market incumbents. Tellis (2006) argues that

the disruption of incumbents—if and when it occurs—is due not to technological innovation per se but rather to incumbents’ lack of vision of the mass market and an unwillingness to [redirect] assets to serve that market.

Faber and colleagues suggested that closed ignorance occurs due to “false knowledge or false judgments” (Faber et al., 1992b). False truths may result from the overreliance on an inadequate number of perspectives or observations. Key decision makers of the persistent forecasting system should be made aware of closed ignorance at the outset and put in place a set of processes to mitigate this form of bias. Further, these bias mitigation processes should be evaluated on a periodic basis by both a self-audit and a third-party assessment. The composition of the decision-making group should be included in this review process.

One method of overcoming forecasting bias due to closed ignorance is to increase the diversity of the participants in the forecast. This can be accomplished by creating and implementing a Web-based forecasting system designed for global participation. Another approach is to incorporate forecasting activities such as workshops, surveys, and studies from other countries. The more diverse the participants, the more likely it is that all perspectives, including many of the outliers, will be captured.

Open Ignorance

Open ignorance assumes that key stakeholders of the persistent forecasting system are willing to admit to what they don’t know. Costanza and colleagues build on Faber’s work to suggest that there are four main sources of surprise that result from open ignorance (Costanza et al., 1992):

  • Personal ignorance,

  • Communal ignorance,

  • Novelty ignorance, and

  • Complexity ignorance.

Personal Ignorance

Personal ignorance results from lack of knowledge or awareness on the part of an individual. The impact of personal bias on a forecast can be mitigated by incorporating multiple perspectives during both data gathering and data analysis—that is, at every stage of the persistent forecasting system process, including during idea generation, monitoring and assessment, escalation, and review. Converging these multiple perspectives could be dangerous, however, owing to the tendency to develop a consensus view instead of a diversity of views. Gaining an understanding of a more diverse set of viewpoints helps reduce personal ignorance.

According to Karan Sharma of the Artificial Intelligence Center at the University of Georgia, “each concept must be represented from the perspective of other concepts in the knowledge base. A concept should have representation from the perspective of multiple other concepts” (Sharma, 2008, p. 426). Sharma also cites a number of studies that discuss the role of multiple perspectives in human and machine processes (Sharma, 2008).

The concept of multiple perspectives is also embedded in many of the forecasting and analytical processes discussed elsewhere in this report or is a guiding principle for them, including scenario planning, stakeholder analysis, and morphological analysis. When groups such as a workshop are assembled for gathering or interpretating data, system operators should strive to create set of participants that is diverse in the following characteristics:



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement