National Academies Press: OpenBook

Persistent Forecasting of Disruptive Technologies (2010)

Chapter: 4 Reducing Forecasting Ignorance and Bias

« Previous: 3 The Nature of Disruptive Technologies
Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×

4
Reducing Forecasting Ignorance and Bias

INTRODUCTION

When imagining an ideal disruptive technology forecasting system, the potential negative impacts of individual bias and forecasting bias on forecasts were a key consideration of the committee.1 While no data source can be assumed to be free of biases, balancing sources of information can help to reduce overall forecasting bias. For example, forecasters who rely primarily or exclusively on Western experts and data sources are at a higher risk of producing an unbalanced and biased forecast. Such forecasts can create planning blind spots resulting from cultural mirroring2 and false assumptions. A biased forecast gives an incomplete view of potential futures and increases the probability that the user will be unprepared for a future disruptive act or event.

A 1992 article by Faber and colleagues introduced “ignorance,” or a lack of information, as another source of surprise in addition to the traditional economic concepts of risk and uncertainty (Faber et al., 1992a). Based on Faber’s article, the committee chose to distinguish these three terms in the following way:

  • Risk. Occurs when the probabilities of outcomes are thought to be known;

  • Uncertainty. Occurs when the outcomes are known (or predicted) but the probabilities are not; and

  • Ignorance. Occurs when the outcomes are not known (or predicted).

IGNORANCE

Ignorance is a significant source of forecasting bias, which in turn causes forecasting failures and disruptive surprises. According to Faber et al., ignorance can be categorized as either closed or open, and both types can be a key source of surprise (Faber et al., 1992a). Closed ignorance occurs when key stakeholders are either unwilling or unable to consider or recognize that some outcomes are unknown. In this case, the stakeholders have no knowledge of their own ignorance. Conversely, open ignorance occurs when the key stakeholders know what it is that they don’t know.

1

For purposes of this report, the committee defines “individual bias” as a prejudice held by a person and “forecasting bias” as incompleteness in the data sets or methodologies used in a forecasting system.

2

Cultural mirroring, also known as mirror imaging, is the assumption that one’s beliefs and values are held by everyone else.

Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×

Closed Ignorance

Closed ignorance can affect individuals and organizations alike. One form of closed ignorance stems from individuals and groups that are averse to recognizing possible disruptions as they pursue their goals or objectives. This ignorance can be partially countered by opening the system to outside opinion. Gerard Tellis, with the University of Southern California, has conducted several studies involving disruption and market incumbents. Tellis (2006) argues that

the disruption of incumbents—if and when it occurs—is due not to technological innovation per se but rather to incumbents’ lack of vision of the mass market and an unwillingness to [redirect] assets to serve that market.

Faber and colleagues suggested that closed ignorance occurs due to “false knowledge or false judgments” (Faber et al., 1992b). False truths may result from the overreliance on an inadequate number of perspectives or observations. Key decision makers of the persistent forecasting system should be made aware of closed ignorance at the outset and put in place a set of processes to mitigate this form of bias. Further, these bias mitigation processes should be evaluated on a periodic basis by both a self-audit and a third-party assessment. The composition of the decision-making group should be included in this review process.

One method of overcoming forecasting bias due to closed ignorance is to increase the diversity of the participants in the forecast. This can be accomplished by creating and implementing a Web-based forecasting system designed for global participation. Another approach is to incorporate forecasting activities such as workshops, surveys, and studies from other countries. The more diverse the participants, the more likely it is that all perspectives, including many of the outliers, will be captured.

Open Ignorance

Open ignorance assumes that key stakeholders of the persistent forecasting system are willing to admit to what they don’t know. Costanza and colleagues build on Faber’s work to suggest that there are four main sources of surprise that result from open ignorance (Costanza et al., 1992):

  • Personal ignorance,

  • Communal ignorance,

  • Novelty ignorance, and

  • Complexity ignorance.

Personal Ignorance

Personal ignorance results from lack of knowledge or awareness on the part of an individual. The impact of personal bias on a forecast can be mitigated by incorporating multiple perspectives during both data gathering and data analysis—that is, at every stage of the persistent forecasting system process, including during idea generation, monitoring and assessment, escalation, and review. Converging these multiple perspectives could be dangerous, however, owing to the tendency to develop a consensus view instead of a diversity of views. Gaining an understanding of a more diverse set of viewpoints helps reduce personal ignorance.

According to Karan Sharma of the Artificial Intelligence Center at the University of Georgia, “each concept must be represented from the perspective of other concepts in the knowledge base. A concept should have representation from the perspective of multiple other concepts” (Sharma, 2008, p. 426). Sharma also cites a number of studies that discuss the role of multiple perspectives in human and machine processes (Sharma, 2008).

The concept of multiple perspectives is also embedded in many of the forecasting and analytical processes discussed elsewhere in this report or is a guiding principle for them, including scenario planning, stakeholder analysis, and morphological analysis. When groups such as a workshop are assembled for gathering or interpretating data, system operators should strive to create set of participants that is diverse in the following characteristics:

Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×

age, wealth, education, career path, scientific specialization, culture, religion, countries, languages, economic philosophy, and political perspective.

Communal Ignorance

A technology may not be immediately recognized by a group or community as disruptive for a number of reasons, including an early judgement that it is not likely to be successful, an initially slow rate of adoption, or a lack of imagination. Numerous academic papers propose lack of imagination as the primary reason for disruption. To various degrees, lack of imagination contributes to most forms of bias or ignorance but appears to be particularly acute in groups or communities who by definition may assemble because of similar viewpoints and who may accordingly be less willing to consider others’ views. For the same reason, ignorance may also be due to lack of knowledge. According to Faber and colleagues, another cause of communal ignorance is that “there is no information available to society concerning this event. By research, however, it would be possible to obtain this information” (Faber et al., 1992b, p. 85). According to the Aspen Global Change Institute’s Elements of Change report, communal ignorance can be overcome through the acquisition of new knowledge achieved “through research, broadly within existing scientific concepts, ideas, and disciplines” (Schneider and Turner, 1995, p. 8).

Many forecasts are generated by a relatively small group of similar individuals (e.g., of the same age group, educational background, culture, or native language). A persistent forecasting system should reduce communal ignorance by including a broader set of communities and viewpoints, such as an open system that encourages global participation. With the advent of the Internet, it is now easy to create Web-based systems that allow individuals anywhere to collaborate on virtually any topic at any time. By leveraging communities of interest and public domain sources of information, open collaboration systems may be used to envision a broader range of possible disruptions.

The persistent forecasting system should utilize processes such as scenario methods and gaming to “imagine the unimaginable” and develop multiple views of potential futures in areas identified as key priorities. Importantly, these techniques must encourage and capture fringe or extreme thoughts from individuals who might be expected to come up with early signals of potential disruptions.

When participation from individuals or groups representing certain viewpoints is insufficient, system designers will need to find ways to encourage greater participation. If the sources of such viewpoints are not available or accessible, proxies may need to be created to replicate the viewpoints. Red teaming and adversary simulations are time-tested methods of creating proxies.

Novelty Ignorance

Jesus Ramos-Martin suggests that novelty ignorance can stem from the inability to anticipate and prepare for external factors (shocks) or internal factors such as “changes in preferences, technologies, or institutions” (Ramos-Martin, 2003, p. 7). Natural disasters and resource crises such as limited water, energy, or food are examples of external shocks that might cause novelty ignorance.

While it is difficult, if not impossible, to forecast the exact timing of external shocks, decision makers can benefit from the simulation and gaming of alternative futures to gain better insight into the impact of various shocks under different scenarios. These insights can be used to mitigate the impact of surprise by encouraging the allocation of resources before the surprise occurs.

Complexity Ignorance

Surprise may also be caused when information is available but insufficient tools are available to analyze the data. Thus, interrelationships, hidden dependencies, feedback loops, and other factors that impact system stability may remain hidden. This special type of challenge is called complexity ignorance.

Our world is comprised of many complex adaptive systems (CASs), such as those found in nature, financial markets, and society at large. While personal and communal ignorance can be mitigated, ignorance coming from

Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×

a failure to understand or model complex systems is more difficult to deal with. It is therefore worthwhile to examine complexity ignorance in more detail. According to John Holland, a member of the Center for the Study of Complex Systems at the University of Michigan, a CAS is

a dynamic network of many agents (which may represent cells, species, individuals, firms, nations) acting in parallel, constantly acting and reacting to what the other agents are doing. The control of a CAS tends to be highly dispersed and decentralized. If there is to be any coherent behavior in the system, it has to arise from competition and cooperation among the agents themselves. The overall behavior of the system is the result of a huge number of decisions made every moment by many individual agents. (Waldrop, 1992)3

Also, Kevin Dooley (1996) elaborates further:

A CAS behaves/evolves according to three key principles: (i) order is emergent as opposed to pre-determined; (ii) the system’s history is irreversible;4 and (iii) the system’s future is often unpredictable.5

The University of Michigan’s Center for the Study of Complex Systems describes CASs as follows:

In a complex system the agents are usually numerous, diverse and dynamic. They are intelligent but not perfect decision makers. They learn and adapt in response to feedback from their activities. They interact in structured ways, often forming organizations to carry out their tasks. They operate in a dynamic world that is rarely in equilibrium and often in chaos.6

Many of the systems we live in—natural, financial, and social—can be defined as CAS. Complexity ignorance arises because the world currently lacks, and may never have, the computational tools necessary to precisely model the behavior of all of the individual agents and forces or the behavior of the system as a whole. Further, complexity theory suggests that the agents do not always act rationally. Even if the agents were to act rationally, the system itself might show irrational behavior. Finally, it is extraordinarily difficult to determine cause and effect, or secondary and tertiary level interrelationships and dependencies. While the committee recognizes the inability to precisely model complexity, it does believe that at a minimum several major complex systems should be tracked in order to discover changes in macro effects. There are an increasing number of tools available that can be used to deepen our understanding of complex systems and garner early warnings of potential forces of disruption.

Summary of Ignorance Mitigation Methods

Table 4-1 summarizes the forms of ignorance and the committee’s suggested methods for mitigation.

BIAS

No matter whether one is assembling groups of people for data gathering (through workshops, scenarios, games, etc.) or collecting data from other sources, it is critical that system operators understand the main sources of individual and forecasting bias. Because ignorance is a key cause of both forecasting and individual bias, the recommended methods for reducing ignorance are essential tools for bias mitigation. The preceding section dealt with methods to mitigate human ignorance and surprise. This section identifies several potential sources of bias and discusses their impact on a forecast.

A key question to ask is which individual, group, or region would benefit most or be hurt most by the disruptive technologies being forecasted. To have a comprehensive system that is able to identify potential disruptions

3

Available at http://en.wikipedia.org/wiki/Complex_adaptive_system. Last accessed October 23, 2008.

4

In Wolfram’s A New Kind of Science, this concept is also known as computational irreducibility (2002).

5

Available at http://en.wikipedia.org/wiki/Complex_adaptive_system. Last accessed November 19, 2009.

6

Available at http://www.cscs.umich.edu/about/complexity.html. Last accessed October 23, 2008.

Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×

TABLE 4-1 Forms of Ignorance and Methods of Mitigation

Ignorance

Description

Methods of Mitigation

Closed ignorance

Information is available but forecasters are unwilling or unable to consider that some outcomes are unknown to the forecaster.a

Self-audit process, regular third-party audits, and open and transparent system with global participation

Open ignorance

Information is available and forecasters are willing to recognize and consider that some outcomes are unknown.

 

Personal

Surprise occurs because an individual forecaster lacks knowledge or awareness of the available information.

Explore multiple perspectives from a diverse set of individuals and data sources for data gathering and analysis

Communal

Surprise occurs because a group of forecasters has only similar viewpoints represented or may be less willing to consider the views of forecasters outside the community.

An open and transparent platform that includes viewpoints, data, and assets from a broader set of communities; “vision-widening” exercises such as gaming, scenarios, and workshops; creation of proxies representing extreme perspectives

Novelty

Surprise occurs because the forecasters are unable to anticipate and prepare for external shocks or internal changes in preferences, technologies, or institutions.

Simulating impacts and gaming alternative future outcomes of various potential shocks under different conditions

Complexity

Surprise occurs when inadequate forecasting tools are used to analyze the available data, resulting in inter-relationships, hidden dependencies, feedback loops, and other negative factors that lead to inadequate or incomplete understanding of the data.

Track changes and interrelationships of various systems (i.e., nature, financial markets, social trends) to discover potential macro-effect force changes

aOutcomes that are unknown are sometimes described as outcomes that are unpredictable in principle. One never could envisage them a priori because one cannot make even tentative predictions about the likely range of all possible outcomes. Philip Lawn, Toward Sustainable Development (Boca Raton, Fla.: CRC Press, 2000), p. 169.

from all parts of the globe, it is essential that a wide range of cultures, industries, organizations, and individuals contribute to the data collection efforts. Forecasting bias occurs when a forecast relies too heavily on one perspective during data-gathering, data analysis, or forecast generation.

A broad assessment of the demographics of potential participants and data sources must be a critical design point of the persistent forecasting system. This assessment can be accomplished through appropriately designed questionnaires, interviews, and surveys of the participants as well as analysis of the sources of the data being used in the forecast. The goal of such an exercise is to achieve balance in a forecast.

To reduce bias, the system should encourage participants to be open about personal characteristics that could help identify potential biases. Special attention needs to be given to age and culture diversity.

Age Bias

One common individual bias is the assumption that future generations’ acceptance of new technologies will mirror that of today’s users. Examples include the rejection of virtual presence in favor of physical presence for social interaction, the preference for paper books to electronic books, and trust of expert-sourced rather than crowd-sourced information. Technologies that are not accepted by today’s user may be easily accepted by users 10-20 years from now.

When applied to forecasting, age bias can be counteracted by gathering sufficient input from the generations of scientists, entrepreneurs, and technologists who will most likely create the future disruptive technologies and

Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×

applications. According to Dean Simonton, the age of outstanding achievements for an individual appears to be highly contingent on the discipline, with peaks in the early 30s for fields such as lyric poetry, pure mathematics, and theoretical physics and in the later 40s and 50s for domains such as the writing of novels, history, philosophy, medicine, and general scholarship (Simonton, 1988).

Another individual age-related bias is the assumption that one generation’s view of the future will be the same as another generation’s. Research suggests that younger people are more future-oriented than older people (Carstensen et al., 1999; Fingerman and Perlmutter, 1995), maybe because the former may perceive time as expansive or unlimited and tend to be more future-oriented. They are motivated to acquire new knowledge about the social and physical world and to seek out novelty (for example, meeting new friends or expanding their social networks). They also tend to be more concerned about future possibilities. Conversely, older people may perceive time as limited and tend to be more focused on the present. It has been suggested that this leads them to have a smaller number of meaningful relationships, to work to sustain positive feelings, and to be less motivated to acquire new knowledge. In other words, they may be more concerned about savoring the present than about changing the future. Forecasting bias can occur in systems that are not sufficiently future- or youth-oriented.

Mitigating Age Bias

One approach to mitigating age bias is to consider the time horizon of the forecast and then seek out participation from projected users and creators of disruptive technologies. For example, if pioneers in medicine are typically in their 40s and 50s, it would be appropriate to seek input from postdoctoral researchers in their 30s when developing a 20-year disruptive technology forecast. A common mistake is to survey opinions of the future only from older and well-established experts in the field.

Another approach is to consider the technological environment that surrounds youth today to gain a better understanding of the acceptability of future technologies. For example, How will future generations of warfighters feel about the use of robots and drones as the principal form of warfare? While many of today’s warfighters might reject the notion of automated warfare, future warfighters (today’s youth) who are growing up with video games, the Internet, robotic toys, mobile smart devices, virtual presence, and social networks may have a completely different attitude.

Cultural Bias

The committee believes that cultural factors should be considered when assessing the quality of a data source or when analyzing the data. There is ample quantitative data showing that societies around the globe vary considerably in their values, beliefs, norms, and worldviews (Bond et al., 2004; Gelfand et al., 2007; Hofstede et al., 1990; House et al., 2004; Schwartz, 1994). Research has yielded metrics for the dimensions in which cultures vary. At the cultural level, these dimensions often reflect basic issues surrounding the regulation of human activity that all societies must confront—issues that are solved in different ways (Schwartz, 1994). Such variability must be taken into account when identifying potential disruptive technologies for a number of reasons:

  • They can affect what is seen as disruptive.

  • Special incentives may be required to motivate individuals to discuss potential disruptive technologies.

  • Individuals from diverse cultures may feel more or less comfortable about communicating potential disruptions depending on the means of data gathering.

Because cultures vary widely with respect to their values, beliefs, worldviews, resources, motivations, and capabilities, the sampling must be as wide as possible. Within and across societies, it is essential to capture variation in age, socioeconomic status, gender, religion, population density, experience, and industry. Disruptive technologies can occur anywhere at any time and for any demographics. Accordingly, the system must collect wide and diverse data, be capable of supporting multiple languages, provide adequate and appropriate incentives, and employ multiple methodologies.

Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×

It is worth noting the counterargument to this—namely, that globalization is making the world more homogeneous, obviating the need for concern about cultural differences. Already, skeptics argue that youth in many countries—from the United States to Japan to Zimbabwe—are all eating Big Macs, drinking Coca-Cola, and wearing Levi’s, causing a homogenization of world culture. As noted by Huntington, this argument is missing the essence of culture, which includes at the most basic level deeply rooted assumptions, beliefs, and values (Huntington, 1996; Triandis, 1972). Huntington also notes that “non-Western societies can modernize and have modernized without abandoning their own cultures and adopting wholesale Western values, institutions, and practices” (Huntington, 1996, p. 78). Some even argue that cultural identity is on the rise with the end of the superpower divide and the consequent emergence of age-old animosities (Huntington, 1996). Moreover, cross-cultural conflicts are pervasive throughout the world, and the anger and shame that result from these conflicts can even instigate development of disruptive technologies. Culturally distinct contexts therefore are important to recognize and assess. In all, the argument that cultural differences are no longer important (or will cease to be important) in the study of disruptive technologies is not tenable.

Mitigating Cultural Bias

Because cultural differences have been demonstrated to have a pervasive effect on human cognition, motivation, emotion, and behavior (Gelfand et al., 2007), their implications for an open, persistent forecasting system must be assessed and minimized.

First, as previously noted, the concept of a disruptive technology is complex, and adding a cultural facet to its definition makes it even more so. It must be remembered that cultural differences can affect not only the approach to developing a broad and inclusive system but can also change what is perceived as disruptive.

Second, different incentives may be needed to motivate participants from different cultures during the information-gathering process. For example, monetary incentives offered by strangers might be suitable in highly individualistic cultures (such as the United States, Australia, and many countries throughout Western Europe). However, even in these countries, where out-groups may be distrusted, it may be necessary to go through trusted social networks. For this reason, it is critical to develop networks of local collaborators around the globe to facilitate the information-gathering process.

Third, cultural differences in familiarity and comfort with the methodologies used to extract information may also bias the results. Cross-cultural psychology can document numerous problems with gathering data that can affect the reliability and validity of the data collected (Gelfand et al., 2002; Triandis, 1983). Not all methodologies yield equivalent results across cultures; they will vary in the extent to which they are familiar, ethnically appropriate, reliable, and valid. Without taking these issues into account, the data and conclusions will be culturally biased.

Reducing Linguistic Bias

The language in which information is gathered can also bias the responses. For example, responses from Chinese study participants in Hong Kong differed widely depending on whether instructions were given in Mandarin, Cantonese, or English (Bond and Cheung, 1984). The authors of that study proposed that the respondents varied their answers according to who they thought was interested in the results—the Beijing authorities, the Hong Kong authorities, or the British authorities. Bennett, too (1977), found that bilingual persons gave more extreme answers in English than in their native language, and Marin and colleagues (1983) showed that bilingual individuals provided more socially desirable answers in English (presumably because they were communicating to outsiders). These studies demonstrate the role that language plays in communicating the purpose of the study to people in different cultures. When surveying individuals across cultures, it is critical to consider the implications of language choice and make decisions based on input from local people. The committee believes that a disruptive technology forecasting system should not be limited to English, and participants should be able to express themselves and respond in their native language.

Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×

CONCLUSION

This chapter introduced the concepts of individual bias and forecasting bias and discussed their effects on the validity of a forecast and the ignorance that leads to the two forms of bias. Technology forecasts often suffer from bias due to inadequacies in the method of forecasting, the source of the data, or the makeup of those who develop the method. While some bias may be unavoidable, much of it can be identified and mitigated by developing a broad and inclusive forecasting system. The committee believes that the mitigation of forecasting bias requires periodic audits by internal and external evaluators to ensure the diversity of participants and data sources as well as the robustness of the forecasting process.

REFERENCES

Bennett, M. 1977. Response characteristics of bilingual managers to organizational questionnaires. Personnel Psychology 30: 29-36.

Bond, M.H., and M.K. Cheung. 1984. Experimenter language choice and ethnic affirmation by Chinese trilinguals in Hong Kong. International Journal of Intercultural Relations 8: 347-356.

Bond, Michael H., Kwok Leung, Al Au, Kwok-Kit Tong, and Zoë Chemonges-Nielson. 2004. Combining social axioms with values in predicting social behaviours. European Journal of Personality 18: 177-191.

Carstensen, Laura L., Derek M. Isaacowitz, and Susan T. Charles. 1999. Taking time seriously: A theory of socioemotional selectivity. American Psychologist 54(3): 165-181.

Costanza, Robert, Bryan G. Norton, and Benjamin D. Haskell. 1992. Ecosystem health: New goals for environmental management. Chicago, Ill.: Island Press.

Dooley, Kevin. 1996. A Nominal Definition of Complex Adaptive Systems. The Chaos Network 8(1): 2-3. Available at http://www.public.asu.edu/~kdooley/papers/casdef.PDF. Last accessed November 19, 2009.

Faber, Malte, Reiner Manstetten, and John Proops. 1992a. Humankind and the environment: An anatomy of surprise and ignorance. Environmental Values 1(3): 217-241.

Faber, Malte, Reiner Manstetten, and John Proops. 1992b. Toward an open future: Ignorance, novelty, and evolution. Pp. 72-96 in Ecosystem Health: New Goals for Environmental Management, Robert Costanza, Bryan G. Norton, and Benjamin D. Haskell, eds. Washington, D.C.: Island Press.

Fingerman, Karen L., and Marion Perlmutter. 1995. Future time perspective and life events across adulthood. Journal of General Psychology 122(1): 95-111.

Gelfand, M.J., J.L. Raver, and K. Holcombe Ehrhart. 2002. Methodological issues in cross-cultural organizational research. Pp. 216-241 in Handbook of Industrial and Organizational Psychology Research Methods, S. Rogelberg, ed. New York: Blackwell.

Gelfand, M.J., M. Erez, and Z. Aycan. 2007. Cross-cultural organizational behavior. Annual Review of Psychology 58: 479-514.

Hofstede, G., B. Neuijen, D.D. Ohayv, and G. Sanders. 1990. Measuring organizational cultures: A qualitative and quantitative study across twenty cases. Administrative Science Quarterly 35(2): 286-316.

House, Robert J., Paul J. Hanges, Mansour Javidan, Peter Dorfman, and Vipin Gupta, eds. 2004. Culture, Leadership, and Organizations: The GLOBE Study of 62 Societies. Thousand Oaks, Calif.: Sage Publications.

Huntington, S. 1996. The Clash of Civilizations and the Remaking of World Order. New York: Simon & Schuster.

Marin, G., H.C. Triandis, H. Betancourt, and Y. Kashima. 1983. Ethnic affirmation versus social desirability: Explaining discrepancies in bilinguals’ responses to a questionnaire. Journal of Cross-Cultural Psychology 14(2): 173-186.

Ramos-Martin, Jesus. 2003. Empiricism in ecological economics: A perspective from complex systems theory. Ecological Economics 46(3): 387-398.

Schneider, Stephen, and B.L. Turner. 1995. Summary: Anticipating global change surprise. Pp. 6-18 in Anticipating Global Change Surprises. A Report of the Aspen Global Change Institute Elements of Change Series, John Katzenberger, and Susan Joy Hassol, eds. Aspen, Colo.: Aspen Global Change Institute.

Schwartz, S.H. 1994. Beyond individualism/collectivism: New dimensions of values. In Individualism and Collectivism: Theory Application and Methods. U. Kim, H.C. Triandis, C. Kagitcibasi, S.C. Choi, and G. Yoon. eds. Newbury Park, Calif.: Sage.

Sharma, Karan. 2008. Designing knowledge based systems as complex adaptive systems. Pp. 424-428 in Frontiers in Artificial Intelligence and Applications, Vol. 171, Wang Pei, Ben Goertzel, and Stan Franklin, eds. Amsterdam, Netherlands: IOS Press.

Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×

Simonton, Dean K. 1988. Age and outstanding achievement: What do we know after a century of research? Psychological Bulletin 104(2): 251-267.

Tellis, Gerard J. 2006. Disruptive technology or visionary leadership? Journal of Product Innovation Management 23(1): 34-38.

Triandis, H.C. 1972. The Analysis of Subjective Culture. New York: Wiley.

Triandis, H.C. 1983. Essentials of Studying Cultures. New York: Pergamon Press.

Waldrop, M. Mitchell. 1992. Complexity: The Emerging Science at the Edge of Order and Chaos. New York: Simon & Schuster.

Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×
Page 48
Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×
Page 49
Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×
Page 50
Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×
Page 51
Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×
Page 52
Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×
Page 53
Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×
Page 54
Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×
Page 55
Suggested Citation:"4 Reducing Forecasting Ignorance and Bias." National Research Council. 2010. Persistent Forecasting of Disruptive Technologies. Washington, DC: The National Academies Press. doi: 10.17226/12557.
×
Page 56
Next: 5 Ideal Attributes of a Disruptive Technology Forecasting System »
Persistent Forecasting of Disruptive Technologies Get This Book
×
Buy Paperback | $46.00 Buy Ebook | $36.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Technological innovations are key causal agents of surprise and disruption. In the recent past, the United States military has encountered unexpected challenges in the battlefield due in part to the adversary's incorporation of technologies not traditionally associated with weaponry. Recognizing the need to broaden the scope of current technology forecasting efforts, the Office of the Director, Defense Research and Engineering (DDR&E) and the Defense Intelligence Agency (DIA) tasked the Committee for Forecasting Future Disruptive Technologies with providing guidance and insight on how to build a persistent forecasting system to predict, analyze, and reduce the impact of the most dramatically disruptive technologies. The first of two reports, this volume analyzes existing forecasting methods and processes. It then outlines the necessary characteristics of a comprehensive forecasting system that integrates data from diverse sources to identify potentially game-changing technological innovations and facilitates informed decision making by policymakers.

The committee's goal was to help the reader understand current forecasting methodologies, the nature of disruptive technologies and the characteristics of a persistent forecasting system for disruptive technology. Persistent Forecasting of Disruptive Technologies is a useful text for the Department of Defense, Homeland Security, the Intelligence community and other defense agencies across the nation.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!