National Academies Press: OpenBook

Statistical Issues in Defense Analysis and Testing: Summary of a Workshop (1994)

Chapter: COMMUNICATING UNCERTAINTY TO DECISION MAKERS

« Previous: SOURCES OF VARIABILITY
Suggested Citation:"COMMUNICATING UNCERTAINTY TO DECISION MAKERS." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×

Communicating Uncertainty to Decision Makers

By their nature, decisions about system development and acquisition involve uncertainties. But current methods for reporting the results of analysis and testing tend to hide the uncertainties. Reports that contain only point estimates of outcome variables—such as cost, time to completion, and system performance parameters (particularly true in COEAs)—or that tell only whether a system passed or failed a test, are inadequate for informed decision making. For example, consider the decision of whether to begin development on a new system whose forecasted performance is better than that of an existing system. Suppose that the benefit/cost trade-off based only on a point estimate is quite favorable to the new system. Suppose, though, that there is significant technical risk associated with the new system so that there is a substantial likelihood that the new system may fail to meet minimal standards of acceptable performance. This uncertainty about the performance of the new system could well change the decision about whether to begin development.

Information from analysis or testing needs to be presented in ways that are constructive for decision making. Which uncertainties to present and how best to communicate them depend on the decision being supported by the analysis or test. Elsewhere in this report, we discuss explicit modeling of the decision problem faced by program evaluators. Regardless of whether such a formal modeling approach is taken, the decision maker needs to address the following questions:

Suggested Citation:"COMMUNICATING UNCERTAINTY TO DECISION MAKERS." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×
  • What choices are being considered regarding the systems being evaluated?

  • What are the objectives against which the desirability of these choices are to be measured?

  • What uncertainties remain after the analysis about the extent to which each of the choices meets these objectives?

  • What are the contingencies that affect the resolution of these uncertainties?

Analysts need to inform decision makers about the robustness of conclusions. It is important to consider all the uncertainties in the analysis, and it is equally important to report to users when these uncertainties affect the conclusions of an analysis. Sources of information used in analysis, as well as potential biases in the information, should also be reported. Some guiding principles are listed below. Further discussion may be found in Fischhoff et al. (1981), Keeney (1992), and Clemen (1992).

  1. Present uncertainties in a way that avoids technical jargon and makes their implications clear. This is a guiding theme through the rest of the discussion. Practitioners of statistical analysis must remember that most decision makers are not familiar with technical statistical terms. Use terms that can be understood by nonstatisticians. Standard, simple, and understandable presentation formats for communicating information about policy-relevant uncertainties would be helpful.

  2. List and discuss assumptions and omissions. Because the conclusions of an analysis depend on the validity of its assumptions and the sensitivity of the conclusions to violations of the assumptions, it is incumbent on the analyst to present explicitly the assumptions underlying the analysis. Any analysis leaves out some factors, either because they are too difficult to include given the available models and methodologies, or because the information required to include them is not available. When factors are left out of an analysis, they should be identified and the potential effect of the omitted factors on the results should be discussed.

  3. Identify all important sources of uncertainty. It is tempting for analysts to consider only uncertainty that can be easily quantified. Confidence intervals around the value of an estimated parameter as produced by a computer package account for only the within-model uncertainty associated with an estimate. Other important errors include:

    • Uncertainty due to modeling assumptions and omitted factors. For example, a piece of equipment may not have been tested under all relevant conditions, and the way it is employed on the test range may be very different from how it will be employed on the battlefield.

    • Uncertainty about the future military climate. The combat utility

Suggested Citation:"COMMUNICATING UNCERTAINTY TO DECISION MAKERS." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×

of a weapon system depends on the combat scenario in which it will be used. There is considerable uncertainty about the types of military situations in which weapon systems of the future will be used.

  • Uncertainty about scientific and technological knowledge. Being at the cutting edge of technology carries with it technical risk. As the Department of Defense tries to bring results from research laboratories more rapidly into the field, less will be known about the technologies employed in a new system prior to testing of that system. Awareness of and an assessment of these uncertainties is essential to good decision making.

  • Uncertainty about the appropriateness of analysis methods. It is important to acknowledge potential fallibility or limitations in the analytical methods employed and to assess the effect of plausible alternative methods on the results.

    1. Whenever an estimate is presented, uncertainty bounds should also be reported. An analysis produces estimates of unknown quantities, and these estimates vary in precision. Error bounds should become a standard part of reporting procedures. A related and sometimes useful principle is to present multiple analyses. When different models or modeling assumptions result in qualitatively different conclusions, it may be appropriate to present the results of multiple analyses, together with discussion about the reasons for the discrepancies among the analyses.

    2. Frequently the most important output of analysis is insight, not an answer. Good analysis is not simply cranking a problem through a model to get “the answer.” A competent analyst will try different models and different assumptions to determine the sensitivity of the result to the modeling assumptions. (This comment applies to statistical analysis of operational test results as well as to analysis of the results of simulations.) In the process of doing the analysis, the analyst gains considerable understanding of the problem, the model, and the benefits and limits of the model as it applies to the problem. Rather than simply presenting the answer from an analysis, it is important for the analyst to convey the important insights he or she has gained during the analysis process. Such insight can help decision makers weigh choices in a more informed and effective way.

    3. Use graphical presentation methods. Not only is a picture worth a thousand words, but it may also be worth 100 tables. Recent advances in computer graphics for data analysis and isualization can be useful in understanding and presenting the results of complex experiments. Such graphics have been integrated into widely available statistical packages, which should facilitate their use. Tufte (1983) provides an entertaining and informative discussion of graphical methods for displaying complex quantitative data.

Decision makers must realize, however, that they, too, must pay a substantial price to successfully incorporate uncertainty assessments into their thinking. As recent experience in the industrial quality movement has taught,

Suggested Citation:"COMMUNICATING UNCERTAINTY TO DECISION MAKERS." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×

a commitment from management is essential. The strength of the statistical method is explicitly to quantify our ignorance—not our knowledge. Decision makers may sometimes be reluctant to acknowledge uncertainty, because substantial uncertainty usually complicates the process of building consensus around a decision. But as technicians must look beyond the question “How big does n have to be?” so too must the decision maker pose other questions in addition to “What's the number?”

Suggested Citation:"COMMUNICATING UNCERTAINTY TO DECISION MAKERS." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×
Page 19
Suggested Citation:"COMMUNICATING UNCERTAINTY TO DECISION MAKERS." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×
Page 20
Suggested Citation:"COMMUNICATING UNCERTAINTY TO DECISION MAKERS." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×
Page 21
Suggested Citation:"COMMUNICATING UNCERTAINTY TO DECISION MAKERS." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×
Page 22
Next: LINKING INFORMATION ACROSS THE ACQUISITION PROCESS »
Statistical Issues in Defense Analysis and Testing: Summary of a Workshop Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!