• What choices are being considered regarding the systems being evaluated?

  • What are the objectives against which the desirability of these choices are to be measured?

  • What uncertainties remain after the analysis about the extent to which each of the choices meets these objectives?

  • What are the contingencies that affect the resolution of these uncertainties?

Analysts need to inform decision makers about the robustness of conclusions. It is important to consider all the uncertainties in the analysis, and it is equally important to report to users when these uncertainties affect the conclusions of an analysis. Sources of information used in analysis, as well as potential biases in the information, should also be reported. Some guiding principles are listed below. Further discussion may be found in Fischhoff et al. (1981), Keeney (1992), and Clemen (1992).

  1. Present uncertainties in a way that avoids technical jargon and makes their implications clear. This is a guiding theme through the rest of the discussion. Practitioners of statistical analysis must remember that most decision makers are not familiar with technical statistical terms. Use terms that can be understood by nonstatisticians. Standard, simple, and understandable presentation formats for communicating information about policy-relevant uncertainties would be helpful.

  2. List and discuss assumptions and omissions. Because the conclusions of an analysis depend on the validity of its assumptions and the sensitivity of the conclusions to violations of the assumptions, it is incumbent on the analyst to present explicitly the assumptions underlying the analysis. Any analysis leaves out some factors, either because they are too difficult to include given the available models and methodologies, or because the information required to include them is not available. When factors are left out of an analysis, they should be identified and the potential effect of the omitted factors on the results should be discussed.

  3. Identify all important sources of uncertainty. It is tempting for analysts to consider only uncertainty that can be easily quantified. Confidence intervals around the value of an estimated parameter as produced by a computer package account for only the within-model uncertainty associated with an estimate. Other important errors include:

    • Uncertainty due to modeling assumptions and omitted factors. For example, a piece of equipment may not have been tested under all relevant conditions, and the way it is employed on the test range may be very different from how it will be employed on the battlefield.

    • Uncertainty about the future military climate. The combat utility



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement