—Best practice: Clearly define the QOIs for a given VVUQ analysis, including the solution-verification task. Different QOIs will be affected differently by numerical errors.

—Best practice: Ensure that solution verification encompasses the full range of inputs that will be employed during UQ assessments.

•  Principle: The efficiency and effectiveness of code and solution verification can often be enhanced by exploiting the hierarchical composition of codes and mathematical models, with verification performed first on the lowest-level building blocks and then on successively more complex levels.

—Best practice: Identify hierarchies in computational and mathematical models and exploit them for code and solution verification. It is often worthwhile to design the code with this approach in mind.

—Best practice: Include in the test suite problems that test all levels in the hierarchy.

•  The goal of solution verification is to estimate, and control if possible, the error in each QOI for the problem at hand.

—Best practice: When possible in solution verification, use goal-oriented a posteriori error estimates, which give numerical error estimates for specified QOIs. In the ideal case the fidelity of the simulation is chosen so that the estimated errors are small compared to the uncertainties arising from other sources.

—Best practice: If goal-oriented a posteriori error estimates are not available, try to perform self-convergence studies (in which QOIs are computed at different levels of refinement) on the problem at hand, which can provide helpful estimates of numerical error.

Many VVUQ tasks introduce questions that can be posed, and in principle answered, within the realm of mathematics. Validation and prediction introduce additional questions whose answers require judgments from the realm of subject-matter expertise. For validation and prediction, the committee identified several principles and associated best practices, which are detailed in the main text. Some of the more important of these are summarized here:

•  Principle: A validation assessment is well defined only in terms of specified quantities of interest (QOIs) and the accuracy needed for the intended use of the model.

—Best practice: Early in the validation process, specify the QOIs that will be addressed and the required accuracy.

—Best practice: Tailor the level of effort in assessment and estimation of prediction uncertainties to the needs of the application.

•  Principle: A validation assessment provides direct information about model accuracy only in the domain of applicability that is “covered” by the physical observations employed in the assessment.

—Best practice: When quantifying or bounding model error for a QOI in the problem at hand, systematically assess the relevance of supporting data and validation assessments (which were based on data from different problems, often with different QOIs). Subject-matter expertise should inform this assessment of relevance (as discussed above and in Chapter 7).

—Best practice: If possible, use a broad range of sources of physical observations so that the accuracy of a model can be checked under different conditions and at multiple levels of integration.

—Best practice: Use “holdout tests” to test validation and prediction methodologies. In such a test some validation data are withheld from the validation process, the prediction machinery is employed to “predict” the withheld QOIs, with quantified uncertainties, and finally the predictions are compared to the withheld data.

—Best practice: If the desired QOI was not observed for the physical systems used in the validation process, compare sensitivities of the available physical observations with those of the QOI.

—Best practice: Consider multiple metrics for comparing model outputs against physical observations.

•  Principle: The efficiency and effectiveness of validation and prediction assessments are often improved by exploiting the hierarchical composition of computational and mathematical models, with assessments beginning on the lowest-level building blocks and proceeding to successively more complex levels.

—Best practice: Identify hierarchies in computational and mathematical models, seek measured data that facilitate hierarchical validation assessments, and exploit the hierarchical composition to the extent possible.

The National Academies of Sciences, Engineering, and Medicine
500 Fifth St. N.W. | Washington, D.C. 20001

Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement