National Academies Press: OpenBook
« Previous: Appendixes
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×

Appendix A


Glossary

TABLE A.1 Glossary of Terms Related to Verification, Validation, and Uncertainty Quantification


Term, with Synonyms and Cross-References Definition Notes and Comments

 
accuracy
See also precision.
A measure of agreement between the estimated value of some quantity and its true value. (Adapted from Society for Risk Analysis [SRA] Glossary.a) See note under precision.
 
adjoint map Given a map (i.e., forward model) from an input vector space to an output vector space, the adjoint is an associated map between the vector space of linear real-valued functions on the output space to the vector space of linear real-valued functions on the input space. Given a linear real function on the output space, a linear real function on the input space is obtained by first applying the original map to any specified vector in the input space and then applying the given linear real function on the output space. The adjoint map is important for determining properties of the original map when the input and output vectors cannot be observed directly. It plays a fundamental role in the theory of maps, e.g., for determining solvability of inverse problems, stability and sensitivity, Green’s functions, and derivatives of the output of a map with respect to the input. The concrete formulation and evaluation of an adjoint depend heavily on the properties of the original map (i.e., forward model) and the input and output spaces, with extra care needed for nonlinear maps.
 
aleatoric uncertainty
Synonyms: aleatoric probability, aleatoric uncertainty, systematic error See also probability, epistemic uncertainty.
A measure of the uncertainty of an unknown event whose occurrence is governed by some random physical phenomena that are either (1) predictable, in principle, with sufficient information (e.g., tossing a die), or (2) essentially unpredictable (radioactive decay).b See epistemic uncertainty.
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
 

Term, with Synonyms and Cross-References Definition Notes and Comments

algorithm A finite list of well-defined instructions that, when executed, proceed through a finite number of well-defined successive states, eventually terminating and producing an output. The instructions and executions are not necessarily deterministic; some algorithms incorporate random input (see Monte Carlo simulation).
 
approximation
See also estimation (of parameters in probability models).
The result of a computation or assessment that may not be exactly correct but that is adequate for a particular purpose.c  
 
average
Synonyms: arithmetic mean, sample mean See also mean.
The sum of n numbers divided by n.d,e,f The average is a simple arithmetic operation requiring a set of n numbers. It is often confused with the mean (or expected value), which is a property of a probability distribution. One reason for this confusion is that the average of a set of realizations of a random variable is often a good estimator of the mean of the random variable’s distribution.
 
Bayesian approach
See also prior probability.
An approach that uses observations (data) to constrain uncertain parameters in a probabilistic model. The constrained uncertainty is described by a posterior probability distribution, produced using Bayes’s theorem to combine the prior probability distribution with the probabilistic model of the observations. In most problems the Bayesian approach produces a high-dimensional probability distribution describing the joint uncertainty in all of the model parameters. Functionals or integrals of this posterior distribution are typically used to summarize the posterior uncertainty. These summaries are typically produced by means of numerical approximation or sampling methods such as Markov chain Monte Carlo.
 
code verification
See also verification, solution verification.
The process of determining and documenting the extent to which a computer program (“code”) correctly solves the equations of the mathematical model.  
 
computational model
Synonym: computer model See also model (simulation).
Computer code that (approximately) solves the equations of the mathematical model. In physically based applications the computational model might encode physical rules such as conservation of mass or momentum. In other applications the computational model might also produce a Monte Carlo or a discrete-event realization.
 
conditional probability See also probability. The probability of an event supposing (i.e., “conditioned on”) the occurrence of other specified events. In the Bayesian approach the posterior distribution is a conditional probability distribution, conditioned on the physical observations. It is important to note that subjectively assessed probabilities are based on the state of knowledge that holds at the time of the probability assessment.
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
 

Term, with Synonyms and Cross-References Definition Notes and Comments

confidence interval
Synonym: interval
A range of values [a, b] determined from a sample, using a predetermined rule chosen such that, in repeated random samples from the same population, the fraction α of computed ranges will include the true value of an unknown parameter. The values a and b are called confidence limits; α is called the confidence coefficient (commonly chosen to be .95 or .99); and 1 - α is called the confidence level. (Adapted from SRA Glossary.)a Confidence intervals should not be interpreted as implying that the parameter itself has a range of values; it has only one value. For any given sample the confidence limits a and b define a random range within which the parameter of interest will lie with probability a (provided that the actual population satisfies the initial hypothesis).
 
constrained uncertainty See also Bayesian approach. Uncertainty about a parameter, prediction, or other entity that has been reduced by incorporating additional information, such as new physical observations. For most of the examples in this report, uncertainty is constrained using the Bayesian approach, conditioning on physical observations, producing a posterior distribution for parameters and predictions.
 
continuous random variable
See also cumulative distribution function, probability density function.
A random variable, X, is continuous if it has an absolutely continuous cumulative distribution function.d  
 
cumulative distribution function
Synonyms: cumulative distribution, cdf, distribution function
See also probability density function,
The probability that a random variable X will be less than or equal to a value x; written as P{Xx}.f,g The cdf always exists for any random variable; it is monotonic nondecreasing in x, and (being a probability 0 ≤ P{Xx} ≤ 1. If P{Xx} is absolutely continuous
 
probability distribution.   in x, then X is called a continuous random variable; if it is discontinuous at a finite or countably infinite number of values of x, and constant otherwise, X is called a discrete random variable.
 
data assimilation A recursive process for producing predictions with uncertainty regarding some process, commonly used in weather forecasting and other fields of geoscience. At a given iteration, new physical observations are combined with model-based predictions to produce updated predictions and updated estimates of the current state of the system. The combination method is usually based on Bayesian inference. The Kalman filter, the ensemble Kalman filter, and particle filters are examples of approaches with which data assimilation is carried out.
 
data verification and validation The process of verifying the internal consistency and correctness of data and validating that they represent real-world entities appropriate for their intended purpose or an expected range of purposes.h  
 
discrete random variable
See also cumulative distribution function.
A random variable that has a nonzero probability for only a finite, or countably infinite, set of values.b  
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
 

Term, with Synonyms and Cross-References Definition Notes and Comments

epistemic uncertainty
Synonym: epistemic probability See also aleatoric uncertainty.
A representation of uncertainty about propositions due to incomplete knowledge. Such propositions may be about either past or future events.b Some examples of epistemic uncertainty are (1) a probability density function describing uncertainty regarding the acceleration due to gravity at Earth’s surface; (2) determination of the probability that a required maintenance procedure will, in fact, be carried out.
 
estimation (of parameters in probability models)
See also approximation.
A procedure by which sample data are used to assess the value of an unknown quantity.e Estimation procedures are usually based on statistical analyses that address their efficiency, effectiveness, limiting behaviors, degrees of bias, etc. The most common methods of parameter estimation are “maximum likelihood” and the method of moments. Under the Bayesian approach estimates can be produced by taking the mean, median, or most likely value determined by the posterior distribution.
 
expected value
Synonym: expectation See also mean.
The first moment of the probability distribution of a random variable X; often denoted as E(X) and defined as ∑ xip(xi) if X is a discrete random variable and ∫ xf(x)dx if X is a continuous random variable.d, f  
 
extrapolative prediction
See also interpolative prediction.
The use of a model to make statements about quantities of interest (QOIs) in settings (initial conditions, physical regimes, parameter values, etc.) that are outside the conditions for which the model validation effort occurred.  
 
face validation See also validation. A nonquantitative “sanity check” on a model that requires both its structural content and outputs to be consistent with well-understood and agreed-on forms, ranges, etc. Face validation should not be used by itself as a formal validation process. Instead, it should be used to guide model development, design of sensitivity analyses, etc.
 
forward problem
See also inverse problem.
The use of a model, given the values of all necessary inputs (initial conditions, parameters, etc.), to produce potentially observable QOIs.  
 
forward propagation
Synonym: uncertainty propagation (UP) See also forward problem.
Quantifying the uncertainty of a model’s responses that results from uncertainty in the model’s inputs being propagated through the model.  
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
 

Term, with Synonyms and Cross-References Definition Notes and Comments

global statistical sensitivity analysis See also sensitivity analysis. The study of how the uncertainty in the output or QOI of a model (numerical or otherwise) can be apportioned to different sources of uncertainty in the model input. The term global ensures that the analysis considers more than just local or one-factor-at-a-time effects. Hence interactions and nonlinearities are important components of a global statistical sensitivity analysis. Global statistical sensitivity analysis is distinguished from local, or one-at-a-time, sensitivity analyses in that interactions and nonlinearities are considered.
 
input verification See also verification. The process of determining that the data entered into a model or simulation accurately represent what the developer intends. (Adapted from DOD, 2009.h)  
 
interpolative prediction
See also extrapolative prediction.
The use of a model to make statements about QOIs in regimes within which the model has been validated. In practice, it may be difficult to determine if a particular prediction is an interpolation or not.
 
intrusive methods
See also nonintrusive methods
(black box methods).
Approaches to exploring a computational model that require a recoding of the model. Such a recoding might be done in order to efficiently produce derivative information using the adjoint equation to facilitate a sensitivity analysis.  
 
inverse problem
See also forward problem.
An estimation of a model’s uncertain parameters by using data, measurements, or observations. An inverse problem is often formulated as an optimization problem that minimizes an appropriate measure of the “differences” between observed and model-predicted outputs (with constraints—or penalty costs—on the values of some of the parameters).
 
level of fidelity See also validation. The amount of detail with which a model describes an actual process. Relevant features might include the descriptions of geometry, model symmetries, dimensionality, or physical processes in the model. High-fidelity models attempt to capture more of these features than do low-fidelity models. A high level of fidelity does not necessarily imply that the model will give highly accurate predictions for the system.
 
likelihood
See also probability, uncertainty.
The likelihood, L(A | D), of an event, A, given the data, D, and a specific model, is often taken to be proportional to P(D | A), the constant of proportionality being arbitrary.i In informal usage, “likelihood” is often a qualitative description of probability or frequency. However, equally often these descriptions do not satisfy the axioms of probability.
 
linear regression
Synonym: regression
See also nonlinear regression.
Regression when the function to be fit is linear in the independent variables.  
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
 

Term, with Synonyms and Cross-References Definition Notes and Comments

Markov chain Monte Carlo (MCMC) A sampling technique that constructs a Markov chain to produce Monte Carlo samples from a typically complicated, multivariate distribution. The resulting sample is then used to estimate functionals of the distribution. MCMC typically requires many fewer points than grid-based sampling methods. MCMC approaches become intractable as the complexity of the forward problem and the dimensions of the parameter spaces increase.
 
mathematical model
Synonym: conceptual model See also model (simulation).
A model that uses mathematical language (sets of equations, inequalities, etc.) to describe the behavior of a system.  
 
mean
See also expected value, average.
The first moment of a probability distribution, with the same mathematical definition as that of expected value. The mean is a parameter that represents the central tendency of a distribution.d,e,g,j  
 
measurement error The discrepancy between a measurement and the quantity that the measurement instrument is intended to measure.k Measurement error is often decomposed into two components: replicate variation and bias.
 
model (simulation) See also simulation. A representation of some portion of the world in a readily manipulated form. A mathematical model is an abstraction that uses mathematical language to describe the behavior of a system.l Mathematical models are used to aid our understanding of some aspects of the real world and to aid in decision making. They are also valuable rhetorical tools for presenting the rationale supporting various decisions, since they arguably allow for transparency and the reproduction of results by others. However, models are only as good as their (validated) relationship to the real world and within the context for which they are designed.
 
model discrepancy
Synonyms: model inadequacy, structural error
A term accounting for or describing the difference between a model of the system and the true physical system. In some cases, model discrepancy is the dominant source of uncertainty in model-based predictions. When relevant physical data are available, model discrepancy can be estimated. Estimating this term when relevant physical observations are not available is difficult.
 
Monte Carlo simulation See also model (simulation). A model constructed so that the input of a large number of random draws from defined probability distributions will generate outputs that are representative of the random behavior of a particular system, phenomenon, consequences, etc., of a series of events.m Each set of “runs” of a simulation inherently represents the outcomes of a series of experiments. The analysis of simulation output data therefore requires a proper experimental design, followed by the use of statistical techniques to estimate parameters, test hypotheses, etc.
 
multiscale phenomena Equations representing the dynamics of a nonlinear system that combine the behavior at many scales of physical dimension and/or time. The analysis of multiscale phenomena presents many challenges to numerical analysis and associated software, so that the coupling of results from one scale to those of another may lead to instability in the model output that might not represent physical reality.
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
 

Term, with Synonyms and Cross-References Definition Notes and Comments

multivariate adaptive regression splines
(MARS)

See also regression.
A form of nonparametric regression analysis (usually presented as an extension of linear regression) that automatically represents nonlinearities and interactions in terms of splines (e.g., functions having smooth first and second derivatives).n  
 
nonintrusive methods (black box methods) Methods to carry out sensitivity analysis or forward propagation or to solve the inverse problem that only require forward runs of the computational model, effectively treating the model as a black box.  
 
nonlinear regression
See also regression, linear regression.
Regression when the function to be fit is nonlinear in the independent variables.  
 
parameter Terms in a mathematical function that remain fixed during any computational procedure. These may include initial conditions, physical constants, boundary values, etc. Often parameters are fixed at assumed values, or they can be estimated using physical observations. Alternatively, uncertainty regarding parameters may be constrained with physical data.
 
polynomial chaos
Synonym: PC, Wiener chaos expansion See also Monte Carlo simulation.
A parameterization of random variables and processes that lends itself to the characterization of transformations between input and output quantities. The resulting representations are akin to a response surface with respect to normalized random variables and can be readily evaluated, yielding very efficient procedures for sampling the output variables. The coefficients in these representations can be estimated in a number of ways, including Galerkin projections, least squares, perturbation expansions, statistical sampling, and numerical quadrature.
 
posterior probability
See also Bayesian approach, prior probability.
Probability distribution describing uncertainty in parameters (and possibly other random quantities) of interest in a statistical model after data are observed and conditioned on. The Bayesian approach updates the prior probability distribution by conditioning on the data (often physical observations), producing a posterior distribution for the same parameters. Often of interest is the posterior predictive distribution for a QOI, describing uncertainty about the QOI for the physical system.
 
precision
See also accuracy.
The implied degree of certainty with which a value is stated, as reflected in the number of significant digits used to express the value—the more digits, the more precision. (Adapted from SRA Glossary.a) Consider two statements assessing Bill Gates’s net worth, W. A precise but inaccurate assessment is “W = $123,472.89.” An imprecise but accurate assessment is “W > $6 billion.”
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
 

Term, with Synonyms and Cross-References Definition Notes and Comments

prediction uncertainty The uncertainty associated with a prediction about a QOI for the real-world process. The prediction uncertainty could be described by a posterior distribution for the QOI, a predictive distribution, a confidence interval, or possibly some other representation. This is a statement about reality, given information from an analysis typically involving a computational model, physical observations, and possibly other information sources.
 
prior probability
Synonym: a priori probability
See also Bayesian approach, posterior probability.
Probability distribution assigned to parameters (and possibly other random quantities) of interest in a statistical model before physical observations are available. Bayesian approach updates this prior probability distribution by conditioning on the physical observations, producing a posterior distribution for the same parameters. Obtaining the prior distribution may be done using expert judgment or previous data, or it may be specified to be “neutral” to the analysis.
 
probability
See also likelihood, conditional probability, aleatoric uncertainty, subjective probability.
One of a set of numerical values between 0 and 1 assigned to a collection of random events (which are subsets of a sample space) in such a way that the assigned numbers obey two axioms:
(1) 0 ≤ P{A} ≤ 1 for any A and
(2) P{A} + P{B} = P{A ∪ B} for two mutually exclusive events A and B.j
This definition holds for all quantification of uncertainty: subjective or frequentist.
 
probability density function (pdf) The derivative of an absolutely continuous cumulative distribution function.j

For a scalar random variable X, a function f such that, for any two numbers, a and b, with ab, P{aXb} = image.
The pdf is the common way to represent the probability distribution of a continuous random variable, because its shape often displays the central tendency (mean) and variability (standard deviation). From its definition, P{a < Xb} is the integral of the pdf between a and b.
 
probability distribution See cumulative distribution function.  
 
probability elicitation
Synonyms: probability assessment, subjective probability
A process of gathering, structuring, and encoding expert judgment (about uncertain events or quantities) in the form of probability statements about future events.o There are many approaches for probability elicitation, the most common of which are those used for obtaining a priori subjective probabilities. Note that the results of probability elicitations are sometimes called probability assessments or assignments.
 
quantity of interest (QOI) A numerical characteristic of the system being modeled, the value of which is of interest to stakeholders, typically because it informs a decision. To be useful the model must be able to provide, as output, values of or probability statements about QOIs.  
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
 

Term, with Synonyms and Cross-References Definition Notes and Comments

reduced model
Synonym: emulator
A low-fidelity model developed to replace (or augment) a computationally demanding, high-fidelity model. A reduced model is particularly useful for carrying out computationally demanding analysis (e.g., sensitivity analysis, forward propagation of uncertainty, solving the inverse problem) that would be infeasible with the original model. Sometimes a reduced model “collapses” aspects of a “physics-based” model so as to be referred to as a “physics-blind” model.
 
regression
See also: linear regression, nonlinear regression.
A form of statistical analysis in which observational data are used to statistically fit a mathematical function that presents the data (i.e., dependent variables) as a function of a set of parameters and one or more independent variables.  
 
response surface
See also sensitivity analysis.
A function that predicts outputs from a model as a function of the model inputs. A response surface is typically estimated from an ensemble of model runs using a regression, Gaussian process modeling, or some other estimation or interpolation procedure. A response surface can be used like a reduced model to carry out computationally demanding analyses (e.g., sensitivity analysis, forward propagation, solving the inverse problem). Since the response surface does not exactly reproduce the computational model, there is typically additional error in results produced by response surface approaches.
 
robustness analysis
See also sensitivity analysis.
For a prescriptive model, a procedure that analyzes the degree to which deviations from a “best” decision provide suboptimal values of the desired criterion. These deviations can be due to uncertainty in model formulation, assumed parameter values, etc.  
 
sensitivity analysis
See also robustness analysis.
An exploration, often by numerical (rather than analytical) means, of how model outputs (particularly QOIs) are affected by changes in the inputs (parameter values, assumptions, etc.).  
 
simulation
Synonym: model
See also Monte Carlo simulation.
The execution of a computer code to mimic an actual system. Many uncertainty quantification (UQ) methods use an ensemble of simulations, or model runs, to construct emulators, carry out sensitivity analysis, etc.
 
solution verification
See also verification, code verification.
The process of determining as completely as possible the accuracy with which the algorithms solve the mathematical-model equations for a specified QOI.  
 
standard deviation See also variance. The square root of the variance of a distribution.j  
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
 

Term, with Synonyms and Cross-References Definition Notes and Comments

stochastic
See also probability.
Pertaining to a sequence of observations, each of which can be considered to be a sample from a probability distribution. Often informally used as a synonym of “probabilistic.”
 
subjective probability
See also probability elicitation.
Expert judgment about uncertain events or quantities, in the form of probability statements about future events. It is not based on any precise computation but is often a reasonable assessment by a knowledgeable person.  
 
uncertainty
See also probability, aleatoric probability, epistemic uncertainty.
The condition of being unsure about something; a lack of assurance or conviction.c For the purpose of this report, uncertainty is often described regarding a QOI of the true, physical system. This uncertainty depends on a model-based prediction, as well as on other information included in the VVUQ assessment. This uncertainty can be described using probability.
 
uncertainty quantification (UQ) The process of quantifying uncertainties in a computed QOI, with the goals of accounting for all sources of uncertainty and quantifying the contributions of specific sources to the overall uncertainty. More broadly, UQ can be thought of as the field of research that uses and develops theory, methodology, and approaches for carrying out inference, with the aid of computational models, on complex systems.
 
validation The process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model.p  
 
variance
See also standard deviation.
The second moment of a probability distribution, defined as E(X - µ)2, where µ is the first moment of the random variable X. The variance is a common measure of variability around the mean of a distribution. Its square root, the standard deviation, having dimensional units of the random variable, is a more intuitively meaningful measure of dispersion from the mean.
 
verification
See also code verification, solution verification.
The process of determining whether a computer program (“code”) correctly solves the mathematical-model equations. This includes code verification (determining whether the code correctly implements the intended algorithms) and solution verification (determining the accuracy with which the algorithms solve the mathematical-model equations for specified QOIs).  

image

a Society for Risk Analysis (SRA), Glossary of Risk Analysis Terms. Available at sra.org/resources_glossary.php.

b Cornell LCS Statistics Laboratory. See http://instruct1.cit.cornell.edu:8000/courses/statslab/Stuff/indes.php.

c American Heritage Dictionary. 2000. Boston: Houghton, Mifflin.

d Glossary of Statistics Terms. Available at http://www.stat.berkeley.edu/users/stark/SticiGui/Text/gloss.htm.

e Statistical Education Through Problem Solving [STEP] Consortium. Available at http://www.stats.gla.ac.uk/steps/index.html.

f W. Feller. 1968. An Introduction to Probability Theory and Its Applications. New York, N.Y.: Wiley.

g J.L. Devore. 2000. Probability and Statistics for Engineering and the Sciences. Pacific Grove, Calif.: Duxbury Press.

Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×

h DOD (Department of Defense). 2009. Instruction 5000.61. December 9. Washington, D.C.

i A.W.F. Edwards. 1992. Likelihood. Baltimore, Md.: Johns Hopkins University Press.

j S.M. Ross. 2000. Introduction to Probability Models. New York: Academic Press.

k Duke University. 1998. Statistical and Data Analysis for Biological Sciences. Available at http://www.isds.duke.edu/courses/Fall98/sta210b/terms.html.

l R. Aris. 1995. Mathematical Modelling Techniques, New York: Dover.

m EJ. Henley and H. Kunmamoto. 1981. Reliability Engineering and Risk Assessment. Upper Saddle River, N.J.: Prentice-Hall.

n J.H. Friedman. 1991. Multivariate Adaptive Regression Splines. The Annals of Statistics 19(1):1-67.

o M.S. Meyer and J.M. Booker. 1998. Eliciting and Analyzing Expert Judgment. LA-UR-99-1659. Los Alamos, N.Mex.: Los Alamos National Laboratory.

p American Institute for Aeronautics and Astronautics. 1998. Guide for the Verification and Validation of Computational Fluid Dynamics Simulations. Reston, Va.: American Institute for Aeronautics and Astronautics.

Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
Page 109
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
Page 110
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
Page 111
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
Page 112
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
Page 113
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
Page 114
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
Page 115
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
Page 116
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
Page 117
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
Page 118
Suggested Citation:"Appendix A: Glossary." National Research Council. 2012. Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. Washington, DC: The National Academies Press. doi: 10.17226/13395.
×
Page 119
Next: Appendix B: Agendas of Committee Meetings »
Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification Get This Book
×
Buy Paperback | $42.00 Buy Ebook | $33.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Advances in computing hardware and algorithms have dramatically improved the ability to simulate complex processes computationally. Today's simulation capabilities offer the prospect of addressing questions that in the past could be addressed only by resource-intensive experimentation, if at all. Assessing the Reliability of Complex Models recognizes the ubiquity of uncertainty in computational estimates of reality and the necessity for its quantification.

As computational science and engineering have matured, the process of quantifying or bounding uncertainties in a computational estimate of a physical quality of interest has evolved into a small set of interdependent tasks: verification, validation, and uncertainty of quantification (VVUQ). In recognition of the increasing importance of computational simulation and the increasing need to assess uncertainties in computational results, the National Research Council was asked to study the mathematical foundations of VVUQ and to recommend steps that will ultimately lead to improved processes.

Assessing the Reliability of Complex Models discusses changes in education of professionals and dissemination of information that should enhance the ability of future VVUQ practitioners to improve and properly apply VVUQ methodologies to difficult problems, enhance the ability of VVUQ customers to understand VVUQ results and use them to make informed decisions, and enhance the ability of all VVUQ stakeholders to communicate with each other. This report is an essential resource for all decision and policy makers in the field, students, stakeholders, UQ experts, and VVUQ educators and practitioners.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!