For example, the DOE provides a suggested set of goals, principles, and guidelines for software quality practices (DOE, 2000). The use of these practices can be tailored to the development environment and application area. For example, pervasive use of software configuration management and regression testing is the state of practice in many scientific communities.


Code verification is the process of determining whether a computer program (“code”) correctly implements the intended algorithms. Various tools for code verification and techniques that employ them have been proposed (Roache, 1998, 2002; Knupp and Salari, 2003; Babuska, 2004). The application of these processes is becoming more prevalent in many scientific communities. For example, the computer model employed in the electromagnetics case study described in Section 4.5 uses carefully verified simulation techniques. Tools for code verification include, but are not limited to, comparisons against analytic and semianalytic (i.e., independent, error-controlled) solutions and the “method of manufactured solutions.” The latter refers to the process of postulating a solution in the form of a function, followed by substituting it into the operator characterizing the mathematical model in order to obtain (domain and boundary) source terms for the model equations. This process then provides an exact solution for a model that is driven by these source terms (Knupp and Salari, 2003; Oden, 2003).

The comparison of code results against independent, error-controlled (“reference”) solutions allows researchers to assess, for example, the degree to which code implementation achieves the expected solution to the mathematical system of equations. Because the reference solution is exact and the code implements numerical approximation to the exact solution, one can test convergence rates against those predicted by theory. As a separate, complementary activity one can often construct a reference solution to the discretized problem, providing an independent solution of the computational model (not the mathematical model). This verification activity allows assessment of correctness in the pre-asymptotic regime. To make such reference solutions mathematically tractable, typically simplified model problems (e.g., ones with lower dimensionality and with simplified physics and geometry) are chosen. However, there are few analytical and semianalytical solutions for more complex problems. There is a need to develop independent, error-controlled solutions for increasingly complex systems of equations—for example, those that represent coupled-physics, multiscale, nonlinear systems. Developing such solutions that are relevant to a given application area is particularly challenging. Similarly, developing manufactured solutions becomes more challenging as the mathematical models become more complex, since the number of terms in the source expressions grows in size and complexity, requiring great care in managing and implementing the source terms into the model.

Another challenge is the need to construct manufactured solutions that expose different features of the model relevant to the simulation of physical systems, for example, different boundary conditions, geometries, phenomena, nonlinearities, or couplings. The method of manufactured solutions is employed in the verification methodology used in the Center for Predictive Engineering and Computational Sciences (PECOS) study described in Section 5.9. Finally as indicated in Section 5.9, manufactured or analytical solutions should reproduce known challenging features of the solution, such as boundary layers, effects of interfaces, anisotropy, singularities, and loss of regularity.

Some communities employ cross-code comparisons (in which different codes solve the same discretized system of partial differential equations [PDEs]) and refer to this practice as verification. Although this activity provides valuable information under certain conditions and can be useful to ensure accuracy and correctness, this activity is not “verification” as the term is used in this report. Often the reference codes being compared are not themselves verifiable. One of the significant challenges in cross-code comparisons is that of ensuring that the codes are modeling identical problems; these codes tend to vary in the effects that they include and in the way that the effects are included. It may be difficult to simulate identical physics processes and problems. One needs to model the same problem for the different codes; ideally the reference solution is arrived at using a distinct error-controlled numerical technique.

Upon completion of a code-verification study, a statement can be made about the correctness of the implementation of the intended algorithms in the computer program under the conditions imposed by the study (e.g., selected QOIs, initial and boundary conditions, geometry, and other inputs). To ensure that the code continues

The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement