ship—in which sometimes even minor changes to up-front commitments may necessitate amendments to contracts and adjustments in costing—can create barriers to the effective and timely sharing of information that can assist the customer in efficiently reaching accurate assurance judgments. Additionally, it can be difficult to create incentives for the appropriate use of preventive measures such as those referenced in this chapter.

In this chapter the committee first considers the trends related to the challenges of software assurance. It then offers a concise conceptual framework for certain software assurance issues. Finally, it identifies significant technical opportunities and potential future challenges to improving our ability to provide assurance. (Some of these are elaborated in Chapter 5.)

Failures in software assurance can be of particularly high consequence for defense systems because of their roles in protecting human lives, in warfighting, in safeguarding national assets, and in other pivotal roles. The probability of failure can also be high, due to the frequent combination of scale, innovative character, and diversity of sourcing in defense systems. Unless exceptional attention is devoted to assurance, a high level of risk derives from this combination of high consequence and high likelihood.

Assurance considerations also relate to progress tracking, as discussed in Chapter 2—assessment of readiness for operational evaluation and release is based not just on progress in building a system, but also on progress in achieving developmental assurance. Additionally, the technologies and practices used to achieve assurance may also contribute useful metrics to guide process decision making.

Assurance Is a Judgment

Software assurance is a human judgment of fitness for use. In practice, assurance judgments are based on application of a broad range of techniques that include both preventive and evaluative methods and that are applied throughout a software engineering process. Indeed, for modern systems, and not just critical systems, the design of a software process is driven not only by issues related to engineering risk and uncertainty, but also, in a fundamental way, by quality considerations.3 These, in turn, are driven by systems risks—hazards—as described in Chapter 2 and also in Box 4.1 (cybersecurity).

An important reality of defense software assurance is the need to achieve safety—that is, in war, there are individual engagements where lives are at stake and where software is the deciding factor in the outcome. In many life and death situations, optimum performance may not be the proper overriding assurance criterion, but rather the “minimization of maximum regret.” This is exacerbated by the fact that, while a full-scale operational test of many capabilities may not be feasible, assurance must nonetheless be achieved. This applies, for example, to certain systems that support strategic defense and disaster mitigation. The committee notes, however, that there are great benefits in architecting systems and structuring requirements such that many capabilities of systems that would otherwise be rarely used only for “emergencies” are also used in an ongoing mode for more routine operations. This creates benefits from operational feedback and user familiarity. It also permits iterative development and deployment, such as is familiar to users of many evolving commercial online services.

Another reality of defense software that affects assurance is that it is developed by contractors working at arm’s length from the DoD. This means, for example, that the information sharing necessary to assessing and achieving assurance must be negotiated explicitly.

There are many well-publicized examples of major defense systems exhibiting operational failures of various kinds that are, evidently, consequences of inadequate assurance practices. A recent example of this type of top-level systems engineering issue was the failure of an F-22 flight management system when it was flown across the international dateline for the first time en route from Hawaii to Japan. In a CNN interview, Maj. Gen. Don Sheppard (ret.) said, “At the international date line, whoops, all systems dumped and when I say all systems, I mean all systems, their navigation, part of their communications,


Michael Howard and Steve Lipner, 2006, The Security Development Lifecycle, Redmond, WA: Microsoft Press. See also Box 2.3.

The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement