Click for next page ( 2

The National Academies of Sciences, Engineering, and Medicine
500 Fifth St. N.W. | Washington, D.C. 20001

Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement

Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 1
Executive Summary ecent rough estimates are that the U.S. Department of Defense (DoD) spends at least $38 billion a year on the research, develop- , ~ Moment, testing, and evaluation of new defense systems; approxi- mately 40 percent of that cost at least $16 billion is spent on software development and testing. There is widespread understanding within DoD that the effectiveness of software-intensive defense systems is often ham- pered by low-quality software as well as increased costs and late delivery of software components. Given the costs involved, even relatively incremen- tal improvements to the software development process for defense systems could represent a large savings in funds. And given the importance of producing defense software that will carry out its intended function, rela- tively small improvements to the quality of defense software systems would be extremely important to identify. DoD software engineers and test and evaluation officials may not be fully aware of a range of available techniques, because of both the recent development of these techniques and their origination from an orientation somewhat removed from software engineering, i.e., from a statistical per- spectlve. The panel's charge therefore was to convene a workshop to identify statistical software engineering techniques that could have applicability to DoD systems in development. It was in response both to previous reports that identified a need for more attention to software and to the emergence of techniques that have been widely applied in industry and have demon-

OCR for page 1
2 INNOVATIONS IN SOFTWARE ENGINEERING strafed impressive benefits in the reduction of errors as well as testing costs and time. The panel structured its task consistent with a statistical approach to collecting information on system development. This approach has three stages: (1) determination of complete, consistent, and correct requirements; (2) selection of an effective experimental design to guide data collection on system performance; and (3) evaluation of system performance, consistent with requirements, using the data collected. The intent was to focus on methods that had been found useful in industrial applications. Some of the issues selected for study were suggested by DoD software experts for spe- cific areas to address that would make the workshop relevant to their con- cerns and needs. The panel's work was not intended to be comprehensive in identifying techniques that could be of use to DoD in software system development and for software engineering, or even for statistically oriented software en- gineering. Without question, a number of very effective techniques in software engineering were not represented at the workshop nor considered by the panel. The workshop sessions on specification of requirements focused on software cost reduction and sequence-based software specification. The sessions on test design addressed Markov chain usage testing and combina- torial design testing. Presentations on software evaluation discussed risk assessment, cost modeling, models of software process improvement, and fault-tolerant software. The presentations on software evaluation suggested the great variety of issues that can be addressed through data analysis and, therefore, the value of collecting and analyzing data on system performance, both during system development and after fielding a system. After the workshop, the oversight committee interacted mainly by e-mail and, to a lesser extent, by telephone and in person in small groups of two or three. Informed and primarily motivated by the workshop's presen- tations, but augmented by these subsequent interactions, the oversight com- mittee decided to issue recommendations for further efforts in this area. Recommendation 1: Given the current lack of implementation of state-of-the-art methods in software engineering in the service test agencies, initial steps should be taken to develop access to either in-house or in a closely affiliated relationship state-of-the-art software engineering expertise in the operational or developmen- tal service test agencies.

OCR for page 1
EXECUTIVE SUMMARY 3 Such expertise could be acquired in part in several ways, especially including internships for doctoral students and postdoctorates at the test and evaluation agencies, and with sabbaticals for test and evaluation agency staff at industry sites at which state-of-the-art techniques are developed and used. Recommendation 2: Each service's operational or developmental test agency should routinely collect and archive data on software performance, including test performance data and data on field performance. The data should include fault types, fault times and frequencies, turnaround rate, use scenarios, and root cause analy- sis. Also, software acquisition contracts should include require- ments to collect such data. Recommendation 3: Each service's operational or developmental test agency should evaluate the advantages of the use of state-of- the-art procedures to check the specification of requirements for a relatively complex defense software-intensive system. One effective way of carrying out this evaluation would be to develop specifications in parallel, using the service's current procedures, so that qual- ity metric comparisons can be made. Each service would select a software system that: (1) requires field configuration, (2) has the capabilities to adapt and evolve to serve future needs, and (3) has both hardware and software that come from multiple sources. Recommendation 4: Each service's operational or developmental test agency should undertake a pilot study in which two or more testing methods, including one model-based technique, are used in parallel for several software-intensive systems throughout devel- opment to determine the amount of training required, the time needed for testing, and the method's effectiveness in identifying software defects. This study should be initiated early in system development. Recommendations 3 and 4 propose that each of the services select software-intensive systems as case studies to evaluate new techniques. By doing so, test and evaluation officials can judge both the costs of training engineers to use the new techniques the ''fixed costs" of widespread imple-

OCR for page 1
4 INNOVATIONS IN SOFTWARE ENGINEERING mentation and the associated benefits from their use in the DoD acquisi- tion environment. Knowledge of these costs and benefits will aid DoD in deciding whether to implement these techniques more broadly. There are insufficient funds in individual programs or systems under development to support the actions we recommend. Thus, support will be needed from the services themselves or from the Department of Defense. Recommendation 5: DoD should allocate general research and de- velopment funds to support pilot and demonstration projects of the sort recommended in this report in order to identify methods in software engineering that are effective for defense software sys- tems in development. The panel notes that constraints hinder DoD from imposing on its contractors specific state-of-the-art techniques in software engineering and development that are external to the technical considerations of the costs and benefits of the implementation of the techniques themselves. How- ever, the restrictions do not preclude the imposition of a framework, such as the capability maturity model. Recommendation 6: DoD neetls to examine the advantages anti disadvantages of the use of methods for obligating software devel- opers under contract to DoD to use state-of-the-art methods for requirements analysis and software testing, in particular, and soft- ware engineering and development more generally. The techniques discussed in this report are consistent with these con- ditions and constraints. We also note that many of the techniques de- scribed in this report are both system oriented and based on behavior and are therefore applicable to both hardware and software components, which is an important advantage for DoD systems.