Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 51
5 Next Steps The Workshop on Statistical Methods in Software Engineering for Defense Systems, as is typical of workshops, had necessarily lim- ited goals. Presentations reviewed a group of mainly statistically oriented techniques in software engineering that have been shown to be useful in industrial settings but that to date have not been widely adopted by the defense test and evaluation community (though there have been many initial efforts, as suggested above). The goal therefore was to make the defense test and evaluation community more aware of these techniques and their potential benefits. As was pointed out in the introduction, it is well recognized that the software produced for software-intensive defense systems often is deficient in quality, costs too much, and is delivered late. Addressing these problems will likely require a broad effort, including, but not limited to, the greater utilization of many wall-established techniques in software engineering that are widely implemented in industry. Some of the more statistically ori- ented techniques were described at the workshop, though the workshop should not be considered to have provided a comprehensive review of tech- niques in software engineering. Methods were described in the areas of: (1) development of higher-quality requirements, (2) software testing, and (3) evaluation of software development and performance. While it seems clear that the methods described at the workshop, if utilized, would generally improve defense software development, the costs of adoption of many of the techniques need to be better understood in 51
OCR for page 52
52 INNOVATIONS IN SOFTWARE ENGINEERING defense applications, as well as the magnitude of the benefits from their use, prior to widespread implementation. The panel also recognizes that Department of Defense (DoD) systems and DoD acquisition are some- what different from typical industrial systems and development procedures; DoD systems tend to be larger and more complex, and the acquisition process involves specific milestones and stages of testing. Acceptance of any new techniques will therefore depend on their being "proven" in a slightly different environment, and on surmounting comfort with the sta- tus quo. It also seems safe to assume, given the widespread problems with soft- ware development in the defense test and evaluation community, that this community has limited access to software engineers familiar with state-of- the-art techniques. If some of the techniques described in this report turn out to be useful in the arena of defense test and evaluation, their wide- spread adoption will be greatly hampered if the current staff has little famil- iarity with these newer methods. Therefore, the adoption of the techniques described in this report (as well as other related techniques) needs to be supported by greater expertise in-house at the service test agencies and at other similar agencies in the defense test and acquisition community. The hope is to develop either within or closely affiliated to each service test agency access to state-of-the-art software engineering expertise. The following two recommendations address the foregoing concerns by advocating that case studies be undertaken to clearly demonstrate the costs and benefits associated with greater use of these techniques with de- fense systems. Recommendation 1: Given the current lack of implementation of state-of-the-art methods in software engineering in the service test agencies, initial steps should be taken to develop access to either in-house or in a closely affiliated relationship state-of-the-art software engineering expertise in the operational or developmen- tal service test agencies. Such expertise could be acquired in part in several ways, especially including internships for doctoral students and postdoctorates at the test and evaluation agencies, and with sabbaticals for test and evaluation agency staff at industry sites where state-of-the-art techniques are developed and used. In addition, the sessions on data analysis to assess the performance of a
OCR for page 53
NEXT STEPS 53 software system and the software development process demonstrated the broad value of analysis of data collected on both the system development process and on system performance, both from tests and from operational use. Clearly, a prerequisite to these and other valuable analyses is the col- lection of and facilitated access to this information in a software data archive. The service test agencies can collect information on system perfor- mance. However, collection of data on software system development- e.g., data that would support either experience factory or orthogonal defect classification, as well as data on costs, defects, various kinds of metadata will require specification of this additional data collection in acquisition contracts. This would go well beyond collection of problem reports, in that substantial environmental information would also be collected that would support an understanding of the source of any defects. Recommendation 2: Each service's operational or developmental test agency should routinely collect and archive data on software performance, including test performance data and data on field performance. The data should include fault types, fault times and frequencies, turnaround rate, use scenarios, and root cause analy- sis. Also, software acquisition contracts should include require- ments to collect such data. A session at the workshop was devoted to the development of consis- tent, complete, and correct requirements. Two specific methods were de- scribed, which have been shown in industry and in selected defense appli- cations to be extremely helpful in identifying errors in requirement specifications. Though only two methods were presented, there are com- peting methods related to those described. There is every reason to expect that widespread adoption of the methods presented or their competitors would have important benefits for developing higher-quality defense soft- ware systems. However, some training costs would be incurred with a change to these procedures. Therefore, pilot tests should be undertaken to better understand the benefits and costs of implementation. Recommendation 3: Each service's operational or developmental test agency should evaluate the advantages of the use of state-of- the-art procedures to check the specification of requirements for a relatively complex defense software-intensive system.
OCR for page 54
54 INNOVATIONS IN SOFTWARE ENGINEERING One effective way of carrying out this evaluation would be to develop specifications in parallel, using the service's current procedures, so that qual- ity metric comparisons can be made. Each service would select a software system that (1) requires field configuration, (2) has the capabilities to adapt and evolve to serve future needs, and (3) has both hardware and software that come from multiple sources. Another session at the workshop was devoted to model-based testing techniques. Clearly, both Markov chain usage testing and combinatorial design-based testing are promising for DoD applications and are likely to result in reduced software development time and higher-quality software systems. More generally, model-based testing is a general approach that could provide great advantages in testing defense software systems. There are other innovative testing strategies that are not model-based but are in- stead code-based, such as data flow testing (Rapps end Weyuker, 19851. To better understand the extent to which these methods can be effective, using current DoD personnel, on DoD software-intensive systems, pilot projects should be carried out, in each service, to evaluate their benefits and costs. Recommendation 4: Each service's operational or developmental test agency should undertake a pilot study in which two or more testing methods, including one model-based technique, are used in parallel for several software-intensive systems throughout de- velopment to determine the amount of training required, the time needed for testing, and the method's effectiveness in identifying software defects. This study should be initiated early in system development. Recommendations 3 and 4 propose that each of the services select software-intensive systems as case studies to evaluate new techniques. By doing so, test and evaluation officials can judge both the costs of training engineers to use the new techniques the ''fixed costs" of widespread imple- mentation and the associated benefits from their use in the DoD acquisi- tion environment. Knowledge of these costs and benefits will aid DoD in deciding whether to implement these techniques more broadly. There are generally insufficient funds in the budgets of the service test agencies to support the actions we recommend. Also, due to their priori- ties, it is very unlikely that program managers will be interested in funding these pilot projects. Thus, support will be needed from the services them-
OCR for page 55
NEXT STEPS 55 selves or from DoD in order to go forward. The committee suggests that DoD support these projects out of general research and development funds. Recommendation 5: DoD should allocate general research and de- velopment funds to support pilot and demonstration projects of the sort recommended in this report in order to identify methods in software engineering that are effective for defense software sys- tems in development. The panel notes that constraints hinder DoD from imposing on its contractors specific state-of-the-art techniques in software engineering and development that are external to the technical considerations of the costs and benefits of the implementation of the techniques themselves. How- ever, the restrictions do not preclude the imposition of a framework, such as the capability maturity model. Recommendation 6: DoD needs to examine the advantages and disadvantages of the use of methods for obligating software devel- opers under contract to DoD to use state-of-the-art methods for requirements analysis and software testing, in particular, and soft- ware engineering anal development more generally. The techniques discussed in this report are consistent with these con- ditions and constraints. We also note that many of the techniques de- scribed in this report are both system oriented and based on behavior, and are therefore applicable to both hardware and software components, which is an important advantage for DoD systems.
Representative terms from entire chapter: