National Academies Press: OpenBook
« Previous: 4 Data Analysis to Assess Performance and To Support Software Improvement
Suggested Citation:"5 Next Steps." National Research Council. 2003. Innovations in Software Engineering for Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/10809.
×
Page 51
Suggested Citation:"5 Next Steps." National Research Council. 2003. Innovations in Software Engineering for Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/10809.
×
Page 52
Suggested Citation:"5 Next Steps." National Research Council. 2003. Innovations in Software Engineering for Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/10809.
×
Page 53
Suggested Citation:"5 Next Steps." National Research Council. 2003. Innovations in Software Engineering for Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/10809.
×
Page 54
Suggested Citation:"5 Next Steps." National Research Council. 2003. Innovations in Software Engineering for Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/10809.
×
Page 55

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

5 Next Steps The Workshop on Statistical Methods in Software Engineering for Defense Systems, as is typical of workshops, had necessarily lim- ited goals. Presentations reviewed a group of mainly statistically oriented techniques in software engineering that have been shown to be useful in industrial settings but that to date have not been widely adopted by the defense test and evaluation community (though there have been many initial efforts, as suggested above). The goal therefore was to make the defense test and evaluation community more aware of these techniques and their potential benefits. As was pointed out in the introduction, it is well recognized that the software produced for software-intensive defense systems often is deficient in quality, costs too much, and is delivered late. Addressing these problems will likely require a broad effort, including, but not limited to, the greater utilization of many wall-established techniques in software engineering that are widely implemented in industry. Some of the more statistically ori- ented techniques were described at the workshop, though the workshop should not be considered to have provided a comprehensive review of tech- niques in software engineering. Methods were described in the areas of: (1) development of higher-quality requirements, (2) software testing, and (3) evaluation of software development and performance. While it seems clear that the methods described at the workshop, if utilized, would generally improve defense software development, the costs of adoption of many of the techniques need to be better understood in 51

52 INNOVATIONS IN SOFTWARE ENGINEERING defense applications, as well as the magnitude of the benefits from their use, prior to widespread implementation. The panel also recognizes that Department of Defense (DoD) systems and DoD acquisition are some- what different from typical industrial systems and development procedures; DoD systems tend to be larger and more complex, and the acquisition process involves specific milestones and stages of testing. Acceptance of any new techniques will therefore depend on their being "proven" in a slightly different environment, and on surmounting comfort with the sta- tus quo. It also seems safe to assume, given the widespread problems with soft- ware development in the defense test and evaluation community, that this community has limited access to software engineers familiar with state-of- the-art techniques. If some of the techniques described in this report turn out to be useful in the arena of defense test and evaluation, their wide- spread adoption will be greatly hampered if the current staff has little famil- iarity with these newer methods. Therefore, the adoption of the techniques described in this report (as well as other related techniques) needs to be supported by greater expertise in-house at the service test agencies and at other similar agencies in the defense test and acquisition community. The hope is to develop either within or closely affiliated to each service test agency access to state-of-the-art software engineering expertise. The following two recommendations address the foregoing concerns by advocating that case studies be undertaken to clearly demonstrate the costs and benefits associated with greater use of these techniques with de- fense systems. Recommendation 1: Given the current lack of implementation of state-of-the-art methods in software engineering in the service test agencies, initial steps should be taken to develop access to either in-house or in a closely affiliated relationship state-of-the-art software engineering expertise in the operational or developmen- tal service test agencies. Such expertise could be acquired in part in several ways, especially including internships for doctoral students and postdoctorates at the test and evaluation agencies, and with sabbaticals for test and evaluation agency staff at industry sites where state-of-the-art techniques are developed and used. In addition, the sessions on data analysis to assess the performance of a

NEXT STEPS 53 software system and the software development process demonstrated the broad value of analysis of data collected on both the system development process and on system performance, both from tests and from operational use. Clearly, a prerequisite to these and other valuable analyses is the col- lection of and facilitated access to this information in a software data archive. The service test agencies can collect information on system perfor- mance. However, collection of data on software system development- e.g., data that would support either experience factory or orthogonal defect classification, as well as data on costs, defects, various kinds of metadata will require specification of this additional data collection in acquisition contracts. This would go well beyond collection of problem reports, in that substantial environmental information would also be collected that would support an understanding of the source of any defects. Recommendation 2: Each service's operational or developmental test agency should routinely collect and archive data on software performance, including test performance data and data on field performance. The data should include fault types, fault times and frequencies, turnaround rate, use scenarios, and root cause analy- sis. Also, software acquisition contracts should include require- ments to collect such data. A session at the workshop was devoted to the development of consis- tent, complete, and correct requirements. Two specific methods were de- scribed, which have been shown in industry and in selected defense appli- cations to be extremely helpful in identifying errors in requirement specifications. Though only two methods were presented, there are com- peting methods related to those described. There is every reason to expect that widespread adoption of the methods presented or their competitors would have important benefits for developing higher-quality defense soft- ware systems. However, some training costs would be incurred with a change to these procedures. Therefore, pilot tests should be undertaken to better understand the benefits and costs of implementation. Recommendation 3: Each service's operational or developmental test agency should evaluate the advantages of the use of state-of- the-art procedures to check the specification of requirements for a relatively complex defense software-intensive system.

54 INNOVATIONS IN SOFTWARE ENGINEERING One effective way of carrying out this evaluation would be to develop specifications in parallel, using the service's current procedures, so that qual- ity metric comparisons can be made. Each service would select a software system that (1) requires field configuration, (2) has the capabilities to adapt and evolve to serve future needs, and (3) has both hardware and software that come from multiple sources. Another session at the workshop was devoted to model-based testing techniques. Clearly, both Markov chain usage testing and combinatorial design-based testing are promising for DoD applications and are likely to result in reduced software development time and higher-quality software systems. More generally, model-based testing is a general approach that could provide great advantages in testing defense software systems. There are other innovative testing strategies that are not model-based but are in- stead code-based, such as data flow testing (Rapps end Weyuker, 19851. To better understand the extent to which these methods can be effective, using current DoD personnel, on DoD software-intensive systems, pilot projects should be carried out, in each service, to evaluate their benefits and costs. Recommendation 4: Each service's operational or developmental test agency should undertake a pilot study in which two or more testing methods, including one model-based technique, are used in parallel for several software-intensive systems throughout de- velopment to determine the amount of training required, the time needed for testing, and the method's effectiveness in identifying software defects. This study should be initiated early in system development. Recommendations 3 and 4 propose that each of the services select software-intensive systems as case studies to evaluate new techniques. By doing so, test and evaluation officials can judge both the costs of training engineers to use the new techniques the ''fixed costs" of widespread imple- mentation and the associated benefits from their use in the DoD acquisi- tion environment. Knowledge of these costs and benefits will aid DoD in deciding whether to implement these techniques more broadly. There are generally insufficient funds in the budgets of the service test agencies to support the actions we recommend. Also, due to their priori- ties, it is very unlikely that program managers will be interested in funding these pilot projects. Thus, support will be needed from the services them-

NEXT STEPS 55 selves or from DoD in order to go forward. The committee suggests that DoD support these projects out of general research and development funds. Recommendation 5: DoD should allocate general research and de- velopment funds to support pilot and demonstration projects of the sort recommended in this report in order to identify methods in software engineering that are effective for defense software sys- tems in development. The panel notes that constraints hinder DoD from imposing on its contractors specific state-of-the-art techniques in software engineering and development that are external to the technical considerations of the costs and benefits of the implementation of the techniques themselves. How- ever, the restrictions do not preclude the imposition of a framework, such as the capability maturity model. Recommendation 6: DoD needs to examine the advantages and disadvantages of the use of methods for obligating software devel- opers under contract to DoD to use state-of-the-art methods for requirements analysis and software testing, in particular, and soft- ware engineering anal development more generally. The techniques discussed in this report are consistent with these con- ditions and constraints. We also note that many of the techniques de- scribed in this report are both system oriented and based on behavior, and are therefore applicable to both hardware and software components, which is an important advantage for DoD systems.

Next: References »
Innovations in Software Engineering for Defense Systems Get This Book
×
 Innovations in Software Engineering for Defense Systems
Buy Paperback | $47.00 Buy Ebook | $37.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Recent rough estimates are that the U.S. Department of Defense (DoD) spends at least $38 billion a year on the research, development, testing, and evaluation of new defense systems; approximately 40 percent of that cost-at least $16 billion-is spent on software development and testing. There is widespread understanding within DoD that the effectiveness of software-intensive defense systems is often hampered by low-quality software as well as increased costs and late delivery of software components. Given the costs involved, even relatively incremental improvements to the software development process for defense systems could represent a large savings in funds. And given the importance of producing defense software that will carry out its intended function, relatively small improvements to the quality of defense software systems would be extremely important to identify. DoD software engineers and test and evaluation officials may not be fully aware of a range of available techniques, because of both the recent development of these techniques and their origination from an orientation somewhat removed from software engineering, i.e., from a statistical perspective. The panel's charge therefore was to convene a workshop to identify statistical software engineering techniques that could have applicability to DoD systems in development.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!