National Academies Press: OpenBook
« Previous: 1 Introduction
Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

2

Workshop Summary

The goal of the workshop was to have presentations on leading-edge industrial practices from speakers who are (or have recently been) involved in systems development in the commercial sector. There were four primary speakers, two on software and two on hardware. The speakers were selected on the basis of their direct involvement with requirements setting, systems design and development, and system testing.

The speakers had been asked to present an overview of approaches to system development, with an emphasis on addressing the three motivating questions for the panel’s work (see Chapter 1). Each set of presentations (on software and on hardware) was followed by two discussants, one with a defense perspective and one from the panel, and then general discussion.

SOFTWARE

HP-UX Continuous Development, Integration, and Test: An Agile Process1

The first presentation on software was by Donald Bollinger, a distinguished technologist in the Mission-Critical Business Systems Division of Hewlett-Packard (HP). He has designed and overseen the development,

image

1The presentation slides are available at http://www7.nationalacademies.org/cnstat/Presentations%20Main%20Page.html [November 2011].

Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

integration, and testing of HP-UX, the operating system environment for HP’s critical computer systems.

Bollinger focused on the HP-UX system and used it as a case study to describe leading-edge software development practices at HP. He noted that it is an example of an “agile” development process.2 HP-UX is a large, complex software system with tens of millions of lines of code. It is used in mission/business-critical environments, and it is essential that very high quality is maintained release after release. It has been upgraded repeatedly, piece by piece, over the past 25 years. It has spanned four hardware architectures and dozens of platforms.

Over the past 10 years, Bollinger noted, HP moved from a “waterfall” software development process to an agile development process. (Briefly, a waterfall program proceeds from concept, to requirements, design, prototype, construction, acceptance test and final delivery. Complete and detailed requirements are emphasized. Deviations from the initial requirements are expensive and disruptive. In contrast, an Agile program begins with the same concept, and executes multiple iterations. Each iteration is a full pass from requirements to acceptance test. The first iteration will quickly [e.g., in one-tenth the time] produce an extremely minimal version of the concept. Subsequent iterations add or improve the capabilities of the product until a useful version is created, and keep going after that to create ever more useful versions. An Agile program embraces changing requirements, exploits knowledge gained in early iterations to improve designs and implementation, and encourages user feedback to guide later iterations.) One key difference between HP-UX and many DOD software systems is that HP-UX is the same basic system—only new functionalities and capabilities are added over time. The capabilities never degrade, and the customers do not change much over time.

Bollinger touched on a number of the system’s key features, with an emphasis on incremental delivery of working software. There has been a substantial improvement in quality (in terms of customer defect rates as well as productivity release time) after HP switched to the continuous development, integration, and test process. Bollinger noted that HP continuously develops, integrates, and tests all elements of HP-UX to ship release criteria. Furthermore, the company starts the next release, at full throttle, the day after the last one is finished. He also emphasized the importance of not breaking legacy and of fixing defects before adding new code.

Bollinger strongly emphasized the importance of communication and collaboration with the customer and all other members of the develop-

image

2For the principles of the “agile manifesto,” see http://agilemanifesto.org/principles.html [August 2011].

Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

ment and testing team. Those discussions cover a variety of issues, including which requirements are “must-haves” and which are flexible, which requirements are unattainable, how specifications in the written documents should be interpreted, and information feedback from the field. Bollinger also mentioned the concept of “open development” in which the development teams share the results they have (subject to some appropriate protection). This last point may be more relevant for the contractor than for DOD personnel during developmental testing and operational testing. Bollinger also repeatedly emphasized the importance of accountability, efficiency, and cost performance in the commercial environment.

Testing in an Evolutionary Acquisition Environment: Agile with Discipline

The second presentation on software was by Sham Vaidya, an IBM distinguished engineer and the service area leader for emerging technology and architecture for IBM Global Business Services and a member of the IBM Academy of Technology. His experience is in information technology with a focus on enterprise architecture, component business modeling, business architecture, application integration, and business-oriented architectures.3

Vaidya discussed three case studies: a large global warranty management system for an automobile manufacturer, verification and validation of the power PC microprocessor chips in the pSeries boxes, and setting up a testing center of excellence for wireless operations for a large telecommunications client. He noted that he is a proponent of the agile software development process, and a number of his points were in common with Bollinger’s.

The warranty management system was a large and complex program with about 300 million claims, an additional 16 million new claims a year, 2,000 users globally, and about 200 users interacting with the system at any given time. There were multiple data sources: faxes, batch inputs, the Web, and some defined user interface. This is somewhat similar to the types of data sources from the field in DOD applications. The major lesson IBM learned from this project was that the system could not be developed and released in “one shot”; rather a multirelease approach was necessary. (This approach had also been mentioned as a core element of agile development.) In this application, it was not possible to anticipate all of the requirements up front. New design components were added in subsequent releases,

image

3The presentation slides are available at http://www7.nationalacademies.org/cnstat/Presentations%20Main%20Page.html [November 2011].

Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

based on lessons learned and feedback from the field. The effort to acquire and prepare the correct test data was, in itself, a huge project.

The power PC chip verification project dealt with the generation of test suites. The issues here are related to those in software testing. Vaidya stressed the importance of the use of a hierarchical verification approach, starting with small components and integrating more and more until the full system level is reached. The focus of the last project (testing center for wireless) was the role of various testing functions to maximize test efficiencies and ensure the timely production of high-quality software.

Discussion

One aspect of agile development, recommended by both speakers, created some controversy. This was “progressive or changing requirements” which could come even late in development. The agile manifesto notes that agile processes harness change for the developer’s competitive advantage. This point met with some resistance at the workshop. Several participants raised serious reservations about using such a process in DOD’s environment, in which there are already many opportunities and incentives for gaming the system. In addition, fluid requirements may lead to costly changes in system architecture (especially with hardware), introduction of new defects, and delays in system delivery. Clearly, there are systems that are suitable for staged development and multiple releases (such as those acquired in an evolutionary manner in DOD; see National Research Council, 2006) when changes in requirements will happen over time and are guided by feedback from customers.

One panel member noted that many of the concepts that are included in the agile manifesto are, by themselves, not new. It appears that, like many quality management paradigms, what is new is the disciplined environment that is promoted in the agile development process. By “disciplined environment,” we refer to a systematic approach to process development that is based on accepted quality management and systems engineering principles. For example, the agile software development process is based on the 12 principles outlined in the agile manifesto (Beck et al., 2001). It emphasizes, among other things, customer satisfaction by rapid delivery of useful software, working software as the principal measure of progress, close, daily cooperation between the business people and the software developers, and sustainable development (the ability to maintain a constant pace).

Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

HARDWARE

Effective Development and Validation Processes

The first presentation on hardware was by Jeffrey Zyburt. Now a private consultant, he spent 30 years at various positions in Chrysler, including as director of vehicle development and director of proving grounds and durability testing labs, with extensive experience in hardware development and manufacturing processes.4

Zyburt’s presentation focused on the causes of ineffective and effective development processes, based on his experience for vehicles in the automotive industry. He first listed some of the reasons for ineffective product development:

•   lack of a “dedicated” program lead and cross-functional core development team from concept to postproduction,

•   ever-changing program targets and functional objectives,

•   late design changes,

•   no prioritization of customer requirements and no distinction between “must-haves” and “wish list,”

•   late component/system supplier sourcing,

•   inadequate supplier capabilities (design/development and analysis/testing),

•   no agreed on pass/fail test criteria, and

•   advanced engineering and concept design and redesign that occurs along the critical path of the program timeline.

In contrast, Zyburt then listed the characteristics that are an integral part of an effective vehicle development program:

•  “dedicated” upfront resources, including the program lead and a core development team, both of which are responsible and accountable until postlaunch;

•   a team that is multidisciplinary (different aspects of the vehicle development) and that ensures all of the functional attributes of the vehicle can meet the program targets;

•   a prioritized list of customer requirements and an identification of the sacred few or “must-haves” (based on compelling questions early in the program) and ensures that the “wish list” is well aligned with the “must haves”;

image

4The presentation slides are available at http://www7.nationalacademies.org/cnstat/Presentations%20Main%20Page.html [November 2011].

Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

•   defined/nonfluid functional objectives for the program;

•   offline (outside the program timeline) advanced technology development;

•   coordinated releases of subsystems from all disciplines so that the vehicle can be evaluated as a system for risk assessments at each milestone;

•   reassessment if any program change is proposed;

•   independent third-party (internal or external) assessments at each milestone with objective “go/don’t go” metrics; and

•   closed-loop feedback from field/warranty data on issues found and use of gap analysis (analysis of the causes of the reasons for the differences between the performance of the current system and the stated requirements) to identify scope for improvement.

Trends in Automotive Electronics Design: Current and Future Methodologies

The second presentation on hardware was by Salim Momin. Currently with SRS Enterprises, he previously was with Freescale Semiconductors, where he managed the “virtual garage” (among other activities). The objective of this organization was to understand how Freescale’s customers (tier-one suppliers to automotive companies) and their customers were developing their designs.5

Momin noted that automotive manufacturers are moving from being component focused to being architecture focused because the latter is the key to system integration. To enable this change, companies are increasingly adopting model-based approaches to control systems engineering and requirements setting. Model-based design is an approach to codifying (formalizing) the process of taking customer requirements and translating them into systems requirements and specifications. In some cases the executable specifications can be generated, which leads to implementation. For example, in software, C code can be generated from the models using auto-code generation tools. In other words, text-based requirements are converted into mathematical equations, and mathematical analysis and simulation, visualization, and animation techniques are then used to verify and clarify the requirements. A model-based approach has many advantages, including validation of requirements, consistency checks, and resolving ambiguities in the statement of requirements and specifications.

Momin pointed to several advantages of the use of modeling in DOD’s context: (1) it specifies the actual intent of the functionality so that it is very

image

5The presentation slides are available at http://www7.nationalacademies.org/cnstat/Presentations%20Main%20Page.html [November 2011].

Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

clear and precise; (2) it is reusable if it is well documented; (3) it is executable, so it gives an unambiguous functional execution; (4) it can be used to automatically generate test suites (i.e., schemes for selecting scenarios for testing the system); and (5) perhaps most importantly, the model captures knowledge that is preserved and institutionalized. In other words, it provides a formal and rigorous framework for the requirements generation process. In some cases—such as software or logic design for integrated circuits—the models can be used for implementation of the design.

Momin also mentioned that the default standard for functional modeling in the automotive industry is based on tools from MathWorks® (Stateflow®, Simulink®, MATLAB®). Other tools, such as UML and SysML, are also being used in other application areas, such as aerospace by companies like Boeing, enabled by tools from IBM. He noted other examples: General Motors uses UML for modeling and code generation of software for electronic control units used to control comfort and convenience functions of cars; and Ford uses Stateflow® and Embedded Coder® for their body electronic control units. Most engine control software is modeled using Simulink/Stateflow® and C code is auto-generated—companies doing this are General Motors, Ford, Chrysler, Toyota, and Nissan.

Adequate documentation is critical for a model-based design approach to work. Momin acknowledged the difficulty of getting engineers to spend time on documentation. He noted the availability of software tools, such as those developed by MathWorks®, which facilitate the process of documentation.

Discussion

A participant from DOD noted that the model-based design tools described by Momin are beginning to be used by defense contractors for complex systems. However, DOD itself may have limited capability in exercising these models during the review process, which is a serious limitation in collaborating with contractors.

Both Bollinger (software) and Zyburt (hardware) emphasized the importance of asking customers to prioritize their requirements into two groups: a list of “must-haves” and a “wish list.” This approach has obvious advantages as it forces the customer to think carefully through the requirements at the beginning of the development process and to make tough decisions. Also, having a prioritized wish list provides considerable flexibility in trading off these requirements during design and development stages.

Zyburt repeated his point that late design changes are one of the features of an ineffective vehicle development process. Changes to system design and architecture often result in substantial cost increases,

Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

delays in development, possible introduction of additional defects, and degraded quality. This perspective conflicts with the emphasis on changing requirements in the agile manifesto, which was referred to approvingly by both speakers on software systems, Bollinger and Vaidya. It is possible that large and complex software systems are well suited for staged development and multiple releases, when the requirements over stages can change depending on feedback from customers and the field. The importance of appropriate oversight and accountability in approving design changes was also discussed by workshop participants. In feedback received after the workshop, Momin and Zyburt noted the advantages if DOD were to establish and enforce processes for evaluating the impact of changes in requirements on system design and also establish clear guidelines and criteria for accepting changes in requirements after the freeze. However, there are already guidelines and criteria in place within DOD for approving changes in requirements and design. Nevertheless, programs continue to be plagued by the occurrence of “requirements creep,” suggesting that the procedures are not being followed or enforced.

Extensive communication and collaboration among the design, development, and testing teams were stressed as integral parts of leading-edge practices in the commercial sector. Another common discussion issue was ensuring maturity of new technologies since innovating on a schedule is often not possible. (This topic has been discussed in previous National Research Council reports [e.g., 2006] and unspecified DOD studies; see also discussion earlier in this chapter and in Chapter 3.) Some of the participants from industry suggested that the real problem might be lack of adherence to criteria in the assessment of new technology readiness and that there may be poor risk assessment of the impact of technology insertion and integration on systems. They speculated that this might be part of a general lack of adequate enforcement and oversight by domain experts at key milestone deliverables.

Several other issues were emphasized by more than one speaker at the workshop:

•   the importance of accountability and continuity of the project management team;

•   better managing the hand-off process during system development and testing so that useful information available to the developer is also available to testers;

•   making clear-cut decisions during milestones—that is, “red” and “green” decisions based on objective metrics and not “yellow” ones;

•   the importance of not breaking legacy and fixing defects before new components or subsystems are added; and

Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

•   the substantial benefits in using feedback loops for system improvement and for test and model improvement.

Some of the industry speakers noted at the end of the workshop that although there seem to be reasonable rules and procedures in place within DOD, it appears they are not properly implemented by appropriate checks and balances. They speculated that this is probably the major hindrance to the improvement of defense acquisition. In fact, one of the speakers from industry noted: “The good news is, all the studies you [have done]… you know 80 percent [of the best practices and guidelines needed] is already there. All you’ve got to do is go out and do what you wrote down, and you’ll be in great shape.”

Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

This page intentionally left blank.

Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 15
Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 16
Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 17
Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 18
Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 19
Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 20
Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 21
Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 22
Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 23
Suggested Citation:"2 Workshop Summary." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 24
Next: 3 Requirements Setting »
Industrial Methods for the Effective Development and Testing of Defense Systems Get This Book
×
Buy Paperback | $35.00 Buy Ebook | $27.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

During the past decade and a half, the National Research Council, through its Committee on National Statistics, has carried out a number of studies on the application of statistical methods to improve the testing and development of defense systems. These studies were intended to provide advice to the Department of Defense (DOD), which sponsored these studies. The previous studies have been concerned with the role of statistical methods in testing and evaluation, reliability practices, software methods, combining information, and evolutionary acquisition.

Industrial Methods for the Effective Testing and Development of Defense Systems is the latest in a series of studies, and unlike earlier studies, this report identifies current engineering practices that have proved successful in industrial applications for system development and testing. This report explores how developmental and operational testing, modeling and simulation, and related techniques can improve the development and performance of defense systems, particularly techniques that have been shown to be effective in industrial applications and are likely to be useful in defense system development. In addition to the broad issues, the report identifies three specific topics for its focus: finding failure modes earlier, technology maturity, and use of all relevant information for operational assessments.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!