National Academies Press: OpenBook

Industrial Methods for the Effective Development and Testing of Defense Systems (2012)

Chapter: 6 Communication, Resources, and Infrastructure

« Previous: 5 Testing Methods
Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

6

Communication, Resources,
and Infrastructure

Several aspects of infrastructure, expertise, and acquisition processes at the U.S. Department of Defense (DOD) hamper the application of best engineering practices. In this chapter we consider the importance of communication among the different teams involved in testing and development, data archiving, the use of feedback loops, and available systems engineering capabilities.

COMMUNICATION AND COLLABORATION AMONG REQUIREMENTS SETTING, DESIGN, AND TESTING

Conclusion 1 highlights the need for early and clear communications about requirements. In addition, industry representatives at the workshop stressed the importance of collaboration and communication among customers and program developers as well as participants across all aspects of system development and testing. Such collaboration is essential to avoiding long, costly, and unsuccessful product development programs. The drivers of unsuccessful commercial programs included the following features that panel members noted to be common to many troubled acquisition programs in DOD.1

image

1See Zyburt’s presentation at http://www7.nationalacademies.org/cnstat/Presentations%20Main%20Page.html [November 2011].

Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

•   ever-changing program targets and functional objectives;

•   providing inadequate or improper requirements to supplier or internal design group;

•   lack of agreement on pass/fail criteria;

•   late or no bench testing, which leads to a complete system becoming a “discovery property” rather than a “validation property”; and

•   robust developed technologies are not “plugged in” to a program, and consequently advanced engineering and concept design occurs along the critical path of the program timeline versus offline.

Leading industrial companies have established programs to promote higher levels of collaboration among suppliers, manufacturers, customers, service organizations, and the ultimate users of the product. For example, Toyota Motor Company had two awards (excellence and superiority) to promote collaboration with and friendly competition among suppliers; see Box et al. (1988).

The recent Center for Naval Analysis (2009) study on the successful F/A-18E/F Super Hornet Development Program (mentioned above) reported on one of the few (recent) DOD development programs completed on time, within initial cost/required funding estimates, and meeting or exceeding all performance parameters—an outcome that resulted from close collaboration.2 Close collaboration has also existed in earlier successful DOD programs, such as the nuclear attack submarine and ballistic missile submarine programs; it appears to be rare in recent DOD acquisition programs. Senior DOD acquisition executives in the Office of the Secretary and the military departments have the authority to require such close collaboration in the programs they oversee, but of late they have rarely required it or enforced it among the various groups critical to program success. The lack of coordinated efforts, particularly in the early stages of requirements and systems development, has contributed

image

2The program participants emphasized the importance of collaboration to achieve success. In this case, there was excellent partnership among the government program management office, the contractor program management office personnel and customer representatives from the Navy and the Office of the Secretary of Defense (OSD), the requirements community, the developmental test authority, the operational test agency, and a cost analysis improvement group. Several attributes were key in the process, which include the following:

•   The program team took the time to get the requirements vetted and understood by all and revalidated those requirements every year.

•   All team members were willing to work in an open and sharing manner–one team, one set of tools.

•   The program team took the time to get the program planning right from the start–and executed the plan in an open and sharing way.

•   There was disciplined change control throughout

Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

to the long-term detriment, and sometimes the cancellation, of a number of acquisition programs.

Collaboration can also be considered from the perspective of sharing information. Previous National Research Council (NRC) studies have emphasized the importance of using all available test information to improve operational evaluations, particularly with the use of evolutionary acquisition techniques. For example, National Research Council (2006:22) recommended that “the USD (AT&L) [Under Secretary of Defense for Acquisition, Technology, and Logistics] should develop and implement policies, procedures, and rules to… to share all relevant data on systems performance and the results of modeling and simulation… to assist in system evaluation and development.” It is not possible to use and integrate information from the various sources without good collaboration and sharing of models and data across all of the important testing events.

DATA ARCHIVING

Conclusion 7: A data archive with information on developmental and operational test and field data will provide a common framework for discussions on requirements and priorities for development. In addition, it can be used to expedite the identification of and correction of design flaws. Given the expenses and complexity in developing such an archive, it is important that the benefits of a data archive be adequately demonstrated to support development.

Several previous NRC reports have also discussed this important topic,3 but it has not received any noticeable attention in DOD. The collection and analysis of data on test and field performance, including warranty data, is a standard feature in commercial industries. In the DOD context, it is also important to retain information about test suites (by both contractors and DOD). In fact, it would be useful to require, through contractual obligation, that detailed information on tests carried out by contractors be provided to DOD. The panel does not make this suggestion lightly, as providing access to such test data is a large undertaking.

An archival test and field performance database could inform system developers as to the capabilities of components that had been used in fielded systems. Such a database could be extremely useful for requirements setting for future related systems. Furthermore, by capturing field

image

3Key among them are (1) Recommendation 3.3 in National Research Council (1998:49); Recommendation 2 in National Research Council (2003:3); National Research Council (2004:61); and Recommendation 6 in National Research Council (2006:37).

Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

performance data, test scenarios can be selected to determine whether problems in a previously released system had been addressed in the most recent stage of development. Such a database could also help answer very broad questions about which development practices are most effective, cost impacts and trajectories, and what can be done to reduce development and acquisition costs. A key example is whether additional testing in development reduces life-cycle costs.

Such a database, if it included data on the performance of fielded systems, could support analyses similar to those of warranty systems in the commercial world. As Gilmore (2010) notes, DOD spends a substantial amount of its acquisition budget on operations and support. For example, for ground combat systems, the cost is 73 percent. A key driver of this cost is the poor reliability performance of the system and the resulting costs for replacement parts. A data archive could support analysis to control and manage a considerable fraction of operations and support costs by revealing and quickly fixing system deficiencies through a failure mode, effects, and criticality analysis, and a failure reporting, analysis, and corrective action system supported by such data collection.

This database would need to be easily accessible by program managers and testers. It is important for everyone to work from the same database so that requirements, specifications, and later assertions of reliability and effectiveness based on archived test results and results from modeling and simulation can be compared and contrasted. The speakers at the workshop insisted that developers in industry know the historical performance of components or subsystems that are included in the system in question, and so they can then anticipate problems in development and work to prevent them. Therefore, it is extremely important to also include contractor test results in such an archive, since that is the only way the full history of performance can be represented.

Recommendation 3: The U.S. Department of Defense (DOD) should create a defense system data archive, containing developmental test, operational test, and field performance data from both contractors and the government. Such an archive would achieve several important objectives in the development of defense systems:

•   substantially increase DOD’s ability to produce more feasible requirements,

•   support early detection of system defects,

•   improve developmental and operational test design, and

•   improve modeling and simulation through better model validation.

Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

Given these important benefits, DOD should initiate plans to begin creation of a defense system data archive. Some issues that need immediate resolution include (a) whether the archive should be DOD-wide or should be stratified by type of system to limit its size, (b) what data are to be included and how the data elements should be represented to facilitate linkages of related systems, and (c) what data-based management structure is used. In designing this archive, a flexible architecture should be used so that if the archive is initially limited to a subset of the data sources listed here due to budgetary considerations, the archive can be readily expanded over time to include the remaining sources.

Specification of how such a database should be constructed and what it should contain are beyond the scope of this study. DOD currently has multiple databases that have been developed in the different services for different types of systems to satisfy various needs. They represent some aspects of the database we are describing. There are databases with developmental test data, databases that collect operational test data, databases with modeling and simulation results, and databases with field performance data. Unfortunately, in most cases, these databases are not compatible with each other. Perhaps an initial approach to the development of a test and field data archive would be to institute linkages that allow the combination of system-specific information across the existing databases.

A key reason for the lack of progress in this area is the incentive structure in the DOD acquisition environment. Individual programs do not obtain any immediate benefit from committing resources for the development and maintenance of data archives beyond their own program for the common good. So the first step would be for DOD to recognize the advantages of building and maintaining such a database and exploring how a data archive would be funded. With this recognition, the panel suggests that DOD could commission a committee of people with expertise in database management and people with experience in program development to propose concrete recommendations.

FEEDBACK LOOPS

Conclusion 8: Feedback loops can significantly improve system development by improving developmental and operational test design and improving the use of modeling and simulation. Feedback systems can function similarly to warranty management systems that have proved essential to the automotive industry. To develop feasible requirements, understanding how components installed in related systems have performed when fielded is extremely useful in understanding their limitations for possibly more stressful use in a proposed system. To support such feedback loops, data on field

Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

performance, test data, and results from modeling and simulation must be easily accessible, which highlights the necessity for a test and field data archive.

Field performance data are the ultimate indicators of how well a system is functioning in actual operational conditions. By field performance data, we include all the circumstances that can have an impact on the quality of the components, subsystems, and a system itself. These circumstances include all relevant pre- and postdeployment activities, including transportation, maintenance, implementation, and storage. They could also include training data, if such data are collected objectively.

Such data should be used to better understand the strengths and weaknesses of newly fielded systems and can be used in feedback loops. They can indicate when components or subsystems should be modified because of inferior effectiveness or suitability, and they can be used to identify for which missions the current system will work. For instance, if a system exhibits poor reliability in certain stressful scenarios of use, say, while carrying loads of more than a given weight, and if the reliability of the system under such conditions cannot be easily or quickly improved, the information can support a decision not to use the system for such missions (if alternatives are available). And, if the reliability of the relevant component can be improved with a redesign, the information can be used to support arguments for such a redesign.

Design flaws that are identified in fielded systems can also be evidence of failure in the testing process. For instance, inferior reliability of a system under heavy loads is likely to be an indication that those weights were not used during developmental and operational testing. The reason for such an omission can then be examined, and the process for selection of experimental designs can be improved. Also, field performance data can provide information on the validity of any modeling and simulation that were used to assess operational performance. For example, if modeling and simulation were used to extrapolate from light loads instead of actual physical testing, the validity of the use of modeling and simulation can be examined and the process for validating modeling and simulation can then be improved.

The National Research Council (2003) noted two significant benefits of feedback loops: field performance data can be used to help estimate total life-cycle costs of a newly fielded system, and, in spiral development, effective feedback processes can identify enhancements that will improve the effectiveness and suitability of later stages of development. Improving the quality and timeliness of this feedback is important in being able to respond to rapid changes in threat environments.

It is the panel’s understanding that such a feedback loop currently

Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

operates only when a system is underperforming in a dramatic way. Instead, such analysis and feedback should be routine. Although DOD does collect field performance data, they are of highly varying quality and are not archived in a way that facilitates analysis (see discussion in Chapter 5). To varying degrees, the services do use a deficiency reporting process as a feedback mechanism during developmental programs, starting with the design review and continuing through testing. Deficiencies are categorized to identify the relative importance and urgency of a response. For example, in the Air Force, the stated purpose of the deficiency reporting investigation and resolution process is to provide “a means of identifying deficiencies, resolving those deficiencies within the bounds of program resources and the appropriate acceptance of risk for those deficiencies that cannot be resolved in a timely manner” (U.S. Air Force, 2007:1-1). However, the process has mostly been allowed to atrophy in the past 15 years, for several reasons, most notably the services’ lack of participation in developmental testing and the need to ignore all but the most critical deficiencies that are identified because of a lack of funds to take corrective actions.

Recommendation 4: After a test and field data archive has been established, the Under Secretary of Defense for Acquisition, Technology, and Logistics and the acquisition executives in the military services should lead a U.S. Department of Defense (DOD) effort to develop feedback loops on improving fielded systems and on better understanding tactics of use of fielded systems. The DOD acquisition and testing communities should also learn to use feedback loops to improve the process of system development, to improve developmental and operational test schemes, and to improve any modeling and simulation used to assess operational performance.

SYSTEMS ENGINEERING CAPABILITIES IN DOD

Conclusion 9: The U.S. Department of Defense has lost much of its expertise in all aspects of systems engineering in recent years. It is important to have in-house capability in the critical areas relating to the design, development, and operation of major types of systems and subsystems. One such area is expertise in model-based design tools.

Some of the speakers at the workshop noted that commercial companies stress the importance of systems engineering expertise. This expertise is key not only for system development but also for requirements setting, model development, and testing. In contrast, Adolph (2010:51) notes that

Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

in DOD: “The manpower reductions mandated by Congress in the late 1990s, followed by excessive additional services-directed reductions, have decimated the program office engineering and test support workforce as well as DOD government test organization personnel.” In addition, Adolph et al. (2008:220), summarizing a 2008 report of the Defense Science Board, state: “A second and related priority is to ensure that government organizations reconstitute a cadre of experienced test and evaluation, engineering, and RAM personnel to support the acquisition process.”

In order to improve its test and development process, DOD will have to reverse this trend. It appears that DOD has recognized this problem and is taking steps to rectify it. The panel applauds this effort, but we emphasize that, even with a dedicated and sustained effort, it will take a take a decade or more to have the capabilities that DOD had in the early 1990s. Therefore, DOD should examine short-term use of contractors, academics, employees of national laboratories, etc. so that many of the recommendations in this and other studies can be implemented in a timely manner. The problem of systems engineering capability is also complicated by the reduced numbers of U.S. citizens who are acquiring engineering degrees. DOD should examine creative alternatives, including ways to engage noncitizen engineers on DOD acquisition programs, temporary employment opportunities, fellowships, internships, and sabbaticals of various kinds.

Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 51
Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 52
Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 53
Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 54
Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 55
Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 56
Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 57
Suggested Citation:"6 Communication, Resources, and Infrastructure." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 58
Next: 7 Organizational Structures and Related Issues »
Industrial Methods for the Effective Development and Testing of Defense Systems Get This Book
×
Buy Paperback | $35.00 Buy Ebook | $27.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

During the past decade and a half, the National Research Council, through its Committee on National Statistics, has carried out a number of studies on the application of statistical methods to improve the testing and development of defense systems. These studies were intended to provide advice to the Department of Defense (DOD), which sponsored these studies. The previous studies have been concerned with the role of statistical methods in testing and evaluation, reliability practices, software methods, combining information, and evolutionary acquisition.

Industrial Methods for the Effective Testing and Development of Defense Systems is the latest in a series of studies, and unlike earlier studies, this report identifies current engineering practices that have proved successful in industrial applications for system development and testing. This report explores how developmental and operational testing, modeling and simulation, and related techniques can improve the development and performance of defense systems, particularly techniques that have been shown to be effective in industrial applications and are likely to be useful in defense system development. In addition to the broad issues, the report identifies three specific topics for its focus: finding failure modes earlier, technology maturity, and use of all relevant information for operational assessments.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!