6

Communication, Resources,
and Infrastructure

Several aspects of infrastructure, expertise, and acquisition processes at the U.S. Department of Defense (DOD) hamper the application of best engineering practices. In this chapter we consider the importance of communication among the different teams involved in testing and development, data archiving, the use of feedback loops, and available systems engineering capabilities.

COMMUNICATION AND COLLABORATION AMONG REQUIREMENTS SETTING, DESIGN, AND TESTING

Conclusion 1 highlights the need for early and clear communications about requirements. In addition, industry representatives at the workshop stressed the importance of collaboration and communication among customers and program developers as well as participants across all aspects of system development and testing. Such collaboration is essential to avoiding long, costly, and unsuccessful product development programs. The drivers of unsuccessful commercial programs included the following features that panel members noted to be common to many troubled acquisition programs in DOD.1

image

1See Zyburt’s presentation at http://www7.nationalacademies.org/cnstat/Presentations%20Main%20Page.html [November 2011].



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 51
6 Communication, Resources, and Infrastructure Several aspects of infrastructure, expertise, and acquisition processes at the U.S. Department of Defense (DOD) hamper the application of best engineering practices. In this chapter we consider the importance of com- munication among the different teams involved in testing and develop- ment, data archiving, the use of feedback loops, and available systems engineering capabilities. COMMUNICATION AND COLLABORATION AMONG REQUIREMENTS SETTING, DESIGN, AND TESTING Conclusion 1 highlights the need for early and clear communications about requirements. In addition, industry representatives at the workshop stressed the importance of collaboration and communication among cus - tomers and program developers as well as participants across all aspects of system development and testing. Such collaboration is essential to avoiding long, costly, and unsuccessful product development programs. The drivers of unsuccessful commercial programs included the follow- ing features that panel members noted to be common to many troubled acquisition programs in DOD.1 1See Zyburt’s presentation at http://www7.nationalacademies.org/cnstat/ Presentations %20Main%20Page.html [November 2011]. 51

OCR for page 51
52 INDUSTRIAL METHODS FOR EFFECTIVE DEVELOPMENT AND TESTING • ever-changing program targets and functional objectives; • providing inadequate or improper requirements to supplier or internal design group; • lack of agreement on pass/fail criteria; • late or no bench testing, which leads to a complete system becom- ing a “discovery property” rather than a “validation property”; and • robust developed technologies are not “plugged in” to a program, and consequently advanced engineering and concept design occurs along the critical path of the program timeline versus offline. Leading industrial companies have established programs to promote higher levels of collaboration among suppliers, manufacturers, customers, service organizations, and the ultimate users of the product. For example, Toyota Motor Company had two awards (excellence and superiority) to promote collaboration with and friendly competition among suppliers; see Box et al. (1988). The recent Center for Naval Analysis (2009) study on the success- ful F/A-18E/F Super Hornet Development Program (mentioned above) reported on one of the few (recent) DOD development programs com- pleted on time, within initial cost/required funding estimates, and meet- ing or exceeding all performance parameters—an outcome that resulted from close collaboration.2 Close collaboration has also existed in earlier successful DOD programs, such as the nuclear attack submarine and bal- listic missile submarine programs; it appears to be rare in recent DOD acquisition programs. Senior DOD acquisition executives in the Office of the Secretary and the military departments have the authority to require such close collaboration in the programs they oversee, but of late they have rarely required it or enforced it among the various groups critical to program success. The lack of coordinated efforts, particularly in the early stages of requirements and systems development, has contributed 2The program participants emphasized the importance of collaboration to achieve success. In this case, there was excellent partnership among the government program management office, the contractor program management office personnel and customer representatives from the Navy and the Office of the Secretary of Defense (OSD), the requirements com - munity, the developmental test authority, the operational test agency, and a cost analysis improvement group. Several attributes were key in the process, which include the following: • he program team took the time to get the requirements vetted and understood by all T and revalidated those requirements every year. • ll team members were willing to work in an open and sharing manner—one team, A one set of tools. • he program team took the time to get the program planning right from the start—and T executed the plan in an open and sharing way. • here was disciplined change control throughout. T

OCR for page 51
53 COMMUNICATION, RESOURCES, AND INFRASTRUCTURE to the long-term detriment, and sometimes the cancellation, of a number of acquisition programs. Collaboration can also be considered from the perspective of sharing information. Previous National Research Council (NRC) studies have emphasized the importance of using all available test information to improve operational evaluations, particularly with the use of evolu - tionary acquisition techniques. For example, National Research Coun - cil (2006:22) recommended that “the USD (AT&L) [Under Secretary of Defense for Acquisition, Technology, and Logistics] should develop and implement policies, procedures, and rules to . . . to share all relevant data on systems performance and the results of modeling and simulation . . . to assist in system evaluation and development.” It is not possible to use and integrate information from the various sources without good collaboration and sharing of models and data across all of the important testing events. DATA ARCHIVING Conclusion 7: A data archive with information on developmental and operational test and field data will provide a common frame- work for discussions on requirements and priorities for develop- ment. In addition, it can be used to expedite the identification of and correction of design flaws. Given the expenses and complexity in developing such an archive, it is important that the benefits of a data archive be adequately demonstrated to support development . Several previous NRC reports have also discussed this important topic,3 but it has not received any noticeable attention in DOD. The col- lection and analysis of data on test and field performance, including war- ranty data, is a standard feature in commercial industries. In the DOD context, it is also important to retain information about test suites (by both contractors and DOD). In fact, it would be useful to require, through contractual obligation, that detailed information on tests carried out by contractors be provided to DOD. The panel does not make this suggestion lightly, as providing access to such test data is a large undertaking. An archival test and field performance database could inform system developers as to the capabilities of components that had been used in fielded systems. Such a database could be extremely useful for require - ments setting for future related systems. Furthermore, by capturing field 3Key among them are (1) Recommendation 3.3 in National Research Council (1998:49); Recommendation 2 in National Research Council (2003:3); National Research Council (2004:61); and Recommendation 6 in National Research Council (2006:37).

OCR for page 51
54 INDUSTRIAL METHODS FOR EFFECTIVE DEVELOPMENT AND TESTING performance data, test scenarios can be selected to determine whether problems in a previously released system had been addressed in the most recent stage of development. Such a database could also help answer very broad questions about which development practices are most effective, cost impacts and trajectories, and what can be done to reduce develop- ment and acquisition costs. A key example is whether additional testing in development reduces life-cycle costs. Such a database, if it included data on the performance of fielded systems, could support analyses similar to those of warranty systems in the commercial world. As Gilmore (2010) notes, DOD spends a substantial amount of its acquisition budget on operations and support. For example, for ground combat systems, the cost is 73 percent. A key driver of this cost is the poor reliability performance of the system and the resulting costs for replacement parts. A data archive could support analysis to control and manage a considerable fraction of operations and support costs by revealing and quickly fixing system deficiencies through a failure mode, effects, and criticality analysis, and a failure reporting, analysis, and cor- rective action system supported by such data collection. This database would need to be easily accessible by program man- agers and testers. It is important for everyone to work from the same database so that requirements, specifications, and later assertions of reli- ability and effectiveness based on archived test results and results from modeling and simulation can be compared and contrasted. The speakers at the workshop insisted that developers in industry know the historical performance of components or subsystems that are included in the system in question, and so they can then anticipate problems in development and work to prevent them. Therefore, it is extremely important to also include contractor test results in such an archive, since that is the only way the full history of performance can be represented. Recommendation 3: The U.S. Department of Defense (DOD) should create a defense system data archive, containing developmental test, operational test, and field performance data from both contractors and the government. Such an archive would achieve several impor- tant objectives in the development of defense systems: • ubstantially increase DOD’s ability to produce more feasible s requirements, • support early detection of system defects, • improve developmental and operational test design, and • mprove modeling and simulation through better model i validation.

OCR for page 51
55 COMMUNICATION, RESOURCES, AND INFRASTRUCTURE Given these important benefits, DOD should initiate plans to begin creation of a defense system data archive. Some issues that need immedi - ate resolution include (a) whether the archive should be DOD-wide or should be stratified by type of system to limit its size, (b) what data are to be included and how the data elements should be represented to facilitate linkages of related systems, and (c) what data-based management struc - ture is used. In designing this archive, a flexible architecture should be used so that if the archive is initially limited to a subset of the data sources listed here due to budgetary considerations, the archive can be readily expanded over time to include the remaining sources. Specification of how such a database should be constructed and what it should contain are beyond the scope of this study. DOD currently has multiple databases that have been developed in the different services for different types of systems to satisfy various needs. They represent some aspects of the database we are describing. There are databases with devel- opmental test data, databases that collect operational test data, databases with modeling and simulation results, and databases with field perfor- mance data. Unfortunately, in most cases, these databases are not compat- ible with each other. Perhaps an initial approach to the development of a test and field data archive would be to institute linkages that allow the combination of system-specific information across the existing databases. A key reason for the lack of progress in this area is the incentive struc- ture in the DOD acquisition environment. Individual programs do not obtain any immediate benefit from committing resources for the devel- opment and maintenance of data archives beyond their own program for the common good. So the first step would be for DOD to recognize the advantages of building and maintaining such a database and exploring how a data archive would be funded. With this recognition, the panel sug- gests that DOD could commission a committee of people with expertise in database management and people with experience in program develop - ment to propose concrete recommendations. FEEDBACK LOOPS Conclusion 8: Feedback loops can significantly improve system development by improving developmental and operational test design and improving the use of modeling and simulation. Feedback systems can function similarly to warranty management systems that have proved essential to the automotive industry. To develop feasible requirements, understanding how components installed in related systems have performed when fielded is extremely useful in understanding their limitations for possibly more stressful use in a proposed system. To support such feedback loops, data on field

OCR for page 51
56 INDUSTRIAL METHODS FOR EFFECTIVE DEVELOPMENT AND TESTING performance, test data, and results from modeling and simulation must be easily accessible, which highlights the necessity for a test and field data archive. Field performance data are the ultimate indicators of how well a sys - tem is functioning in actual operational conditions. By field performance data, we include all the circumstances that can have an impact on the quality of the components, subsystems, and a system itself. These circum- stances include all relevant pre- and postdeployment activities, including transportation, maintenance, implementation, and storage. They could also include training data, if such data are collected objectively. Such data should be used to better understand the strengths and weaknesses of newly fielded systems and can be used in feedback loops. They can indicate when components or subsystems should be modified because of inferior effectiveness or suitability, and they can be used to identify for which missions the current system will work. For instance, if a system exhibits poor reliability in certain stressful scenarios of use, say, while carrying loads of more than a given weight, and if the reliability of the system under such conditions cannot be easily or quickly improved, the information can support a decision not to use the system for such mis- sions (if alternatives are available). And, if the reliability of the relevant component can be improved with a redesign, the information can be used to support arguments for such a redesign. Design flaws that are identified in fielded systems can also be evi - dence of failure in the testing process. For instance, inferior reliability of a system under heavy loads is likely to be an indication that those weights were not used during developmental and operational testing. The reason for such an omission can then be examined, and the process for selection of experimental designs can be improved. Also, field per- formance data can provide information on the validity of any modeling and simulation that were used to assess operational performance. For example, if modeling and simulation were used to extrapolate from light loads instead of actual physical testing, the validity of the use of modeling and simulation can be examined and the process for validat - ing modeling and simulation can then be improved. The National Research Council (2003) noted two significant benefits of feedback loops: field performance data can be used to help estimate total life-cycle costs of a newly fielded system, and, in spiral development, effective feedback processes can identify enhancements that will improve the effectiveness and suitability of later stages of development. Improving the quality and timeliness of this feedback is important in being able to respond to rapid changes in threat environments. It is the panel’s understanding that such a feedback loop currently

OCR for page 51
57 COMMUNICATION, RESOURCES, AND INFRASTRUCTURE operates only when a system is underperforming in a dramatic way. Instead, such analysis and feedback should be routine. Although DOD does collect field performance data, they are of highly varying quality and are not archived in a way that facilitates analysis (see discussion in Chapter 5). To varying degrees, the services do use a deficiency report - ing process as a feedback mechanism during developmental programs, starting with the design review and continuing through testing. Defi- ciencies are categorized to identify the relative importance and urgency of a response. For example, in the Air Force, the stated purpose of the deficiency reporting investigation and resolution process is to provide “a means of identifying deficiencies, resolving those deficiencies within the bounds of program resources and the appropriate acceptance of risk for those deficiencies that cannot be resolved in a timely manner” (U.S. Air Force, 2007:1-1). However, the process has mostly been allowed to atrophy in the past 15 years, for several reasons, most notably the services’ lack of participation in developmental testing and the need to ignore all but the most critical deficiencies that are identified because of a lack of funds to take corrective actions. Recommendation 4: After a test and field data archive has been established, the Under Secretary of Defense for Acquisition, Tech- nology, and Logistics and the acquisition executives in the military services should lead a U.S. Department of Defense (DOD) effort to develop feedback loops on improving fielded systems and on better understanding tactics of use of fielded systems. The DOD acquisi- tion and testing communities should also learn to use feedback loops to improve the process of system development, to improve developmental and operational test schemes, and to improve any modeling and simulation used to assess operational performance. SYSTEMS ENGINEERING CAPABILITIES IN DOD Conclusion 9: The U.S. Department of Defense has lost much of its expertise in all aspects of systems engineering in recent years. It is important to have in-house capability in the critical areas relating to the design, development, and operation of major types of systems and subsystems. One such area is expertise in model-based design tools. Some of the speakers at the workshop noted that commercial compa- nies stress the importance of systems engineering expertise. This expertise is key not only for system development but also for requirements setting, model development, and testing. In contrast, Adolph (2010:51) notes that

OCR for page 51
58 INDUSTRIAL METHODS FOR EFFECTIVE DEVELOPMENT AND TESTING in DOD: “The manpower reductions mandated by Congress in the late 1990s, followed by excessive additional services-directed reductions, have decimated the program office engineering and test support workforce as well as DOD government test organization personnel.” In addition, Adolph et al. (2008:220), summarizing a 2008 report of the Defense Science Board, state: “A second and related priority is to ensure that government organizations reconstitute a cadre of experienced test and evaluation, engineering, and RAM personnel to support the acquisition process.” In order to improve its test and development process, DOD will have to reverse this trend. It appears that DOD has recognized this problem and is taking steps to rectify it. The panel applauds this effort, but we emphasize that, even with a dedicated and sustained effort, it will take a take a decade or more to have the capabilities that DOD had in the early 1990s. Therefore, DOD should examine short-term use of contrac- tors, academics, employees of national laboratories, etc. so that many of the recommendations in this and other studies can be implemented in a timely manner. The problem of systems engineering capability is also complicated by the reduced numbers of U.S. citizens who are acquiring engineering degrees. DOD should examine creative alternatives, includ - ing ways to engage noncitizen engineers on DOD acquisition programs, temporary employment opportunities, fellowships, internships, and sab - baticals of various kinds.