National Academies Press: OpenBook
« Previous: COOPERATION VERSUS ADVOCACY IN DECISION MAKING
Suggested Citation:"DATA STORAGE AND USE." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×

Data Storage and Use

Better use of data within DoD could improve the quality of decisions regarding weapon acquisition; since the workshop, DoD has drafted a directive on collection of data for weapon system development. Presentations and subsequent discussions at the workshop suggested that information from field and simulated testing (including associated costs) could be stored in more accessible records and shared more frequently across organizational lines. The creation of a new DoD statistical agency or less ambitious institutional reforms might facilitate efforts at quality improvement.

Hodges recommended that DoD establish and maintain strong links between data archiving and management operations and the groups engaged in statistical analysis of these data. With such organizational ties, statistical approaches might provide, as David Chu has suggested, leverage in determining what data to collect and how much testing is required. Hodges and others urged that DoD be open to possible new institutional arrangements to achieve its objectives.

Several participants pointed out that DoD testing data are expensive and hard to gather; they should therefore be valued highly. Even more scarce, as Chu noted in his remarks, are combat action data. Yet DoD agencies do not systematically build or use bodies of data on DoD systems. Hodges expressed the belief that, although some individuals value data and maintain archives, the importance of systematic accumulation and active use of data is not widely appreciated in DoD. He cited experiences involving missing documentation of helicopter operational tests and unused data

Suggested Citation:"DATA STORAGE AND USE." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×

about the accuracy of artillery forward observers. Also, there appear to be only limited attempts to gain information in a systematic way from training exercises for the purposes of improving combat analyses. Hodges, for example, has concluded that the Army has been unwilling to make a commitment of analytically trained personnel on the ground at the National Training Center and other facilities commensurate with their importance as sources of data. Apparently, field test data are often put away and forgotten after the regulatory requirement is satisfied. Seglie described the Pentagon attitude toward data as one characterized by the belief that testing handicaps the forces by using money and equipment that could be directed elsewhere.

Numerous workshop participants called for linking data from operational and developmental testing and using experience with similar systems. These activities implicitly require good data storage and accessibility. Gaver noted the need to work with historical data on various categories of equipment. Both Gaver and Fries noted recent movement in this direction; Fries referred to the Army's current activities in the establishment of a master test and evaluation data base and also called for comprehensive data collection in operational testing and for integrity checks to ensure data quality. Larry Crow also cited the need for a sound data collection system to assess the reliability growth of defense systems.

Regarding the cost and operational effectiveness analyses that are performed prior to concept demonstration approval (milestone 1) and development approval (milestone 2), Lese stressed the importance of developing validated data bases. He also called for technical and operational corroboration of data by engineering assessments and/or performance tests. He also argued that, because of the attention given to cost estimates in COEAs, these estimates should be validated and should include uncertainty and sensitivity analysis.

Others touched on the need for more careful collection and storage of cost data in other areas. Gaver called for a more explicit accounting of costs in reliability testing. In their background paper, Peggy Mion and John Gehrig (Appendix B) identified seven problems associated with understanding and determining the cost of testing for major weapons programs: (1) the difficulty of resurrecting financial records over the long acquisition period (15 to 20 years); (2) difficulty in determining boundaries between related programs; (3) accounting for development testing in view of contractor sensitivities about providing this information; (4) differentiation of testing costs from costs of tactics and doctrine development and training; (5) accounting for institutional costs of test range use; (6) accounting for use of resources in the program manager's offices; and (7) accounting for the cost of production testing.

Mion and Gehrig concluded that, although they had achieved some degree of success in determining test costs for the Army tactical missile

Suggested Citation:"DATA STORAGE AND USE." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×

system, there are major difficulties in obtaining definitive test cost data for any major acquisition program. These difficulties arise because relevant cost information does not reside within a single repository but is divided between project management offices and test and evaluation agencies. The authors also cite variations in cost accounting across test and evaluation agencies and developmental testing organizations.

Hodges identified two institutional problems contributing to poor data utilization within DoD. First, no one has responsibility for accumulating test data in one place and ensuring proper utilization. Second, there are competing objectives among the chief players—defense contractors, the services, and the Office of the Secretary of Defense —that complicate the collection and interchange of data. A technical problem is that some of the needed statistical methods—for example, Bayesian methods for incorporating prior information—are not trivial, which complicates their routine implementation. Hodges suggested, therefore, that DoD needs to develop a collection of methods for sensibly using an accumulated body of data.

Hodges and others believe that DoD should identify an agency to be the designated repository of data and the advocate of data collection and use. Such an agency could be a new office, an existing office, or a consortium of existing offices. In order to function effectively as a data advocate, this agency would need broad access to data that are being collected, some role in deciding what data will be collected, and some role in how data are used. In closing remarks at the workshop, Donald Rubin echoed Hodges's call for a data advocate within DoD, and Samaniego emphasized the need for long-term data management.

Suggested Citation:"DATA STORAGE AND USE." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×
Page 42
Suggested Citation:"DATA STORAGE AND USE." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×
Page 43
Suggested Citation:"DATA STORAGE AND USE." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×
Page 44
Next: THE FUTURE »
Statistical Issues in Defense Analysis and Testing: Summary of a Workshop Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!