Appendix A

Agenda

WORKSHOP ON STATISTICAL ISSUES IN DEFENSE ANALYSIS AND TESTING

  1. Welcome and Introduction to the Workshop

John Rolph

Suzanne Woolsey

Miron Straf

  1. Overview of Statistical Issues in Defense Analysis and Testing

David Chu

Robert Duncan

  1. Cost and Operational Effectiveness Analyses and Operational Tests

    Rapporteur: Michael Cohen

  1. The purpose of cost and operational effectiveness (COEAs) for weapon systems: What do they tell us? What do we do now?

William Lese

  1. Why operational tests? How much is enough? What do we do now?

Ernest Seglie

  1. Discussion

Stephen Vardeman

Michael Woodroofe

  1. Rejoinders and general discussion



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 49
Statistical Issues in Defense Analysis and Testing: Summary of a Workshop Appendix A Agenda WORKSHOP ON STATISTICAL ISSUES IN DEFENSE ANALYSIS AND TESTING Welcome and Introduction to the Workshop John Rolph Suzanne Woolsey Miron Straf Overview of Statistical Issues in Defense Analysis and Testing David Chu Robert Duncan Cost and Operational Effectiveness Analyses and Operational Tests Rapporteur: Michael Cohen The purpose of cost and operational effectiveness (COEAs) for weapon systems: What do they tell us? What do we do now? William Lese Why operational tests? How much is enough? What do we do now? Ernest Seglie Discussion Stephen Vardeman Michael Woodroofe Rejoinders and general discussion

OCR for page 49
Statistical Issues in Defense Analysis and Testing: Summary of a Workshop Design of Experiments Rapporteur: Louis Gordon Design of experiments in COEAs: Where should we go next? Cyrus Staniec Design of experiments in operational test and evaluation (OT&E ): Where should we go next? Arthur Fries Discussion Vijayan Nair Jerome Sacks Rejoinders and general discussion Panel 1: Risk Management and the Connection Between COEAs, Developmental Testing, and Operational Testing Rapporteur: Michael Cohen Introduction COEAs Developmental testing Operational testing William Lese Richard Ledesma Ernest Seglie Panel discussion Henry Dubin James Duff Martin Meth Marion Williams Larry Crow Donald Gaver General discussion   Risk Assessment and Costs Rapporteur: Michael Cohen Risk assessment in modeling and testing Charles Horton Patricia Sanders Cost of tests. Cost of programs. John Gehrig

OCR for page 49
Statistical Issues in Defense Analysis and Testing: Summary of a Workshop Discussion Arthur Dempster James Hodges Rejoinders and general discussion Modeling for Weapon Systems Rapporteur: Nancy Spruill Modeling for weapon systems in an operational context: How to evaluate the nail's contribution to the war Jim Metzger Discussion Kathryn Laskey Stephen Pollock Rejoinders and general discussion Panel 2: Where should we go in the future? Presentations Richard Ledesma William Lese Ernest Seglie Discussion Donald Rubin Francisco Samaniego General Discussion Report from Rapporteurs Michael Cohen Louis Gordon Nancy Spruill Concluding Remarks John Rolph

OCR for page 49
Statistical Issues in Defense Analysis and Testing: Summary of a Workshop Appendix B Background Materials The Workshop on Statistical Issues in Defense Analysis and Testing was held jointly by the Committee on National Statistics and the Committee on Applied and Theoretical Statistics on September 24-25, 1992. As part of the workshop, defense analysts were invited to write and present background papers and discuss substantive areas in which they sought improvements through application of statistical methods. Below is a list of these papers, which are available from their respective authors. Commentaries and discussions of the background papers listed below were also prepared by various workshop participants. Please refer to Appendix A for a list of the discussion items. Techniques for Determining Test Size James B. Duff Navy Operational Test and Evaluation Force, Norfolk, Va Design of Experiments in Operational Test and Evaluation (OT&E )—Where Should We Go Next? Arthur Fries Institute for Defense Analyses, Alexandria, Virginia Risk Assessment in Modeling and Testing Charles Horton Office of the Director, Operational Test and Evaluation, The Pentagon

OCR for page 49
Statistical Issues in Defense Analysis and Testing: Summary of a Workshop Cost and Operational Effectiveness Analyses (COEAs) and the Acquisition Process William Lese Office of the Assistant Secretary of Defense for Program Analysis and Evaluation, The Pentagon Reliability Testing for Complex Single Shot Equipment and Highly Reliable Continuous Equipment Martin Meth and Robert Read Office of the Under Secretary of Defense for Acquisition, Production, and Logistics, The Pentagon Modeling for Weapon Systems in an Operational Context Jim Metzger Office of the Assistant Secretary of Defense for Program Analysis and Evaluation, The Pentagon Cost of Testing/Program Costs Peggy Mion and John Gehrig Army Test and Evaluation Management Agency, The Pentagon Risk Assessment in Modeling and Simulation Patricia Sanders Office of the Assistant Secretary of Defense for Program Analysis and Evaluation, The Pentagon How Much Testing Is Enough? Ernest Seglie Office of the Director, Operational Test and Evaluation, The Pentagon Design of Experiments in Operational Effectiveness Analysis Cyrus Staniec Office of the Assistant Secretary of Defense for Program Analysis and Evaluation, The Pentagon

OCR for page 49
Statistical Issues in Defense Analysis and Testing: Summary of a Workshop Appendix C Implementation Guidelines

OCR for page 49
Statistical Issues in Defense Analysis and Testing: Summary of a Workshop OFFICE OF THE SECRETARY OD DEFENSE WASHINGTON, D.C. 20301 MEMORANDUM FOR THE ASSISTANT SECRETARY OF THE ARMY (RESEARCH, DEVELOPMENT AND ACQUISITION) ASSISTANT SECRETARY OF THE NAVY (RESEARCH, DEVELOPMENT AND ACQUISITION) ASSISTANT SECRETARY OF THE AIR FORCE (ACQUISITION) SUBJECT: Implementation Guidelines for Relating Cost and Operational Effectiveness Analysis (COEA) Measures of Effectiveness (MOEs) to Test and Evaluation Current acquisition policy emphasizes that the cost and operational effectiveness analyses (COEAs) and test and evaluation are aids to decision making. COEAs are analytic tools used to determine whether the military benefit of alternative means for meeting operational requirements are worth the cost. Test and evaluation aids decisionmakers by verifying that systems have attained their technical performance specifications and objectives and are operationally effective and suitable for their intended use. The acquisition polices also state that a linkage should exist between COEAs and test and evaluation, particularly in the measures of effectiveness (MOEs) and performance parameters which define the military utility of a system. Attached are guidelines that have been developed to implement this policy consistently throughout the acquisition process. Attachment

OCR for page 49
Statistical Issues in Defense Analysis and Testing: Summary of a Workshop Relating Cost and Operational Effectiveness Analysis (COEA) Measures of Effectiveness (MOEs) to Test and Evaluation Implementation Guidelines ACQUISITION POLICY Current acquisition policy states that the cost and operational effectiveness analyses (COEA) and test and evaluation are aids to decisionmaking. The COEA aids decisionmakers in judging whether or not any of the proposed alternatives to the current program offer sufficient military benefit to be worth the cost. Test and evaluation aids decisionmakers by verifying that systems have attained their technical performance specifications and objectives and are operationally effective and operationally suitable for their intended use. Current acquisition policies also state that a linkage should exist between COEAs and test and evaluation. Specifically, DoD instruction 5000.2, “Defense Acquisition Management Policies and Procedures,” February 23, 1991 (Part 4, Section E, paragraph 3.a (5)) states: To judge whether an alternative is worthwhile, one must first determine what it takes to make a difference. Measures of effectiveness should be defined to measure operational capabilities in terms of engagement or battle outcomes. Measures of performance, such as weight and speed, should relate to the measures of effectiveness such that the effect of a change in the measure of performance can be related to a change in the measure of effectiveness. . .. (c) Measures of effectiveness should be developed to a level of specificity such that a system's effectiveness during developmental and operational testing can be assessed with the same effectiveness criteria as used in the cost and operational effectiveness analysis. This will permit further refinement of the analysis to reassess cost effectiveness compared to alternatives in the event that performance, as determined during testing, indicates a significant drop in effectiveness (i.e., to or below a threshold) compared to the levels assumed in the initial analysis. And DoD 5000.2-M, “Defense Acquisition Management Documentation and Reports,” February 1991 (Part 8, paragraph 2.a (5)) states: A comprehensive test and evaluation program is an integral factor in

OCR for page 49
Statistical Issues in Defense Analysis and Testing: Summary of a Workshop analyzing operational effectiveness, since it will provide test results at each milestone decision point that give credence to the key assumptions and estimates that may have been made in the current or earlier cost and operational effectiveness analysis. In order to implement this guidance consistently throughout the acquisition process, the following guidelines have been developed. GUIDELINES The DoD component, in the process of performing a Milestone I COEA, should identify the MOEs to be used in the COEA and show how these MOEs are derived from the Mission Need Statement (MNS). Each COEA should include MOEs reflecting operational utility that can be tested. For those MOEs that cannot be directly tested, the COEA should show how changes in testable parameters or measures of performance (MOPs) can be related to changes in COEA MOEs. The MOEs and the related MOPs should be included in the Operational Requirements Document (ORD) and the key MOEs/MOPs should also be included in the Acquisition Program Baseline (APB) subject to review by the Requirements Validation Authority and approval by the Milestone Decision Authority. The Test and Evaluation Master Plan (TEMP) should document how the COEA MOEs and related MOPs will be addressed in test and evaluation. Consistency should be maintained between all the acquisition management documentation. In particular, the MOEs, MOPs, and criteria in the ORD, the COEA, the TEMP and the APB, should be consistent. For a variety of reasons (e.g., affordability, availability of test resources, safety constraints), the proposed operational test environment frequently differs significantly from that assumed in the COEA. Principal differences include: the proposed operational test scenario and threat representation safety and operating restrictions not represented in the COEA differences between the test articles and the system as represented in the COEA (e.g., level of maturity of the system, instrumentation, etc.) that could cause a discrepancy in measured effectiveness. In assessing the possible impact of test limitations, the DoD Component responsible for the COEA should explain in a quantitative evaluation how

OCR for page 49
Statistical Issues in Defense Analysis and Testing: Summary of a Workshop and to what extent COEA results would be expected to vary as a result of test limitations. This should aid decisionmakers in their later evaluation of the test. The Milestone Decision Authority and the Requirements Validations Authority should be able to review the COEA using test results (developmental and operational test as appropriate to the milestone decision) to reaffirm the decision that the selected alternative is a cost effective approach to satisfying an operational government. If the system effectiveness thresholds stipulated in the APB and ORD and used in the COEA are not supported by test results, the COEA sensitivity analyses should be available to assist in determining whether the system, as tested, still offers sufficient military benefit to be worth its cost and whether the system can be confirmed to be operationally effective and operationally suitable.