National Academies Press: OpenBook

Statistical Issues in Defense Analysis and Testing: Summary of a Workshop (1994)

Chapter: APPENDIX C: Implementation Guidelines

« Previous: APPENDIX B: Background Materials
Suggested Citation:"APPENDIX C: Implementation Guidelines." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×

Appendix C

Implementation Guidelines

Suggested Citation:"APPENDIX C: Implementation Guidelines." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×

OFFICE OF THE SECRETARY OD DEFENSE

WASHINGTON, D.C. 20301

MEMORANDUM FOR THE ASSISTANT SECRETARY OF THE ARMY (RESEARCH, DEVELOPMENT AND ACQUISITION)

ASSISTANT SECRETARY OF THE NAVY (RESEARCH, DEVELOPMENT AND ACQUISITION)

ASSISTANT SECRETARY OF THE AIR FORCE (ACQUISITION)

SUBJECT: Implementation Guidelines for Relating Cost and Operational Effectiveness Analysis (COEA) Measures of Effectiveness (MOEs) to Test and Evaluation

Current acquisition policy emphasizes that the cost and operational effectiveness analyses (COEAs) and test and evaluation are aids to decision making. COEAs are analytic tools used to determine whether the military benefit of alternative means for meeting operational requirements are worth the cost. Test and evaluation aids decisionmakers by verifying that systems have attained their technical performance specifications and objectives and are operationally effective and suitable for their intended use.

The acquisition polices also state that a linkage should exist between COEAs and test and evaluation, particularly in the measures of effectiveness (MOEs) and performance parameters which define the military utility of a system. Attached are guidelines that have been developed to implement this policy consistently throughout the acquisition process.

Attachment

Suggested Citation:"APPENDIX C: Implementation Guidelines." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×

Relating Cost and Operational Effectiveness Analysis (COEA)

Measures of Effectiveness (MOEs) to Test and Evaluation

Implementation Guidelines

ACQUISITION POLICY

Current acquisition policy states that the cost and operational effectiveness analyses (COEA) and test and evaluation are aids to decisionmaking. The COEA aids decisionmakers in judging whether or not any of the proposed alternatives to the current program offer sufficient military benefit to be worth the cost. Test and evaluation aids decisionmakers by verifying that systems have attained their technical performance specifications and objectives and are operationally effective and operationally suitable for their intended use.

Current acquisition policies also state that a linkage should exist between COEAs and test and evaluation. Specifically, DoD instruction 5000.2, “Defense Acquisition Management Policies and Procedures,” February 23, 1991 (Part 4, Section E, paragraph 3.a (5)) states:

To judge whether an alternative is worthwhile, one must first determine what it takes to make a difference. Measures of effectiveness should be defined to measure operational capabilities in terms of engagement or battle outcomes. Measures of performance, such as weight and speed, should relate to the measures of effectiveness such that the effect of a change in the measure of performance can be related to a change in the measure of effectiveness. . ..

(c) Measures of effectiveness should be developed to a level of specificity such that a system's effectiveness during developmental and operational testing can be assessed with the same effectiveness criteria as used in the cost and operational effectiveness analysis. This will permit further refinement of the analysis to reassess cost effectiveness compared to alternatives in the event that performance, as determined during testing, indicates a significant drop in effectiveness (i.e., to or below a threshold) compared to the levels assumed in the initial analysis.

And DoD 5000.2-M, “Defense Acquisition Management Documentation and Reports,” February 1991 (Part 8, paragraph 2.a (5)) states:

A comprehensive test and evaluation program is an integral factor in

Suggested Citation:"APPENDIX C: Implementation Guidelines." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×

analyzing operational effectiveness, since it will provide test results at each milestone decision point that give credence to the key assumptions and estimates that may have been made in the current or earlier cost and operational effectiveness analysis.

In order to implement this guidance consistently throughout the acquisition process, the following guidelines have been developed.

GUIDELINES

  1. The DoD component, in the process of performing a Milestone I COEA, should identify the MOEs to be used in the COEA and show how these MOEs are derived from the Mission Need Statement (MNS). Each COEA should include MOEs reflecting operational utility that can be tested. For those MOEs that cannot be directly tested, the COEA should show how changes in testable parameters or measures of performance (MOPs) can be related to changes in COEA MOEs. The MOEs and the related MOPs should be included in the Operational Requirements Document (ORD) and the key MOEs/MOPs should also be included in the Acquisition Program Baseline (APB) subject to review by the Requirements Validation Authority and approval by the Milestone Decision Authority.

  2. The Test and Evaluation Master Plan (TEMP) should document how the COEA MOEs and related MOPs will be addressed in test and evaluation. Consistency should be maintained between all the acquisition management documentation. In particular, the MOEs, MOPs, and criteria in the ORD, the COEA, the TEMP and the APB, should be consistent.

  3. For a variety of reasons (e.g., affordability, availability of test resources, safety constraints), the proposed operational test environment frequently differs significantly from that assumed in the COEA. Principal differences include:

    • the proposed operational test scenario and threat representation

    • safety and operating restrictions not represented in the COEA

    • differences between the test articles and the system as represented in the COEA (e.g., level of maturity of the system, instrumentation, etc.) that could cause a discrepancy in measured effectiveness.

    In assessing the possible impact of test limitations, the DoD Component responsible for the COEA should explain in a quantitative evaluation how

Suggested Citation:"APPENDIX C: Implementation Guidelines." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×

and to what extent COEA results would be expected to vary as a result of test limitations. This should aid decisionmakers in their later evaluation of the test.

  1. The Milestone Decision Authority and the Requirements Validations Authority should be able to review the COEA using test results (developmental and operational test as appropriate to the milestone decision) to reaffirm the decision that the selected alternative is a cost effective approach to satisfying an operational government. If the system effectiveness thresholds stipulated in the APB and ORD and used in the COEA are not supported by test results, the COEA sensitivity analyses should be available to assist in determining whether the system, as tested, still offers sufficient military benefit to be worth its cost and whether the system can be confirmed to be operationally effective and operationally suitable.

Suggested Citation:"APPENDIX C: Implementation Guidelines." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×

There was a problem loading page 59.

Suggested Citation:"APPENDIX C: Implementation Guidelines." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×
Page 54
Suggested Citation:"APPENDIX C: Implementation Guidelines." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×
Page 55
Suggested Citation:"APPENDIX C: Implementation Guidelines." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×
Page 56
Suggested Citation:"APPENDIX C: Implementation Guidelines." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×
Page 57
Suggested Citation:"APPENDIX C: Implementation Guidelines." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×
Page 58
Suggested Citation:"APPENDIX C: Implementation Guidelines." National Research Council. 1994. Statistical Issues in Defense Analysis and Testing: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/9686.
×
Page 59
Statistical Issues in Defense Analysis and Testing: Summary of a Workshop Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!