Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 99
Appendix A Letter Report of the Pane! to the Army Test and Evaluation Command
OCR for page 100
OCR for page 101
THE NATIONAL ACADEMIES Advisers to the Nation on Science, Engineering, and Medirine Division of Behavioral and Social Sciences and Education Committee on National Statistics Panel on Operational Test Design and Evaluation of the Interim Armored Vehicle (IAV) Frank John Apicella Technical Director Army Evaluation Center U.S. Army Test and Evaluation Command 4501 Ford Avenue Alexandria, VA 22302-1458 Dear Mr. Apicella: 500 Fifth Street, NW Washington, DC 20001 Phone: 202 334 3408 Fax: 202 334 3584 Email:jmcgee~ nas.edu October 10, 2002 As you know, at the request of the Army Test and Evaluation Com- mand (ATEC) the Committee on National Statistics has convened a panel to examine ATEC's plans for the operational test design and evaluation of the Interim Armored Vehicle, now referred to as the Stryker. The panel is currently engaged in its tasks of focusing on three aspects of the operational test design and evaluation of the Stryker: (1) the measures of performance and effectiveness used to compare the Interim Brigade Combat Team (IBCT), equipped with the Stryker, against a baseline force; (2) whether the current operational test design is consistent with state-of-the-art meth- ods in statistical experimental design; and (3) the applicability of models for combining information from testing and field use of related systems and from developmental test results for the Stryker with operational test results for the Stryker. ATEC has asked the panel to comment on ATEC's current plans and to suggest alternatives. The work performance plan includes the preparation of three reports: · The first interim report (due in November 2002) will address two 101
OCR for page 102
102 IMPROVED OPERATIONAL TESTING AND EVALUATION topics: (1) the measures of performance and effectiveness used to compare the Stryker-equipped IBCT against the baseline force, and (2) whether the current operational test design is consistent with state-of-the-art methods . . . . . In statlstlca . experlmenta . ( ~eslgn. · The second interim report (due in March 2003) will address the topic of the applicability of models for combining information from test- ing and field use of related systems and from developmental test results for the Stryker with operational test results for the Stryker. · The final report (due in luly 2003) will integrate the two interim reports and add any additional findings of the panel. The reports have been sequenced and timed for delivery to support ATEC's time-critical schedule for developing plans for designing and imple- menting operational tests and for performing analyses and evaluations of the test results. Two specific purposes of the initial operational test of the Stryker are to determine whether the Interim Brigade Combat Team (IBCT) equipped with the Stryker performs more effectively than a baseline force (Light In- fantry Brigade), and whether the Stryker meets its performance require- ments. The results of the initial operational test contribute to the Army's decisions of whether and how to employ the Stryker and the IBCT. The panel's first interim report will address in detail factors relating to the effec- tiveness and performance of the Stryker-equipped IBCT and of the Stryker; effective experimental designs and procedures for testing these forces and their systems under relevant operational conditions, missions, and scenarios; subjective and objective measures of performance and effectiveness for cri- teria of suitability, force effectiveness, and survivability; and analytical pro- cedures and methods appropriate to assessing whether and why the Stryker- equipped IBCT compares well (or not well) against the baseline force, and whether and why the Stryker meets (or does not meet) its performance requirements. In the process of deliberations toward producing the first interim re- port that will address this broad sweep of issues relevant to operational test design and to measures of performance and effectiveness, the panel has discerned two issues with long lead times to which, in the opinion of the panel, ATEC should begin attending immediately, so that resources can be identified, mustered, and applied in time to address them: early develop- ment of a "straw man" (hypothetical draft) Test and Evaluation Report (which will support the development of measures and the test design as
OCR for page 103
APPENDIXA: LETTER REPORT 103 well as the subsequent analytical efforts) and the scheduling of test partici- pation by the Stryker-equipped force and the baseline force so as to remove an obvious test confounder of different seasonal conditions. The general purpose of the initial operational test (JOT) is to provide information to decision makers about the utility of and the remaining chal- lenges to the IBCT and the Stryker system. This information is to be generated through the analysis of IOT results. In order to highlight areas for which data are lacking, the panel strongly recommends that immediate effort be focused on specifying how the test data will be analyzed to address relevant decision issues and questions. Specifically, a straw man Test Evalua- tion Report (TER) should be prepared as if the IOT had been completed. It should include examples of how the representative data will be analyzed, specific presentation formats (including graphs) with expected results, in- sights one might develop from the data, draft recommendations, etc. The content of this straw man report should be based on the experience and intuition of the analysts and what they think the results of the IOT might look like. Overall, this could serve to provide a set of hypotheses that would be tested with the actual results. Preparation of this straw man TER will help ATEC assess those issues that cannot be informed by the opera- tional tests as currently planned, will expose areas for which needed data is lacking, and will allow appropriate revision of the current operational test plan. The current test design calls for the execution of the IBCT/Stryker vs. the opposing force (OPFOR) trials and the baseline vs. the OPFOR trials to be scheduled for different seasons. This design totally confounds time of year with the primary measure of interest: the difference in effectiveness between the baseline force and the IBCT/Stryker force. The panel believes that the factors that are present in seasonal variations weather, foliage density, light level, temperature, etc. may have a greater effect on the differences between the measures of the two forces than the abilities of the two forces themselves. We therefore recommend that serious consideration be given to scheduling these events as closely in time as possible. One way to address the potential confounding of seasonal affects, as well as possible effects of learning by blue forces and by the OPFOR, would be to inter- sperse activities of the baseline force and the IBCT/Stryker force over time. The panel remains eager to assist ATEC in improving its plans and processes for operational test and evaluation of the IBCT/Stryker. We are grateful for the support and information you and your staff have consis- tently provided during our efforts to date. It is the panel's hope that deliv-
OCR for page 104
104 IMPROVED OPERATIONAL TESTING AND EVALUATION Bring to you the recommendations in this brief letter in a timely fashion will encourage ATEC to begin drafting a straw man Test Evaluation Report in time to influence operational test activities and to implement the change in test plan to allow the compared forces to undergo testing in the same season. Sincerely yours, Stephen Pollock, Chair Panel on Operational Test Design and Evaluation of the Interim Armored Vehicle
Representative terms from entire chapter: