National Academies Press: OpenBook
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

Statistics, Testing, and Defense Acquisition

New Approaches and Methodological Improvements

Michael L. Cohen, John E. Rolph, and Duane L. Steffey, Editors

Panel on Statistical Methods for Testing and Evaluating Defense Systems

Committee on National Statistics

Commission on Behavioral and Social Sciences and Education

National Research Council

NATIONAL ACADEMY PRESS
Washington, D.C.
1998

Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

NATIONAL ACADEMY PRESS
2101 Constitution Avenue, N.W. Washington, D.C. 20418

NOTICE: The project that is the subject of this report was approved by the Governing Board of the National Research Council, whose members are drawn from the councils of the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine. The members of the committee responsible for the report were chosen for their special competences and with regard for appropriate balance.

The project that is the subject of this report is supported by Contract DASW01-94-C-0119 between the National Academy of Sciences and the Director, Operational Test and Evaluation at the U.S. Department of Defense. Any opinions, findings, conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the view of the organizations or agencies that provided support for this project.

Additional copies of this report are available from the
National Academy Press,
2101 Constitution Avenue, NW, Lock Box 285, Washington, DC 20055. (800) 624-6242 or (202) 334-3313 (in the Washington Metropolitan Area).

Library of Congress Cataloging-in-Publication Data

Statistics, testing, and defense acquisition: new approaches and methodological improvements / Michael L. Cohen, John E. Rolph, and Duane L. Steffey, editors; Panel on Statistical Methods for Testing and Evaluating Defense Systems, Committee on National Statistics, Commission on Behavioral and Social Sciences and Education, National Research Council.

p. cm.

Includes bibliographical references (p.) and index.

ISBN 0-309-06551-8 (pbk.)

1. United States—Armed Forces—Weapons systems—Testing—Statistical methods. I. Cohen, Michael L. II. Rolph, John E. III. Steffey, Duane L. IV. National Research Council (U.S.). Commission on Behavioral and Social Sciences and Education. Panel on Statistical Methods for Testing and Evaluating Defense Systems.

UF503 .S727 1998

358.4' 1807'0973—ddc21

98-9065

Copyright 1998 by the National Academy of Sciences. All rights reserved.

Printed in the United States of America

Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

PANEL ON STATISTICAL METHODS FOR TESTING AND EVALUATING DEFENSE SYSTEMS

JOHN E. ROLPH (Chair),

Marshall School of Business, University of Southern California

MARION R. BRYSON,

North Tree Management, Monterey, California

HERMAN CHERNOFF,

Department of Statistics, Harvard University

JOHN D. CHRISTIE,

Logistics Management Institute, McLean, Virginia

LOUIS GORDON, Private Consultant,

Palo Alto, California

KATHRYN BLACKMOND LASKEY,

Department of Systems Engineering and Center of Excellence in C3I, George Mason University

ROBERT C. MARSHALL,

Department of Economics, Pennsylvania State University

VIJAYAN N. NAIR,

Department of Statistics, University of Michigan

ROBERT T. O'NEILL,

Division of Biometrics, Food and Drug Administration, U.S. Department of Health and Human Services

STEPHEN M. POLLOCK,

Department of Industrial and Operations Engineering, University of Michigan

JESSE H. POORE,

Department of Computer Science, University of Tennessee

FRANCISCO J. SAMANIEGO,

Division of Statistics, University of California, Davis

DENNIS E. SMALLWOOD,

Department of Social Sciences, U.S. Military Academy

MICHAEL L. COHEN, Study Director

DUANE L. STEFFEY, Study Director (to July 1995); Consultant

ANURADHA P. DAS, Research Assistant

ERIC M. GAIER, Consultant

CANDICE S. EVANS, Senior Project Assistant

Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

COMMITTEE ON NATIONAL STATISTICS 1997-1998

NORMAN M. BRADBURN (Chair),

National Opinion Research Center, University of Chicago

JULIE DAVANZO,

RAND, Santa Monica, California

WILLIAM F. EDDY,

Department of Statistics, Carnegie Mellon University

JOHN F. GEWEKE,

Department of Economics, University of Minnesota, Minneapolis

ERIC A. HANUSHEK,

W. Allen Wallis Institute of Political Economy, Department of Economics, University of Rochester

RODERICK J.A. LITTLE,

Department of Biostatistics, University of Michigan

THOMAS A. LOUIS,

School of Public Health, University of Minnesota

CHARLES F. MANSKI,

Department of Economics, University of Wisconsin

WILLIAM NORDHAUS,

Department of Economics, Yale University

JANET L. NORWOOD,

The Urban Institute, Washington, D.C.

EDWARD B. PERRIN,

School of Public Health and Community Medicine, University of Washington

PAUL ROSENBAUM,

Department of Statistics, Wharton School, University of Pennsylvania

KEITH F. RUST,

Westat, Inc., Rockville, Maryland

FRANCISCO J. SAMANIEGO,

Division of Statistics, University of California, Davis

MIRON L. STRAF, Director

ANDREW WHITE, Deputy Director

Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

Acknowledgments

The very nature of this study required the panel and staff to attend meetings (nine plenary and ten working group meetings) all over the country to collect information on test designs and evaluations on a wide variety of systems; a list of the systems we studied, including several we used as case studies, are in Appendix A. Locations we visited include the Army Test and Experimentation Command Headquarters at Fort Hunter Liggett in California; Eglin Air Force Base at Fort Walton Beach in Florida; the Air Force Operational Test and Evaluation Center in Albuquerque, New Mexico; the Navy Operational Test and Evaluation Force in Norfolk, Virginia; the Army Operational Test and Evaluation Command in Alexandria, Virginia; RAND in Santa Monica, California; and the Institute for Defense Analyses in Alexandria, Virginia. We were extremely fortunate to meet with so many people—from the military services, the Office of the Secretary of the U.S. Department of Defense (DoD), and private organizations in the testing community—willing to share their expertise with, and extend their hospitality to, the panel: to all these individuals, we are grateful.

We particularly wish to acknowledge the support of Philip Coyle, director, and Ernest Seglie, science adviser, DoD Office of the Director, Operational Test and Evaluation (the study sponsors); Henry Dubin, technical director, U.S. Army Operational Test and Evaluation Command; Steven Whitehead, technical director, U.S. Navy Operational Test and Evaluation Force; Marion Williams, technical director, U.S. Air Force Operational Test and Evaluation Center; and Robert Bell, technical director, U.S. Marine Corps Operational Test and Evaluation Activity.

In addition, many people went beyond the call of duty to assist the panel in

Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

its work. We thank, first: Kwai Chan, Christine Fossett, Jackie Guin, Louis Rodrigues, and Robert Stolba, General Accounting Office; Ric Sylvester, Office of the Deputy Undersecretary of Defense for Acquisition Reform; Lee Frame and Austin Huangfu, Office of the Director, Operational Test and Evaluation; Margaret Myers and Ray Paul, Office of the Secretary of Defense; Patricia Sanders, Office of the Undersecretary of Defense for Acquisition and Technology; Dean Zerbe, Senator Grassely's Office; Donald Yockey, former Under Secretary of Defense for Acquisition.

From the Air Force, we thank: Howard Leaf, Director of Test and Evaluation, U.S. Air Force; Suzanne Beers, David Blanks, Lyn Canham, Michael Carpenter, Charles Carter, Angie Crawford, David Crean, William Dyess, John Faris, Tim Gooley, Anthony "Shady" Groves, Ken Hebert, Brian Ishihara, Jeff Jacobs, Eric Keck, Roderick Leitch, Scott Long, Mike Malone, Michael McHugh, Donald Merkison, Terence Mitchell, Herbert Morgan, Ken Murphy, Sharon Nichols, Steve Ordonia, Ronald Reano, Mark Reid, James Sheedy, Brian Simes, Chuck Stansberry, Cecil Stevens, Robert Stovall, Frank Swehoskey, Scott Weisgerber, Larry Wolfe, and Dave Young, U.S. Air Force Operational Test and Evaluation Center.

From the Army, we thank: Susan Wright, Army Digitization Office; Cy Lorber, Army Materiel Command; Will Brooks, Sam Frost, Dwayne Nuzman, Jim Streilein, and Bill Yeakel, Army Materiel Systems Analysis Activity; Charles Pate, Training and Doctrine Command; Larry Leiby, Scott Lucero, John McVey, and Hank Romberg, U.S. Army Operational Test and Evaluation Command; Michael Hall, Greg Kokoskie, Ed Miller, Harold Pasini, Patrick Sul, and Tom Zeberlein, U.S. Army Operational Evaluation Command; Michael Jackson and Carl Russell, U.S. Army Test and Experimentation Center.

From the Navy, we thank: James Duff, former technical director of the Navy Operational Test and Evaluation Command; Donald Gaver, Naval Postgraduate School; Karen Ahlquist, Mike Alesi, Jeff Gerlitz, Kevin Smith, and Cynthia Womble, Navy Operational Test and Evaluation Force.

And from other institutions and agencies, we thank: Nozer Singpurwalla, George Washington University; Robert Boling, Peter Brooks, William Buchanan, James Carpenter, Thomas Christie, Gary Comfort, Robert Daly, Robert Dighton, Richard Fejfar, Arthur Fries, David Hart, Kent Haspert, Anil Joglekar, Irwin Kaufman, Richard "Hap" Miller, Michael Shaw, David Spalding, Bradley Thayer, Alfred Victor, Charles Waespy, and Steve Warner, Institute for Defense Analyses; Dale Pace, Johns Hopkins University; and Patrick Vye, RAND.

This report has been reviewed by individuals chosen for their diverse perspectives and technical expertise, in accordance with procedures approved by the NRC's Report Review Committee. The purpose of this independent review is to provide candid and critical comments that will assist the authors and the NRC in making the published report as sound as possible and to ensure that the report meets institutional standards for objectivity, evidence, and responsiveness to the

Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

study charge. The content of the review comments and draft manuscript remain confidential to protect the integrity of the deliberative process.

We thank the following individuals for their participation in the review of this report: David S.C. Chu, RAND, Washington, D.C.; Phil E. DePoy, National Opinion Research Center, University of Chicago; Gerald P. Dinneen, consultant, Edina, Minnesota; William Eddy, Department of Statistics, Carnegie Mellon University; Alexander H. Flax, consultant, Potomac, Maryland; David R. Heebner, Heebner Associates, McLean, Virginia; Robert J. Hermann, United Technologies Corporation, Hartford, Connecticut; James Hodges, Division of Biostatistics, University of Minnesota, Twin Cities; William Howard, consultant, Scottsdale, Arizona; Joseph B. Kadane, Department of Statistics, Carnegie Mellon University; Patrick D. Larkey, Heinz School of Public Policy, Carnegie Mellon University; John L. McLucas, consultant, Alexandria, Virginia; General Glenn K. Otis (ret.), Newport News, Virginia; and Warren F. Rogers, Warren Rogers Associates, Inc., Middletown, Rhode Island.

While the individuals listed above provided many constructive comments and suggestions, responsibility for the final content of this report rests solely with the authoring committee and the NRC.

The panel was fortunate to have an extremely able staff who both supported and led the panel through the past four years. I would like to particularly acknowledge the contributions of Duane Steffey, study director for the first phase of the study, and Michael Cohen, study director for the final phase. Their research and organizational skills, combined with their ability to develop contacts and foster relationships in the testing community made them an important asset to the success of this study.

The panel is grateful to Eugenia Grohman, Associate Director for Reports of the Commission on Behavioral and Social Sciences and Education (CBASSE), for her fine technical editorial work, which contributed greatly to the readability of this report.

Anu Das, research assistant for the study, provided invaluable support to the panel, particularly to the work of the software testing working group, assisting greatly in the preparation of Chapter 8. The panel's senior project assistant, Candice Evans, along with the difficult job of coordinating many offsite meetings—further complicated by the necessity for security clearances (and a hurricane!)—also handled all aspects of report production, most notably helping prepare drafts of Chapters 1 and 2 and Appendix D.

Finally, no acknowledgment would be complete without thanking the panel members themselves: they traveled extensively to military bases and test facilities, contributed their time and expert knowledge, and drafted many of the sections of the report.

John E. Rolph, Chair

Panel on Statistical Methods for Testing and Evaluating Defense Systems

Page viii Cite
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

The National Academy of Sciences is a private, non-profit, self-perpetuating society of distinguished scholars engaged in scientific and engineering research, dedicated to the furtherance of science and technology and to their use for the general welfare. Upon the authority of the charter granted to it by the Congress in 1863, the Academy has a mandate that requires it to advise the federal government on scientific and technical matters. Dr. Bruce Alberts is president of the National Academy of Sciences.

The National Academy of Engineering was established in 1964, under the charter of the National Academy of Sciences, as a parallel organization of outstanding engineers. It is autonomous in its administration and in the selection of its members, sharing with the National Academy of Sciences the responsibility for advising the federal government. The National Academy of Engineering also sponsors engineering programs aimed at meeting national needs, encourages education and research, and recognizes the superior achievements of engineers. Dr. William A. Wulf is president of the National Academy of Engineering.

The Institute of Medicine was established in 1970 by the National Academy of Sciences to secure the services of eminent members of appropriate professions in the examination of policy matters pertaining to the health of the public. The Institute acts under the responsibility given to the National Academy of Sciences by its congressional charter to be an adviser to the federal government and, upon its own initiative, to identify issues of medical care, research, and education. Dr. Kenneth I. Shine is president of the Institute of Medicine.

The National Research Council was organized by the National Academy of Sciences in 1916 to associate the broad community of science and technology with the Academy's purposes of furthering knowledge and advising the federal government. Functioning in accordance with general policies determined by the Academy, the Council has become the principal operating agency of both the National Academy of Sciences and the National Academy of Engineering in providing services to the government, the public, and the scientific and engineering communities. The Council is administered jointly by both Academies and the Institute of Medicine. Dr. Bruce Alberts and Dr. William A. Wulf are chairman and vice chairman, respectively, of the National Research Council.

Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

Preface

The Committee on National Statistics of the National Research Council (NRC) has had a long-standing goal of encouraging the development and use of state-of-the-art statistical methods across the federal government. In this context, discussions began several years ago during meetings of the Committee on National Statistics about the possibility of providing assistance to the U.S. Department of Defense (DoD). Mutual interest between the committee and the DoD Office of Program Analysis and Evaluation in the greater application of statistics within DoD led to a meeting of key DoD personnel and several NRC staff. As a result of this meeting, system testing and evaluation emerged as an area for which improvement in the application of statistics could prove useful.

Consequently, at the request of DoD, the Committee on National Statistics, in conjunction with the NRC Committee on Applied and Theoretical Statistics, held a 2-day workshop in September 1992 on experimental design, statistical modeling, simulation, sources of variability, data storage and use, and operational testing of weapon systems. The workshop was sponsored by the Office of the Director, Operational Test and Evaluation (DOT&E) and the Office of the Assistant Secretary of Defense for Program Analysis and Evaluation. Defense analysts were invited to write and present background papers and discuss substantive areas in which they sought improvements through application of statistical methods. Statisticians and other participants responded by suggesting alternative approaches to specific problems and identifying program areas that might especially benefit from the application of improved statistical methods. The overarching theme of the workshop was that using more appropriate statistical approaches could improve the evaluation of weapon systems in the DoD acquisi

Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

tion process. The workshop findings were published in Statistical Issues in Defense Analysis and Testing: Summary of a Workshop (Rolph and Steffey, 1994).

Workshop participants expressed the need for a study to address in greater depth the issues that surfaced at the workshop. A multiyear panel study was undertaken by the Committee on National Statistics in early 1994 sponsored by DOT&E. The Panel on Statistical Methods for Testing and Evaluating Defense Systems was established to recommend statistical methods for improving the effectiveness and efficiency of testing and evaluation of defense systems, with emphasis on operational testing. The 13-member panel comprised experts in the fields of statistics (including quality management, decision theory, sequential testing, reliability theory, and experimental design), operations research, software engineering, defense acquisition, and military systems.

The panel's interim report, Statistical Methods for Testing and Elvaluating Defense Systems (National Research Council, 1995), presented some preliminary findings, but it did not offer any recommendations. Key chapters were devoted to experimental design of operational test, operational testing of software-intensive systems, operational test and evaluation for reliability, availability, and maintainability, and use of modeling and simulation to assist in operational test design and evaluation.

This report presents the conclusions and recommendations resulting from the panel's 4-year study. The report is structured to accommodate various types of readers. Chapters 1-4 are for a non-technical audience. Chapter 1 discusses the panel's scope of work and how this was adjusted to deal with constraints on the application of statistics in the test and acquisition of military systems. Chapters 2 and 3 assess the current use of testing in system development and identify key elements of a new paradigm for the use of testing as part of the development of defense systems. Chapter 4 summarizes the substantial benefits that defense operational test design and evaluation would obtain from the use of statistical methods that reflect current practices. The changes recommended in Chapter 4 do not assume that the new paradigm recommended in Chapter 3 will be adopted and therefore can be implemented immediately.

Chapters 5-9 explore in more detail and more technically the topics covered in Chapter 4 as well as additional issues concerning the application of state-of-the-art statistical methods to defense operational test design and evaluation: experimental design for operational testing (Chapter 5), operational test evaluation (Chapter 6), test design and evaluation for reliability, availability, and maintainability (Chapter 7), software testing (Chapter 8), and modeling and simulation for use in operational test design and evaluation (Chapter 9).

Chapter 10 considers the need for the defense test and acquisition community to develop greater access to statistical expertise and how to do so.

Though the panel was not charged with the development or execution of technical work related to operational testing and evaluation, the panel decided

Page xiii Cite
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×

that further exploration of certain technical issues would be useful for its deliberations. Thus, three technical papers were prepared: "Strategic Information Generation and Transmission: The Evolution of Institutions in DoD Operational Testing." "On the Performance of Weibull Life Tests Based on Exponential Life Testing Designs," and "Application of Statistical Science to Testing and Evaluating Software Intensive Systems." The panel has drawn from the papers, which will be published separately; abstracts of them are presented in Appendix B.

John E. Rolph, Chair

Panel on Statistical Methods for Testing and Evaluating Defense Systems

Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
This page in the original is blank.
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page R1
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page R2
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page R3
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page R4
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page R5
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page R6
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page R7
Page viii Cite
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page R8
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page R9
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page R10
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page R11
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page R12
Page xiii Cite
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page R13
Suggested Citation:"Front Matter." National Research Council. 1998. Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Washington, DC: The National Academies Press. doi: 10.17226/6037.
×
Page R14
Next: Executive Summary »
Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements Get This Book
×
Buy Paperback | $57.00 Buy Ebook | $45.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

For every weapons system being developed, the U.S. Department of Defense (DOD) must make a critical decision: Should the system go forward to full-scale production? The answer to that question may involve not only tens of billions of dollars but also the nation's security and military capabilities. In the milestone process used by DOD to answer the basic acquisition question, one component near the end of the process is operational testing, to determine if a system meets the requirements for effectiveness and suitability in realistic battlefield settings. Problems discovered at this stage can cause significant production delays and can necessitate costly system redesign.

This book examines the milestone process, as well as the DOD's entire approach to testing and evaluating defense systems. It brings to the topic of defense acquisition the application of scientific statistical principles and practices.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!