Industrial Methods for the

EFFECTIVE DEVELOPMENT AND TESTING
OF DEFENSE SYSTEMS

Panel on Industrial Methods for the Effective Test
and Development of Defense Systems

Committee on National Statistics
Division of Behavioral and Social Sciences and Education

Board on Army Science and Technology
Division on Engineering and Physical Sciences

NATIONAL RESEARCH COUNCIL
OF THE NATIONAL ACADEMIES

THE NATIONAL ACADEMIES PRESS
Washington, D.C.
www.nap.edu



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page R1
Panel on Industrial Methods for the Effective Test and Development of Defense Systems Committee on National Statistics Division of Behavioral and Social Sciences and Education Board on Army Science and Technology Division on Engineering and Physical Sciences

OCR for page R1
THE NATIONAL ACADEMIES PRESS 500 Fifth Street, N.W. Washington, DC 20001 NOTICE: The project that is the subject of this report was approved by the Gov- erning Board of the National Research Council, whose members are drawn from the councils of the National Academy of Sciences, the National Academy of Engi - neering, and the Institute of Medicine. The members of the committee responsible for the report were chosen for their special competences and with regard for appropriate balance. Support for the work of the Committee on National Statistics is provided by a consortium of federal agencies through a grant from the National Science Foun - dation (grant number SES-0453930). The project that is the subject of this report was supported by an allocation of the National Science Foundation by the U.S. Department of Defense under this grant. Any opinions, findings, conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the view of the organizations or agencies that provided support for this project. International Standard Book Number-13: 978-0-309-22270-9 International Standard Book Number-10: 0-309-22270-2 Additional copies of this report are available from the National Academies Press, 500 Fifth Street, N.W., Lockbox 285, Washington, DC 20055; (800) 624-6242 or (202) 334-3313 (in the Washington metropolitan area); Internet, http://www.nap.edu. Copyright 2012 by the National Academy of Sciences. All rights reserved. Printed in the United States of America Suggested citation: National Research Council. (2012). Industrial Methods for the Effective Development and Testing of Defense Systems. Panel on Industrial Methods for the Effective Test and Development of Defense Systems. Committee on National Statistics, Division of Behavioral and Social Sciences and Education and Board on Army Science and Technology, Division on Engineering and Physical Sciences. Washington, DC: The National Academies Press.

OCR for page R1
The National Academy of Sciences is a private, nonprofit, self-perpetuating society of distinguished scholars engaged in scientific and engineering research, dedicated to the furtherance of science and technology and to their use for the general welfare. Upon the authority of the charter granted to it by the Congress in 1863, the Academy has a mandate that requires it to advise the federal govern - ment on scientific and technical matters. Dr. Ralph J. Cicerone is president of the National Academy of Sciences. The National Academy of Engineering was established in 1964, under the charter of the National Academy of Sciences, as a parallel organization of outstanding engineers. It is autonomous in its administration and in the selection of its mem - bers, sharing with the National Academy of Sciences the responsibility for advis - ing the federal government. The National Academy of Engineering also sponsors engineering programs aimed at meeting national needs, encourages education and research, and recognizes the superior achievements of engineers. Dr. Charles M. Vest is president of the National Academy of Engineering. The Institute of Medicine was established in 1970 by the National Academy of Sciences to secure the services of eminent members of appropriate professions in the examination of policy matters pertaining to the health of the public. The Institute acts under the responsibility given to the National Academy of Sciences by its congressional charter to be an adviser to the federal government and, upon its own initiative, to identify issues of medical care, research, and education. Dr. Harvey V. Fineberg is president of the Institute of Medicine. The National Research Council was organized by the National Academy of Sciences in 1916 to associate the broad community of science and technology with the Academy’s purposes of furthering knowledge and advising the federal government. Functioning in accordance with general policies determined by the Academy, the Council has become the principal operating agency of both the National Academy of Sciences and the National Academy of Engineering in pro - viding services to the government, the public, and the scientific and engineering communities. The Council is administered jointly by both Academies and the Institute of Medicine. Dr. Ralph J. Cicerone and Dr. Charles M. Vest are chair and vice chair, respectively, of the National Research Council. www.national-academies.org

OCR for page R1

OCR for page R1
PANEL ON INDUSTRIAL METHODS FOR THE EFFECTIVE TEST AND DEVELOPMENT OF DEFENSE SYSTEMS VIJAY NAIR (Chair), Department of Statistics and Department of Industrial and Operations Engineering, University of Michigan CHARLES E. (PETE) ADOLPH, Independent Consultant, Albuquerque, NM W. PETER CHERRY, Science Applications International Corporation, Ann Arbor, MI (Retired) JOHN D. CHRISTIE, Logistics Management Institute, Alexandria, VA THOMAS P. CHRISTIE, Independent consultant, Arlington, VA A. BLANTON GODFREY, College of Textiles, North Carolina State University RAJ KAWLRA, Manufacturing Quality, Chrysler LLC, Auburn Hills, MI JOHN E. ROLPH, Department of Industrial Operations and Management, Marshall School of Business, University of Southern California ELAINE WEYUKER, AT&T Laboratories, Florham Park, NJ MARION L. WILLIAMS, Institute for Defense Analyses, Alexandria, VA ALYSON G. WILSON, Science and Technology Policy Institute, Institute for Defense Analyses, Washington, DC MICHAEL L. COHEN, Study Director MICHAEL J. SIRI, Program Associate v

OCR for page R1
COMMITTEE ON NATIONAL STATISTICS 2011-2012 LAWRENCE D. BROWN (Chair), Department of Statistics, The Wharton School, University of Pennsylvania JOHN M. ABOWD, School of Industrial and Labor Relations, Cornell University ALICIA CARRIQUIRY, Department of Statistics, Iowa State University WILLIAM DuMOUCHEL, Oracle Health Sciences, Waltham, MA V. JOSEPH HOTZ, Department of Economics, Duke University MICHAEL HOUT, Survey Research Center, University of California, Berkeley KAREN KAFADAR, Department of Statistics, Indiana University SALLIE KELLER, Science and Technology Policy Institute, Institute for Defense Analyses, Washington, DC LISA LYNCH, Heller School for Social Policy and Management, Brandeis University SALLY C. MORTON, Department of Biostatistics, University of Pittsburgh JOSEPH NEWHOUSE, Division of Health Policy Research and Education, Harvard University RUTH D. PETERSON, Department of Sociology (emeritus), Ohio State University HAL S. STERN, Donald Bren School of Computer and Information Sciences, University of California, Irvine JOHN H. THOMPSON, National Opinion Research Center at the University of Chicago ROGER TOURANGEAU, Joint Program in Survey Methodology, University of Maryland, and Survey Research Center, University of Michigan ALAN ZASLAVSKY, Department of Health Care Policy, Harvard University Medical School CONSTANCE F. CITRO, Director vi

OCR for page R1
BOARD ON ARMY SCIENCE AND TECHNOLOGY ALAN H. EPSTEIN (Chair), Pratt & Whitney, East Hartford, CT DAVID M. MADDOX (Vice Chair), Independent consultant, Arlington, VA DUANE ADAMS, Carnegie Mellon University (Retired) ILESANMI ADESIDA, College of Engineering, University of Illinois at Urbana-Champaign RAJ AGGARWAL, College of Engineering, University of Iowa EDWARD C. BRADY, Strategic Perspectives, Inc., Fort Lauderdale, FL L. REGINALD BROTHERS, BAE Systems, Arlington, VA JAMES CARAFANO, The Heritage Foundation, Washington, DC W. PETER CHERRY, Science Applications International Corporation, Ann Arbor, MI EARL H. DOWELL, School of Engineering, Duke University RONALD P. FUCHS, Independent consultant, Bellevue, Washington W. HARVEY GRAY, Oak Ridge National Laboratory, Oak Ridge, TN CARL GUERRERI, Electronic Warfare Associates, Inc., Herndon, VA JOHN H. HAMMOND, Lockheed Martin Corporation, Fairfax, VA (Retired) RANDALL W. HILL, JR., Institute for Creative Technologies, University of Southern California MARY JANE IRWIN, Department of Computer Science and Engineering , Pennsylvania State University ROBIN L. KEESEE, Independent consultant, Fairfax, VA ELLIOTT D. KIEFF, Departments of Medicine and Microbiology, Harvard University LARRY LEHOWICZ, Quantum Research International, Arlington, VA WILLIAM L. MELVIN, Sensors, and Electromagnetic Applications Laboratory, Georgia Institute of Technology Research Institute ROBIN MURPHY, Department of Computer Science and Engineering and Cognitive NeuroSciences and Psychology, University of South Florida SCOTT PARAZYNSKI, The Methodist Hospital Research Institute, Houston, TX RICHARD R. PAUL, Independent consultant, Bellevue, WA JEAN D. REED, Independent consultant, Arlington, VA LEON E. SALOMON, Independent consultant, Gulfport, FL JONATHAN M. SMITH, School of Engineering and Applied Sciences, University of Pennsylvania MARK J.T. SMITH, School of Electrical and Computer Engineering, Purdue University MICHAEL A. STROSCIO, Nanoengineering Research Laboratory, University of Illinois, Chicago JOSEPH YAKOVAC, JVM, LLC, Hampton, VA BRUCE A. BRAUN, Director vii

OCR for page R1

OCR for page R1
Acknowledgments The Panel on Industrial Methods for the Effective Test and Develop - ment of Defense Systems expresses its appreciation to the many indi- viduals who provided valuable assistance in producing this report. We appreciate the support of Michael Gilmore, Director of Operational Test and Evaluation (DOT&E), and Frank Kendall, Principal Deputy Under Secretary of Defense (Acquisition, Technology, and Logistics) at the U.S. Department of Defense (DOD). We are also greatly indebted to Nancy Spruill, director, Acquisition Resources and Analysis, and Ernest Seglie, recently retired science advisor to the director of operational test and evaluation at DOD. The success of the study depended greatly on the presentations at the workshop, which was the panel’s main fact-finding activity. The panel is extremely grateful to the major speakers who represented industry perspectives: Donald Bollinger, Hewlett-Packard; Salim Momin, SRS Enterprises; Sham Vaidya, IBM; and Jeffrey Zyburt, DCYI Engineering Consulting & Development Process. The other presentations by Michael Cushing, Army Evaluation Center, and Robin Pope, SAIC (Science Appli - cations International Corporation), were also very helpful in preparing this report. We also thank Steve Hutchinson, Defense Information System Agency, DOD, and Dmitry Tananko, General Dynamics, who served as discussants. The workshop also included a very productive panel session with reactions of the defense test community to the presentations by the experts from industry, and we thank them: William McCarthy, DOT&E; Steve Welby, director, Systems Engineering; and Chris DiPetto, acting director, Development Test. ix

OCR for page R1
x ACKNOWLEDGMENTS We also thank the staff of the Committee on National Statistics, espe- cially Michael Siri and Anthony Mann, for their smooth organization of our meetings, and Julie Schuck for her work on the report draft and for helping plan and support the panel’s meetings. We would also like to express our gratitude to Eugenia Grohman for the technical editing of the report. Most importantly, we were fortunate to have an outstanding group of colleagues on the panel. They provided critical insights and expertise on industrial processes and systems engineering as well as defense acqui- sition and testing. They volunteered their time and service generously before, during, and after the panel meetings and were involved exten- sively in the writing of the report. This report has been reviewed in draft form by individuals chosen for their diverse perspectives and technical expertise, in accordance with procedures approved by the Report Review Committee of the National Research Council (NRC). The purpose of this independent review is to provide candid and critical comments that will assist the institution in making its published report as sound as possible and to ensure that the report meets institutional standards for objectivity, evidence, and respon - siveness to the study charge. The review comments and draft manuscript remain confidential to protect the integrity of the deliberative process. We thank the following individuals for their review of this report: Donald Bollinger, distinguished technologist, Hewlett-Packard; Gilbert F. Decker, consultant, Los Gatos, CA; Arthur Fries, staff member and project leader, Institute for Defense Analyses; Charles E. McQueary, consultant, former under secretary for science and technology, U.S. Department of Home- land Security and former director of operational test and evaluation; Department of Defense, Greensboro, NC; William Meeker, Department of Statistics, Iowa State University; Duane Steffey, director, Statistical and Data Sciences, Exponent®, Menlo Park, CA; Dmitry Tananko, manager, Reliability, General Dynamics Land Systems; and Jeff Zyburt, president, DCYI Consulting, New Hudson, MI. Although the reviewers listed above have provided many construc- tive comments and suggestions, they were not asked to endorse the con- clusions or recommendations nor did they see the final draft of the report before its release. The review of this report was overseen by Thom J. Hodgson, distinguished university professor, Fitts Industrial and Systems Engineering Department, North Carolina State University Appointed by the NRC’s Report Review Committee, he was responsible for making certain that an independent examination of this report was carried out in accordance with institutional procedures and that all review comments were carefully considered. Responsibility for the final content of this report rests entirely with the authoring panel and the institution.

OCR for page R1
xi ACKNOWLEDGMENTS The panel recognizes the many federal agencies that support the Committee on National Statistics directly and through a grant from the National Science Foundation. Without their support and their commit - ment to improving the national statistical system, the work that is the basis of this report would not have been possible. Vijay Nair, Chair Michael L. Cohen, Study Director Panel on Industrial Methods for the Effective Test and Development of Defense Systems

OCR for page R1

OCR for page R1
Contents Glossary and Acronyms xv Summary 1 1 Introduction 11 Scope of the Study, 11 The Panel’s Approach, 13 Structure of the Report, 14 2 Workshop Summary 15 Software, 15 Hardware, 19 3 Requirements Setting 25 Communication with Users, 26 Feasibility and Costs, 27 Changes in Requirements, 27 Use of Model-Based Design Tools, 29 4 Design and Development 33 The Importance of Technological Maturity, 33 Use of Objective Metrics for Assessment, 40 Staged Development with an Emphasis on Software, 41 xiii

OCR for page R1
xiv CONTENTS 5 Testing Methods 45 Testing as a Continuous Process for Learning, 45 Combining Information, 47 Accelerated Testing, 48 Software Systems, 49 6 Communication, Resources, and Infrastructure 51 Communication and Collaboration Among Requirements Setting, Design, and Testing, 51 Data Archiving, 53 Feedback Loops, 55 Systems Engineering Capabilities in DOD, 57 7 Organizational Structures and Related Issues 59 Enforcement of DOD Directives and Procedures, 59 The Role of a Program Manager, 61 References 65 Appendixes A Workshop Agenda 69 B Overview of the Defense Milestone System 73 C Biographical Sketches of Panel Members and Staff 75

OCR for page R1
Glossary and Acronyms ACAT: Acquisition category, a designation for each defense program based on program costs that determines both the level of review that is required by law and the level at which Milestone decision authority rests in DOD. ACAT I: Of four acquisition categories (ACAT I to ACAT IV), the most expensive systems, which are estimated to require either more than $365 mil- lion (fiscal 2000) for research and development or more than $2.19 billion (fiscal 2000) for purchase of the specified number of delivered systems. Defense Acquisition Board (DAB): A senior advisory board for defense acquisitions in DOD that includes the vice chairman of the Joint Chiefs of Staff and the service secretaries, among others, and that plays a key role since it is responsible for approving major defense acquisition programs. Developmental test (and evaluation): Typical testing of a defense system early in development, analogous to laboratory or bench testing, some - times involving only components or subsystems, that often does not represent full operational realism, in contrast with Operational test (and evaluation). Director, Cost Assessment and Program Evaluation (CAPE): The prin- cipal staff assistant to the secretary of defense for cost assessment and program evaluation, whose responsibilities include analysis and evalua- xv

OCR for page R1
xvi GLOSSARY AND ACRONYMS tion of plans, programs, and budgets in relation to U.S. defense objectives, projected threats, allied contributions, estimated costs, and resource con - straints and ensuring that the costs of DOD programs, including classified programs, are presented accurately and completely. Director, Defense Research and Engineering (DDR&E): The principal staff adviser to USD-AT&L for matters of research and engineering. Director, Operational Test and Evaluation: The office or the person who heads DOT&E. DOT&E: Office of the Director of Operational Test and Evaluation (or, sometimes, the person who holds the office), a unit in the Office of the Sec- retary of Defense, which also reports directly to Congress, responsible for DOD policies and procedures for analyzing and interpreting the results of operational testing and evaluation for each major DOD acquisition program, approving test plans, and providing independent evaluations of ACAT I systems. Effectiveness and suitability1: A measure of the overall ability of a sys- tem to accomplish a mission when used by representative personnel in the environment planned or expected for operational employment of the system considering organization, doctrine, tactics, supportability, vul- nerability, and threat. Effectiveness is the degree to which a system can carry out its mission when fully operational. (Operational) suitability is the degree to which a system can be placed and sustained satisfactorily in field use. Evolutionary acquisition: The development of a defense system in stages, with the systems that result from each stage of development potentially released to the field. 5000.01: DOD directive that provides management principles and man- datory policies and procedures for managing all acquisition programs. 5000.02: DOD instruction that establishes a simplified and flexible man- agement framework for translating capability needs and technology opportunities. 1Definition adapted from Joint Capabilities Integration and Development System, CJCSI 3170.01G. See http://jitc.fhu.disa.mil/jitc_dri/pdfs/3170_01g.pdf [December 2011].

OCR for page R1
xvii GLOSSARY AND ACRONYMS Full-rate production: The final step of procurement, in contrast to release to the field of a small number of units as part of low-rate initial produc- tion, which requires either the judgment that it is effective and suitable by DOT&E or by a full-rate production decision review. HP-UX: Hewlett-Packard’s implementation of the UNIX operating system. Initial Operational Test and Evaluation (IOT&E): The first large opera- tional test of a system or system element [see Operational test (and evaluation)]. Joint Capabilities Integrated Development System (JCIDS): A formal DOD procedure that defines requirements and evaluation criteria for defense systems in development. Materiel developer: The organization or command responsible for pro- viding materiel to DOD or specific service forces, with responsibilities that include research and development of weapon systems. Milestone A: The step in the Milestone system that promotes a system to the technology development phase of development. Milestone B: The step in the Milestone system that promotes a system to the engineering and manufacturing development phase of development. Milestone decision authority: The person or office responsible for the decision to promote a system to the next step of development in the Mile- stone system. Milestone system: A set of three milestones that bridge the four steps of defense acquisition: (1) materiel solution analysis, (2) technology develop- ment, (3) engineering and manufacturing development, and (4) produc - tion and deployment. Model-based engineering: Systems engineering, starting from develop- ment of requirements, through development of components and sub- systems, then integration into full systems, that is guided throughout by the use of models that simulate overall system performance of sys- tems comprised of various kinds of subsystems and components, which enforces collaboration across multiple engineering departments.

OCR for page R1
xviii GLOSSARY AND ACRONYMS Modeling and simulation: Various methods for simulating, sometimes with system components in the loop and sometimes entirely computer based, the functioning of a (proposed) defense system. Operational test (and evaluation): Testing of a defense system relatively late in development, involving the full system in whatever numbers will be used cooperatively in the field, in scenarios that attempt to repre - sent full operational realism, including representation of enemy systems, countermeasures, and operated by users with training typical of fielded systems. Program Management Office (PMO): The office tasked with develop- ment, production, and sustainment of a defense system on a timely basis that satisfies a set of requirements at a given price. Program manager2: The person with responsibility for and authority to accomplish program objectives for development, production, and sus - tainment to meet the user’s operational needs and accountable for cred- ible cost, schedule, and performance reporting to the Milestone decision authority. Reliability, Availability, and Maintainability (RAM)3: The probability of an item to perform a required function under stated conditions for a specified period of time (reliability), degree to which it is in an operable state and can be committed at the start of a mission when the mission is called for at an unknown (random) point in time (availability), and its ability to be retained in, or restored to, a specified condition when main - tenance is performed by personnel having specified skill levels, using pre- scribed procedures and resources, at each prescribed level of maintenance and repair (maintainability). Technology readiness level: The degree to which the behavior of a newly developed technology is understood well enough for incorporation into a system in Full-rate production. 2Definition adapted from the U.S. Department of Defense Directive 5000.01. See http:// www.dtic.mil/whs/directives/corres/pdf/500001p.pdf [December 2011]. 3Definition adapted from the U.S. Department of Defense Guide for Achieving Reli- ability, Availability, and Maintainability. See http://www.acq.osd.mil/dte/docs/RAM_ Guide_080305.pdf [December 2011].

OCR for page R1
xix GLOSSARY AND ACRONYMS Test and Evaluation Master Plan (TEMP): A formal document that pro- vides a scheme to be used to create detailed test and evaluation plans, especially schedule and resource commitments. U.S. Army Training and Doctrine Command (TRADOC): An Army ele- ment that provides training to soldiers and, as part of that training, helps design, develop, and integrate new capabilities and doctrine. USD-AT&L: The Under Secretary of Defense for Acquisition, Technology, and Logistics, the primary office in the Office of the Secretary of Defense responsible for the development and acquisition of defense systems.

OCR for page R1