National Academies Press: OpenBook
« Previous: 4 Changes to Infrastructure, Process, and Culture in Support of Evolutionary Acquisition
Suggested Citation:"References." National Research Council. 2006. Testing of Defense Systems in an Evolutionary Acquisition Environment. Washington, DC: The National Academies Press. doi: 10.17226/11575.
×

References

Aldridge, E.C., Jr. 2002 Evolutionary acquisition and spiral development. Crosstalk: Journal of Defense Software Engineering (August).


Berry, D.A. 2005 Bayesian clinical trials. Nature Reviews (December).

Bonder, S. 2000 Versatility Planning: An Idea Whose Time Has Come—Again! Steinhardt lecture presented at the Institute for Operations Research and the Management Sciences Conference, Salt Lake City, May 9.

Box, G.E.P., and N.R. Draper 1987 Empirical Model-Building and Response Surfaces. New York: John Wiley and Sons.

Box, G.E.P., W.G. Hunter, and J.S. Hunter 1978 Statistics for Experimenters: An Introduction to Design, Analysis, and Model-Building. New York: John Wiley and Sons.

Box, G.E.P., R. Kackar, V. Nair, M. Phadke, A. Shoemaker, and C.F.J. Wu 1988 Quality practices in Japan. Quality Progress 37-41.


Chaloner, K., and I. Verdinelli 1995 Bayesian experimental design: A review. Statistical Science 10:273-304.

Christie, T. 2002 Testers shouldn’t be blamed for defense program setbacks. National Defense Magazine (May). Available: http://www.nationaldefensemagazine.org/issues/2002/May/Testers_Shouldnt.htm [accessed March 2006].

Christle, G.E., A.R. DiTrapani, K.D. Marmaud, M.P. Sullivan, J.A. Thomas, M.A. Miller, J.A. Ferara, and R.V. Johnson 2001 The Army Acquisition Management Study: Congressional Mandate for Change. Alexandria, VA: Center for Naval Analysis.

Suggested Citation:"References." National Research Council. 2006. Testing of Defense Systems in an Evolutionary Acquisition Environment. Washington, DC: The National Academies Press. doi: 10.17226/11575.
×

Cohen, D.M., S.R. Dalal, A. Kajla, and G.C. Patton 1994 The automatic efficient tests generator. Proceedings of the Fifth International Symposium on Software Reliability Engineering, pp. 303-309, Institute of Electrical and Electronics Engineers, Inc.


Davis, P.K., Bigelow, J.H., and McEver, J. 1999 Analytical Methods for Studies and Experiments on “Transforming the Force.” Santa Monica, CA: RAND.


Mills, H.D. 1971 Top-down programming in large systems. In R. Rustin ed., Debugging Techniques in Large Systems. Upper Saddle River, NJ: Prentice-Hall.

Myers, R.H., and D.C. Montgomery 2001 Response Surface Methodology: Process and Product Optimization Using Designed Experiments. New York: John Wiley and Sons.


National Research Council 1992 Combining Information: Statistical Issues and Opportunities for Research. Washington, DC: National Academy Press.

1998 Statistics, Testing, and Defense Acquisition: New Approaches and Methodological Improvements. Panel on Statistical Methods for Testing and Evaluating Defense Systems, Committee on National Statistics, Commission on Behavioral and Social Sciences and Education, Michael L. Cohen, John E. Rolph, and Duane L. Steffey, editors. Washington, DC: National Academy Press.

2003 Innovations in Software Engineering for Defense Systems. Oversight Committee for the Workshop on Statistical Methods in Software Engineering for Defense Systems. S.R. Dalal, J.H. Poore, and M.L. Cohen, eds. Committee on National Statistics, Division of Behavioral and Social Sciences and Education. Washington, DC.: The National Academies Press.

2004 Improved Operational Testing and Evaluation and Methods of Combining Test Information for the Stryker Family of Vehicles and Related Army Systems. Phase II Report, Panel on Operational Test Design and Evaluation of the Interim Armored Vehicle. Committee on National Statistics, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.


Prowell, S., C. Trammell, R. Linger, and J. Poore 1999 Cleanroom Software Engineering: Technology and Process. Boston: Addison-Wesley.


Trammell, C.J., M.G. Pleszkoch, R.C. Linger, and A.R. Hevner 1996 The incremental development process in cleanroom software engineering. Decision Support Systems 17(1):55-71.


U.S. Department of Defense 2003 Department of Defense Directive 5000.1, May 12. Available: http://www.dtic.mil/whs/directives/corres/pdf2/d50001p.pdf [accessed March 2006].

2003 Department of Defense Instruction 5000.2, May 12. Available: http://www.dtic.mil/whs/directives/corres/pdf/i50002_051203/i50002p.pdf [accessed March 2006].

U.S. General Accounting Office 1992 Weapons Acquisition: A Rare Opportunity for Lasting Change. (GAO/NSIAD-93-15/December.) Washington, DC: U.S. General Accounting Office.

Suggested Citation:"References." National Research Council. 2006. Testing of Defense Systems in an Evolutionary Acquisition Environment. Washington, DC: The National Academies Press. doi: 10.17226/11575.
×

2002 Best Practices: Capturing Design and Manufacturing Knowledge Early Improves Acquisition Outcomes. (GAO-02-701, July 15.) Washington, DC: U.S. General Accounting Office.

2004 Using a Knowledge-Based Approach to Improve Weapon Acquisition. (GAO-04-386SP, January.) Washington, DC: U.S. General Accounting Office.


Wu, C.F., and M. Hamada 2000 Experiments, Planning, Analysis, and Parameter Design Optimization. New York: John Wiley and Sons.

Suggested Citation:"References." National Research Council. 2006. Testing of Defense Systems in an Evolutionary Acquisition Environment. Washington, DC: The National Academies Press. doi: 10.17226/11575.
×
Page 42
Suggested Citation:"References." National Research Council. 2006. Testing of Defense Systems in an Evolutionary Acquisition Environment. Washington, DC: The National Academies Press. doi: 10.17226/11575.
×
Page 43
Suggested Citation:"References." National Research Council. 2006. Testing of Defense Systems in an Evolutionary Acquisition Environment. Washington, DC: The National Academies Press. doi: 10.17226/11575.
×
Page 44
Next: Appendix A: Overview of the Current Milestone Development Process »
Testing of Defense Systems in an Evolutionary Acquisition Environment Get This Book
×
Buy Paperback | $29.00 Buy Ebook | $23.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Department of Defense (DoD) recently adopted evolutionary acquisition, a dynamic strategy for the development and acquisition of its defense systems. Evolutionary defense systems are planned, in advance, to be developed through several stages in a single procurement program. Each stage is planned to produce a viable system which could be fielded. The system requirements for each stage of development may be specified in advance of a given stage or may be decided at the outset of that stage's development. Due to the different stages that comprise an evolutionary system, there exists a need for careful reexamination of current testing and evaluation policies and processes, which were designed for single-stage developments.

The Office of the Under Secretary of Defense for Acquisition, Technology and Logistics (USD-AT&L) and the Director of Operational Testing and Evaluation (DOT&E) asked the Committee on National Statistics (CNSTAT) of the National Academies to examine the key issues and implications for defense testing from the introduction of evolutionary acquisition. The CNSTAT was charged with planning and conducting a workshop to study test strategies for the evolutionary acquisition. The committee reviewed defense materials defining evolutionary acquisition and interviewed test officials from the three major test service agencies to understand the current approaches used in testing systems procured through evolutionary acquisition. The committee also examined possible alternatives to identify problems in implementation.

At the workshop that took place on December 13-14, 2004, the committee tried to answer many questions including: What are the appropriate roles and objectives for testing in an evolutionary environment?, Can a systematic, disciplined process be developed for testing and evaluation in such a fluid and flexible environment?, and Is there adequate technical expertise within the acquisition community to fully exploit data gathered from previous stages to effectively combine information from various sources for test design and analysis?. Testing of Defense Systems in an Evolutionary Acquisition Environment provides the conclusions and recommendations of the CNSTAT following the workshop and its other investigations.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!