National Academies Press: OpenBook
« Previous: Appendix D: Survey
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 88
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 89
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 90
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 91
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 92
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 93
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 94
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 95
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 96
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 97
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 98
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 99
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 100
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 101
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 102
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 103
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 104
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 105
Suggested Citation:"Appendix E: Types of Reviews." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 106

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Appendix E Types of Reviews Acquisition Strategy Panel (ASP) Air Force Instruction 63-101 describes the ASP as follows: 3.7.2.1. The Acquisition Strategy Panel supports the SAE and other MDAs. ASPs are forums that evaluate proposed acquisition strategies to ensure all key viable alternatives have been considered and that the best recommendation is provided to the SAE and/or the program’s MDA for approval. 3.7.2.2. The SAF/AQX-ACE is the SAE-chaired ASP process owner and secre- tariat for all ACAT I/IA and non-delegated ACAT II programs. 3.7.2.3. The field ACE offices are the ASP process owner and secretariat for all non-SAE chaired ACAT II and III PEO/DAO programs. 3.7.2.4. Information concerning SAE-chaired ASPs, such as the current draft template for briefings, can be found at the SAF/AQX-ACE ASP secretariat website. Additionally, similar information pertaining to non-SAE chaired ASPs can be found at each of the respective Field ACE websites which are accessible on the SAF ACE website. 3.7.2.5. Additional information regarding general ASP requirements can be found in AFFARS 5307.104-90, Acquisition Strategy Panels (ASPs).  A ir   Force Instruction 63-101, April 17, 2009, Acquisition and Sustainment Life Cycle M ­ anagement. 88

APPENDIX E 89 Ad Hoc Reviews Ad hoc reviews may come from many sources, some with very short time h ­ orizons, some with longer. They can come from leadership at the Office of the Secretary of Defense (OSD) seeking up-to-date information before a Defense Acquisition Board (DAB), Defense Acquisition Executive Summary (DAES), or other event that triggers interest or concern. They can come from the need to provide up-to-date information to support an unscheduled event or circumstance. Ad hoc reviews can also be initiated by military service or agency leadership in much the same way and for the same reasons. Advocacy focused reviews (miscel- laneous technical topics such as SW, T&E, production readiness, etc.) sponsored by subject matter experts (SMEs) are generally focused to support some higher- level reviews such as an overarching integrated product team (OIPT), DAB, or service/agency management review. Air Force Audit Agency (AFAA) Air Force Mission Directive 17 describes the AFAA as follows: The AFAA accomplishes the internal audit mission of the United States Air Force. The AFAA provides timely, value-added audit services to all management levels. These services focus on independent, objective, and quality audits that include reviewing and promoting the economy, effectiveness, and efficiency of operations; assessing and improving Air Force fiduciary stewardship and the accu­ racy of financial reporting; and evaluating programs and activities and assisting management in achieving intended results. Air Force Requirements for Operational Capabilities Council (AFROCC) Air Force Instruction 10-601 describes the AFROCC as follows: The AFROCC, an instrument of the CSAF and Secretary of the Air Force ­(SECAF), reviews, validates, and recommends approval of all Air Force ­capabilities-based requirements. The AFROCC ensures Air Force capabilities-based requirements documentation is prepared in accordance with Air Force and Joint Staff guidance, complies with established standards, and accurately articulates valid Air Force capabilities-based requirements. The AFROCC reviews Air Force FSA study plans directed by JCDs, AFCDs and for initiatives forecast to become ACAT I programs. For follow-on capabilities-based requirements documents, the AFROCC validates all Air Force-developed AoA Study Plans, interim status (when appropriate), and Air   Force Mission Directive 17, November 13, 2002, Air Force Audit Agency (AFAA). Air   Force Instruction 10-601, July 31, 2006, Capabilities-Based Requirements Development.

90 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS final results. It is chaired by AF/A5R and is composed of MAJCOM requirements principals, Secretariat, and HQ Air Force representatives. Air Force Review Board (AFRB) Air Force Instruction 63-101 describes the AFRB as follows: 3.7.1.1. AF Review Boards are forums chaired by the SAE for conducting major decision reviews (in- or out-of-cycle), as well as making and documenting major milestone decisions. AFRBs are not conducted for services or space programs. 3.7.1.2. SAF/AQX-ACE is the AFRB process owner and secretariat. 3.7.1.3. The AFRB process is required for all ACAT IC, ACAT IAC, non- d ­ elegated ACAT II programs and special interest programs. The PEO may recom- mend what type of AFRB is necessary: full, mini (tailored attendance), or paper. A template and more information can be found at the SAF/AQX-ACE website. 3.7.1.4. For ACAT ID and ACAT IAMs, AFRBs are used to develop the AF cor- porate consensus prior to an OSD Defense Acquisition Board (DAB) (pre-DAB within AF) or Information Technology Acquisition Board (ITAB). The AFRB should be conducted no later than two weeks prior to last OSD Overarching Integrated Product Team (OIPT). The SAE determines if an ACAT ID or ACAT IAM program requires an AFRB. 3.7.1.5. PEOs and DAOs execute a tailored AFRB process for delegated ACAT II and ACAT III programs. Air Force Systems Engineering Assessment Model (SEAM) The Air Force Center for Systems Engineering describes the AF SEAM as follows: AF SEAM defines ten AF standard SE process areas, lists associated goals under each process area and provides associated specific and generic practices. Many of the best practices contained in AF SEAM were derived from various Software Engineering Institute (SEI)/Carnegie Mellon, Capability Maturity Model Integration® (CMMI®) products. Additionally, various international and industry standards, Department of Defense publications and development team   Air Force Instruction 63-101, April 17, 2009, Acquisition and Sustainment Life Cycle M ­ anagement. Air Force Systems Engineering Assessment Model (AF SEAM) Management Guide, Version 1, Au-   gust 1, 2008. Available at http://www.afit.edu/cse/docs/AF%20SEAM%20Management%20Guide% 20(Aug%202008).pdf. Last accessed May 4, 2009.

APPENDIX E 91 members’ expert knowledge significantly contributed to the material contained in this model. It is essential to note that AF SEAM is a process assessment tool which is designed to assess the presence of needed SE processes as a “leading indicator” to subsequent delivery success. While the tool assesses the existence of SE process work products (i.e. CONOPS, plans, technical documents, etc) it does not assess the outcomes delivered to the customer. The model concentrates on “what” SE processes must be in place which, when properly executed, increase the likelihood customer needs will be satisfied. This is due to the fact that the quality of a System or Product is highly influenced by the quality of the process used to develop and maintain it. Alternative System Review (ASR) The Defense Acquisition Guidebook describes the ASR as follows: The ASR is a multi-disciplined technical review to ensure that the resulting set of requirements agrees with the customers’ needs and expectations and that the system under review can proceed into the Technology Development phase. The ASR should be complete prior to Milestone A. Generally this review assesses the alternative systems that have been evaluated during the Concept Refinement phase, and ensures that the preferred system alternative is cost effective, afford- able, operationally effective and suitable, and can be developed to provide a timely solution to a need at an acceptable level of risk. Of critical importance to this review is the understanding of available system concepts to meet the capabilities described in the Initial Capabilities Document and the affordabil- ity, operational effectiveness, and technology risks inherent in each alternative concept. Depending on the overall acquisition strategy, one or more preferred solutions may carry forward into the Technology Development phase. By reviewing alternative system concepts, the ASR helps ensure that sufficient effort has been given to conducting trade studies that consider and incorporate alternative system designs that may more effectively and efficiently meet the defined capabilities. A successful review is predicated on the IPT’s determination that the operational capabilities, preferred solution(s), available technologies, and program resources (funding, schedule, staffing, and processes) form a satisfac- tory basis for proceeding into the Technology Development phase. The program manager should tailor the review to the technical scope and risk of the system, and address the ASR in the Systems Engineering Plan.   efense D Acquisition Guidebook, Section 4.3.1.4.2, Alternative System Review (ASR). Available online at https://akss.dau.mil/dag/GuideBook/IG_c4.3.1.4.2.asp. Last accessed May 4, 2009.

92 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS Configuration Steering Board (CSB) Department of Defense Instruction 5000.02 describes the CSB as follows:  The Acquisition Executive of each DoD Component shall establish and chair a CSB with broad executive membership including senior representatives from the Office of the USD(AT&L) and the Joint Staff. Additional executive members shall include representatives from the office of the chief of staff of the Armed Force concerned, other Armed Forces representatives where appropriate, the military deputy to the CAE and the Program Executive Officer (PEO) (section 814 of P.L. 110-417, Reference (w)). (1) The CSB shall meet at least annually to review all requirements changes and any significant technical configuration changes for ACAT I and IA programs in development that have the potential to result in cost and schedule impacts to the program. Such changes will generally be rejected, deferring them to future blocks or increments. Changes shall not be approved unless funds are identified and schedule impacts mitigated. (2) The PM, in consultation with the PEO, shall, on a roughly annual basis, iden- tify and propose a set of descoping options, with supporting rationale address- ing operational implications, to the CSB that reduce program cost or moderate requirements. The CSB shall recommend to the MDA (if an ACAT ID or IAM program) which of these options should be implemented. Final decisions on descoping option implementation shall be coordinated with the Joint Staff and military department requirements officials. Critical Design Review (CDR) The Defense Acquisition Guidebook describes the CDR as follows: The CDR is a multi-disciplined technical review to ensure that the system under review can proceed into system fabrication, demonstration, and test; and can meet the stated performance requirements within cost (program budget), schedule (program schedule), risk, and other system constraints. Generally this review assesses the system final design as captured in product specifications for each configuration item in the system (product baseline), and ensures that each product in the product baseline has been captured in the detailed design documentation. Product specifications for hardware enable the fabrication of configuration items, and may include production drawings. Product specifications for software (e.g., Software Design Documents) enable coding of a Computer Software Configura-   epartment D of Defense Instruction 5000.02, Operation of the Defense Acquisition System, D ­ ecember 8, 2008.   efense Acquisition Guidebook, Section 4.3.3.4.5, Critical Design Review (CDR). Available D online at https://akss.dau.mil/dag/GuideBook/IG_c4.3.3.4.5.asp. Last accessed May 4, 2009.

APPENDIX E 93 tion Item. Configuration items may consist of hardware and software elements, and include items such as airframe, avionics, weapons, crew systems, engines, trainers/training, etc. Defense Acquisition Board (DAB) The Defense Acquisition Guidebook describes the DAB as follows: The Defense Acquisition Board advises the USD(AT&L)/DAE on critical acqui­ sition decisions. The USD(AT&L) chairs the Defense Acquisition Board, and the Vice Chairman of the Joint Chiefs of Staff serves as co-chair. Defense Acquisition Board members are the following executives: Under Secretary of Defense (Comptroller); Under Secretary of Defense (Policy); Under Secretary of Defense (Personnel & Readiness); Under Secretary of Defense (Intelligence); Assistant Secretary of Defense for Networks and Information Integration/DoD Chief Information Officer; Director, Operational Test & Evaluation; Chairman, Program Analysis and Evaluation; the Secretaries of the Army, the Navy, and the Air Force; and the Director, Acquisition Resources & Analysis (as the DAB Executive Secretary). Defense Acquisition Board advisors include the Principal Deputy USD(AT&L); the Deputy Under Secretary of Defense (Logistics & Material Readiness); the Director, Defense Research & Engineering; the relevant OIPT Leader(s); the Program Executive Officer; the Program Manager; the Chairmen, Cost Analysis Improvement Group; the Director, Defense Procure- ment and Acquisition Policy; DoD General Counsel; the Deputy Under Secretary of Defense (Industrial Policy); the DoD Component Acquisition Executives; Commander, United States Joint Forces Command; and the Chair, Functional Capabilities Board(s). The USD(AT&L)/DAE may ask other department officials to participate in reviews, as required. Defense Acquisition Executive Summary (DAES) The Defense Acquisition Guidebook describes the DAES as follows:10 The DAES is a multi-part document, reporting program information and assess- ments; program manager, Program Executive Officer, CAE comments; and cost and funding data. The DAES provides an early-warning report to USD(AT&L) and ASD(NII). The DAES describes actual program problems, warns of poten- tial program problems, and describes mitigating actions taken or planned. The program manager may obtain permission from USD(AT&L) or ASD(NII) to tailor DAES content. At minimum, the DAES should report program assessments   efense Acquisition Guidebook, Section 10.2.1, Defense Acquisition Board Review. Available D online at https://akss.dau.mil/DAG/Guidebook/IG_c10.2.asp#1021. Last accessed May 4, 2009. 10  efense Acquisition Guidebook, Section 10.9.4, Defense Acquisition Executive Summary D (DAES), Available online at https://akss.dau.mil/DAG/GuideBook/IG_c10.9.4.asp. Last accessed May 4, 2009.

94 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS (including interoperability), unit costs (10 U.S.C. 2433), and current estimates. It should also report the status of exit criteria and vulnerability assessments (31 U.S.C. 9106). The DAES should present total costs and quantities for all years, as projected, through the end of the current acquisition phase. In keeping with the concept of total program reporting, the DAES should present best estimates for costs b ­ eyond the FYDP, if the FYDP does not otherwise identify those costs. (The total program concept refers to system acquisition activities from Program Initiation through Production and Deployment.) The DAES should also report approved program funding for programs that are subsystems to platforms and whose pro- curement is reported in the platform budget line. Design Readiness Review (DRR) Department of Defense Instruction 5000.2 described the DRR as follows: 11 The Design Readiness Review during SDD provides an opportunity for mid- phase assessment of design maturity as evidenced by measures such as the number of subsystem and system design reviews successfully completed; the percentage of drawings completed; planned corrective actions to hardware/­ software deficiencies; adequate development testing; an assessment of environ- ment, safety and occupational health risks; a completed failure modes and effects analysis; the identification of key system characteristics and critical manufactur- ing processes; an estimate of system reliability based on demonstrated reliability rates; etc. Successful completion of the Design Readiness Review ends System Integration and continues the SDD phase into the System Demonstration effort. MDAs may, consistent with the intent of this paragraph, determine the form and content of the review. DOD Inspector General (DOD IG) Department of Defense Directive 5106.01 describes the DOD IG as follows:12 The Office of the Inspector General of the Department of Defense was estab- lished by Congress in the Defense Authorization Act for Fiscal Year 1983, Public Law (Pub. L.) 97-252, which is codified at Reference (c), as an independent and objective unit within the Department of Defense to conduct and supervise audits and investigations relating to the programs and operations of the Department of 11  epartment of Defense Instruction 5000.2, Operation of the Defense Acquisition System, May D 12, 2003. (This instruction has since been superseded by DODI 5000.02, Operation of the Defense Acquisition System, December 8, 2008.) 12  epartment of Defense Directive 5106.01, Inspector General of the Department of Defense, D April 13, 2006.

APPENDIX E 95 Defense. In support of the mission of the Department of Defense, the Inspector General performs the duties, has the responsibilities, and exercises the powers specified in Reference (c). [Appendix 3 of title 5, United States Code, “Inspector General Act of 1978,” as amended] Functional Configuration Audit (FCA) Military Handbook 61A describes the FCA as follows:13 The Functional Configuration Audit (FCA) is used to verify that the actual per- formance of the CI meets the requirements stated in its performance specification and to certify that the CI has met those requirements. For systems, the FCA is used to verify that the actual performance of the system meets the requirements stated in the system performance specification. In some cases, especially for very large, complex CIs and systems, the audits may be accomplished in increments. Each increment may address a specific functional area of the system/CI and will document any discrepancies that are found in the performance capabilities of that increment. After all of the increments have been completed, a final (summary) FCA may be held to address the status of all of the action items that have been identified by the incremental meetings and to document the status of the FCA for the system or CI in the minutes and certifications. In this way, the audit is effectively accomplished with a minimum of complications. Government Accountability Office (GAO) In its performance plan for fiscal year 2009, the GAO describes itself as follows:14 GAO is an independent, nonpartisan, professional services agency in the leg- islative branch of the federal government. Commonly known as the “audit and investigative arm of the Congress” or the “congressional watchdog,” we examine how taxpayer dollars are spent and advise lawmakers and agency heads on ways to make government work better. Our mission is to support the Congress in meeting its constitutional responsibili- ties and to help improve the performance and ensure the accountability of the federal government for the benefit of the American people. We accomplish our mission by providing reliable information and informed analysis to the Congress, to federal agencies, and to the public, and we recommend improvements, when appropriate, on a wide variety of issues. 13  ilitary M Handbook 61A(SE), Configuration Management Guidance, February 7, 2001. 14  erformance P Plan for Fiscal Year 2009: Mission, Performance Plans, Resources and Strategies, GAO-08-507SP, February 19, 2008. Available online at http://www.gao.gov/htext/d08507sp.html. Last accessed May 5, 2009.

96 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS Independent Program Assessment (IPA) National Security Space Acquisition Policy 03-01 describes the IPA as follows:15 An IPA is a focused, short duration “peer review” activity that typically runs from two to five weeks in duration depending on the program’s complexity. The core members of an IPAT are assigned to work the assessment full-time for the IPAT leader, who is responsible for the final recommendation to the MDA. The IPA activity is usually conducted at the program office locale and/or the contractor facility to facilitate easy, ready access to the system experts, the data, and the equipment under review. While the IPAT may discuss issues with vari- ous elements in conducting the assessment, the assessment is not a consensus process. Rather, it produces an unbiased, structured, independent evaluation of the proposed space acquisition activity in order to provide the DoD Space MDA an overview of how well the SPD/PM has addressed problematic issues and to identify areas of concern or potential risk. The IPA will also report on vulnerability, mitigation and protection measures addressed by the program. The IPA also compares program accomplishment with program objectives and with previous DoD Space MDA direction, guidance, decisions, and/or Presidential or Congressionally directed actions. Integrated Baseline Review (IBR) The Defense Acquisition Guidebook describes the IBR as follows:16 An IBR is a joint assessment of the Performance Measurement Baseline (PMB) conducted by the government program manager and the contractor. The IBR is not a one-time event. It is a process, and the plan should be continually evalu- ated as changes to the baseline are made (modifications, restructuring, etc.). IBRs should be used as necessary throughout the life of a project to facilitate and maintain mutual understanding of: • The scope of the PMB consistent with authorizing documents; • Management control processes; • Risks in the PMB associated with cost, schedules, and resources; and • Corrective actions where necessary. 15  ational Security Space Acquisition Policy 03-01, Guidance for DoD Space System Acquisition N Process, December 27, 2004. 16  efense Acquisition Guidebook, Section 11.3.3.3, Integrated Baseline Reviews (IBRs). Available D online at https://akss.dau.mil/dag/DoD5000.asp?view=document&rf=GuideBook\IG_c11.3.1.3.asp. Last accessed May 4, 2009.

APPENDIX E 97 Integrating Integrated Product Team (IIPT) The Defense Acquisition Guidebook mentions the IIPT as follows:17 IPTs are an integral part of the Defense acquisition oversight and review pro- cess. For Acquisition Category ID and IAM programs, there are generally two levels of IPT: the Overarching Integrated Product Team and the Working-level Integrated Product Team(s). Each program should have an OIPT and at least one WIPT. WIPTs should focus on a particular topic such as cost/performance, test, or contracting. An Integrating Integrated Product Team (IIPT), which is itself a WIPT, should coordinate WIPT efforts and cover all topics not otherwise a ­ ssigned to another IPT. IPT participation is the primary way for any organization to participate in the acquisition program. Joint Requirements Oversight Council (JROC) Chairman of the Joint Chiefs of Staff Instruction 5123.01D describes the JROC as follows:18 a. JROC Mission. Title 10, United States Code (USC), section 181, directed the Secretary of Defense to establish the JROC. In addition to other matters assigned to it by the President or Secretary of Defense, the JROC shall: (1) Assist the Chairman in identifying and assessing the priority of joint military capabilities (including existing systems and equipment) to meet the national military and defense strategies. (2) Assist the Chairman in considering alternatives to any acquisition program that has been identified to meet military capabilities by evaluating the cost, sched- ule, and performance criteria of the program and of the identified alternatives. (3) As part of its mission to assist the Chairman in assigning joint priority among existing and future programs meeting valid capabilities, ensure that the assign- ment of such priorities conforms to and reflects resource levels projected by the Secretary of Defense through the JPG. b. JROC Membership. The Chairman is the chairman of the JROC. The functions of the JROC Chairman are delegated to the Vice Chairman of the Joint Chiefs of Staff. Other members of the JROC are officers in the grade of general or admiral from the Army, Navy, Air Force, and Marine Corps. Service representatives are recommended by their military department secretary and approved by the Chair- man after consultation with the Secretary of Defense. 17 Defense Acquisition Guidebook, Section 10.3, Role of Integrated Product Teams (IPTs). Available online at https://akss.dau.mil/DAG/Guidebook/IG_c10.3.asp. Last accessed May 5, 2009. 18  hairman of the Joint Chiefs of Staff Instruction 5123.01D, Charter of the Joint Requirements C Oversight Council, August 1, 2007.

98 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS Manufacturing Readiness Assessment (MRA) The draft Department of Defense Manufacturing Readiness Assessment (MRA) Deskbook describes the MRA as follows:19 A Manufacturing Readiness Assessment (MRA), for the purposes of this docu- ment, is a structured evaluation of a technology, component, manufacturing process, weapon system or subsystem using the MRL definitions as a standard. It is performed to: • Define current level of manufacturing maturity • Identify maturity shortfalls and associated risks • Provide the basis for manufacturing maturation and risk management (plan- ning, identification, analysis, mitigation, implementation, and tracking) Operational Test Readiness Review (OTRR) The Defense Acquisition Guidebook describes the OTRR as follows:20 The OTRR is a multi-disciplined product and process assessment to ensure that the “production configuration” system can proceed into Initial Operational Test and Evaluation with a high probability of successfully completing the operational testing. Successful performance during operational test generally indicates that the system is suitable and effective for service introduction. The Full Rate Pro- duction Decision may hinge on this successful determination. The understanding of available system performance to meet the Capability Production Document is important to the OTRR. The OTRR is complete when the Service Acquisition Executive evaluates and determines materiel system readiness for Initial Opera- tional Test and Evaluation. Overarching Integrated Product Team (OIPT) The Defense Acquisition Guidebook describes the OIPT as follows:21 All Acquisition Category ID and IAM programs will have an OIPT to provide assistance, oversight, and review as the program proceeds through its acquisition 19  anufacturing Readiness Assessment (MRA) Deskbook [draft], May 29, 2008. Available online M at https://acc.dau.mil/GetAttachment.aspx?id=182129&pname=file&aid=34013&lang=en-US. Last accessed May 5, 2009. 20  efense Acquisition Guidebook, Section 4.3.4.4.2, Operational Test Readiness Review (OTRR). D Available online at https://akss.dau.mil/dag/GuideBook/IG_c4.3.4.4.2.asp. Last accessed May 4, 2009. 21  efense Acquisition Guidebook, Section 10.3.1, Overarching IPT (OIPT) Procedures and D Assess­ment. Available online at https://akss.dau.mil/dag/Guidebook/IG_c10.3.1.asp. Last accessed May 4, 2009.

APPENDIX E 99 life cycle. An appropriate official within OSD, typically the Director, Defense Systems or the Deputy to the Assistant Secretary of Defense for Networks and Information Integration (ASD(NII)) for Command, Control, Communications, Intelligence, Surveillance and Reconnaisance (C3ISR) [sic] and Information Technology (IT) Acquisition, will lead the OIPT for Acquisition Category ID programs. The Deputy to the ASD(NII) for C3ISR and IT Acquisition also leads the OIPT for Acquisition Category IAM programs. The OIPT for Acquisition Category IAM programs is called the NII OIPT. OIPTs should include the Pro- gram Manager, Program Executive Officer, DoD Component Staff, Joint Staff, and OSD staff involved in oversight and review of the particular Acquisition Category ID or IAM program. Other OIPTs, specifically those for Chem Bio and Space, will be lead [sic] and directed by similar executives. The OIPT should form upon departmental intention to start an acquisition pro- gram. The OIPT charters the Integrating Integrated Product Team and Working- level Integrated Product Teams. The OIPT should consider the recommendations of the Integrating Integrated Product Team regarding the appropriate milestone for program initiation and the minimum information needed for the program initiation milestone review. OIPTs should meet, thereafter, as necessary over the life of the program. The OIPT leader should act to resolve issues when requested by any member of the OIPT, or when so directed by the Milestone Decision Authority. The goal is to resolve as many issues and concerns at the lowest level possible, and to expeditiously escalate issues that need resolution at a higher level. The OIPT should bring only the highest-level issues to the Milestone Decision Authority for decision. The OIPT should normally convene 2 weeks before a planned decision point. It should assess the information and recommendations that the Milestone Decision Authority will receive. It should also assess family-of-system or system-of- s ­ ystem capabilities within and between functional portfolios (or areas) in support of integrated architectures developed by the Joint Staff in collaboration with the OSD, USAF (as DoD Space Milestone Decision Authority), and the DoD Com- ponents. If the program includes a pilot project, such as Total Ownership Cost Reduction, the Program Manager should report the status of the project to the OIPT. The OIPT should then assess progress against stated goals. The Program Manager’s briefing to the OIPT should address interoperability and supportability (including spectrum supportability) with other systems, anti-tamper provisions, and indicate whether those requirements will be satisfied by the acquisition strategy under review. If the program is part of a family-of-systems architecture, the Program Manager should brief the OIPT in that context. If the architecture includes less than Acquisition Category I programs that are key to achieving the expected operational capability, the Program Manager should also discuss the status of and dependence on those programs. The OIPT should review the programmatic risk issues of cost, schedule, and performance. The OIPT leader should recommend to the Milestone Decision Authority whether the anticipated review should go forward as planned.

100 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS For Acquisition Category ID decision points, the OIPT leader will provide the Defense Acquisition Board chair, co-chair, principals, and advisors with an integrated assessment using information gathered through the IPPD process. The OIPT assessment should focus on core acquisition management issues and should consider independent assessments, including technology readiness assess- ments, which the OIPT members normally prepare. These assessments typically occur in context of the OIPT review, and should be reflected in the OIPT leader’s report. There should be no surprises at this point-all team members should work issues in real time and should be knowledgeable of their OIPT leader’s assess- ment. OIPT and other staff members should minimize requirements for the program manager to provide pre-briefs independent of the OIPT process. Physical Configuration Audit (PCA) The Defense Acquisition Guidebook describes the PCA as follows:22 The PCA is conducted around the time of the full rate production decision. The PCA examines the actual configuration of an item being produced. It verifies that the related design documentation matches the item as specified in the contract. In addition to the standard practice of assuring product verification, the PCA confirms that the manufacturing processes, quality control system, measurement and test equipment, and training are adequately planned, tracked, and controlled. The PCA validates many of the supporting processes used by the contractor in the production of the item and verifies other elements of the item that may have been impacted/redesigned after completion of the System Verification Review (SVR). A PCA is normally conducted when the government plans to control the detail design of the item it is acquiring via the Technical Data Package. When the government does not plan to exercise such control or purchase the item’s Techni- cal Data Package (e.g., performance based procurement) the contractor should conduct an internal PCA to define the starting point for controlling the detail design of the item and establishing a product baseline. The PCA is complete when the design and manufacturing documentation match the item as specified in the contract. If the PCA was not conducted prior to the full rate production decision, it should be performed as soon as production systems are available. Post Critical Design Review Assessment (PCDRA) Department of Defense Instruction 5000.02 describes the PCDRA as follows:23 22  efense Acquisition Guidebook, Section 4.3.4.4.3, Physical Configuration Audit (PCA). Available D online at http://akss.dau.mil/dag/DoD5000.asp?view=document&rf=GuideBook\IG_c4.3.4.4.3.asp. Last accessed May 4, 2009. 23  epartment of Defense Instruction 5000.02, Operation of the Defense Acquisition System, D December 8, 2008.

APPENDIX E 101 The MDA shall conduct a formal program assessment following system-level CDR. The system-level CDR provides an opportunity to assess design maturity as evidenced by measures such as: successful completion of subsystem CDRs; the percentage of hardware and software product build-to specifications and drawings completed and under configuration management; planned corrective actions to hardware/software deficiencies; adequate developmental testing; an assessment of environment, safety and occupational health risks; a completed failure modes and effects analysis; the identification of key system characteris- tics; the maturity of critical manufacturing processes; and an estimate of system reliability based on demonstrated reliability rates. 1. The PM shall provide a Post-CDR Report to the MDA that provides an overall assessment of design maturity and a summary of the system-level CDR results which shall include, but not be limited to: a. The names, organizations, and areas of expertise of independent subject matter expert participants and CDR chair; b. A description of the product baseline for the system and the percentage of build-to packages completed for this baseline; c. A summary of the issues and actions identified at the review together with their closure plans; d. An assessment of risk by the participants against the exit criteria for the EMD Phase; and e. Identification of those issues/risks that could result in a breach to the pro- gram baseline or substantively impact cost, schedule, or performance. 2. The MDA shall review the Post-CDR Report and the PM’s resolution/ mitiga- tion plans and determine whether additional action is necessary to satisfy EMD Phase exit criteria and to achieve the program outcomes specified in the APB. The results of the MDA’s Post-CDR Assessment shall be documented in an ADM. 3. Successful completion of the Post-CDR Assessment ends Integrated System Design and continues the EMD Phase into System Capability and Manufacturing Process Demonstration. Preliminary Design Review (PDR) The Defense Acquisition Guidebook describes the PDR as follows:24 24  Defense Acquisition Guidebook, Section 4.3.3.4.4, Preliminary Design Review (PDR). Available online at https://akss.dau.mil/dag/GuideBook/IG_c4.3.3.4.4.asp. Last accessed May 4, 2009.

102 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS The PDR is a multi-disciplined technical review to ensure that the system under review can proceed into detailed design, and can meet the stated performance requirements within cost (program budget), schedule (program schedule), risk, and other system constraints. Generally, this review assesses the system pre- liminary design as captured in performance specifications for each configura- tion item in the system (allocated baseline), and ensures that each function in the functional baseline has been allocated to one or more system configuration items. Configuration items may consist of hardware and software elements and include such items as airframes, avionics, weapons, crew systems, engines, trainers/training, etc. Production Readiness Review (PRR) The Defense Acquisition Guidebook describes the PRR as follows:25 The PRR examines a program to determine if the design is ready for production and if the producer has accomplished adequate production planning. The review examines risk; it determines if production or production preparations incur unacceptable risks that might breach thresholds of schedule, performance, cost, or other established criteria. The review evaluates the full, production-config- ured system to determine if it correctly and completely implements all system requirements. The review determines whether the traceability of final system requirements to the final production system is maintained. Program Executive Officer Sufficiency Review (PEO/SR) The PEO/SR was described to the committee as follows:26 The purpose of the AAC PEO Program Sufficiency Review Process is three-fold. First, to provide acquisition teams a robust support structure to develop/refine acquisition strategies. The second purpose is to effectively employ senior acquisi- tion leaders’ experience with independent/peer reviews of high-interest acquisition plans, program technical status along with associated cost and schedule. Finally, the review assists the program team to design realistic/high confidence program plans and to be able to understand, agree with and fully explain the risks of the program. Sufficiency Reviews are the final step in an integrated assessment of technical and programmatic elements of the program construct. 25  efense Acquisition Guidebook, Section 4.3.3.9.3, Production Readiness Review (PRR). Avail- D able online at https://akss.dau.mil/dag/GuideBook/IG_c4.3.3.9.3.asp. Last accessed May 4, 2009. 26  udy A. Stokley, SES, USAF AFMC AAC/CA e-mail to Jim Garcia, January 9, 2009. J

APPENDIX E 103 Program Support Review (PSR) Department of Defense Instruction 5000.02 describes the PSR as follows: 27 PSRs are a means to inform an MDA and Program Office of the status of technical planning and management processes by identifying cost, schedule, and performance risk and recommendations to mitigate those risks. PSRs shall be conducted by cross-functional and cross-organizational teams appropriate to the program and situation. PSRs for ACAT ID and IAM programs shall be planned by the Director, Systems and Software Engineering (SSE) to support OIPT program reviews, at other times as directed by the USD(AT&L), and in response to requests from PMs. System Design Review (SDR) On the Defense Acquisition University’s Acquisition Community Connection Web site, the SDR is described as follows:28 This review is conducted to evaluate the optimization, correlation, completeness, and risks associated with the allocated technical requirements. A review of the system engineering process that produced the allocated technical requirements and of the engineering planning for the next phase of effort should also be r ­ eviewed. Basic manufacturing considerations should be reviewed and planning for production engineering in subsequent phases should be addressed. This review should be conducted when the system definition effort has proceeded to the point where system characteristics are defined and the configuration items are identified. System Functional Review (SFR) The Defense Acquisition Guidebook describes the SFR as follows:29 The SFR is a multi-disciplined technical review to ensure that the system under review can proceed into preliminary design, and that all system requirements and functional performance requirements derived from the Capability Development Document are defined and are consistent with cost (program budget), schedule (program schedule), risk, and other system constraints. Generally this review assesses the system functional requirements as captured in system specifications (functional baseline), and ensures that all required system performance is fully 27  epartment D of Defense Instruction 5000.02, Operation of the Defense Acquisition System, December 8, 2008. 28Available online at https://acc.dau.mil/CommunityBrowser.aspx?id=50742&lang=en-US. Last   accessed May 5, 2009. 29  Defense Acquisition Guidebook, Section 4.3.3.4.3, System Functional Review (SFR). Available online at https://akss.dau.mil/dag/GuideBook/IG_c4.3.3.4.3.asp. Last accessed May 4, 2009.

104 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS decomposed and defined in the functional baseline. System performance may be decomposed and traced to lower-level subsystem functionality that may define hardware and software requirements. The SFR determines whether the systems functional definition is fully decomposed to a low level, and whether the IPT is prepared to start preliminary design. System Requirements Review (SRR) The Defense Acquisition Guidebook describes the SRR as follows:30 The SRR is conducted to ascertain progress in defining system technical require­ ments. This review determines the direction and progress of the systems engi- neering effort and the degree of convergence upon a balanced and complete configuration. It is normally held during Technology Development, but may be repeated after the start of System Development and Demonstration to clarify the contractor’s understanding of redefined or new user requirements. The SRR is a multi-disciplined technical review to ensure that the system under review can proceed into the System Development and Demonstration phase, and that all system requirements and performance requirements derived from the Initial Capabilities Document or draft Capability Development Document are defined and are consistent with cost (program budget), schedule (program schedule), risk, and other system constraints. Generally this review assesses the system requirements as captured in the system specification, and ensures that the system requirements are consistent with the preferred system solution as well as available technologies resulting from the Technology Development phase. Of critical importance to this review is an understanding of the program technical risk inherent in the system specification and in the System Development and Demonstration Phase Systems Engineering Plan. Determining an acceptable level of risk is key to a successful review. System Verification Review (SVR) The Defense Acquisition Guidebook describes the SVR as follows:31 The SVR is a multi-disciplined product and process assessment to ensure that the system under review can proceed into Low-Rate Initial Production and Full- Rate Production within cost (program budget), schedule (program schedule), risk, and other system constraints. Generally this review is an audit trail from the Critical Design Review. It assesses the system final product, as evidenced in 30  efense Acquisition Guidebook, Section 4.3.2.4.1, System Requirements Review (SRR). Avail- D able online at https://akss.dau.mil/dag/GuideBook/IG_c4.3.2.4.asp#43241. Last accessed May 4, 2009. 31  efense Acquisition Guidebook, Section 4.3.3.9.2. System Verification Review (SVR), https:// D akss.dau.mil/dag/GuideBook/IG_c4.3.3.9.2.asp. Last accessed May 4, 2009.

APPENDIX E 105 its production configuration, and determines if it meets the functional require- ments (derived from the Capability Development Document and draft Capability Production Document) documented in the Functional, Allocated, and Product Baselines. The SVR establishes and verifies final product performance. It pro- vides inputs to the Capability Production Document. The SVR is often conducted concurrently with the Production Readiness Review. A Functional Configuration Audit may also be conducted concurrently with the SVR, if desired. Technology Readiness Assessment (TRA) The Defense Acquisition Guidebook describes the TRA as follows:32 The TRA is a systematic, metrics-based process that assesses the maturity of Critical Technology Elements. The TRA should be conducted concurrently with other Technical Reviews, specifically the Alternative Systems Review, System Requirements Review, or the Production Readiness Review. If a platform or system depends on specific technologies to meet system operational threshold requirements in development, production, and operation, and if the technology or its application is either new or novel, then that technology is considered a ­Critical Technology Element. The TRA should not be considered a risk assessment, but it should be viewed as a tool for assessing program risk and the adequacy of technology maturation planning. The TRA scores the current readiness level of selected system elements, using defined Technology Readiness Levels. The TRA highlights critical technologies and other potential technology risk areas that require program manager attention. The TRA essentially “draws a line in the sand” on the day of the event for making an assessment of technology readi- ness for critical technologies integrated at some elemental level. If the system does not meet pre-defined Technology Readiness Level scores, then a Critical Technology Element maturation plan is identified. This plan explains in detail how the Technology Readiness Level will be reached prior to the next milestone decision date or relevant decision point. Test Readiness Review (TRR) The Defense Acquisition Guidebook describes the TRR as follows:33 The TRR is a multi-disciplined technical review to ensure that the subsystem or system under review is ready to proceed into formal test. The TRR assesses test objectives, test methods and procedures, scope of tests, and safety and con- firms that required test resources have been properly identified and coordinated 32  efense Acquisition Guidebook, Section 4.3.2.4.3. Technology Readiness Assessment (TRA). D Available online at https://akss.dau.mil/dag/DoD5000.asp?view=document&rf=GuideBook\IG_ c4.3.3.9.4.asp. Last accessed May 4, 2009. 33  efense Acquisition Guidebook, Section 4.3.3.9.1. Test Readiness Review (TRR). Available D online at https://akss.dau.mil/dag/GuideBook/IG_c4.3.3.9.asp#43391. Last accessed May 4, 2009.

106 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS to support planned tests. The TRR verifies the traceability of planned tests to program requirements and user needs. The TRR determines the completeness of test procedures and their compliance with test plans and descriptions. The TRR assesses the system under review for development maturity, cost/schedule effectiveness, and risk to determine readiness to proceed to formal testing. In addition to adequate planning and management, to be effective the program manager should follow-up [sic] with the outcomes of the TRR.

Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs Get This Book
×
Buy Paperback | $43.00 Buy Ebook | $34.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Department of Defense (DOD) spends over $300 billion each year to develop, produce, field and sustain weapons systems (the U.S. Air Force over $100 billion per year). DOD and Air Force acquisitions programs often experience large cost overruns and schedule delays leading to a loss in confidence in the defense acquisition system and the people who work in it. Part of the DOD and Air Force response to these problems has been to increase the number of program and technical reviews that acquisition programs must undergo. This book looks specifically at the reviews that U.S. Air Force acquisition programs are required to undergo and poses a key question: Can changes in the number, content, or sequence of reviews help Air Force program managers more successfully execute their programs?

This book concludes that, unless they do it better than they are now, Air Force and DOD attempts to address poor acquisition program performance with additional reviews will fail. This book makes five recommendations that together form a gold standard for conduct of reviews and if implemented and rigorously managed by Air Force and DOD acquisition executives can increase review effectiveness and efficiency. The bottom line is to help program managers successfully execute their programs.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!