1
Introduction

STATEMENT OF TASK

The committee was tasked by the Air Force to review the program management and the technical reviews and assessments that U.S. Air Force space and nonspace system acquisition programs are required to undergo; assess each review in terms of resources required and its role and contribution, identifying cases where different reviews have common or overlapping goals, content, or requirements; identifying and evaluating options for streamlining, tailoring, integrating, or consolidating reviews of programs to increase cost-effectiveness and to lessen the workforce impact of the reviews as a whole; and recommending changes that the Air Force and the Department of Defense should make. The committee’s tasking is shown in Box 1-1.

BACKGROUND

DOD spends over $300 billion per year to develop, produce, field, and sustain weapons systems.1 Too often, DOD weapons systems programs experience large cost overruns and schedule delays, contributing to a growing loss of confidence

1

See DOD (U.S. Department of Defense), National Defense Budget Estimates for FY 2009, Updated September 2008. This amount is the sum of the amounts shown for “Operation & Maintenance,” “Procurement,” and “RDT&E.” Available online at http://www.defenselink.mil/comptroller/defbudget/fy2009/FY09Greenbook/greenbook_2009_updated.pdf. Last accessed May 19, 2009.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 5
1 Introduction STATEMENT OF TASk The committee was tasked by the Air Force to review the program manage- ment and the technical reviews and assessments that U.S. Air Force space and nonspace system acquisition programs are required to undergo; assess each review in terms of resources required and its role and contribution, identifying cases where different reviews have common or overlapping goals, content, or require - ments; identifying and evaluating options for streamlining, tailoring, integrating, or consolidating reviews of programs to increase cost-effectiveness and to lessen the workforce impact of the reviews as a whole; and recommending changes that the Air Force and the Department of Defense should make. The committee’s task - ing is shown in Box 1-1. BACkgROUND DOD spends over $300 billion per year to develop, produce, field, and sustain weapons systems.1 Too often, DOD weapons systems programs experience large cost overruns and schedule delays, contributing to a growing loss of confidence 1 See DOD (U.S. Department of Defense), National Defense Budget Estimates for FY 2009, Updated September 2008. This amount is the sum of the amounts shown for “Operation & Maintenance,” “Procurement,” and “RDT&E.” Available online at http://www.defenselink.mil/comptroller/defbudget/ fy2009/FY09Greenbook/greenbook_2009_updated.pdf. Last accessed May 19, 2009. 

OCR for page 5
 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS BOX 1-1 Statement of Task The National Research Council (NRC) will 1. Review the prescribed program reviews and assessments that U.S. Air Force space and non-space system acquisition programs in all Depart- ment of Defense (DOD) acquisition categories (ACATs) are required to undergo, consistent with the various phases of the acquisition life- cycle, that verify appropriate planning has occurred prior to concept decision, Milestone/Key Decision Point (KDP) A, Milestone/KDP B, and Milestone/KDP C. 2. Assess each review and the resources required to accomplish it, including funding, manpower (people and knowhow), work effort, and time. 3. Assess the role and contribution that each review and the combined reviews make to successful acquisition. 4. Identify cases where different reviews have shared, common, or over- lapping goals, objectives, content, or requirements. 5. Identify and evaluate options for streamlining, tailoring, integrating, or consolidating reviews of programs to increase the cost-effectiveness and to lessen workforce impact of the reviews as a whole, including examination and discussion of review processes used by other agen- cies (such as the National Aeronautics and Space Administration and the Department of Energy), the other military departments (the U.S. Army and the U.S. Navy), and industry. 6. Recommend changes that the Air Force and DOD should make to the reviews of Air Force programs, including review goals, objectives, content, and requirements.

OCR for page 5
 INTRODUCTION in the DOD acquisition system.2,3 As reflected in the statement of task in Box 1-1, this study addresses improvements to one of the essential elements of program success—program reviews. The DOD acquisition decision process is based on phased milestone decisions that are supported by a series of technical and programmatic reviews. These reviews are designed to help program managers (PMs) effectively and efficiently manage the programs and to give executive leadership the information it needs to inform decisions. The formal acquisition decision process in place at the time of the study and used by the committee as the basis for its review is depicted in Figure 1-1. As noted in the Summary, the May 2003 version of DODI 5000.2 was replaced in December 2008 by DODI 5000.02, shown in Figure 1-2. The main differences are these: The materiel development decision (MDD) replaces the concept decision (CD); the materiel solution analysis (MSA) phase replaces the concept refinement (CR) phase; the engineering and manufacturing development (EMD) phase replaces the system development and demonstration (SDD) phase, and its two main efforts have been renamed (system integration and system demonstration became integrated system design and system capability and manufacturing process demonstration). Post-CDR assessment replaces the design readiness review. This formal DOD review process has evolved over the past 60 years, with many of the changes intended to address acquisition program cost overruns, schedule delays, and performance shortfalls in the delivered product, service, or system. Since implementation of the Goldwater-Nichols Act in the late 1980s, 4 the main defense acquisition organizations (e.g., the program management offices) have operated under a tiered decision structure. For large acquisitions, the current policy described in DOD Instruction 5000.1 states that the Under Secretary of Defense for Acquisition, Technology, and Logistics (USD (AT&L)) is the Mile - stone Decision Authority (MDA) responsible for the overall program: The Milestone Decision Authority (MDA) is the designated individual with over- all responsibility for a program. The MDA shall have the authority to approve entry of an acquisition program into the next phase of the acquisition process and shall be accountable for cost, schedule, and performance reporting to higher authority, including Congressional reporting.5 2 Elizabeth Newell, “GAO: Weapons systems over budget, overdue, underperforming” (April 1, 2008). Available online at http://www.govexec.com/dailyfed/0408/040108e1.htm. Last accessed May 19, 2009. 3 GAO (Government Accountability Office), Defense Acquisition: Assessment of Selected Weapons Programs, GAO-08-467SP, Washington, D.C.: GAO (2008). 4U.S. Congress, Goldwater-Nichols Department of Defense Reorganization Act of 1986, Public Law 99-433. 5 USD (AT&L), The Defense Acquisition System, Department of Defense Directive 5000.1, Wash - ington, D.C.

OCR for page 5
8 Program Initiation System Development Production and and Demonstration User Needs and Concept Deployment Technology Operations Technology Refinement Development Production and Support Full Rate Opportunities System System Readiness, Production and Integration Demonstration LRIP, and IOT&E Deployment FRP DRR CD A B C FIgURE 1-1 Major defense acquisition decision reviews and phases. SOURCE: Adapted from John T. Dillard, Centralized Control of De- fense Acquisition Programs: A Comparative Review of the Framework from 98 to 200, NPS-AM-03-003, Acquisition Research Sponsored Report Series, September 2003, Monterey, Calif.: Naval Postgraduate School. Program Initiation Engineering and Production and Manufacturing Materiel User Needs and Technology Deployment Development Solution Operations Technology Development Production Analysis Full Rate System Capability and Support Opportunities Integrated Readiness, LRIP, Production and Manufacturing System Design and IOT&E and Process Deployment Demonstration FRP A B C MDD Figure 1-1 broadside PCDRA FIgURE 1-2 Revised major defense acquisition decision reviews and phases. SOURCE: Adapted from Figure 1-1 and new DODI 5000.02.

OCR for page 5
9 INTRODUCTION Three levels down the hierarchy, a PM is described as follows: The designated individual with responsibility for and authority to accomplish program objectives for development, production, and sustainment to meet the user’s operational needs. The PM shall be accountable for credible cost, schedule, and performance reporting to the MDA. Thus, the PM and MDA share responsibility for development and oversight of a program. Further guidance under DOD Instruction 5000.1 provides as follows: There is no one best way to structure an acquisition program to accomplish the objective of the Defense Acquisition System. MDAs and PMs shall tailor pro- gram strategies and oversight, including documentation of program information, acquisition phases, the timing and scope of decision reviews and decision levels to fit the particular conditions of that program, consistent with applicable laws and regulations and the time-sensitivity of the capability need. While the wording above might indicate that the MDA and PM plan jointly or collaborate on program strategy, there are, in fact, both a Service (or Compo - nent) Acquisition Executive (SAE) and a Program Executive Officer (PEO) in the hierarchy between them, and direct communication between an MDA and a PM is typically infrequent. The four tiers of major program reporting are shown in Figure 1-3. Additionally, the Air Force has recently embedded the PEOs and the PMs in a wing/group/squadron framework aimed at aligning acquisition and operational structures. Figure 1-4 depicts the DOD and Air Force milestone and program review processes. Although changes to both policy and implementation have occurred periodically, the process has its roots in dealing with single programs and/or single systems (platform, weapon, sensor) typically acquired by a single military service. Over the past decade, the emergence of network-enabled programs that require significant interoperability across multiple platforms, weapons, sensor systems, and military services has substantially contributed to complexity and cost of many acquisition programs, complicating program management and the oversight processes. Beyond decision reviews for major defense acquisition programs at each milestone (A, B, and C), regulations prescribe additional reviews at the Office of the Secretary of Defense (OSD) level for concept (materiel development) deci - sion, design readiness (Post Critical Design Review Assessment (PCDRA)), and full rate production. Before each of these, an overarching integrated product team (OIPT) review is conducted in preparation for the Defense Acquisition Board (DAB) meeting. In preparation for these, a service/component-level review—such

OCR for page 5
0 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS Acquisition Support Program Oversight and Sustainment and Execution SECAF CSAF DAE USD (AT&L) Milestone Decision Authority for Major Programs MAJCOM SAE HQ SAF/AQ Milestone Decision Authority for Lesser Programs Product Dual Hatted Center PEO Commander Oversees Portfolio of Programs Dual Hatted Wing PM Group Squadron Manages One or More Programs FIgURE 1-3 Four tiers of major program reporting. SOURCE: Committee-generated. DAE, Defense Acquisition Executive; USD (AT&L), Under Secretary of Defense for Acquisition, Technology, and Logistics; SECAF, Secretary of the Air Force; CSAF, Chief of Staff of the Air Force; MAJCOM HQ, Major Command Headquarters; SAE, Service Acquisition Executive, SAF/AQ, Assistant Secretary of the Air Force for Acquisition; PEO, Program Executive Officer; PM, Program Manager. 26 Figure 1-3 as an Air Force Review Board (AFRB) or acquisition strategy panel (ASP)—is typically conducted as well. In addition, OSD has implemented program support reviews (PSRs), similar to independent program assessments (IPAs) for space systems, and has directed annual configuration steering boards (CSBs) for pro - grams in the SDD phase. The CSBs are to be chaired by the CAE. At the PEO level, sufficiency reviews are being conducted annually for ACAT I-III programs.

OCR for page 5
System Concept Production Technology Development Refinement Refinement & Demonstration Phases and Development and Demonstration (now EMD) (now MSA) Deployment (now (now DAB Reviews/ A C FRP B DRR CD MDD) PCDRA) Milestones Concept OSD and OIPT OIPT OIPT OIPT OIPT OIPT Joint Staff PSR PSR PSR JROC JROC JROC JROC ASP AFRB ASP AFRB ASP AFRB ADM, LCMP, SEP, ISP, & PMD Air Force Sufficiency Reviews Configuration Steering Boards AFROCC AFROCC AFROCC TRR OTRR TRA IBR Program LHA SVR/FCA CDR Execution SRR ASR SFR PDR MRA PCA PRR FIgURE 1-4 DOD-Air Force milestone and program review process. For acronyms, see list following the Table of Contents. SOURCE: Adapted from Janet Hassan, Acquisition Chief, Process Office, “Oversight, command and control (OC2),” presentation to the committee on  May 7, 2008.

OCR for page 5
2 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS More recent innovations in oversight reviews are specialty reviews, which are assessments (conducted at varying levels) of various aspects of a program, such as logistics health, manufacturing readiness, and technical readiness (maturity). In the various iterations of the DOD 5000 series regulations governing acqui - sition programs, both the number and level of reviews have increased substan - tially, particularly when taking into account the array of prebriefs and informa - tional meetings held in support of the formal reviews.6 Reviews at multiple levels of the acquisition management hierarchy have increased with each revision of the DOD 5000 series instructions in 2000, 2003, and 2008. The DOD 5000.1 and 5000.2 series of 2000 prescribed six OSD-level decision reviews in the Acquisi- tion Framework for major programs from only four previously (the 1996-era instructions). Its new evolutionary acquisition policy also called for partitioning programs into increments, each requiring its own Milestone B and C reviews. The result was 10 or so reviews in the course of a notional, fully scoped program. More nondiscretionary reviews have since been added in later regulations, and in memoranda from the USD (AT&L) such as the one signed on July 30, 2007, dictating that CSBs chaired by SAEs be conducted annually for major acquisition programs. Similarly, periodic OSD-level program support reviews (PSRs) and assessments of operational test readiness (AOTRs) have arisen to add oversight across functional areas and “improve the probability of program success.”7 Discre- tionary program-level reviews, such as technical reviews prescribed for systems engineering, were made mandatory in the latest DOD 5000.02 instruction (2008). The net result is a substantial increase in the number and frequency of manage - ment reviews at program, service, and OSD levels. Numerous recent studies8,9,10 have addressed the cost overruns and delays experienced by DOD acquisition programs over the past few decades. In brief, despite continued attempts to improve the acquisition process, in part through the addition of reviews, acquisition programs continue to experience cost overruns, schedule delays, and/or as-delivered performance shortfalls. From the perspective of the PM, all of the reviews, both formal and infor- mal, must be supported by the program office, and in many cases the industry partners also participate. Although each individual review is intended to serve a 6 J.T. Dillard, “Toward centralized control of defense acquisition programs,” Acquisition Review Journal, Defense Acquisition University (DAU), August-December 2005. 7Available at http://www.acq.osd.mil/at/initiatives/factsheets/program_support_reviews/index. html. 8Assessment Panel of the Defense Acquisition Performance Assessment Project, Defense Acquisi- tion Performance Assessment Report (January 2006). 9 Obaid Younossi, Mark V. Arena, Robert S. Leonard, Charles Robert Roll, Jr., Arvind Jain, and Jerry M. Sollinger, Is Weapon System Cost Growth Increasing? A Quantitative Assessment of Completed and Ongoing Programs, Santa Monica, Calif.: RAND Corporation (2007). 10 GAO, Defense Acquisition: Assessment of Selected Weapons Program , Report GAO-08-467SP, Washington, D.C.: GAO (2008).

OCR for page 5
 INTRODUCTION specific purpose, the overall magnitude of the review efforts not only significantly increases the workload of the program office in terms of direct support, but also diverts attention from day-to-day management of the program. The committee, in reviewing studies conducted over the past decade, could find no evidence of earlier work that focused on the impact of the overall formal and informal review process on the acquisition system in terms of resources spent by the program office or the effect of diverting a PM’s attention from the day-to- day management of his or her programs. Additionally, the unique role that PMs play in the acquisition process requires them to participate in all reviews (and prereviews) with multiple program stakeholders. In brief, only the PM sees and feels the breadth and depth of the review process. For this reason, the committee decided to approach the study from the perspective of the PM, who is a key ele- ment in successful program execution. The committee recognizes the challenges inherent in achieving successful DOD acquisition programs in an increasingly complex and dynamic arena that spans multiple organizations (including industry) and functions that do not easily align. That said, the committee recognizes the opportunity to contribute in a sub - stantive way by examining the expenses a PM incurs from the growing array of program and technical review in terms of time spent supporting reviews and in time lost focusing on program execution. A key question then is this: Can changes in the number, content, or sequence of program reviews help the program manager execute the program more successfully? METHODOLOgy To fulfill the assignments set out in the statement of task, the commit- tee employed a blended research methodology, using four complementary approaches. Presentations Data were gathered in the course of four separate multiday conferences. The committee received presentations by PMs and PEOs from the three military departments; industry; DOD overseers, practitioners, process owners, and policy writers; as well as GAO researchers and others who had studied DOD acquisition in a larger context. In addition, some committee members interviewed contribu - tors who were unable to meet with the full committee. The presenters are listed in Appendix B.

OCR for page 5
 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS Literature Review In parallel, a committee subgroup accumulated and examined an extensive body of pertinent studies and acquisition reform initiatives within the Air Force, DOD, and other agencies over the last 20 years. The previous studies are listed in Appendix C. Survey and Other Data During the compilation and analysis of data from presentations, interviews, and previous studies, it became apparent that there were few data on external pro - gram reviews to support this study, particularly items 2, 3, and 4 in the Statement of Task. Consequently, a survey tailored to support this study was developed and beta-tested. The survey was designed to collect information from Air Force PMs and PEOs on external reviews they had experienced. Survey information included quantitative and qualitative data on the impact of external reviews on program execution, including the time and effort spent preparing for, participating in, and following up on actions resulting from such reviews. The survey also asked about PM and PEO assessments of the value of the reviews to them in managing their programs. The intent of having the survey was to expand the numbers of persons con- tacted beyond a limited number of interviews and to generate some quantitative data. The committee used the survey data as another form of information to aug - ment its research, interviews, and personal experience. No finding, conclusion, or recommendation in the report is based solely on survey data; rather, the findings reflect what the committee heard from all sources. Pertinent survey results are discussed in Chapter 2, Findings and Conclu - sions. The survey can be found in Appendix D, along with a detailed description of how it was developed and conducted and its results. Other data were collected (on a case-by-case basis) from individual programs to fill in missing information about the number and levels of the reviews that are being conducted as part of the current acquisition process described earlier in this chapter. Comparative Matrix Lastly, a comparative matrix was constructed as a tool to help identify the number and types of known reviews, their purpose, and their target audiences to identify opportunities for streamlining, integrating, and/or consolidating reviews. The number and types of programmatic and technical reviews are summarized in Chapter 2; a brief description of each review is contained in Appendix E.

OCR for page 5
 INTRODUCTION Integration and Synthesis of Data As stated earlier, the committee recognized the substantial body of historical literature and thought addressing the challenges of DOD systems acquisition. Following its review of earlier studies and the series of presentations, the com - mittee spent a significant amount of time discussing how best to respond to the Statement of Task and how best to develop actionable recommendations clearly traceable to study findings. Three early observations substantially influenced those discussions and led to the organization of this report: 1. None of the impressive array of past studies reviewed by the committee approached the acquisition challenge from the perspective of the PM, who is a critical element in the success or failure of a program. 2. The committee’s literature review and early interviews indicated that little information existed that would allow it to quantify the resources necessary to accomplish any particular program review as required by item 2 in the Statement of Task. 3. The early round of interviews as well as the collective experience of the committee members led to their sense that there is substantial variance in the conduct and impact of any given review carried out over any set of programs—that reviews are easily influenced by the “personality” or “interest” of the reviewing authority. This observation cast doubt on how well Statement of Task items 3, 4, and 5 could be addressed. As a result, the committee decided to create and implement a survey to obtain additional information—both qualitative and quantitative, if possible. Addition - ally, the committee decided to construct a program review matrix (Table 2-1) to present a holistic view of the array of typical reviews (and accompanying pre- reviews) faced by a program manager. Finally, the committee decided to focus on developing a comprehensive set of recommendations responsive to the statement of task. The intent of the committee is to reflect the perspective of the PM within the larger context of the acquisition environment. The full committee deliberated on the results from all of its information sources to arrive at consensus findings, conclusions, and recommendations.