National Academies Press: OpenBook
« Previous: Summary
Suggested Citation:"1 Introduction." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 5
Suggested Citation:"1 Introduction." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 6
Suggested Citation:"1 Introduction." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 7
Suggested Citation:"1 Introduction." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 8
Suggested Citation:"1 Introduction." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 9
Suggested Citation:"1 Introduction." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 10
Suggested Citation:"1 Introduction." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 11
Suggested Citation:"1 Introduction." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 12
Suggested Citation:"1 Introduction." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 13
Suggested Citation:"1 Introduction." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 14
Suggested Citation:"1 Introduction." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 15

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

1 Introduction statement of task The committee was tasked by the Air Force to review the program manage- ment and the technical reviews and assessments that U.S. Air Force space and nonspace system acquisition programs are required to undergo; assess each review in terms of resources required and its role and contribution, identifying cases where different reviews have common or overlapping goals, content, or require- ments; identifying and evaluating options for streamlining, tailoring, integrating, or consolidating reviews of programs to increase cost-effectiveness and to lessen the workforce impact of the reviews as a whole; and recommending changes that the Air Force and the Department of Defense should make. The committee’s task- ing is shown in Box 1-1. Background DOD spends over $300 billion per year to develop, produce, field, and sustain weapons systems. Too often, DOD weapons systems programs experience large cost overruns and schedule delays, contributing to a growing loss of confidence   ee DOD (U.S. Department of Defense), National Defense Budget Estimates for FY 2009, Updated S September 2008. This amount is the sum of the amounts shown for “Operation & Maintenance,” “Procurement,” and “RDT&E.” Available online at http://www.defenselink.mil/comptroller/­defbudget/ fy2009/FY09Greenbook/greenbook_2009_updated.pdf. Last accessed May 19, 2009. 

 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS BOX 1-1 Statement of Task The National Research Council (NRC) will 1. Review the prescribed program reviews and assessments that U.S. Air Force space and non-space system acquisition programs in all Depart- ment of Defense (DOD) acquisition categories (ACATs) are required to undergo, consistent with the various phases of the acquisition life- cycle, that verify appropriate planning has occurred prior to concept d ­ ecision, Milestone/Key Decision Point (KDP) A, Milestone/KDP B, and Milestone/KDP C. 2. Assess each review and the resources required to accomplish it, including funding, manpower (people and knowhow), work effort, and time. 3. Assess the role and contribution that each review and the combined reviews make to successful acquisition. 4. Identify cases where different reviews have shared, common, or over- lapping goals, objectives, content, or requirements. 5. Identify and evaluate options for streamlining, tailoring, integrating, or consolidating reviews of programs to increase the cost-effectiveness and to lessen workforce impact of the reviews as a whole, including examination and discussion of review processes used by other agen- cies (such as the National Aeronautics and Space Administration and the Department of Energy), the other military departments (the U.S. Army and the U.S. Navy), and industry. 6. Recommend changes that the Air Force and DOD should make to the reviews of Air Force programs, including review goals, objectives, content, and requirements.

INTRODUCTION  in the DOD acquisition system., As reflected in the statement of task in Box 1-1, this study addresses improvements to one of the essential elements of program success—program reviews. The DOD acquisition decision process is based on phased milestone decisions that are supported by a series of technical and programmatic reviews. These reviews are designed to help program managers (PMs) effectively and efficiently manage the programs and to give executive leadership the information it needs to inform decisions. The formal acquisition decision process in place at the time of the study and used by the committee as the basis for its review is depicted in Figure 1-1. As noted in the Summary, the May 2003 version of DODI 5000.2 was replaced in December 2008 by DODI 5000.02, shown in Figure 1-2. The main differences are these: The materiel development decision (MDD) replaces the concept decision (CD); the materiel solution analysis (MSA) phase replaces the concept refinement (CR) phase; the engineering and manufacturing development (EMD) phase replaces the system development and demonstration (SDD) phase, and its two main efforts have been renamed (system integration and system demonstration became integrated system design and system capability and manufacturing process demonstration). Post-CDR assessment replaces the design readiness review. This formal DOD review process has evolved over the past 60 years, with many of the changes intended to address acquisition program cost overruns, schedule delays, and performance shortfalls in the delivered product, service, or system. Since implementation of the Goldwater-Nichols Act in the late 1980s,  the main defense acquisition organizations (e.g., the program management offices) have operated under a tiered decision structure. For large acquisitions, the current policy described in DOD Instruction 5000.1 states that the Under Secretary of Defense for Acquisition, Technology, and Logistics (USD (AT&L)) is the Mile- stone Decision Authority (MDA) responsible for the overall program: The Milestone Decision Authority (MDA) is the designated individual with over- all responsibility for a program. The MDA shall have the authority to approve entry of an acquisition program into the next phase of the acquisition process and shall be accountable for cost, schedule, and performance reporting to higher authority, including Congressional reporting.   lizabeth Newell, “GAO: Weapons systems over budget, overdue, underperforming” (April 1, E 2008). Available online at http://www.govexec.com/dailyfed/0408/040108e1.htm. Last accessed May 19, 2009.   AO (Government Accountability Office), Defense Acquisition: Assessment of Selected Weapons G Programs, GAO-08-467SP, Washington, D.C.: GAO (2008). U.S. Congress, Goldwater-Nichols Department of Defense Reorganization Act of 1986, Public Law 99-433.   USD (AT&L), The Defense Acquisition System, Department of Defense Directive 5000.1, Wash- ington, D.C.

 Program Initiation System Development Production and User Needs and Concept Technology and Demonstration Deployment Operations Technology Refinement Development Production Full Rate and Support Opportunities System System Readiness, Production and Integration Demonstration LRIP, and IOT&E Deployment CD A B DRR C FRP Figure 1-1  Major defense acquisition decision reviews and phases. SOURCE: Adapted from John T. Dillard, Centralized Control of De- fense Acquisition Programs: A Comparative Review of the Framework from 1987 to 2003, NPS-AM-03-003, Acquisition Research Sponsored Report Series, September 2003, Monterey, Calif.: Naval Postgraduate School. Program Initiation Engineering and Materiel Manufacturing Production and User Needs and Technology Development Deployment Solution Operations Technology Development Analysis System Capability Production Full Rate and Support Opportunities Integrated and Manufacturing Readiness, LRIP, Production System Design Process and IOT&E and Demonstration Deployment MDD Figure 1-1 broadside A B PCDRA C FRP FIGURE 1-2  Revised major defense acquisition decision reviews and phases. SOURCE: Adapted from Figure 1-1 and new DODI 5000.02.

INTRODUCTION  Three levels down the hierarchy, a PM is described as follows: The designated individual with responsibility for and authority to accomplish program objectives for development, production, and sustainment to meet the user’s operational needs. The PM shall be accountable for credible cost, schedule, and performance reporting to the MDA. Thus, the PM and MDA share responsibility for development and oversight of a program. Further guidance under DOD Instruction 5000.1 provides as follows: There is no one best way to structure an acquisition program to accomplish the objective of the Defense Acquisition System. MDAs and PMs shall tailor pro- gram strategies and oversight, including documentation of program information, acquisition phases, the timing and scope of decision reviews and decision levels to fit the particular conditions of that program, consistent with applicable laws and regulations and the time-sensitivity of the capability need. While the wording above might indicate that the MDA and PM plan jointly or collaborate on program strategy, there are, in fact, both a Service (or Compo- nent) Acquisition Executive (SAE) and a Program Executive Officer (PEO) in the hierarchy between them, and direct communication between an MDA and a PM is typically infrequent. The four tiers of major program reporting are shown in Figure 1-3. Additionally, the Air Force has recently embedded the PEOs and the PMs in a wing/group/squadron framework aimed at aligning acquisition and operational structures. Figure 1-4 depicts the DOD and Air Force milestone and program review processes. Although changes to both policy and implementation have occurred periodically, the process has its roots in dealing with single programs and/or single systems (platform, weapon, sensor) typically acquired by a single military service. Over the past decade, the emergence of network-enabled programs that require significant interoperability across multiple platforms, weapons, sensor systems, and military services has substantially contributed to complexity and cost of many acquisition programs, complicating program management and the oversight processes. Beyond decision reviews for major defense acquisition programs at each milestone (A, B, and C), regulations prescribe additional reviews at the Office of the Secretary of Defense (OSD) level for concept (materiel development) deci- sion, design readiness (Post Critical Design Review Assessment (PCDRA)), and full rate production. Before each of these, an overarching integrated product team (OIPT) review is conducted in preparation for the Defense Acquisition Board (DAB) meeting. In preparation for these, a service/component-level review—such

10 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS Program Oversight Acquisition Support and Execution and Sustainment DAE SECAF CSAF USD (AT&L) Milestone Decision Authority for Major Programs SAE MAJCOM SAF/AQ HQ Milestone Decision Authority for Lesser Programs Product Dual Hatted PEO Center Commander Oversees Portfolio of Programs Dual Hatted PM Wing Group Manages One or More Squadron Programs Figure 1-3  Four tiers of major program reporting. SOURCE: Committee-generated. DAE, Defense Acquisition Executive; USD (AT&L), Under Secretary of Defense for Acquisition, Technology, and Logistics; SECAF, Secretary of the Air Force; CSAF, Chief of Staff of the Air Force; MAJCOM HQ, Major Command Headquarters; SAE, Service Acquisition Executive, SAF/AQ, Assistant Secretary of the Air Force for Acquisition; PEO, Program Executive Officer; PM, Program Manager. 526 Figure 1-3 as an Air Force Review Board (AFRB) or acquisition strategy panel (ASP)—is typically conducted as well. In addition, OSD has implemented program support reviews (PSRs), similar to independent program assessments (IPAs) for space systems, and has directed annual configuration steering boards (CSBs) for pro- grams in the SDD phase. The CSBs are to be chaired by the CAE. At the PEO level, sufficiency reviews are being conducted annually for ACAT I-III programs.

Concept System Production Technology Development Phases Refinement Refinement & Demonstration and (now MSA) Development and Demonstration (now EMD) Deployment DAB Reviews/ (now (now CD MDD) A B DRR PCDRA) C FRP Milestones Concept OSD and OIPT OIPT OIPT OIPT OIPT OIPT Joint Staff PSR PSR PSR JROC JROC JROC JROC ASP AFRB ASP AFRB ASP AFRB ADM, LCMP, SEP, ISP, & PMD Air Force Sufficiency Reviews Configuration Steering Boards AFROCC AFROCC AFROCC TRA IBR TRR OTRR Program LHA CDR SVR/FCA Execution SRR ASR SFR PDR MRA PRR PCA Figure 1-4  DOD-Air Force milestone and program review process. For acronyms, see list following the Table of Contents. SOURCE: Adapted from Janet Hassan, Acquisition Chief, Process Office, “Oversight, command and control (OC2),” presentation to the committee on 11 May 7, 2008.

12 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS More recent innovations in oversight reviews are specialty reviews, which are assessments (conducted at varying levels) of various aspects of a program, such as logistics health, manufacturing readiness, and technical readiness (maturity). In the various iterations of the DOD 5000 series regulations governing acqui- sition programs, both the number and level of reviews have increased substan- tially, particularly when taking into account the array of prebriefs and informa- tional meetings held in support of the formal reviews. Reviews at multiple levels of the acquisition management hierarchy have increased with each revision of the DOD 5000 series instructions in 2000, 2003, and 2008. The DOD 5000.1 and 5000.2 series of 2000 prescribed six OSD-level decision reviews in the Acquisi- tion Framework for major programs from only four previously (the 1996-era instructions). Its new evolutionary acquisition policy also called for partitioning programs into increments, each requiring its own Milestone B and C reviews. The result was 10 or so reviews in the course of a notional, fully scoped program. More nondiscretionary reviews have since been added in later regulations, and in memoranda from the USD (AT&L) such as the one signed on July 30, 2007, dictating that CSBs chaired by SAEs be conducted annually for major acquisition programs. Similarly, periodic OSD-level program support reviews (PSRs) and assessments of operational test readiness (AOTRs) have arisen to add oversight across functional areas and “improve the probability of program success.” Discre- tionary program-level reviews, such as technical reviews prescribed for systems engineering, were made mandatory in the latest DOD 5000.02 instruction (2008). The net result is a substantial increase in the number and frequency of manage- ment reviews at program, service, and OSD levels. Numerous recent studies,,10 have addressed the cost overruns and delays experienced by DOD acquisition programs over the past few decades. In brief, despite continued attempts to improve the acquisition process, in part through the addition of reviews, acquisition programs continue to experience cost overruns, schedule delays, and/or as-delivered performance shortfalls. From the perspective of the PM, all of the reviews, both formal and infor- mal, must be supported by the program office, and in many cases the industry partners also participate. Although each individual review is intended to serve a   J.T. Dillard, “Toward centralized control of defense acquisition programs,” Acquisition Review Journal, Defense Acquisition University (DAU), August-December 2005. Available at http://www.acq.osd.mil/at/initiatives/factsheets/program_support_reviews/index.   html. Assessment Panel of the Defense Acquisition Performance Assessment Project, Defense Acquisi-   tion Performance Assessment Report (January 2006).   Obaid Younossi, Mark V. Arena, Robert S. Leonard, Charles Robert Roll, Jr., Arvind Jain, and Jerry M. Sollinger, Is Weapon System Cost Growth Increasing? A Quantitative Assessment of Completed and Ongoing Programs, Santa Monica, Calif.: RAND Corporation (2007). 10  AO, Defense Acquisition: Assessment of Selected Weapons Program, Report GAO-08-467SP, G Washington, D.C.: GAO (2008).

INTRODUCTION 13 specific purpose, the overall magnitude of the review efforts not only significantly increases the workload of the program office in terms of direct support, but also diverts attention from day-to-day management of the program. The committee, in reviewing studies conducted over the past decade, could find no evidence of earlier work that focused on the impact of the overall formal and informal review process on the acquisition system in terms of resources spent by the program office or the effect of diverting a PM’s attention from the day-to- day management of his or her programs. Additionally, the unique role that PMs play in the acquisition process requires them to participate in all reviews (and prereviews) with multiple program stakeholders. In brief, only the PM sees and feels the breadth and depth of the review process. For this reason, the committee decided to approach the study from the perspective of the PM, who is a key ele- ment in successful program execution. The committee recognizes the challenges inherent in achieving successful DOD acquisition programs in an increasingly complex and dynamic arena that spans multiple organizations (including industry) and functions that do not ­easily align. That said, the committee recognizes the opportunity to contribute in a sub- stantive way by examining the expenses a PM incurs from the growing array of program and technical review in terms of time spent supporting reviews and in time lost focusing on program execution. A key question then is this: Can changes in the number, content, or sequence of program reviews help the program manager execute the program more successfully? METHODOLOGY To fulfill the assignments set out in the statement of task, the commit- tee employed a blended research methodology, using four complementary approaches. Presentations Data were gathered in the course of four separate multiday conferences. The committee received presentations by PMs and PEOs from the three military departments; industry; DOD overseers, practitioners, process owners, and policy writers; as well as GAO researchers and others who had studied DOD acquisition in a larger context. In addition, some committee members interviewed contribu- tors who were unable to meet with the full committee. The presenters are listed in Appendix B.

14 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS Literature Review In parallel, a committee subgroup accumulated and examined an extensive body of pertinent studies and acquisition reform initiatives within the Air Force, DOD, and other agencies over the last 20 years. The previous studies are listed in Appendix C. Survey and Other Data During the compilation and analysis of data from presentations, interviews, and previous studies, it became apparent that there were few data on external pro- gram reviews to support this study, particularly items 2, 3, and 4 in the Statement of Task. Consequently, a survey tailored to support this study was developed and beta-tested. The survey was designed to collect information from Air Force PMs and PEOs on external reviews they had experienced. Survey information included quantitative and qualitative data on the impact of external reviews on program execution, including the time and effort spent preparing for, participating in, and following up on actions resulting from such reviews. The survey also asked about PM and PEO assessments of the value of the reviews to them in managing their programs. The intent of having the survey was to expand the numbers of persons con- tacted beyond a limited number of interviews and to generate some quantitative data. The committee used the survey data as another form of information to aug- ment its research, interviews, and personal experience. No finding, conclusion, or recommendation in the report is based solely on survey data; rather, the findings reflect what the committee heard from all sources. Pertinent survey results are discussed in Chapter 2, Findings and Conclu- sions. The survey can be found in Appendix D, along with a detailed description of how it was developed and conducted and its results. Other data were collected (on a case-by-case basis) from individual programs to fill in missing information about the number and levels of the reviews that are being conducted as part of the current acquisition process described earlier in this chapter. Comparative Matrix Lastly, a comparative matrix was constructed as a tool to help identify the number and types of known reviews, their purpose, and their target audiences to identify opportunities for streamlining, integrating, and/or consolidating reviews. The number and types of programmatic and technical reviews are summarized in Chapter 2; a brief description of each review is contained in Appendix E.

INTRODUCTION 15 Integration and Synthesis of Data As stated earlier, the committee recognized the substantial body of historical literature and thought addressing the challenges of DOD systems acquisition. Follow­ing its review of earlier studies and the series of presentations, the com- mittee spent a significant amount of time discussing how best to respond to the Statement of Task and how best to develop actionable recommendations clearly traceable to study findings. Three early observations substantially influenced those discussions and led to the organization of this report: 1. None of the impressive array of past studies reviewed by the committee approached the acquisition challenge from the perspective of the PM, who is a critical element in the success or failure of a program. 2. The committee’s literature review and early interviews indicated that little information existed that would allow it to quantify the resources necessary to accomplish any particular program review as required by item 2 in the Statement of Task. 3. The early round of interviews as well as the collective experience of the committee members led to their sense that there is substantial variance in the conduct and impact of any given review carried out over any set of programs—that reviews are easily influenced by the “personality” or “interest” of the reviewing authority. This observation cast doubt on how well Statement of Task items 3, 4, and 5 could be addressed. As a result, the committee decided to create and implement a survey to obtain additional information—both qualitative and quantitative, if possible. Addition- ally, the committee decided to construct a program review matrix (Table 2-1) to present a holistic view of the array of typical reviews (and accompanying pre- reviews) faced by a program manager. Finally, the committee decided to focus on developing a comprehensive set of recommendations responsive to the statement of task. The intent of the committee is to reflect the perspective of the PM within the larger context of the acquisition environment. The full committee deliberated on the results from all of its information sources to arrive at consensus findings, conclusions, and recommendations.

Next: 2 Findings and Conclusions »
Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs Get This Book
×
 Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs
Buy Paperback | $43.00 Buy Ebook | $34.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Department of Defense (DOD) spends over $300 billion each year to develop, produce, field and sustain weapons systems (the U.S. Air Force over $100 billion per year). DOD and Air Force acquisitions programs often experience large cost overruns and schedule delays leading to a loss in confidence in the defense acquisition system and the people who work in it. Part of the DOD and Air Force response to these problems has been to increase the number of program and technical reviews that acquisition programs must undergo. This book looks specifically at the reviews that U.S. Air Force acquisition programs are required to undergo and poses a key question: Can changes in the number, content, or sequence of reviews help Air Force program managers more successfully execute their programs?

This book concludes that, unless they do it better than they are now, Air Force and DOD attempts to address poor acquisition program performance with additional reviews will fail. This book makes five recommendations that together form a gold standard for conduct of reviews and if implemented and rigorously managed by Air Force and DOD acquisition executives can increase review effectiveness and efficiency. The bottom line is to help program managers successfully execute their programs.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!