Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 1
Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs Summary The Department of Defense (DOD) spends over $300 billion per year to develop, produce, field, and sustain weapons systems.1 Achieving success for DOD acquisition programs in an increasingly complex and dynamic arena that spans multiple organizations (including industry) and functions that do not easily align is a challenge. Too often, DOD weapons systems programs experience large cost overruns and schedule delays, contributing to a growing loss of confidence in the DOD acquisition system.2,3 In response, there has been a growing array of program and technical reviews that a program manager (PM) must face. While they are one of the essential elements of program success, these reviews result in costs to the program in terms of time spent supporting the reviews at the expense of time lost focusing on program execution. This study addresses a key question: Can changes in the number, content, sequence, or conduct of program reviews help the PM more successfully execute the program? 1 See DOD (U.S. Department of Defense), National Defense Budget Estimates for FY 2009, Updated September 2008. This amount is the sum of the amounts shown for “Operation & Maintenance,” “Procurement,” and “RDT&E.” Available online at http://www.defenselink.mil/comptroller/defbudget/fy2009/FY09Greenbook/greenbook_2009_updated.pdf. Last accessed May 19, 2009. 2 Elizabeth Newell, “GAO: Weapons systems over budget, overdue, underperforming” (April 1, 2008). Available online at http://www.govexec.com/dailyfed/0408/040108e1.htm. Last accessed May 19, 2009. 3 Government Accountability Office (GAO), Defense Acquisition: Assessment of Selected Weapons Programs, GAO-08-467SP, Washington, D.C.: GAO (2008).
OCR for page 2
Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs METHODOLOGY The committee was tasked by the Air Force to review the prescribed program management and technical reviews and assessments that U.S. Air Force space and nonspace system acquisition programs are required to undergo; identify and evaluate options for streamlining, tailoring, integrating, or consolidating reviews of programs to increase their cost-effectiveness and to lessen the impact of the reviews on the workforce; and recommend changes that the Air Force and DOD should make. To accomplish its assignment, the committee received presentations by PMs and program executive officers (PEOs) from the three military departments, industry representatives, overseers, practitioners, process owners, and policy writers in DOD, as well as Government Accountability Office (GAO) researchers and others who have studied DOD acquisition in a broader context. The committee studied the pertinent literature on various acquisition reform initiatives in the Air Force, DOD, and other agencies over the last 20 years (see Appendix C). It found very little quantitative information to address all the elements in the Statement of Task. As a result, the committee surveyed Air Force PMs and PEOs to collect quantitative and qualitative information on the impact of external reviews on program execution and to get an idea of how the reviews help them to manage their programs. The committee also gathered information from individual programs on the number and levels of reviews being conducted as part of the current acquisition process. It constructed a comparative matrix to help identify the number and types of known reviews, their purpose and target audiences, all of which could suggest opportunities for streamlining, integrating, and/or consolidating the reviews. The committee deliberated on the results of these efforts and reached a consensus on its findings, conclusions, and recommendations. As this report was being finished, the Office of the Secretary of Defense (OSD) published revised DOD Instruction 5000.02. The committee’s findings, conclusions, and recommendations are not impacted by this revision, which increased the number of program reviews. RECOMMENDATIONS The committee presents the following five recommendations aimed at achieving more effective program acquisition and reducing the burden on the PMs. It believes that if the Air Force were to adopt and implement all of the recommendations, it would achieve a “gold standard” that could serve as a benchmark for other DOD acquisition program review efforts. Recommendation 1. To ensure that they possess a common understanding of the intent, scope, and output of reviews, the Air Force acquisition and requirement communities at all levels should engage in timely planning for program reviews that results in clear, comprehensive, measurable objectives.
OCR for page 3
Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs To carry out this recommendation, the Service Acquisition Executive (SAE) should direct a governance process that plans, coordinates, and executes reviews at each level of organization. Each program review’s objectives, metrics, and success criteria should be effectively communicated to the PM and the office of primary responsibility (OPR) for the review well in advance of the review. To complete the process, a report should be issued by the chair of the review team followed by a closeout report by the PM. Recommendation 2. The SAE should develop a plan for the timely, synchronized execution of all program reviews. The plan should align with program decision milestones and decision points. Program reviews should be aligned with program decision points and milestones to ensure that the number of reviews preceding the decision point and milestone reviews is minimized and that the reviews bring value to the program. Properly timed reviews should result in fewer long-term schedule delays and costs, because early identification of issues and risks should allow the PM to institute strategies for managing them. The elimination of some reviews and the combining of others will reduce costs and lighten the burden on the PM and his or her staff. Recommendation 3. Before creating or approving a new review, the SAE should compare its objectives with those of existing reviews to determine whether one of the latter could accomplish or incorporate those objectives. This comparison should determine whether broadening the stakeholders for a given review, rather than conducting additional program reviews, would accomplish the objective. Recommendation 4. The OPR should staff the review team with recognized subject matter experts. Subject matter experts need to be identified. They should participate in the program review for the review to be of full value to the program. To facilitate this process, the OPR should maintain a roster of subject matter experts in standard technical areas. Recommendation 5. The OPR conducting the review should ensure that all review outputs are documented, including root causes if any have been identified, and provide recommendations that can be acted upon by the PM, the program management office, or other program stakeholders. The committee notes that it is a best practice to capture lessons learned, identifying the root cause of problems and risks encountered in program manage-
OCR for page 4
Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs ment, as well as to document findings, observations, and recommendations made during a program review. Documenting the output of reviews stimulates open communication and builds an atmosphere of trust that will lead to participation in future program reviews. Another benefit that should result from documenting the output of the review process is that the management and execution of programs will continue to improve, as program personnel are, in effect, mentored by the expertise of the review team. Although there may not be sufficient data to permit a quantitative response to the question whether changes in the number, content, sequence, or conduct of program reviews can help the PM execute the program more successfully, the committee is confident that if the above recommendations are implemented and rigorously managed by the SAE and his or her staff, there will be greater control of the review process, which will directly benefit the PMs and allow the successful execution of their programs. The continual learning process that these recommendations represent exemplifies a program management learning process that builds from one review to the next.