National Academies Press: OpenBook
« Previous: 2 Findings and Conclusions
Suggested Citation:"3 Recommendations." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 30
Suggested Citation:"3 Recommendations." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 31
Suggested Citation:"3 Recommendations." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 32
Suggested Citation:"3 Recommendations." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 33
Suggested Citation:"3 Recommendations." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 34
Suggested Citation:"3 Recommendations." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 35
Suggested Citation:"3 Recommendations." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 36
Suggested Citation:"3 Recommendations." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 37
Suggested Citation:"3 Recommendations." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 38

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

3 Recommendations As discussed briefly in the section on methodology in Chapter 1, the findings and conclusions set forth in Chapter 2 serve as the basis for the recommenda- tions contained herein that would allow the Air Force and the DOD to achieve meaningful and constructive change. The data collected by the committee—the sum of the literature review, survey results, interview comments, and committee experience—suggest that substantial variances exist in the planning and execution of program reviews as they apply to multiple programs. In brief, each program is different and each review is conducted in a different manner, with different par- ticipants and different results. The same type of program review applied to two different programs may have differing results. Additionally, since neither the data nor the time available permitted a detailed, direct, one-on-one response to items 2-4 in the Statement of Task, it is difficult to make recommendations that might apply to specific program reviews. As a result, the committee decided to focus its response to the Statement of Task in the form of a recommended approach—a set of principles that might form a core set of best practices that applies to each specific program as well as to the coordination and synchronization of all reviews—with the goal of increasing the effectiveness and efficiency of program reviews and decreasing the burden on the program manager. The five recommendations apply to the execution of program reviews as addressed in the Statement of Task. Further, although there are not enough specific data to permit a quantified response to the key question raised in the summary—namely, Can changes in the number, content, sequence, or conduct of program reviews help the program manager more successfully execute the program?—the findings, interviews, and survey results gathered by the committee indicate that addressing the administrative issues surrounding reviews 30

RECOMMENDATIONS 31 can have a very positive impact on the ability of PMs to execute their programs more successfully. Following the recommendations, the committee provides comments regard- ing an approach for their implementation. Table 3-1 lists the conclusions from Chapter 2. Recommendation 1. To ensure that they possess a common understanding of the intent, scope, and output of reviews, the Air Force acquisition and requirement communities at all levels should engage in timely planning for program reviews that results in clear, comprehensive, measurable objectives. This recommendation, based principally on Conclusions 1, 5, and 6, reflects the committee’s discussion and desire to ensure that each program review is planned and conducted with thoroughness and precision to achieve success. The committee acknowledges the significant challenges posed by Conclusion 6 in terms of the comprehensiveness of planning and processes needed to ensure proper system-of-systems integration. This may well be an area for future study. To execute this recommendation, a governance process directed by the Ser- vice Acquisition Executive (SAE) should be implemented to synchronize and execute reviews at each level of organization. The governance structure must have an owner of the review process that encompasses all reviews captured within that structure, all policies issued in connection with those reviews, and control of the proliferation of reviews, including pre- and postreview mechanisms. Engagement in program review planning is not consistent throughout the Air Force. Such planning must be deliberate and should be communicated to the PM TABLE 3-1  List of Conclusions Conclusion Description 1 Many reviews add little value and others do not add value in proportion to the effort required. Reducing the number of such reviews or combining them can increase the time available to the PMs to more effectively manage their programs. 2 Reviews could be more effective if they were sequenced and timed to provide the information needed for program execution. 3 Required attendance at program review meetings is not clearly communicated nor is it effectively controlled. 4 Streamlining or combining reviews and their associated prebriefs in both the vertical and horizontal directions could increase efficiency. 5 It is important that program review planning is accomplished in a thoughtful, purposeful manner with a standard approach in order to firmly address the need for communication of the expectations and outcomes. 6 Program review format and design needs to reflect the greater complexity and interrelationships inherent in many current Air Force programs to ensure that a system of systems works across organizational constructs.

32 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS and the office of primary responsibility (OPR) for program reviews well in advance of the reviews. Direction for a review should include, at a minimum, the stated objective(s) of the review; the metrics for those objectives; the materials the PM is expected to supply to the review team, including supporting material; and the criteria for success for the program review. To this end, the committee suggests that the SAE review the application of leading industry standards developed by internationally recognized project and program management associations. Such standards have met the test of benchmarking in many industries, are globally applicable, and can be easily adapted. Following the program review, the chair of the review team should issue a report giving the PM the findings that require the PM to take corrective action and recommendations for further actions. Further, the report should compile some lessons learned from carrying out the review. To complete the review process, the PM should file a closeout report with the chair of the review detailing implementation of the corrective action plan and recommendations. The closeout report should note open items, closed items, items still in process, and project issues or risks that have been encountered or are predicted when the report is filed. The report should be taken into account in any follow-on program review. The committee further recommends that the SAE track the various metrics outlined by the review committees to determine if the reviews are having a sig- nificant impact on program performance. Such data could be used to improve the program review process as well as focus reviews on areas of concern. Recommendation 2. The SAE should develop a plan for the timely, synchronized execution of all program reviews. The plan should align with program decision milestones and decision points. This recommendation is based principally on Conclusions 2, 4, and 6. Its goal is to coordinate and synchronize the array of program reviews both horizontally and vertically across the department. The number of reviews preceding the decision points and milestones should be minimized and those that are held are overseen to ensure the content is perti- nent. This will reduce the burden on the program and assure that the reviews bring value to the program. In some cases, the alignment of those reviews may allow them to be consolidated and moved to a more appropriate time in the life cycle of the program. Program reviews are aligned with the decision points and milestones to ensure better program execution. Properly synchronized reviews should bring fewer schedule delays and reduce costs, since early identification of issues and risks should allow the PM to institute better planning and handling strategies. The elimination of some reviews or the combination of others will reduce costs as well as reduce the burden on the PM and program management staff,

RECOMMENDATIONS 33 allowing greater focus on the program and its execution. This idea is elaborated in the next section, “Implementing the Recommendations.” Recommendation 3. Before creating or approving a new review, the SAE should compare its objectives with those of existing reviews to determine whether one of the latter could accomplish or incorporate those objectives. This recommendation reflects discussions based on Conclusions 1, 2, 4, and 5 and the committee’s sense that the burden on the PM is exacerbated by additional reviews that arise during the course of program execution and are not necessarily coordinated or synchronized with previous or future reviews. Evidence from the survey and the interviews indicates that many PMs believe that the number of program reviews is a burden to their program and that adding more would only increase that burden. Recommendation 1 will keep this from happening as will the assessment process called for in Recommendation 3, part of which should determine if the objective of the proposed review could be achieved within an existing review. The assessment process should also determine whether broadening the stakeholders for a given review would do more to accomplish an objective than instituting additional program reviews. All review process determinations should keep R ­ ecommendation 2 in mind to ensure that reviews are timely, synchronized, and add value to the program. The same criteria established for a review should be applied to any and all prereviews, including those requested before reviews required by the Program Executive Office, the Air Force, or the Office of the Secretary of Defense (OSD). If possible, all stakeholders should work together to consolidate the prereview process. The committee understands that many stakeholders want the program review to be successful and that having some prereviews might be worthwhile. To that end, the committee encourages the Air Force to establish guidance for managing a prereview so that it has a minimal impact on the schedule and cost of the program and the program management staff. The committee also recom- mends that the Air Force encourage OSD to do the same, particularly regarding prebriefs. Recommendation 4. The OPR should staff the review team with recognized subject matter experts. This recommendation reflects committee discussions related to Conclusions 3 and 6. The committee was somewhat surprised to learn that many reviews are con- ducted without the “right” people present. This raises two issues. First, recognized subject matter experts need to be identified. Second, the experts must participate in the program review for their expertise to be of full value to the program. To act on this recommendation, the OPR should maintain a roster of experts

34 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS in standard technical areas, taking into account that back-ups will be needed for any given program review to guarantee that an expert is available to attend that review. Further, a process guidance document should be prepared that would provide direction for the selection of subject matter expert teams, their formation, and their use. The objective(s) of the program review should be considered when staffing the review team. Moreover, what counts is to ensure continuity of effort—that is, the availability of the subject matter experts not only during the review but also for the periods before and after the review. Recommendation 5. The OPR conducting the review should ensure that all review outputs are documented, including root causes if any have been identified, and provide recommendations that can be acted upon by the PM, the program management office, or other program stakeholders. This recommendation is based on discussion related to Conclusion 5 and reflects the committee’s view on the importance of proper documentation for f ­ ollow-through and for sharing lessons learned across programs. The output of the program reviews is sometimes not fully captured. The com- mittee notes that it is a best practice to capture lessons learned, identifying the root cause of problems and encountered risks in program management, as well as to document findings, observations, and recommendations made during a program review. Best practices give the PM and his or her management staff a roadmap to improvement or recovery. They also provide access to technical experts who can lend their knowledge at critical junctures in program development and execution, carry out monitoring and reporting functions, and serve as a vehicle for transfer- ring and disseminating the body of lessons learned and the knowledge of senior Air Force technical and managerial officials. Documenting the output of reviews stimulates open communication and builds an atmosphere of trust that will lead to participation in future program reviews. Documented feedback will mean that decisions can be tracked and imple- mented as well as communicated to the decision makers. Further, the management and execution of programs will continue to improve as program personnel learn from the experience of the review team. The committee recommends two ways to capture the information conveyed during the review. The first way is for the review team to write a report about the review. The second is to create a database for storing lessons learned and shar- ing them with the rest of the Air Force and others. The database would allow the l ­ essons learned to benefit an audience beyond the immediate PM, and it would begin the virtuous cycle of review and improvement that the Air Force is seek- ing. The committee suggests that this database of lessons learned be owned and administered by the SAE. To be useful, it needs to be searchable and updated regularly.

RECOMMENDATIONS 35 In summary, the committee believes that implementing these five recom- mendations will improve the efficiency and effectiveness of program reviews. Together, the recommendations form a “gold standard” for conduct of reviews. Implementing the Recommendations As illustrated in Chapter 1, a typical ACAT I program review structure might look like Figure 3-1. Note that Figure 3-1 reflects a milestone-driven process and does not show all the numbers of prereview briefings associated with each review. Recently revised DOD Instruction 5000.02 does not appreciably change the focus on milestones but it introduces more oversight reviews with the aim of achieving better acquisi- tion results. As stated earlier, the committee found that, although it is critical for the assessment of each individual review, information on costs, manpower, effort, content, objectives, and so on, was not available at every level. Despite the sparse- ness of information, the committee’s interviews, findings, and survey results strongly suggest that better administration of the review process—including synchronizing, combining, and aligning reviews—would alleviate the burden on the PM and help him or her achieve program success. Given the existing DOD organizational hierarchy and culture, the committee believes that the areas with most potential for consolidation and streamlining are the various external reviews and assessments, such as the configuration steering boards (CSBs), PEO suf- ficiency reviews (PEO/SRs), and prebriefs to the OSD staff, that are carried out for, say, a year before a milestone or other major review. The content of CSBs and PEO/SRs might be combined with phase milestone reviews or intermediate phase reviews. As for prebriefs, the reinstatement of the IIPT and strict enforcement (by the Air Force and OSD) of limits on OSD staff prebriefs to that forum would do a lot to decrease the number of prebriefs before the DAB milestone review and other DAB-level reviews. Figure 3-2 represents a nominal approach to synchronizing and integrating a series of program reviews aligned to major program milestones. Given the sheer number and frequency of program and technical reviews, combining at least some of them is seen by the committee as a way to improve effectiveness and efficiency while still satisfying the decision support needs of multiple stakeholders. The committee believes that combining and synchronizing reviews in this way should significantly improve program management (and governance) effi- ciency and effectiveness.

36 Concept System Production Technology Development Phases Refinement Refinement & Demonstration and (now MSA) Development and Demonstration (now EMD) Deployment DAB Reviews/ (now (now CD MDD) A B DRR PCDRA) C FRP Milestones Concept OSD and OIPT OIPT OIPT OIPT OIPT OIPT Joint Staff PSR PSR PSR JROC JROC JROC JROC ASP AFRB ASP AFRB ASP AFRB ADM, LCMP, SEP, ISP, & PMD Air Force Sufficiency Reviews Configuration Steering Boards AFROCC AFROCC AFROCC TRA IBR TRR OTRR Program LHA CDR SVR/FCA Execution SRR ASR SFR PDR MRA PRR PCA Figure 3-1  DOD-Air Force milestone and program review process. For acronyms, see list following the Table of Contents. SOURCE: Adapted from Janet Hassan, Chief, Acquisition Process Office, “Oversight, command and control (OC2),” presentation to the committee on May 7, 2008.

Concept System Technology Production Phases Refinement Refinement Development & Demonstration Development & Deployment (now MSA) & Demonstration (now EMD) DAB Reviews/ (now (now CD MDD) A B DRR PCDRA) C FRP Milestones Concept OSD & OIPT OIPT OIPT OIPT OIPT OIPT Joint Staff PSR PSR PSR JROC JROC JROC JROC ASP AFRB ASP AFRB ASP AFRB X X X XADM, LCMP, SEP, ISP, & PMD X X Air Force Sufficiency Reviews Configuration Steering Boards AFROCC AFROCC AFROCC TRA IBR TRR OTRR Program LHA CDR SVR/FCA Execution ASR SRR SFR PDR MRA PRR PCA 37 Figure 3-2  Areas for potential consolidation and streamlining. SOURCE: Committee-generated; modification of Figure 3-1.

Next: Appendixes »
Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs Get This Book
×
 Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs
Buy Paperback | $43.00 Buy Ebook | $34.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Department of Defense (DOD) spends over $300 billion each year to develop, produce, field and sustain weapons systems (the U.S. Air Force over $100 billion per year). DOD and Air Force acquisitions programs often experience large cost overruns and schedule delays leading to a loss in confidence in the defense acquisition system and the people who work in it. Part of the DOD and Air Force response to these problems has been to increase the number of program and technical reviews that acquisition programs must undergo. This book looks specifically at the reviews that U.S. Air Force acquisition programs are required to undergo and poses a key question: Can changes in the number, content, or sequence of reviews help Air Force program managers more successfully execute their programs?

This book concludes that, unless they do it better than they are now, Air Force and DOD attempts to address poor acquisition program performance with additional reviews will fail. This book makes five recommendations that together form a gold standard for conduct of reviews and if implemented and rigorously managed by Air Force and DOD acquisition executives can increase review effectiveness and efficiency. The bottom line is to help program managers successfully execute their programs.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!