3
Recommendations

As discussed briefly in the section on methodology in Chapter 1, the findings and conclusions set forth in Chapter 2 serve as the basis for the recommendations contained herein that would allow the Air Force and the DOD to achieve meaningful and constructive change. The data collected by the committee—the sum of the literature review, survey results, interview comments, and committee experience—suggest that substantial variances exist in the planning and execution of program reviews as they apply to multiple programs. In brief, each program is different and each review is conducted in a different manner, with different participants and different results. The same type of program review applied to two different programs may have differing results. Additionally, since neither the data nor the time available permitted a detailed, direct, one-on-one response to items 2-4 in the Statement of Task, it is difficult to make recommendations that might apply to specific program reviews.

As a result, the committee decided to focus its response to the Statement of Task in the form of a recommended approach—a set of principles that might form a core set of best practices that applies to each specific program as well as to the coordination and synchronization of all reviews—with the goal of increasing the effectiveness and efficiency of program reviews and decreasing the burden on the program manager. The five recommendations apply to the execution of program reviews as addressed in the Statement of Task. Further, although there are not enough specific data to permit a quantified response to the key question raised in the summary—namely, Can changes in the number, content, sequence, or conduct of program reviews help the program manager more successfully execute the program?—the findings, interviews, and survey results gathered by the committee indicate that addressing the administrative issues surrounding reviews



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 30
3 Recommendations As discussed briefly in the section on methodology in Chapter 1, the findings and conclusions set forth in Chapter 2 serve as the basis for the recommenda - tions contained herein that would allow the Air Force and the DOD to achieve meaningful and constructive change. The data collected by the committee—the sum of the literature review, survey results, interview comments, and committee experience—suggest that substantial variances exist in the planning and execution of program reviews as they apply to multiple programs. In brief, each program is different and each review is conducted in a different manner, with different par- ticipants and different results. The same type of program review applied to two different programs may have differing results. Additionally, since neither the data nor the time available permitted a detailed, direct, one-on-one response to items 2-4 in the Statement of Task, it is difficult to make recommendations that might apply to specific program reviews. As a result, the committee decided to focus its response to the Statement of Task in the form of a recommended approach—a set of principles that might form a core set of best practices that applies to each specific program as well as to the coordination and synchronization of all reviews—with the goal of increasing the effectiveness and efficiency of program reviews and decreasing the burden on the program manager. The five recommendations apply to the execution of program reviews as addressed in the Statement of Task. Further, although there are not enough specific data to permit a quantified response to the key question raised in the summary—namely, Can changes in the number, content, sequence, or conduct of program reviews help the program manager more successfully execute the program?—the findings, interviews, and survey results gathered by the committee indicate that addressing the administrative issues surrounding reviews 0

OCR for page 30
 RECOMMENDATIONS can have a very positive impact on the ability of PMs to execute their programs more successfully. Following the recommendations, the committee provides comments regard- ing an approach for their implementation. Table 3-1 lists the conclusions from Chapter 2. Recommendation 1. To ensure that they possess a common understanding of the intent, scope, and output of reviews, the Air Force acquisition and requirement communities at all levels should engage in timely planning for program reviews that results in clear, comprehensive, measurable objectives. This recommendation, based principally on Conclusions 1, 5, and 6, reflects the committee’s discussion and desire to ensure that each program review is planned and conducted with thoroughness and precision to achieve success. The committee acknowledges the significant challenges posed by Conclusion 6 in terms of the comprehensiveness of planning and processes needed to ensure proper system-of-systems integration. This may well be an area for future study. To execute this recommendation, a governance process directed by the Ser- vice Acquisition Executive (SAE) should be implemented to synchronize and execute reviews at each level of organization. The governance structure must have an owner of the review process that encompasses all reviews captured within that structure, all policies issued in connection with those reviews, and control of the proliferation of reviews, including pre- and postreview mechanisms. Engagement in program review planning is not consistent throughout the Air Force. Such planning must be deliberate and should be communicated to the PM TABLE 3-1 List of Conclusions Conclusion Description 1 Many reviews add little value and others do not add value in proportion to the effort required. Reducing the number of such reviews or combining them can increase the time available to the PMs to more effectively manage their programs. 2 Reviews could be more effective if they were sequenced and timed to provide the information needed for program execution. 3 Required attendance at program review meetings is not clearly communicated nor is it effectively controlled. 4 Streamlining or combining reviews and their associated prebriefs in both the vertical and horizontal directions could increase efficiency. 5 It is important that program review planning is accomplished in a thoughtful, purposeful manner with a standard approach in order to firmly address the need for communication of the expectations and outcomes. 6 Program review format and design needs to reflect the greater complexity and interrelationships inherent in many current Air Force programs to ensure that a system of systems works across organizational constructs.

OCR for page 30
2 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS and the office of primary responsibility (OPR) for program reviews well in advance of the reviews. Direction for a review should include, at a minimum, the stated objective(s) of the review; the metrics for those objectives; the materials the PM is expected to supply to the review team, including supporting material; and the criteria for success for the program review. To this end, the committee suggests that the SAE review the application of leading industry standards developed by internationally recognized project and program management associations. Such standards have met the test of benchmarking in many industries, are globally applicable, and can be easily adapted. Following the program review, the chair of the review team should issue a report giving the PM the findings that require the PM to take corrective action and recommendations for further actions. Further, the report should compile some lessons learned from carrying out the review. To complete the review process, the PM should file a closeout report with the chair of the review detailing implementation of the corrective action plan and recommendations. The closeout report should note open items, closed items, items still in process, and project issues or risks that have been encountered or are predicted when the report is filed. The report should be taken into account in any follow-on program review. The committee further recommends that the SAE track the various metrics outlined by the review committees to determine if the reviews are having a sig - nificant impact on program performance. Such data could be used to improve the program review process as well as focus reviews on areas of concern. Recommendation 2. The SAE should develop a plan for the timely, synchronized execution of all program reviews. The plan should align with program decision milestones and decision points. This recommendation is based principally on Conclusions 2, 4, and 6. Its goal is to coordinate and synchronize the array of program reviews both horizontally and vertically across the department. The number of reviews preceding the decision points and milestones should be minimized and those that are held are overseen to ensure the content is perti - nent. This will reduce the burden on the program and assure that the reviews bring value to the program. In some cases, the alignment of those reviews may allow them to be consolidated and moved to a more appropriate time in the life cycle of the program. Program reviews are aligned with the decision points and milestones to ensure better program execution. Properly synchronized reviews should bring fewer schedule delays and reduce costs, since early identification of issues and risks should allow the PM to institute better planning and handling strategies. The elimination of some reviews or the combination of others will reduce costs as well as reduce the burden on the PM and program management staff,

OCR for page 30
 RECOMMENDATIONS allowing greater focus on the program and its execution. This idea is elaborated in the next section, “Implementing the Recommendations.” Recommendation 3. Before creating or approving a new review, the SAE should compare its objectives with those of existing reviews to determine whether one of the latter could accomplish or incorporate those objectives. This recommendation reflects discussions based on Conclusions 1, 2, 4, and 5 and the committee’s sense that the burden on the PM is exacerbated by additional reviews that arise during the course of program execution and are not necessarily coordinated or synchronized with previous or future reviews. Evidence from the survey and the interviews indicates that many PMs believe that the number of program reviews is a burden to their program and that adding more would only increase that burden. Recommendation 1 will keep this from happening as will the assessment process called for in Recommendation 3, part of which should determine if the objective of the proposed review could be achieved within an existing review. The assessment process should also determine whether broadening the stakeholders for a given review would do more to accomplish an objective than instituting additional program reviews. All review process determinations should keep Recommendation 2 in mind to ensure that reviews are timely, synchronized, and add value to the program. The same criteria established for a review should be applied to any and all prereviews, including those requested before reviews required by the Program Executive Office, the Air Force, or the Office of the Secretary of Defense (OSD). If possible, all stakeholders should work together to consolidate the prereview process. The committee understands that many stakeholders want the program review to be successful and that having some prereviews might be worthwhile. To that end, the committee encourages the Air Force to establish guidance for managing a prereview so that it has a minimal impact on the schedule and cost of the program and the program management staff. The committee also recom - mends that the Air Force encourage OSD to do the same, particularly regarding prebriefs. Recommendation 4. The OPR should staff the review team with recognized subject matter experts. This recommendation reflects committee discussions related to Conclusions 3 and 6. The committee was somewhat surprised to learn that many reviews are con- ducted without the “right” people present. This raises two issues. First, recognized subject matter experts need to be identified. Second, the experts must participate in the program review for their expertise to be of full value to the program. To act on this recommendation, the OPR should maintain a roster of experts

OCR for page 30
 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS in standard technical areas, taking into account that back-ups will be needed for any given program review to guarantee that an expert is available to attend that review. Further, a process guidance document should be prepared that would provide direction for the selection of subject matter expert teams, their formation, and their use. The objective(s) of the program review should be considered when staffing the review team. Moreover, what counts is to ensure continuity of effort—that is, the availability of the subject matter experts not only during the review but also for the periods before and after the review. Recommendation 5. The OPR conducting the review should ensure that all review outputs are documented, including root causes if any have been identified, and provide recommendations that can be acted upon by the PM, the program management office, or other program stakeholders. This recommendation is based on discussion related to Conclusion 5 and reflects the committee’s view on the importance of proper documentation for follow-through and for sharing lessons learned across programs. The output of the program reviews is sometimes not fully captured. The com- mittee notes that it is a best practice to capture lessons learned, identifying the root cause of problems and encountered risks in program management, as well as to document findings, observations, and recommendations made during a program review. Best practices give the PM and his or her management staff a roadmap to improvement or recovery. They also provide access to technical experts who can lend their knowledge at critical junctures in program development and execution, carry out monitoring and reporting functions, and serve as a vehicle for transfer- ring and disseminating the body of lessons learned and the knowledge of senior Air Force technical and managerial officials. Documenting the output of reviews stimulates open communication and builds an atmosphere of trust that will lead to participation in future program reviews. Documented feedback will mean that decisions can be tracked and imple- mented as well as communicated to the decision makers. Further, the management and execution of programs will continue to improve as program personnel learn from the experience of the review team. The committee recommends two ways to capture the information conveyed during the review. The first way is for the review team to write a report about the review. The second is to create a database for storing lessons learned and shar- ing them with the rest of the Air Force and others. The database would allow the lessons learned to benefit an audience beyond the immediate PM, and it would begin the virtuous cycle of review and improvement that the Air Force is seek- ing. The committee suggests that this database of lessons learned be owned and administered by the SAE. To be useful, it needs to be searchable and updated regularly.

OCR for page 30
 RECOMMENDATIONS In summary, the committee believes that implementing these five recom - mendations will improve the efficiency and effectiveness of program reviews. Together, the recommendations form a “gold standard” for conduct of reviews. IMPLEMENTINg THE RECOMMENDATIONS As illustrated in Chapter 1, a typical ACAT I program review structure might look like Figure 3-1. Note that Figure 3-1 reflects a milestone-driven process and does not show all the numbers of prereview briefings associated with each review. Recently revised DOD Instruction 5000.02 does not appreciably change the focus on milestones but it introduces more oversight reviews with the aim of achieving better acquisi - tion results. As stated earlier, the committee found that, although it is critical for the assessment of each individual review, information on costs, manpower, effort, content, objectives, and so on, was not available at every level. Despite the sparse - ness of information, the committee’s interviews, findings, and survey results strongly suggest that better administration of the review process—including synchronizing, combining, and aligning reviews—would alleviate the burden on the PM and help him or her achieve program success. Given the existing DOD organizational hierarchy and culture, the committee believes that the areas with most potential for consolidation and streamlining are the various external reviews and assessments, such as the configuration steering boards (CSBs), PEO suf - ficiency reviews (PEO/SRs), and prebriefs to the OSD staff, that are carried out for, say, a year before a milestone or other major review. The content of CSBs and PEO/SRs might be combined with phase milestone reviews or intermediate phase reviews. As for prebriefs, the reinstatement of the IIPT and strict enforcement (by the Air Force and OSD) of limits on OSD staff prebriefs to that forum would do a lot to decrease the number of prebriefs before the DAB milestone review and other DAB-level reviews. Figure 3-2 represents a nominal approach to synchronizing and integrating a series of program reviews aligned to major program milestones. Given the sheer number and frequency of program and technical reviews, combining at least some of them is seen by the committee as a way to improve effectiveness and efficiency while still satisfying the decision support needs of multiple stakeholders. The committee believes that combining and synchronizing reviews in this way should significantly improve program management (and governance) effi - ciency and effectiveness.

OCR for page 30
 System Concept Production Technology Development Refinement Refinement & Demonstration Phases and Development and Demonstration (now EMD) (now MSA) Deployment (now (now DAB Reviews/ A C FRP B DRR CD MDD) PCDRA) Milestones Concept OSD and OIPT OIPT OIPT OIPT OIPT OIPT Joint Staff PSR PSR PSR JROC JROC JROC JROC ASP AFRB ASP AFRB ASP AFRB ADM, LCMP, SEP, ISP, & PMD Air Force Sufficiency Reviews Configuration Steering Boards AFROCC AFROCC AFROCC TRR OTRR TRA IBR Program LHA SVR/FCA CDR Execution SRR ASR SFR PDR MRA PCA PRR FIgURE 3-1 DOD-Air Force milestone and program review process. For acronyms, see list following the Table of Contents. SOURCE: Adapted from Janet Hassan, Chief, Acquisition Process Office, “Oversight, command and control (OC2),” presentation to the committee on May 7, 2008.

OCR for page 30
System Concept Production Technology Development Refinement Refinement & Demonstration Phases & Deployment Development & Demonstration (now EMD) (now MSA) (now (now DAB Reviews/ C A FRP B DRR CD MDD) PCDRA) Milestones Concept OSD & OIPT OIPT OIPT OIPT OIPT OIPT Joint Staff PSR PSR PSR JROC JROC JROC JROC ASP AFRB ASP AFRB ASP AFRB X XX X X XADM, LCMP, SEP, ISP, & PMD Air Force Sufficiency Reviews Configuration Steering Boards AFROCC AFROCC AFROCC TRR OTRR TRA IBR Program LHA SVR/FCA CDR Execution SRR ASR SFR PDR MRA PCA PRR  FIgURE 3-2 Areas for potential consolidation and streamlining. SOURCE: Committee-generated; modification of Figure 3-1.

OCR for page 30