National Academies Press: OpenBook
« Previous: 1 Introduction
Suggested Citation:"2 Findings and Conclusions." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 16
Suggested Citation:"2 Findings and Conclusions." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 17
Suggested Citation:"2 Findings and Conclusions." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 18
Suggested Citation:"2 Findings and Conclusions." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 19
Suggested Citation:"2 Findings and Conclusions." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 20
Suggested Citation:"2 Findings and Conclusions." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 21
Suggested Citation:"2 Findings and Conclusions." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 22
Suggested Citation:"2 Findings and Conclusions." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 23
Suggested Citation:"2 Findings and Conclusions." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 24
Suggested Citation:"2 Findings and Conclusions." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 25
Suggested Citation:"2 Findings and Conclusions." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 26
Suggested Citation:"2 Findings and Conclusions." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 27
Suggested Citation:"2 Findings and Conclusions." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 28
Suggested Citation:"2 Findings and Conclusions." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 29

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

2 Findings and Conclusions As summarized in Chapter 1, the committee deliberated on the results from the four data sources to arrive at the findings and conclusions contained in this chapter. The findings and conclusions are based on information derived from the survey responses, interviews with Air Force and DOD personnel and other stake- holders, earlier studies, and the committee members’ expertise. The committee spent considerable time deliberating to ensure that the findings indeed represent what it had heard. The conclusions then represent a committee consensus of a gen- eralization of the findings. The recommendations in turn (see Chapter 3) resulted from integrating and synthesizing the findings and conclusions in a way that the committee believed would be most responsive to the Statement of Task. The matrix of reviews that was constructed to help classify the number and types of known reviews, their purpose, and their target audiences in order to iden- tify opportunities for streamlining, integrating, and/or consolidating reviews is shown in Table 2-1, with further information included in Appendix E. The reviews are of four types (see third column of the matrix): • Milestone/programmatic reviews. These reviews are conducted by the milestone decision authority (MDA) (the Defense Acquisition Executive (DAE) for acquisition category (ACAT) ID programs), with support from the Office of the Secretary of Defense (OSD) staff. They provide the basis for the MDA to decide whether to allow a new program to be initiated or an ongoing program to proceed to the next phase. Of the four types, reviews of this type require by far the greatest number of prereviews (reviews conducted by functional offices and intermediate management levels on the way to the MDA review). These reviews are listed in approximately 16

FINDINGS AND CONCLUSIONS 17 the order they occur leading up to the ultimate Defense Acquisition Board (DAB) review. • Periodic oversight reviews. These reviews are conducted at a regular f ­ requency—annually or quarterly—to allow the DAE, the Service Acquisi- tion Executive (SAE), or the program executive officer (PEO) to monitor progress. They are listed alphabetically. • Ad hoc reviews. These reviews are initiated by an organization outside the program office either in response to a problem or, in the case of some major programs, periodically (e.g., by the GAO). They are listed alphabetically. • Technical or engineering reviews. In contrast to the external reviews mentioned above, these reviews are generally conducted internally by the program itself, with some support from Service functional assets. They are a principal means by which the program office manages the technical and fiscal execution of the program. They are listed in approximately the order they occur over the life of the program. The matrix (Table 2-1) summarizes the various types of reviews typically con- ducted over the lifetime of a program. It names 31 formal reviews of four types, although not all of them or even all types may be conducted for every program. At least four of the reviews require unique documents and 10 have been identified by the committee as duplicating or partially duplicating other reviews. The matrix does not list every possible ad hoc review. Nor does it list the pre- reviews or prebriefs generated by these formal reviews, since it was not possible to determine the number of such reviews or prebriefs. Because no data or metrics are required or collected on reviews by DOD in general, or by the Air Force in particular, it was not possible to determine the overlap or duplication of different reviews. (Of note, although no historic information on the number of prereviews or prebriefs is available, interview comments and the committee members’ experi- ence lead one to believe that major reviews at the level of the Joint Requirements Oversight Council (JROC), DAB, and PEO are generally accompanied by mul- tiple (3 to 5) prebriefs or prereviews.) In the program review process depicted in Figure 1-4, it can be seen that there are 6 MS or MDA reviews in a typical program, plus 6 overarching inte- grated product teams (OIPTs) reviews before those, plus 6 Air Force Review Board (AFRB)/acquisition strategy panel (ASP) reviews before each of those, plus 10 or 11 mandatory technical reviews at the program level, plus 4 peri- odic program support reviews (PSRs) for each phase of the life cycle, plus one operational test readiness assessment (OTRA) before IOT&E, plus annual configuration steering board (CSB) reviews (notionally 7), plus annual PEO- level sufficiency reviews (notionally 7), plus at least 2 technology readiness assessments/manufacturing readiness assessments (TRAs/MRAs). This suggests in excess of 50 reviews just at the program level and above without counting

18 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS Table 2-1  Program Review Matrix OPR/ Review Name Type Purpose Frequency Customer ASP Acquisition Programmatic Formalize MS SAF/AQ Strategy Panel acquisition strategy AFROCC Air Force Programmatic Validate MS A5R/CSAF, Requirement requirements SECAF for Operational Capabilities Council JROC Joint Programmatic Prioritize Joint MS VCJCS/DAB Requirements requirements Oversight Council AFRB Air Force Programmatic Ready for MS MS SAE Review Board or OSD IPA Independent Programmatic Risk MS SMC/MDA Program assessment Assessment PSR Program Programmatic Risk MS Director SSE/ Support assessment USD(AT&L) Review IIPT Integrating Programmatic Support OIPT MS PM/OIPT IPT OIPT Overarching Programmatic Ready for MS Dir PSA, IPT DAB DASD(NII)/ DAB DAB Defense Programmatic Advise MS USD(AT&L) Acquisition USD(AT&L) Board CSB Configuration Periodic Review Annual SAE/ Steering oversight requirements USD(AT&L) Board changes DAES Defense Periodic Ongoing Quarterly PM/ Acquisition oversight performance DUSD(A&T) Executive Summary PEO/SR PEO Periodic Executability Annual PEO Sufficiency oversight confidence Review

FINDINGS AND CONCLUSIONS 19 Unique Prereview Documenta­- Presenter Stakeholders Briefings Product/Output tion? Duplication PM SAF/AQ NDA Approve Yes Partially acquisition strategy plan A5 staff, Air staff, AF FCBs Validate No NDA MAJCOM Secretariat requirements representative J8 director, Services, Joint JROCM Yes No Service warfighters Require- representative ments Board, Panel Requirements PM, user, Unknown Readiness to No No representative OSD, service, proceed KTR PM, staff DSAB, staff Unknown Briefing and report No Partially support PM, PO & Dir SSE, Coordi- Brief findings, No Partially KTR SMEs USD(AT&L), nation recommendations PM meetings Varies PM, user, None Report to OIPT NDA Partially OSD, service, KTR PM; OSD PM, PEO, IIPT or Report w/ No No staff OSD, Joint equivalent, recommendations Staff staff OIPT chair, USD (AT&L), OIPT Acquisition Yes No PM, others DAB Decision members Memorandum PM SAE, Unknown Approve Yes NDA USD(AT&L), requirements JCS change PM (with PM, SAE, Unknown Action items Yes No SAE) OSD staff PM, SMEs PEO, PM, Unknown Briefing, PoPs No NDA ACE, SMEs update continued

20 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS Table 2-1  Program Review Matrix (continued) OPR/ Review Name Type Purpose Frequency Customer AFAA AF Audit Ad hoc Varies Varies AFAA/SECAF Agency oversight DOD IG DOD Ad hoc Varies Varies DoD IG/ Inspector oversight SecDef General GAO Government Ad hoc Cost, Varies GAO/Congress Accountability oversight schedule, Office performance Other ad Varies Varies Varies Varies hoc TRA Technology Technical Executability MS DDR&E/MDA Readiness confidence Assessment ASR Alternative Technical Ready for MS A Engineering/ System technology PM Review development SEAM Systems Technical Validates SE Varies AFCSE/PM Engineering process Assessment Model SRR System Technical Executability One time Engineering/ Requirements confidence PM Review SDR System Design Technical Replaced by One time Engineering/ Review the SFR PM SFR System Technical Ready for One time Engineering/ Functional prelim design PM Review PDR Preliminary Technical Ready for One time Engineering/ Design detailed PM Review design IBR Integrated Technical Align program Varies PM Baseline expectations Review LHA Logistics Technical Logistics Unknown Unknown; Health health currently AAC Assessment MRA Manufacturing Technical Executability One time DDR&E/MDA Readiness confidence Assessment CDR Critical Technical Ready for One time Engineering/ Design fabrication PM Review

FINDINGS AND CONCLUSIONS 21 Unique Prereview Documenta­- Presenter Stakeholders Briefings Product/Output tion? Duplication PM, staff SECAF, staff, Coordi- Report No Partially AFAA office nation meetings PM, staff Varies - Coordi- Written report w/ No Partially wide-ranging nation recommendations meetings Varies - often Varies - Coordi- Written report w/ No Partially PM or rep wide-ranging nation recommendations meetings Varies Varies Unknown Varies Varies Yes AF ST&E PM, S&T, Unknown Tech Readiness NDA Yes DDR&E, Levels/Plan DAE PM, PO & PM, User, Unknown Rationale for No No KTR SMEs KTR preferred alt PM/ AFCSE, PM None Assessment reports No Partially Engineering PM/ Engineering, Unknown Satisfy exit criteria No No Engineering user, KTR PM/ PM, Unknown Satisfy exit criteria No No Engineering contractor PM/ Engineering, Unknown Satisfy exit criteria No No Engineering user, KTR PM/ Engineering, None Satisfy exit criteria No No Engineering user, KTR, DAE PM PM, NDA Mutual No Partially contractor understanding of program baseline Unknown PM, user, AF Unknown Log/sustainment Unknown Partially logisticians assessment PM/ PM, S&T, Unknown Mfg Readiness NDA NDA Engineering Levels/Plan PM/ Engineering, None Satisfy exit criteria No No Engineering User, KTR, DAE continued

22 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS Table 2-1  Program Review Matrix (continued) OPR/ Review Name Type Purpose Frequency Customer TRR Test Readiness Technical Ready for One time Engineering/ Review testing PM PRR Production Technical Ready for MS C, FRP Engineering/ Readiness production PM Review OTRR Operational Technical Approve One time SAF/AQ Test Readiness program OT Review readiness SVR System Technical Verifies MS C Engineering/ Verification performance PM Review FCA Functional Technical Satisfies One time Engineering/ Configuration contract PM Audit PCA Physical Technical Satisfies One time Engineering/ Configuration contract PM Audit Frequency: MS, this review takes place in support of or in conjunction with each formal program milestone (A, B, C, etc.); MS A, this review takes place in support of or in conjunction with formal program milestone A; MS C, this review takes place in support of or in conjunction with milestone C; MS C, FRP, this review takes place in support of or in conjunction with milestone C and the full rate production decision. Duplication: A review/meeting exactly the same as another review/meeting. By this definition, the use of the same material at more than one meeting does not constitute a duplication of the meeting. The the prebriefs to staff functional principals and others as noted in the preceding paragraph. Although the data and information received did not allow the committee to determine specific resources required to accomplish each review, answers to Question 1.5 in the survey and information gathered from the interviews indicated that 10-30 percent of a PM’s time is spent supporting reviews. The committee found that there is little consistency in the way reviews are conducted and con- cluded that opinions on the contribution of specific reviews to successful acquisi- tion varied widely. The committee believes that the Air Force could improve the effectiveness of its program review effort and reduce the burden on PMs by thoughtfully combin- ing and scheduling reviews. The committee looked at the policies and processes of the National Aeronautics and Space Administration (NASA), the Department of Energy (DOE), industry, and the other military services. Although review

FINDINGS AND CONCLUSIONS 23 Unique Prereview Documenta­- Presenter Stakeholders Briefings Product/Output tion? Duplication PM/ Test IPT, None Prepared for testing No No Engineering/ User, KTR, Test DOT&E PM/ KTR, Subs None Requirement met, No No Engineering production ready PM SAF staff, NDA ASAF(A) approval Yes Partially DOT&E of program’s OT readiness PM/ User, KTR None Requirements met, No No Engineering funding adequate PM/ PM, None Audit report NDA No Engineering Engineering, KTR PM/ PM, None Audit report or NDA No Engineering Engineering, completion KTR degree of duplication shown in the Duplication column is subjective and is based on review documen- tation, interviewee comments, and survey results. Given that no review, let alone its process owner, requires any data collection on duplication, coordination, value added, and so on, the conclusions are not rigorous. Review duplication attributes: Yes, the review was judged to be a duplication; No, the review was judged not to be a duplication; Partially, the meeting was judged to be a partial duplication; and NDA, the review is too new or had no data available to allow a judgment on duplication. structures and practices exist, the differences in implementation make cross- c ­ ommunity consolidation or streamlining a significant challenge. The information collected from experienced PMs in the U.S. Army and the U.S. Navy, as well as the U.S. Air Force, and from industry contributed to the set of best practices reported in the recommendations. Reviews Have Both Benefits and Costs The committee found that program and technical reviews have both benefits and costs. In general, reviews provide technical and programmatic support to suc- cessfully execute acquisition programs, to inform decisions, to share awareness, and to engender program advocacy. In their answers to Survey Question 2.3, PMs said that reviews facilitated program execution as well as problem dis­covery and resolution at all levels of the acquisition enterprise, including industry. That said,

24 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS support for the increasing number of program reviews and associated prebriefs is costly, as is the lost productivity and attention of both government and industry. For example, DODI 5000.02 of December 2008 calls for systems engineering across all phases and mandates technical reviews. The new regulation mandates competitive prototyping of the system or key system elements during the technol- ogy development (TD) phase, and a preliminary design review (PDR) must be conducted for each of the candidate designs, followed by a PDR report to be pro- vided to the MDA either in the TD phase supporting milestone B or, if afterward, a separate MDA review during the engineering and manufacturing development (EMD) phase. Although some such overhead is undoubtedly necessary, excesses can distract PMs from their primary focus, which is managing their program’s technical and business progress. The findings and conclusions of the committee focus on two key areas: The first is the execution of reviews (sequencing, timing, participation), and the second is their planning. Execution of Reviews (Sequencing, Timing, Participation) The committee’s investigation confirmed that the number of reviews is grow- ing. The committee recognizes that such reviews are generally valuable for sharing knowledge and serve as “stage gates” for the governance and control of programs, including managing risk. While doing this, however, they most certainly add to program costs, and they also draw management’s attention from the main program effort. Of most concern is that the proliferation of reviews does not appear to have had a positive effect on program cost and schedule outcomes. The committee also realized that there is a significant amount of preparation and coordination for reviews, both vertically in their conduct at multiple levels of responsibility and horizontally across adjacent staff offices. As such, merely depicting the individual reviews does not sufficiently capture the amount of time and effort spent by staff in preparing for and coordinating the reviews. The committee found that the many disparate concerns of higher-level staffs had an impact on the program manager. For ACAT I programs, many of the written-out responses to Survey Questions 4.1-4.4 described DOD staff as a stove-piped bureaucracy, where domain “czars” have purview over a breadth of programs (by virtue of the OIPT structure or their membership on the DAB) but are not horizontally integrated from the standpoint of knowledge sharing or synergy. This means not only that the PMs have to prepare separate information   he T qualitative write-in comments from the PMs have not been included in this report because several of the comments were written in such specific detail that the authors of the comments could be identified by people familiar with their programs. This would violate the privacy protection meth- odology for the survey, so the data are not included.

FINDINGS AND CONCLUSIONS 25 briefs for these higher-ups, but also that the information provided to them is not fully integrated across these domains by the OSD staff for optimal decision m ­ aking by the MDA. Programs and technical reviews are often not being optimally synchronized with program events. Speakers gave examples of requirements reviews being conducted after contracts had been awarded and other similar occurrences of i ­ nappropriate sequencing. The committee was briefed on emerging acquisition policy with its emphasis on pre-Milestone A activities to ensure a better under- standing of alternative technical requirements. The committee believes there is ample opportunity for the Air Force and DOD to improve the timing of these reviews and even to consolidate reviews at the service staff level and below to eliminate redundancy and mitigate the associated burdens and costs. Finding 1a. A significant share of a PM’s time is spent on preparing for and participating in milestone and other reviews. Finding 1b. The large number of reviews diverts a PM’s attention from the execu- tion of his or her program. Finding 1c. Reviews impose significant costs on program leadership teams. The committee interviewed a number of PMs to obtain their views on the time they spent and the costs associated with carrying out these various reviews. It also sought their assessment of how the conduct of these reviews affects their ability to carry out the day-to-day tasks associated with their programs. The PM survey also asked respondents to address these issues. In every case, those inter- viewed or surveyed cited significant costs in terms of money and time to carry out the reviews, and most of them also noted an adverse impact on their ability to carry out other PM responsibilities. Sixty-nine percent of survey respondents said they were working 51 or more hours per week (Survey Question 1.4) and on average were able to spend only 46 percent of that time managing their program at wing level or below (Survey Question 1.5). About 20 percent of their time was spent reporting up the chain of command above the wing level (Survey Question 1.5). In addition, many of the respondents cited the need to dedicate hundreds or thousands of program staff and contractor hours to carry out the reviews, with concomitant money and time lost from their everyday duties (based on write-in responses to Survey Questions 3.11-3.14 and 3.26-3.29). Finding 1d. Many higher-level reviews do not contribute to program execution in proportion to their expenditure of time and effort. While those interviewed identified benefits they received from reviews that were effectively carried out, a number of survey respondents and interviewees said

26 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS they received no significant benefit or improvement to program cost, schedule, or performance (CSP) from the effort. For example, 29 PMs identified what they considered the least beneficial (from the standpoint of program CSP) higher- level review they had been involved in. Eighty percent of them cited no positive impact on CSP from this “least beneficial review” (based on written-in responses to Survey Question 3.24). Some even went so far as to say that the review had actually had a negative impact on CSP by delaying program schedule or increasing program costs in the case of reviews that were ineffectively carried out (based on written-in responses to Survey Question 3.25). It should be noted that the PMs were not questioned on ancillary benefits such as sponsorship, which they may not have viewed as a direct program benefit. Finding 1e. No one in the Air Force or OSD is responsible for monitoring the number of program and technical reviews or the workload they give rise to or their cost, effectiveness, and cumulative impact on PMs. Based on the committee’s review of the OSD and Air Force organizational structures and its discussions with senior DOD leadership, no one is responsible for monitoring the direct costs of program and technical reviews, in terms of the time, personnel usage, extra costs, and effects on contractors and the PM. Conclusion 1. Many reviews add little value and others do not add value in pro- portion to the effort required. Reducing the number of such reviews or combining them could increase the time available to the PMs to more effectively manage their programs. Finding 2. The sequencing, timing, and frequency of reviews are often not tied to the program schedule in a manner that most effectively supports the program and its execution. Forty-three percent of survey respondents commenting on the least beneficial reviews suggested that those reviews could be more effective if they were con- ducted less frequently and at a more appropriate time in the program’s life cycle (generally earlier) (Survey Question 2.6). Conclusion 2. Reviews could be more effective if they were sequenced and timed to provide the information needed for program execution. Chapter 3 describes an approach for beginning to achieve this result. Finding 3a. The program review principals, key stakeholders, and subject matter experts do not always attend program reviews.

FINDINGS AND CONCLUSIONS 27 Finding 3b. Program and technical reviews are often not attended by the right personnel or, in some cases, are attended by too many personnel. Many survey respondents noted that more effort should be given to ensuring that the right subject matter experts and appropriate senior officials attend program reviews and that the number of attendees be limited to those who can add value to the meeting (based on responses to Survey Questions 2.6 and 3.15 and written-in responses to Survey Questions 3.30-4.4). Conclusion 3. Required attendance at program review meetings is not clearly communicated nor is it effectively controlled. Finding 4a. For some reviews the number of actual reviews and preparatory reviews is excessive and the reviews do not contribute value to the program’s management. Many of the PMs, both in the survey and in interviews, stated that a prolifera- tion of meetings and premeetings was taking time away from the management of their programs (based on responses to Survey Question 2.6 and written-in responses to survey Questions 3.10 and 3.25). Commenting on the reasons for this increase, a few PMs mentioned the Integrating Integrated Product Team (IIPT) and the NII/AT&L structure. The elimination of IIPT reviews was cited as one factor that led to the need for more individual premeetings with the Joint Staff, program management offices, and OSD, where previously only one meeting was needed. Another reason cited was the recent NII/AT&L reconfiguration that resulted in the sharing of responsibilities between these two offices. Survey respondents cited the amount of time now taken up by such premeetings. One respondent, for example, said “The problem isn’t the review . . . it’s the numerous premeetings needed to get to the review.” Finding 4b. Program managers are spending time on multiple reviews with similar objectives. Many survey respondents believed that selected reviews could be combined (Survey Question 2.7). Conclusion 4. Streamlining or combining reviews and their associated prebriefs in both the vertical and horizontal directions could increase efficiency. PROGRAM REVIEW PLANNING The committee found that in many cases, despite published guidance advo- cating proper planning, reviews were incompletely planned or conducted. The

28 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS potential value of a review cannot be fully realized if objectives are not clearly specified or if the right persons are not in attendance. What information a particu- lar review is supposed to contain or what areas a particular review is supposed to cover are often ill defined or based on presumptions of agendas or issues of the day. Often there are no metrics for assessing the effectiveness of a review, and this can lead to perceptions of disruption and uncertain value. On the other hand, many PMs found that the program support reviews (PSRs) and the independent program assessments (IPAs) were well planned, that they added value, and that they contributed positively to program management. PSRs and IPAs are comprehensive in nature; have well-defined processes, outcomes, and metrics; are socialized with PMs and staff; are conducted by sub- ject matter experts; and are well documented. A review process that makes timely use of PSRs and IPAs might help to limit the number of reviews needed across the bureaucracy to gather information and make decisions. Finding 5. The purpose, scope, information needs, key issues, and expected out- comes of many reviews have not been specified. Survey respondents mentioned a number of ways to improve reviews. Sugges- tions for improvement to address the problems cited above included narrowing the review’s focus or changing its charter (based on responses to Survey Questions 2.6 and 3.15 and on written-in responses to Survey Question 3.30). PMs noted that they would be asked about issues that had not been previously identified for discussion and would then be required to spend countless hours after the review trying to respond to them. They also said there was no standard approach to how the reviews should be conducted. Conclusion 5. It is important that program review planning is accomplished in a thoughtful, purposeful manner with a standard approach in order to firmly address the need for communication of the expectations and outcomes. Finding 6. Reviews focus on a single system instead of on a complex system of systems of which the single system is a part. Further, reviews that attempt to address programs from the larger system-of-systems perspective are often unable to cope with the complex interfaces among programs. Seventy percent of the ACAT I PMs who responded to the survey character- ized the amount of external interface of their programs with other efforts as exten- sive (Survey Question 1.13, ACAT 1 PM responses only). This answer—combined with survey written-in responses and PM discussions with the full committee n ­ oting that some reviews did not take into account connections with and depen- dencies on other programs for mission accomplishment—allows concluding that the current acquisition and program review process has not adapted to the evolu-

FINDINGS AND CONCLUSIONS 29 tion of simple systems into systems of systems and fails to take into consideration the additional complexity and interrelationships necessary for effective program management in this new environment (based on written-in responses to Survey Questions 3.10, 3.25, and 4.1-4.4). Conclusion 6. Program review format and design need to reflect the greater complexity and interrelationships inherent in many current Air Force programs to ensure that a system of systems works across organizational constructs.

Next: 3 Recommendations »
Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs Get This Book
×
 Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs
Buy Paperback | $43.00 Buy Ebook | $34.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Department of Defense (DOD) spends over $300 billion each year to develop, produce, field and sustain weapons systems (the U.S. Air Force over $100 billion per year). DOD and Air Force acquisitions programs often experience large cost overruns and schedule delays leading to a loss in confidence in the defense acquisition system and the people who work in it. Part of the DOD and Air Force response to these problems has been to increase the number of program and technical reviews that acquisition programs must undergo. This book looks specifically at the reviews that U.S. Air Force acquisition programs are required to undergo and poses a key question: Can changes in the number, content, or sequence of reviews help Air Force program managers more successfully execute their programs?

This book concludes that, unless they do it better than they are now, Air Force and DOD attempts to address poor acquisition program performance with additional reviews will fail. This book makes five recommendations that together form a gold standard for conduct of reviews and if implemented and rigorously managed by Air Force and DOD acquisition executives can increase review effectiveness and efficiency. The bottom line is to help program managers successfully execute their programs.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!