National Academies Press: OpenBook
« Previous: Appendix C: Related Studies
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 58
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 59
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 60
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 61
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 62
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 63
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 64
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 65
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 66
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 67
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 68
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 69
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 70
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 71
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 72
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 73
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 74
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 75
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 76
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 77
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 78
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 79
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 80
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 81
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 82
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 83
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 84
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 85
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 86
Suggested Citation:"Appendix D: Survey." National Research Council. 2009. Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs. Washington, DC: The National Academies Press. doi: 10.17226/12673.
×
Page 87

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Appendix D Survey To obtain data to help it accomplish the tasks set for it, the committee developed and conducted a survey of Air Force PMs. A copy of the survey in its entirety follows. The main purpose of the survey was to increase the number of people the committee talked to beyond the limited number the committee had contacted directly and to obtain additional quantitative data. The survey data were yet another form of information to augment what the committee members had learned from their research, interviews, and personal experience. The committee employed a multistep process to produce the final survey: 1. The initial series of survey questions was developed from inputs from f ­ ormer government program managers and senior consultants with rel- evant DOD experience. 2. The draft set of survey questions was discussed with current Air Force PMs and senior functional support staff at one Air Force product center and the questions were refined. 3. A survey expert from NRC provided general guidelines on the conduct of the survey and data protection statements, on ensuring that the survey questions were objectively stated and structured to encourage survey t ­akers to complete the survey, and on incorporating human factor consid- erations. The draft survey was improved using these guidelines. The NRC expert also provided advice on the approval process for surveys that are part of an NRC-administered study. This advice was followed to obtain NRC Institutional Review Board approval of the survey. 4. Survey format feedback and Air Force survey approval process informa- tion were also provided by the Air Force Manpower Agency Air Force 58

APPENDIX D 59 Survey Office, which is responsible for approval of surveys of Air Force personnel. This feedback was also used to refine the survey and to plan the schedule for survey approval. 5. A refined draft survey was provided to several current Air Force PMs of various grades asking them to check on the clarity and pertinence of the questions and to estimate how long it would take to complete the survey. This feedback was used to further revise the survey questions. 6. The draft survey was provided to the full committee membership individu- ally for their review, and the comments were used to further refine and streamline the survey. 7. The near-final survey was reviewed by an NRC survey expert, who helped to clarify some questions and eliminate others in order to reduce the time necessary to complete the survey while preserving the potential to collect as useful data as possible. After survey development had been completed, the final version of the survey was submitted in parallel to the NRC Institutional Review Board and to the Air Force formal survey approval process. Both the NRC and the Air Force approved the survey as it was submitted, and it received an official Air Force Survey Number (USAF SCN 08-045) and an expiration date (July 18, 2009). The next step was distribution to the intended survey population—that is, to Air Force PMs. Several steps remained: 1. Each of the four Air Force product centers (the Aeronautical Systems Center, the Air Armament Center, the Electronic Systems Center, and the Space and Missile Systems Center) was tasked to provide a list of PMs employed there. The list was to include all the Acquisition Category (ACAT) I PMs at the center and a sampling of ACAT II and ACAT III PMs. Each provided the PM names and e-mail addresses, which were entered into the Web-based survey tool. 2. Each of the Program Executive Officers (PEOs) at the four product centers was notified of the survey and invited to participate, both because of their program management experience and to ensure they were aware of the questions being asked of their PMs. Next the PMs identified at the centers were invited by e-mail to take the survey and given the direct link to the Web-based survey tool. 3. To maximize participation in the survey, NRC staff used the Web-based tool to send reminders to the PMs. Also, based on the advice of the NRC survey expert, the committee’s survey subgroup extended the survey w ­ indow in conjunction with a final word of encouragement to the PMs who had not yet taken the survey. The survey data collection period was closed out on August 7, 2008.

60 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 4. When the survey was developed, criteria had been developed to gauge the success of the survey from the standpoint of how the data would be used. Four criteria were developed in consultation with the NRC expert: accept- able number of responses from each product center; acceptable percentage of response from the most senior (ACAT I) PMs; acceptable percentage of overall PM responses; and acceptable number of responses for each review that is specifically evaluated in report. Once time for responding to the survey had run out, the results were reviewed against the success criteria and were judged to have met them (or not). The results were then reviewed by the data collection subgroup of the committee and specific reports were broken out, depending on the particular issue/topic of interest, and provided to the full committee for its use. The specific results of the survey that gave rise to a particular committee finding are discussed in the analysis section under the pertinent finding. The quantitative survey results (multiple choice) are shown next, for all the PMs who responded to the survey. The qualitative “written- in” (essay type) comments from the PMs have not been included in this appendix because several of them gave so much detail that their authors could be identified by people familiar with their programs, violating their privacy. Because the committee recognized the challenges of constructing a survey and reporting its results (such as bias, demographics, and numbers of responses), it sought and received professional support from the NRC in, among other things, devising criteria to judge whether there was sufficient information to permit mean- ingful analysis. The committee also made promises to the Air Force regarding the use and anonymity of the data. Although there were more responses from ESC than from the other centers, when the data were partitioned in various ways the overall results show significant consistency across the four centers. With respect to potential survey bias, the committee considered the possibility that disgruntled (or “dissatisfied”) PMs might be more likely to respond. However, responses from the PMs on the survey were thoughtful and balanced and the balance of positive and negative comments on the survey was very much in alignment with interview comments and discussions. A final note about the use of survey data: the main use was to weigh and compare the positive and negative perceptions of program reviews and to suggest how the overall review process, as well as individual reviews, could be made more effective from the perspective of the PMs. The specific response breakout percentages for any individual question were seldom the focus—the relative bal- ance was of more interest to the committee in most cases. No finding, conclusion, or recommendation of the committee is based solely on survey data; rather, they represent what the committee heard from all its sources.

APPENDIX D 61 Survey and Response Data Intro Page Purpose of Survey: The primary purpose of this survey is to collect information from AF program managers on how much time/effort is uniquely spent preparing for, participating in, and following up on tasks from higher level AF and OSD reviews, that would not otherwise have had to be spent for the purpose of good program management. The study committee is also interested in your assessments, both positive and negative, of the higher-level reviews you have participated in and any changes you would recommend. Collecting this information will help the committee to determine how to respond to the SAF/AQR-sponsored study objec- tive to “Identify and evaluate options for streamlining, tailoring, integrating, or consolidating reviews of programs to increase the cost-effectiveness and to lessen workforce impact of the reviews as a whole.” The entire Statement of Task for this study can be found at http://www8.nationalacademies.org/cp/­projectview. aspx?key=48922 for reference. Data Protection Statement: The detailed data collected in this survey will only be viewed by the three committee people assigned to collect the data (Randy W ­ eidenheimer, Richard Szafranski, and Allan Burman), the chairman and vice- chairman of the study committee (Rand Fisher and Dan Stewart, respectively), and National Academy of Sciences professional staff members (Jim Garcia, Enita Williams, and Kamara Brown). Any reporting of the survey results will be at the summary level, with information related to specific people or programs removed. If the study group decides that any direct quotations from write-in sections of the survey would be useful to illustrate specific points, then the committee will attri- bute the quote to “an AF program manager” and remove all identifying informa- tion (and will confirm with the author that this has been done satisfactorily). Survey Instructions, Structure, and Statement of Task Instructions for Survey: Please complete all questions in the survey, marking questions “N/A” as appropriate. If necessary, you can save a partially completed survey and return to it later to answer the remaining questions. Structure of Survey: Section 1—Demographic Data Section—information on program manager and program Section 2—Program Activity Overview Section—information on pertinent exter- nal reviews/reporting accomplished by the program

62 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS Section 3—Questions on Specific Reviews—information on time/effort spent on specific reviews/reporting accomplished by each program manager taking the survey Section 4—Optional Section to Comment on Streamlining/Tailoring/Integrating/ Consolidating Opportunities Section 1—Demographic Data Instructions: Please complete all questions in Section 1. 1.0 At which product center do you currently work? Response Count AAC 9 ASC 17 ESC 41 SMC 12 Other 2 answered question 81 skipped question 2 1.1 How long have you been a program manager in your current position? Response Count Less than 6 months 23 6 months but less than 1 year 8 1 year but less than 2 years 21 2 years but less than 3 years 19 3 years or more 12 answered question 83 skipped question 0

APPENDIX D 63 1.2 Including the time in your current job, how many total years experience do you have performing the function of a program manager (PM), whether this was your official title or not? Response Count Less than 1 year 0 1 year but less than 3 years 7 3 years but less than 5 years 8 5 years but less than 7 years 14 7 years but less than 15 years 26 15 or more years 28 answered question 83 skipped question 0 1.3 How much acquisition experience do you have? For the purposes of this study, consider time spent in program offices as well as staff assignments that worked with the requirements definition process, the planning/programming/ budgeting process, or the acquisition policy/governance process. Also, include any time spent working in industry in an equivalent job to the government jobs identified above. Response Count Less than 1 year 0 1 year but less than 3 years 5 3 years but less than 5 years 3 5 years but less than 7 years 7 7 years but less than 15 years 23 15 or more years 45 answered question 83 skipped question 0 1.3.a Of the total time stated in 1.3, how much of the time was spent in System Program Office (SPO)? Response Count Less than 1 year 0 1 year but less than 3 years 8 3 years but less than 5 years 13 5 years but less than 7 years 12 7 years but less than 15 years 36 15 or more years 13 answered question 82 skipped question 1

64 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 1.4 How many hours do you work, on average, each week? Response Count 40-45 hours 7 46-50 hours 18 51-55 hours 20 56-60 hours 17 61-65 hours 8 66-70 hours 7 71 or more hours 3 Other 1 answered question 81 skipped question 2 1.5 Estimate the percentage of your time you spend on the following activities each week. The sum of all fields should total “100,” including the write-in “Other” field at the bottom of the list. You must enter the number “0” in any activity field—including “Other”—that is not applicable. For other, please write in applicable examples in the field. Response Average Personnel Activities (e.g., performance reports, hiring actions, recognition and 12.60 promotion ceremonies, career counseling/mentoring, etc.) Administrative Activities (e.g., Center and Wing staff meetings, facility issues, 13.43 security and computer training, etc.) Military Training (e.g., physical fitness, self-aid buddy care, LOAC training, 6.34 exercise support, etc.) Program Management—above-the-Wing level activities (e.g., verbal and written 19.56 reporting to chain-of-command beyond the Wing, including PEO, HQ AF, and OSD reviews/reports) Program Management—Wing-level and below activities (e.g., including gov’t- 46.02 only meetings as well as interactions with the contractors) Other Activities 5.85 answered question 82 skipped question 1

APPENDIX D 65 1.6 On average each week, what percentage of your time is spent in direct contact with your contractors? This includes talking on the phone and on VTCs as well as in-person meetings. Response Count Less than 1% 2 1% to 5% 6 6% to 10% 19 11% to 15% 16 16% to 20% 15 21% to 25% 4 26% to 30% 5 31% to 35% 2 36%-40% 6 More than 40% 3 N/A 1 Other 2 answered question 81 skipped question 2 1.7 Absent any other demands on your time, ideally how much time would you want to spend in contact with your contractors, on average, each week? Response Count Less than 1% 0 1% to 5% 0 6% to 10% 9 11% to 15% 10 16% to 20% 13 21% to 25% 14 26% to 30% 5 31% to 35% 10 36%-40% 9 More than 40% 11 N/A 0 Other 0 answered question 81 skipped question 2

66 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 1.8 How many total people are in your program office? Include Mil/Civ/ FFRDC/SETA. Response Count Less than 5 1 5-10 2 11-20 11 21-40 13 41-60 11 61-80 7 81-100 2 101 or more 34 N/A 1 answered question 82 skipped question 1 1.9 In your opinion, during your tenure has the acquisition experience level of personnel in your program office increased, remained about the same, or decreased? Response Count Increased 17 Remained about the same 29 Decreased 36 N/A 0 answered question 82 skipped question 1 1.10 What is the approximate annual budget of your program/portfolio? Response Count Less than $25M 15 $25M to $50M 9 $51M to $75M 5 $76M to $100M 2 $101M to $150M 8 $151M to $300M 16 $301M to $500M 10 $501M to $700M 3 $700M or more 11 N/A 2 answered question 81 skipped question 2

APPENDIX D 67 1.11 What is the Acquisition Category (ACAT) of your current program? If you have more than one program, indicate the highest ACAT rating within your portfolio (with ACAT ID being the highest possible). Response Count ACAT ID 25 ACAT IC 5 ACAT IAM 1 ACAT IAC 3 ACAT II 15 ACAT III 30 N/A 2 answered question 81 skipped question 2 1.12 During the period 1 Jan 06 to 30 May 08, what acquisition phase has your program been in? If your program transitioned between two (or more) phases during this period, mark all that apply. If you have more than one major program, please complete for each program, with #1 being largest dollar value program, then #2, then #3. Response Program #1 Program #2 Program #3 Count Concept Refinement 5 6 4 14 Technology Development 13 12 6 29 System Development and Demonstration 36 25 11 54 Production and Deployment 42 16 8 49 Operations and Support 24 6 12 34 N/A 4 2 2 4 answered question 81 skipped question 2

68 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 1.13 One possible driver of the need for higher level reviews is to ensure the coordination of programs that have a significant amount of external inter- faces. Please characterize the amount of external interfaces of the program with other efforts. Response Count Stand-alone system with very little/minimal amount of external interfaces 10 Modest amount of external interfaces 28 Extensive amount of external interfaces 43 N/A 1 answered question 82 skipped question 1 Section 2—Program Activity Overview Section If you have more than one program, then comment on the one with the highest Acquisition Category and, if more than one in that ACAT, comment on the pro- gram with highest total program cost. NOTE: Reviews can have more than one purpose (e.g., approve a milestone, improve cost/schedule/technical performance, provide information, etc.). The questions below focus on specific purposes of reviews. 2.1 Which of the major program reviews/assessments has your program par- ticipated in during the period 1 Jan 06 to 30 May 08? Check all that apply. Response Count Defense Acquisition Board (DAB) Milestone Review 10 Defense Space Acquisition Board (DSAB) Milestone Review 5 Defense Acquisition Board (DAB) Status Review 6 Defense Space Acquisition Board (DSAB) Status Review 2 Overarching Integrated Product Team (OIPT) Review 15 Technology Readiness Assessment (TRA) 21 Technology Maturity Assessment (TMA) 8 Independent Program Assessment (IPA) 23 Program Support Review (PSR) 11 Manufacturing Readiness Review (MRR) 7 Logistics Health Assessment (LHA) 5 System Engineering Assessment Model (SEAM) 9 Air Force Review Board (AFRB) 13 Other 29 answered question 57 skipped question 28

APPENDIX D 69 2.2 For each of these major program reviews/assessments that your program experienced, indicate your assessment of their impact on program perfor- mance (i.e., cost/schedule/technical performance accomplishment)? Please select a response for each review, indicate “N/A” for each review that is not applicable. Positive No Negative Response Impact Impact Impact N/A Count Defense Acquisition Board (DAB) Milestone 7 5 1 35 48 Review Defense Space Acquisition Board (DSAB) 2 1 1 38 42 Milestone Review Defense Acquisition Board (DAB) Status 2 4 1 39 46 Review Defense Space Acquisition Board (DSAB) 2 0 2 39 43 Status Review Overarching Integrated Product Team (OIPT) 7 7 4 30 48 Review Technology Readiness Assessment (TRA) 8 13 2 27 50 Technology Maturity Assessment (TMA) 4 4 1 34 43 Independent Program Assessment (IPA) 12 5 4 27 48 Program Support Review (PSR) 4 5 2 33 44 Manufacturing Readiness Review (MRR) 5 3 0 34 42 Logistics Health Assessment (LHA) 3 2 2 35 42 System Engineering Assessment Model (SEAM) 3 7 2 33 45 Air Force Review Board (AFRB) 7 5 1 34 47 Other 9 8 3 23 43 answered question 66 skipped question 17

70 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 2.2.a Which single review had the greatest positive impact on program performance? Response Count Defense Acquisition Board (DAB) Milestone Review 5 Defense Space Acquisition Board (DSAB) Milestone Review 0 Defense Acquisition Board (DAB) Status Review 1 Defense Space Acquisition Board (DSAB) Status Review 1 Overarching Integrated Product Team (OIPT) Review 3 Technology Readiness Assessment (TRA) 4 Technology Maturity Assessment (TMA) 1 Independent Program Assessment (IPA) 11 Program Support Review (PSR) 2 Manufacturing Readiness Review (MRR) 2 Logistics Health Assessment (LHA) 0 System Engineering Assessment Model (SEAM) 3 Air Force Review Board (AFRB) 3 Other 19 answered question 55 skipped question 28 2.3 Reference your highest rated review from question 2.2a above, why did this review have a positive impact? Check all that apply. Response Count Subject matter experts provided valuable inputs on problems/issues 21 Senior leaders engaged to help resolve problems/issues 26 Visibility of the review focused contractor leadership attention on fixing problems 16 prior to having to brief senior government leaders Program office uncovered problems/issues as part of preparation for review 12 N/A 19 Other 7 answered question 64 skipped question 19

APPENDIX D 71 2.4 Higher level HQ AF/OSD reviews/assessments provide senior leadership information that is necessary for their understanding of program perfor- mance, to fulfill their oversight role. Please rate each of the reviews that your program experienced in terms of how effective you believe the struc- ture/format of the review was at providing useful data to the senior AF and OSD leadership. Lots of Some Little No Useful Useful Useful Useful Response Data Data Data Data N/A Count Defense Acquisition Board (DAB) 4 10 2 0 26 42 Milestone Review Defense Space Acquisition Board 2 4 1 0 27 34 (DSAB) Milestone Review Defense Acquisition Board (DAB) 1 4 2 0 30 37 Status Review Defense Space Acquisition Board 0 1 2 0 31 34 (DSAB) Status Review Overarching Integrated Product 4 11 3 2 23 43 Team (OIPT) Review Technology Readiness 8 9 6 2 19 44 Assessment (TRA) Technology Maturity Assessment 2 3 2 2 27 36 (TMA) Independent Program Assessment 9 9 0 2 21 41 (IPA) Program Support Review (PSR) 1 7 3 2 26 39 Manufacturing Readiness Review 2 4 3 2 27 38 (MRR) Logistics Health Assessment 0 3 3 2 28 36 (LHA) System Engineering Assessment 2 3 3 3 26 37 Model (SEAM) Air Force Review Board (AFRB) 4 9 2 1 26 42 Other 4 9 2 1 22 38 answered question 61 skipped question 22

72 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 2.4.a Higher level HQ AF/OSD reviews/assessments provide senior lead- ership information that is necessary for their understanding of program performance, to fulfill their oversight role. Please rate each of the reviews that your program experienced in terms of how effective you believe the structure/format of the review was at providing useful data to the senior AF and OSD leadership. Lots of Some Little No Useful Useful Useful Useful Response Data Data Data Data N/A Count Defense Acquisition Board (DAB) 4 8 1 0 32 45 Milestone Review Defense Space Acquisition Board 1 4 1 1 31 38 (DSAB) Milestone Review Defense Acquisition Board (DAB) 1 5 1 1 33 41 Status Review Defense Space Acquisition Board 0 1 2 1 35 39 (DSAB) Status Review Overarching Integrated Product 5 6 6 1 27 45 Team (OIPT) Review Technology Readiness 5 9 5 3 25 47 Assessment (TRA) Technology Maturity Assessment 1 4 2 2 31 40 (TMA) Independent Program Assessment 12 6 0 1 26 45 (IPA) Program Support Review (PSR) 3 4 1 4 30 42 Manufacturing Readiness Review 1 3 3 3 32 42 (MRR) Logistics Health Assessment 0 3 1 3 33 40 (LHA) System Engineering Assessment 1 3 2 4 30 40 Model (SEAM) Air Force Review Board (AFRB) 5 7 1 1 31 45 Other 7 6 1 1 21 38 answered question 61 skipped question 22

APPENDIX D 73 2.5 From the list below, identify the three higher level HQ AF/OSD reviews/ reporting activities that you believe have the LEAST beneficial impact on program performance. One would be the LEAST beneficial, followed by two, then three. Second Third Least Least Least Response Beneficial Beneficial Beneficial Count Defense Acquisition Board (DAB) 1 1 1 3 Milestone Review Defense Space Acquisition Board (DSAB) 0 0 1 1 Milestone Review Defense Acquisition Board (DAB) Status 1 4 3 8 Review Defense Space Acquisition Board (DSAB) 0 0 1 1 Status Review Overarching Integrated Product Team 8 3 2 13 (OIPT) Review Technology Readiness Assessment (TRA) 4 4 3 11 Technology Maturity Assessment (TMA) 4 6 2 12 Independent Program Assessment (IPA) 1 3 3 7 Program Support Review (PSR) 7 4 5 16 Manufacturing Readiness Review (MRR) 2 4 2 8 Logistics Health Assessment (LHA) 3 1 4 8 System Engineering Assessment Model 7 4 6 17 (SEAM) Air Force Review Board (AFRB) 2 2 4 8 Other 4 1 0 5 answered question 46 skipped question 37

74 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 2.6 What could be done to improve the positive impact of these reviews? (Select as many as apply for the three reviews identified in question 2.5 above.) Second Third Least Least Least Response Beneficial Beneficial Beneficial Count Hold the review at a different time in the 3 1 1 5 program lifecycle Reduce frequency of reviews 9 5 5 14 Expand attendee list to include additional 3 2 0 4 subject matter experts Restrict attendee list to smaller group 13 8 6 15 Change charter 8 3 3 9 Combine with another review 15 18 15 23 Consolidate or reduce number of pre- 17 9 11 21 reviews Narrow focus of review 11 9 7 16 Shorten length of the meeting 10 7 6 11 Other 5 3 5 7 2.7 From what you know from any source, identify the program reviews that have the highest potential to be combined into a single useful review. Please select from the list of reviews below and use the write-in section to show the pairings/groupings (examples: Review M & Z; Review S, T, & Y; Report O & R). Response Count Defense Acquisition Board (DAB) Milestone Review 7 Defense Space Acquisition Board (DSAB) Milestone Review 2 Defense Acquisition Board (DAB) Status Review 7 Defense Space Acquisition Board (DSAB) Status Review 4 Overarching Integrated Product Team (OIPT) Review 12 Technology Readiness Assessment (TRA) 24 Technology Maturity Assessment (TMA) 21 Independent Program Assessment (IPA) 11 Program Support Review (PSR) 12 Manufacturing Readiness Review (MRR) 9 Logistics Health Assessment (LHA) 12 System Engineering Assessment Model (SEAM) 12 Air Force Review Board (AFRB) 5 Other 3 answered question 35 skipped question 48

APPENDIX D 75 2.8 Which of the written/digital reporting mechanisms has your program used during the period 1 Jan 06 to 30 May 08? Check all that apply. Response Count SMART 70 PoPS 68 SAR 18 DAES 21 Other 13 answered question 71 skipped question 12 2.9 Internal to your program office, do you use these written/digital reporting mechanisms in the day-to-day management of your program? Please mark Yes, No, or N/A for all reporting mechanisms. Yes No N/A Response Count SMART 22 49 1 72 PoPS 18 50 2 70 SAR 2 26 31 59 DAES 3 25 30 58 Other 2 12 18 32 answered question 72 skipped question 11

76 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 2.10 Have you received any feedback from the HQ AF or OSD level on the inputs you have provided for these written/digital reporting mechanisms? Please mark Yes, No, or N/A for all reporting mechanisms. Yes No N/A Response Count SMART 15 54 1 70 PoPS 11 54 2 67 SAR 7 17 34 58 DAES 8 16 33 57 Other 4 6 23 33 answered question 71 skipped question 12 2.11 How well do you think these written/digital reporting mechanisms do at providing an accurate and informative view of your program? Please rate each tool that you use on a scale of 1 to 5, with 1 indicating the tool does a very good job at providing an accurate and informative picture of your program and 5 indicating that the tool does a very poor job at providing an accurate and informative picture. Mark all reporting mechanisms. Very Very Good Good Acceptable Poor Poor Response (1) (2) (3) (4) (5) N/A Count SMART 7 22 26 12 4 1 72 PoPS 1 13 29 21 6 1 71 SAR 3 6 11 3 1 37 61 DAES 3 6 10 5 0 36 60 Other 1 4 4 2 1 20 32 answered question 72 skipped question 11

APPENDIX D 77 2.12 For the following major reviews, please indicate your opinion about whether the documentation required by higher authorities to support each of the following reviews is Insufficient (In), About Right (AR), Excessive but Decreasing (E-D), Excessive and Stable (E-S), or Excessive and Increasing (E-I). Select N/A if you do not have experience with a particular review. Response In AR E-D E-S E-I N/A Count Defense Acquisition Board (DAB) 1 9 2 5 9 31 57 Milestone Review Defense Space Acquisition Board (DSAB) 1 5 1 0 4 41 52 Milestone Review Defense Acquisition Board (DAB) Status 1 4 1 8 4 37 55 Review Defense Space Acquisition Board (DSAB) 1 5 1 2 1 41 51 Status Review Overarching Integrated Product Team 1 13 1 4 10 28 57 (OIPT) Review Technology Readiness Assessment (TRA) 1 18 1 5 8 24 57 Technology Maturity Assessment (TMA) 2 11 1 5 1 33 53 Independent Program Assessment (IPA) 0 11 3 6 3 31 54 Program Support Review (PSR) 0 13 2 3 3 32 53 Manufacturing Readiness Review (MRR) 2 9 2 3 0 36 52 Logistics Health Assessment (LHA) 0 6 2 3 4 36 51 System Engineering Assessment Model 1 4 3 3 8 31 50 (SEAM) Air Force Review Board (AFRB) 0 10 2 8 2 32 54 Other 0 4 0 2 1 20 27 answered question 62 skipped question 21 Section 3A—Questions on Specific Reviews This section asks you to rate the Most Helpful and Least Helpful higher level reviews/assessments that your program has experienced sometime during the period 1 Jan 06 to 30 May 08. Most Helpful Review: For the higher level review/assessment that you thought was most helpful to execution of your program (reference question 2.2 above), please answer the following questions:

78 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 3.1 Did the review occur at the most useful time in the schedule for program activities? Response Count Yes 38 No 21 answered question 59 skipped question 24 3.2 Did the review result in the decision(s) necessary to allow the program to continue on schedule? Response Count Yes 46 No 13 answered question 59 skipped question 24 3.2.a Was this result appropriate, given the situation? Response Count Yes 53 No 5 answered question 58 skipped question 25 3.3 Did the right subject matter experts appropriate for this review attend the meeting? Response Count Yes 44 No 7 Does not apply 9 answered question 60 skipped question 23

APPENDIX D 79 3.4 Did you receive timely guidance from this meeting? Response Count Yes 35 No 16 Does not apply 9 answered question 60 skipped question 23 3.5 Did the review, or directed follow-up action, cause any change in the cur- rent execution of the program? Response Count Yes 24 No 29 Does not apply 8 answered question 61 skipped question 22 3.6 Did the review, or directed follow-up action, cause any change in the future plans for the program? Response Count Yes 31 No 22 Does not apply 7 answered question 60 skipped question 23 3.7 What percentage of program office senior leadership personnel (i.e., X of the Y senior SPO people, example 2 of 5 = 40%) were involved with the preparation for this review? Response Count Less than 10% 3 10% to less than 20% 5 20% to less than 30% 8 30% to less than 40% 11 40% to less than 50% 5 50% or more 26 answered question 58 skipped question 25

80 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 3.8 Was the prime contractor asked to provide support for this meeting? Response Count Yes 26 No 21 Does not apply 13 answered question 60 skipped question 23 3.8.a If yes, then did this support involve more than 20% of the contractor leadership personnel (i.e., X of the Y senior program people, example 2 of 5 = 40%)? Response Count Yes 13 No 22 answered question 35 skipped question 48 Section 3A (cont)—Identifying Positive and Negative Impacts This study is trying to determine if there are more efficient ways to perform the higher HQ review/reporting process, so the most useful data that can be collected involve the impact of higher HQ reviews/reports on program performance. One way to measure this is to identify positive and negative impacts of these reviews/ reports on program performance. The questions below seek to identify any specific examples of both positive impacts and negative impacts. 3.9 Can you identify any ways in which the program performance was improved (e.g., problems in the program that were resolved faster or discov- ered earlier) because of the attention provided by this higher level review/ assessment? Write in: Write-in responses withheld.

APPENDIX D 81 3.10 Can you identify any negative impacts on program performance (e.g., problems in the program that took longer to discover or longer to resolve) because of the time spent supporting this higher level review/assessment? Write in: Write-in responses withheld. Section 3A (cont)—Estimates of High Level Review Costs Part of the purpose of this survey is to understand the costs associated with higher level reviews. These costs can take many forms, including “opportunity” costs (as addressed previously), but another aspect of these costs is the monetary value of the time the program office spends doing preparation, prebriefs, the review itself, coordination of meeting minutes/decision memoranda, and postreview follow-up. The questions below ask for estimates of these costs in several variations. 3.11 How many hours of government personnel (mil & civ) were needed for the total support of this review? Write-in responses withheld. 3.12 How many hours of FFRDC/SETA personnel were needed for the total support of this review? Write-in responses withheld. 3.13 How many hours of prime/subcontractor personnel were needed for the total support of this review? Write-in responses withheld. 3.14 It is possible that some/most of the hours spent to prepare for the review, as documented in questions 3.11, 3.12, and 3.13, would have been spent as part of good program management, even if there had not been a higher level review. Therefore, this question asks you to estimate how many hours were uniquely spent to prepare for this higher level review that would not have been spent for any other reason besides preparing for this review. Estimate the unique hours spent on the higher-level review: Write-in responses withheld.

82 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 3.15 What could be done to improve the utility of this review? (Check all that apply) Response Count Reduce frequency 7 Increase attendees 2 Decrease attendees 12 Change charter 6 Combine with another review 9 Consolidate or reduce number of pre-reviews 22 Narrow focus of review 11 Shorten length of the meeting 5 Change sequence in relation to other reviews 3 Better synchronize with other reviews 14 Nothing—review is fine as it is 10 Other 10 answered question 53 skipped question 30 Section 3B—Evaluating Least Helpful Higher Level Reviews/Assessments Least Helpful Review—For the higher level review/assessment that you thought was least helpful (reference question 2.5 above), please answer the following questions: 3.16 Did the review occur at the most useful time in the schedule for program activities? Response Count Yes 19 No 23 answered question 42 skipped question 41 3.17 Did the review result in the decision(s) necessary to allow the program to continue on schedule? Response Count Yes 22 No 20 answered question 42 skipped question 41

APPENDIX D 83 3.17.a Given the information that was presented at the review, was this result appropriate? Response Count Yes 31 No 10 answered question 41 skipped question 42 3.18 Did the subject matter experts appropriate for this review attend the meeting? Response Count Yes 32 No 10 answered question 42 skipped question 41 3.19 Did you get timely guidance from this meeting? Response Count Yes 16 No 26 answered question 42 skipped question 41 3.20 Did the review, or directed follow-up action, cause any change in the current execution of the program? Response Count Yes 12 No 30 answered question 42 skipped question 41

84 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 3.21 Did the review, or directed follow-up action, cause any change in the future plans for the program? Response Count Yes 14 No 28 answered question 42 skipped question 41 3.22 What percentage of senior program leadership personnel (i.e., X of the Y senior SPO people, example 2 of 5 = 40%) were involved with the prepara- tion for this review? Response Count Less than 10% 2 10% to less than 20% 9 20% to less than 30% 4 30% to less than 40% 5 40% to less than 50% 5 50% or more 15 answered question 40 skipped question 43 3.23 Was the prime contractor asked to provide support for this meeting? Response Count Yes 23 No 11 N/A 8 answered question 42 skipped question 41 3.23.a If yes, then did this support involve more than 20% of the contractor leadership personnel (i.e., X of the Y senior program people, example 2 of 5 = 40%)? Response Count Yes 12 No 16 answered question 28 skipped question 55

APPENDIX D 85 Section 3B (cont)—Identifying Positive and Negative Impacts This study is trying to determine if there are more efficient ways to perform the higher HQ reviews/assessments, so the most useful data that can be collected involve the impact of higher HQ reviews/reports on program performance. One way to measure this is to identify positive and negative impacts of these reviews/ reports on program performance. The questions below seek to identify any specific examples of both positive impacts and negative impacts. 3.24 Can you identify any ways in which the program performance was improved (e.g., problems in the program that were resolved faster or discov- ered earlier) because of the attention provided by this higher level review/ assessment? Write in: Write-in responses withheld. 3.25 Can you identify any negative impacts on program performance (e.g., problems in the program that took longer to discover or longer to resolve) because of the time spent supporting this higher level review/assessment? Write-in responses withheld. Section 3B (cont)—Estimates of High Level Review Costs Part of the purpose of this survey is to understand the costs associated with higher level reviews. These costs can take many forms, including “opportunity” costs (as addressed above), but another aspect of these costs is the monetary value of the time the program office spends doing preparation, prebriefs, the review itself, coordination of meeting minutes/decision memoranda, and postreview follow-up. The questions below ask for estimates of these costs in several variations. 3.26 How many hours of government personnel (mil & civ) were needed for the total support of this review? Write-in responses withheld.

86 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 3.27 How many hours of FFRDC/SETA personnel were needed for the total support of this review? Write-in responses withheld. 3.28 How many hours of prime/subcontractor personnel were needed for the total support of this review? Write-in responses withheld. 3.29 It is possible that some/most of the hours spent to prepare for the review, as documented in questions 3.26, 3.27, and 3.28, would have been spent as part of good program management, even if there had not been a higher level review. Therefore, this question asks you to estimate how many hours were uniquely spent to prepare for this higher level review that would not have been spent for any other reason besides preparing for this review. Estimate the unique hours spent on the higher-level review: Write-in responses withheld. 3.30 What could be done to improve the utility of this review? (Select all those that apply.) Response Count Reduce frequency 9 Increase attendees 1 Decrease attendees 12 Change charter 7 Combine with another review 13 Consolidate or reduce number of pre-reviews 12 Narrow focus of review 10 Shorten length of the meeting 5 Change sequence in relation to other reviews 3 Better synchronize with other reviews 9 Nothing—review is fine as it is 4 Other 5 answered question 39 skipped question 44

APPENDIX D 87 Section 4—Comments on Streamlining/Tailoring/Integrating/ Consolidating Opportunities Please provide general comments as well as recommendations on streamlining, tailoring, integrating, and consolidating opportunities. 4.1 What opportunities for streamlining of higher-level reviews, not previ- ously mentioned, would you recommend? Write-in responses withheld. 4.2 What opportunities for tailoring of higher-level reviews, not previously mentioned, would you recommend? Write-in responses withheld. 4.3 What opportunities for integrating of higher-level reviews, not previously mentioned, would you recommend? Write-in responses withheld. 4.4 What opportunities for consolidating of higher-level reviews, not previ- ously mentioned, would you recommend? Write-in responses withheld. 4.5 Please provide any other comments you think would improve the ability of higher HQ review of AF acquisition programs to either enable senior lead- ers to perform their oversight role more effectively, and/or help the program being reviewed execute more effectively. Write-in responses withheld.

Next: Appendix E: Types of Reviews »
Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs Get This Book
×
 Optimizing U.S. Air Force and Department of Defense Review of Air Force Acquisition Programs
Buy Paperback | $43.00 Buy Ebook | $34.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Department of Defense (DOD) spends over $300 billion each year to develop, produce, field and sustain weapons systems (the U.S. Air Force over $100 billion per year). DOD and Air Force acquisitions programs often experience large cost overruns and schedule delays leading to a loss in confidence in the defense acquisition system and the people who work in it. Part of the DOD and Air Force response to these problems has been to increase the number of program and technical reviews that acquisition programs must undergo. This book looks specifically at the reviews that U.S. Air Force acquisition programs are required to undergo and poses a key question: Can changes in the number, content, or sequence of reviews help Air Force program managers more successfully execute their programs?

This book concludes that, unless they do it better than they are now, Air Force and DOD attempts to address poor acquisition program performance with additional reviews will fail. This book makes five recommendations that together form a gold standard for conduct of reviews and if implemented and rigorously managed by Air Force and DOD acquisition executives can increase review effectiveness and efficiency. The bottom line is to help program managers successfully execute their programs.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!