Appendix D
Survey

To obtain data to help it accomplish the tasks set for it, the committee developed and conducted a survey of Air Force PMs. A copy of the survey in its entirety follows. The main purpose of the survey was to increase the number of people the committee talked to beyond the limited number the committee had contacted directly and to obtain additional quantitative data. The survey data were yet another form of information to augment what the committee members had learned from their research, interviews, and personal experience.

The committee employed a multistep process to produce the final survey:

  1. The initial series of survey questions was developed from inputs from former government program managers and senior consultants with relevant DOD experience.

  2. The draft set of survey questions was discussed with current Air Force PMs and senior functional support staff at one Air Force product center and the questions were refined.

  3. A survey expert from NRC provided general guidelines on the conduct of the survey and data protection statements, on ensuring that the survey questions were objectively stated and structured to encourage survey takers to complete the survey, and on incorporating human factor considerations. The draft survey was improved using these guidelines. The NRC expert also provided advice on the approval process for surveys that are part of an NRC-administered study. This advice was followed to obtain NRC Institutional Review Board approval of the survey.

  4. Survey format feedback and Air Force survey approval process information were also provided by the Air Force Manpower Agency Air Force



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 58
Appendix D Survey To obtain data to help it accomplish the tasks set for it, the committee developed and conducted a survey of Air Force PMs. A copy of the survey in its entirety follows. The main purpose of the survey was to increase the number of people the committee talked to beyond the limited number the committee had contacted directly and to obtain additional quantitative data. The survey data were yet another form of information to augment what the committee members had learned from their research, interviews, and personal experience. The committee employed a multistep process to produce the final survey: 1. The initial series of survey questions was developed from inputs from former government program managers and senior consultants with rel- evant DOD experience. 2. The draft set of survey questions was discussed with current Air Force PMs and senior functional support staff at one Air Force product center and the questions were refined. 3. A survey expert from NRC provided general guidelines on the conduct of the survey and data protection statements, on ensuring that the survey questions were objectively stated and structured to encourage survey takers to complete the survey, and on incorporating human factor consid - erations. The draft survey was improved using these guidelines. The NRC expert also provided advice on the approval process for surveys that are part of an NRC-administered study. This advice was followed to obtain NRC Institutional Review Board approval of the survey. 4. Survey format feedback and Air Force survey approval process informa- tion were also provided by the Air Force Manpower Agency Air Force 8

OCR for page 58
9 APPENDIX D Survey Office, which is responsible for approval of surveys of Air Force personnel. This feedback was also used to refine the survey and to plan the schedule for survey approval. 5. A refined draft survey was provided to several current Air Force PMs of various grades asking them to check on the clarity and pertinence of the questions and to estimate how long it would take to complete the survey. This feedback was used to further revise the survey questions. 6. The draft survey was provided to the full committee membership individu- ally for their review, and the comments were used to further refine and streamline the survey. 7. The near-final survey was reviewed by an NRC survey expert, who helped to clarify some questions and eliminate others in order to reduce the time necessary to complete the survey while preserving the potential to collect as useful data as possible. After survey development had been completed, the final version of the survey was submitted in parallel to the NRC Institutional Review Board and to the Air Force formal survey approval process. Both the NRC and the Air Force approved the survey as it was submitted, and it received an official Air Force Survey Number (USAF SCN 08-045) and an expiration date (July 18, 2009). The next step was distribution to the intended survey population—that is, to Air Force PMs. Several steps remained: 1. Each of the four Air Force product centers (the Aeronautical Systems Center, the Air Armament Center, the Electronic Systems Center, and the Space and Missile Systems Center) was tasked to provide a list of PMs employed there. The list was to include all the Acquisition Category (ACAT) I PMs at the center and a sampling of ACAT II and ACAT III PMs. Each provided the PM names and e-mail addresses, which were entered into the Web-based survey tool. 2. Each of the Program Executive Officers (PEOs) at the four product centers was notified of the survey and invited to participate, both because of their program management experience and to ensure they were aware of the questions being asked of their PMs. Next the PMs identified at the centers were invited by e-mail to take the survey and given the direct link to the Web-based survey tool. 3. To maximize participation in the survey, NRC staff used the Web-based tool to send reminders to the PMs. Also, based on the advice of the NRC survey expert, the committee’s survey subgroup extended the survey window in conjunction with a final word of encouragement to the PMs who had not yet taken the survey. The survey data collection period was closed out on August 7, 2008.

OCR for page 58
0 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 4. When the survey was developed, criteria had been developed to gauge the success of the survey from the standpoint of how the data would be used. Four criteria were developed in consultation with the NRC expert: accept - able number of responses from each product center; acceptable percentage of response from the most senior (ACAT I) PMs; acceptable percentage of overall PM responses; and acceptable number of responses for each review that is specifically evaluated in report. Once time for responding to the survey had run out, the results were reviewed against the success criteria and were judged to have met them (or not). The results were then reviewed by the data collection subgroup of the committee and specific reports were broken out, depending on the particular issue/topic of interest, and provided to the full committee for its use. The specific results of the survey that gave rise to a particular committee finding are discussed in the analysis section under the pertinent finding. The quantitative survey results (multiple choice) are shown next, for all the PMs who responded to the survey. The qualitative “written- in” (essay type) comments from the PMs have not been included in this appendix because several of them gave so much detail that their authors could be identified by people familiar with their programs, violating their privacy. Because the committee recognized the challenges of constructing a survey and reporting its results (such as bias, demographics, and numbers of responses), it sought and received professional support from the NRC in, among other things, devising criteria to judge whether there was sufficient information to permit mean- ingful analysis. The committee also made promises to the Air Force regarding the use and anonymity of the data. Although there were more responses from ESC than from the other centers, when the data were partitioned in various ways the overall results show significant consistency across the four centers. With respect to potential survey bias, the committee considered the possibility that disgruntled (or “dissatisfied”) PMs might be more likely to respond. However, responses from the PMs on the survey were thoughtful and balanced and the balance of positive and negative comments on the survey was very much in alignment with interview comments and discussions. A final note about the use of survey data: the main use was to weigh and compare the positive and negative perceptions of program reviews and to suggest how the overall review process, as well as individual reviews, could be made more effective from the perspective of the PMs. The specific response breakout percentages for any individual question were seldom the focus—the relative bal - ance was of more interest to the committee in most cases. No finding, conclusion, or recommendation of the committee is based solely on survey data; rather, they represent what the committee heard from all its sources.

OCR for page 58
 APPENDIX D SURvEy AND RESPONSE DATA Intro Page Purpose of Survey: The primary purpose of this survey is to collect information from AF program managers on how much time/effort is uniquely spent preparing for, participating in, and following up on tasks from higher level AF and OSD reviews, that would not otherwise have had to be spent for the purpose of good program management. The study committee is also interested in your assessments, both positive and negative, of the higher-level reviews you have participated in and any changes you would recommend. Collecting this information will help the committee to determine how to respond to the SAF/AQR-sponsored study objec- tive to “Identify and evaluate options for streamlining, tailoring, integrating, or consolidating reviews of programs to increase the cost-effectiveness and to lessen workforce impact of the reviews as a whole.” The entire Statement of Task for this study can be found at http://www8.nationalacademies.org/cp/projectview. aspx?key=48922 for reference. Data Protection Statement: The detailed data collected in this survey will only be viewed by the three committee people assigned to collect the data (Randy Weidenheimer, Richard Szafranski, and Allan Burman), the chairman and vice- chairman of the study committee (Rand Fisher and Dan Stewart, respectively), and National Academy of Sciences professional staff members (Jim Garcia, Enita Williams, and kamara Brown). Any reporting of the survey results will be at the summary level, with information related to specific people or programs removed. If the study group decides that any direct quotations from write-in sections of the survey would be useful to illustrate specific points, then the committee will attri - bute the quote to “an AF program manager” and remove all identifying informa- tion (and will confirm with the author that this has been done satisfactorily). Survey Instructions, Structure, and Statement of Task Instructions for Survey: Please complete all questions in the survey, marking questions “N/A” as appropriate. If necessary, you can save a partially completed survey and return to it later to answer the remaining questions. Structure of Survey: Section 1—Demographic Data Section—information on program manager and program Section 2—Program Activity Overview Section—information on pertinent exter- nal reviews/reporting accomplished by the program

OCR for page 58
2 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS Section 3—Questions on Specific Reviews—information on time/effort spent on specific reviews/reporting accomplished by each program manager taking the survey Section 4—Optional Section to Comment on Streamlining/Tailoring/Integrating/ Consolidating Opportunities Section 1—Demographic Data Instructions: Please complete all questions in Section 1. 1.0 At which product center do you currently work? Response Count AAC 9 ASC 17 ESC 41 SMC 12 Other 2 81 answered question 2 skipped question 1.1 How long have you been a program manager in your current position? Response Count Less than 6 months 23 6 months but less than 1 year 8 1 year but less than 2 years 21 2 years but less than 3 years 19 3 years or more 12 83 answered question 0 skipped question

OCR for page 58
 APPENDIX D 1.2 Including the time in your current job, how many total years experience do you have performing the function of a program manager (PM), whether this was your official title or not? Response Count Less than 1 year 0 1 year but less than 3 years 7 3 years but less than 5 years 8 5 years but less than 7 years 14 7 years but less than 15 years 26 15 or more years 28 83 answered question 0 skipped question 1.3 How much acquisition experience do you have? For the purposes of this study, consider time spent in program offices as well as staff assignments that worked with the requirements definition process, the planning/programming/ budgeting process, or the acquisition policy/governance process. Also, include any time spent working in industry in an equivalent job to the government jobs identified above. Response Count Less than 1 year 0 1 year but less than 3 years 5 3 years but less than 5 years 3 5 years but less than 7 years 7 7 years but less than 15 years 23 15 or more years 45 83 answered question 0 skipped question 1.3.a Of the total time stated in 1.3, how much of the time was spent in System Program Office (SPO)? Response Count Less than 1 year 0 1 year but less than 3 years 8 3 years but less than 5 years 13 5 years but less than 7 years 12 7 years but less than 15 years 36 15 or more years 13 82 answered question 1 skipped question

OCR for page 58
 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 1.4 How many hours do you work, on average, each week? Response Count 40-45 hours 7 46-50 hours 18 51-55 hours 20 56-60 hours 17 61-65 hours 8 66-70 hours 7 71 or more hours 3 Other 1 81 answered question 2 skipped question 1.5 Estimate the percentage of your time you spend on the following activities each week. The sum of all fields should total “100,” including the write-in “Other” field at the bottom of the list. you must enter the number “0” in any activity field—including “Other”—that is not applicable. For other, please write in applicable examples in the field. Response Average Personnel Activities (e.g., performance reports, hiring actions, recognition and 12.60 promotion ceremonies, career counseling/mentoring, etc.) Administrative Activities (e.g., Center and Wing staff meetings, facility issues, 13.43 security and computer training, etc.) Military Training (e.g., physical fitness, self-aid buddy care, LOAC training, 6.34 exercise support, etc.) Program Management—above-the-Wing level activities (e.g., verbal and written 19.56 reporting to chain-of-command beyond the Wing, including PEO, HQ AF, and OSD reviews/reports) Program Management—Wing-level and below activities (e.g., including gov’t- 46.02 only meetings as well as interactions with the contractors) Other Activities 5.85 82 answered question 1 skipped question

OCR for page 58
 APPENDIX D 1.6 On average each week, what percentage of your time is spent in direct contact with your contractors? This includes talking on the phone and on vTCs as well as in-person meetings. Response Count Less than 1% 2 1% to 5% 6 6% to 10% 19 11% to 15% 16 16% to 20% 15 21% to 25% 4 26% to 30% 5 31% to 35% 2 36%-40% 6 More than 40% 3 N/A 1 Other 2 81 answered question 2 skipped question 1.7 Absent any other demands on your time, ideally how much time would you want to spend in contact with your contractors, on average, each week? Response Count Less than 1% 0 1% to 5% 0 6% to 10% 9 11% to 15% 10 16% to 20% 13 21% to 25% 14 26% to 30% 5 31% to 35% 10 36%-40% 9 More than 40% 11 N/A 0 Other 0 81 answered question 2 skipped question

OCR for page 58
 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 1.8 How many total people are in your program office? Include Mil/Civ/ FFRDC/SETA. Response Count Less than 5 1 5-10 2 11-20 11 21-40 13 41-60 11 61-80 7 81-100 2 101 or more 34 N/A 1 82 answered question 1 skipped question 1.9 In your opinion, during your tenure has the acquisition experience level of personnel in your program office increased, remained about the same, or decreased? Response Count Increased 17 Remained about the same 29 Decreased 36 N/A 0 82 answered question 1 skipped question 1.10 What is the approximate annual budget of your program/portfolio? Response Count Less than $25M 15 $25M to $50M 9 $51M to $75M 5 $76M to $100M 2 $101M to $150M 8 $151M to $300M 16 $301M to $500M 10 $501M to $700M 3 $700M or more 11 N/A 2 81 answered question 2 skipped question

OCR for page 58
 APPENDIX D 1.11 What is the Acquisition Category (ACAT) of your current program? If you have more than one program, indicate the highest ACAT rating within your portfolio (with ACAT ID being the highest possible). Response Count ACAT ID 25 ACAT IC 5 ACAT IAM 1 ACAT IAC 3 ACAT II 15 ACAT III 30 N/A 2 81 answered question 2 skipped question 1.12 During the period 1 Jan 06 to 30 May 08, what acquisition phase has your program been in? If your program transitioned between two (or more) phases during this period, mark all that apply. If you have more than one major program, please complete for each program, with #1 being largest dollar value program, then #2, then #3. Response Program #1 Program #2 Program #3 Count Concept Refinement 5 6 4 14 Technology Development 13 12 6 29 System Development and Demonstration 36 25 11 54 Production and Deployment 42 16 8 49 Operations and Support 24 6 12 34 N/A 4 2 2 4 81 answered question 2 skipped question

OCR for page 58
8 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 1.13 One possible driver of the need for higher level reviews is to ensure the coordination of programs that have a significant amount of external inter- faces. Please characterize the amount of external interfaces of the program with other efforts. Response Count Stand-alone system with very little/minimal amount of external interfaces 10 Modest amount of external interfaces 28 Extensive amount of external interfaces 43 N/A 1 82 answered question 1 skipped question Section 2—Program Activity Overview Section If you have more than one program, then comment on the one with the highest Acquisition Category and, if more than one in that ACAT, comment on the pro- gram with highest total program cost. NOTE: Reviews can have more than one purpose (e.g., approve a milestone, improve cost/schedule/technical performance, provide information, etc.). The questions below focus on specific purposes of reviews. 2.1 Which of the major program reviews/assessments has your program par- ticipated in during the period 1 Jan 06 to 30 May 08? Check all that apply. Response Count Defense Acquisition Board (DAB) Milestone Review 10 Defense Space Acquisition Board (DSAB) Milestone Review 5 Defense Acquisition Board (DAB) Status Review 6 Defense Space Acquisition Board (DSAB) Status Review 2 Overarching Integrated Product Team (OIPT) Review 15 Technology Readiness Assessment (TRA) 21 Technology Maturity Assessment (TMA) 8 Independent Program Assessment (IPA) 23 Program Support Review (PSR) 11 Manufacturing Readiness Review (MRR) 7 Logistics Health Assessment (LHA) 5 System Engineering Assessment Model (SEAM) 9 Air Force Review Board (AFRB) 13 Other 29 57 answered question 28 skipped question

OCR for page 58
 APPENDIX D 2.12 For the following major reviews, please indicate your opinion about whether the documentation required by higher authorities to support each of the following reviews is Insufficient (In), About Right (AR), Excessive but Decreasing (E-D), Excessive and Stable (E-S), or Excessive and Increasing (E-I). Select N/A if you do not have experience with a particular review. Response In AR E-D E-S E-I N/A Count Defense Acquisition Board (DAB) 1 9 2 5 9 31 57 Milestone Review Defense Space Acquisition Board (DSAB) 1 5 1 0 4 41 52 Milestone Review Defense Acquisition Board (DAB) Status 1 4 1 8 4 37 55 Review Defense Space Acquisition Board (DSAB) 1 5 1 2 1 41 51 Status Review Overarching Integrated Product Team 1 13 1 4 10 28 57 (OIPT) Review Technology Readiness Assessment (TRA) 1 18 1 5 8 24 57 Technology Maturity Assessment (TMA) 2 11 1 5 1 33 53 Independent Program Assessment (IPA) 0 11 3 6 3 31 54 Program Support Review (PSR) 0 13 2 3 3 32 53 Manufacturing Readiness Review (MRR) 2 9 2 3 0 36 52 Logistics Health Assessment (LHA) 0 6 2 3 4 36 51 System Engineering Assessment Model 1 4 3 3 8 31 50 (SEAM) Air Force Review Board (AFRB) 0 10 2 8 2 32 54 Other 0 4 0 2 1 20 27 62 answered question 21 skipped question Section 3A—questions on Specific Reviews This section asks you to rate the Most Helpful and Least Helpful higher level reviews/assessments that your program has experienced sometime during the period 1 Jan 06 to 30 May 08. Most Helpful Review: For the higher level review/assessment that you thought was most helpful to execution of your program (reference question 2.2 above), please answer the following questions:

OCR for page 58
8 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 3.1 Did the review occur at the most useful time in the schedule for program activities? Response Count Yes 38 No 21 59 answered question 24 skipped question 3.2 Did the review result in the decision(s) necessary to allow the program to continue on schedule? Response Count Yes 46 No 13 59 answered question 24 skipped question 3.2.a Was this result appropriate, given the situation? Response Count Yes 53 No 5 58 answered question 25 skipped question 3.3 Did the right subject matter experts appropriate for this review attend the meeting? Response Count Yes 44 No 7 Does not apply 9 60 answered question 23 skipped question

OCR for page 58
9 APPENDIX D 3.4 Did you receive timely guidance from this meeting? Response Count Yes 35 No 16 Does not apply 9 60 answered question 23 skipped question 3.5 Did the review, or directed follow-up action, cause any change in the cur- rent execution of the program? Response Count Yes 24 No 29 Does not apply 8 61 answered question 22 skipped question 3.6 Did the review, or directed follow-up action, cause any change in the future plans for the program? Response Count Yes 31 No 22 Does not apply 7 60 answered question 23 skipped question 3.7 What percentage of program office senior leadership personnel (i.e., x of the y senior SPO people, example 2 of 5 = 40%) were involved with the preparation for this review? Response Count Less than 10% 3 10% to less than 20% 5 20% to less than 30% 8 30% to less than 40% 11 40% to less than 50% 5 50% or more 26 58 answered question 25 skipped question

OCR for page 58
80 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 3.8 Was the prime contractor asked to provide support for this meeting? Response Count Yes 26 No 21 Does not apply 13 60 answered question 23 skipped question 3.8.a If yes, then did this support involve more than 20% of the contractor leadership personnel (i.e., x of the y senior program people, example 2 of 5 = 40%)? Response Count Yes 13 No 22 35 answered question 48 skipped question Section 3A (cont)—Identifying Positive and Negative Impacts This study is trying to determine if there are more efficient ways to perform the higher HQ review/reporting process, so the most useful data that can be collected involve the impact of higher HQ reviews/reports on program performance. One way to measure this is to identify positive and negative impacts of these reviews/ reports on program performance. The questions below seek to identify any specific examples of both positive impacts and negative impacts. 3.9 Can you identify any ways in which the program performance was improved (e.g., problems in the program that were resolved faster or discov- ered earlier) because of the attention provided by this higher level review/ assessment? Write in: Write-in responses withheld.

OCR for page 58
8 APPENDIX D 3.10 Can you identify any negative impacts on program performance (e.g., problems in the program that took longer to discover or longer to resolve) because of the time spent supporting this higher level review/assessment? Write in: Write-in responses withheld. Section 3A (cont)—Estimates of High Level Review Costs Part of the purpose of this survey is to understand the costs associated with higher level reviews. These costs can take many forms, including “opportunity” costs (as addressed previously), but another aspect of these costs is the monetary value of the time the program office spends doing preparation, prebriefs, the review itself, coordination of meeting minutes/decision memoranda, and postreview follow-up. The questions below ask for estimates of these costs in several variations. 3.11 How many hours of government personnel (mil & civ) were needed for the total support of this review? Write-in responses withheld. 3.12 How many hours of FFRDC/SETA personnel were needed for the total support of this review? Write-in responses withheld. 3.13 How many hours of prime/subcontractor personnel were needed for the total support of this review? Write-in responses withheld. 3.14 It is possible that some/most of the hours spent to prepare for the review, as documented in questions 3.11, 3.12, and 3.13, would have been spent as part of good program management, even if there had not been a higher level review. Therefore, this question asks you to estimate how many hours were uniquely spent to prepare for this higher level review that would not have been spent for any other reason besides preparing for this review. Estimate the unique hours spent on the higher-level review: Write-in responses withheld.

OCR for page 58
82 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 3.15 What could be done to improve the utility of this review? (Check all that apply) Response Count Reduce frequency 7 Increase attendees 2 Decrease attendees 12 Change charter 6 Combine with another review 9 Consolidate or reduce number of pre-reviews 22 Narrow focus of review 11 Shorten length of the meeting 5 Change sequence in relation to other reviews 3 Better synchronize with other reviews 14 Nothing—review is fine as it is 10 Other 10 53 answered question 30 skipped question Section 3B—Evaluating Least Helpful Higher Level Reviews/Assessments Least Helpful Review—For the higher level review/assessment that you thought was least helpful (reference question 2.5 above), please answer the following questions: 3.16 Did the review occur at the most useful time in the schedule for program activities? Response Count Yes 19 No 23 42 answered question 41 skipped question 3.17 Did the review result in the decision(s) necessary to allow the program to continue on schedule? Response Count Yes 22 No 20 42 answered question 41 skipped question

OCR for page 58
8 APPENDIX D 3.17.a given the information that was presented at the review, was this result appropriate? Response Count Yes 31 No 10 41 answered question 42 skipped question 3.18 Did the subject matter experts appropriate for this review attend the meeting? Response Count Yes 32 No 10 42 answered question 41 skipped question 3.19 Did you get timely guidance from this meeting? Response Count Yes 16 No 26 42 answered question 41 skipped question 3.20 Did the review, or directed follow-up action, cause any change in the current execution of the program? Response Count Yes 12 No 30 42 answered question 41 skipped question

OCR for page 58
8 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 3.21 Did the review, or directed follow-up action, cause any change in the future plans for the program? Response Count Yes 14 No 28 42 answered question 41 skipped question 3.22 What percentage of senior program leadership personnel (i.e., x of the y senior SPO people, example 2 of 5 = 40%) were involved with the prepara - tion for this review? Response Count Less than 10% 2 10% to less than 20% 9 20% to less than 30% 4 30% to less than 40% 5 40% to less than 50% 5 50% or more 15 40 answered question 43 skipped question 3.23 Was the prime contractor asked to provide support for this meeting? Response Count Yes 23 No 11 N/A 8 42 answered question 41 skipped question 3.23.a If yes, then did this support involve more than 20% of the contractor leadership personnel (i.e., x of the y senior program people, example 2 of 5 = 40%)? Response Count Yes 12 No 16 28 answered question 55 skipped question

OCR for page 58
8 APPENDIX D Section 3B (cont)—Identifying Positive and Negative Impacts This study is trying to determine if there are more efficient ways to perform the higher HQ reviews/assessments, so the most useful data that can be collected involve the impact of higher HQ reviews/reports on program performance. One way to measure this is to identify positive and negative impacts of these reviews/ reports on program performance. The questions below seek to identify any specific examples of both positive impacts and negative impacts. 3.24 Can you identify any ways in which the program performance was improved (e.g., problems in the program that were resolved faster or discov- ered earlier) because of the attention provided by this higher level review/ assessment? Write in: Write-in responses withheld. 3.25 Can you identify any negative impacts on program performance (e.g., problems in the program that took longer to discover or longer to resolve) because of the time spent supporting this higher level review/assessment? Write-in responses withheld. Section 3B (cont)—Estimates of High Level Review Costs Part of the purpose of this survey is to understand the costs associated with higher level reviews. These costs can take many forms, including “opportunity” costs (as addressed above), but another aspect of these costs is the monetary value of the time the program office spends doing preparation, prebriefs, the review itself, coordination of meeting minutes/decision memoranda, and postreview follow-up. The questions below ask for estimates of these costs in several variations. 3.26 How many hours of government personnel (mil & civ) were needed for the total support of this review? Write-in responses withheld.

OCR for page 58
8 OPTIMIZING USAF AND DOD REVIEW OF AIR FORCE ACQUISITION PROGRAMS 3.27 How many hours of FFRDC/SETA personnel were needed for the total support of this review? Write-in responses withheld. 3.28 How many hours of prime/subcontractor personnel were needed for the total support of this review? Write-in responses withheld. 3.29 It is possible that some/most of the hours spent to prepare for the review, as documented in questions 3.26, 3.27, and 3.28, would have been spent as part of good program management, even if there had not been a higher level review. Therefore, this question asks you to estimate how many hours were uniquely spent to prepare for this higher level review that would not have been spent for any other reason besides preparing for this review. Estimate the unique hours spent on the higher-level review: Write-in responses withheld. 3.30 What could be done to improve the utility of this review? (Select all those that apply.) Response Count Reduce frequency 9 Increase attendees 1 Decrease attendees 12 Change charter 7 Combine with another review 13 Consolidate or reduce number of pre-reviews 12 Narrow focus of review 10 Shorten length of the meeting 5 Change sequence in relation to other reviews 3 Better synchronize with other reviews 9 Nothing—review is fine as it is 4 Other 5 39 answered question 44 skipped question

OCR for page 58
8 APPENDIX D Section 4—Comments on Streamlining/Tailoring/Integrating/ Consolidating Opportunities Please provide general comments as well as recommendations on streamlining, tailoring, integrating, and consolidating opportunities. 4.1 What opportunities for streamlining of higher-level reviews, not previ- ously mentioned, would you recommend? Write-in responses withheld. 4.2 What opportunities for tailoring of higher-level reviews, not previously mentioned, would you recommend? Write-in responses withheld. 4.3 What opportunities for integrating of higher-level reviews, not previously mentioned, would you recommend? Write-in responses withheld. 4.4 What opportunities for consolidating of higher-level reviews, not previ- ously mentioned, would you recommend? Write-in responses withheld. 4.5 Please provide any other comments you think would improve the ability of higher Hq review of AF acquisition programs to either enable senior lead- ers to perform their oversight role more effectively, and/or help the program being reviewed execute more effectively. Write-in responses withheld.