1
Preacquisition Technology Development for Air Force Weapon Systems

The National Research Council (NRC) issued a report in 2008 entitled Pre-Milestone A and Early-Phase Systems Engineering: A Retrospective Review and Benefits for Future Air Force Systems Acquisition (hereinafter referred to as the Kaminski report, after Paul G. Kaminski, the chair of that report’s study committee).1 The Kaminski report emphasized the importance of systems engineering early in the Department of Defense (DoD) acquisition life cycle and urged the revitalization of the systems engineering and Development Planning (DP) disciplines throughout the DoD.2

No less important to the future combat capability of the armed services is the development of new, cutting-edge technology. This is particularly true for the United States Air Force (USAF), which from its very inception has sought to capitalize on technological and scientific advances. Even before there was a United States Air Force, the Chief of Staff of the U.S. Army Air Corps Henry H. “Hap” Arnold was committed to giving his forces a decisive technological edge:

Arnold … intended to leave to his beloved air arm a heritage of science and technology so deeply imbued in the institution that the weapons it would fight with would always be the best the state of the art could provide and those on its drawing boards would be prodigies of futuristic thought.3

1

NRC. 2008. Pre-Milestone A and Early-Phase Systems Engineering: A Retrospective Review and Benefits for Future Air Force Systems Acquisition. Washington, D.C.: The National Academies Press.

2

Ibid.

3

Neil Sheehan. 2009. A Fiery Peace in a Cold War: Bernard Schriever and the Ultimate Weapon. New York, N.Y.: Random House, p. xvi.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 11
1 Preacquisition Technology Development for Air Force Weapon Systems The National Research Council (NRC) issued a report in 2008 entitled Pre- Milestone A and Early-Phase Systems Engineering: A Retrospective Review and Ben- efits for Future Air Force Systems Acquisition (hereinafter referred to as the Kaminski report, after Paul G. Kaminski, the chair of that report’s study committee).1 The Kaminski report emphasized the importance of systems engineering early in the Department of Defense (DoD) acquisition life cycle and urged the revitalization of the systems engineering and Development Planning (DP) disciplines throughout the DoD.2 No less important to the future combat capability of the armed services is the development of new, cutting-edge technology. This is particularly true for the United States Air Force (USAF), which from its very inception has sought to capital- ize on technological and scientific advances. Even before there was a United States Air Force, the Chief of Staff of the U.S. Army Air Corps Henry H. “Hap” Arnold was committed to giving his forces a decisive technological edge: Arnold . . . intended to leave to his beloved air arm a heritage of science and technology so deeply imbued in the institution that the weapons it would fight with would always be the best the state of the art could provide and those on its drawing boards would be prodigies of futuristic thought.3 1 NRC. 2008. Pre-Milestone A and Early-Phase Systems Engineering: A Retrospective Review and Benefits for Future Air Force Systems Acquisition. Washington, D.C.: The National Academies Press. 2 Ibid. 3 Neil Sheehan. 2009. A Fiery Peace in a Cold War: Bernard Schriever and the Ultimate Weapon. New York, N.Y.: Random House, p. xvi. 11 R01861 AF PTD--CS4 final.indd 11 2/18/11 2:25 PM

OCR for page 11
e va l u at i o n u. s . a i r f o rc e P r e ac q u i s i t i o n t e c h n o lo g y d ev e lo P m e n t s 12 of General Arnold succeeded in his dream of building the foundations of an Air Force that was second to none technologically. Dramatic innovations in aeronau- tics and later in space were fielded, with schedules that today seem impossible to achieve. The first U-2 flew just 18 months after it was ordered in 1953, and it was operational just 9 months after that first flight.4 The SR-71, even more radical, was developed with similar speed, going from contract award to operational status in less than 3 years.5 In the space domain, innovation was pursued with similar speed: for example, the Atlas A, America’s first intercontinental ballistic missile, required only 30 months from contract award in January 1955 to first launch in June 1957.6 At that time, the American military and defense industry set the standard in the effective management of new technology. In fact, the entire field known today as “project management” springs from the management of those missile develop- ment programs carried out by the Air Force, the United States Navy, and later the National Aeronautics and Space Administration (NASA). Tools used routinely throughout the project management world today—Program Evaluation Review Technique (PERT) and Critical Path Method (CPM) scheduling systems, Earned Value Management (EVM), Cost/Schedule Control System Criteria (C/SCSC), for example—trace back directly to the work of the Air Force, the Navy, and NASA in those years.7 Clearly those days are gone. The Kaminski report cites compelling statistics that describe dramatic cost and schedule overruns in specific, individual programs. Taken all together, the picture for major system acquisition is no better: The time required to execute large, government-sponsored systems development programs has more than doubled over the past 30 years, and the cost growth has been at least as great.8 STATEMENT OF TASK AND COMMITTEE FORMATION The Air Force requested that the National Research Council review current conditions and make recommendations on how to regain the technological exper- tise so characteristic of the Air Force’s earlier years. Such outside studies have long been part of the Air Force’s quest for improvement in technology. For example, 4 Clarence L. “Kelly” Johnson. 1989. More Than My Share of It All. Washington, D.C.: Smithsonian Institution Scholarly Press. 5 Ibid. 6 For additional information, see http://www.fas.org/nuke/guide/usa/icbm/sm-65.htm. Accessed May 10, 2010. 7 For additional information, see http://www.mosaicprojects.com.au/PDF_Papers/P050_Origins_ of_Modern_PM.pdf. Accessed May 10, 2010. 8 NRC. 2008. Pre-Milestone A and Early-Phase Systems Engineering: A Retrospective Review and Ben - efits for Future Air Force Systems Acquisition. Washington, D.C.: The National Academies Press, p. 14. R01861 AF PTD--CS4 final.indd 12 2/18/11 2:25 PM

OCR for page 11
Preacquisition technology develoPment air force weaPon systems 13 for General Hap Arnold, in his autobiography, described the difficulties that he faced leading the pre-World War II United States Army Air Corps: Still, in spite of our smallness and the perpetual discouragements, it was not all bad. Prog- ress in engineering, development, and research was fine. At my old stamping grounds in Dayton, I found the Materiel Division doing an excellent job within the limits of its funds. [General Oscar] Westover was calling on the National Research Council for problems too tough for our Air Corps engineers to handle. . . .9 Early in the present study, the Air Force pointed to three elements of the ex- isting acquisition process as examples of things that required improvement. First, evolutionary technology transition has suffered from less-than-adequate early (pre-Milestone A) planning activities that manifest themselves later in problems of cost, schedule, and technical performance. Second, revolutionary transition too often competes with evolutionary transition, as reflected in efforts to rush advanced technology to the field while failing to recognize and repair chronic underfunding of evolutionary Air Force acquisition efforts. Third, there appears to be no single Air Force research and development (R&D) champion designated to address these issues. Although technology plays a part in all Air Force activities, from operations to sustainment and systems modification, the task for the present study was targeted at the development and acquisition of new major systems. Accordingly, this study focuses on how to improve the ability to specify, develop, test, and insert new tech - nology into new Air Force systems, primarily pre-Milestone B. Box 1-1 contains the statement of task for this study. To address the statement of task, the Committee on Evaluation of U.S. Air Force Preacquisition Technology Development was formed. Biographical sketches of the committee members are included in Appendix A. THE PARAMETERS OF THIS STUDY The statement of task specifically requires assessment of relevant DoD processes and policies, both current and historical, and invites proposed changes to Air Force workforce, organization, policies, processes, and resources. Issues of particular concern include resourcing alternatives for pre-Milestone B activities and the role of technology demonstrations. Previous NRC studies, and studies by other groups, 9 H.H. Arnold. 1949. Global Mission. New York, N.Y.: Harper, p. 165. R01861 AF PTD--CS4 final.indd 13 2/18/11 2:25 PM

OCR for page 11
e va l u at i o n u. s . a i r f o rc e P r e ac q u i s i t i o n t e c h n o lo g y d ev e lo P m e n t s 14 of have addressed portions of the material covered in this report.10,11,12,13,14,15,16 Im- portantly, however, no previous report was expressly limited to addressing the topic of early-phase technology development that is the focus of this study. Studies of major defense systems acquisition are certainly not in short supply. Over the previous half-century there have been literally scores of such assessments, and their findings are remarkably similar: Weapons systems are too expensive, they take too long to develop, they often fail to live up to expectations. A central ques- tion for a reader of this report has to be—What makes this study any different? The answer is, in a word, technology, and its development and integration into Air Force systems. From those early days of Hap Arnold, it was the capable development, planning, and use of technology that set the Air Force and its pre- decessors apart from the other services. That technological reputation needs to be preserved—some would say recaptured—if the Air Force is to continue to excel in the air, space, and cyberspace domains discussed later in this chapter. One cause of this technological challenge is that, for a variety of reasons, the Air Force has lost focus on technology development over the past two decades. The Kaminski report makes clear that Air Force capabilities in the critical areas of systems engineering and Development Planning were allowed to atrophy. These declines had their origins in legislative actions, financial pressures, demographics, workforce development, and a host of other sources. But altogether, they led to a 10 NRC. 2008. Pre-Milestone A and Early-Phase Systems Engineering: A Retrospective Review and Benefits for Future Air Force Systems Acquisition. Washington, D.C.: The National Academies Press. 11 NRC. 2010. Achieving Effective Acquisition of Information Technology in the Department of Defense. Washington, D.C.: The National Academies Press. 12 Assessment Panel of the Defense Acquisition Performance Assessment Project. 2006. Defense Acquisition Performance Assessment Report. A Report by the Assessment Panel of the Defense Acqui- sition Performance Assessment Project for the Deputy Secretary of Defense. Available at https://acc. dau.mil/CommunityBrowser.aspx?id=18554. Accessed June 10, 2010. 13 Gary E. Christle, Danny M. Davis, and Gene H. Porter. 2009. CNA Independent Assessment.Air Force Acquisition: Return to Excellence. Alexandria, Va.: CNA Analysis & Solutions. 14 Business Executives for National Security. 2009. Getting to Best: Reforming the Defense Acquisi- tion Enterprise. A Business Imperative for Change from the Task Force on Defense Acquisition Law and Oversight. Available at http://www.bens.org/mis_support/Reforming%20the%20Defense.pdf. Accessed June 10, 2010. 15 USAF. 2008. Analysis of Alternative (AoA) Handbook: A Practical Guide to Analysis of Alterna - tives. Kirtland Air Force Base, N.Mex.: Air Force Materiel Command’s (AFMC’s) Office of Aerospace Studies. Available at http://www.oas.kirtland.af.mil/AoAHandbook/AoA%20Handbook%20Final. pdf. Accessed June 10, 2010. 16 DoD. 2009. Technology Readiness Assessment (TRA) Deskbook. Prepared by the Director, Research Directorate (DRD), Office of the Director, Defense Research and Engineering (DDR&E). Washington, D.C.: Department of Defense. Available at http://www.dod.mil/ ddre/doc/DoD_TRA_July_2009_Read_Version.pdf. Accessed September 2, 2010. R01861 AF PTD--CS4 final.indd 14 2/18/11 2:25 PM

OCR for page 11
Preacquisition technology develoPment air force weaPon systems 15 for BOX 1-1 Statement of Task The NRC will: 1. Examine appropriate current or historical Department of Defense (DoD) policies and processes, including the Planning, Programming, Budgeting, and Execution System, DoD Instruction 5000.02, the Air Force Acquisi - tion Improvement Plan, the Joint Capabilities Integration and Development System, and DoD and Air Force competitive prototyping policies to comprehend their impact on the execution of pre-program of record technology development efforts. 2. Propose changes to the Air Force workforce, organization, policies, processes and resources, if any, to better perform preacquisition technology development. Specific issues to consider include: a. Resourcing alternatives for Pre-Milestone B activities b. The role of technology demonstrations 3. Study and report on industry/Government best practices to address both evolutionary (deliberate) and revo - lutionary (rapid) technology development. 4. Identify potential legislative initiatives, if any, to improve technology development and transition into opera - tional use. decline in the Air Force’s ability to successfully integrate technology into weapons systems in a timely and cost-effective manner. This study was also commissioned at a time of increased interest in technol- ogy management outside the Air Force and beyond the DoD. Disappointed by acquisition programs that underperformed in terms of cost and schedule, Congress enacted the Weapon Systems Acquisition Reform Act of 2009 (WSARA; Public Law 111-23). One of WSARA’s major goals is to reduce the likelihood of future pro- grammatic failure by reducing concurrency (i.e., the simultaneity of two or more phases of the DoD acquisition process), thus ensuring that systems and major sub- systems are technologically mature before entering production. This congressional intent is apparent in WSARA’s emphasis on systems engineering and development planning, and in other mandates’ requirements for technology demonstrations, competitive prototypes, and preliminary design reviews earlier in the acquisition cycle, before costly system development and production decisions are made. R01861 AF PTD--CS4 final.indd 15 2/18/11 2:25 PM

OCR for page 11
e va l u at i o n u. s . a i r f o rc e P r e ac q u i s i t i o n t e c h n o lo g y d ev e lo P m e n t s 16 of COMMITTEE APPROACH TO THE STUDY Throughout the study, the committee met with numerous Air Force stakehold- ers to gain a fuller understanding of the sponsor’s needs and expectations relating to the elements contained in the statement of task. The full committee met four times to receive briefings from academic, government, and industry experts in technology development, and it conducted a number of visits during which subgroups of the committee met with various stakeholders. The committee met two additional times to discuss the issues and to finalize its report. Appendix B lists specific meetings, individual participants, and participating organizations. The almost absurd complexity of Figure 1-1 illustrates how daunting the DoD acquisition system is. A clearer image is not available. Given the incredible intricacy of the system, coupled with the relatively short time line of the study, the com- mittee endeavored at the outset to distill this complexity into a basic touchstone mission statement: How to improve the Air Force’s ability to specify, develop, test, and insert new technology into Air Force systems. THREE DOMAINS OF THE AIR FORCE The mission of the United States Air Force is to fly, fight, and win . . . in air, space, and cyberspace.17 This mission statement, set forth in a joint September 2008 letter from the Sec- retary of the Air Force and the Air Force Chief of Staff, emphasizes the importance of all three domains in which the Air Force must operate. Each domain involves special considerations and challenges. Consequently, each of the three domains represents a unique environment in terms of science and technology (S&T) and major systems acquisition. Air The air domain is perhaps most frequently associated with Air Force major systems acquisition. It is characterized by relatively low (and declining) numbers of new major systems, with a relatively small number of industry contractors compet- ing fiercely to win each new award. In this realm, relationships between government and industry tend to be at arm’s length and sometimes adversarial. The duration 17 “U.S. Air Force Mission Statement and Priorities.” September 15, 2008. Available at http://www. af.mil/information/viewpoints/jvp.asp?id=401. Accessed May 19, 2010. R01861 AF PTD--CS4 final.indd 16 2/18/11 2:25 PM

OCR for page 11
Version 5.4 15 June 2010 DoD Decision Planning, Support Systems Programming, Integrated Defense Acquisition, Technology, and Logistics Life Cycle Management System Effective Interaction Budgeting & Execution Process This chart is a classroom aid for Defense Acquisition University students. It provides a notional illustration of interfaces among three major is Essential DEPSECDEF Oversight decision support systems used to develop, produce and field a weapon system for national defense. Defense acquisition is a complex process DoD 7000.14-R Following the Materiel Development Decision, the Milestone Decision Authority may authorize entry into the acquisition process at any point, consistent with phase-specific entrance criteria and statutory requirements with many more activities than shown here and many concurrent activities that cannot be displayed on a two-dimensional chart. Supporting information is on back of this chart. For more information, see the Defense Acquisition Portal (http://dap.dau.mil). Joint Defense Capabilities Integration Acquisition Operations & Support Phase Production & Deployment Phase Materiel Solution Analysis Phase Technology Development Phase Engineering & Manufacturing Development Phase & Development System System VCJCS Oversight USD(AT&L) Oversight Develop a system or increment of capability; complete full system integration, develop affordable and executable manufacturing process; Achieve operational capability that satisfies mission needs. Low-rate initial production (limited deployment for software intensive Execute support program that meets materiel readiness and Complete Analysis of Alternatives to assess potential materiel solutions to Reduce technology risk, determine and mature appropriate set of technologies to integrate into full system, CJCSI 3170.01 series DoDD 5000.01 ensure operational supportability; reduce logistics footprint; implement human systems integration; design for producibility; ensure systems with no development hardware) and full-rate production (full deployment for software intensive systems). Deliver operational support performance requirements and capability need, identify key technologies and estimate life cycle costs. demonstrate critical technology elements on prototypes, and complete preliminary design. Identify an affordability; protect critical program information; and demonstrate system integration, interoperability, safety, and utility. fully funded quantity of systems and supporting material and services for program or increment to users. sustains system in most cost-effective manner. Consider commercial-off-the-shelf and solutions from both large and affordable program or increment of militarily useful capability, demonstrate technology in relevant small business. Identify materiel solution to capability need. MS environment, and identify and assess manufacturing risks. Provide for two or more competing teams MS MS Full-Rate Production/Deployment Overlaps Production and Deployment Phase. Integrated System System Capability & Manufacturing Complete Technology Development Strategy. producing prototypes of system and/or key system elements prior to or through Milestone B. Disposal Low-Rate Initial Production Design Process Demonstration Life Cycle Sustainment Post- A B C FRP Decision Points/Milestones MDD CDR A DR IOC FOC R01861 AF PTD--CS4 final.indd 17 FDDR Joint Capabilities Integration and Development System - Acronyms System Threat Assessment System Threat Assessment System Threat Assessment For software Supportability Supportability Supportability Strategic Guidance Requirements Requirements Requirements IOC – Initial Operational Capability intensive CDD – Capability Development Document Force Force Force Compliant Compliant Compliant Technical Technical Survivability Survivability Survivability IT – Information Technology Technical CPD – Capability Production Document systems Protection Protection Protection Solution Solution Solution Standards & Standards & Standards & JP 3 0 – J i t P bli ti 3-0 Joint Publication 3 0 “J i t O 3-0, “Joint Operations” ti ” DCR – DOTMLPF Change Recommendation Ch R d ti KPP KPP KPP Net-Ready Net-Ready Net-Ready Architectures Architectures Architectures interfaces interfaces interfaces KPP KPP KPP KPPs KPPs Joint Operations Concepts KPPs DOTMLPF – Doctrine, Organization, Training, Materiel, JROC – Joint Requirements Oversight Council KPP KPP KPP Traceable Traceable Traceable Leadership and Education, Personnel, & Facilities MUA – Military Utility Assessment Concept of Operations To ICD & To ICD & To ICD & Net Centric Net Centric Net Centric FOC – Full Operational Capability KPP – Key Performance Parameter Information Information Information Operations Plans JP 3-0 JP 3-0 JP 3-0 Data & Services Data & Services Data & Services Selectively Applied Selectively Applied Selectively Applied ICD – Initial Capabilities Document KSA – Key System Attribute Assurance Assurance Assurance Joint Strategy Strategy Strategy System System System Energy Energy Energy Capabilities-Based Training Training Training Efficiency Efficiency Efficiency Assessment Joint Operations Concepts Joint Operations Concepts KPP KPP KPP KPP KPP KPP Joint Operations Concepts Capabilities Concept of Operations Concept of Operations Joint Staff Review Joint Staff Review Joint Staff Review Concept of Operations System Threat Assessment System Threat Assessment J-1 – Joint manpower. J-2 – Intelligence/threat. J-1 – Joint manpower. J-2 – Intelligence/threat. J-1 – Joint manpower. J-2 – Intelligence/threat. Technology Development Strategy Validated and approved J-3 – Operational suitability, sufficiency, & J-3 – Operational suitability, sufficiency, & J-3 – Operational suitability, sufficiency, & Component/JROC Component/JROC Component/JROC Threshold/objective tradeoffs – Threshold/objective tradeoffs – Threshold/objective tradeoffs – AoA (if updated) AoA (if updated) Draft AoA Report supportability. J-4 – Facilities, sustainment, & energy supportability. J-4 – Facilities, sustainment, & energy supportability. J-4 – Facilities, sustainment, & energy CDD & CPD for each increment of ICD Integration & Validation & Validation & Validation & Revised Performance Attributes DoD Enterprise Architecture & Revised Performance Attributes DoD Enterprise Architecture & Revised Performance Attributes CDD CPD CDD efficiency. J-5 – Alignment with strategy & priorities. efficiency. J-5 – Alignment with strategy & priorities. efficiency. J-5 – Alignment with strategy & priorities. DoD Enterprise Architecture & evolutionary acquisition Solution Architecture Solution Architecture Approval Approval Approval J-6 – IT/NSS interoperability & supportability. J-6 – IT/NSS interoperability & supportability. J-6 – IT/NSS interoperability & supportability. Solution Architecture J-7 – Systems training. J-8 – Weapons safety. J-7 – Systems training. J-8 – Weapons safety. J-7 – Systems training. J-8 – Weapons safety. KPPs KPPs Information Support Plan Information Support Plan Technical Standards Profile Technical Standards Profile Technical Standards Profile Development JROC/ Information Information Information MUA/Final Demo MUA/Final Demo MUA/Final Demo Component Support Support Support Validation and Report for Report for Report for Availability Ownership Availability Ownership Availability Ownership Reliability Reliability Reliability Plan Plan Plan Approval Joint Capability Technology Joint Capability Technology Joint Capability Technology Cost Cost Cost System KPP KPP KPP KSA KSA KSA Demonstrations & other Demonstrations & other Demonstrations & other KSA KSA KSA prototype projects prototype projects prototype projects Materiel Availability & Materiel Availability & Materiel Availability & Operational Availability Operational Availability Operational Availability (need-driven) DCR Non-materiel solutions Evolutionary Acquisition Strategy Initiate Evolutionary Acquisition Strategy A B C (If program initiation, or if equiv to FDDR) P- Clinger-Cohen Act (IT incl NSS) Clinger-Cohen Act (IT incl NSS) Clinger-Cohen Act (IT incl NSS) Clinger-Cohen Act (IT incl NSS) FRP CDRA - Compliance - Compliance - Compliance - Compliance Increment 2 - CIO Confirmation of Compliance - CIO Confirmation of Compliance - CIO Confirmation of Compliance - CIO Confirmation of Compliance MS B MS A Cert Cert B C A Cost-Type Exit E it Exit E it Exit E it Exit E it Exit E it Contract Post- TRA TRA P- DAB/ DAB/ DAB/ DAB/ DAB/ CDRA FRP PSR ADM Determination Criteria Criteria PDR A Criteria Criteria APB MDA APB MDA Criteria APB MDA MDA PSR ADM ADM ADM ADM MDA ADM MDA ADM Increment 3 (if applicable) ITAB ITAB ITAB ITAB ITAB PSR PSR Met Met Met Met Met Required if PDR is Exit Exit Exit Exit Exit Exit after Milestone B Criteria Criteria Criteria Criteria Criteria Criteria PDR Report Oversight PDR Report Post-CDR Report Acquisition Strategy Technology Development Strategy (TDS) Acquisition Strategy Acquisition Strategy • Acquisition Approach • Resource Management • Acquisition Approach • • Acquisition Approach • Business Strategy Resource Management • Acquisition Approach • Resource Management & • Source & Related Documents • Program Security • Source & Related Documents • • Source & Related Documents • Resource Management Program Security Considerations • Source & Related Documents • Program Security Considerations Oversight & Review and Contracting Acronyms • Capability Needs Considerations • Capability Needs • • Capability Needs • Program Security Test & Evaluation • Capability Needs • Test & Evaluation Total Life MAIS – Major Automated Information System ADM – Acquisition Decision Memorandum • Top-Level Integrated Schedule • Test & Evaluation • Top-Level Integrated Schedule • • Top-Level Integrated Schedule Considerations Data Management • Top-Level Integrated Schedule • Data Management Review MDA – Milestone Decision Authority AoA – Analysis of Alternatives • Program Interdependency & • Data Management • Program Interdependency & • • Program Interdependency & • Test Planning Life-Cycle Sustainment Planning • Program Interdependency & • Life-Cycle Sustainment Planning TLCSM TLCSM TLCSM MDD – Materiel Development Decision APB – Acquisition Program Baseline Cycle Systems Interoperability Summary • Life-Cycle Sustainment Planning Interoperability Summary • Interoperability Summary • Data Management & Life-Cycle Signature Support Plan Interoperability Summary • Life-Cycle Signature Support Plan NSS – National Security Systems CDR – Critical Design Review • International Cooperation • Life-Cycle Signature Support • International Cooperation • • International Cooperation Technical Data Rights CBRN Survivability • International Cooperation • CBRN Survivability PDR – Preliminary Design Review CBRN – Chemical, Biological, Radiological & Nuclear Management • Risk & Risk Management Plan • Risk & Risk Management • • Risk & Risk Management • Life-Cycle Sustainment Human Systems Integration • Risk & Risk Management • Human Systems Integration P-CDRA – Post Critical Design Review Assessment DAB – Defense Acquisition Board • Technology Maturation • CBRN Survivability • Technology Maturation • • Technology Maturation & Planning ESOH • Technology Maturation • ESOH AoA Study P-PDRA – Post PDR Assessment ECP – Engineering Change Proposal • Industrial Capability & • Human Systems Integration • Industrial Capability & • Competitive Prototyping • Life-Cycle Signature Military Equipment Valuation & • Industrial Capability & • AoA Study Military Equipment Valuation & PSR – Program Support Review ESOH – Environment, Safety, and Occupational Health Manufacturing Readiness • ESOH Manufacturing Readiness • Industrial Capability & Support Plan Accountability Manufacturing Readiness Accountability Guidance Plan RAM-C – Reliability, Availability, Maintainability & Cost EVM – Earned Value Management • Business Strategy • Corrosion Prevention & Control • Business Strategy Manufacturing Capabilities • CBRN Survivability • Corrosion Prevention & Control • Business Strategy • Corrosion Prevention & Control Rationale Report FRPDR – Full Rate Production Decision Review Purpose of LRIP: RFP – Request for Proposals FDDR – Full Deployment Decision Review updated as MAIS •Complete manufacturing development updated as RFI – Request for Information IBR – Integrated Baseline Review only necessary necessary Initial AoA AoA Final RFP cannot be released AoA Final RFP cannot be released Final RFP cannot be released Final RFP cannot be released AoA •Establish initial production base TRA – Technology Readiness Assessment ITAB – Information Technology Acquisition Board RAM-C until Acq Strategy is approved until Acq Strategy is approved until TDS is approved until Acq Strategy is approved TLCSM – Total Life Cycle Systems Management LRIP – Low Rate Initial Production •Ramp to production rate Report Draft RFP Draft RFP •Produce systems for IOT&E Draft RFP Draft RFP Source Source Source Source Acq Source Source Source Source Acq Acq Acq RFP & RFP & RFP & RFP & Selection Selection Selection Selection Study Post-Production Software Support Contracts Selec on Selec on Plan Proposals Selec on Plan Plan Plan Proposals Proposals Proposals Selec on Engineering & Contracting Technology Plan Production Plan Plan Plan Contracts LRIP Manufacturing Development Contract Contract Management ECPs/Changes Contract Management ECPs/Changes Contract Management ECPs/Changes Contract Management ECPs/Changes Contract Development Contract Contract Contract Contract Contract Contract Closeout Closeout Closeout Closeout IBR IBR IBR EVM EVM EVM Sustainment Contracts IBR EVM Surveillance Surveillance Surveillance Surveillance Initial Alternative Production System Low-Rate Initial Full-Rate Initial Final Major Product Materiel Materiel Performance Product Product Prototypes Representative Production Production Prototypes Baseline Post-Deployment Solutions Solution Spec Baseline Baseline (verified) Products Review Articles Systems Systems Product Support/PBL Management Product Support Package/PBL Implementation Product Support Plan Demonstrate Product Support Capability Refine •Public-Private Partnering Operations and Sustainment •Continuous Tech Refreshment Develop Initial Product Support Strategy Set Define •Supply Chain Management •Product Support Elements •Statutory/Regulatory •Footprint Reduction •Product Support Elements Supportability Evaluate Product •Joint Operations •Supply Chain Management •Obsolescence Management •Peacetime Product Support •Revalidate BCA Initiate Product Support BCA •Support and Cost Baseline -Supply Support -Training •Source of Support •Supply Chain Management Supportability Objectives/ •Crises Logistics/ •Configuration Control •PBA Modifications •Training Support Capabilities -Maintenance -Support Data Strategy •Refine Life Cycle Sustainment •Legacy Considerations •Product Support Elements •Contract for Sustainment (Define Ground Rules & Assumptions) Constraints Objectives •Data Management •Refine LCSP •Revalidate BCA •Assessment of PSI/PSPs Plan •Conduct Product Support BCA -Manpower & personnel •Finalize Product Support BCA (organic & Commercial) •Monitor Performance & Adjust •Revalidate PBA/BCA Sustainment FRP Product Support Production Qualification Post-CDR A MDD Testing DR A B OUTPUTS INPUTS INPUTS OUTPUTS INPUTS OUTPUTS Performance-Based Logistics (PBL) Strategy (Preferred Product Support Approach) C Pre-IOC and Post IOC Supportability Assessments Business Case Product Support Integrator/ Performance-Based •ICD •System Allocated Baseline •Sys Performance Spec •Preliminary System Spec •ICD & Draft CDD •Initial Product Baseline AOTR OTRR Analysis Product Support Provider Agreements •PDR Report •Test Reports •Acquisition Strategy •Test Reports •AoA Study Plan •Systems Engineering Plan •Approved Materiel Solution Technical & Logistics Acronyms IOT&E •SEP •TEMP •PESHE •PPP •TRA •Exit Criteria •SEP •TRA •PESHE •TEMP BLRIP •Exit Criteria •T&E Strategy •Exit Criteria OA – Operational Assessment ASR – Alternative Systems Review Report to Disposal •NEPA Compliance Schedule •APB •CDD •STA •ISP Defense •NEPA Compliance Schedule •Alternative Maintenance •System Safety Analysis •Support & Maintenance Concepts OTRR – Operational Test Readiness Review FOT&E AOTR – Assessment of Operational Test Readiness Most Congress Full-Up System Level LFT&E •Risk Assessment •LCSP •SEP •TEMP •PESHE •PPP •Elements of Product Support PBA – Performance-Based Agreement BCA – Business Case Analysis & Sustainment Concepts •Support & Maintenance Concepts & Technologies Acceptable PESHE – Programmatic Environment, BLRIP – Beyond Low Rate Initial Production as required •Validated Systems Support & Maint •NEPA Compliance Schedule •Risk Assessment & Technologies •Systems Engineering Plan Recycle/Reuse Safety & Occupational Health Evaluation CDR – Critical Design Review Objectives & Requirements •Risk Assessment •Life Cycle Sustainment Plan Acquisition •Inputs to puts to: •AoA Report o epo t Joint Interoperability PDR – Preliminary Design Review Preliminary CI – Configuration Item •System Safety Analysis •Validated Sys Support & Maint •System Safety Analysis LFTE PCA – Physical Configuration Audit DT&E – Developmental Test & Evaluation -draft CDD -AoA -TDS -IBR •T&E Strategy Certification Testing Report to •Inputs to: -CDD -ISP -STA -IBR Objectives & Requirements •Inputs to: -CPD -STA -ISP PRR – Production Readiness Review EOA – Early Operational Assessment -Cost/Manpower Est. •Technology Development Strategy Reprocessing Congress JITC Joint Interoperability PPP – Program Protection Plan -Acq Strategy •Product Support Strategy ESOH – Environment, Safety & Occupational Health -IBR -Cost/Manpower Est. System -Product Support Strategy •System Safety Analysis PSI – Product Support Integrator FCA – Functional Configuration Audit -Affordability Assessment •System Safety Analysis Test Certification PSP – Product Support Provider FMECA – Failure Mode Effects & Criticality Analysis -Cost/Manpower Est. (event-driven) RCM – Reliability Centered Maintenance FCA FOT&E – Follow-On Test & Evaluation Disposal Landfill RMS – Reliability, Maintainability & Supportability FTA – Failure Tree Analysis PRR SVR ITR ASR SRR SEP – Systems Engineering Plan INPUTS OUTPUTS INPUTS OUTPUTS IOT&E – Initial Operational Test & Evaluation Least Interpret User Needs, Interpret User Needs. Analyze/Assess Interpret User Needs, Interpret User Needs, Demo & Validate System SFR – System Functional Review Integrated DT&E/OT&E/LFT&E ISR – In-Service Review •Test Results •Service Use Data •Product Baseline •Data for In-Service Review Verification/ Validation Validation Acceptable SRR – System Requirements Review ISP – Information Support Plan Analyze Operational Analyze Operational Refine System Refine System Concepts Versus & Tech Maturity Versus Demonstrate System to •Exit Criteria •User Feedback •Failure Reports •Test Reports Validation •Input to CDD for next increment Linkage Linkage STA – System Threat Assessment ITR – Initial Technical Review Performance Specs & Performance Specs & Capabilities & Capabilities & Defined User Needs & Defined User Needs & Specified User Needs and Linkage •APB •CPD •SEP •TEMP •Discrepancy Reports •SEP •PESHE •TEMP •Modifications/upgrades to fielded SVR – System Verification Review Environmental Constraints Environmental Constraints Trades JITC – Joint Interoperability Test Command Environmental Constraints Environmental Constraints Environmental Constraints •Product Support Element •SEP TEMP – Test & Evaluation Master Plan Environmental Constraints Trades LFT&E – Live Fire Test & Evaluation •System Safety Analysis Environmental Constraints Trades systems TDS – Technology Development Strategy LORA – Level of Repair Analysis Requirements •System Safety Analysis •Inputs to: -IBR •SEP •Test Reports Technical TRA – Technology Readiness Assessment MTA – Maintenance Task Analysis •PESHE •Product Support Element - Cost/Manpower Est. •System Safety Analysis Trades Trades TRR – Test Readiness Review NEPA – National Environmental Policy Act Develop Concept Assess/Analyze Develop System Performance Trades System DT&E, OT&E, LFT&E Develop System Functional Develop System Functional Systems Engineering •System Safety Analysis Requirements - Product Support Package Demo/Model •Product Support Package Performance (& Constraints) Concept & Verify (& Constraints) Spec & & OAs Verify System Specs & Verification Plan to Verification Verification Specs & Verification Plan to Verification Integrated System Versus Test and Evaluation Enabling/Critical Tech & Functionality & Constraints Definition & Verification Linkage Linkage Evolve System Functional System Concept’s Linkage Evolve System Functional Performance Spec In-Service Prototypes Verification Plan Compliance to Specs Baseline Objectives Performance Baseline Supportability PCA Lighter blocks reflect technical Review Monitor & Collect Analyze Deficiencies efforts required if PDR is Verify and Validate Implement and Verification/ required after Milestone B. Service Use & Supply To Determine Corrective Production Validation Field Decompose Concept Assess/Analyze Develop Functional SFR SFR Evolve Functional Integrated DT&E, LFT&E & Demo System & Prototype Evolve Functional Actions Configuration Chain Performance Data Trades Linkage Verification Performance into System Concept Verification Definitions for Enabling/ Verification Performance Specs into EOAs Verify Performance Functionality Performance Specs into Linkage Linkage Functional Definition & Versus Functional Linkage Critical Tech/Prototypes & System Allocated Baseline Compliance to Specs Versus Plan System Allocated Baseline Verification Objectives Capabilities Associated Verification Plan Trades Analyze Data to: TRR PDR PDR Assess Risk of FMECA Validate Failures & Modify Configuration Improved System Decompose Functional Assess/Analyze FTA Evolve CI Functional Determine Root Causes Decompose Concept (Hardware/Software/Specs) Individual CI Demo Enabling/Critical Verification Verification Definitions into Critical Verification Enabling/Critical Specs into Product Functional Definition into To Correct Deficiencies RCM Verification Technology Components Linkage Linkage Linkage Component Definition & Components Versus (Build to) Documentation Component Concepts & DT&E Versus Plan Technologies Verification Plan Capabilities & Verification Plan Assessment Objectives LORA Determine CDR MTA Integrate and Test System Risk/ Corrective Action Hazard Severity Design/Develop System Concepts, Develop Component Concepts, Fabricate, Assemble, LFT&E (with alternate i.e., Enabling/Critical Technologies, i.e., Enabling/Critical Waiver Code to “Build-to” LFT&E Plan) Update Constraints, & Technologies, Constraints, (if appropriate) Documentation Cost Acronyms Cost/Risk Drivers & Cost/Risk Drivers • Process Change CARD – Cost Analysis Requirements Description Develop MSA Phase TD Phase Full Funding Full Funding Full Funding CCE – Component Cost Estimate - Operations/Maintenance Support Corrective CCP – Component Cost Position Fully Funded Fully Funded In FYDP In FYDP In FYDP • Materiel Change Economic Analysis (MAIS) Economic Analysis (MAIS) Action ICE – Independent Cost Estimate Economic Analysis (MAIS) Manpower Estimate (MDAP) Manpower Estimate (MDAP) Manpower Estimate (MDAP) - Hardware/Product Support Package MDAP – Major Defense Acquisition Program MAIS – Major Automated Information System Financial PMO – Program Management Office Affordability Affordability CCE CCP ICE CARD CARD CCE CCP ICE CARD CCE CCP ICE CARD CCE CCP ICE RDT&E – Research, Development, Test & Evaluation Assessment Assessment Management MDAP/MAIS MDAP MDAP MDAP/MAIS MDAP MDAP MDAP/MAIS MDAP MDAP MDAP/MAIS MDAP MDAP Cost Estimation Actual Costs Parametric Engineering Analogy Methods RDT&E – Advanced Technology Development RDT&E – Systems Development & Demonstration RDT&E – Advanced Component Development and Prototypes Types of Procurement Operations and PMO POM Input Funds Maintenance RDT&E – Management & Support PMO Budget Estimate Appropriated Funds To Support Contracts Military Planning, Programming, Budgeting, and Execution Acronyms FYDP – Future Years Defense Program PMO – Program Management Office Planning, Departments and POM/Budget Formulation POM/Budget Submit MBI – Major Budget Issues POM – Program Objectives Memorandum OMB – Office of Management and Budget DoD Testimony DoD Appeals Defense Agencies July Programming, Issue Nominations/Disposition Allocation Budgeting Defense Planning & SECDEF Option National Military Strategy Programming Guidance Integrated Program/Budget Office of the Decisions DoD Budget Apportionment MBI & Execution Review Secretary of Defense FYDP FYDP Fiscal Guidance August - November November December National Defense Strategy and Joint Staff updated updated Process April Authorization/ Authorization Budget Appropriation (annual- Appropriation President’s Budget to Committees Committees Committees OMB Acts Passed calendar-driven) White Congress National Security Strategy Fiscal Guidance Congressional Budget Process January House February (1st Monday) National Strategy Documents (updated as necessary) March February – September Authors Chuck Cochrane and Brad Brown For a single copy of this chart, send a request to daupubs@dau.mil Send recommendations to improve the content of this chart to wallchart@dau.mil FIGURE 1-1 Department of Defense acquisition process. SOURCE: Defense Acquisition University. Available at https://ilc.dau.mil/default_nf.aspx. Accessed June 17 11, 2010. 1-1.eps landscape 2/18/11 2:26 PM

OCR for page 11
e va l u at i o n u. s . a i r f o rc e P r e ac q u i s i t i o n t e c h n o lo g y d ev e lo P m e n t s 18 of of acquisition programs tends to be long, often measured in decades, whereas “buy quantities” have declined dramatically over time.18 In the air domain, not all technology insertion takes place prior to the initial delivery of a system. Aircraft stay in the active inventory for far longer periods than in years past. For example, the newest B-52 is now 48 years old, and most KC-135 aerial refueling tankers are even older. The advancing age of such aircraft means that numerous carefully planned and executed technology insertions are therefore required to upgrade and extend the lives of aging fleets. These post-acquisition technology-based activities are in themselves both militarily necessary and economically significant, and they are increasingly char- acteristic of the air domain. The financial impacts of these activities are especially noteworthy. For example, the periodic overhauls of B-2 stealth bombers require 1 year and $60 million—per aircraft. Yet the military imperatives leave the Air Force little choice: “Although there is nothing else like the B-2, it’s still a plane from the 1980s built with 1980s technology,” said Peter W. Singer, a senior fellow at the Brookings Institution think tank. Other countries have developed new ways to expose the B-2 on radar screens, so the Air Force has to upgrade the bomber in order to stay ahead. “Technology doesn’t stand still; it’s always moving forward,” he said. “It may cost an arm and a leg, but you don’t want the B-2 to fall behind the times.”19 Major aeronautical systems are thus characterized by large expenditures for re- search and development, as well as for the initial procurement and periodic updates of the end items themselves. But the largest expenditures tend to be in operations and support (O&S) costs over the life of a system. This is increasingly true, as sys- tems are kept in the inventory for longer periods: The initial purchase price of the Air Force’s B-52 bomber was about $6 million in 1962; that sum is dwarfed by the resources required to operate and support the bomber over the last half-century.20 Space “Space is different.” This idea was raised repeatedly during the course of this study. The challenges of the space world are, in fact, significantly different from 18 Following, for example, are approximate buy quantities in the world of multi-engine bombers, from World War II to today: 18,000 B-24s; 12,000 B-17s; 4,000 B-29s; 1,200 B-47s; 800 B-52s; 100 B-1s; 21 B-2s. Similar purchasing patterns exist for fighter and cargo aircraft. 19 W.J. Hennigan. 2010. “B-2 Stealth Bombers Get Meticulous Makeovers.” Los Angeles Times, June 10. Available at http://articles.latimes.com/2010/jun/10/business/la-fi-stealth-bomber-20100610. Ac - cessed June 22, 2010. 20 Information available at https://acc.dau.mil/CommunityBrowser.aspx?id=241468. Accessed May 18, 2010. R01861 AF PTD--CS4 final.indd 18 2/18/11 2:26 PM

OCR for page 11
Preacquisition technology develoPment air force weaPon systems 19 for those in the air domain. Some of those differences are obvious. For example, space presents an extraordinarily unforgiving environment in which few “do-overs” are possible; by comparison, the air domain offers the luxury of maturing complex technology prototypes through a sequence of relatively rapid “fly-fix-fly” spirals during the development phase. This “spiral development process” for aircraft allows the refinement of complex technologies through responses to empirical observa- tions. In contrast, the space domain offers few such opportunities. Furthermore, in contrast with aircraft production lines, space systems tend to be craft-produced in small quantities by skilled craftspeople. Additionally, in the space domain there are few if any “flight test vehicles,” in that every launch is an operational mission, and failures in the always-critical launch phase tend to be spectacular and irreversible. Therefore, space development programs rely on “proto-qualification” or engineering models and must “test like you fly” in order to maximize the opportunity for on-orbit mission success. Once on orbit, a space platform must work for its lifetime as designed, since the oppor- tunities for in-space rework, repair, and refurbishment are limited. In consideration of these factors, the space domain has led the way in develop- ing measures of technological stability. With the stakes so high and with so little ability to rectify problems once a spacecraft is in operational use, space developers and operators have found it necessary to ensure that only tested and stable systems and components make their way into space. Thus, NASA developed the concept of Technology Readiness Levels (TRLs) in the 1980s.21 Based on the idea that in- corporating unproven technology into critical space systems was neither safe nor cost-effective, NASA used the seven-tiered TRL process (later expanded to nine tiers) to assess objectively the maturity and stability of components prior to placing them in space systems (illustrated in Figure 1-2). This TRL concept later spread to the military and commercial worlds and developed into an entire family of assess- ment tools—Manufacturing Readiness Levels, Integration Readiness Levels, and Systems Readiness Levels. As with the air domain, the R&D phase in the space domain is expensive, as is the purchase of a space vehicle itself. Unlike with aircraft, however, the O&S costs tend to be relatively low throughout the life of space systems, with operational ex- penses generally limited to ground station management and communication with the space vehicle. According to Defense Acquisition University, O&S costs consume 41 percent of a fixed-wing aircraft’s life-cycle cost (LCC), but only 16 percent of the LCC for the average spacecraft.22 21 Additional information on TRL definitions is available through the NASA Web site at http://esto. nasa.gov/files/TRL_definitions.pdf. Accessed June 22, 2010. 22 Information available at https://acc.dau.mil/CommunityBrowser.aspx?id=241468. Accessed May 18, 2010. R01861 AF PTD--CS4 final.indd 19 2/18/11 2:26 PM

OCR for page 11
e va l u at i o n u. s . a i r f o rc e P r e ac q u i s i t i o n t e c h n o lo g y d ev e lo P m e n t s 20 of System Test, Launch Actual system “flight proven” through successful TRL 9 & Operations mission operations Actual system completed and “flight qualified” TRL 8 System/Subsystem through test and demonstration (Ground or Flight) Development System prototype demonstration in a space TRL 7 environment Technology System/subsystem model or prototype demonstration Demonstration TRL 6 in a relevant environment (Ground or Space) Component and/or breadboard validation in relevant TRL 5 Technology environment Development Component and/or breadboard validation in laboratory TRL 4 environment Research to Prove Feasibility Analytical and experimental critical function and/or TRL 3 characteristic proof-of-concept Technology concept and/or application formulated TRL 2 Basic Technology Research Basic principles observed and reported TRL 1 FIGURE 1-2 1-2.eps Technology Readiness Level (TRL) descriptions. SOURCE: NASA, modified from http://www.hq.nasa. gov/office/codeq/trl/trlchrt.pdf. Accessed June 22, 2010. Compared to an aircraft system that can be modified to extend its life for many years, a spacecraft has a finite life on orbit, limited by the operating environment of space and the amount of fuel onboard. As a result, space systems tend to be in a constant state of acquisition. As an example, the Global Positioning System (GPS) Program, managed by the Space and Missile Systems Center, is responsible for fly- ing the current generation of satellites on orbit, for producing the next generation of satellites, and for developing the follow-on GPS system—all at the same time. The space domain’s heavy reliance on Federally Funded Research and Devel- opment Centers (FFRDCs) is another characteristic that sets it apart from the air domain. The Aerospace Corporation has partnered with the Air Force since 1960 to provide five core technological competencies to the Air Force’s space efforts: The Aerospace FFRDC provides scientific and engineering support for launch, space, and related ground systems. It also provides the specialized facilities and continuity of effort required for programs that often take decades to complete. This end-to-end involvement reduces development risks and costs, and allows for a high probability of mission success. The Department of Defense has identified five core competencies for the Aerospace FFRDC: launch certification, system-of-systems engineering, systems development and acquisition, process implementation, and technology application. The primary customers are the Space and Missile Systems Center of Air Force Space Command and the National Reconnaissance R01861 AF PTD--CS4 final.indd 20 2/18/11 2:26 PM

OCR for page 11
Preacquisition technology develoPment air force weaPon systems 21 for Office, although work is performed for civil agencies as well as international organizations and governments in the national interest.23 Senior warfighters have raised concerns about the overall health of the space in- dustrial base and its ability to meet the needs of U.S. national security. In particular, studies have pointed to inconsistent performance and reliability among third- and fourth-tier suppliers, many of which perform space contracts intermittently and cannot sustain design, engineering, and manufacturing capabilities in the absence of continual work.24,25,26,27,28 Cyberspace In the early years of the Air Force’s space era, there was much uncertainty about roles and missions, about organizational structure, about boundaries and poli- cies and processes. It took decades to resolve these matters, and in fact issues still arise from time to time—an example being the recently resolved issue of whether strategic missiles logically belong to the Air Combat Command, or to the Global Strike Command, or to the Space Command. That same level of uncertainty now characterizes the Air Force’s cyberspace efforts. For example, in August 2009, a joint letter from the Secretary of the Air Force (SAF) and the Chief of Staff of the Air Force (CSAF) countermanded previ- ous guidance and set up a new command—the 24th Air Force—as “the Air Force service component to the USCYBERCOM [United States Cyber Command], align- ing authorities and responsibilities to enable seamless cyberspace operations.”29 23 Information from the Aerospace Corporation. Available at http://www.aero.org/corporation/ffrdc. html. Accessed May 18, 2010. 24 Information from Booz Allen Hamilton. May 19, 2003. Available at www.boozallen.com/consult - ing/industries_article/659130. Accessed May 27, 2010. 25 Eric R. Sterner and William B. Adkins. 2010. “R&D Can Revitalize the Space Industrial Base.” Space News, February 22. 26 Aerospace Industries Association. 2010. Tipping Point: Maintaining the Health of the National Security Space Industrial Base. September. Available at http://www.aia-aerospace.org/assets/aia_re- port_tipping_point.pdf. Accessed January 29, 2011. 27 Jay DeFrank. 2006. The National Security Space Industrial Base: Understanding and Addressing Concerns at the Sub-Prime Contractor Level. The Space Foundation. April 4. Available at http://www. spacefoundation.org/docs/The_National_Security_Space_Industrial_Base.pdf. Accessed January 29, 2011. 28 Defense Science Board. 2003. Acquisition of National Security Space Programs. Washington, D.C.: Office of the Under Secretary of Defense (Acquisition, Technology, and Logistics). Available at http:// www.globalsecurity.org/space/library/report/2003/space.pdf. Accessed January 29, 2011. 29 Available at http://www.24af.af.mil/shared/media/document/AFD-090821-046.pdf. Accessed May 18, 2010. R01861 AF PTD--CS4 final.indd 21 2/18/11 2:26 PM

OCR for page 11
e va l u at i o n u. s . a i r f o rc e P r e ac q u i s i t i o n t e c h n o lo g y d ev e lo P m e n t s 22 of But as late as August 2010, much about Air Force efforts in cyberspace remained unresolved. As quoted from a 24th Air Force Web site: At this time, we do not yet know the full complement of wings, centers and/or other units to be assigned to 24th Air Force. The organization of the required capabilities is still being determined. . . . The exact size of 24th Air Force is unknown at this time. . . . The final numbers for Headquarters 24th Air Force are yet to be determined.30 Similarly, there is much yet to be learned about systems acquisition in the cyberspace domain. However, some themes can be deduced. The first theme is the need for speed and agility in a world where threats can arise in days, or even hours. Throughout the course of this study, considerable time was devoted to learning about rapid-reaction acquisition efforts in organizations such as Lockheed Martin’s Skunk Works, the Air Force’s Big Safari organization, and the Joint Improvised Explosive Device Defeat Organization (JIEDDO). Com- mon to all of these efforts was a strong sense of urgency, with program durations often measured in weeks or months rather than years and decades. But cyberspace reaction cycles are often even shorter, which for some raises the question: Is the term “major system acquisition” even relevant in the cyberspace domain?31 Program offices like Big Safari and JIEDDO highlight the need to keep pace with an agile and adaptive enemy; in such programs rapid acquisition processes are vital to the safeguarding of military forces and thus to the national interest. The cyberspace domain has similarly short time horizons, and these can be expected to place special demands on the acquisition of cyberspace technology (see Figure 1-3). A second likely theme is that success in cyberspace acquisition will depend on building and rewarding a culture of innovation, and in that sense it will require more risk tolerance and failure tolerance than are commonly found in bureaucratic organizations. In 2008, the Secretary of the Air Force, the Honorable Michael Wynne, said that a new cyberspace organization would need to encourage innova- tion from the bottom ranks to the top: Calling innovation the top goal of the command, Wynne said the fledgling organization must be “more agile than any other in the Department of Defense” if it is to succeed at fighting in a climate where technology and tactics are often obsolete just months after being introduced. . . . to compete in the cyber battlefield, AFCYBER [Air Force Cyber] must be able to rapidly invent, develop and field new technologies, sometimes in a matter of weeks. To do this, the command will have to adopt a culture that encourages risk taking, especially among its young officers. . . . “Innovation will almost always come from the lower ranks,” he said, “from those who have not internalized any agreed upon ways of doing things.” 30 Available at http://www.24af.af.mil/questions/topic.asp?id=1666. Accessed August 25, 2010. 31 Jon Goding, Principal Engineering Fellow, Raytheon. 2010. “Improving Technology Development in Cyber: Challenges and Ideas.” Presentation to the committee, June 7, 2010. R01861 AF PTD--CS4 final.indd 22 2/18/11 2:26 PM

OCR for page 11
Preacquisition technology develoPment air force weaPon systems 23 for • Example: • Modification of existing Net Attack tool to optimize use against W specific target IO • Ops & • Purchase and/or modification of Net Defense tool to provide 88 Innovation defense against newly detected virus •6 • (Hrs – Weeks) • Example: I R FA • Transition of mature AFRL technology to meet SA urgent Net Defense tool need • Rapid ER • Contractor development of new Net Attack tool to YB satisfy urgent tasking • (Weeks – Months) •C • Example: E SA • Combat Info Transportation System O/ • Foundational (CITS) PE • Major modification to existing AF M/ Network infrastructure •P • (Months – Years) • Overarching processes defined and controlled through • Foundational Tier necessary to ensure configuration control FIGURE 1-3 Cyberspace agile acquisition construct recently adopted by the United States Air Force. NOTE: Program Man - 1-3.eps ager (PM), Program Executive Officer (PEO), Service Acquisition Executive (SAE), Information Operations Wing (IOW). SOURCE: Air Force Materiel Command/Electronic Systems Center. Wynne called on the command to foster officers who make mistakes and take risks that will lead to innovation. “Not only must you allow good ideas to percolate up, you must make your officers’ careers dependent upon demonstrating innovation. Being a flawless officer in Cyber Command should lead to early retirement.”32 A third probable characteristic of cyberspace acquisition is likely to be even closer collaboration between government, industry, and academic institutions, domestic and international. The FFRDC model discussed in the preceding subsec- tion, on the space domain, is already a critical part of the cyberspace domain—that is, the MITRE Corporation’s long-standing support of the Electronic Systems Center, and the Software Engineering Institute’s support of the DoD. The need for ready access to highly specialized cyberspace expertise is very similar to the type of needs found in the space domain. Additionally, commercial industry must be brought into the collaboration, as commercial communities—for example, finan- 32 Inside the Air Force. Available at http://integrator.hanscom.af.mil/2008/June/06262008/06262008- 08.htm. Accessed May 18, 2010. R01861 AF PTD--CS4 final.indd 23 2/18/11 2:26 PM

OCR for page 11
e va l u at i o n u. s . a i r f o rc e P r e ac q u i s i t i o n t e c h n o lo g y d ev e lo P m e n t s 24 of cial services, computer security—have a long history of dealing with cyberspace security, integration, and testing, and thus can contribute usefully to the Air Force’s operational capability. Technology development for air and space domains is driven by domestically based defense contractors, focusing largely on military applications. In the cyber- space world, however, the commercial market dwarfs that of the military, and both threats and resources are largely unaffected by political or geographic boundaries.33 Cyberspace acquisitions will therefore require much tighter cooperation between technology development communities, foreign and domestic, inside and outside the military-industrial complex. A recent paper from the Air Force illustrates some of the benefits of collaborative approaches to cyberspace acquisition: A collaborative environment and integrated network that enables rapid reach-back into a broad and diverse array of cyber experts throughout the nation, giving the warfighter access to cutting edge technology and expertise that otherwise would be unavailable to the military. . . . A process to discover world-class cyber experts, who may be either unaware of the military cyberspace requirements or overlooked because they work for smaller, less- known firms.34 It may also be useful to consider that the state of the cyberspace domain has similarities to that of the nascent air domain circa 1910, or to the fledgling space world in 1960. AIR FORCE SCIENCE AND TECHNOLOGY STRATEGIC PLANNING During this study, technology transition was described as a “contact sport: ev- ery successful technology hand-off requires both a provider and a receiver.”35 The probability of a successful technology handoff increases when the provider and the receiver work together in a disciplined way to identify capability needs and match them to an S&T portfolio. Approximately 30 percent of the Air Force Research Laboratory’s (AFRL’s) technology development efforts are “technology-push” efforts driven by technolo- gists who perceive how an emerging technology might enable a new operational capability in advance of a stated user need—as opposed to “technology-pull” ef- forts, or technology development done in response to a known capability need. The 33 Jon Goding, Principal Engineering Fellow, Raytheon. 2010. “Improving Technology Development in Cyber: Challenges and Ideas.” Presentation to the committee, June 7, 2010. 34 Available at http://www.docstoc.com/docs/30607347/The-Collaboration-Imperative-for- Cyberspace-Stakeholders. Accessed May 18, 2010. 35 Michael Kuliasha, Chief Technologist, Air Force Research Laboratory. 2010. “AFRL Perspective on Improving Technology Development and Transition.” Presentation to the committee, May 13, 2010. R01861 AF PTD--CS4 final.indd 24 2/18/11 2:26 PM

OCR for page 11
Preacquisition technology develoPment air force weaPon systems 25 for technology-pull portion of AFRL technology development is motivated by user requirements, which originate from multiple sources, including Major Commands (MAJCOMs), Product Centers, and air logistics centers.36 In some cases, user needs are assessed and prioritized before they are provided to the AFRL. For example, Product Centers are again working with their warf- ighter partners to prioritize requirements, which are influenced, in part, by their understanding of technology enablers emerging from the Air Force, industry, and university laboratories. These needs are prioritized within a particular MAJCOM and Product Center channel, but there is no mechanism that can adequately filter and prioritize needs across the Air Force today. To its credit, the Air Force recognizes this deficiency and is taking steps to develop a more robust corporate mechanism for technology needs assessment and prioritization.37 THE “THREE R” FRAMEWORK At the beginning of the study, the committee found it useful to organize its thinking around simple axiomatic principles. This resulted in a framework in- corporating the following: (1) Requirements, (2) Resources, and (3) the Right People—or the “Three Rs.” The framework is a concise and simple expression of unarguable criteria for successful program execution. If all three of these compo- nents are favorable, program success is possible. If any of the three is unfavorable, the program will most likely fail to deliver as expected. The framework is shown concisely in Box 1-2, and the principles are considered individually in the subsec- tions below. Requirements The importance of clear, stable, feasible, and universally understood require- ments has been long understood and has been validated by countless studies. Further, requirements need to be trade-off tolerant, that is, they need to be flexible enough to permit meaningful analysis of alternative solutions. Inadequately defined requirements drive program instability, through late design changes that drive cost increases and schedule slips, which in turn lead to an erosion of political support for the program. These ripples do not end at the boundary of a problematic program: As costs rise and schedules slide, the impact is transferred to other programs, and they then bear the costs imposed to save the original troubled system. 36 Ibid. 37 Ibid. R01861 AF PTD--CS4 final.indd 25 2/18/11 2:26 PM

OCR for page 11
e va l u at i o n u. s . a i r f o rc e P r e ac q u i s i t i o n t e c h n o lo g y d ev e lo P m e n t s 26 of BOX 1-2 The “Three Rs” Early in this study, the committee developed a framework, the “Three Rs,” for organizing its findings and recom - mendations. The framework describes characteristics that, in the committee’s judgment, need to be addressed fully in order for successful technology development to occur. That framework is composed of the following: 1. Requirements—clear, realistic, stable, trade-off tolerant, and universally understood; 2. Resources—adequate and stable, and including robust processes, policies, and budgets; and 3. The Right People—skilled, experienced, and in sufficient numbers, with stable leadership. Our assessment is that the current requirements process does not meet the needs of the current security environment or the standards of a successful acquisition process. Requirements take too long to develop, are derived from Joint Staff and Service views of the Combatant Commands’ needs and often rest on immature technologies and overly optimistic estimates of future resource needs and availability. This fact introduces instabil - ity into the system when the lengthy and insufficiently advised requirement development process results in capabilities that do not meet warfighter needs or the capabilities that are delivered “late-to-need.”38 A second cause of difficulty in the area of requirements is that there can be a large disconnect between what the warfighter wants—“desirements,” as expressed by one presenter to the committee—and what the laws of science permit. In those cases, overly optimistic estimates early in the project life can end up requiring miracles—or worse, sequential miracles—in order to become reality. In the words of the Defense Acquisition Performance Assessment Report (called the DAPA report; commissioned by Acting Deputy Secretary of Defense Gordon England in June 2005): Neither the Joint Capabilities Integration and Development System nor the Services requirement development processes are well informed about the maturity of technologies that underlie achievement of the requirement or the resources necessary to realize their development. No time-phased, fiscally and technically informed capabilities development 38 Assessment Panel of the Defense Acquisition Performance Assessment Project. 2006. Defense Ac- quisition Performance Assessment Report. A Report by the Assessment Panel of the Defense Acquisition Performance Assessment Project for the Deputy Secretary of Defense, p. 35. Available at http://www. frontline-canada.com/Defence/pdfs/DAPA-Report-web.pdf. Accessed January 29, 2011. R01861 AF PTD--CS4 final.indd 26 2/18/11 2:26 PM

OCR for page 11
Preacquisition technology develoPment air force weaPon systems 27 for and divestment plan exists to guide and prioritize the development and understanding of weapon system requirements.39 In sum, then, a successful program requires the vigilant management of the requirements process. The Government Accountability Office (GAO) summed it up well in its 2010 report Defense Acquisitions: Strong Leadership Is Key to Planning and Executing Stable Weapon Programs, which studied 13 successful acquisition programs and drew lessons from those successes: The stable programs we studied exhibited the key elements of a sound knowledge-based business plan at program development start. These programs pursued capabilities through evolutionary or incremental acquisition strategies, had clear and well-defined requirements, leveraged mature technologies and production techniques, and established realistic cost and schedule estimates that accounted for risk. They then executed their business plans in a dis- ciplined manner, resisting pressures for new requirements and maintaining stable funding. The programs we reviewed typically took an evolutionary acquisition approach, addressing capability needs in achievable increments that were based on well-defined requirements. To determine what was achievable, the programs invested in systems engineering resources early on and generally worked closely with industry to ensure that requirements were clearly defined. Performing this up-front requirements analysis provided the knowledge for making trade-offs and resolving performance and resource gaps by either reducing the proposed requirements or deferring them to the future. The programs were also grounded in well-understood concepts of how the weapon systems would be used.40 Resources As is the case with requirements, stability of resources is essential to program success. Turbulence in any of the following areas—technology, budgets, acquisition regulation, legislation, policy, or processes—contributes to program failure, as the resulting uncertainty deprives government and industry of the ability to execute programs as planned. One key area is technological maturity. The GAO has ex- amined the importance of technological maturity in predicting program success. In 1999, the GAO examined 23 successful technology efforts in both government and commercial projects, concluding that the use of formal approaches to assess technological stability, like the NASA-developed TRL system discussed elsewhere in this report, was crucial to program success. As stated in the 1999 GAO report: [D]emonstrating a high level of maturity before new technologies are incorporated into product development programs puts those programs in a better position to succeed. The 39 Ibid., p. 36. 40 GAO. 2010. Defense Acquisitions: Strong Leadership Is Key to Planning and Executing Stable Weapon Programs. Washington, D.C.: GAO, p. 16. Available at http://www.gao.gov/new.items/d10522.pdf. Accessed June 11, 2010. R01861 AF PTD--CS4 final.indd 27 2/18/11 2:26 PM

OCR for page 11
e va l u at i o n u. s . a i r f o rc e P r e ac q u i s i t i o n t e c h n o lo g y d ev e lo P m e n t s 28 of TRLs, as applied to the 23 technologies, reconciled the different maturity levels with subse - quent product development experiences. They also revealed when gaps occurred between a technology’s maturity and the intended product’s requirements. For technologies that were successfully incorporated into a product, the gap was recognized and closed before product development began, improving the chances for successful cost and schedule outcomes. The closing of the gap was a managed result. It is a rare program that can proceed with a gap between product requirements and the maturity of key technologies and still be delivered on time and within costs.41 Additional emphasis on the achievement of technological maturity was man- dated in the Weapon Systems Acquisition Reform Act of 2009, which requires, among many other provisions, that Major Defense Acquisition Programs (MDAPs) must, prior to Milestone B, carry out competitive prototyping of the system or of critical subsystems and complete their Preliminary Design Review.42 Similar to the need for technological maturity, there must be stability and predictability in the financial resources available to a program manager. During this study, frequent reference was made to the Valley of Death, that graveyard for technology development efforts that might survive early exploratory R&D phases, but then fall victim to a lack of funding for bridging the gap to the system develop- ment and production phases.43 With its longer time horizons, the DoD’s Planning, Programming, Budgeting, and Execution System is ill equipped to handle problems like the Valley of Death, or the sorts of rapid acquisitions that are often required today. A 2006 study from the Center for Strategic and International Studies focused on joint programs, but the message applies to all acquisition efforts: The current Planning, Programming, Budgeting, and Execution System (PPBES) resource allocation process is not integrated with the requirements process and does not provide sufficient resources for joint programs, especially in critical early stages of coordination between and perturbations in resource planning and requirements planning frequently result in program funding instability. Such instability increases program costs and trig- gers schedule slippages across DoD acquisition programs. Chronic under-funding of joint programs is endemic to the current resource allocation system.44 41 GAO. 1999. Best Practices: Better Management of Technology Development Can Improve Weapon System Outcomes. GAO NSIAD-99-162. Washington, D.C.: General Accounting Office, p. 3. Available at http://www.gao.gov/archive/1999/ns99162.pdf. Accessed June 11, 2010. 42 Weapon Systems Acquisition Reform Act of 2009 (Public Law 111-23, May 22, 2009). 43 Dwyer Dennis, Brigadier General, Director, Intelligence and Requirements Directorate, Head - quarters Air Force Materiel Command, Wright-Patterson Air Force Base, Ohio. 2010. “Development Planning.” Presentation to the committee, March 31, 2010. 44 David Scruggs, Clark Murdock, and David Berteau. 2006. Beyond Goldwater Nichols: Department of Defense Acquisition and Planning, Programming, Budgeting, and Execution System Reform. Wash- ington, D.C.: Center for Strategic and International Studies. Available at http://csis.org/files/media/ csis/pubs/bgnannotatedbrief.pdf. Accessed June 11, 2010. R01861 AF PTD--CS4 final.indd 28 2/18/11 2:26 PM

OCR for page 11
Preacquisition technology develoPment air force weaPon systems 29 for Among the most critical resources are robust processes, from the very con- ception of a program. For both government and industry, well-defined and well- understood work processes in all phases of program management are essential to successful technological development. Repeatedly during this study, evidence was presented that within the Air Force some of these processes have been diluted in significant ways in the past decade and are only now beginning to be reinvigorated. In particular, there was general agreement on the decline of the systems engi- neering field.45 After a period of decline, systems engineering has been revived and received additional attention in the 2009 WSARA legislation. But once a field has been allowed to atrophy for whatever reason, the redevelopment of that capability is a long and arduous task. A similar situation exists with the field of Development Planning. For decades, Product Centers had DP functions (“Product Center Development Planning Or- ganizations,” or, as referred to in headquarters shorthand—XRs) that worked with warfighter commands to address alternatives to meet future needs. These offices operated in the early conceptual environs, pre-program of record, to help a using command clarify its requirements, assess the feasibility of alternatives, and settle on the preferred way to meet those requirements. Often, the DP resources that worked on the early stages of a program were later used to form the initial cadre of a program office, if indeed one was ultimately established. As with the rebirth of systems engineering, the disestablishment of Develop- ment Planning is being rectified. Product Center DP directors provided valuable input to this study, and although it is clear that their function is being reborn, it is equally obvious that a capability can be eliminated quickly by one decision but can only be revived with time and with great difficulty. The Right People The third critical element for a successful program is the right people, which translates to program managers and key staff with the right skills, the right experi- ence, and in the right numbers to lead programs successfully. This category also includes the right personnel policies and “right” cultures, which can contribute to program success. The acquisition workforce has been buffeted by change for decades. Every acquisition setback has generated a new round of “fixes,” which by now have so constrained the system that it is to some a wonder that it functions at all. This workforce has been downsized, outsourced, and reorganized to the point of dis- 45 For a thorough discussion of the importance of systems engineering, see the Kaminski report— NRC. 2008. Pre-Milestone A and Early-Phase Systems Engineering: A Retrospective Review and Benefits for Future Air Force Systems Acquisition. Washington, D.C.: The National Academies Press. R01861 AF PTD--CS4 final.indd 29 2/18/11 2:26 PM

OCR for page 11
e va l u at i o n u. s . a i r f o rc e P r e ac q u i s i t i o n t e c h n o lo g y d ev e lo P m e n t s 30 of traction, yet there is little or no evidence to suggest that discernible improvements have resulted. This disruption in the acquisition workforce was well recognized by those close to it. Beginning in the early 1990s, many of the best and brightest Air Force acquisition professionals chose to retire—many of them early—to take jobs with advisory and assistance services (A&AS) contractors. As these highly competent and experienced performers left, they were often not replaced, and so an enormous “bathtub” developed: Air Force acquisition specialties became understaffed, and many of the people who did remain were either in very senior oversight positions, or were very junior and, lacking mentors, very inexperienced. The middle of the force, the journeymen and junior managers, quite literally disappeared. This was recognized by the authors of the DAPA report.46 Released in 2006, that report accurately described the state of the acquisition workforce: Key Department of Defense acquisition personnel who are responsible for requirements, budget and acquisition do not have sufficient experience, tenure and training to meet current acquisition challenges. Personnel stability in these key positions is not sufficient to develop or maintain adequate understanding of programs and program issues. System engineering capability within the Department is not sufficient to develop joint architectures and interfaces, to clearly define the interdependencies of program activities, and to manage large scale integration efforts. Experience and expertise in all functional areas [have] been de-valued and contribute to a “Conspiracy of Hope” in which we understate cost, risk and technical readiness and, as a result, embark on programs that are not executable within initial estimates. This lack of experience and expertise is especially true for our program management cadre. The Department of Defense exacerbates these problems by not having an acquisition career path that provides sufficient experience and adequate incentives for advancement. The aging science and engineering workforce and declining numbers of sci- ence and engineering graduates willing to enter either industry or government will further enforce the negative impact on the Department’s ability to address these concerns. With the decrease in government employees, there has been a concomitant increase in contract support with resulting loss of core competencies among government personnel.47 In May 2009, 3 years after the DAPA release and after being rocked by two major failed source selections in the previous year, the Air Force released its Acquisition Improvement Plan.48 It cited five shortcomings of the acquisition process, all of 46 Assessment Panel of the Defense Acquisition Performance Assessment Project. 2006. Defense Acquisition Performance Assessment Report. A Report by the Assessment Panel of the Defense Acquisi- tion Performance Assessment Project for the Deputy Secretary of Defense. Available at http://www. frontline-canada.com/Defence/pdfs/DAPA-Report-web.pdf. Accessed January 29, 2011. 47 Ibid., p. 29. 48 USAF. 2009. Acquisition Improvement Plan. Washington, D.C.: USAF. Available at http://www. dodbuzz.com/wp-content/uploads/2009/05/acquisition-improvement-plan-4-may-09.pdf. Accessed June 11, 2010. R01861 AF PTD--CS4 final.indd 30 2/18/11 2:26 PM

OCR for page 11
Preacquisition technology develoPment air force weaPon systems 31 for which pointed, in whole or in part, to failures in the human side of the acquisition enterprise: 1. Degraded training, experience and quantity of the acquisition workforce; 2. Overstated and unstable requirements that are difficult to evaluate during source selection; 3. Under-budgeted programs, changing of budgets without acknowledging impacts on program execution, and inadequate contractor cost discipline; 4. Incomplete source selection training that has lacked “lessons learned” from the current acquisition environment, and delegation of decisions on leadership and team assignments for MDAP source selections too low; and 5. Unclear and cumbersome internal Air Force organization for acquisition and Program Executive Officer (PEO) oversight.49 Clearly the two failed source selections had been a major blow to the Air Force’s reputation in acquisition. The Acquisition Improvement Plan closes with a call to recapture the successes of yesterday: We will develop, shape, and size our workforce, and ensure adequate and continuous train - ing of our acquisition, financial management, and requirements generation professionals. In so doing we will re-establish the acquisition excellence in the Department of the Air Force that effectively delivered the Intercontinental Ballistic Missile; the early reconnaissance, weather, and communication satellites; the long-range bombers like the venerable B-52; and fighters like the ground-breaking F-117A. . . .50 Those steps are under way. Evidence was presented during this study indicat- ing that expedited hiring was being used to fill empty positions, in accordance with the DAPA recommendations.51 However, the redevelopment of a skilled and experienced workforce is in some ways reminiscent of the challenges facing those seeking to reinvigorate systems engineering or Development Planning: Similar to what was seen with those critical processes, a skilled workforce can shrink quickly, yet will take decades or more to rebuild and mature. REPORT ORGANIZATION The remainder of this report is structured as follows, to correspond to the four main paragraphs of the statement of task. Chapter 2, “The Current State of 49 Ibid. 50 Ibid., p. 14. 51 Assessment Panel of the Defense Acquisition Performance Assessment Project. 2006. Defense Acquisition Performance Assessment Report. A Report by the Assessment Panel of the Defense Acqui- sition Performance Assessment Project for the Deputy Secretary of Defense. Available at https://acc. dau.mil/CommunityBrowser.aspx?id=18554. Accessed June 11, 2010. R01861 AF PTD--CS4 final.indd 31 2/18/11 2:26 PM

OCR for page 11
e va l u at i o n u. s . a i r f o rc e P r e ac q u i s i t i o n t e c h n o lo g y d ev e lo P m e n t s 32 of the Air Force’s Acquisition Policies, Processes, and Workforce,” addresses the first paragraph of the statement of task. Chapter 3, “Government and Industry Best Practices,” addresses the third paragraph of the statement of task. Chapter 4, “The Recommended Path Forward,” responds to the second and fourth paragraphs of the statement of task. Importantly, the committee chose to present its findings in Chapters 2 and 3, and the associated recommendations (plus the reiterated relevant findings from the earlier chapters) are consolidated in Chapter 4. Finally, Appen- dixes C and D provide background information related to the subjects addressed in Chapter 2. R01861 AF PTD--CS4 final.indd 32 2/18/11 2:26 PM