1
Introduction

DEFINITIONS OF THE TERM “IT SYSTEM”

The statement of task for this study calls for an examination of the acquisition and the test and evaluation (T&E) processes as specifically applied to information technology (IT) in the Department of Defense (DOD). At the outset of the study, the Committee on Improving Processes and Policies for the Acquisition and Test of Information Technologies in the Department of Defense discovered that the term “IT system” was used in different ways by briefers to the committee as well as among members of the committee itself. Further investigation showed that the DOD provides no specific definition of “IT system” per se (see Box 1.1 for relevant examples). For purposes of this study, the committee decided to consider IT systems to be just those systems that support the DOD “information enterprise” (see definition in Box 1.1), but excluding IT embedded in weapons systems and in DOD-unique hardware. In particular, the term as used by the committee signifies systems expected to run on or interface with existing infrastructure and systems that are user-facing; moreover, “IT system” as used by the committee means systems that are delivered through the acquisition process (and not systems “homegrown” in individual commands).

The committee subdivided IT systems as specified above into two categories that differ in terms of development requirements, technical characteristics, and risk:



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 17
1 Introduction DEFINITIONS OF THE TERM “IT SySTEM” The statement of task for this study calls for an examination of the acquisition and the test and evaluation (T&E) processes as specifically applied to information technology (IT) in the Department of Defense (DOD). At the outset of the study, the Committee on Improving Processes and Policies for the Acquisition and Test of Information Technologies in the Department of Defense discovered that the term “IT system” was used in different ways by briefers to the committee as well as among members of the committee itself. Further investigation showed that the DOD pro - vides no specific definition of “IT system” per se (see Box 1.1 for relevant examples). For purposes of this study, the committee decided to consider IT systems to be just those systems that support the DOD “information enterprise” (see definition in Box 1.1), but excluding IT embedded in weapons systems and in DOD-unique hardware. In particular, the term as used by the committee signifies systems expected to run on or interface with existing infrastructure and systems that are user-facing; moreover, “IT system” as used by the committee means systems that are delivered through the acquisition process (and not systems “homegrown” in indi- vidual commands). The committee subdivided IT systems as specified above into two categories that differ in terms of development requirements, technical characteristics, and risk: 

OCR for page 17
 ACHIEVING EFFECTIVE ACQUISITION OF IT IN THE DOD BOX 1.1 Definitions Related to the Term “IT System” in Department of Defense Directives Department of Defense Instruction (DODI) 5000.2, published most recently in December 2008, defines the authoritative DOD acquisition process. The terms “IT system,” “information technology system,” and “information system” are not explicitly defined in DODI 5000.2, although the term “IT system” is used in several places, as is the term “information system.” An “automated informa- tion system (AIS)” is defined as follows: A system of computer hardware, computer software, data or telecommunications that performs functions such as collecting, processing, storing, transmitting, and displaying information. Excluded are computer resources, both hardware and soft- ware, that are: a. an integral part of a weapon or weapon system; b. used for highly sensitive classified programs (as determined by the Sec- retary of Defense); c. used for other highly sensitive information technology programs (as de- termined by the ASD(NII)/DOD CIO) [Assistant Secretary of Defense for Networks and Information Integration/DOD Chief Information Officer]; or d. determined by the USD(AT&L) [Under Secretary of Defense for Acqui- sition, Technology and Logistics] or designee to be better overseen as a non-AIS program (e.g., a program with a low ratio of RDT&E [research, development, test, and evaluation] funding to total program acquisition costs or that requires signifi- cant hardware development).1 This definition focuses on characteristics relevant to the matter of who man- ages acquisition oversight for various types of programs based on the applica- tion, funding, or sensitivity of the program. DOD Directive (DODD) 8000 specifies oversight responsibilities for DOD information-management activities and supporting information technology, implementing provisions of the Information Technology Management Reform Act of 1996 (part of the National Defense Authorization Act for Fiscal Year 1996, Public Law 104-106). “Information technology” is defined in the directive as follows: • Software deelopment and commercial off-the-shelf integration (SDCI) programs—those that focus on the development of new software to pro- vide new functionality or on the development of software to integrate commercial off-the-shelf (COTS) components, and • COTS hardware, software, and serices (CHSS) programs—those that are focused exclusively on COTS hardware, software, or services without modification for DOD purposes (that is, the capabilities being purchased are determined solely by the marketplace and not by the DOD).

OCR for page 17
 INTRODUCTION Any equipment or interconnected system or subsystem of equipment, used in the automatic acquisition, storage, manipulation, management, movement, control, display, switching, interchange, transmission, or reception of data or information by the executive agency, if the equipment is used by the executive agency directly or is used by a contractor under a contract with the executive agency that requires the use of that equipment; or of that equipment to a significant extent in the perfor- mance of a service or the furnishing of a product. Information technology includes computers, ancillary equipment, software, firmware and similar procedures, ser- vices (including support services), and related resources; but does not include any equipment acquired by a Federal contractor incidental to a Federal contract.2 This definition of information technology has, unfortunately, been too often interpreted as communications hardware-focused, although its scope is clearly broader. As a result, the committee chose as its point of departure for this study the definition of the “DOD information enterprise” provided in the glossary of DODD 8000.1: Department of Defense Information Enterprise. The DOD information resources, assets, and processes required to achieve an information advantage and share in- formation across the Department of Defense and with mission partners. It includes: (a) the information itself and the Department’s management over the information life cycle; (b) the processes, including risk management, associated with managing information to accomplish the DOD mission and functions; (c) activities related to designing, building, populating, acquiring, managing, operating, protecting, and defending the information enterprise; and (d) related information resources such as personnel, funds, equipment, and IT, including national security systems.3 1 DOD Instruction 5000.2, “Operation of the Defense Acquisition System,” 2008, p. 33. 2 DOD Directive 8000.1, “Management of the Department of Defense Information Enter- prise,” 2009, p. 11. 3 DOD Directive 8000.1, 2009, p. 10. EFFECTIVE APPROACHES TO INFORMATION TECHNOLOGy IN THE COMMERCIAL SECTOR The information age has ushered in an era of personalized prod- ucts and services built on standard, massively replicable platforms—a powerful combination of centrally supported IT and end-user-driven IT (which generally relies on centrally managed IT to provide at least some of the underlying computing, storage, and communications capabilities). The result has been an ever-increasing empowerment of individuals and

OCR for page 17
0 ACHIEVING EFFECTIVE ACQUISITION OF IT IN THE DOD organizations, giving them the ability to innovate their technical capabili - ties, their business processes, and their own product and service offer- ings. Accompanying this empowerment has been a rising set of expecta - tions for performance of the information technology foundations through which these expectations are met. Hence the environment for delivering capability has become increasingly competitive, with emergent, tailored solutions for certain kinds of problems realized in days and months, sometimes by the customers themselves. How are commercial IT market leaders managing these demands? They are doing so by instituting standardization and discipline at the heart of their respective IT enterprises while enabling agile, customer-led innovation at the edge of these enterprises. Most large IT providers have developed highly reliable, available, and scalable computing environ - ments as the backbone of their product and service offerings. Consider search engines, commodity trading platforms, online auctions, and online marketplaces. All of these are based on commodity hardware and software that have been integrated to provide uninterrupted, extensible comput - ing power, in many cases around the globe. These platforms are defined and their interfaces are exposed, at least internally, with an emphasis on interface stability and longevity.1,2 In some cases, the platform interfaces are exposed and accessed externally.3,4 Exposed, stable interfaces enable customers to apply computing power in new and unanticipated ways without compromising configu- ration control by the service provider or hindering the overall customer experience. By exposing robust interface points, customers can elect (or build) their own uniquely tailored experiences, thereby enjoying high satisfaction themselves and providing a reliable business base for the supplier. The perception—and sometimes the reality—is that customer- led innovation is a “free-for-all” at the edge. Indeed, in many cases, consumer-facing providers cannot—or do not seek to—control the edge because their market is so diverse. However, this is not the general case for most enterprises. Many companies are successful at actively pursuing customer-led innovation as a principal means of driving the company’s evolution while doing so in a methodical, managed way. Integrating 1 Luiz Andre Barroso, J. Dean, and U. Holzle, “Web Search for a Planet: The Google Cluster Architecture,” IEEE Micro 23(1):22-28, April/May 2003. 2 Anand Gangadharan, “eBay Platform Roadmap,” eBay Devcon 2009, June 2009, San Jose, Calif. 3 Association for Computing Machinery, “A Conversation with Werner Vogels, Learning from the Amazon Technology Platform,” ACM Queue 4(4):14-17, May 2006. 4 Tom Killalea, “Building Scalable Web Services: Build Only What You Really Need,” ACM Queue 6(6):10-13, October 2008.

OCR for page 17
 INTRODUCTION the customer overtly into the product or service evolution is viewed as essential to success.5,6,7 At the same time, customer-led innovation has not resulted in delivery that is completely customer-driven: commercial developers still have their own rhythms of delivery of features, releases that customers must wait for. For both centrally defined and edge-defined IT, many successful commercial IT suppliers have organized development around two key principles: (1) portfolio management8 and (2) development by small teams employing agile software-development methods.9,10 Portfolio management is a formal process whereby limited resources are strategically allocated to a subset of possible projects. Project risk, overall objectives, costs, ben - efits, and project interdependencies are all weighed, and a corporate-level decision is rendered on strategic investments. Implemented properly, portfolio management is an agile management tool that can accept cur- rent, real-world data and quickly evaluate and recommend changes to the portfolio. The use of small teams employing agile methods has many advan- tages. Among them is the minimal enterprise expense that is incurred prior to the engagement of the first users and to all subsequent releases until a business base is established. If a product or service fails to meet business objectives at any point in its evolution, it can be canceled or redi - rected, at relatively low cost.11 The agile approach is one specific approach to software development within a larger category known as iterative, incremental development (IID). A survey article12 on the history of IID chronicles a long succession of major technology programs that have suc - cessfully used IID, including the X-15 hypersonic aircraft program and the application of IID methods to software projects on NASA’s Project Mer- cury. By the 1970s, IID was more widely applied to major software proj- ects at selected major prime government contractors, including TRW and 5 Nanette Byrnes, “Xerox Refocuses on Its Customers,” Business Week, April 18, 2007. 6 “Lego Mindstorms Advanced User Tools.” Available at http://mindstorms.lego.com/ Overview/NXTreme.aspx; accessed June, 26, 2009. 7 “National Instruments’ LabView.” Available at http://zone.ni.com/dzhp/app/main; accessed June 26, 2009. 8 M.W. Dickinson, A.C. Thornton, and S. Graves, “Technology Portfolio Management: Optimizing Interdependent Projects over Multiple Time Periods,” IEEE Transactions on En- gineering Management 48(4):518-527, November 2001. 9 Ade Miller and Eric Carter, “Agility and the Inconceivably Large,” pp. 304-308 in Proceed- ings of the Agile 00, IEEE Computer Society, Washington, D.C., 2007. 10 Association for Computing Machinery, “A Conversation with Werner Vogels,” 2006. 11 Lan Cao and Balasubramaniam Ramesh, “Agile Requirements Engineering Practices: An Empirical Study,” IEEE Software 25(1):60-37, January/February 2008. 12 Craig Larman and V.R. Basili, “Iterative and Incremental Development: A Brief History,” IEEE Computer 36(6): 47-56, June 2003.

OCR for page 17
 ACHIEVING EFFECTIVE ACQUISITION OF IT IN THE DOD IBM. The 1980s and 1990s saw significant evolution in IID approaches, and in 2001 the first text on the subject, Agile Software Deelopment, by Alistair Cockburn, was published.13 A more in-depth discussion of IID is provided in Chapter 3 of this report. This chronology situates agile and related approaches within a broader context and also demonstrates that IID has a long history of being applied successfully for different types and scales of problems both in the DOD and in the commercial sector. THE DEFENSE ACQUISITION SySTEM The complex Defense Acquisition System (DAS) has three major com- ponents, defined as follows: • The Joint Capabilities Integration and Deelopment System (JCIDS) is aimed at identifying, assessing, and prioritizing joint military capabil - ity needs. The Joint Staff and the Joint Requirements Oversight Council champion it.14 • The Planning, Programming, Budgeting and Execution System (PPBES) allocates resources to capabilities deemed necessary to accomplish the DOD’s missions. The Under Secretary of Defense, Comptroller champions it.15 • The Defense Acquisition Management System (DAMS) establishes the “management framework for translating capability needs and technology opportunities, based on approved capability needs, into stable, affordable, and well-managed acquisition programs that include weapon systems, services, and automated information systems.” The Under Secretary of Defense for Acquisition, Technology and Logistics (USD AT&L) champi- ons the DAMS.16 Each of these components is discussed in more detail in Appendix A. The inherent difficulties in synchronizing these three DAS compo- nents have implications for all types of acquisition programs, includ- ing those delivering IT systems. The January 2006 report of the Defense Acquisition Performance Assessment (DAPA) project concluded that “the budget, acquisition and requirements processes [of the Department of Defense] are not connected organizationally at any level below the Dep - 13 A. Cockburn, Agile Software Deelopment, Addison-Wesley, Boston, Mass., 2001. 14 Defense Acquisition University, JCIDS Definition. Available at http://www1.dau.mil/; accessed June 2009. 15 Defense Acquisition Guidebook Section 1.2. December 2004. Available at https://akss. dau.mil/dag/guidebook/IG-c1.2.asp; accessed June 2009. 16 DOD Instruction 5000.2, “Operation of the Defense Acquisition System,” 2008, Para- Operation ” Para- graph 1.b.

OCR for page 17
 INTRODUCTION uty Secretary of Defense.”17 The DAPA panel specifically considered the impact of this disconnect on DOD software-related programs and made a number of recommendations aimed at addressing the problems. The present committee’s report is focused largely on the DAMS component of the DAS. The PPBES is a well-established process, and its demands are largely predictable. The JCIDS requirements are suf- ficiently general to provide the necessary flexibility, and are integrated with the existing DAMS. The committee believes that the present Defense Acquisition Management System constitutes a significant challenge to the successful acquisition of IT programs and that changing it represents a promising opportunity to improve the performance of these programs. Moreover, the committee believes that these changes can be successfully integrated with the other existing components of the DAS. The DAMS thus constitutes the focus of this report, although the changes proposed by the committee may also have implications for the JCIDS and PPBES components. RESULTS OF CURRENT ACQUISITION PROCESSES AND PRACTICES FOR INFORMATION TECHNOLOGy SySTEMS The committee received a briefing from the Office of the Assistant Sec- retary of Defense (Networks and Information Integration) (OASD [NII]) regarding the time that a set of major automated information system (MAIS) programs took to progress through the DOD acquisition system. The set was composed of 23 MAIS programs (3 of which were labeled as extensions of existing programs) that were initiated in fiscal year (FY) 1997 or later and that were completed or discontinued by early 2009. The presentation provided summary charts, and the OASD (NII) later provided the committee with the underlying data.18 This data set gives the dates on which each program started and completed the following phases in the acquisition cycle: the analysis of alternatives (AoA), the economic analysis (EA), engineering and manufacturing development (which begins following Milestone B [MS B]), and the achievement of initial operating capability (IOC). Some programs started a phase with - out completing previous phases, and some programs completed a phase without continuing to the next phase. In the figures and table in this chapter, those programs that entered 17 Assessment Panel of the Defense Acquisition Performance Assessment Project, Defense Acquisition Performance Assessment Report, Department of Defense, Washington, D.C., Janu- ary 2006. 18 Timothy J. Harp, Deputy Assistant Secretary of Defense (C3ISR & IT Acquisition), “Information Technology Acquisition,” presentation to the committee, Washington, D.C., February 25, 2009; and Timothy J. Harp, personal communication to the committee.

OCR for page 17
 ACHIEVING EFFECTIVE ACQUISITION OF IT IN THE DOD the acquisition process at AoA are labeled A to H. (These labels are used rather than the program names because the objective of the analysis was to establish time lines rather than to examine issues associated with indi - vidual programs.) The programs that entered EA without first complet- ing an AoA are labeled AA to DD. The programs that started at MS B are labeled AAA to HHH. Eight programs started the acquisition process at the AoA phase (pre- Milestone B). Figure 1.1 indicates the time in months for each program to complete its AoA. The average time for these programs to complete their AoA was 11 months; the median was 13 months. Five of these pro- grams (those labeled “A”, “B,” “C,” “D,” and “H”) went beyond AoA completion. Nine programs in this data set completed their economic analysis (Figure 1.2). Five of these nine were continuations of efforts shown in Figure 1.1. The five programs in common in Figures 1.1 and 1.2 took an average of 28 months and a median of 30 months to complete both phases—roughly 2½ years. The remaining four programs reflected in Figure 1.2 entered EA without first completing an AoA. Overall the aver- 120 100 Time to complete (months) 80 60 40 16 13 13 13 13 20 10 9 2 0 A B C E F G H D MAIS Program Label FIGURE 1.1 Time taken to complete the analysis of alternatives (AoA) for the eight major automated information system (MAIS) programs that started the acquisition process at the AoA phase. NOTE: See the accompanying text for an explanation of the program labels. SOURCE: Compiled by the committee from data provided by the editable R01648, fig 1.1 Department of Defense for 23 MAIS programs initiated in FY 1997 or later and completed or discontinued by early 2009.

OCR for page 17
 INTRODUCTION 120 T ime to c o mplete ( m on th s ) 100 75 80 52 60 49 40 26 20 20 18 20 7 4 0 A AA D B C H BB CC DD MAIS P r og ra m L a be l FIGURE 1.2 Time taken to complete the economic analysis phase (AoA comple- tion to Milestone B) for major automated information system (MAIS) programs during FY 1997 to early 2009. NOTE: See the accompanying text for an explanation of the program labels. SOURCE: Compiled by the committee from data provided by the Department of Defense. R01648, figure 1-2, editable age time for programs in this data set to complete the EA was 30 months; the median was 20 months. Figure 1.3 shows the time that it took for 13 programs to go from Mile- stone B to a successful initial operating capability. Most of these programs entered the acquisition process at Milestone B. Two of these programs (labeled “A” and “D”) completed all three phases of the acquisition pro - cess and are represented in all three figures. These programs took a total of 58 months (for “A”) and 64 months (for “D”) to reach IOC. Overall the average time for programs in this data set to go from Milestone B to IOC was 53 months; the median was 43 months. Table 1.1 shows the average and median times required across all three acquisition phases to reach IOC. Although it is not mathematically accurate simply to add the averages or medians shown here, these sta- tistics suggest that 6 to 8 years could be required to complete the entire acquisition process and reach IOC. Note that oversight attention is gen - erally believed to have increased over the period of time represented in this data set and analysis, suggesting that the time to IOC may be even longer for more recent programs (and for programs in the future) than these averages suggest.

OCR for page 17
 ACHIEVING EFFECTIVE ACQUISITION OF IT IN THE DOD 120 108 Time to complete (months) 100 75 75 72 80 70 52 60 43 41 36 35 34 40 30 24 20 0 AAA D BBB CCC DDD EEE FFF A AA4 GGG BB2 CC2 HHH MAIS Program Label FIGURE 1.3 Time taken from Milestone B to initial operating capability for major automated information system (MAIS) programs during FY 1997 to early 2009. NOTE: See the R01648, Figure 1-3, editable explanation of the program labels. accompanying text for an SOURCE: Compiled by the committee from data provided by the Department of Defense. TABLE 1.1 Average and Median Times Taken by Major Automated Information System Programs in Acquisition Process Phases Leading to Initial Operating Capability Phase Average (in months) Median (in months) AoA completion 11 13 AoA to MS B 30 20 MS B to IOC 53 43 NOTE: See accompanying text for a description of the MAIS programs in the data set; see also Figures 1.1, 1.2, and 1.3. AoA, analysis of alternatives; MS B, Milestone B; IOC, initial operating capability. One reason for these very long time lines is the burden imposed by the oversight process—the time associated with preparing documenta - tion, scheduling review meetings, and so forth. To illustrate this point, the Business Transformation Agency (BTA)) constructed a graph—referred to as “The Big Ugly,” and based on one originally constructed by the U.S. Air Force—that shows all of the reviews and documents required to field a program. The BTA also considered the specific case of adding a 200-line program to a business system and projected that it would take more than

OCR for page 17
 INTRODUCTION $1 million and 2 years just for the DOD 5000 acquisition reviews and documentation.19 SCOPE AND CONTExT OF THIS REPORT Over the years, numerous reports have made recommendations aimed at reforming defense acquisition. Indeed, multiple recent reports have tackled the question of IT acquisition specifically and have come to conclusions similar to those reached in this report. The committee believes that this general consensus buttresses the points made here. It is not the committee’s purpose, however, to comment specifically on other reports. One distinctive contribution of this report is its discussion of dif - ferent classes of IT and how such differences merit different acquisition approaches. The rest of the report examines in more detail the implications of current DOD IT acquisition processes and the committee’s rationales and recommended changes. Chapter 2 explores the cultural backdrop of the defense IT acquisition community and its effects on how IT systems are procured. Chapter 3 examines software and systems engineering practices and proposes a revised acquisition-management approach for IT systems. Chapter 4 considers testing and how the testing and evaluation of IT sys - tems within the acquisition process might be made more effective. Appen- dix A provides a brief overview of the defense acquisition system for IT, Appendixes B and C respectively provide details of the recommended acquisition process for SDCI and CHSS programs, Appendix D gives examples of programs that have succeeded with nontraditional oversight, Appendix E lists briefings provided to the committee, and Appendix F provides biosketches of the committee members and staff. The acronyms used in the report are defined in Appendix G. 19 Information provided to the committee by Keith Seaman, Acting Director, BTA Compo - nent Acquisition Executive, February 2009.