The Acquisition Process and Culture
Until 1996, there was a separate set of processes and policies for the acquisition of Department of Defense (DOD) information technology (IT) systems, as called for in the Brooks Act of 1965 (Public Law 89-306). In 1996, the IT and non-IT policies were merged, under the rationale that the requirements of the Brooks Act and the associated DOD 8000 processes had made the acquisition process too cumbersome and slow. IT programs thereafter fell under a single acquisition process specified in the DOD 5000 series regulations, which were intended to provide a more flexible and nimble framework for all types of programs. Shortly after the IT programs were consolidated under DOD 5000, there was an emphasis on tailoring the oversight and documentation requirements of DOD 5000 to better suit the needs of IT programs. There have also been repeated efforts, most recently in December 2008, to reform the process defined by the DOD 5000 series in order to address persistent challenges in the acquisition system.
The DOD has struggled to provide affordable and effective military capabilities through the defense acquisition system (and not just for IT systems). Repeated attempts have been made to reform the acquisition processes, largely based on the experience with weapons systems, with particular attention to highly visible problems in very large programs. Within the Department of Defense, the top 10 acquisition programs
account for about 80 percent of the acquisition budget.1 These programs are large weapons systems programs such as those for the Joint Strike Fighter (F-35), the Future Combat Systems, the Ballistic Missile Defense System, and the SSN 774-class fast-attack submarine. Perceived weaknesses in the DOD acquisition process have included the following: an inadequate assessment of technological maturity before beginning system development; insufficient government reviews and oversight during the multiyear design and development phases; and inadequate preparation for and execution of operational testing.2,3 In revising the DOD acquisition policies and procedures over the years, the DOD has attempted to address these perceived weaknesses by including more process steps and additional reviews.
Many argue that these reforms, especially the introduction of more oversight, have not improved—and may well have further burdened—the acquisition system. In particular, these changes, aimed primarily at challenges related to large, weapons system programs, have had noteworthy adverse implications for IT programs. IT program managers who provided briefings to the committee during the course of this study indicated that DOD 5000 processes dramatically increase the time to deliver solutions, especially those available as commercial off-the-shelf (COTS) solutions. In addition, the DOD 5000 processes result in the creation of larger formal acquisition programs that, by their very nature, increase documentation requirements and the associated sizes of the support teams.
A recent review conducted by the DOD Obama-Biden Presidential Transition Team noted unanimous agreement among the chief information officers (CIOs) of the DOD and the Military Services that the ability of the DOD acquisition process to deliver needed IT systems was “fundamentally broken.” The CIOs cited the inability of the DOD acquisition system to field systems based on commercial technology while it was still state of the art. With commercial IT technologies evolving on 18-month cycles, taking 6 to 8 years for a large IT program to field initial operating capabilities (IOCs; see Chapter 1) is a clear indicator that the defense acquisition processes are not matched to the fundamental characteristics of commercial technology. The CIOs suggested an urgent need to be
able to field rapidly to military personnel the same commercial technology available to private-sector users and, in some instances, adversaries. Moreover, the CIOs cited the enormous cost of acquisition oversight processes and the repeated failure of large DOD IT acquisition programs to deliver needed user capabilities. They also noted the need to move toward smaller IT acquisition projects to reduce the risk of failure and to accelerate the fielding of mission-critical capabilities.4
The rest of this chapter describes cultural and organizational issues that pose challenges to effective and efficient IT acquisition in the DOD. It closes with a discussion of the importance of measures of success for IT programs together with a brief set of potential metrics.
DIFFERENCES BETWEEN INFORMATION TECHNOLOGY SYSTEMS AND WEAPONS SYSTEMS ARE NOT REFLECTED IN CURRENT PROCESS
Information technology programs have a number of characteristics that distinguish them from defense weapons systems programs. During this study the committee frequently heard the phrase “IT is fundamentally different.” These differences are seen as the root cause of many of the problems noted above by the DOD CIO community. As an example, one of the characteristics of most IT programs is their dependence on COTS technologies—thus technology development and many other pre-Milestone B activities specified in the DOD 5000 series are unnecessary. Unfortunately, the tendency in the DOD has been to force-fit DOD 5000 processes, including pre-Milestone B activities—onto COTS-based IT programs and to require that the programs conduct technology demonstrations with mature (or legacy) commercial technology. The program manager (PM) is forced to demonstrate mature and proven technology—and is then forced to carry out the fielding using exactly that same technology baseline, even though, because of the slow pace of the process, the fielding is likely to occur well after the demonstrated software is commercially retired.
As another example of how IT programs are different from weapons systems programs, consider the use and timing of operational test and evaluation (OT&E) activities. The OT&E of weapons systems programs is used as a major risk-reduction activity. For weapons systems, OT&E is conducted by specially trained test professionals charged with determining if the system should go into full-scale production, which represents the majority of the total program cost. For many IT systems, however, the bulk of the program costs have already been spent by the time the opera-
tional testing phase is reached.5 Therefore, if testing is to inform spending decisions, it must be performed much earlier in the development cycle, and risks must be addressed at that stage.
Moreover, operational testing for an IT system is best done not by professional testers but by the actual users of the system in a limited operational environment. A key question for IT operational testing is whether the system represents an operational improvement over existing capabilities that users are willing to accept in the field—not whether the system should go into full-scale production.
Weapons systems and IT systems are also different in terms of the applicable technology cycles. The technology cycle for IT systems is much more rapid than that for most weapons systems technologies. Although new weapons systems technologies (for example, stealth, engines, and explosive technologies) can take years to develop, there is roughly an 18-month cycle for new IT technologies, which are driven largely by the commercial sector. With the average DOD acquisition program taking more than 2 years to become established and to reach an initial milestone, and as much as a decade to produce IOCs for testing, it is obvious that the pace associated with the DOD acquisition processes greatly lags that for IT advancement. It should also be noted that the nature of the current oversight process for DOD acquisition programs tends to encourage the aggregation of many requirements into larger programs, further exacerbating cycle time mismatches. This requirements aggregation is driven by such factors as a desire to avoid the cost of separate program documentation, a desire to minimize acquisition oversight reviews (see below), and a perceived need to consolidate budget requirements in order to raise the significance or importance of the program so that it will attract funding.
Another difference between weapons systems and IT systems is that of requirements specificity at various points in the development process. This difference manifests itself in two important ways: (1) user interaction and (2) boundary conditions.
With regard to user interaction, it does not make sense to develop detailed user-interaction requirements at the beginning of an IT program, particularly for a brand-new capability. IT systems provide capabilities that are very user-centric, and in some cases completely new user interaction must be observed and/or measured and program
However, some IT programs both develop software and provide the hardware to operate it across a large production inventory. For these classes of programs, a significant part of the total cost is in the procurement of the hardware and its installation on a ship. This is one motivation for the discussion in Chapter 3 about breaking such programs up into their off-the-shelf-components (COTS hardware, software, and services [CHSS]) and the development and/or integration components (software development and commercial off-the-shelf integration [SCDI]) rather than having a single program.
flow or other system elements updated to reflect user preferences and usage models. Moreover, IT systems need to evolve as military missions change owing to threat or organizational changes as well as to improve the effectiveness and efficiency of military operations. Conversely, the evolution of weapons systems platforms reflects longstanding and time-honored operational methods that relate to the inherently long timescales involved in the design and building of platforms, in training, and so forth.
An early emphasis on setting detailed requirements often results in IT systems failing to meet user needs or meeting, too late, a user requirement that has long since changed. When this happens, military members often develop “homegrown” solutions in the field to meet their needs. Such a solution may address the user’s immediate problem, but it creates new problems—such as even greater disconnects between centrally managed and centrally specified IT systems and user expectations, as well as a potential proliferation of poorly documented and supported solutions. One way to help avoid such a situation is to improve the coupling of the end user’s perspective to development and testing, allowing the process to more formally embrace and harness edge innovation.
With regard to boundary conditions, IT systems have, at best, amorphous boundaries. For example, few if any operationally relevant IT systems components can operate independently of the network for long periods of time. Eventually each component must join the network to receive and relay information. IT systems capabilities tend to be widely dependent on (and often distributed across) the network. Moreover, these days joint and coalition operations are the rule rather than the exception, increasing the necessity of interconnection. When forces are interdependent, they need their IT systems to interoperate seamlessly in dynamic situations that cannot be easily forecast. From a user perspective, an IT component should be able to “do it all.” As a result, requirements changes during and after development may be perceived by the user to be minor, but they may in fact compromise architectural decisions made very early in a program.
Conversely, weapons systems platforms tend to have physically discrete boundaries, defined very early on, that the user understands cannot be broached with impunity. In other words, the IT systems acquisition challenge is to fit large, fixed pieces together to construct, from a user perspective, a smoothly integrated whole, whereas many weapons systems platforms deal with much smaller chunks of commercial hardware and software and a great deal more customized hardware or developed software.
REQUIREMENTS PROCESS IMPEDES USE OF COMMERCIAL OFF-THE-SHELF SOLUTIONS
The use of COTS products has long been a staple in the commercial sector and is generally preferred over unique, one-of-a-kind systems development. COTS packages such as SAP Logistics, PeopleSoft for human resources, Siebel CRM for customer relationship management, and Oracle Applications for financial systems have successfully penetrated and provide day-to-day support to the commercial sector. Industry tends to adapt its business processes to standard COTS configuration templates so as to minimize time lines and costs associated with tailoring and long-term maintenance. Similarly, firms often find it advantageous to wait for features and functionality that exist on COTS roadmaps—or to engage with industry to get features added to those roadmaps—rather than to develop custom code or configurations. Conversely, the federal government has been slow to adopt COTS solutions, arguing that federal requirements are unique. Noncommercial requirements established by law and regulation that impose unique requirements exacerbate the problem. The resistance in the federal government to COTS solutions has softened over the past few years, but use of these kinds of COTS products in the federal government is still in the adoption phase.
The DOD has also been slow to adopt COTS products on the basis that “the DOD is different.” When the DOD selects a COTS product for application, the acquisition process often devolves into a significant modification of the COTS product to meet DOD-unique requirements. A notable example of this contrast involves enterprise resource planning (ERP) systems. Although industry does not have a perfect record, the list of failed DOD ERP projects is very long. Vast sums of money have been spent on these failed attempts to modify or uniquely configure COTS IT products.6
Part of the impediment to the DOD’s adoption of COTS solutions lies in its process for developing requirements. COTS products have processes, inputs, and outputs that are specifically defined. Industry has learned to adapt to these processes and products in the interests of economics and rapid development and deployment time lines. The DOD
has, until recently, argued that the requirements of DOD systems cannot be easily adapted to COTS products, which in turn requires changes to the basic COTS software. This approach not only is counterproductive but also soon turns the ostensibly COTS solution into a one-of-a-kind system, weighed down with development, testing, deployment, and maintenance challenges. The result is that commodity IT technologies that underpin DOD IT systems emerge on 18-month cycles, but it takes several years for the average IT program to field an IOC (see Chapter 1).
For COTS acquisition, a governance model and associated processes focused on new development, such as the DOD 5000 series, are not a good fit. Specifically, many pre-Milestone B activities addressed in the DOD 5000 series do not apply to predominantly COTS-based DOD IT systems. For example, it does not make sense to require IT programs to conduct technology demonstrations of COTS components that are already in widespread use. The managers of both the DOD Network Centric Enterprise Services (NCES) and Network Enabled Command and Control (NECC) programs cited this as a problem in briefings to this committee.7
The alternative is to emphasize the effective purchasing of COTS products to meet enterprise standards (as opposed to program or project-unique needs) and the organizing of programs in an incremental, modular fashion. Governance then is focused on the horizontal integration of system services.
OVERLY LARGE INFORMATION TECHNOLOGY PROGRAMS INCREASE RISK
Current IT systems acquisition processes in the DOD encourage the bundling of many capabilities into a single program activity. A premise for such bundling is that once a team is in place, it will be able to deliver the desired capabilities without the overhead of assembling a new team. The aim is to reduce the learning curve and start-up times on both the DOD and the contractor sides. However, in the current acquisition process, aggregating what could have been several smaller development efforts into a larger major program means that the delivery time and costs are significantly increased and that the ability to leverage state-of-the-art technology is impeded. Previous studies (see Chapter 3) have shown that the evolving of a software solution in short-term, lightweight spirals, with user feedback from early fielded capabilities influencing successive spirals, is the most effective approach for IT systems. This approach is, of
course, not without its own risks,8 but it should be demanded and supported by the IT systems acquisition process. The size of projects has a significant impact on the ability to assess success and failure promptly. Program size can mask real operational problems. For example, IT standards compliance in large programs might have been accomplished only for limited portions of the overall supplied capability, yet the program in total is reported as compliant. Only during operational use will short-comings that have developed be discovered. It is better to field smaller increments early and to discover such shortcomings when there is still the opportunity to fix them than to turn over an IT system for maintenance, only to discover severe shortfalls across the entire system.
FUNDING PROCESS IMPEDES FLEXIBILITY
The DOD’s process for obtaining funding for new acquisition programs typically takes multiple years. To address a DOD capability short-fall, the shortfall must be linked to a request to Congress for funding that would be provided in a future year. For solutions that will rely on information technology, the time frame for seeking funding can be many times longer than the actual time needed to develop or procure the solution. If it is to achieve a more rapid delivery of information technology solutions, the DOD will need a more responsive process for justifying and allocating funding to address capability shortfalls.
Although government funding processes are controlled by Congress and have legally binding controls, there are a number of opportunities to use existing flexibility to achieve improved speed and agility. For example, there are authorities given to the DOD to allocate funds for urgent warfighter needs and to reallocate funding after congressional appropriation as a result of changing needs. These processes could be used to initiate IT solution development in weeks or months, leading to rapid fielding. In addition, in many cases acquisition funds are allocated by Congress to a larger mission or program area or in some cases to a portfolio of projects identified with an area of mission need—for example, to fund software upgrades in a particular mission area or system. The potential exists to leverage such flexibilities to establish an empowered and accountable process that could rapidly allocate funding to IT projects. Providing transparency to Congress regarding the use of these flexibilities would be
important in order to ensure that proper oversight is maintained and to demonstrate achievement of the objective of rapid delivery.
In the longer term, the DOD could work with Congress to establish a new set of funding mechanisms for IT-supported requirements that would align congressional funding with mission or capability areas rather than with individual acquisition programs. Under this concept, Congress would allocate funding to a mission area that would be governed in the DOD through a process similar to portfolio management. In implementing this concept, DOD officials would be responsible for setting priorities and allocating the funding to individual IT projects after the congressional appropriation of funds to a portfolio of mission requirements. This approach can ensure appropriate justification of funding needs tied to mission requirements during budget submission as well as the rapid allocation of appropriated funding consistent with the pace of evolving mission requirements and technology advancements. Currently the DOD uses a process similar to this concept for funding maintenance upgrades to aircraft avionics software. Likewise, a somewhat similar process is used for managing IT projects funded through working-capital funding processes. These and other examples of flexible and rapid funding processes should be useful models as the DOD works with Congress to establish a new funding process for acquisition of information technology.
EXCESSIVE OVERSIGHT, YET INSUFFICIENT PROGRAM ACCOUNTABILITY
It has often been observed that although the predictable response of an organization to program failures is to institute additional oversight, the burdens resulting from that oversight may paradoxically increase the likelihood of future failures. This phenomenon appears to have been at work in the DOD, where problems in past IT programs have led to oversight that delays program completion without necessarily ensuring the delivery of timely and useful capabilities.
In the current program environment, there can be multiple oversight bodies, and there are numerous participants in the program oversight and review process. Specifically, there are layers of integrated product teams (IPTs) that perform reviews of programs on a periodic basis and as a precursor to milestone decisions. These layers of IPTs consist of working-level IPTs (WIPTs), integrated IPTs (IIPTs), and overarching IPTs (OIPTs). When the IPT construct was originally instituted, it was intended to help a program resolve program issues quickly and at a level as low as possible in the organization. However, over time, the IPT structure has become a burdensome process often consisting of representation by organizations with special interests and with no accountability for program success.
Each of the oversight groups can require program changes, and often each individual representative of a Service or group has the ability to force changes to a project or to require special accommodations or requirements at a very detailed level, often without any justification with respect to impacts on the cost and/or schedule. In consequence, “process leadership” has replaced results-oriented IT acquisition leadership; success is defined as strict adherence to the acquisition process and to specific requirements defined at the outset of an acquisition program, even when the end-user capability is hopelessly compromised. Tracking the process milestones can easily become mistaken for the tracking of real project results. Real project progress, assessment, and milestone evaluation are often confused with adhering to process requirements. Ultimately, too many pocket vetoes result in the substitution of process for product.
Although processes are important, true operational measures of success for a program, especially those related to end-user satisfaction, are more important. Related to this, shared personal responsibility and accountability on the part of all participants in the process are essential for program success. This is especially important in joint programs, where Service-specific preferences have to be reconciled with the common good.
There are notable counterexamples to the trends discussed above. Several large and complex DOD programs without high degrees of institutional oversight have been demonstrably successful in rapidly delivering useful capability to the field by following tailored, focused, proactive, accountable oversight of the kind advocated in this report; see the examples given in Box 2.1 and detailed in Appendix D. Such programs tend to have been managed by program managers (PMs) who figured out how to navigate the acquisition process effectively so that they could focus on meeting end-user needs and keep oversight to a reasonable level. The key is in striking the right balance—placing on the program manager appropriate responsibility and accountability for meeting end-user needs in a timely way while establishing appropriate levels of oversight.
One way to address this challenge is to explicitly clarify and strengthen the responsibility, authority, and accountability for program execution and to create a process that allows for clear, more timely, and more accurate assessment of a project’s progress and risk and for the early identification of failing projects. One approach would be to provide the PM and a portfolio management team (PMT) with decision authority, derived explicitly from higher authority, to determine trade-offs among schedule, cost, and functionality. The PMT would represent several equities—an acquisition equity at or above the next echelon up from the PM, a functional equity representing the voice of the end user, and an enterprise equity typically represented by the chief information officer or the CIO’s designee. The
Succeeding with Nontraditional Oversight
Selective examples of large and complex Department of Defense information technology (IT)-based programs that were demonstrably successful with nontraditional oversight are identified below and discussed in detail in Appendix D. Collectively, these success stories provide evidence that changes of the nature proposed in this report can have dramatically positive impacts. Common characteristics among these programs include the following:
Following are some notable examples:
PMT, in consultation with the PM, would be empowered to make decisions about such things as development priorities, the contents of capability increments, and what constitutes “must have” versus “nice to have.”
The examples in Box 2.1 also suggest several other ways to achieve greater agility and responsiveness and to avoid the stifling oversight
inherent in the preponderance of acquisition category (ACAT) I-level programs (see the section below on “Legislative Impediments” for information on ACAT programs). One of the best channels for doing so is to leverage the opportunity to insert improvements on top of existing platforms. The conundrum is how to get a baseline initial operating capability
in place so that it can be used as a platform for rapid development and fielding. Large programs must be structured to move to IOC in a more timely manner so that they can become the “platforms” on which future capabilities can be acquired in an agile, iterative, responsive fashion to meet end users’ ever-changing requirements and to take advantage of emerging technologies.
The biggest challenges in effecting real change will continue to be with “new starts” and larger-value DOD IT programs for which there are demands for detailed justification, a ponderous programming and budgeting process to get multiyear funding justified in the Program Objectives Memorandum and appropriated annually, and “disciplined” ACAT I-level oversight processes that do not align well with agile acquisition.
Another way to foster greater agility is to structure the DOD IT portfolio so that it includes a small number of true enterprise-level programs that are so big, so important, and so costly (i.e., above the funding thresholds at which weapons systems are categorized as ACAT ID) that they warrant intensive management and oversight at the Office of the Secretary of Defense (OSD). A good example of a large enterprise-level IT program is the Global Information Grid-Bandwidth Expansion program. Such large programs could be scheduled so that they deliver next-generation platforms at a point in time when the legacy platforms are nearing the stage at which they can no longer be sustained through annual upgrades and should be replaced. If the remaining programs in the DOD IT portfolio were structured for decentralized management and oversight, the IT system acquisition community would be better positioned for agile acquistion. Decentralized programs might be categorized in the DOD IT portfolio as (1) modification-in-service programs and (2) new programs structured at dollar levels that permit designation as ACAT II and ACAT III with the same dollar thresholds used for weapons systems and with oversight by the Services, agencies, or program executive officers.
CULTURAL IMPEDIMENTS TAKE PRECEDENCE OVER RAPID DEVELOPMENT
The DOD’s perceived need for caution over speed is understandable. Given the criticality and danger of its mission, its worldwide operations and large workforce, and the frequent need for clear, decisive action, the Department of Defense, by its nature, is an organization with a classic command-and-control culture. If current trends continue, it is likely that processes and systems will become even more top-down and centralized in spite of the DOD’s desire to move to an integrated, cross-Service environment with empowered decision making at all levels of command. Although current doctrine is shifting from a “need to know” basis to a
“need to share” basis, it is being accomplished through clearly controlled and hierarchical processes and systems.
In a command-and-control environment, the steps in an IT system’s life-cycle development process are based on frequent reviews and concurrence by a large number of concerned, but often narrowly focused, stakeholders. Such an environment does not lend itself to rapid innovation or to rapid development processes.
Meaningful assessment becomes nearly impossible when large, complex programs have long time spans between significant milestones. Current DOD processes, for example, put great emphasis on detailing requirements before a program is approved to start, so that costs and risks can be evaluated. Although this makes sense for most weapons systems, for IT systems it often results in years of requirements development, leading to the delivery of IT systems that are trying to meet requirements that have long since changed or are continuing to shift. A great deal of project time can pass while costs accumulate before a meaningful assessment of project viability can be made.
To fundamentally recalibrate the culture, any proposed new DOD IT systems acquisition process should focus on the rapid fielding of successive increments of capability. The time of delivery of some useful and usable capability should become a key performance parameter. A rapid delivery capability based on commercially available technologies is needed, along with a fielding model that permits the evolution of these capabilities as the technology evolves during training, integration, maintenance, and support. The DOD must move to a culture that enables the rapid adoption of new technologies and a process that assesses where such rapid developments are essential. Delivering increasingly useful increments of capability should receive a higher evaluation than that given for delivering nothing to the field for several years.
In conjunction with this process, necessary to program success is a culture of “tough love” communicated through direct, pragmatic advice that is given by oversight organizations to the program managers in combination with the element of advocacy. Such a culture needs to replace excessive oversight, a “gotcha” mentality, and gatekeeping. The culture must support the whole team in the effort to help the PM deliver capabilities to the user, replacing a culture focused on rigid controls, fault finding, and grading of the PM’s work.
More generally, incentivizing the use of IID and agile-inspired approaches to acquisition and development is key. The use of successful IID approaches and the delivery of a capability to the end user should merit rewards for the whole team. It will be important to foster the notion that when a program “wins,” the whole team wins. (See Chapter 3 for a discussion of how the handling of failure also needs to change.)
Leadership must come from senior levels within IT organizations, not only from the IT staff. Defense acquisition executives have an important role to play in establishing a culture that encourages IID. In addition, a process by which successes and failures are reviewed on a relatively frequent periodic basis would aid in helping teams understand what is working and why (or why not). In essence, an iterative approach can be applied to the process as well as to the technology.
INADEQUATE INFORMATION TECHNOLOGY ACQUISITION WORKFORCE
Under the leadership of the Under Secretary of Defense for Acquisition, Technology and Logistics, and as specified in DOD acquisition regulations, the DOD Acquisition Corps is charged with procuring systems and services to meet warfighters’ needs in a timely fashion as required to satisfy national security objectives. The Acquisition Corps includes many highly trained specialists in areas of engineering, science, testing, and business and program management who act as acquisition executives, program managers, and contracting officers. A number of studies have expressed concern about the technical proficiency of the acquisition workforce.9 Over the past two decades, numerous defense authorization and appropriation bills have included provisions aimed at improving the training of acquisition professionals.
Among the many challenges in this area is that relatively few in the acquisition workforce have specific expertise in IT or in how to manage IT programs. An important factor is that there are few “digital natives”— people who have grown up with and/or are highly proficient with IT—in the ranks of senior acquisition PMs.
Another factor is that very little currently available formal training is focused on the distinct issues that arise in DOD IT programs. Although the Defense Acquisition University (DAU), the premier source of acquisition training in the DOD, offers both resident and remote training programs that emphasize systems program management and policy compliance, the DAU does not have a comprehensive program to teach IT program management or IT test and evaluation.
As a result, the acquisition workforce is not well equipped to manage IT programs. Because the DOD’s acquisition regulations were designed primarily to meet the needs of large weapons programs, significant tai-
loring is required to accommodate IT programs, especially to follow the incremental, iterative development process recommended in this report. Without a nuanced understanding of IT and the needs of IT programs, an IT program manager is thus at a disadvantage in advising or embarking on the tailoring of acquisition processes. Even those PMs willing and able to advocate for significant tailoring would be incurring additional risk by embarking on a course distinct from the standard acquisition process.10
Moreover, personnel practices that are common in the acquisition community make it nearly impossible to align rewards and penalties with true program success. Contributing factors include these:
The DOD rotates personnel too often for any one PM to see an acquisition through more than a single milestone;
The acquisition process rewards the following of acquisition processes rather than the delivery of useful and usable capability to end users;
The military culture is a “can do” culture—no program manager wants to say that a given task cannot be done; and
Program size is used as a success metric and is associated, overtly, with rank. As a result, program managers are incentivized to make programs larger, which contrasts starkly with evidence from many studies that smaller programs reduce cost and risk.
Closely related to acquisition workforce capabilities is the DOD’s capacity to be a smart buyer—to possess the in-house technical expertise (that is, scientific, engineering, and mathematical skills) required to engage effectively with industry on technical design, research and development, and procurement matters. It is generally viewed as inherently a government responsibility that cannot be delegated to industry. As such, sustaining (and enhancing) a smart-buyer capability is a necessary complement to efforts to strengthen the acquisition workforce or to reform DOD acquisition processes.11
For example, a PM fielding a COTS-based capability might attempt to argue that the Technical Readiness Assessment pertinent to the program should assess the integrator’s ability to build and deploy components based on past performance rather than the COTS ability to support common infrastructure requirements. In today’s environment, the likely result of such a discussion would require the PM to do both: demonstrate that widely deployed COTS works as advertised and provide data supporting the integrator’s ability to build and deploy components on the infrastructure.
See, for example, Kenneth Horn, Carolyn Wong, Elliot Axelband, Paul Steinberg, and Ike Chang, Maintaining the Army’s “Smart Buyer” Capability in a Period of Downsizing, RAND, Santa Monica, Calif., 1999. Available at http://www.rand.org/pubs/white_papers/2005/WP120.pdf; accessed December 12, 2009.
Today’s acquisition oversight process in the DOD is designed for the disciplined management of large, expensive, complex weapons systems—a process whose overall features are dictated by statute. Programs are assigned acquisition categories based on acquisition cost estimates and are designated for oversight levels based on associated cost thresholds—at the Office of the Secretary of Defense, at the Service or agency level, or at lower levels such as that of program executive officers. However, the total dollar thresholds for designating oversight levels for IT programs are significantly lower than those used for weapons systems (by a factor of five).12 This results in a dichotomy in which an IT system with a development and deployment cost of $126 million over its life cycle has highly centralized oversight at OSD, while a weapons system counterpart at the same dollar level can be decentralized for oversight at the program executive officer level. Moreover, the current legislation has no provision for major automated information system (MAIS) programs to receive oversight at the Service or agency level. One approach to solving the problem of highly centralized oversight with its attendant delays would be to use the same dollar thresholds in effect for major defense acquisition programs (MDAPs) for the designation of ACAT levels to MAIS programs. This elevation of thresholds for IT programs would better align the authority for IT program oversight to the appropriate levels at OSD, the Services and agencies, and lower echelons.
MEASURES OF SUCCESS
The committee fully anticipates that any new IT systems acquisition process that is defined and adopted will, by definition, evolve over time. The committee has identified shortcomings in the present system and recommends a new direction for and approach to IT acquisition. Evaluating the progress and success of programs is a critical component of this approach. Indeed, for each program there should be a tailored set of metrics that are agreed to.
This report does not recommend a particular set of metrics; instead it describes the generic categories from which the metrics should be drawn. For example, there will be instances in which an IID approach is aimed at
developing modules that will reside within a well-established framework that has already been developed to have security, reliability, and/or other nonfunctional characteristics, and elaborate metrics will not be required in those areas. But there will be other cases where the infrastructure is not available or is inadequate, and in those cases the metrics will need to be expanded to account for the infrastructure requirements as well as the intended functionality. Similarly, although the dominant focus should be on end-user needs, attention should also be paid to other stakeholders as appropriate. The following measures of success for IT systems acquisition are suggested as a useful set of guidelines to consider when developing tailored metrics for particular programs:
Measurable improvements in currently fielded end-user capability, including functionality, performance, the meeting of commercial benchmarks, and reliability experienced by the end user.
Measurable reduction in the costs of currently fielded IT operations.
Measurable improvement in end-user satisfaction.
Measurable increase in consumption (use by end users).
Significantly reduced COTS fielding times.
For software development and commercial off-the-shelf integration programs, fielding capability within the product cycle of major COTS components.
For COTS hardware, software, and service programs, fielding pace comparable to COTS cycles (for example, no more than 12 to 18 months for COTS hardware).
Significantly reduced increment cycle times—no more than 12 to 18 months between increments of fieldable end-user capability.
Measurable decreases in the number and severity of bugs discovered postfielding (including security vulnerabilities).
Measurable improvements in availability and reliability as experienced by the end user.
Measurable reduction in currently fielded IT operations costs.
Measurable improvements in administrator-to-server ratios.
Measurable improvements in server-to-client ratios along with a decrease in the unit costs of bytes served (output delivered).
Measurable improvements in progress against budget, schedule, and functional capability.
Demonstrably part of a long-term competitive strategy.
Corresponding, specific target metrics could be developed for each portfolio and project or program funded by the portfolio at the initiation of a program and defined incrementally for each iteration within a project or program. (For example, a performance metric might be tightened over successive increments of a program.) These metrics would form the basis for reporting progress to senior DOD and Office of Management and Budget officials as well as to Congress.