Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 23
--> 2 Project Management Introduction DOE's inability to complete projects on time and on budget has been widely reported by the U.S. Congress (1998a, 1998b), the General Accounting Office (GAO) (1996, 1997, 1999), the Defense Nuclear Facilities Safety Board (DNFSB) (1997), and DOE's Office of Inspector General (DOE, 1995a, 1996a, 1997a), among others. In 1996, Congress requested that the GAO review DOE's ability to complete major systems acquisitions (projects with TPC of more than $100 million) within originally estimated costs and time schedules. GAO found that between 1980 and 1996, 31 of the 80 DOE major system acquisition projects had been terminated prior to completion, 34 were continuing although over budget, and 15 had been completed (GAO, 1996). GAO noted that management problems and ineffective oversight had led to cost overruns, schedule slippages, and project terminations. The DNFSB also raised concerns about ineffective project management in its analysis of schedule slippage in DOE's spent nuclear fuel project at Hanford, Washington, citing the lack of sound project management as the principal reason (DNFSB, 1997). Furthermore, the DNFSB reported that poor project management practices can become a safety issue when project delays increase or prolong risk to the public, workers, and the environment (Conway, 1998). DOE's problems in completing many projects on time and on budget can be partially attributed to the complexity, uniqueness, and frequent changes in these projects, but these difficulties are exacerbated by DOE's shortcomings in project management. Among the deficiencies are an organizational structure unsuited to managing projects, inadequate techniques for planning and executing projects,
OCR for page 24
--> the lack of review processes, poor change control mechanisms, and the lack of performance metrics and risk evaluations. DOE faces major challenges in upgrading its project management system. But unless the system is improved in several areas, DOE will continue to have excessive cost overruns, inordinate time delays, improper project formulation, and a dissatisfied Congress and stakeholders. In this chapter the committee first reviews DOE's record of performance in carrying out projects. This is followed by a discussion of DOE's project organization and structure, procedures, policies, and accountability. The chapter ends with a review and assessment of DOE's project planning, budgeting and estimating, and the processes used to assess risk, track progress, and control change. Record of Project Performance Do DOE projects exceed their schedules and budgets more often and by larger amounts than projects of similar size and complexity with similar risks executed by other government agencies or private industry? The committee sought an objective answer to this question in a variety of independent analyses by the GAO and by DOE's independent contractor, Independent Project Analysis (IPA). These studies over the past decade have documented DOE's difficulties in meeting project schedules and budgets. The failings are attributed to the inherent difficulty, complexity, and changes in much of the work, deficiencies in DOE's policies and practices, constricting contractual arrangements, problems with project management, a complex cumbersome organization, and complicated inefficient administrative practices. (See Appendix A for information on problems with DOE's performance.) Studies by the General Accounting Office GAO has issued numerous reports related to DOE's construction, procurement, and contracting practices (for a recent review, see GAO, 1999). Many of these reports have centered on contractor performance and the appropriateness of contractor costs. Others have focused on DOE's difficulties in delivering major projects within baseline costs, schedules, and scope. GAO reported that only 15 of the 80 major systems projects initiated between the 1980 and 1996 had been completed, and many of those were behind schedule and over cost. Thirty-one projects were terminated before completion. GAO attributed the failures to four factors, not all of which are completely under DOE's control (GAO, 1996): flawed incentives for contractors lack of DOE personnel with the skills to oversee the contractors' operations DOE's unclear or changing missions incremental funding of projects
OCR for page 25
--> Statistical Analyses of Costs and Schedules In the past, DOE has commissioned quantitative analyses of its management of environmental remediation and waste management projects by IPA (1993, 1995, 1996). In these studies, statistical comparisons were made of DOE environmental remediation and waste management projects with comparable projects performed by private industry and other government agencies (primarily the U.S. Army Corps of Engineers). Although the committee was unable to examine the models IPA developed and used in these analyses, and therefore cannot comment on their validity, these studies constitute the only available cross-sectional and longitudinal analyses of DOE's project performance. DOE has not challenged the results of these studies. In fact, it made them the basis of a major reform effort that included a two-day "stand-down" (cessation of EM project activity to discuss solutions to the problems identified in the studies) on January 26 and 27, 1994. The stand-down was ordered by the secretary of energy as recommended by the EM assistant secretary. DOE also arranged for an update of previous IPA analyses to benchmark and calibrate improvements (IPA, 1996). Although the specific projects examined, the sizes of the samples (76 projects in December 1990, 65 projects in November 1993, 22 projects in December 1995, and 48 projects in April 1996), the statistical models, and IPA's database of industrial and governmental projects used for comparison all varied from study to study, some elements were common to all of them. All of IPA's studies focused on four outcomes: cost performance, or the absolute cost of DOE projects compared to the absolute costs of industry and other government agencies, normalized for comparability cost overruns, or the relative increase in DOE project costs compared to the original budgets schedule performance, or the absolute duration of DOE projects compared to the duration of projects by industry and other government agencies, normalized for comparability schedule slippage, or the relative increase in DOE project durations compared to the original schedules Project Costs and Cost Overruns The combined results of two IPA studies (1993, 1995) showed that DOE waste management projects cost 48 percent more on the average than comparable projects by industry and other government agencies; and DOE environmental remediation projects cost about 33 percent more. The average cost overruns were about 48 percent for environmental remediation projects and about 42 percent for waste management projects, compared to an average of about 3 percent for
OCR for page 26
--> industry and other government agencies. Moreover, the variability in cost growth from project to project was far higher than in industry. Thus, not only did DOE projects actually cost about 40 percent more than comparable industrial projects (the "DOE tax"), they overran their initial cost estimates by about 45 percent, indicating that DOE either has a problem with controlling costs or a problem with estimating costs, or both (IPA, 1993). The 1996 update of project performance showed that DOE waste management projects cost 33 to 43 percent more than similar projects carried out by the private sector, and cost overruns were between 22 and 36 percent. DOE improved somewhat on the cost of environmental remediation projects during the period, lowering the average cost to 25 percent more than the private sector. However, DOE cost overruns for environmental remediation were substantially unchanged, averaging 50 percent (IPA, 1996). Schedules and Schedule Slippage DOE waste management projects took an average of three times longer to complete than comparable projects by industry and other government agencies. The original schedules slipped an average of 52 percent, compared to an industry average of 17 percent (IPA, 1996). Thus, even though DOE's initial schedules were much longer than for similar projects by others, they nevertheless slipped more. DOE performance on environmental remediation projects was better, with durations only about 18 percent longer than those of comparable projects, but the average slippage was about 42 percent (IPA, 1996). In the 1996 update, IPA forecasted some improvements in costs and durations, but these were extrapolations for projects that had not yet been completed (IPA, 1996). No follow-up study has been made since April 1996 to determine whether these expectations were realized, but even with the predicted improvements, DOE's project costs and schedules would still be much higher than those of comparable projects by industry. Changes in Project Scope to Meet Contingencies Cost increases in DOE projects are often distorted by DOE's tendency to consider project scope as a contingency. In other words, if it appears that the project will overrun its budget, DOE reduces the original authorized scope of a project to meet the budget, or increases the scope to use up any apparent underruns (IPA, 1990). These adjustments bias project costs upward. Project Definition Project definition is a continuing problem for DOE. IPA's statistical analyses showed that inadequate project definition (detailed planning of scope, objectives,
OCR for page 27
--> resources) accounts for 50 percent of the cost increases for environmental remediation projects (IPA, 1993). IPA also found that in waste management projects, definition is usually done after only 6 percent of the design is complete, while the industry average is 15 percent (IPA, 1995). Project definition is closely correlated with cost growth (see Project Planning section below for details). IPA found that of the 48 percent DOE cost premium for waste management projects, 11 percent could be eliminated if DOE's project definition were improved to equal the average for the private sector (IPA, 1996). Value Engineering In one study, IPA estimated that the cost of conducting an effective value-engineering program is about 0.5 percent of the anticipated savings (IPA, 1996). LCAM (DOE's broad guidance for project management and acquisition) defines value engineering as "an organized effort, directed by a person trained in value engineering techniques, to analyze the functions of systems, equipment, facilities, services, and supplies to achieve the essential functions at the lowest lifecycle cost that is consistent with required performance, reliability, availability, quality, and safety" (DOE, 1995a). Yet the DOE inspector general concluded that DOE has not fully developed and implemented an effective value-engineering program (DOE, 1998a). The U.S. Army Corps of Engineers has reported a $20 return for every dollar spent on value engineering, and GAO has reported that value engineering usually results in a net saving of 3 to 5 percent of project costs. DOE savings for FY 1996 were about 0.2 percent, by comparison, according to a report by the DOE Office of Inspector General (DOE, 1998a). Project Team Turnover The turnover rate for DOE project managers is nearly twice that of industry, and this rate has worsened over time (IPA, 1995). Lack of DOE Involvement at the Project Level DOE waste management projects are generally led by project managers from contractor organizations, and DOE often has little or no representation on the project team (IPA, 1996). Although DOE's involvement at the project level has increased significantly since 1993, DOE's representation on project teams is still far less than the average involvement of owners in private sector projects. Because DOE has little involvement at the project level, control of the project is in the hands of contractors, and DOE is often not in a position to provide effective oversight. All project owners are frequently called upon to make decisions, and if the owner is not closely involved and his decisions are not timely, the project budget and schedule will suffer. DOE is at a further disadvantage when contractor
OCR for page 28
--> project managers are not adequately trained as project managers. Successful owners in similar situations compensate by developing effective methods for owner representation and oversight. It is a truism in the industry that good clients make good projects, and owners who are knowledgeable, informed, experienced, and involved make the best clients. Management Structure, Procedures, and Accountability The management of DOE projects takes place within a complex organizational structure that does not have established, consistent procedures for managing projects. Various policies and guidelines are available, but they are inconsistently applied by program and field offices. DOE has no central authority with the mission of ensuring that projects are managed properly, although FM provides administrative and oversight functions within the limits of its authority. This section begins with a description of DOE's project management and organization. This is followed by a review of guidance documents and procedures for project management. The section concludes with an overview of existing DOE policies and procedures for project management. Project Management and Organization The three principal secretarial offices (programs) (Office of Science, EM, and DP, which account for approximately 75 percent of DOE's obligations) as well as the other offices have distinctive methods and traditions of project management. Oversight responsibility for project management department-wide is assigned to FM, but, according to DOE policy and LCAM, project management is the job of the field offices. When FM was established, it was intended to be the organization within DOE accountable for project performance. The FM office was headed by an associate deputy secretary (now classified as a director), who was expected to have management authority over most of the field offices. However, this role never materialized. In fact, FM does not have line authority over any office or program. DOE projects are developed and funded under the direction of the program offices and managed by the field organization responsible for oversight of the contractor's implementation of the project. For major engineering or construction projects, a DOE project manager is assigned and, if necessary, a technical support organization is provided. Project managers may be responsible for more than one project at a time if the projects do not warrant full-time managers. Most projects are managed by operations and field offices that oversee design, construction, and operations that are actually carried out by contractors and subcontractors. Management approaches vary with the type of contract (see
OCR for page 29
--> Chapter 4). The M&O (management and operations) and M&I (management and integration) contractors, which run many of DOE's larger facilities, are responsible for the facilities (including construction) at their sites, although direction, oversight, and approval authority are retained by the DOE operations office. The contractor's project organization generally mirrors that of the DOE organization that manages the contract, with each component of DOE's organization having a counterpart in the contractor's. The organization is dictated largely by the request for proposal (RFP), which defines the scope of work and other requirements, such as the work breakdown structure, project control systems, reporting responsibilities, and regulatory requirements. The contractor prepares a detailed work breakdown structure and project organization chart, designating the roles and responsibilities for each function. Once the functional roles and responsibilities and organizational structure have been solidified, descriptions are developed for key personnel, indicating individual roles and responsibilities, levels of authority, and reporting relationships. For major system acquisitions, the contractor's project manager is generally selected early to provide leadership and direction during the preparation of the proposal. After the contract has been awarded, the project manager and key staff members begin the process of building a team. Policies and procedures are developed and implemented, and technical, cost, and schedule baselines are established. Communication is established within the project and with the customer (DOE and/or the M&O or M&I contractor). Periodic performance reports are prepared for the management of the contractor and DOE, including cost and schedule performance reports, variance analyses, and manpower reports. The contractor is supposed to conduct quarterly program review meetings, attended by personnel from DOE headquarters, the field office, and the M&O or M&I contractor. At these meetings problems are identified, action plans developed, and implementation tasks assigned to a lead organization and a responsible individual. Project Management Guidance Documents Project management at DOE is governed by four documents, in hierarchical order. At the highest and most general level is DOE Order 430.1, LCAM (Life Cycle Asset Management), which was originally implemented in 1995 and was recently revised with DOE Order 430.1A (DOE, 1995b). LCAM is often dismissed by DOE staff as pertaining mainly to program management, but its tenets are applicable down to the project level. The second document, the Joint Program Office Direction on Project Management (JPODPM), is a list of procedures for project management that is signed by the three Principal Secretarial Offices (DP, ER, and EM), the Director of RW, and the office managers responsible to them.
OCR for page 30
--> The third document is the Energy Systems Acquisition Advisory Board (ESAAB) Notice (N 430.1), which addresses decision making by headquarters. The fourth is a set of about 30 LCAM Good Practices Guides (GPG-FM-20-various), which expand on the basic principles of LCAM. A diligent project manager would use LCAM to clarify responsibilities, JPODPM for procedural guidance, and the Good Practices Guides for specifics in carrying out project management tasks. Three of the four contain mandatory requirements. Life-Cycle Asset Management The purpose of LCAM is defined as follows: [LCAM] establishes high-level Departmental requirements in planning, acquiring, operating, maintaining and disposing of DOE's physical assets. The Order phases out 13 prior Departmental Orders in these functional disciplines and incorporates industry standards, a graded approach, and performance objectives. The LCAM Order focuses on performance or outcomes over process and allows the Operations/Field Offices the flexibility to develop their own systems once a site-specific performance agreement has been implemented. The LCAM Order specifies the minimum project management system requirements that all projects must comply with (DOE, 1995b). In short, LCAM outlines a strategy for facility management, from initial planning through final disposition. Unlike the detailed project management directions and guidelines in the earlier DOE Order 4700.1, Project Management System (which LCAM replaced), LCAM contains only general guidelines for project management. The process, it says (in the "Requirements" section), "shall be an integrated, systematic approach that shall ensure, but shall not be limited to . . . a project management system based on effective management practices that is sufficiently flexible to allow for the size and complexity of the project." The section concludes with 15 requirements that lead to conceptual design, project execution, and project operation. LCAM does not address the issues of reporting, federal and contractor responsibilities, or accountability. It does not specify an evaluation, reporting, or monitoring formula and does not provide guidelines for making specific critical decisions. The 430.1A revision to LCAM includes as an attachment the Contractor Requirements Document, which contains requirements for contractors that are similar to those imposed on DOE. LCAM concludes with a list of the responsibilities of the organizational elements of DOE. The program office responsibilities include: (1) reviewing infrastructure activity of field elements in coordination with FM; (2) reviewing field element performance, including design, scope, and cost; and (3) peer reviews of programs under its authority. Field elements are responsible for (1) coordinating all review and external oversight activities of the contractors; and (2) conducting independent design, scope, and cost reviews when the size and complexity of a project warrant them.
OCR for page 31
--> Joint Program Office Direction on Project Management The JPODPM, which was finalized in January of 1996, was the result of LCAM's assignment to program offices the authority to provide guidance to the field, and is intended to supplement the limited project management guidance found in LCAM (DOE, 1996b). JPODPM applies only to the participating programs that have signed the directive (i.e., EM, ER, DP and RW). A program level distribution memo describes the document. The JPODPM provides joint program direction to their respective field project management organizations. EM, ER, DP, and RW have consolidated their basic project requirements in the areas of planning, reporting, approval and change control thresholds. The JPODPM provides supplementary requirements to the LCAM Order in areas where the programs are in need of specific products or documents. The JPODPM may be used for project-specific technical reviews at the discretion of the participating Program Project Management Team members (DOE, 1996b). The directive assigns responsibilities for critical decision for major systems and other line-item projects, based on a "graded" approach to meet the needs of individual projects. JPODPM applies to major systems, to other line-item projects, to operating/expense-funded projects, and to general plant projects and capital equipment. According to JPODPM guidelines, however, the DOE headquarters program office determines the degree of planning and documentation required. The JPODPM requires that the outcome of the conceptual phase of a document be documented in a conceptual design report (CDR) or other appropriate document. The approved CDR is the basis for project design and planning. Following the CDR, JPODPM requires that either a strategic system execution plan (SSEP), a project execution plan (PEP), or, for an environmental restoration projects, a management action process (MAP) document be developed by the field element. The SSEP, PEP, or MAP constitutes an agreement between the headquarters organization and the field element and describes the management responsibilities and commitments for the project. The PEP also includes the schedule milestones, costs, project deliverables, project baselines (including project controls, change controls, and thresholds for change), and if applicable a performance-based measurement process. The PEP and the CDR combined are the foundation for setting the scope, cost, and schedule baselines for a project. The JPODPM includes an implementation schedule and requires that all projects be in compliance. Any departure must be reported to the appropriate headquarters program office. Table 2-1 shows the project management decision and approval levels for all programs except environmental restoration. Energy Systems Acquisition Advisory Board Notice The ESAAB focuses on the authority and responsibility for making the four
OCR for page 32
--> TABLE 2-1 Project Management Decision, Documentation, and Approval Levels Based on Total Estimated Cost Decision Document Strategic Systems (as designated by the DOE secretary) Other Line Item Projects (> $50m) Other Line Item Projects (< $50m) General Plant Projects (< $2m) Approval of mission need (Critical Decision 1) DOE secretary or as delegated HQ program office HQ program office Field element Conceptual Design Report DOE secretary or as delegated HQ program office Field element Field element Approval of baseline (Critical Decision 2) DOE secretary or as delegated HQ program office Field element Field element Approval to start construction (Critical Decision 3) DOE secretary or as delegated Field element Field element Field element Completion/start of operations (Critical Decision 4) DOE secretary or as delegated Field element Field element Field element Source: DOE, 1996b, Attachment 1. critical decisions (CDs), rather than on the technical and documentary material on which the decisions are based. The CDs are: (1) approval of mission need; (2) approval of baseline; (3) start of construction; (4) completion or start-up of operations. The stated purpose of the ESAAB is described below as: The ESAAB Notice provides supplemental requirements to the LCAM Order that "dovetail" with the Department's physical asset management processes and clarifies Secretarial level processes in executive decision-making. The ESAAB Notice addresses the Headquarters' decision-making process for project critical decisions and baseline change control (DOE, 1997b). Good Practice Guides FM has issued more than 30 LCAM Good Practice Guides as references for project managers. The guides, which range from "Test and Evaluation" to "Comprehensive Land-Use Planning," provide information for meeting ESAAB, LCAM, and JPODPM requirements. These nonmandatory guides are intended to assist operations and field offices in developing performance-based management systems. The guides are based on industry practices and are not prescriptive.
OCR for page 33
--> Lack of Systematic Policy Application DOE's policy guidelines for project management have not been applied widely or systematically. Because the guidelines are largely voluntary and because there is no central management authority, the guidelines do not provide the basis for a professional project management organization. Most DOE projects are done by contract, and many contractors have their own management systems, which they use in lieu of DOE's. The scope of the Contractor Requirements Document addition to LCAM is broad enough that almost any project management system deployed by a contractor will comply. The guidelines for project management are not clear enough to ensure effective oversight by DOE in its role of "managing the contractor." Finding. DOE does not have adequate policies and procedures for managing projects. No single authority is responsible for enforcing or ensuring that project management tools are used. DOE employees associated with projects told the committee that the JPODPM is the most useful of the DOE project management documents. But many admitted not feeling constrained to follow its directions in their particular applications. In the committee's judgment, the project management documents are not detailed enough to ensure the effective implementation of a project management system. For example, DOE projects do not consistently use project management plans (PMPs) to define the organization of projects and the roles and responsibilities of the parties involved, although such plans are standard in industry and the JPODPM calls for them (see examples in Chapter 3). Although the committee appreciates that guidelines must allow for flexibility to meet special circumstances and to make room for innovation, the voluntary nature of the guidelines has become an invitation to nonadherence, and license for each headquarters program office, operations office, and field office to proceed in its own way. In fact, individual offices throughout DOE have also issued project management documents that vary in scope and quality. In general, the early stages of projects have been overlooked. For example, DOE has had limited success in implementing value engineering practices, which often must be completed during the early phases of a project to have a significant impact on costs. DOE has developed (although not consistently applied) comprehensive practice guides for the design and construction phases of projects but has not developed comparable guidelines for the early conceptual and preconceptual phases of projects when the potential for substantial savings is high in both time and cost (NRC, 1998). The committee considers CD-1 (approval of mission need) and CD-2 (approval of baselines) essential for defining credible and achievable project baselines.
OCR for page 39
--> Lack of Focus on Project Management Expertise DOE does not have a central direction for project management. FM was intended to have line authority over field managers but was never given that authority, and today FM has only advisory and oversight functions. Field managers continue to report both to the director of field management and to one or more program assistant secretaries. In short, FM provides staff support but is not a major influence on project management practices (Peters, 1998). In fact, based on its present organizational position, FM cannot effectively direct or oversee project management. Its main instrument of oversight is a quarterly report on the status of major and strategic systems and the administration of the annual project validation process to support the chief financial officer in the budget process. Studies by IPA (1996), the Construction Industry Institute (CII, 1991, 1994), and the Business Roundtable (BRT, 1997) have found that better-than-average project systems have some form of central organization that is responsible for controlling project definition, maintaining discipline, and integrating management activities. In the absence of an organization to maintain control over project management and carry out uniform policies, DOE has relied on program and field elements to accomplish projects. In addition to modifying its organizational framework to support project management, DOE should benchmark project management against generally accepted industry practices. Based on the collective expertise of committee members, the committee developed a list of characteristics that contribute to the successful completion of large, often one-of-a-kind projects. The characteristics, which are presented in Appendix C, are formatted as a checklist and do not define a process but could be used as a checklist for DOE projects. Quality Performance Standards Based on the ISO 9000 Process Many organizations that have recently reorganized to improve quality have sought certification through the International Standards Organization (ISO) a Geneva-based international agency responsible for global standardization which has established quality performance standards in the ISO 9000 process. These standards have been widely embraced by private sector and government organizations both here and abroad. In its simplest application, the ISO 9000 process requires that an organization define what it does, how it will do it, what records will be kept, and who the responsible parties are for all operations. The organization must then show that its policies and procedures are (1) consistent with the organization's purpose; (2) universally applied, understood, and followed; and (3) continued as the basis of doing business. For an organization to be certified, it must clearly define its purposes, missions, and goals; purge excessive procedures
OCR for page 40
--> and policies; and replace them with simple, straightforward documents that provide only essential instructions to staff. The organization then undergoes a certification review coordinated by the ISO. Reviewers note extraneous or conflicting instructions, and shortcomings in the quality performance program. Certification shows that the operation has a clear plan, procedures, and policies. ISO 9000 certification tends to reduce paperwork, eliminate nonessential activities, reduce operating costs, and improve performance. But it requires a sustained effort by the leaders and staff of the organization. Annual recertification requires that the organization continue to operate in the certified mode. A process like ISO 9000 certification could help DOE remake the operating policies, procedures, and standards that have accumulated over the past 50 years. Finding. ISO 9000 provides a certification process by which an organization can measure itself against its stated processes, but DOE has not obtained certification. The certification process would help DOE remake its operating procedures and standards and make its practices consistent with its procedures. Recommendation. DOE, as an organization, should obtain and maintain ISO 9000 certification for all of its project management activities. To accomplish this, DOE should name one office and one individual to be responsible for acquiring and maintaining ISO 9000 certification for the whole department and should require that consultants and contractors involved in the engineering, design, and construction of projects also be ISO 9000 certified. Project Planning DOE's project planning has not been effective, although there are exceptions, such as the successful Advanced Photon Source Project at Argonne National Laboratory and the B-Factory at the Stanford Linear Accelerator Center. Recurrent problems with project management have raised questions about the credibility of DOE's conceptual designs and cost estimates (NRC, 1998). Findings by the Business Roundtable (BRT, 1997), the Construction Industry Institute (CII, 1991, 1994) and several years of research and studies of hundreds of projects by IPA (1993, 1995, 1996) show that preconstruction planning is one of the most important factors in successful projects. Preconstruction Planning In March 1998, a government-industry forum on capital facilities and core competencies was held in Washington, D.C. The forum was sponsored by the Business Roundtable, the Naval Facilities Engineering Command, and the Federal Facilities Council, of which DOE is a member. The forum report concluded:
OCR for page 41
--> The best capital project systems maintain the in-house resources necessary to develop and shape projects in the advance planning phase and to bind the owner functions together to find the right project and prepare for efficient execution. Finally, they all maintain some form of central organization responsible for preparing the work process, for advance planning, to provide the skills and resources, to pull in critical core competencies, and to provide the interpersonal organizational structure that binds the operations, business, engineering, maintenance, outside organizations, and affected project systems (Federal Facilities Council, 1998). Through research and practice, the construction industry has also documented the benefits of preconstruction planning in terms of cost and schedule. Many committee members also have firsthand experience with preconstruction planning and the associated processes and documentation. Effective preconstruction planning involves several steps. First, it requires that in all aspects of the project, key personnel (design, construction, start-up, maintenance, and operations) be involved from the outset. Second, a strategic plan must be developed that defines mission needs and relates them to project requirements. Third, an integrated project plan must be prepared that addresses the overall strategy for acquiring the end product or service, identifies the interfaces, and establishes measures of success. Fourth, an integrated regulatory plan must be developed to identify regulatory interface points and requirements and to establish constraints and boundary conditions that must be accommodated. Fifth, a detailed project execution plan can be prepared to establish the tactics, organizational relationships, roles and responsibilities, and precise steps for executing the various aspects of the project. Finding. DOE preconstruction planning is inadequate and ineffective, even though preconstruction planning is one of the most important factors in achieving project success. Recommendation. DOE should require that strategic plans, integrated project plans, integrated regulatory plans, and detailed project execution plans be completed prior to the establishment of project baselines. To ensure facility user and program involvement in the preconstruction planning process, DOE should require written commitments to project requirements from the ultimate users. Setting Baselines Prematurely A baseline is a set of technical, scope, cost, and schedule parameters that describe the expected capabilities, cost, and duration of a project. In principle, baselines are based on adequate definition of engineering designs, with appropriate allowances for uncertainties (contingencies), and are included in budget
OCR for page 42
--> submissions. However, there is often considerable pressure to adjust estimated costs to fit the anticipated budget authorization. For example, some DOE personnel stated that Congress would not approve project capital requests of more than $1 billion, so estimates were adjusted to be below this number. Such practices inevitably result in cost overruns and/or subsequent reductions in project scope. DOE generally establishes project baselines after only 2 or 3 percent of the design work has been completed (Tavares, 1998), which the committee believes is premature. The consensus in private industry, the Department of Defense's Military Construction Program, and the U.S. Army Corps of Engineers Civil Works Program is that the appropriate point to develop a project estimate suitable for budgeting is at the 30 to 35 percent design completion stage (McGinnis, 1998). DOE has no designated source of funding for the preparation of baselines, and apparently does not explicitly budget for prebaseline engineering, making it difficult for project managers to support preconstruction planning. Other federal agencies have recognized the value of prebaseline engineering and budget for it accordingly. For example, the Department of Defense's Military Construction Program includes a line item for planning-and-design funds in the preliminary design stage in its budget requests to Congress. DOE has recently made some attempts to set baselines at a more credible stage of design. According to a recent report, "Managing to the Baseline," FM has made improved baselines its highest priority (DOE, 1998b), and corresponding efforts are under way with Congress and the Office of Management and Budget (OMB) to revise upward the required design content of project baselines and to provide the appropriate design funding in the FY 2001 budget (Peters, 1998). DOE's chief financial officer has supported this effort by including the revision of project baselines as part of the FY 2001 budget call. If total funds for baseline preparation are not increased, fewer projects will probably be undertaken by each program. Changing the baseline funding levels might appear to reduce the flexibility of program managers to manage their programs. Nevertheless, the committee believes that the possibility of reducing the overall number of projects would be much less damaging than premature baselines have been. The committee believes that agreements between DOE, OMB, and Congress on better baseline definition should be established as soon as possible. Finding. DOE often sets project baselines too early, usually at the 2- to 3-percent design stage, sometimes even lower. (An agreement between Congress and DOE' s chief financial officer to establish baselines at the 20- to 30-percent design stage is scheduled to be implemented beginning with fiscal year 2001.) Recommendation. DOE should significantly increase the percentage of design completed prior to establishing baselines. Depending on the complexity of the
OCR for page 43
--> project, the point at which project baselines are established should be between the completion of conceptual design and the completion of the preliminary design, which should fall between 10 and 30 percent of total design. The committee supports continuing efforts by Congress and the DOE to develop project baselines at a point of adequate definition beginning with the fiscal year 2001. Recommendation. Baseline validation should be assigned specifically to the project management office recommended in this report. The Military Construction Program of the Department of Defense, which requests planning and design funds for all projects in the preliminary design stage on the basis of total program size, is a potential model for DOE. Project Estimates and Budgets Project estimates are not the same as budgets. An estimate is a forecast of project costs. A budget includes the estimate plus contingency factors to cover future uncertainties, modified by professional judgment. A budget may be a target figure for the project, a cap that the project must meet, and a tool for project discipline (discussions of DOE's project costs, estimates, contingencies, and durations can be found in Appendix A and Appendix B). It is often erroneously believed that reducing budgets can reduce costs, but this is rarely true; reducing budgets may only increase budget overruns. Costs can be reduced, however, by faster project completion, reductions in project scope, better project definitions, redesigns, value engineering, rigorous change control, better quality control, more effective management, and more efficient design and construction through stronger incentives, shared lessons learned, and more effective competition. Considerable pressure has been generated from within DOE and from OMB to reduce project budgets to fit the funds expected to be available (OMB, 1997). Cost overruns are inevitable, however, if project budgets are arbitrarily reduced without associated reductions in scope. Managing Risk and Contingency Contingency allowances are required to pay for unforeseen but inevitable circumstances that cause costs to increase during the course of a project. Confidence bands can be established statistically on the basis of experience with other projects, taking into account the location, project complexity, and outside influences. Unjustified reduction of contingency is not a method to save money, and often results in extra costs associated with delays for acquiring additional funds or reducing the project scope to pay for unforeseen occurrences. Unfortunately, experience suggests that risk management is not central to DOE's planning, budgeting, and acquisition process (DOE, 1998c; OMB, 1997). DOE, as a government agency, should be risk neutral. In general, however, DOE
OCR for page 44
--> appears to be risk-averse in terms of cost and schedule accountability. At the same time, DOE has often taken the following risks on projects that private industry, or even other government agencies, would not have taken: commitments to cleanup and remediation projects based on unproven technologies commitments to a single cleanup technology without investigating alternatives awarding fixed-price contracts without clearly defined scope or conditions initiating remediation projects before wastes have been adequately characterized initiating projects based on estimates made at very early stages of definition with very low degrees of confidence initiating projects without adequate preconstruction planning initiating projects before project managers and other required staff have been identified DOE cost estimates violate OMB's policy on contingency allowances in a number of ways (OMB, 1992): the effects of uncertainty are not analyzed; no probability distributions are given for costs; past biases or over-optimism in cost estimates are not considered in preparing new cost estimates; no sensitivity analyses are performed; and cost estimates are not expected values. Based on the committee's observations, DOE, Congress, and other stakeholders do not communicate with each other effectively about estimated project costs and durations. Even the definitions of cost estimates and contingencies are inconsistent. Sometimes the estimated cost of a project is confused with the baselined TPC or the appropriated budget for the project. In many cases, the estimated cost for a project (numbers based on preliminary designs), the baseline TPC (more accurate numbers), and the project budget (allocated funds) differ. In the end, of course, TPC will equal the funds allocated. But, DOE often complicates the situation by failing to communicate its difficulties. Although there is a great deal of talk about risk, definitions of the term vary. Because DOE has no standard method for assessing project risk comparing projects, methods, and contracts is difficult. DOE often bases contingency allowances on a fixed percentage of the total estimated cost rather than on an assessment of the risks of success and failure, and contingency allowances on DOE projects are routinely low (see Appendix B). Contingencies should accurately reflect the risks and uncertainties inherent in the work and should relate to the degree of uncertainty (e.g., the fewer unknowns, the lower the contingency). One-of-a-kind projects, which have many unknown characteristics and may involve unproven technologies, have a higher probability
OCR for page 45
--> of not meeting initial estimates. Allowances for cost growth and unknown costs should be developed through risk, contingency, or scenario analyses. Sources of bias should be addressed through sensitivity analyses and independent reviews that evaluate the assumptions used in the cost and duration estimates, and cost and duration estimates should be robust against changes in assumptions. An explanation of the basis for the contingency allowances in a cost estimate should be submitted with the project proposal. Confidence factors, or the likelihood that a given budget or schedule will not be exceeded, should be associated with all cost and duration estimates at all stages, so project proponents, participants, and sponsors all have a clear idea of the risks of overruns. DOE project estimates, however, do not contain all of this information. DOE often presents point estimates with no indication of their reliability and seems to take no cognizance of the fact that some projects have higher risks of overrunning estimated costs and schedules and should therefore have higher contingency allowances. The availability of small, yet powerful, computers has ushered in a new era of risk assessment and analysis. Many firms and government agencies are using risk analysis techniques to identify, measure, and manage project risks. Yet DOE seldom conducts formal risk assessments or analyses for its projects. Results from risk analyses can be used to define contingency amounts for both budget and schedule. Technical performance risk assessments can be used to guide design decisions away from risky alternatives. Risk assessments conducted early in the project life cycle can afford a project team an opportunity to identify project risks and mitigate their effects. Periodic project risk assessments can be used to suggest effective contracting arrangements for shifting and sharing project risks. Considering the size and technology of DOE projects, it is very surprising that DOE conducts so few formal project risk assessments. Finding. DOE often sets project contingencies too low because they are often based on the total estimated cost of a project rather than the risk of performing the project. Finding. DOE does not always use proven techniques for assessing risks of major projects in terms of costs, schedule, and scope. Recommendation. DOE should establish contingency levels for each project based on acceptable risk, degree of uncertainty, and desired confidence levels for meeting the baselines. The responsibility and authority for managing contingencies should be assigned to the project manager responsible for doing the work. In the process of evaluating potential projects, DOE should apply risk assessment and probabilistic estimating techniques, as required by the Office of Management and Budget.
OCR for page 46
--> Tracking Project Progress Earned value is a method for tracking progress that relates the actual cost of the work performed to the budgeted cost and schedule (Abba, 1997). Comparisons of planned values to actual performed (earned) values provides an objective assessment of cost performance. The Department of Defense and other organizations use earned value as a metric and an early warning system for managing projects. DOE, however, does not use it. Moreover, DOE has no consistent system for objectively, rather than subjectively, tracking progress and predicting cost and schedule overruns. Finding. DOE does not effectively use available tools, such as earned value management, to track the progress of projects with respect to budget and schedule. Recommendation. DOE should establish minimum requirements for a cost-effective earned-value performance measurement system that integrates information on a project's work scope (technical baseline), cost, and schedule. These requirements should be included in the request for proposal. Change Control Change control is the systematic evaluation, coordination, and approval or disapproval of proposed changes in the established design baselines. Change control includes verification that approved changes have been incorporated into the technical configuration baseline and the contract budget baseline. Because changes to the scope, cost, or schedule of a project may be proposed after a baseline has been established, the project must have a clear and efficient process for managing these changes. Once a baseline has been set, a rigorous change control procedure should be established to maintain technical, cost, and schedule discipline. Change control requires that a current working cost estimate be kept up to date with all approved changes and that all impacts of all proposed changes be fully evaluated and priced out prior to approval. The LCAM Order (DOE Order 430.1), DOE's broad guidance for project management, includes this requirement, but there seems to be no mechanism for producing and maintaining a running estimate. For example, a 1997 LCAM self-assessment report by the Savannah River Operations Office noted that there was no policy or procedural requirement to have a working estimate throughout the life of a project (DOE, 1997c). Without an up-to-date working estimate, cost and schedule increases may be deferred until a periodic review, which usually results in unanticipated cost growth, delays, and reworking. DOE's baseline management and change control systems have evolved from a very formal, detailed system that was defined in superseded documents, such as DOE Orders 4700.1 and 4300.1. Today the DOE change control process is
OCR for page 47
--> rather loosely guided by DOE's LCAM Order and its implementing directives (JPODPM and ESAAB Notice N 430. 1). The JPODPM provides the most detailed guidance, including explicit decision-making authority. DOE's policy guidance states that it is the responsibility of the field office—not DOE headquarters—to direct or oversee the contractor because the field office is accountable for the contractor's performance. Despite this, managers in field offices told the committee that many managers in DOE headquarters continue to communicate directly with contractors. Even if they are primarily seeking information, these conversations often become de facto directives that can lead to changes in scope, cost, or schedule. Even a request for information outside normal reporting channels may divert contractor personnel and raise costs. The absence of a disciplined change control process makes it difficult to hold contractors accountable for delivering projects that meet agreed cost, schedule, and scope requirements. It also diverts funds from necessary maintenance of operational and site infrastructure and erodes the confidence of DOE's stakeholders, including Congress and regulators. With a properly functioning change control system, project managers can monitor and control changes, limiting them to changes based on authorized internal project replanning or contractual obligations. Finding. DOE does not have a consistent system for controlling changes in project baselines. Recommendation. DOE should establish a system for managing change that provides control, traceability, and visibility for all baseline changes. Change control requirements should apply to the contractor, the field elements, and headquarters. References Abba, W.F. 1997. Earned value management: reconciling government and commercial practices. Program Manager Magazine (Special Issue) 1:58-63. BRT (The Business Roundtable). 1997. The Business Stake in Effective Project Systems. Washington, D.C.: The Business Roundtable. CII (Construction Industry Institute). 1991. Organizing for Project Success. Austin, Tex.: Construction Industry Institute. CII. 1994. Pre-Project Planning Handbook. Austin, Tex.: Construction Industry Institute. Conway, J. 1998. Defense Nuclear Facilities Safety Board's experience with U.S. Department of Energy project management. Presentation by John Conway, Chair of the Defense Nuclear Facilities Safety Board, to the Committee to Assess the Polices and Practices of the Department of Energy to Design, Manage, and Procure Environmental Restoration, Waste Management, and Other Construction Projects, June 23, 1998, National Research Council, Washington, D.C. DNFSB (Defense Nuclear Facilities Safety Board). 1993. Improving DOE Technical Capability in Defense Nuclear Facilities Programs. Washington, D.C.: Defense Nuclear Facilities Safety Board.
OCR for page 48
--> DNFSB. 1997. Review of the Hanford Spent Nuclear Fuel Project. Technical Report 17 . Washington, D.C.: Defense Nuclear Facilities Safety Board. DOE (U.S. Department of Energy). 1995a. Report on the Audit of the Department of Energy's Environmental Molecular Sciences Laboratory. DOE/IG-0371. Washington, D.C.: U.S. Department of Energy, Office of Inspector General. DOE. 1995b. Life Cycle Asset Management. DOE Order 430.1. Revised October 14, 1998. Washington, D.C.: U.S. Department of Energy, Office of Field Management. DOE. 1996a. Special Report on the Audit of the Management of Department of Energy Construction Projects. DOE/IG-0398. Washington, D.C.: U.S. Department of Energy, Office of Inspector General. DOE. 1996b. Joint Program Office Direction on Project Management for Environmental Management, Energy Research, Defense Programs, and the Office of Civilian Radioactive Waste Management. January 1996 Issue and Distribution Memorandum. Washington, D.C.: U.S. Department of Energy. DOE. 1997a. Audit of Renovation and New Construction Projects at Lawrence Livermore National Laboratory. WR-B-97-06. Washington, D.C.: U.S. Department of Energy, Office of Inspector General. DOE. 1997b. Energy Systems Acquisition Board Procedures and Distribution Memorandum. DOE Notice: N 430.1, Approved October 28, 1997. Washington, D.C.: U. S. Department of Energy. DOE. 1997c. Savannah River Life Cycle Asset Management Self-Assessment Report, 1997. Savannah River, S.C.: U.S. Department of Energy, Savannah River Operations Office . DOE. 1998a. Audit Report: The U.S. Department of Energy's Value Engineering Program. HQ-B-98-01. Washington, D.C.: U.S. Department of Energy, Office of Inspector General. DOE. 1998b. Managing to the Baseline. A Report to the Secretary of Energy. February 26, 1998, revised October 5, 1998, and revised December 1998. Washington, D.C.: U.S. Department of Energy, Office of Field Management. DOE. 1998c. Report to Congress: Treatment and Immobilization of Hanford Radioactive Tank Waste. Phase I: Privatization Project Description. Washington, D.C.: U.S. Department of Energy, Office of Environmental Management. Federal Facilities Council. 1998. Government Industry Forum on Capital Facilities and Core Competencies. Washington, D.C.: National Academy Press. GAO (General Accounting Office). 1996. Department of Energy: Opportunities to Improve Management of Major System Acquisitions. Report to the Chairman, Committee on Governmental Affairs, U.S. Senate. GAO/RCED-97-17. Washington, D.C.: Government Printing Office. GAO. 1997. Nuclear Waste: Department of Energy's Project to Cleanup Pit 9 at Idaho Falls Is Experiencing Problems. Report to the Committee on Commerce, U.S. House of Representatives. GAO/RCED-97-180. Washington, D.C.: Government Printing Office. GAO. 1999. Major Management Challenges and Program Risks: Department of Energy. GAO/OCG-99-6. Washington, D.C.: Government Printing Office. IPA (Independent Project Analysis). 1990. U.S. Department of Energy, Office of Defense Programs Project Cost Growth Study: Briefing. Reston, Va.: Independent Project Analysis, Inc. IPA. 1993. U.S. Department of Energy, Office of Environmental Restoration and Waste Management, Project Performance Study. Reston, Va.: Independent Project Analysis, Inc. IPA. 1995. U.S. Department of Energy, Office of Environmental Restoration and Waste Management, Project Performance Study, Waste Management Addendum. Reston, Va.: Independent Project Analysis, Inc. IPA. 1996. U.S. Department of Energy, Office of Environmental Restoration and Waste Management, Project Performance Study Update. Reston, Va.: Independent Project Analysis, Inc.
OCR for page 49
--> McGinnis, C. 1998. Issues of construction project management and delivery. Presentation by Charles McGinnis, associate director (retired), Construction Industry Institute, to the Committee to Assess the Polices and Practices of the Department of Energy to Design, Manage, and Procure Environmental Restoration, Waste Management, and Other Construction Projects, August 3, 1998, National Research Council, Washington, D.C. NRC (National Research Council). 1998. Assessing the Need for Independent Project Reviews in the Department of Energy. Board on Infrastructure and the Constructed Environment, National Research Council. Washington, D.C.: National Academy Press. OMB (Office of Management and Budget). 1992. Revised Guidelines and Discount Rates for Benefit-Cost Analysis of Federal Programs. Circular No. A-94. Washington, D.C.: Executive Office of the President. OMB. 1997. Capital Programming Guide. Supplement to OMB Circular A-ll, Part 3: Planning, Budgeting, and Acquisition of Capital Assets. Washington, D.C.: Executive Office of the President. Peters, F. 1998. Project management at the Department of Energy. Presentation by F. Peters, deputy director, Office of Field Management, to the Committee to Assess the Polices and Practices of the Department of Energy to Design, Manage, and Procure Environmental Restoration, Waste Management, and Other Construction Projects, August 3, 1998, National Research Council, Washington, D.C. Tavares, A. 1998. The Office of Field Management's project management system. Presentation by A. Tavares, director, Project and Fixed Asset Management, Office of Field Management, to the Committee to Assess the Polices and Practices of the Department of Energy to Design, Manage, and Procure Environmental Restoration, Waste Management, and Other Construction Projects, June 22, 1998, National Research Council, Washington, D.C. U.S. Congress. 1998a. Testimony before the Subcommittee on Oversight and Investigations, May 12, 1998, regarding the Hanford Spent Nuclear Fuel Project, by John T. Conway, chairman, Defense Nuclear Facilities Safety Board, and Ernest Moniz, Under Secretary, U.S. Department of Energy. Washington, D.C.: Government Printing Office. U.S. Congress. 1998b. Energy and Water Development Appropriations Bill for Fiscal Year 1999. HR 105-581. Washington, D.C.: Government Printing Office.
Representative terms from entire chapter: