5
Project Management Metrics
PROJECT METRICS
This chapter reviews the use of project management metrics by the Office of Environmental Management (EM) of the U.S. Department of Energy (DOE).1 EM, in coordination with DOE’s Office of Project Management (PM), has developed detailed processes and methods for tracking project-level outcomes and success measures for activities it has defined as projects (DOE, 2015a). For example, EM’s headquarters staff use earned value management (EVM) techniques to track and monitor project cost and schedule performance. Key measures, discussed in detail below, include the Cost Performance Index (CPI) and the Schedule Performance Index (SPI), along with other typical EVM measures, such as management reserve (MR), estimate at completion (EAC), total project cost (TPC), and funding profile. Additional project management metrics that are typically specific to a given situation, such as objectives linked to safety performance, removal of specific amount of waste, or compliance with consent decrees, are not tracked by EVM techniques.
EM contractors are responsible for reporting project-level outcomes and their key measures through DOE’s Project Assessment and Reporting System (PARS) II system. In addition to cost and schedule, the PARS II system helps with tracking project specific metrics.2 This system provides up-to-date and reasonably
___________________
1 Project management metrics necessarily roll up into program performance metrics. However, this interim report focuses on project metrics; a future report by this committee will examine and discuss in detail EM’s program performance metrics.
2 See P. Bosco, “Project Management (PM) Governance, Systems and Training,” presentation to the committee May 6, 2020, Washington, D.C.
timely information to EM on a monthly basis so it can monitor, assess, and, if need be, take action to correct project problems as time elapses.
As with any project management measurement system, the PARS II and EM’s EVM systems are only as good as the information that the contractor puts into them. This has been an ongoing criticism of EM by the U.S. Government Accountability Office (GAO) as referenced in a number of reports (GAO, 2019b, c). For example, a lack of adequate scope definition during the front-end planning process creates an unstable baseline in which the scope changes or “creeps” as the project or program proceeds. This can lead to a situation in which the baseline is updated and the original baseline is lost, hence the metrics are not really indicative of the critical decisions (CDs).
EM’s portfolio of projects subject to DOE Order 413.3B, Program and Project Management for the Acquisition of Capital Assets, includes approximately 25 percent of its overall yearly budget.3 A large majority of activities are not defined as “projects” or fall outside applicability of Order 413.3B in other respects and are therefore not similarly tracked and managed. For instance, projects characterized as operations—the majority of EM’s work, including activities such as groundwater remediation—are not tracked by EVM systems certified by the project manager per requirements of Order 413.3B. This latter requirement is invoked for projects of more than $50 million and classified currently as a capital investment (DOE, 2015a; 2018a). As of October 31, 2020, 33 of 34 capital asset projects (CAPs) that were post CD-2 were tracked by EVM System (EVMS). (The one project that is not was approved for alternative project controls.4) EM advises but does not require contractors5 who perform projects costing between $20 million and $50 million to use EVMS, per Appendix C of Order 413.3B (DOE, 2018a), but does, for these smaller projects, nonetheless track earned value data. EM defines projects smaller than $20 million as minor capital projects and they are aggregated into programs (i.e., not tracked separately), further limiting EVM requirements of its activities.
GAO states that:
EM manages most of its cleanup work as operations activities, under less stringent oversight requirements than capital asset projects. EM manages its cleanup work under different requirements, depending on whether it classifies the work as a capital asset project or an operations activity. EM currently manages most of its work as operations activities. In its fiscal year 2019 budget, operations activi-
___________________
3 Rodney Lehman, Director of Project Management, Office of Corporate Services, Office of Environmental Management (EM), Department of Energy (DOE), comments during the committee’s July 21, 2020, public data-gathering session.
4 Paul Bosco, DOE Office of Project Management, email to Martin Offutt, committee staff, November 11, 2020.
5 Rodney Lehman, Director of Project Management, Office of Corporate Services, DOE EM, comments during the committee’s July 21, 2020, public data-gathering session.
ties accounted for 77 percent of EM’s budget (about $5.5 billion), and capital asset projects accounted for 18 percent about $1.3 billion)” (GAO, 2019a, p. 12).
The following sections explore project management metrics in more detail.
EM’S REQUIREMENTS FOR THE USE OF PROJECT METRICS
In its discussions with EM and review of the documents provided and existing websites, the committee has identified five primary performance management approaches used by EM on its projects:
- EVMS and PARS (described in Chapter 3)
- Project dashboards
- Project evaluation and measurement plans (PEMPs)
- Contract and project performance metrics and targets
- Progress reports to Congress
In general, EVMS is an organization’s system for monitoring project/program management that integrates a defined set of associated work scopes, schedules, and budgets. An organization’s leadership uses performance management information, produced from the EVMS, to plan, direct, and control the execution and accomplishment of contract/project cost, schedule, and technical performance objectives (scope of work). EVMS is a robust approach to project management and is well defined for use government wide. As described earlier, the EVMS approach is used for EM’s cost-based6 projects with contract values that exceed $50 million. By integrating scope, cost, budget, schedule, and risk, it can assess current performance and project future trends. Data are reported to and warehoused in DOE’s PARS II.7
Project dashboards8 are prepared monthly and provide a green, yellow, or red assessment of each active capital project and measures EM’s expectation that the project will meet its expected baseline cost (e.g., Monthly Cleanup Portfolio Report, DOE EM-5.22, Office of Project Management, January 2020). The color coding is assessed against cost, schedule, and scope. Evaluation criteria for these ratings were not identified, but as noted in Chapter 4, when discussing Order 413.3B Section 3c(4), point 3, that allowing for a 10 percent overrun is undesirable.
PEMPs measure the contractors’ performance and are the primary tool to establish incentive and award fees earned by each contractor. They are
___________________
6 Fixed-price, lump-sum, and guaranteed maximum price (GMax) contract types are excluded.
7 The committee will evaluate the effectiveness of PARS as it relates to the EM program in its second report.
8 DOE, Office of Project Management, 2020, “Project Dashboard - June 2020,” June, https://www.energy.gov/sites/prod/files/2020/06/f76/June%202020%20Project%20Dashboard.pdf.
established at each cleanup site with EM HQ’s review based on the size of the contract. Chapter 7 provides detail on criteria and rating methodology used for PEMPs. EM uses the phrase “key performance parameters” (KPPs) and describes KPP principles and their use in two documents, DOE Guide: U.S Department of Energy Performance Baseline Guide (DOE, 2015b) and Special Notice—Modification to End State Contracting Model (DOE, 2018b, App. C9). The documents focus on establishing baseline project definition and design basis and suggest that KPPs be established for any area where changes will have a major impact. The documents do not offer sample KPPs.9 A best practices reference providing examples of KPPs used on successful projects perhaps would be helpful to practitioners.
EM has identified a list of performance metrics used to assess project performance. This list was originally titled the Overall Contract/Project Management Performance (OCPMP) and is reported quarterly10,11 (see Table 5.1). The goal of the metrics is to measure progress toward completing the scope of work for the contract and the entire life of an operations activity. Notable is that the number of metrics has decreased from 17 in 2008 to 7 in 2020, and the title of the report has been changed to “Overall Root Cause Analysis (RCA)/Corrective Action Plan (CAP) Performance Metrics.”12
REPORTING OF PROJECT METRICS
The committee has reviewed sample copies of EM’s project management reports,13 among other provided documents. These reports show EM extensively using EVM project control practices along with capital asset project dashboards, and corporate performance measures. Regarding EVM, EM routinely calculates the following indices: SPI, CPI, EAC, budget at completion (BAC), budgeted cost of work scheduled (BCWS), budgeted cost of work performed (BCWP), and actual cost of work performed (ACWP).
Effective implementation of an EVMS requires a transparent and reliable process and approaches that explicitly and clearly highlight the project’s temporal
___________________
9 The committee did not see examples of DOE’s KPPs.
10 DOE, “FY 2020 Second Quarter Report: Overall Root Cause Analysis (RCA)/Corrective Action Plan (CAP) Performance Metrics,” https://www.energy.gov/sites/prod/files/2020/05/f74/FY%202020%20Q2%20Project%20Management%20Performance%20Metrics%20Report.pdf, accessed August 11, 2020.
11 DOE, “FY 2008 4th Quarter Metrics: Overall Contract and Project Management Performance Metrics and Targets,” https://www.energy.gov/sites/prod/files/FY2008%204th%20Quarter%20RCA_GAO_OMB%20Attachmentv02%202008-11-17.pdf, accessed August 11, 2020.
12 The committee will explore what led to the reduced number of reporting metrics and changed title in its second report.
13 Catherine Bohan, DOE-EM, “NAS 3133 Response to Request for Additional Information #1 dated 03062020 (Item 14)” email April 8, 2020.
TABLE 5.1 Comparison of the DOE’s Office of Environmental Management’s (DOE-EM’s) Project Management Performance Metrics and Targets from 2008 to 2020
No. Contract/Project Management Performance Metrics FY 2020, 2nd Qtr Actual FY18-FY20 | Contract/Project Management Performance Metrics FY 2008 Target FY 2008 Actual Comment |
---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
No. Contract/Project Management Performance Metrics FY 2020, 2nd Qtr Actual FY18-FY20 | Contract/Project Management Performance Metrics FY 2008 Target FY 2008 Actual Comment |
---|---|
|
|
|
|
|
|
|
|
|
|
|
|
NOTE: FY = fiscal year.
SOURCE: Data from DOE EM: FY 2020 Second Quarter Report (see footnote 9) and FY 2008 4th Quarter Metrics (see footnote 10) and “FY 2020 Second Quarter Report: Overall Root Cause Analysis (RCA)/Corrective Action Plan (CAP) Performance Metrics,” https://www.energy.gov/sites/prod/files/2020/05/f74/FY%202020%20Q2%20Project%20Management%20Performance%20Metrics%20Report.pdf; “FY 2008 4th Quarter Metrics: Overall Contract and Project Management Performance Metrics and Targets,” https://www.energy.gov/sites/prod/files/FY2008%204th%20Quarter%20RCA_GAO_OMB%20Attachmentv02%202008-11-17.pdf, accessed August 11, 2020.
status. Such approaches bring transparency to cost and schedule overruns. Some observations on these documents include:
- The calculation of the SPI is based on dollars, not time. By extracting two new variables from the progress reports, namely: actual time of work performed (ATWP) and scheduled time of work performed (STWP), a revised SPI(t) (equivalent to STWP/ATWP) could be created and would better track schedule performance. Figure 5.1 depicts these two new variables.
The difference between these two methods can be dramatic. For example, if a project scheduled for 4.5 years with a BCWS of $300 million actually finishes in 6 years, the traditional SPI at completion is 1.0, whereas the revised SPI(t) at completion is 0.75; the traditional schedule variance at completion is 0, whereas the revised schedule variance(t) is negative 1.5 years. SPI(t) captures the impact of increased time on performance whereas SPI does not, and thus SPI(t) may be used to forecast schedule delays.
- Including the percentage of cost over (under) run, compared to the baseline (i.e., original critical decision (CD)-2 TPC) in the project success metrics would provide more clarity. Some projects have significant cost overruns (e.g., some EM projects have more than doubled their baseline
cost and are not yet complete) and others have lower cost overruns. There are also some projects that finished exactly at the estimated cost. Currently, EM integrates all cost overruns into binary success metrics of Yes/No, which does not provide information on the magnitude of a cost overrun or underrun (i.e., the variance).14 The variance can be calculated as the difference between BAC and EAC, with the latter determined as BAC divided by CPI.
A robust, reliable, effective, and efficient governance process for the EVMS provides EM headquarters with more clarity on projects’ status. However, several reviews of EM’s EVMS indicate issues with its implementation and governance process. Examples provided by EM include: the certified EVMS is not fully used; a governance process is not in place; and some datasets provided by contractors are not accurate, complete, repeatable, and auditable (see Table 5.3 for more examples and references). Further investigation of the linkage between the governance and data collection processes, on the one hand, and effective implementation of EVMS, on the other, could be of assistance to EM.
Throughout the review of documents that EM shared with the committee, DOE made several statements that led to specific concerns associated with EVMS and its implementation. Table 5.3 contains a list of statements that were made in the existing documents by EM and its contractors related to EVMS.
All of these issues indicate the need for a robust, reliable, effective, and efficient governance process for EVMS. Therefore, for the second phase of this study, the committee plans to review EVMS governance in more detail, including:
- Current EVMS governance process, the involved parties, and their roles and responsibilities;
- Current EVMS certification process and enforcement of such certification;
- Current data collection processes for EVMS to ensure they are current, accurate, complete, repeatable, and auditable; and
- Current project control systems that EM actively uses.
Over the past 10 years, major projects around the world have adopted some form of digital design and workflow processes. Computer-aided design (CAD) and building information modeling (BIM) are the primary digital systems that improve collaboration, cost estimating, project visualization, scheduling, and project handover, among other metrics. DOE participates in the U.S. Army Corps
___________________
14 Rodney Lehman, Director of Project Management, Office of Corporate Services, DOE EM, “Overview of DOE O[rder] 413.3B and EM Project Management Protocol for Demolition Projects,” presentation during the committee’s February 24, 2020, Washington, D.C., public data-gathering session, Slide 11. Also see Cathy Bohan, DOE, “Project Success List.xls” in “NAS 3133 Response to Request for Additional Information #1 dated 03062020 (Item 6)” email to committee staff, March 25, 2020.
TABLE 5.3 List of Earned Value Management System (EVMS)-Related Issues that Were Explicitly Stated in the Documents Provided to the Committee
Issue | SOURCE |
---|---|
|
(a) |
|
(a) |
|
(a) |
|
(a) |
|
(b) |
|
(c) |
|
(c) |
SOURCE: (a) “14_January 2020 Master Segment Quad Charts 03.02.20.pdf,” slide 25; (b) “14 January 2020 Monthly CAP quad charts 03.03.20.pdf,” slide 12; (c) “14 January 2020 Monthly CAP quad charts 03.03.20.pdf,” slide 16.
of Engineers CAD/BIM Technology Center and the A/E/C CAD Standard.15 The committee’s review of the Central Plateau Cleanup Contract–Final Request for Proposal (RFP)16 and Section H of that RFP did not find DOE requirements17 for a BIM execution plan or other forms of digital delivery.
Projects in the United States have begun to follow ISO 19650, following its successful use in the United Kingdom.18 These standards are best practices for BIM collaboration and production. The standard integrates the project’s work and organizational breakdown structures (WBS and OBS) and enhances project estimating, scheduling, and status. EM may want to investigate ISO 19650 and, moving forward, determine a consistent requirement for inclusion in its contracts.
___________________
15 See Whole Building Design Guide, “CAD/BIM Technology Center: A/E/C CAD Standard,” https://www.wbdg.org/ffc/army-coe/cad-bim-technology-center.
16 See DOE, Environmental Management Consolidated Business Center, “CPCC Section H IDIQ,” https://www.emcbc.doe.gov/SEB/CPCC/Documents/RFP/CPCC_Section_H_IDIQ.pdf.
17 Every design and construction contractor has in-house digital standards and these are likely established through the contractors’ quality programs.
18 Organization and digitization of information about buildings and civil engineering works, including building information modeling (BIM)—Information Management Using Building Information Modelling—Part 1: Concepts and Principles.
Recent GAO Findings and Recommendations
In a 2019 report, GAO stated that EM follows only 25 percent (3 of 12) of PMI’s project management guidelines (GAO, 2019b). Among those project management guidelines that were identified as not met, or minimally met, were (1) developing and maintaining an integrated master schedule using GAO best practices; and (2) establishing project-reporting systems/databases to provide a clear picture of project performance to management and to keep the contractor accountable. In its response to the GAO, EM stated it would issue an update to the policy. EM issued the new policy by memorandum19 in November 2020. Chapter 4 of the present report also considers PMI guidelines.
GAO further stated:
EM relies on contractors’ EVM systems to measure the performance of its contractors’ operations activities, but EM has not followed (i.e., has not met, has minimally met, or has partially met) best practices to ensure that these systems are (1) comprehensive, (2) provide reliable data, and (3) are used by EM leadership for decision-making—which are the three characteristics of a reliable EVM system. Moreover, EM has allowed the contractors to categorize a large portion of their work in a way that limits the usefulness of the EVM data” (GAO, 2019b, p 36).
A further example of project progress tracking and its impact on closure and perception is the Hanford Waste Treatment and Immobilization Plant (WTP) during the early 2010s. The facility was employing a strategy of feeding liquid tank waste into a pretreatment facility at the WTP that would separate the feed into two streams—low activity waste (LAW) and high activity waste—for subsequent treatment and immobilization in respective facilities for each type of waste.20 However, EM stopped the construction of the facility in 2012 due to technical challenges. Following a period of rework, the contractor proceeded under a new strategy that would allow LAW sourced directly from the tanks to be pretreated to remove cesium and solids in a new purpose-built facility, the LAW Pretreatment System (LAWPS).21 From there it would be fed to the Low Activity Waste Facility, which would immobilize the waste. Over half of the $752 million EM spent on the pretreatment facility of the WTP in fiscal year 2013 to 2018 was for overhead, oversight, procurements, and facility maintenance. According to the contractor’s EVM reports, 43 percent was spent resolving technical challenges
___________________
19 William I. White, DOE EM, 2020, “Issuance of the Environmental Management Program Management Protocol,” Memorandum for Distribution, Washington, D.C., November 6.
20 DOE, Office of River Protection, 2016, “Low Activity Waste Pretreatment System: RCRA Notice of Intent Meeting,” November 14, https://www.hanford.gov/files.cfm/Attachment_1_LAWPS_NOI_presentation_Nov_20161.pdf.
21 DOE, “Direct Feed Low-Activity Waste,” https://www.hanford.gov/page.cfm/DFLAW, accessed November 10, 2020.
(GAO, 2020a). Despite the halt in construction and rework, an EM press release on August 4, 2020, stated that the project remained on schedule.22
METHODS FOR TRACKING PERFORMANCE VALUE
As examples of project metrics in a performance-based approach, the committee provides the following, which may be of use in developing an organization-wide consistent method of assessing the value gained by this relatively new approach. The committee observed through information it was provided and documents that it reviewed that these metrics were different from site to site and also even within sites.
Project Performance Measures and Outcomes
In late 2018, EM changed its primary contracting method to a performance-based approach, ostensibly to reinvigorate and accelerate cleanup and reduce risk and financial liability (DOE, 2018b). Performance-based contracts focus on outcomes and results, in contrast to a focus on the processes used to achieve the results. EM introduced the use of an indefinite delivery/indefinite quantity (IDIQ)23 delivery model to allow flexibility in the scope, duration, and type of contractual commitment.
This section will focus on performance metrics and benchmarks and the ability to define project outcomes and performance measures. As noted previously, EM is familiar with KPP principles as they are referenced in two of their documents (DOE, 2015, 2018a). DOE’s recent change in contracting method is a good time to reevaluate its metrics and KPPs at the project and program level. Some areas to consider:
- Does the flexibility of fixed price and cost-based contract types within an IDIQ conflict with exceptions for EVMS and PARS? For example, are too many or too few projects included?
- What is the median size on an IDIQ project, and does it exceed the $50 million contract exception?
___________________
22 The title of the article does explain how the progress made on the facility could be related to project metrics and performance goals. This example also shows the difficulty in determining technology requirements for a first of its kind facility, but also the very large costs of delay especially once construction is under way. See DOE EM, 2020, “Hanford Tank Waste Pretreatment System on Schedule,” August 4, https://www.energy.gov/em/articles/hanford-tank-waste-pretreatment-system-schedule.
23 See J.S. Gansler, W. Lucyshyn, and A. Carl, 2012, An Evaluation of IDIQ Contracts for Service, Center for Public Policy and Private Enterprise, University of Maryland, January, https://jocexcellence.org/wp-content/uploads/2017/02/UMD_09014_An-Evaluation-of-IDIQ-Contracts-for-Service_January-2012.pdf, for an industry survey of IDIQ strengths and weaknesses.
- For the large number of projects that are below the threshold for EVMS and PARS is there a guidance document for the field offices to assure that minimum requirements are met? For example, is there consistency in safety metrics, design, construction, and demolition performance measurement. How are those data aggregated and reported?
- How will changes in the IDIQ delivery model affect historical benchmarks established for large and small contracts?
- Does guidance on adjectival ratings exist and remain consistent?
Based on the committee members’ many years of experience working on major capital programs, the committee offers four performance measurement principles for developing a robust set of performance metrics:
Principle 1 Establish performance metrics consistent with delivery and contract forms and that can stand the test of time.
Successful outcomes are largely the result of a sound program management strategy. The strategy is necessary to translate the vision and intent across the enterprise or program to deliver desired outcomes. The executing program strategy is implemented by an organization through program-wide performance metrics and key performance indicators (KPIs) that measure project components. While benchmark performance expectations may change over time, the primary KPIs and metrics remain consistent and narrowly defined. This is particularly important for programs that are long-lived with multiple contractors and project managers.
To be clear, certain metrics’ importance may change throughout the project, but the individual metric should not. Such consistency allows for comparison across programs, projects, and tasks.
Principle 2 Limit KPIs to a handful at each level of execution.
At the project level,24 focus metrics on tasks, schedule, and costs. General categories for metrics and KPIs include:
___________________
24 Project KPIs are well established by all contractors. Autodesk performed a survey of 200 US-based contractors that measured frequently used KPIs in seven areas. They were:
- Consistency in capturing constructability issues in the bid documents;
- Logging requests for information;
- Documenting change order root cause and schedule impacts;
- Frequent schedule updates;
- Technology to manage safety and inspections;
- Labor productivity due to poor coordination, documents, and schedule; and
- Software to manage closeout activities. See Autodesk, Inc., “KPIs of Construction: Benchmarking the Industry,” https://www.autodesk.com/bim-360/kpi-construction-data-report-infographic.
- Financial
- Schedule
- Safety and Operational
- Quality
- Risk
Sound performance management is a data intensive effort requiring data capture, data storage, normalization, and analysis. Since most data today are captured electronically at the source of production on large projects, it is easy to compile large sets of spreadsheets and performance measures for every operational issue. A common error is measuring too many details with far too many metrics. No metric is perfect, and all have some unintended consequences. Good metrics are actionable, easy to visualize, and support the program strategy. For metrics, “less is more” and having fewer increases focus on desired outcomes.
Principle 3 Use benchmarks and metrics to foster competition.
Team competition among and across projects and programs will encourage productivity and innovation. Benchmark thresholds are often established to reflect minimum and up to exceptional expectations. Such thresholds levels are difficult to determine; stakeholder agreement and buy-in is often tedious and unproductive. In contrast, competition among similar teams offers an elegant way to challenge productivity and foster continuous improvement.
Principle 4 Capture, share, and train successes.
Allow top performance techniques to be shared program-wide. Except for a few patented processes, planning, design, construction, and operational innovations are short-lived. New approaches are shared via joint ventures, talent migration, and technical trade associations and papers. As early as the RFP stage in a project, processes to share technical ideas should be established by the customer.25 Contractual incentives can reward innovation but also demand that innovation be shared for future EM use.
The major takeaway concerning project metrics is that the relationship and importance of key metrics to driving program strategy is central to overall strategy attainment. The IDIQ delivery approach places greater emphasis on EM’s program management staff to establish metrics that improve performance and complement strategy. The importance of using metrics as a driver of continuous improvement and behavioral change cannot be overemphasized.
___________________
25 See discussion of Infrastructure Ontario, below.
In 2018, when Assistant Secretary White described the end-state contracting model (ESCM),26 her goals were to reduce risk and financial liability, accelerate cleanup, and share risk between government and industry. Guidelines for the ESCM focus on process, time to complete the procurement, and a post-award incentive fee. The ESCM guidelines do not offer guidance on how EM should address the lack of cost- and schedule-competition, post-award, a strategy to share innovation, or the use of metrics or methods to assess best value. Other large infrastructure programs that rely on EM’s list of primary contractors may offer EM an opportunity to review their best practices to prepare thoughtful metrics prior to issuing a request for quote (RFQ). One such organization is Infrastructure Ontario (IO), described in Box 5.1); another is environmental cleanup activities at the Department of Defense (DoD) base realignment and closure (BRAC) and formerly used defense sites (FUDS), described in Box 5.2.
FINDINGS AND RECOMMENDATIONS
FINDING: DOE’s Office of Environmental Management, with the help of DOE’s Office of Project Management, has developed detailed processes and methods for tracking project-level outcomes and success measures. The management of EM’s projects by headquarters’ staff uses earned value management, including key measures such as CPI, SPI, management reserve, EAC, TPC, funding profile, and others. DOE-EM contractors report project-level outcomes and their key measures through DOE’s Project Assessment and Reporting System (PARS) II. This system provides EM monthly data on the projects they track and provides up-to-date and reasonably timely information they can monitor, assess, and act on. However, the committee found evidence that EM and its contractors are not following best practices in EVM reporting. Further, the committee found that the current metric (i.e., SPI) does not effectively track schedule performance.
FINDING: EM’s portfolio of projects (work that is subject to following 413.3B) is approximately 25 percent of its annual budget. The percentage of actively tracked projects using certified EVM systems is even smaller (required for capital investment projects greater than $100 million). EM could similarly track a larger majority of activities, but does not.
FINDING: Joint task forces are common to military operations and are now used throughout the government.
___________________
26 Anne Marie White Assistant Secretary for Environmental Management, Written Statement Before the Subcommittee on Strategic Forces Committee on Armed Services United States House of Representatives April 9, 2019, https://docs.house.gov/meetings/as/as29/20190409/109269/hhrg-116as29-wstate-whitea-20190409.pdf.
RECOMMENDATION 5-1: The committee recommends that as the Office of Environmental Management (EM) increases its project management (PM) and Office of Project Management responsibilities using indefinite delivery/indefinite delivery (IDIQ) contracts, it should share and compare best PM practices with others across the U.S. government. To implement this, EM should form a “Joint Task Force” or less formal cooperative structure with Naval Facilities Engineering Systems Command (NAVFAC) and other base
realignment and closure (BRAC) and formerly used defense sites (FUDS) program management organizations.
RECOMMENDATION 5-2: The Department of Energy Office of Environmental Management (EM) should implement a modification to its earned value management system that captures the project’s temporal status more clearly and explicitly. Specifically, EM should immediately require that a revised Schedule Performance Index, SPI(t), which is the ratio of scheduled time of work performed (STWP) and actual time of work performed (ATWP), be reported to accurately track schedule performance.
RECOMMENDATION 5-3: The Department of Energy Office of Environmental Management should explicitly include the percentage of cost overrun or underrun in the project success metrics dashboard, rather than the current “green/yellow/red” metric, to bring more transparency to cost performance.
REFERENCES
DOE (U.S. Department of Energy). 2015a. DOE Guide: Earned Value Management System (EVMS). DOE G 413.3-10A Chg 1 (Admin Chg). October 22, https://www.directives.doe.gov/directivesdocuments/400-series/0413.3-EGuide-10a-admchg1/@@images/file.
______. 2015b. DOE Guide: U.S Department of Energy Performance Baseline Guide. DOE G 413.3-5A Chg 1 (Admin Chg 1). https://www.directives.doe.gov/directives-documents/400series/0413.3-EGuide-05-chg1-admchg.
______. 2018a. DOE O[rder] 413.3B Chg 5 (MinChg), Program and Project Management for the Acquisition of Capital Assets: Change 5. April 12. https://www.directives.doe.gov/directivesdocuments/400-series/0413.3-BOrder-B-chg5-minchg/@@images/file.
______. 2018b. “Special Notice: Modification to End State Contracting Model.” Issued December 12 by DOE’s Office of Environmental Management. https://www.emcbc.doe.gov/SEB/em_escm/Documents/Document%20Library/ESCM%20Special%20Notice%20Combined%20Final%20Version%20X.pdf.
GAO (U.S. Government Accountability Office). 2019a. Department of Energy: Environmental Liability Continues to Grow, and Significant Management Challenges Remain for Cleanup Efforts. GAO-19-460T. Published May 1, 2019; publicly released May 1, 2019. https://www.gao.gov/products/GAO-19-460T.
______. 2019b. DOE Could Improve Program and Project Management by Better Classifying Work and Following Leading Practices. GAO-19-223. Published February 19, 2019; publicly released March 5, 2019. https://www.gao.gov/products/GAO-19-223.
______. 2020. Hanford Waste Treatment Plant: DOE Is Pursuing Pretreatment Alternatives, but Its Strategy Is Unclear While Costs Continue to Rise. GAO-20-363. https://www.gao.gov/assets/710/706854.pdf.