National Academies Press: OpenBook
« Previous: 4 Project Management Policies, Processes, and Procedures
Suggested Citation:"5 Project Management Metrics." National Academies of Sciences, Engineering, and Medicine. 2021. Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report. Washington, DC: The National Academies Press. doi: 10.17226/26000.
×
Page 46
Suggested Citation:"5 Project Management Metrics." National Academies of Sciences, Engineering, and Medicine. 2021. Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report. Washington, DC: The National Academies Press. doi: 10.17226/26000.
×
Page 47
Suggested Citation:"5 Project Management Metrics." National Academies of Sciences, Engineering, and Medicine. 2021. Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report. Washington, DC: The National Academies Press. doi: 10.17226/26000.
×
Page 48
Suggested Citation:"5 Project Management Metrics." National Academies of Sciences, Engineering, and Medicine. 2021. Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report. Washington, DC: The National Academies Press. doi: 10.17226/26000.
×
Page 49
Suggested Citation:"5 Project Management Metrics." National Academies of Sciences, Engineering, and Medicine. 2021. Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report. Washington, DC: The National Academies Press. doi: 10.17226/26000.
×
Page 50
Suggested Citation:"5 Project Management Metrics." National Academies of Sciences, Engineering, and Medicine. 2021. Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report. Washington, DC: The National Academies Press. doi: 10.17226/26000.
×
Page 51
Suggested Citation:"5 Project Management Metrics." National Academies of Sciences, Engineering, and Medicine. 2021. Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report. Washington, DC: The National Academies Press. doi: 10.17226/26000.
×
Page 52
Suggested Citation:"5 Project Management Metrics." National Academies of Sciences, Engineering, and Medicine. 2021. Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report. Washington, DC: The National Academies Press. doi: 10.17226/26000.
×
Page 53
Suggested Citation:"5 Project Management Metrics." National Academies of Sciences, Engineering, and Medicine. 2021. Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report. Washington, DC: The National Academies Press. doi: 10.17226/26000.
×
Page 54
Suggested Citation:"5 Project Management Metrics." National Academies of Sciences, Engineering, and Medicine. 2021. Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report. Washington, DC: The National Academies Press. doi: 10.17226/26000.
×
Page 55
Suggested Citation:"5 Project Management Metrics." National Academies of Sciences, Engineering, and Medicine. 2021. Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report. Washington, DC: The National Academies Press. doi: 10.17226/26000.
×
Page 56
Suggested Citation:"5 Project Management Metrics." National Academies of Sciences, Engineering, and Medicine. 2021. Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report. Washington, DC: The National Academies Press. doi: 10.17226/26000.
×
Page 57
Suggested Citation:"5 Project Management Metrics." National Academies of Sciences, Engineering, and Medicine. 2021. Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report. Washington, DC: The National Academies Press. doi: 10.17226/26000.
×
Page 58
Suggested Citation:"5 Project Management Metrics." National Academies of Sciences, Engineering, and Medicine. 2021. Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report. Washington, DC: The National Academies Press. doi: 10.17226/26000.
×
Page 59
Suggested Citation:"5 Project Management Metrics." National Academies of Sciences, Engineering, and Medicine. 2021. Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report. Washington, DC: The National Academies Press. doi: 10.17226/26000.
×
Page 60

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

5 Project Management Metrics PROJECT METRICS This chapter will review the Department of Energy’s Office of Environmental Management’s (DOE-EM’s) use of project management metrics.1 EM, in coordination with DOE’s Office of Project Management (PM), has developed detailed processes and methods for tracking project-level outcomes and success measures for activities it has defined as “projects” (DOE, 2015a). For example, EM’s Headquarters (HQ) staff use Earned Value Management (EVM) techniques to track and monitor project cost and schedule performance. Key measures, discussed in detail later in this chapter, include Cost Performance Index (CPI), Schedule Performance Index (SPI), along with other typical EVM measures such as Management Reserve (MR), Estimate at Completion (EAC), Total Project Cost (TPC), and funding profile. Additional project management metrics that are typically specific to a given situation such as objectives linked to safety performance, removal of specific amount of waste, or compliance with consent decrees, are not tracked by EVM techniques. EM contractors are responsible for reporting project-level outcomes and their key measures through DOE’s Project Assessment and Reporting System (PARS) II system. In addition to cost and schedule, the PARS II system helps with tracking project specific metrics.2 This system provides up-to- date and reasonably timely information to EM on a monthly basis so it can monitor, assess, and if need be, take action to correct project problems as time elapses. As with any project management measurement system, the PARS II and EM’s EVM systems are only as good as the information that the contractor puts into them. This has been an ongoing criticism of EM by the Government Accountability Office (GAO) as referenced in a number of reports (GAO, 2019b; c). For example, a lack of adequate scope definition during the front end planning process creates an unstable baseline in which the scope changes or “creeps” as the project or program proceeds. This can lead to a situation in which the baseline is updated and the original baseline is lost, hence the metrics are not really indicative of the Critical Decisions (CDs). EM’s portfolio of projects subject to DOE Order 413.3B, Program and Project Management for the Acquisition of Capital Assets (hereafter “Order 413.3B”) includes approximately 25 percent of its overall yearly budget.3 A large majority of activities are not defined as “projects” or fall outside applicability of Order 413.3B in other respects and are therefore not similarly tracked and managed. For instance, projects characterized as operations—the majority of EM’s work including activities such as groundwater remediation—are not tracked by EVM systems certified by the project manager per requirements of Order 413.3B. This latter requirement for is invoked for projects greater than $50 million and classified currently as a capital investment (DOE, 2015a; 2018). As of October 31, 2020, 33 of 34 capital asset projects (CAPs) that were post CD-2 were tracked by EVMS. (The one project that is not 1 Project management metrics necessarily roll up into program performance metrics. However, this interim report focuses on project metrics; a future report by this Committee will examine and discuss in detail EM’s program performance metrics. 2 See “Project Management (PM) Governance, Systems and Training,” presentation to the committee by P. Bosco, May 6, 2020. 3 Comments by Rodney Lehman; Director of Project Management, Office of Corporate Services, Office of Environmental Management (EM) during the committee’s July 21, 2020 public data-gathering session. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5-1

was approved for alternative project controls.)4 EM advises but does not require contractors5 who perform projects costing between $20 million and $50 million to use EVMS, per Appendix C of Order 413.3B (DOE, 2018a), but does, for these smaller projects, nonetheless track EV data. EM defines projects smaller than $20 million as minor capital projects and they are aggregated into programs (i.e., not tracked separately), further limiting EVM requirements of its activities. GAO states that: EM manages most of its cleanup work as operations activities, under less stringent oversight requirements than capital asset projects. EM manages its cleanup work under different requirements, depending on whether it classifies the work as a capital asset project or an operations activity. EM currently manages most of its work as operations activities. In its fiscal year 2019 budget, operations activities accounted for 77 percent of EM’s budget (about $5.5 billion), and capital asset projects accounted for 18 percent about $1.3 billion)” (GAO, 2019a, p. 12). The following sections will explore project management metrics in more detail. EM’S REQUIREMENTS FOR THE USE OF PROJECT METRICS In its discussions with EM and review of the documents provided and existing websites, the committee has identified five primary performance management approaches used by EM on its projects: • EVM System (EVMS) and PARS (described in Chapter 3) • Project Dashboards • Project Evaluation and Measurement Plans (PEMPs) • Contract and project performance metrics and targets • Progress reports to Congress In general, EVMS is an organization’s system for monitoring project/program management that integrates a defined set of associated work scopes, schedules, and budgets. An organization’s leadership uses performance management information, produced from the EVMS, to plan, direct, and control the execution and accomplishment of contract/project cost, schedule, and technical performance objectives (scope of work). EVMS is a robust approach to project management and is well-defined for use government wide. As described earlier, the EVMS approach is used for EM’s cost-based6 projects with contract values that exceed $50 million. By integrating scope, cost, budget, schedule, and risk it can assess current performance and project future trends. Data are reported to and warehoused in DOE’s PARS II.7 Project Dashboards8 are prepared monthly and provide a green, yellow, or red assessment of each active capital project and measures EM’s expectation that the project will meet its expected baseline cost (example, Monthly Cleanup Portfolio Report, DOE EM-5.22, Office of Project Management, January 2020). The color coding is assessed against cost, schedule and scope. Evaluation criteria for these ratings were not identified, but as noted in Chapter 4, when discussing Order 413.3B Section 3c(4), point 3, that allowing for a 10 percent overrun is undesirable. 4 Paul Bosco, DOE Office of Project Management, email to Martin Offutt, NASEM, November 11, 2020. 5 Comments by Rodney Lehman Director of Project Management, Office of Corporate Services, Office of Environmental Management (EM) during the committee’s July 21, 2020 public data-gathering session. 6 Fixed price, lump sum and guaranteed maximum price (GMax) contract types are excluded. 7 The Committee will evaluate the effectiveness of PARS as it relates to the EM program in its second report. 8 Project Dashboard – June 2020. https://www.energy.gov/sites/prod/files/2020/06/f76/June%202020%20Project%20Dashboard.pdf. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5-2

PEMPs measure the contractors’ performance and are the primary tool to establish incentive and award fees earned by each contractor. They are established at each cleanup site with EM HQ’s review based on the size of the contract. Chapter 7 provides detail on criteria and rating methodology used for PEMPs. EM uses the phrase “key performance parameters” (KPP) and describes KPP principles and their use in two documents, DOE Guide: U.S Department of Energy Performance Baseline Guide (DOE, 2015b) and Special Notice - Modification to End State Contracting Model (DOE, 2018b, Appendix C9). The documents focus on establishing baseline project definition and design basis and suggest that KPPs be established for any area where changes will have a major impact. The documents do not offer sample KPPs.9 A best practices reference providing examples of KPPs used on successful projects perhaps would be helpful to practitioners. EM has identified a list of performance metrics used to assess project performance. This list was originally titled the Overall Contract/Project Management Performance (OCPMP) and is reported quarterly 10,11 (see Table 5.1). The goal of the metrics is to measure progress toward completing the scope of work for the contract and the entire life of an operations activity. Notable is that the number of metrics has decreased from 17 in 2008 to 7 in 2020 and the title of the report has been changed to “Overall Root Cause Analysis (RCA)/Corrective Action Plan (CAP) Performance Metrics.”12 9 The committee did not see examples of DOE’s KPPs. 10 FY 2020 Second Quarter Report: Overall Root Cause Analysis (RCA)/Corrective Action Plan (CAP) Performance Metrics, https://www.energy.gov/sites/prod/files/2020/05/f74/FY%202020%20Q2%20Project%20Management%20Performa nce%20Metrics%20Report.pdf (accessed August 11, 2020). 11 FY 2008 4th Quarter Metrics: Overall Contract and Project Management Performance Metrics and Targets. https://www.energy.gov/sites/prod/files/FY2008%204th%20Quarter%20RCA_GAO_OMB%20Attachmentv02%20 2008-11-17.pdf (accessed August 11, 2020). 12 The committee will explore what led to the reduced number of reporting metrics and changed title in its second report. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5-3

TABLE 5.1 Comparison of the Department of Energy’s Office of Environmental Management’s (DOE-EM’s) Project Management Performance Metrics and Targets from 2008 to 2020 No. Contract/Project Management Performance Metrics FY Contract/Project Management Performance Metrics FY 2008 Target FY 2008 2020, 2nd Qtr Actual FY18-FY20: Actual Comment 1. Capital Asset Project Success: Complete 90% of capital asset 1. Capital Asset Line Item Projects: 90% of projects completed within 110% of CD-2 projects at original scope and within 110% of CD-2 TPC. TPC by FY11. 2. EM Cleanup (Soil and Groundwater Remediation, D&D, and Waste Treatment and Disposal) 2. Certified EVM Systems: Post CD-3, greater than $100 million. 3. Certified EVM Systems: Post CD-3, 95% of line item projects and EM cleanup projects by FY11 and FY12, respectively. 4. PDRI Use: By the end of FY11, 80% of projects (>$100M) will use PDRI methodologies no later than CD-2. 5. TRA Use: By end of FY11, 80% of projects >$750M will implement TRA no later than CD-2. 6. Federal Staffing: By the end of FY11, federal contract and project management positions (based on new model) are staffed at 80% of the desired level. 3. Certified FPD’s at CD-1: Projects have certified FPDs no later 7. Certified FPO’s at CD-1: By the end of FY11, 95% of projects have certified FPDs than CD-1. no later than CD-1. 4. Certified FPD’s at CD-3: Projects have FPDs certified at the 8. Certified FPD’s at CD-3: By the end of FY11, 90% of projects have FPDs certified appropriate level assigned to projects no later than CD-3. at the appropriate level assigned to projects no later than CD-3. 5. Certified Contracting Staff: By the end of FY 2011, 85% of the 9. Certified Contracting Staff: By the end of FY11, 85% of the 1102 contracting series 1102 contracting series will be certified. will be certified. 10. Projects Completed Below TPC: By the end of FY11, for all capital asset line item projects that are completed at CD-4, 50% are completed below their currently approved TPC. 11. Full Funding: By the end of FY13, 80% of capital asset line item projects (less than $50 million) are fully funded in one Fiscal Year (one Appropriation). 12. Cost Estimating Staffing: By the end of FY10, establish and staff (at 80% of authorized FTEs) a cost estimating and analysis organization in the Chief Financial Officer, Office of Cost Analysis (CF-70) organization. 13. Award Contracts within 25% of IGE: By the end of FY11, 80% of contract awards are within plus or minus 25% of independent government cost. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5-4

No. Contract/Project Management Performance Metrics FY Contract/Project Management Performance Metrics FY 2008 Target FY 2008 2020, 2nd Qtr Actual FY18-FY20: Actual Comment 14. Contract Specialist Staffing: By the end of FY11, achieve a contract specialist to contract value ratio of 1 per $X* million or less. 15. FPD Staffing: By the end of FY12, achieve a FPD (including Deputy FPD(s), as applicable) to annual work in place ratio of 1 per $X* million or less, and/or in accordance with the staffing study. 6. Schedule Compliance, Projects Greater Than 5 Years Duration: 16. Schedule Compliance, Projects less than 5 years Duration: By the end of FY11, on Projects will meet the project schedule metric that follows: from a program portfolio basis, 90% of all projects will meet the project schedule metric CD-3 to CD-4, projects greater than five years duration will be that follows: from CD-3 to CD-4, projects less than five years duration will be completed within 20% of the original CD-3/4 duration. completed within 12 months of the original CD-3/4 duration. 7. Schedule Compliance, Projects Greater Than 5 Years Duration: 17. Schedule Compliance, Projects greater than 5 years Duration: By the end of FY11, Projects will meet the project schedule metric that follows: from on a program portfolio basis, 90% of all projects will meet the project schedule metric CD-3 to CD-4, projects greater than five years duration will be that follows: from CD-3 to CD-4, projects greater than five years in duration will be completed within 20% of the original CD-3/4 duration. completed within 20% of the original CD-3/4. SOURCE: DOE EM, FY 2020 Second Quarter Report (see footnote 9) and FY 2008 4th Quarter Metrics (see footnote 10). SOURCE: FY 2020 Second Quarter Report: Overall Root Cause Analysis (RCA)/Corrective Action Plan (CAP) Performance Metrics, Available at: https://www.energy.gov/sites/prod/files/2020/05/f74/FY%202020%20Q2%20Project%20Management%20Performance%20Metrics%20Report.pdf; FY 2008 4th Quarter Metrics: Overall Contract and Project Management Performance Metrics and Targets. Available at: https://www.energy.gov/sites/prod/files/FY2008%204th%20Quarter%20RCA_GAO_OMB%20Attachmentv02%202008-11-17.pdf (accessed August 11, 2020). PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5-5

REPORTING OF PROJECT METRICS The committee has reviewed sample copies of EM’s project management reports,1 amongst other provided documents. These reports show EM extensively using EVM project control practices along with Capital Asset Project Dashboards, and Corporate Performance Measures. Regarding EVM, EM routinely calculates the following indices: Schedule Performance Index (SPI), Cost Performance Index (CPI), Estimate at Completion (EAC), Budget at Completion (BAC), Budgeted Cost of Work Scheduled (BCWS), Budgeted Cost of Work Performed (BCWP), and Actual Cost of Work Performed (ACWP). Effective implementation of an EVMS requires a transparent and reliable process and approaches that explicitly and clearly highlight the project’s temporal status. Such approaches bring transparency to cost and schedule overruns. Some observations on these documents include: 1. The calculation of the SPI is based on dollars, not time. By extracting two new variables from the progress reports, namely: Actual Time of Work Performed (ATWP) and Scheduled Time of Work Performed (STWP), a revised SPI(t) (equivalent to STWP/ATWP) could be created and would better track schedule performance. The graph below depicts these two new variables (see Figure 5.1). The difference between these two methods can be dramatic. For example, if a project scheduled for 4.5 years with a Budgeted Cost of Work Scheduled (BCWS) of $300 million actually finishes in 6 years, the traditional SPI at completion is 1.0, whereas the revised SPI(t) at completion is 0.75; the traditional Schedule Variance at completion is 0, whereas the revised Schedule Variance(t) is negative 1.5 years. SPI(t) captures the impact of increased time on performance whereas SPI does not, and thus SPI(t) may be used to forecast schedule delays. 2. Including the percentage (%) of cost over (under) run, compared to the baseline (i.e., Original Critical Decision (CD)-2 Total Project Cost (TPC)) in the Project Success metrics would provide more clarity. Some projects have significant cost overruns (e.g., some EM projects have more than doubled their baseline cost and are not yet complete) and others have lower cost overruns. There are also some projects that finished exactly at the estimated cost. Currently, EM integrates all cost overruns into binary success metrics of Yes/No, which does not provide information on the magnitude of a cost-overrun or underrun (i.e., the variance).2 The variance can be calculated as the difference between Budget at Completion (BAC) and the Estimate at Completion (EAC), with the latter determined as BAC divided by CPI. 1 Email from Catherine Bohan, Office of Environmental Management, “NAS 3133 Response to Request for Additional Information #1 dated 03062020 (Item 14)” April 8, 2020. 2 “Overview of DOE O 413.3B and EM Project Management Protocol for Demolition Projects,” presentation by Rodney Lehman, Director of Project Management, Office of Corporate Services, Office of Environmental Management (EM) during the committee’s February 24, 2020 public data-gathering session. Slide 11. Also see “Project Success List.xls” SOURCE: Cathy Bohan, DOE, email to NASEM staff, March 25, 2020, “NAS 3133 Response to Request for Additional Information #1 dated 03062020 (Item 6).” PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5-6

(millions) FIGURE 5.1 Example of Schedule Performance Index (SPI) which tracks only cost and SPI(t) which tracks time. NOTE: BCWP = Budgeted Cost of Work Performed; BCWS = Budgeted Cost of Work Scheduled; SPI = schedule performance index; STWP = Scheduled Time of Work Performed; ATWP = Actual Time of Work Performed (ATWP), SPI(t) is the ratio of STWP to ATWP. A robust, reliable, effective, and efficient governance process for the EVMS provides EM HQ with more clarity on projects’ status. However, several reviews of EM’s EVMS indicate issues with its implementation and governance process. Examples provided by EM include: the certified EVMS is not fully used; a governance process is not in place; and some datasets provided by contractors are not accurate, complete, repeatable and auditable (see Table 5.3 for more examples and references). Further investigation of the linkage between the governance and data collection processes, on the one hand, and effective implementation of EVMS, on the other, could be of assistance to EM. Throughout the review of documents that EM shared with the committee, DOE made several statements that led to specific concerns associated with EVMS and its implementation. Table 5.3 contains a list of statements that were made in the existing documents by EM and its contractors related to EVMS. All of these issues indicate the need for a robust, reliable, effective, and efficient governance process for EVMS. Therefore, for the second phase of this study, the committee plans to review EVMS governance in more detail, including:  Current EVMS governance process, the involved parties, and their roles and responsibilities.  Current EVMS certification process and enforcement of such certification.  Current data collection processes for EVMS to ensure they are current, accurate, complete, repeatable, and auditable.  Current project control systems that EM actively uses. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5-7

TABLE 5.3 List of EVMS Related Issues that Were Explicitly Stated (verbatim) in the Documents that Were Shared with the Committee Issue Notes 1. The certified EVMS was not fully used to develop the PMB and PB for TSCR. (a) 2. Several EVMS areas need further attention to ensure EIA-748 EVMS (a) 3. A governance process is not in place for reviewing the health of the EVMS (a) 4. The review team determined that the Bechtel National, Inc. (BNI) EVMS data is not (a) current, accurate, complete, repeatable, or auditable, and neither the current project status nor forecast completion cost and schedule are credible. 5. Given the magnitude, breadth, and nature of the findings, BNI’s ability to retain its March (b) 4, 2008 DOE EVMS certification of compliance is in jeopardy. 6. DOE PM completed a surveillance review of the contractor’s (BNI) project controls system (c) (EVMS) and issued the final report on December 2, 2019. The report concluded that BNI has not maintained their EVMS compliant system. As a result of the noted deficiencies, the government cannot have confidence in BNI’s report on their project control system. 7. BNI submitted their corrective action plan on January 17, 2020 focusing on developing a (c) credible PMB and a disciplined change control process. NOTES: See the following documents provided to the committee (a) 14_January 2020 Master Segment Quad Charts 03.02.20.pdf, slide 25; (b) 14 January 2020 Monthly CAP quad charts 03.03.20.pdf, slide 12, (c) 14 January 2020 Monthly CAP quad charts 03.03.20.pdf, slide 16. SOURCE: DOE provided documents (see NOTES). Over the past 10 years, major projects around the world have adopted some form of digital design and workflow processes. Computer Aided Design (CAD) and Building Information Modeling (BIM) are the primary digital systems that improve collaboration, cost estimating, project visualization, scheduling, and project handover, among other metrics. DOE participates in the US Army Corp of Engineers CAD/BIM Technology Center and the A/E/C CAD Standard.3 Our review of the Central Plateau Cleanup Contract – Final RFP4 and Section H of that RFP did not find DOE requirements5 for a BIM execution plan or other forms of digital delivery. Projects in the United States have begun to follow ISO 19650, following its successful use in the United Kingdom.6 These standards are best practices for BIM collaboration and production. The standard integrates the project’s work and organizational breakdown structures (WBS and OBS) and enhances project estimating, scheduling and status. EM may want to investigate ISO 19650 and moving forward determine a consistent requirement for inclusion its contracts. Recent GAO Findings and Recommendations In a 2019 report, GAO stated that EM follows only 25 percent (three of 12) of PMI’s project management guidelines (GAO, 2019b). Among those project management guidelines that were identified as not met, or minimally met, were (1) developing and maintaining an integrated master schedule using 3 See https://www.wbdg.org/ffc/army-coe/cad-bim-technology-center. 4 See https://www.emcbc.doe.gov/SEB/CPCC/Documents/RFP/CPCC_Section_H_IDIQ.pdf. 5 Every design and construction contractor has in-house digital standards and these are likely established through the contractors’ quality programs. 6 Organization and digitization of information about buildings and civil engineering works, including building information modelling (BIM) — Information management using building information modelling — Part 1: Concepts and principles. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5-8

GAO best practices; and (2) establishing project-reporting systems/databases to provide a clear picture of project performance to management and to keep the contractor accountable. In its response to the GAO, EM stated it would an issue update to the policy. EM issued the new policy by memorandum7 in November 2020. Chapter 4 of the present report also considers PMI guidelines. GAO further stated: EM relies on contractors’ EVM systems to measure the performance of its contractors’ operations activities, but EM has not followed (i.e., has not met, has minimally met, or has partially met) best practices to ensure that these systems are (1) comprehensive, (2) provide reliable data, and (3) are used by EM leadership for decision-making—which are the three characteristics of a reliable EVM system. Moreover, EM has allowed the contractors to categorize a large portion of their work in a way that limits the usefulness of the EVM data” (GAO, 2019b, p 36). A further example of project progress tracking and its impact on closure and perception is the Hanford Waste Treatment and Immobilization Plant (WTP) during the early 2010s. The facility was employing a strategy of feeding liquid tank waste into a pretreatment facility at the WTP that would separate the feed into two streams—low activity waste [LAW] and high activity waste—for subsequent treatment and immobilization in respective facilities for each type of waste.8 However, EM stopped the construction of the facility in 2012 due to technical challenges. Following a period of rework, the contractor proceeded under a new strategy that would allow LAW sourced directly from the tanks to be pretreated to remove cesium and solids in a new purpose-built facility, the LAW Pretreatment System (LAWPS).9 From there it would be fed to the Low Activity Waste Facility, which would immobilize the waste. Over half of the $752 million EM spent on the pretreatment facility of the WTP in fiscal year 2013 to 2018 was for overhead, oversight, procurements, and facility maintenance. According to the contractor’s EVM reports, 43 percent was spent resolving technical challenges (GAO, 2020a). Despite the halt in construction and rework, an EM press release on August 4, 2020 stated that the project remained on schedule.10 METHODS FOR TRACKING PERFORMANCE VALUE As examples of project metrics in a performance-based approach, the committee provides the following, which may be of use in developing an organization-wide consistent method of assessing the value gained by this relatively new approach. The committee observed through information it was provided and documents that it reviewed that these metrics were different from site to site and also even within sites. 7 William I. White, Office of Environmental Management, November 6, 2020, “Issuance of the Environmental Management Program Management Protocol,” Memorandum for Distribution. 8 U.S. Department of Energy, Office of River Protection, November 14, 2016, “Low Activity Waste Pretreatment System: RCRA Notice of Intent Meeting.” Available at < https://www.hanford.gov/files.cfm/Attachment_1_LAWPS_NOI_presentation_Nov_20161.pdf>. 9 U.S. Department of Energy, undated, “Direct Feed Low-Activity Waste,” available at <<https://www.hanford.gov/page.cfm/DFLAW> Accessed November 10, 2020. 10 The title of the article does explain how the progress made on the facility could be related to project metrics and performance goals. This example also shows the difficulty in determining technology requirements for a first of its kind facility, but also the very large costs of delay especially once construction is underway. Available at: https://www.energy.gov/em/articles/hanford-tank-waste-pretreatment-system-schedule (accessed 10/27/2020). PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5-9

Project Performance Measures and Outcomes In late 2018, EM changed its primary contracting method to a performance-based approach, ostensibly to reinvigorate and accelerate cleanup and reduce risk and financial liability (DOE, 2018b). Performance-based contracts focus on outcomes and results, in contrast to a focus on the processes used to achieve the results. EM introduced the use of an indefinite delivery/indefinite quantity (ID/IQ or IDIQ)11 delivery model to allow flexibility in the scope, duration, and type of contractual commitment. This section will focus on performance metrics and benchmarks and the ability to define project outcomes and performance measures. As noted previously, EM is familiar with KPP principles as they are referenced in two of their documents (DOE, 2018a; DOE, 2015). DOE’s recent change in contracting method is a good time to reevaluate its metrics and KPPs at the project and program level. Some areas to consider: • Does the flexibility of fixed price and cost-based contract types within an IDIQ conflict with exceptions for EVMS and PARS? For example, are too many or too few projects included? • What is the median size on an IDIQ project, and does it exceed the $50 million contract exception? • For the large number of projects that are below the threshold for EVMS and PARS is there a guidance document for the field offices to assure that minimum requirements are met? For example, is there consistency in safety metrics, design, construction, and demolition performance measurement. How are those data aggregated and reported? • How will changes in the IDIQ delivery model affect historical benchmarks established for large and small contracts? • Does guidance on adjectival ratings exist and remain consistent? Based on the committee’s many years of experience working on major capital programs, it offers the following performance measurement principles for developing a robust set of performance metrics including: Principle #1 Establish performance metrics consistent with delivery and contract forms and stand the test of time. Successful outcomes are largely the result of a sound program management strategy. The strategy is necessary to translate the vision and intent across the enterprise or program to deliver desired outcomes. The executing program strategy is implemented by an organization through program wide performance metrics and key performance indicators (KPIs) that measure project components. While benchmark performance expectations may change over time, the primary KPIs and metrics remain consistent and narrowly defined. This is particularly important for programs that are long-lived with multiple contractors and project managers. To be clear, the certain metrics’ importance may change throughout the project, but the individual metric should not. Such consistency allows for comparison across programs, projects, and tasks. Principle #2 Limit KPIs to a handful at each level of execution. At the project level,12 focus metrics on tasks, schedule, and costs. General categories for metrics and KPIs include: 11 See https://jocexcellence.org/wp-content/uploads/2017/02/UMD_09014_An-Evaluation-of-IDIQ-Contracts- for-Service_January-2012.pdf for an industry survey of IDIQ strengths and weaknesses. 12 Project KPIs are well established by all contractors. Autodesk performed a survey of 200 US-based contractors that measured frequently-used KPIs in seven areas - https://www.autodesk.com/bim-360/kpi- construction-data-report-infographic. They were: PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5-10

 Financial  Schedule  Safety and Operational  Quality  Risk Sound performance management is a data intensive effort requiring data capture, data storage, normalization, and analysis. Since most data today are captured electronically at the source of production on large projects, it is easy to compile large sets of spreadsheets and performance measures for every operational issue. A common error is measuring too many details with far too many metrics. No metric is perfect, and all have some unintended consequences. Good metrics are actionable, easy to visualize, and support the program strategy. For metrics, “less is more” and having fewer increases focus on desired outcomes. Principle #3 Use benchmarks and metrics to foster competition. Team competition among and across projects and programs will encourage productivity and innovation. Benchmark thresholds are often established to reflect minimum and up to exceptional expectations. Such thresholds levels are difficult to determine; stakeholder agreement and buy-in is often tedious and unproductive. In contrast, competition among similar teams offers an elegant way to challenge productivity and foster continuous improvement. Principle #4 Capture, share, and train successes. Allow top performance techniques to be shared program-wide. Except for a few patented processes, planning, design, construction, and operational innovations are short-lived. New approaches are shared via joint ventures, talent migration, and technical trade associations and papers. As early as the RFP stage in a project, processes to share technical ideas should be established by the customer 13. Contractual incentives can reward innovation but also demand that innovation be shared for future EM use. The major takeaway concerning project metrics is that the relationship and importance of key metrics to driving program strategy is central to overall strategy attainment. The IDIQ delivery approach place greater emphasis on EM’s program management staff to establish metrics that improve performance and complement strategy. The importance of using metrics as a driver of continuous improvement and behavioral change cannot be overemphasized. In 2018, when Assistant Secretary White described the End State Contracting Model (ESCM),14 her goals were to reduce risk and financial liability, accelerate cleanup and share risk between government and industry. Guidelines for the ESCM focus on process, time to complete the procurement, and a post-award incentive fee. The ESCM guidelines do not offer guidance on how EM should address the lack of cost- and schedule-competition, post-award, a strategy to share innovation, or the use of metrics or methods to assess best value. Other large infrastructure programs that rely on EM’s list of • Consistency in capturing constructability issues in the bid documents • Logging Requests for Information (RFIs) • Documenting change order root cause and schedule impacts • Frequent schedule updates • Technology to manage safety and inspections • Labor productivity due to poor coordination, documents, and schedule • Software to manage closeout activities 13 See discussion of Infrastructure Ontario below. 14 Written Statement of Anne Marie White Assistant Secretary for Environmental Management Before the Subcommittee on Strategic Forces Committee on Armed Services United States House of Representatives April 9, 2019. Available at https://docs.house.gov/meetings/as/as29/20190409/109269/hhrg-116-as29-wstate-whitea- 20190409.pdf. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5-11

primary contractors may offer EM an opportunity to review their best practices to prepare thoughtful metrics prior to RFQ. One such organization is Infrastructure Ontario (IO) (see Box 5.1); another is environmental cleanup activities at the Department of Defense (DoD) Base Realignment and Closure (BRAC) and Formerly Used Defense Sites (FUDS) (see Box 5.2). BOX 5.1 Infrastructure Ontario Infrastructure Ontario (IO) acts as procurement and commercial lead for all major public infrastructure projects in Ontario, Canada. Its four lines of business are major projects, real estate services, infrastructure lending, and commercial projects. As such, it is the program manager for most large projects in Ontario.15 IO’s procurement process (RFQ, RFP, and Contract Award) has a strong “value-for-money”16 focus and aims to achieve quality17 at a low cost. For some of its smaller infrastructure projects, where the design-builder did not have equity, financing, or long-term maintenance role, it relied on benchmarking techniques to drive its value-for-money strategy. For example: 1. RFQ’s required the contractor to submit: a. Resumes for IO-defined key project positions available for the project. b. Non-proprietary technology and innovation the contractor planned to use c. IO project experience 2. From the above, IO would typically shortlist three to five prime contractors on a “pass-fail” basis, i.e., no future advantage for superior technical scores. IO’s RFP would: a. Encourage non-proprietary alternative technical concepts (ATCs) b. Request unit costs for major quantities c. Request salary rates and markup for the key staff proposed in the RFQ d. Offer schedule incentives e. Request a fixed price bid for the base program and any approved ATC modifications. IO selected contractors based on best value using an undisclosed formula. This approach drove IO’s best-value outcome in several ways:  IO discouraged expensive personnel that exceeded requirements.  IO predetermined cost basis for scope growth.  IO shared ATCs and their costs and enabled the selected contractor to use them if desired.  The bid detail offered IO a range of schedule and cost estimates were available to assess owner contingencies. IO’s approach for certain infrastructure projects is unlikely suitable for EM. The example intends to show that a strategic outcome of “best value” starts with the agency’s RFQ, as is driven by competitive benchmarks and metrics. 15 The 2018 annual audit of IO performance can be found at https://www.infrastructureontario.ca/WorkArea/DownloadAsset.aspx?id=36507222863. 16 Value for money is based on the minimum purchase price and on the maximum efficiency and effectiveness of the purchase over its life cycle. 17 As defined by Crosby as “meeting requirements.” PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5-12

BOX 5.2 Department of Defense (DoD) Base Realignment and Closure (BRAC) and Formerly Used Defense Sites (FUDS): A Joint Task Force Idea Generally considered a success, environmental cleanup at the Department of Defense (DoD) Base Realignment and Closure (BRAC) and Formerly Used Defense Sites (FUDS) provide similar examples of the challenges faced by EM (EPA, 2017). Both programs use a variety of contract forms and procurement processes to fit the project need. DoD manages the sites as decentralized projects and are closer in size and term (5 to 10 years) to EM’s new approach of “chunkable” IDIQ contracts. Due to these similarities, EM may want to form a “Joint Task Force” or less formal cooperative structure with Naval Facilities Engineering Systems Command (NAVFAC) and other BRAC and FUDS program management organizations to share experiences and best practices with their IDIQ approach. Joint Task Forces are common to military operations18 but are now used throughout the government. For BRAC, program management and program management oversight (PMO) are typically performed internally, for example, NAVFAC for the Department of Navy (DON) BRAC.a Under FUDS, the U.S. Army Corps of Engineers (USACOE) is the overall program manager on behalf of the U.S. Army and DoD.b USACOE manages closures at thousands of Army sites, and prioritizes the work based on exposure to human population. It manages stakeholder relationships with the Environmental Protection Agency, state environmental and regulatory agencies, and the local community. In 2005, it began to use performance-based contracting methods. a See https://www.bracpmo.navy.mil/ (accessed 10/27/2020). b See https://www.usace.army.mil/Missions/Environmental/Formerly-Used-Defense-Sites/ (accessed 10/27/2020). FINDINGS AND RECOMMENDATIONS FINDING: DOE’s Office of Environmental Management, with the help of DOE’s Office of Project Management (OPM), has developed detailed processes and methods for tracking project-level outcomes and success measures. The management of EM’s projects by Headquarters’ staff uses Earned Value Management, including key measures such as Cost Performance Index (CPI), Schedule Performance Index (SPI), Management Reserve (MR), Estimate at Completion (EAC), Total Project Cost (TPC), funding profile, and others. DOE-EM contractors report project-level outcomes and their key measures through DOE’s Project Assessment and Reporting System (PARS) II system. This system provides EM monthly data on the projects they track and provides up-to-date and reasonably timely information they can monitor, assess, and act upon. However, the committee found evidence that EM and its contractors are not following best practices in EVM reporting. Further, the committee found that the current metric, i.e., SPI, does not effectively track schedule performance. FINDING: EM’s portfolio of projects (work that is subject to following 413.3B) is approximately 25 percent of its annual budget. The percentage of actively tracked projects using certified EVM systems is even smaller (required for capital investment projects greater than $100 million). EM could similarly track a larger majority of activities, but does not. FINDING: Joint Task Forces are common to military operations and are now used throughout the government. 18 Further information available at https://en.wikipedia.org/wiki/Joint_task_force. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5-13

RECOMMENDATION 5-1: The committee recommends that as EM increases its project management (PM) and Office of Project Management (OPM) responsibilities using Indefinite Delivery/Indefinite Delivery (IDIQ) contracts, it should share and compare best project management (PM) practices with others across the U.S. government. To implement this, EM should form a “Joint Task Force” or less formal cooperative structure with Naval Facilities Engineering Systems Command (NAVFAC) and other Base Realignment and Closure (BRAC) and Formerly Used Defense Sites (FUDS) program management organizations. RECOMMENDATION 5-2: DOE EM should implement a modification to its EVMS system that captures the project’s temporal status more clearly and explicitly. Specifically, EM should immediately require that a revised Schedule Performance Index, SPI(t), which is the ratio of Scheduled Time of Work Performed (STWP) and Actual Time of Work Performed (ATWP), be reported to accurately track schedule performance. RECOMMENDATION 5-3: DOE EM should explicitly include the percentage (%) of cost over (under) run in their Project Success metrics dashboard, rather than the current “green/yellow/red” metric to bring more transparency to cost performance. REFERENCES DOE, 2015a. DOE Guide: Earned Value Management System (EVMS), DOE G 413.3-10A Chg 1 (Admin Chg), Published October 22, 2015, : https://www.directives.doe.gov/directives-documents/400- series/0413.3-EGuide-10a-admchg1/@@images/file (accessed August 11, 2020). DOE, 2015b. DOE Guide: U.S Department of Energy Performance Baseline Guide, DOE G 413.3-5A Chg 1 (Admin Chg 1), Available at: https://www.directives.doe.gov/directives-documents/400- series/0413.3-EGuide-05-chg1-admchg (accessed August 11, 2020). DOE, 2018a. DOE O 413.3B Chg 5 (MinChg), Program and Project Management for the Acquisition of Capital Assets: Change 5, April 12, 2018, https://www.directives.doe.gov/directives-documents/400- series/0413.3-BOrder-B-chg5-minchg/@@images/file (accessed August 11, 2020). DOE, 2018b. “Special Notice - Modification to End State Contracting Model,” issued December 12, 2018 by DOE’s Office of Environmental Management, https://www.emcbc.doe.gov/SEB/em_escm/Documents/Document%20Library/ESCM%20Special%20 Notice%20Combined%20Final%20Version%20X.pdf (accessed August 11, 2020). GAO. 2019a. Department of Energy: Environmental Liability Continues to Grow, and Significant Management Challenges Remain for Cleanup Efforts. GAO-19-460T: Published: May 1, 2019. Publicly Released: May 1, 2019. https://www.gao.gov/products/GAO-19-460T. GAO. 2019b. DOE Could Improve Program and Project Management by Better Classifying Work and Following Leading Practices. GAO-19-223: Published: Feb 19, 2019. Publicly Released: Mar 5, 2019. https://www.gao.gov/products/GAO-19-223. GAO. 2019c. DOE Faces Project Management and Disposal Challenges with High-Level Waste at Idaho National Laboratory. GAO-19-494: Published: Sep 9, 2019. Publicly Released: Sep 9, 2019. https://www.gao.gov/products/GAO-19-494. GAO. 2020a. Hanford Waste Treatment Plant: DOE Is Pursuing Pretreatment Alternatives, but Its Strategy Is Unclear While Costs Continue to Rise, GAO-20-363, https://www.gao.gov/assets/710/706854.pdf. GAO, 2020b. Program-Wide Strategy and Better Reporting Needed to Address Growing Environmental Cleanup Liability. GAO-19-28: January 29, 2020. https://www.gao.gov/assets/700/696632.pdf. Monthly Cleanup Portfolio Report, DOE EM-5.22, Office of Project Management, January 2020. Office of Project Management Oversight and Assessments, Project Dashboard, Post CD-2 Projects, June 2020. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5-14

EPA, 2017. BRAC and EPA’S Federal Facility Cleanup Program: Three Decades of Excellence, Innovation and Reuse, EPA, 505-R-17-001 | November 2017,https://www.epa.gov/sites/production/files/2017-12/documents/brac_v9_11_2_2017_508.pdf. PREPUBLICATION COPY – SUBJECT TO FURTHER EDITORIAL CORRECTION 5-15

Next: 6 Contract Structures »
Review of Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report Get This Book
×
Buy Paperback | $50.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The U.S. Department of Energy (DOE) and its predecessor agencies have conducted activities to develop atomic energy for civilian and defense purposes since the initiation of the World War II Manhattan Project in 1942. These activities took place at large federal land reservations of hundreds of square miles involving industrial-scale operations, but also at many smaller federal and non-federal sites such as uranium mines, materials processing and manufacturing facilities. The nuclear weapons and energy production activities at these facilities produced large quantities of radioactive and hazardous wastes and resulted in widespread groundwater and soil contamination at these sites. DOE initiated a concerted effort to clean up these sites beginning in the 1980s. Many of these sites have been remediated and are in long-term caretaker status, closed or repurposed for other uses.

Review of the Effectiveness and Efficiency of Defense Environmental Cleanup Activities of the Department of Energy's Office of Environmental Management: First Report provides background information on the sites currently assigned to the DOE's Office of Environmental Management that are undergoing cleanup; discusses current practices for management and oversight of the cleanups; offers findings and recommendations on such practices and how progress is measured against them; and considers the contracts under which the cleanups proceed and how these have been and can be structured to include incentives for improved cost and schedule performance.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!