National Academies Press: OpenBook
« Previous: B Site Technology Coordination Groups at Three Major DOE-EM Sites
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 114
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 115
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 116
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 117
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 118
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 119
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 120
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 121
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 122
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 123
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 124
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 125
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 126
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 127
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 128
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 129
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 130
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 131
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 132
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 133
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 134
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 135
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 136
Suggested Citation:"C Focus Areas." National Research Council. 1999. Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology. Washington, DC: The National Academies Press. doi: 10.17226/9448.
×
Page 137

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Appendix C Focus Areas In the OST Action Plan of 1994 (DOE, 1994a), national Focus Areas were established to integrate technology development activities at DOE-EM sites complex-wide in order to prioritize efforts to address the greatest technology needs and to benefit from the results of program efforts at individual sites. By 1996, OST had divided problems into four major Focus Area categories: Subsurface Contaminants, Mixed Waste, High-Leve! Waste in Tanks, and Decontamination and Decommissioning. Areas of technology development applicable over a number of Focus Areas are implemented through Crosscutting Programs that until 1997 were managed through DOE headquarters (DOE-HQ). A complex-wide inventory of problems and technology needs was conducted from 1994 to 1995 to provide information for decision making on a national basis. These needs were compiled and reported in National Technology Needs Assessment reports, which documented the prevalence of problem types and identified generally similar ranges of technology needs across the DOE complex. This national inventory of needs formed the technical basis for the initial design of the Focus Area programs. The problem statements were analyzed by the Focus Areas for commonality of technical issues, an effort that also resulted in the transfer of some problem sets between Focus Areas. The resulting problem sets in each Focus Area were furler grouped into distinct needs statements by combining similar projects to form work packages, which were prioritized in a technology needs portfolio. After preparation of a draft budget for the prioritized technology needs portfolio, Focus Areas representatives met with the STCGs, Technical Program Officers, and other stakeholders to present the prioritized needs programs and obtain feedback prior to presentation of the final program to DOE headquarters. SUBSURFACE CONTAMINANTS FOCUS AREA The Subsurface Contaminants Focus Area (also commonly referred to as "Subcon") is situated at the Savannah River Site, one of many DOE-EM sites with subsurface contamination by radionuclides, toxic metals, and organic compounds. The SCFA combines into one national Focus Area two former Focus Area~Contaminant Plumes and Landfills which were separately managed until ~ 996. To address site needs and DOE program priorities to identify and develop environmental technologies to address soil and ground water remediation problems, the SCFA has identified the following four strategic goals: I. ability to contain and/or stabilize contamination sources that pose an imminent threat to surface and ground waters; 2. ability to delineate dense nonaqueous phase liquid (DNAPL) contamination in the subsurface and to remediate DNAPL-contaminated soils and ground water; ~4

Appendix C Focus Areas and 115 3. ability to remove a full range of metal and radionucTide contamination in soils and ground water; 4. ability to remediate landfi~Is that pose a continuing threat to surface and ground waters. To meet these goals, the SCFA funds the development and deployment of innovative remediation technologies that have the potential to significantly reduce site cleanup costs, reduce risk, or provide benefits beyond those provided by the baseline technology. These technology development activities are conducted in accordance with Accelerating Cleanup: Paths to Closure (DOE, 1998a,). The SCFA obtains information on problems and technology needs from the STCGs using a standardized needs template (developed by the STCGs). Additionally, various activities are conducted by the SCFA to support DOE-HQ or interdeparunental initiatives. Status in December 1996 In 1996, the national needs inventory resulted in more than 520 needs statements, which were consolidated into approximately 34 "problem sets" and approximately half that number of "work packages" (Table C.~) based on similarity of problems and needs. Subsequent solicitations are made annually to the STCGs regarding new or updated problems and technology needs. As of December 1996, most of the portfolio of SCFA projects were ongoing, late-stage projects, initiated prior to the development of the prioritization criteria process. The suite of SCFA projects in December 1996 primarily addressed treatment of contamination by volatile organic compounds. The SCFA intends to obtain complete engineering cost and perfonnance data for ~ ~ of the existing portfolio technologies by 1999 so that they can be moved into the private sector. The SCFA decided to continue binding for these more "mature" projects to completion within two to three years, after which the portfolio is anticipated to change to better represent national priorities. .. .. .. Process Description Innovative technology need projects have been identified and evaluated using recently developed prioritization criteria and decision-making procedures. From Aggregated Neetls to Prioritized Work Packages The Focus Area work begins with the definition and validation of specific remediation needs at the sites. Problem or needs statements provided by the STCGs are reviewed and prioritized by the Focus Area into a selected number of consolidated work packages for eventual funding depending on budget availability. Actions made by the SCFA include the designation of specific lead staff positions responsible for gathering site needs statements, validating the needs statements, and developing a needs database that matches needs to technologies and identifies technology gaps. The Stakeholder Coordinator (SHC) is responsible for gathering and validating site needs through interaction with STCGs and other stakeholders. The Technical Team is responsible for reviewing and evaluating needs statements to determine whether the needs are within the mission of the SCFA and for identifying possible available technologies. The Technical Team is comprised of representatives of DOE personnel from SRS, INEEL, the Richiand (RL) field office, and the Albuquerque (AL) field office. The Systems Engineering Lead is responsible for matching needs to technologies and identifying technology gaps through the use of a computerized decision support system.

116 Decision Making in the DOE-OST TABLE C.~. The FY 1997 Work Packages of Subsurface Contamination Focus Area, listed in Priority Order Designation Description WP2 WP3 WP4 WPS WP6 WP7 WP8 WP9 WP10 WP11 WP12 WP13 WP14 WP15 Portable Selective Hot Spot Removal System Demo In Situ Stabilization for Contamination or Removal DNAPL Characterization (transferred to CMST) In Situ Destructive Treatment Technologies for DNAPL's Advanced Subsurface Containment Systems Design and Performance Validation Mobilization, Extraction, Removal of Metals and Radioactive Contaminants Long-Term Containment Systems Monitoring and Maintenance Soil Removal, Segregation, and Treatment of Waste Unit Secondary Waste Treatment of Extracted Ground Water Reaction Zone Barrier Systems for Metals and Radioactive Contaminants Innovative Alternative Containment System Deployment Reaction Zone Barrier Systems for DNAPL's Mobilization, Extraction, Removal Technologies for DNAPL's In Sits Bulk Waste Treatment NOTE: Not shown is WP1, program management, which comprises approximately 5% of the program budget. These work packages were used by program managers to build the program and budget to be aligned with user needs. SOURCE: Adapted from Wright, 1996. The process of identifying needs begins with the STCGs' defining needs from a technical standpoint and assessing these needs in the context of Accelerating Cleanup: Paths to Closure (DOE, 1998a). The problem or needs statements are submitted to the SCFA, which evaluates, priontizes, and integrates the needs into an overall technology development program. When the needs statements are received, the SCFA evaluates whether technologies are available within DOE or commercially to address any such needs. If technologies are commercially available, the Technical Team contacts vendors to discuss performance requirements and, if appropriate, puts the vendor in contact with the applicable STCG. Following review by the SCFA, needs typically are segregated into five categories: I. needs that satisfy 2006 Plan requirements and have a high end user commitment, warranting continued SCFA technology deployment; 2. needs that do not justify additional action either because the baseline technology is adequate or because the potential risk of developing a new, more economical technology is excessive; 3. needs that currently have commercially available technology solutions; 4. needs being addressed or that should be addressed by another program (e.g., another Focus Area or Crosscutting Program); and 5. needs that are currently not high enough in Accelerating Cleanup: Paths to Closure (DOE, 1998a) priority to warrant additional action.

Appendix C-Focus Areas ~7 The SCFA Technical Team evaluates the needs and combines them into work packages based on similarity of purpose and characteristics. A ranking and rating process, described below in more detail, is used to prioritize the list of technology needs in preparing final work packages communicated to the site STCGs through the Stakeholder Coordination Manager. The resulting prioritized list of work packages (Table C. ~) forms the basis for soliciting technology development proposals. Input From Non-DOE Sources A subcontractor (Scientech, Inc.) is used by SCFA to evaluate the availability of existing technologies by searching public databases for available technologies to address site problems and needs. Communication Plans The SCFA communicates the results of the needs evaluation to the STCGs and stakeholders. Future plans for the SCFA include posting the needs template on the Internet to improve communications with the field offices and provide information regarding performance requirements and program processes. Other plans include increased industry participation to identify technology gaps and more private-sector involvement in the technology development and procurement process. The Development of SCFA Prioritization Criteria The STCG problem or needs statements form the basis of program design. These problems and needs are ranked using criteria developed by various groups at various times. The SCFA has employed a variety of criteria and weighting factors in its process of prioritizing technical needs and allocating budgets. This history is reviewed briefly here. initially, criteria based on DOE's Risk Data Sheets were used to establish Focus Area priorities. These criteria, selected by SCFA and DOE-HQ, were public safety and health, site personnel safety and health, environmental protection, mortgage reduction, pervasiveness of the problem, regulatory compliance, social or cultural impact and economic risk reduction, and · . - mission Impact. A scoring methodology was developed to prioritize SCFA problems and technology needs using these criteria in March 1996. Subsequently, additional Strategic Investment Critena were developed to evaluate whether a technology was within the mission and strategy of the SCFA before it was proposed for funding. These Strategic Investment Criteria were used to determine whether existing technology was adequate and whether the proposed technology development was consistent with SCFA goals and EM-50 policy. Additional questions were related to whether there was a customer committed to implementation, whether the basic science and technical or performance requirements were understood, and whether the proposed activity would meet the customer deadline and result in costs commensurate with benefits received. These Strategic Investment Criteria were used in April and May 1996 to develop a draft FY 1997 plan and budget. This draft plan and budget were presented to stakeholders in May 1996 at a meeting

1I8 Decision Making in the DOE-OST attended by TPOs, stakeholders, Community Leaders Network (CEN) members, and SCFA members, which resulted in the development of modified criteria and weighting factors to be used in developing the final FY 1997 PEG. These modified criteria were technical credibility (30 points), cost reduction (~ 8 points), advantage over baseline (10 points), public, personnel, and/or environmental risk reduction (10 points), deployability (10 points), regulatory or stakeholder acceptance (9 points), return on investment (S points), user support (8 points), regulatory compliance (8 points), secondary waste reduction (4 points) applicability to multiple sites (3 points) not a duplicative effort (2 points), and industry or federal agency leveraging (2 points). The process of applying these criteria begins with a screening evaluation of technical credibility to determine whether a work proposal is appropriate based on the approach, previous performance, personnel expertise, resources, and so form. If the proposal does not pass this evaluation, it receives a score of zero. if it passes the evaluation, it receives a score of 30 multiplied by venous criteria weighting factors. Several assumptions are also included within these criteria. For example, the highest rated of these criteria is technical credibility, not risk reduction. This is because it is assumed that the baseline technology addresses the required reduction in risk, so if any new technology outperforms the baseline technology, it is assumed to achieve the necessary risk reduction. Several issues were identified in using these priority-setting criteria to finalize the FY 1997 PEG. These included redundancies involving criteria such as regulatory acceptance and exclusion issues relating to cost and baseline technology criteria. As a result, SCFA solicited additional comments in an October 1996 stakeholder workshop. More than 22 issues were identified in the workshop, including the need for increased end user or stakeholder input in the scoring, more emphasis on STCG-defined high- risk sites, greater clanfication of DOE-specific needs (i.e., higher ranking for radionuclides than other contaminants), the need for a method to assess the practicality of technology deployment, and the need to avoid duplication of regulatory criteria. A revised prioritization process is reportedly being developed for the FY 1998 PEG and FY 1999 {RB. Additionally, the SCFA has indicated that the prioritization criteria will be revisited at critical points in the budgeting and technology selection process. Portfolio Management Decision Processes Management of the technology development program includes regular review and evaluation of technology selection. Once the portfolio of technology development activities is developed, SCFA uses two major types of review as decision-making processes in the course of administering the technology development program. These are · independent, external program and technology reviews (peer and gate reviews); and · formal Focus Area reviews (project technical reviews).

Appendix C Focus Areas ~9 Additionally, a variety of review procedures are used within EM-50 in association with program management, planning, budget authorization, and administration. This includes administration of the program management budget (5 percent of the total SCFA budget). Peer and Gate Reviews As of December 1996, peer reviews and gate reviews were conducted outside the Focus Area to provide independent technical input into the decision-making process (Bauer, 1996; Frolio, 1996; Heeb, 1996~. Peer reviews (ASME, 1997a) are conducted under contract to DOE by the American Society of Mechanical Engineers. ASME assembles peer review groups on an as-needed basis to resolve technical or stakeholder issues or to confirm that the technical approach selected is appropriate and prepare a report to be submitted to DOE-HQ and the Focus Area. Gate reviews (DOE, 1996n), to evaluate the maturity of a technology to pass into more costly engineering and field development phases (stage 4: Engineering Development), were to be conducted and managed by DOE-OR as of December ~ 996. The gate review uses the following criteria: technology need; technical merit of system under development; cost of developing the technology; safety, environmental protection, and risk; stakeholder and regulatory acceptance; and · commercial viability. DOE-OR would prepare a report of opinions and recommendations that is submitted to the Focus Area and incorporated in the SCFA decision-making process. Technical Project Reviews For reviews at other technology development gates, SCFA performs a technical project review using the same general criteria as the gate review, but managed by the SCFA Lead Office. The stated goal of these reviews is to assess the technical status of a project and evaluate whether it should be continued. Reviewers are selected by the Technical Team Lead, Product Line Managers (PEMs), and Product Line Integrators and may include experts from both within and outside DOE. The Deployment Plan input from the Technical Team includes periodic reviews to evaluate compliance with federal and state agreements. The peer review team is to be comprised of technical personnel knowledgeable about the problem and technology, and the peer review report is submitted to the SCFA Lead Office. Technical project reviews may be stand-alone reviews of a technology project or precursors or follow-ups of a gate or peer review. This is a primary analysis too! for making program design and management decisions in the Focus Area. Responsibility for SCFA Decision Making Decision making in the SCFA is conducted by various individuals and groups within the SCFA team. This team includes the SCFA Lead Office, SCFA technical and systems engineering teams, headquarters sponsors, PEMs and product line integrators, and technical support contractors.

120 Decision Making in the DOE-OST Technical Analysis and Project Review Teams integrate input and provide recommendations to the Lead Office Manager (LOM), who has ultimate decision-making authority. The Technical Team Lead first recommends whether SCFA should develop the technology or whether the solution is available from private industry. The Technical Team also coordinates the program with associated funding agencies such as Crosscutting Programs, the EM Science Program, Industry Programs, and other federal agencies. The LOM makes policy decisions regarding program design and development. The Program Execution Manager (PEM) develops action lists to implement these policy decisions into the Annual Performance and Deployment Plans. Program activities are tracked by the PEM and PEMs, and recommendations are made to the LOM who issues final budget authorization. Other specific LOM decision-making issues relate to congressionally mandated programs and to developing plans and performance specifications for activities involving industry participation. Funding Because of the heavily mortgaged nature (i.e., commitments to provide future funding to ongoing technology development projects) of the program, SCFA funding was not available in FY 1997 beyond that needed to continue ongoing projects through the development process. in anticipation of funding above the mortgage level for the FY 1998 program, SCFA solicited proposals to address needs in support of strategic goals that were not satisfied by the FY 1997 program. Some funding is conducted by SCFA before prioritization of the work packages. This includes fixed program management costs and the costs to support activities under the jurisdiction of DOE-HQ, such as international interagency initiatives. Funding for these activities is authorized and directly allocated by DOE-HQ. Committee Observations The work packages are stated in general language (e.g., "Tn-Situ Destructive Treatment Technologies for DNAPEs") that is of only limited use in specifying what types of projects within that subject area should be funded. Hence, extra effort at refining prioritizations of general topical statements would be less constructive than the use of expert technical opinion to aid in the process of project development and selection. MIXED WASTE FOCUS AREA The Mixed Waste Focus Area (MWFA) is a national program with a scope covering the mixed waste technology needs at all DOE-EM sites. It is managed at INEEL. The MWFA has developed a prioritized list of technology deficiencies to fob a basis for the program's request for proposals to national laboratories and industry. These deficiencies are associated with "treatment trains" of large-scale engineering systems that would be needed to treat venous categories of mixed waste. The process by which these technical deficiencies were identified is discussed briefly below, and is based on OST documents provided to the committee and on a site visit to Idaho Fails in November 1996. A fuller description ofthis process is found in Beite! (1996~.

Appendix C Focus Areas 121 Structure and Composition A key part of the MWFA is the role of the Waste Type Managers, who are contractors at various DOE-EM sites knowledgeable about their site's mixed waste streams and the technical issues associated with their management, charactenzation, storage, and treatment. Each Waste Type Manager is supported by various Waste Type Teams that provide contacts to site operations end user personnel and site personnel responsible for meeting regulatory requirements. These Waste Type Managers and Waste Type Teams provide important technical input (Conner and Connolly, 1996~; however, DOE employees make the final decisions on any funding allocations. The MWFA conducts outreach to stakeholder groups at a national level, such as the National Technical Workgroup, the interstate Technology Regulatory Committee, and the CEN. Process for Technology Development Decision Making The major steps of the process by which the MWFA conducts its business are described in farther detail below. Inputs to the Process: Statements of Needs The MWFA obtains technology development needs from the STCGs of the major DOE-EM sites and from the sites' program development efforts associated with Accelerating Cleanup: Paths to Closure (DOE, 1998a). Another key input was the collection of Site Treatment Plans. Use of EM-30 Site Treatment Plans. The EM-30 Site Treatment Plans (STPs), developed from 1992 to 1995 as the DOE-EM response to the Federal Facility Compliance Act (FFCA), was a useful resource to the MWFA for interfacing with EM-30 program plans. The SIP identifies each mixed waste stream and the proposed treatment facility or process that would treat the waste in accordance with RCRA requirements. The treated waste would then be ready for disposal in a suitable facility. These treatment options have been forged between EM-30 and its regulators; OST was not a partner to these agreements. The regulatory deadlines and program requirements for each mixed waste stream are shown in the integrated Master Schedule (IMS), a MWFA product. Systems Engineering Approach to Establishing a Technical Baseline The MWFA diagrammed the SIP treatment options and combined similar treatment processes to alTive at a manageable number (approximately 24) of"treatment trains," which are generic processing flowsheets depicting engineering systems, each of which treats a separate class of waste stream. The individual components of these treatment trains were examined to determine whether the technology needed to filifill that fimction already existed as a full-scale demonstration or the equivalent, or whether furler technology development work would be needed to provide a proof-of-concept test. The components in the latter category became technology deficiencies; 7 of the 24 treatment trains had no such deficiencies, and ~ 7 had at least one deficiency. Prioritizing Process Flowsheets. Four criteria were developed to prioritize these ~ 7 treatment trains, also called process flowsheets.

122 Decision Making in the DOE-OST 1. Impact was measured by the volume of waste to be treated in this manner, the number of site customers needing this process, the number of affected waste streams in the mixed waste inventory, and a measure of the hazard of the waste. 2. Potential savings was measured by comparison with projected baseline cost. 3. Maturity of the flowsheet was measured by the number of technical deficiencies it contained and the estimated time it would take to remedy each (i.e., the time needed for a technology development project to result in a pilot-scale demonstration to resolve the deficiency). 4. DOE commitments were measured by the degree of commitment of the DOE problem owner (DOE line office management) to solving the problem. A remediation job for which legal orders or regulatory permits were already written received the highest weight, while internally planned DOE initiatives that lacked regulatory drivers received the lowest. This criterion was used to include considerations of regulatory milestones and other schedule requirements. Each flowsheet was rated on a scale of ~ to 5 against these four cnteria. These ratings were combined in a weighted manner using relative weights of 40, 15, 25, and 25 percent, respectively, applied to the above-mentioned criteria. The result of this exercise was a prioritized list of flowsheets. Prioritizing Technical Deficiencies. Separate criteria were developed to prioritize the technical deficiencies, defined as those components of at least one flowsheet representing steps needing technology development work. The five criteria are the following: 1. impact, as measured by the number of flowsheets needing the step, the number of times it was needed in each flowsheet, and the severity of the hazard addressed by the step; 2. critical path deficiency, as measured by an estimate of the number of years lost while a job remained undone as a result of the deficiency's not being addressed; 3. maturity, equivalent to the stage it is at in the "stage-and-gate" model; 4. fi~r~ctional requirements, a measure of the degree to which the performance specifications were known (in a rough conceptual way verses a quantified, documented fashion); and 5. DOE commitments, a measure of the urgency of the job to the DOE-EM line office management responsible for the remedial action. Each deficiency was rated on a scale of ~ to 5 against these five criteria. These ratings were combined in a weighted manner using relative weights assigned to each criterion. The result of this exercise was a prioritized list of technical deficiencies. The Needs Matrix. The MWFA displays these two prioritization results together in a "needs matrix" (Figure C. I) showing both the flowsheets and the deficiencies. The columns are the raIlk-ordered process flowsheets, and the rows are the rank-ordered deficiencies. The entries of the matrix are simply "X" marks to show which deficiencies are integral component parts of which process flowsheets. Technology Development Requirements Documents. Each need or deficiency is recorded as a separate Technology Development Requirements Document (TDRD) that specifies the technical requirements needed, stakeholder and regulatory inputs, and user (i.e., EM-30) schedule for treating mixed waste inventories. This schedule determines the window of opportunity for technology development. Eighteen such TDRDs exist on the MWFA homepage. Three examples of technology deficiency areas identified

123 l OD X · C,: _ o R. I ~~ ~ ~ C _ C Z .§ , _ _ _ c 8 Y X .9 _ _ ~ ~ C V, C X Am _ C ~ A ~ ~ E a ~ ~ ~ C - 91 4J c c _ . Z 551 Q - 32 s ~ x _ ~ - c - 3 if . _ : ~ ~. R S. .g D S ~ C~ _ c o. C C .C ~ - D C - ~ Z ~ _ C ~.g 3 ~4 ~U - c sc t-' ~ C ~ - ~ `' c ~; c .Y ~ Y 8 -. ~ ~°Ud ~S^~H

124 Decision Making in the DOE-OST by the above procedure and recorded as TDRDs are mercury amalgamation (DOE, 1996n), mercury stabilization (DOE, 1996m), and chemical oxidation (DOE, 19961~. These results form the MWFA Technical Baseline, useful for planning what type of technology development work to solicit and award contracts to, as described below. Contracting for Technology Development Work For each technical deficiency, the MWFA solicits additional information on such issues as technical performance requirements, stakeholder and regulatory issues, and availability of off-the-shelf technology. The latter is done in part via a Request for Information (RFI), which is published in Commerce Business Daily. Program managers next decide whether a solicitation will go to national laboratories or to the private sector (i.e., industry or universities). This decision is currently made in part based on an estimate of the maturity (i.e., what stage it has reached in the stage-and-gate model) of the technology needed (with lower-stage work often designated for universities and higher-stage work designated for industry). No overt competition exists between the private sector and national laboratories; separate deficiencies are announced to each group. The work slated for national laboratories is keyed to their specific facilities. Based on technical specifications to meet program requirements, a call for proposals is made. The proposals received in response to this call are evaluated by a review team. They first screen proposals based on the following five screening criteria: I. consistency with MWFA scope, 2. consistency with MWFA strategy, 3. lack of commercial availability, 4. technical credibility, and 5. duplication (i.e., Whether the work is duplicative of other ongoing work). Next, the review team scores each proposal quantitatively against each of the following five evaluation criteria: I. priority of deficiencies addressed (25 percent); 2. technical effectiveness (25 percent); 3. implementabiliCr (25 percent); environment, safety, and health (15 percent); and regulatory or permitting (! 0 percent). The relative weights shown in parentheses how the final weighted score is assigned to each proposal. Program managers use these scores as input information to inform their selection of which proposals to fiend. Monitoring Progress of Ongoing Projects The PI of each project interacts with MWFA program managers who monitor progress.

Appendix C Focus Areas The End Point of MWFA Efforts 125 The end of a technology development activity that is desired by MWFA Program Managers is a close-to-full-scale demonstration of a technology on real or surrogate waste. The goal, in a privatization context, is to provide the private sector with the technology necessary to bid on future privatized mixed waste cleanup jobs. The output of OST technology development work would be available to bidders as proven, demonstrated technology that can be built and implemented as one process component of any full-scare operation. With this strategy, demonstrations at the end of the technology development work are done in accordance with written test plans, and the demonstration results are recorded in a Technical Performance Report (TPR). Committee Observations The process by which technical deficiencies are identified, ranked, and used as a basis for RFPs by the MWFA is the most rigorous of those practiced by all the Focus Areas. The MWFA has assessed that most mixed waste technology needs are in advanced stages of engineering development (i.e., high stage and gate numbers) and therefore that the Focus Area might go out of business within seven years, based on the FY 1997 level of funding, if the projects funded within that time provide satisfactory results. This assessment is based on assumptions that may change. Many of these assumptions, such as · the requirements of the Waste Isolation Pilot Plant waste acceptance criteria, the actual configuration of mixed waste facilities that will be constructed and operated (as compared with the planned operations of the STPs), and · potential changes at local and national levels to mixed waste regulations, have yet to be fully resolved. DECONTAMINATION AND DECOMMISSIONING FOCUS AREA The Decontamination and Decommissioning Focus Area is administered at the Federal Energy Technology Center in Morgantown, West Virginia. The following information comes from a site visit by the committee in May 1997 and from Focus Area publications. The DDFA mission is to "develop, demonstrate, and facilitate implementation of systems to solve EM-40's and EM-60's identified needs for acceptable decontamination and decommissioning of DOE's radiologically contaminated surplus facilities" (Hart, 1997~. The DOE-EM D&D-related tasks include the deactivation of approximately 7,000 contaminated buildings, the decommissioning of approximately 700 buildings, and the dispositioning of radioactively contaminated materials (including more than 600,000 tons of metal, 23 million cubic meters of concrete in contaminated buildings, and 400,000 tons of metal currently in scrap piles). The major drivers for some kind of action are the high safety and health risks associated with working in aged and contaminated facilities, and the high cost associated with facility deactivation, surveillance, and maintenance (i.e., high mortgage costs). Structure and Composition The DDFA is run by DOE program managers.

I26 Decision Making in the DOE-OST Decision Processes Recent decisions within the DDFA have been associated with two activities: identifying and prioritizing D&D-related needs within the DOE-EM complex and conducting the LSDP. The LSDP represents more of a priority for the DDFA than individual technology development projects, which were funded only at a level of approximately $! million in FY 1997. The methods used to gather needs and to initiate the LSDP are described in further detail below. Needs Assessment Activities In pre-Focus Area days (Focus Areas were formed in April 1994, replacing the former Integrated Demonstrations Program), D&D-related technical deficiencies were identified based on a workshop involving outside groups. The report of this workshop is no longer used. The DDFA's management was moved from DOE-HQ to FETC in 1995, when the LSDP program was begun. In early 1996, DDFA personnel visited existing STCGs and site "end users" to collect prioritized site needs. To compare these needs, a short list of scoring criteria and their relative weighting factors were developed in a separate process involving interactions between the DDFA, STCGs, and the CLN. The criteria and weighting factors (DOE, 1996aa) were the following: · public safety and health (12 percent), which addresses the potential negative impact of the stated problem on safety and health of the public; · site personnel safety and health (15 percent), which addresses potential negative impact of the problem on the safety and health of on-site personnel; · environmental protection (10 percent), which addresses the potential of the problem to cause release of hazardous or radioactive materials on-site or off-site including the potential damage to environmental resource, habitats and populations; · compliance (! ~ percent), which relates the problem to compliance with regulatory requirements, laws, court orders, binding agreements, DOE orders, and administrative notifications by a regulatory agency; · mission impact (10 percent), which evaluates the problem relative to the existing DOE-EM · - mission; · mortgage reduction (23 percent), which addresses the problem's impact on the D&D life-cycle cost; · social, cultural, economic (7 percent), which addresses the problem's impact on social, cultural, or economic concerns in areas or regions surrounding the site, including potential impacts on stakeholder trust; and · pervasiveness of need (13 percent), which addresses the problem in terms of magnitude (e.g., severity and volume of contamination, number of facilities affected) and applicability across the DOE complex. National D&D Workshop. The site needs and the sets of criteria and weighting factors were used in a national priority-setting workshop (DOE, 1996aa) sponsored by the DDFA. Focus Area personnel aggregated the 102 site needs into 31 major areas. Each of approximately two dozen participants scored each of these 3 ~ D&D needs against each of the criteria. The result was a prioritized list (Figure C.2) of 31 national D&D technology needs, a nonbinding informative input (to create awareness) to the LSDP Integrated Contractor Teams and the Crosscutting, Industry, and University Programs, for their

127 ~ o AL l 1~1~ ~ T121~1 ~ for 1= 1~1 C 1~ loo 1~ 1 =1 2c~z~ :~: ~N LR -1 ~1 1= 1- 1 1~10 ~ 1~10 1 10 1 10 1 1~1 1~1 =~: _^ O ~ ~ ~ ~ ~ ~ O. ~ O. ~ O oo ~ ~ ~ _ ~ ~ ~ oo ~ O Cot ~ oo ~ ~ ~ ~ O ~ O I m ~ 4 ~ ~ ~ ~ ~ ~ ~ ~ ~ 4 L L ~ ~ oo ~x ~ 0 ~ 0 0 ~ 0 0 oo oo oo ~ ~ ~ oo 0 x x 0 ~ _ x ~ oo ~ ~ ~ ~ 0 ~ ~ 1 ~ I X | X | D I | X | X X | D | ° I I X I | ~ | I ~ 1 1 ° I X I V, I I X I X I I X O I I I 1 ~ 1 D ~ | ON = I Iv, lo I I lo ~ lol KD 1 1 1 1 1 101 IVI ~r 1~1 1= IXI 101 IX I\C 10 ICr: 1 ~u ~ ~ ~1~1 ~ L 23 ~ r ~ ~ ~ T ~ ~'} ~ ~ ~ ~ L1{ 1~111> ~;~;11t11~ ~1 Nl~I~ ] ~o~ ~ 1- 1 1'° 1 1~1- ~1~1= 1~ 1~1 1~1 1~1 1~1 ~ 1~1 1- 1~1~1 1- 1- 1- 1-1- -1 3 ~E° :: tO ~t, ~v~x so G-~tD :~] £ ]~ ~tt~ ~ o ~ ~ _ o V ', _ ~ x o=8 ~ ~ ~. :~. 41~..~. d. CL~1 ~= 2` Ix 1~° 1~° I 1 - 1~° 1 |\D I' 1 14~ 1t- 1t-1 1= I') ', 1 1 I\D 1 - ~ 1 - 1 1= 1~; I'} IV) 1~° '} 1~] 1' j 1 Z 8 I O ~ I ~D ~ l ;=~= 1l ~0~1~ == o ~ ~ ~ 1 T; ~ 1 ~ 1 1 ~ ~ R1 ~ 1~ ~ LLI: ~ 1~o ~ ~ ~ ~ o 0 ~ ~ ~ ~ ~ 0 ~o ~ ~ ~ ~ ~o ~ 0 ~ ~ ~ ~ ~ ~ ~ ~ _ ~ ~ =1 ~ I {0 121 121 I ~D I ~D l 12~ 1 ~ 1 ~ ~ ~ ~ ~ ~ ~ ~ l l I I O ~ ~ ~ ~ ~ " '~

128 Decision Making in the DOE-OST consideration, to guide their selection of technologies. This prioritization exercise will be superseded by FY ~ 998 priorities derived from Accelerating Cleanup: Paths to Closure (DOE, ~ 998a). Funding Technology Development Activities The DDFA funds some specific Pl-type technology development projects as well as a select handfi~! of LSDPs. Technology Development Projects. The DDFA is open to any technology showing a cost, schedule, or risk reduction, or anything that would do a job that cannot be done by current methods. DDFA program managers can steer technology developers to the right program to seek development funds. An outside developer would be directed to the industry Program, which has a separate budget (because, legally, national laboratories cannot compete with private industry) and well-prescribed rules of interaction governing the solicitations for proposals and the type of feedback that can be given. An in _ _ house DOE developer (e.g., someone at a national laboratory) would be steered to funds controlled by one of the Focus Areas. Essentially all DDFA technologies are at the gate 5-6 level. The funds for specific projects comprise about 10 percent of the FY 1997 DDFA budget of $10 million (see Box C.~), with additional D&D- related technology development work represented by approximately $13 million to $14 million in the budgets of industry Programs and Crosscutting Programs. Large-Scale Demonstration Projects The cornerstone of DDFA's technology development program is a series of LSDPs, which are conducted at unused nuclear facilities already slated for D&D (by EM-40 or 60) that serve as EM-50 demonstration test beds for new technologies. The intent of the LSDP is to demonstrate the potential advantages of innovative technologies over commercial, baseline approaches during D&D operations. LSDPs were instituted to address two DDFA issues: EM-30, 40, and 60 are not willing to risk using technology that has not been demonstrated at a large scale, and 2. the private sector is reluctant to take the risk of developing a product for DOE that may not be used or may only have a single use. As a result, DDFA program priorities and expenditures have shifted emphasis from FY 1995, when BOX C.l D&D Technology Database Software In 1997, OST, through FETC, funded the development of a database by a company called Nuclear Expertise, Inc. (NEXT). This NEXT concept, entitled "integrated D&D Planning and Estimating Software Tools for Nuclear Utilities" was part of a contract with DOE to develop planning and cost- estimating tools for DOE facilities slated for decommissioning. Several databases have been developed by a number of private industries, utilities, government agencies, laboratories, and other countries that are conducting decommissioning. A review of the NEXI product indicates that it provides little if anything beyond what is already in existence. Although this specific example was not a high dollar expenditure, it is one example of how the OST decision process did not screen efforts already in existence prior to initiating a project. 1 - - -

Appendix C Focus Areas 100 percent of the approximately $10 million 1 By_ ^^ ~_ ~ ~1 ~_~ ~1 - 1 129 budget for DDFA was for technology development. In FY All/, ~u percent of one LJLJrA tuning was for LSDPs, with only 10 percent for individual technology development projects. The DDFA selects the LSDP facilities from among site proposals on a competitive basis, with criteria to evaluate how well suited each structure is to demonstrating a suite of technologies relevant to complex- wide D&D challenges. The Focus Area sets up a team of several contractors to run the LSDP, allowing windows of time and opportunity in the schedule for cost and performance data to be collected on each innovative technology to be demonstrated. This Integrating Contractor Team (also called a "Strategic Alliance") selects which new technologies are demonstrated, choosing candidates from among EM-50- funded projects as well as from technologies outside the DOE system (i.e., the private-sector and abroad). The DDFA showcases the LSDP approach as the solution to the problem of how to bring private-sector companies and technologies into the DOE-EM complex while educating the private sector on how to do business with DOE. The cost of an LSDP is shared between EM-50 and EM-40 or EM-60. The DDFA philosophy is that using EM-50 money to underwrite LSDPs is a way to contractors. provide a full-scale demonstration of a new technology to EM-40 and 60 federal employees and obtain side-by-side demonstrations of innovative technologies with baseline methods, reduce supplier risk and liability associated with the first-time use of new technologies, and introduce these new technologies to the community of contractors who will be bidding on future work at other DOE sites and who need to know of the technologies in order to use them. As a measure of success, 30 technologies had been demonstrated as of August 1996 in the first three LSDPs, and ~ ~ of these have been retained for use to complete the D&D of that LSDP facility, having proven themselves in the judgment of the contractor teams managing the jobs superior to the baseline methods. The new technologies have not yet been applied to jobs outside the LSDP where they were first demonstrated (as of May 1997), perhaps because the first demonstrations were done relatively recently (in August 1996~. The self-imposed strategic goal of the DDFA is to perform eight LSDPs in appropriate facilities by FY 2002 that would demonstrate the technical capability to handle 90 percent of DOE-EM's D&D problems, as calculated in some yet-to-be-determined way to account for problem needs and their technical solutions. The first three LSDPs, selected by the DDFA in the first solicitation during the summer-fall of 1995, have slightly different arrangements for the Integrating Contractor Teams or Strategic Alliances. For the Chicago Pile 5 (CP-5; see Box C.2) test reactor at Argonne National Laboratory-East, the contractors entered into a legally arranged cooperative agreement. For the Fernald Plant ~ Uranium Handling Complex, a hybrid model was used in which the fixed-price site management and operations (M&O) contractor teamed with others (Babcock & Wilcox, Foster Wheeler, and others). For the safe storage of the Hanford C Production Reactor, the Bechte! Hanford prime site contractor led the job, with others operating as subcontractors. These LSDP management arrangements are an experiment to get improved technology implemented in the DOE-EM complex. The past practice to have a test bed demonstration somewhere, followed by an EM-50 push to encourage use of the newly demonstrated technology resulted in few implementations. Two more LSDP facilities Building 779 (a plutonium-processing lab replete with glove boxes) at Rocky Flats and Building K-27 at Oak Ridge were selected in 1996 in a second competitive solicitation by the DDFA. These planned LSDPs were subsequently canceled (NRC, 199Sa). Selection of [SUP Facilities from Among Site Proposals. The competitive selection process by which LSDP facilities are chosen is handled like an official solicitation, not because this is required but because it is deemed to provide a suitably rigorous way to make a defensible and fair decision in a site

130 Decision Making in the DOE-OST BOX C.2 The CP-5 Large-Scale Demonstration Project The CP-5 demonstration focused on the decontamination and dismantlement of the CP-5 test reactor facility at Argonne National Laboratory. The project was "an aggressive campaign to screen and evaluate potential technologies for demonstration in four problem areas: characterization, decontamination, dismantlement, and worker health and safety" (DOE, 1997c), integrating the technology demonstrations with the schedule of ongoing D&D work. The CP-5 LSDP focused on technologies emphasizing characterization, worker protection, robotics and remote systems, concrete decontamination, and storage pool filtration. The LSDP included six demonstration "sets" (DOE, 1996c). The first set of demonstrations~he Validation Set was intended to fine-tune the overall planning, execution, assessment, and reporting processes. Set 5 relates to the dismantlement of the research reactor, which includes the demonstration of the Mobile Work System and a number of "innovative and commercially available end effecters and tools" (www.strategic-alliance.org/~. Twenty-five separate technologies were planned to be demonstrated before the project terminated in mid-1997. The centerpiece of the demonstration was the dismantlement of the bioshield and reactor core. Tools and equipment to be demonstrated included a mobile work system (Rosie), a Dual Arm Work Platform, and a Swing Reduced Crane Control system. Other technologies demonstrated at the CP-5 LSDP included the Pipe Explorer_ characterization tool, the Mobile Automated Characterization System, an X-Ray Fluorescence Analyzer, Empore Membrane Filtration, Surface Contamination Monitor, Pipe Crawler, Gamma Cam, and Concrete Milling/VAC-PACR-all part of the strategic alliance objectives of FETC. Data are provided from these activities to the U.S. Army Corps of Engineers for cost analysis (DOE, 1997c). The integrating Contracting Team was led by Duke Engineering Services. Other members of the team include 3M, Commonwealth Edison, Duke Power, ICE Kaiser, Florida International University, and Argonne National Laboratory. In addition to project management and integration, this entity is responsible for technology transfer. The CP-5 LSDP was intended to examine potential application of qualified technologies to other DOE sites with similar needs or to private industry D&D projects such as Commonwealth Edison's Dresden I. Benefits of the LSDPs are that they . achieved meaningful technology demonstrations that qualify for commercialization and/or wider application throughout the DOE complex; · expedited deployment of D&D technologies required to meet specific customer needs while meeting OST established ROT guidelines, identified technology activities that should be reviewed for continuing DOE support; and · introduced commercial business practices to technology deployment, thereby illustrating DOE's commitment to performance-based strategies and contracting reform. However, many of the technologies and teamworking activities have already been demonstrated | on other decommissioning projects, in both commercial and DOE sectors (e.g., the Shippingport Station Decommissioning Project, the University of California, Berkeley research reactor, Yankee Rowe, Shoreham, Fort St. Vrain, Tuxedo Park, and other sites in the United States and abroad). A peer review panel (ASME, 1997a; Love, 1997) found that some of the technologies termed "innovative" are in fact fully developed (see also Box C.31. The redemonstration of these technologies on DOE-EM D&D projects is a standard"on-line engineering" practice to adapt equipment to new applications. l

Appendix C Focus Areas 131 competition. The DDFA issues a call for proposals from DOE site operations offices for candidate LSDP locations. The proposals received are evaluated as follows: I. Each person on a Technical Advisory Committee independently reviews each proposal based on criteria such as the significance of the demonstration, its cost-saving potential, the potential for complex- wide application of any useful results, the existence of a variety of challenging technical problems (to maximize potential application elsewhere in DOE), the site commitment to binding the D&D project, and favorable management arrangements. This 7-to-10-person team is constituted of DOE managers from EM-30, 40, 50, and 60, plus representatives of the U.S. Army Corps of Engineers (the Corps was not included in the first solicitation in 1995~. Each Technical Advisory Committee member provides written qualitative comments on each of the criteria, documenting the strengths and weaknesses of the proposal. 2. An Evaluation Team of three people develops consensus on a final ranking for each proposal, by assigning a numerical rating (weight) for each criterion, quantitatively scoring each proposal against the criteria, and averaging the three member scores. 3. A Selection Official receives these numerical results and issues a letter to make the award, but not before the final step. 4. The selection results are presented to the Focus Area Steering Committee (Deputy Assistant Secretaries of EM-30, 40, 50, and 60), for their review and approval. In summary, proposals are prioritized using, first, qualitative comments by the Technical Advisory Committee, and later quantitative scores by the Evaluation Team. The approval of the Focus Area Steering Committee is obtained before the Selection Official makes the final selection. The decision reached is a team effort and a product of Focus Area management efforts in which many participate. The following are evaluation criteria and their weighting factors that were developed (DOE, 1996f) from this process: · significance of the demonstration (20 percent~scale and scope, potential to reduce cost and risk over baseline in similar future projects, end state; · readiness of demonstration (20 percent~current status of characterization and decommissioning plans, D&D contractor under contract; · site commitment (30 percent~funding from other organizations, consideration of stakeholder concerns; · project management (30 percent) Integrating Contractor Team in place, with perceived strengths); · program policy factors no weighting factor appears to be assigned to this criterion; it is used by the Selection Official to consider needs such as distributing projects among a greater geographical area; optimizing the use of available funds; addressing federal, state, and local political sensitivities; diversifying the types of facilities hosting LSDPs; and considering projects that enhance existing or planned activities of DDFA, including collaboration with STCGs. Selection of Individual Technologies to Be Demonstrated at an LSDP Facility. The integrating Contractor Teams and Strategic Alliances (SAs) at existing LSDPs have their own processes to identify, screen, and select the technologies for demonstration, but in general each uses the following evaluation criteria (Hart, 1997) to decide how to allocate the funds they have been authorized by FETC to achieve the objectives stated in their proposal: technology maturity, · application to facility needs,

I32 Decision Making in the DOE-OST application to DOE complex needs, ability to adequately measure performance, compatibility with baseline D&D schedule, demonstration cost, expected improvement over baseline, waste minimization, and technology provider participation. The Integrating Contractor Team decides which technologies are to be demonstrated, at a typical cost of $100,000 to $300,000 per demonstration, and presents this selection as a proposal to two DOE employees (one from the DDFA and one from the DOE Operations Office for the LSDP site). The DOE employees exercise line-by-line veto power. One issue they consider is to ensure that there is no duplicative demonstration at another LSDP. The approved demonstrations are conducted. This process has varied from one LSDP to another, based on the DDFA's experience and the different legal arrangements that define the interaction between the Integrating Contractor Team and DOE managers. Each LSDP has a target number of demonstrations to be done, during the D&D job, and payment of EM- 50 funds to the contractor teams is made in stages as individual technologies are demonstrated (according to pre-approved test plans), rather than in a lump sum. The U.S. Army Corps of Engineers is used as an independent party to validate cost and performance data. The full-scale engineering cost and performance data generated from each individual technology demonstration are recorded in an innovative Technology Summary Report (TTSR), also known as an EM-50 "green book." ., ~Or, Cost Sharing Among [SDP Participants. The fill costs of performing technology demonstrations are shared by the DDFA (EM-50), vendors of innovative technologies, and the owner of the surplus facility (EM-40 or EM-60~. The DDFA contributes several million dollars to each LSDP. In general, DDFA funds represent the incremental costs associated with demonstration of innovative technologies (i.e., the difference between the cost to demonstrate the new technology on a section of a facility and the cost to use a commercial baseline technology on the same section of the facility). Costs for the use of commercial baseline technologies are borne by the problem owner. The new technology vendor is expected to share the cost of the demonstration since successful demonstration will validate the product and provide a rapid avenue to commercialization. Funds may also be contributed from other sources such as the private sector, other DOE departments, and other federal or state agencies. Some of the technologies used in an ESDP can also be submitted for (ASTD) funding. Since the ASTD program is for deployment of already-demonstrated technologies, technologies successfully demonstrated at an LSDP are well positioned for ASTD funds. ~, Impact of Accelerating Cleanup: Paths to Closure on the DDFA.. The current 2006 Plan calls for $2 billion to be spent on D&D in DOE-EM until 2006, with $~8 billion spent after that date. The crafting of site 2006 Plans has generated more and better data on DOE-EM D&D jobs than the DDFA originally had. For example, it requires EM-40 and 60 to show the schedule of D&D operations. The level of detail is also sufficient for the private sector to make a rational decision on inveshnent in going after these jobs. Such additional information (provided as part of 2006 Plan activities) has helped OST program managers (Hart, 1997) justify the need for each of the L`SDPs in the second round. Accelerating Cleanup: Paths to Closure (DOE, 199Sa) is also credited with helping to identify a cost and schedule estimate for each baseline technology. 'Demonstration performance indicators are identified (Hart, 1997) as worker radiation exposure, health and safety, cost or performance, schedule savings, training requirements, secondary waste generation, equipment mobilization or demobilization, and ease of equipment decontamination.

Appendix C Focus Areas 133 DDFA Bringing Technologies into the DOE-EM Complex. As described to the committee (Hart, 1997), the term "innovative" as applied to the DDFA might be best defined as "never been used before in the DOE complex"~nean~ng that a technology does not necessarily have to be new in the non-DOE world for it to be considered for DDFA funding. For the CP-5 Reactor LSDP, where 20 innovative technologies were planned to be demonstrated, 10 of these were mature full-scale hardware systems from OST-funded technology development projects and 10 came from outside DOE (i.e., they were already in use in the private sector and abroad, see Box C.3~. Of these 20 projects, 14 had been demonstrated as of May 1997. The 20 were screened from a beginning list of approximately 100 alternative technologies. D&D-Related Mortgage Reduction. The decontamination and decommissioning of unused buildings is not a compliance-~iven activity in general; therefore, only a handful of projects are done each year. The DDFA sees technical innovation to cut D&D costs as an excellent opportunity to achieve mortgage reduction. The rationale is that, with so few D&D projects slated for the near future, opportunity exists to develop new technologies to impact the baseline costs of future jobs. Committee Observations The 1996 prioritization criteria are generally stated (e.g., public safety and health, environmental protection, and compliance), which has the disadvantage that they are subject to a wide range of possible interpretations. This can cause an evaluator to react to the term used, thereby introducing the evaluator's own value system judgments into the evaluation of technical need. The correction would be to provide expanded statements defining these criteria in more explicit terms. BOX C.3. Characterization Hardware: The Pipe Explorer_ System I The Pipe Explorer_ device was an OST development that had been demonstrated at FUSRAP, | (Adrian, Michigan); Albuquerque, New Mexico; and Grand Junction, Colorado (DOE 1996q) by 1996, but which was redemonstrated in the CP-5 LSDP. The Pipe Explorer_ device was developed by Science and Engineering Associates, Inc. under contract with FETC. It has been used to transport various characterizing sensors into piping systems that have been radiologically contaminated. The device can be deployed through constrictions in the pipe, around 90° bends, vertically up and down, and in slippery conditions. The device is protected from contamination to eliminate cross-contamination and false readings by means of an airtight membrane, which is disposable. When the canister is pressurized, the membrane inverts and deploys inside the pipe. As the membrane is deployed within the pipe, the detector and its cabling are towed into the pipe inside the membrane. Measurements are taken from within the protective membrane. Once the measurements are complete the process is reversed to retrieve the characterization tools. Characterization sensors that had been demonstrated as of the 1996 report were gamma detectors, beta detectors, video cameras, and pipe locators. Alpha capability had been developed by that time, but had not yet been demonstrated. The system is capable of deploying in pipes as small as 2 inches in diameter and up to 250 feet long for pipe of diameter 3 inches or greater. The redemonstration of the Pipe Explorer_ device at CP-5, after its development and demonstration at three other DOE sites, is an indication of a problem within DOE-EM, which might be cast as a deficiency in obtaining the broad knowledge and adoption or utilization of innovative technology as it becomes known and demonstrated. If this function were conducted by another program office, technology development funds would not have to be used for this purpose. I

134 Decision Making in the DOE-OST The committee also notes that not all must be combined in the same weighted analysis. For example, the cnter~on of worker risk could be handled as a go or no-go judgment based on some safety threshold (e.g., whether the innovative technology has more worker risk than the baseline method), instead of in a weighted fashion with other criteria. As a second example, the criterion of mortgage reduction could be taken out (i.e., not weighted with the others, as was done in the 1996 effort) and evaluated separately by DOE program managers considering the policy angles, rather than being combined with the weighted judgments associated with the other criteria. This issue arises because the weight of this category may be different in 1998 than in 1996, and such a difference may affect the outcome. The concept of mortgage reduction is key to Accelerating Cleanup: Paths to Disclosure (DOE, 199SaJ. According to the committee's perception, the descriptors of the 31 needs of the 1996 needs assessment exercise (Figure C.2) are too general to define the technical program. if the program goal is to engender private sector participation in DOE-EM D&D operations (rather than to develop technology in deficiency areas), then such a mission should be formulated only after thorough research (and documentation) to ascertain the validity of the underlying assumption that this is what OST needs to do. TANKS FOCUS AREA This section discusses the decision-making approaches of the Tanks Focus Area (TFA), which develops technology to remediate tanks at four DOE-EM sites2: the SRS, INEEL, the Oak Ridge Reservation (ORR), and the Hanford site. Structure and Composition The TEA is run by DOE program managers, who work as a team with contractors, many at the Pacific Northwest National Laboratory (PNNL). Process Description Tanks Focus Area decision making involves four major steps: technology need identification, technology need prioritization, proposal solicitation, and proposal selection. Since its inception, the TEA has employed a well-defined approach to these tasks, an approach that continues to be modified and improved. The current overall TFA decision-making process is summarized in Figure C.3 (Frey and Brouns, 1997) and discussed below. This discussion will be in the context of decision making for an arbitrary fiscal year N. Identification of Technology Needs The TFA technology needs identification for year N begins in the first quarter of the prior fiscal year (FY N - I) by sending each of the four relevant site STCGs a call for identification of prioritized technology development needs. The format for recording needs is a template originally created by the 2The tanlcs at the Nuclear Fuel Services Plant near Buffalo, New York; are not addressed by the TFA.

135 or x - '~J Q ._ _ ~ ._ _ .0 0 - ~ ~ cn lo U) E E ~ ~ 'my O . (a to (s .~ E ~ ._ a, a) o ~ C :~ ~ Q LL o - ~n CD in on E a, ao .' ~ o" ~ ~ Q LL] ~ O C ~ O C' a) id, - Cl' in ,,, 1 1 1 1 0~O ~ _ c' B O O ~ ~E,"' \/ L < .ca E ° E 3 ~ om z E E Us Zoo ~ ,' B ~ ~ E o .O I 0 = = 0 = ,_ ° ~ ~ ~ 3 · - S E~ z

136 Decision Making in the DOE-OST TFA. With the maturation and coordination of the STCGs, this template has been refined and standardized across all sites and Focus Areas. The call for technology needs results in considerable STCG activity to assess, establish, and prioritize site needs as described in Appendix B. Aggregation and Prioritization of Technology Needs The TFA sorts the prioritized responses from the STCGs by primary technical category (e.g., retrieval, characterization, pretreatment, and immobilization). The STCG priority valuations are retained. The needs are then analyzed to identify duplicate or similar needs (a common occurrence), which are aggregated into higher-level needs in a "roll-up" process. The priority of a higher-level need is assessed based on the STCG priorities of the constituent needs represented within the higher-level need category. This analysis and roll-up of needs is performed by the TFA Technical Integration Managers (TTMs), who are the second tier of the TFA technical organization and staffed by individuals from a number of DOE sites. The aggregated needs are then subjected to a complex series of activities to prioritize them. First, the needs are prioritized by the TFA Technical Team. This draft prioritization is then reviewed by representatives of the four tank site STCGs and representatives of site contractor users via discussions and meetings. The result is a prioritized set of aggregated needs that is reviewed and approved by the various site STCG representatives. The prioritized needs are documented in an annual informal report containing the basic needs, aggregated needs, and final prioritization. This report is distributed to all affected sites. A prioritized list of site needs is achieved by negotiation among the four tank sites and the TFA technical staff using the criteria and trade-offs that each of these parties provides. A structured decision process and explicit criteria are not evident; in fact, the subjectivity appears to be desired by the participants. Thus, the participants appear to be satisfied with the process and the TFA performance continues to be highly regarded by other review groups and by DOE management (Berkey, 1998; SurIes, 19971. Creating a Work Plan of Proposed Projects The prioritized, aggregated needs provide the basis for the TFA Technical Team to devise a technical work plan to meet these needs. The needs are apparently first assessed to determine whether they are carry-over needs that are already being addressed or can be addressed by ongoing projects. Ongoing projects appear to receive the highest priority; then new starts are considered. The Technical Team identifies projects to meet the needs, which includes ongoing projects. Each need is evaluated to determine the way that projects to fulfill it will be solicited. Proposals may be sought from the private sector (industries and universities) or national laboratories by requests published in appropriate media, such as Commerce Business Daily, the Federal Register, Internet sites, and letters within the DOE complex. In some cases, a proposal is solicited from only one organization. These "sole-source" solicitations appear to be used for projects that require capabilities (e.g., facilities or access to wastes) unique to a specific organization. It is not clear to the committee how the Tanks Focus Area determination of unique capability is made and controlled. The projects proposed (representing both the ongoing projects and the new starts), also called "technical responses," are prioritized with a quantitative procedure based on four criteria: ~ . extent of multisite benefit, 2. 3. cost or mortgage reduction, support of TFA strategic goals, and 4. user commitment to deploy.

Appendix C Focus Areas 137 This prioritization process produces enthusiastic discussion, because the participants are normally aware of a nominal budget, which establishes a he facto cut line that provides motivation for proponents of specific proposals. The result is an initial prioritization that is documented and subjected to another review by the same group~epresentatives of the four tank site STCGs and representatives of site contractor users~hat reviewed the aggregated list of prioritized site needs. The result is a prioritized list of TFA projects to be executed. The results of this process are recorded during FY N - ~ in a formal document, a multiyear program plan, that outlines the midrange (several years) plans for the TFA. Additional information is provided during FY N - ~ to the DOE {RB, which is the planning basis for the fiscal year N + ~ budget, and to the PEG, which defines the specific work to be performed in the upcoming fiscal year N. Process Evaluation and Committee Observations As noted above, the TFA program is regarded relatively highly by other reviewers, which can be attributed in part to the strongly user-driven decision-making process. However, the process is complex and, in parts, opaque. Some of the specific concerns that it engenders include the following: · The roll-up (aggregation) of site needs often renames and redefines the need of a specific site to the point that (~) the site does not recognize the aggregated need, even though the solution may be adequate; or (2) the original intent of a site need has been lost and the aggregated need does not meet this original need even though it is claimed to do so. Recent revisions of the needs template have attempted to address this problem by providing specific feedback to each site's stated need. However, it is not clear that the interaction between the TFA and the sites is sufficient to ensure that projects meet all the needs that are claimed. · The means by which the TFA accommodates midcourse corrections necessitated by budget changes (apparently all too common) or changing needs at user sites (also common) are not clear to the committee. · The TFA has been relatively successful in transitioning to a customer-focused technology development program. However, it is not clear that the TFA has effective mechanisms in place to terminate successful development projects and transfer emphasis to other areas. In some areas, investigators have developed yet-to-be-implemented technologies and are now proposing to pursue "second-generation" technologies while other aspects of tank remediation require more resources to develop first-generation technologies.

Next: D Crosscutting Programs »
Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology Get This Book
×
 Decision Making in the U.S. Department of Energy's Environmental Management Office of Science and Technology
Buy Paperback | $50.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!