Committee Findings

In its statement of task, the D&D committee was asked to review the approaches used by the DDFA to assess their utility and effectiveness. The review was to include the DDFA's means of selecting and evaluating alternative technologies and its methods of seeking deployment of new technologies, especially the LSDP. In addition the committee was to review the processes that DDFA is using to compare the advantages expected from technology development to existing (within DOE) processes.

To accomplish its task, the committee held a series of fact-finding, discussion, and writing meetings, as described in Appendix B. Some of the committee members also visited the sites hosting the LSDPs. Reports of these site visits are given in Appendixes C-F. The committee's findings are based on literature made available by OST and the DDFA during the review, presentations at its meetings, and the site visits. Findings relevant to the statement of task are discussed in six topical areas:

  • Strategic planning,
  • Evaluation and prioritization,
  • End-state specification,
  • Dissemination of results,
  • Technology assessment, and
  • Technology implementation.

General Findings

Since its inception, OST has been exposed to a great deal of external criticism focused primarily at lack of significant new technology deployment com-

The National Academies of Sciences, Engineering, and Medicine
500 Fifth St. N.W. | Washington, D.C. 20001

Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement

Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 19
--> 2 Committee Findings In its statement of task, the D&D committee was asked to review the approaches used by the DDFA to assess their utility and effectiveness. The review was to include the DDFA's means of selecting and evaluating alternative technologies and its methods of seeking deployment of new technologies, especially the LSDP. In addition the committee was to review the processes that DDFA is using to compare the advantages expected from technology development to existing (within DOE) processes. To accomplish its task, the committee held a series of fact-finding, discussion, and writing meetings, as described in Appendix B. Some of the committee members also visited the sites hosting the LSDPs. Reports of these site visits are given in Appendixes C-F. The committee's findings are based on literature made available by OST and the DDFA during the review, presentations at its meetings, and the site visits. Findings relevant to the statement of task are discussed in six topical areas: Strategic planning, Evaluation and prioritization, End-state specification, Dissemination of results, Technology assessment, and Technology implementation. General Findings Since its inception, OST has been exposed to a great deal of external criticism focused primarily at lack of significant new technology deployment com-

OCR for page 19
--> mensurate with the amount of money spent (GAO, 1996, 1997). The OST responded to this criticism by an almost continuous change in programs, processes, and stated priorities. These are discussed in Chapter 1 and summarized in Table 1 (p.7). Within the DDFA, the committee found that continual superficial changes, for example, changing the names of its key programs (the Technology Deployment Initiative is now the Accelerated Site Technology Deployment Program, and the Large-Scale Demonstration Program became the Large-Scale Demonstration and Deployment Program) was detrimental to the DDFA's effectiveness. Such changes gave the impression that the programs were ad hoc and lacked follow-up. This was symptomatic of the lack of strategic planning, prioritization, specification of objectives and documentation described in the following sections of this chapter. By implying significant changes in program content when there were none, DDFA undermined its credibility with its potential DOE customers and industrial partners. In addition to the perception of DDFA's programs as a "moving target," the committee found the complexity of DOE purchasing, contracting, and other procedures to be a significant inhibition to participation in DDFA programs by outsiders. Greater participation by private industry and universities might have provided a greater diversity of innovative technologies to the LSDPs. The DDFA's Requests for Proposals for the LSDP were addressed only to DOE offices and not to the commercial sector. Information on potential industrial partnerships for the LSDPs appears to have been available mainly to contractors already familiar with the sites. For example, there was no notice published in Commerce Business Daily when the Strategic Alliance for CP-5 was being formed (Appendix C). In response to external criticism and budgetary pressure, however, recent efforts by OST have been promising. The committee acknowledges that these recent efforts (described by Gerald Boyd in OST's final presentation [Boyd, 1997b] to the committee) to formalize the prioritization process, to involve technology users, and to improve communications and information dissemination, represent necessary and long overdue improvements in OST and the DDFA. These efforts are still in the planning stages, and it is too early to judge whether they will be successful. The committee can only comment on programs that were in place during the review period covered by this report, which ended in December 1997. Strategic Planning Implementation of the DDFA program has suffered from the lack of a current strategic plan on which to base discussions, select alternatives, and manage the program. The planning documents provided to the original CEMT D&D committee in 1995 were never issued in final form, nor were revised planning

OCR for page 19
--> documents provided during the committee's review.17 Numerous changes in policy, objectives, schedules, and funding since 1995, especially the Ten Year Plan with its increased emphasis on technology deployment, have made previous DDFA planning obsolete (DOE, 1997c). Up-to-date planning documents were deemed necessary by the committee to define an ordered and structured sequence of activities for DDFA's technology development and implementation. In the absence of an up-to-date OST-DDFA strategy, new programs were initiated without documented evaluation of their expected results or of their impacts on existing efforts. The increased number of DDFA-related initiatives by EM-OST during 1996–97 is shown in Table 1 (p.7). Shifts in approach, such as the emerging emphasis on deployment on top of the existing efforts at technology demonstration (the LSDPs), caused competition among new and existing programs for limited resources. As a result, DDFA has experienced increased difficulty in meeting its previously announced objectives (for example, the continued schedule slippage of the LSDPs and their documentation discussed in Chapter 1). A conclusion of the 1995 CEMT report was that DOE should address planning in terms of a process, since "many organizations have found that the most valuable aspect of any strategic planning exercise is the process of assembling the plan rather than its specific details" (NRC, 1996). The CEMT report recommended that the planning "receive the undivided attention of the highest-level decision makers." In the committee's view, a successful plan should provide sufficient guidance for technology development to ensure the timely availability of technologies needed to achieve a specified end state for each facility undergoing D&D. It would include analysis of the capabilities of existing technologies and provide for development of new technologies where existing technology is lacking or where a new approach could perform faster, cheaper, and more effectively. Such strategic planning for the DDFA has not been accomplished. Near-Term (Pre-2006) Strategy The current DOE approach to the cleanup of its radiologically contaminated weapons complex facilities is to make maximum use of existing and proven techniques to achieve as much cleanup as practicable in the shortest time. The stated goal of EM's Ten Year Plan is to clean up 90 percent of its sites within the next 10 years (DOE, 1997c, 1998b). This is a commendable goal, but it clearly 17   The documents, in draft form, were: Strategic Plan: Decontamination and Decommissioning Focus Area, Pre-Decisional Draft, July 14, 1995; and Implementation Plan, Decontamination and Decommissioning Focus Area, prepared by U.S. DOE Morgantown Energy Technology Center, July 18, 1995.

OCR for page 19
--> leaves the more difficult, the more intractable, and the more costly cleanup matters for later (long-term) treatment (DOE, 1997c, 1997d).18 For near-term D&D projects, the committee agrees with DDFA's approach of identifying and validating, by on-site demonstrations, existing technologies and commercially available practices that are currently not used on DOE sites. However, information presented to the committee indicated that capabilities already residing in the private, academic, and foreign sectors were not being incorporated efficiently into the DDFA's short-term planning. In fact it became clear during several question sessions that such sources were mostly untapped. In cases where these "new" technologies demonstrably are superior to baseline or "available" technologies, they can be applied successfully to enhance EM's near-term D&D capability. Further, the committee believes that there is insufficient time to develop, test, and apply unproved, truly innovative, technologies to effect major cleanup tasks by 2006. OST's shift in emphasis from technology development toward technology deployment assistance is consistent with accelerated cleanup. If the expected benefits from innovative technologies proposed by the DDFA are to be realized, it will be necessary for DDFA to achieve greater effectiveness in technology deployment than was seen by the committee during its review. Long-Term (Post-2006) Strategy While the emphasis of the Ten Year Plan is on proceeding as quickly as possible with actual cleanup work, the plan also considers development of advanced technologies necessary, both to address cleanup tasks not amenable to available techniques and to reduce costs and risks of the cleanup program. The committee believes that ten years or more is a realistic time frame for development, demonstration, and deployment of truly innovative (i.e., unproved or presently unknown) technologies. The committee found that an opportunity exists now for the DDFA to develop an effective long-term strategy that leads to the deployment of advanced technologies after 2006. It seems prudent for the DDFA to concentrate its technology development activities on addressing longer-term needs. To focus this activity, both site-specific and complex-wide problems that are either intractable or very difficult (e.g., expensive) with current technologies must be identified and prioritized. An effective strategy will target those problems with the highest complex-wide return on investment and include the research and development 18   It is noted that about 60 percent of the total cleanup cost will be incurred after 2006. For D&D activities, coming generally near the end of the overall program, it has been estimated that 80 percent of the costs will be incurred after 2006.

OCR for page 19
--> capabilities of industry, the national laboratories, and research universities as appropriate. A review of OST's programs, including the DDFA, by the Strategic Laboratory Council (composed of one representative from each national laboratory) presented suggestions for greater use of the national laboratories as sources of new technologies, especially through the EM Science Program (SLC, 1997). Because D&D is a growing need for all industrialized countries, non-U.S. programs are another source of technical innovations. Evaluation and Prioritization The sine qua non of effective technology development is the establishment of specific needs that the technology is to meet. The "National Needs Assessment" compiled from needs identified by the STCGs and issued in July 1996 was a significant step in this direction (DOE, 1996a). This document, which was intended to represent the complex-wide DOE technology needs, resulted in the identification of 102 D&D needs. These were grouped into 31 categories. In several of the DDFA's presentations to this committee, site needs were listed as a criterion for selecting technologies to be demonstrated in the LSDPs; however, no presentation or publication from the DDFA has shown a clear relationship between these categories and the technologies selected for the LSDPs or TDIs. The committee finds that effective overall evaluation of the site needs and prioritization of those needs is missing from the present DDFA approach. In addition to the initial "bottom-up" input from the STCGs, "top-down" prioritization for program planning is necessary. Such an exercise requires the judgment and participation of senior D&D subject-matter experts both external and internal to DOE, who can provide input on existing commercial and other government agency technologies and thus develop an overall, prioritized, top-down evaluation of needs to feed into the DDFA program. The committee also believes that the emphasis of this effort should be to identify and prioritize only those technologies needed beyond 2006 that may provide significant cost and/or safety benefits in the long term, as well as those few problem areas where no technology currently exists. End-State Specification The 1995 CEMT report stated that "DOE is in urgent need of defining criteria by which the level of site cleanup on a 'necessary and sufficient' basis within regulatory constraints can be determined" (NRC, 1996). This level of cleanup, or "end state" is the final condition of the building, facility, or site after cleanup is accomplished. In the case of D&D, an agreed upon end state may require little cleanup for a building planned for re-use by DOE, or it may require demolition of the building and return of the land to unlimited use. End states must be known and specified in order to develop a realistic and effective cleanup

OCR for page 19
--> program. Determination of end states for DOE sites or facilities is clearly beyond the authority of OST and is not even entirely within the province of DOE, considering stakeholder and regulatory agency involvement. Nevertheless, the committee would have expected the DDFA to use its best judgment to define reasonable target end states as a standard for determining needs for new technologies, comparing technologies, and measuring their degrees of success. In its present review, the committee found no evidence of significant progress in the DDFA in defining or even proposing end states to be targeted by its new technologies, nor did there appear to be any sense of urgency in the DDFA to address this need. The committee believes that formulating a coherent program to develop new D&D technologies cannot be done unless the target end states to be achieved are specified. Without defining "how clean is clean enough" for a given D&D task, the technology needs, costs, and schedules cannot validly be established. The defined end state also determines characteristics of secondary waste from the D&D activity. Volume, radioactivity, and other chemical and physical properties of the waste stream will determine recycling and disposal options, cost, and, ultimately, whether the proposed D&D technology is adequate. The technology required to reach one level of residual radioactivity may be entirely different from the technology required to reach a different level, either higher or lower. Depending on the required end state, there may or may not be a need for developing a new technology to meet the requirement. By developing reasonable target end states, DDFA will be able to justify its technology development program better. For example, presently available technology may not be able to reach a target end state, in which case, improved technology clearly is needed. Alternatively, in seeking to deploy a new "cheaper and faster" technology, demonstration that it can achieve the same end state as the baseline technology is essential. As in a previous report (NRC, 1996), the committee notes that before undertaking any technology development program, it is essential to define and specify the particular problem that is to be solved, rather than to develop a solution and then look for a problem for which the technology can be used. Dissemination of Results OST does not have direct responsibility or authority for implementing or using the technology it demonstrates. Before the DOE field offices or site contractors will be willing to use its technology, OST/DDFA must sell the technologies through effective communication. Potential users must be aware not only of the technical results of the DDFA's developments and demonstrations but also of the factors that make a given technology more attractive, or less attractive, than the baseline technology. The lack of success in gaining user support for its

OCR for page 19
--> initiatives that the committee observed during its review period indicates that the DDFA is not communicating effectively with its customers. As described in Chapter 1, the Innovative Technology Summaries or ''Green Books" were planned to be the primary vehicle for disseminating results of the LSDPs. DDFA planned to describe each successful technology demonstration in sufficient detail, including comparative cost data, to convince potential users complex-wide to adopt the new technology in place of a previous baseline technology. It is a significant deficiency that no Green Book had been published by the end of the committee's review period, December 1997. Several technologies had been demonstrated more than a year previously.19 Only five of an expected total of about 50 Green Books had been published as of February 1998. Although in his final presentation to the committee, Gerald Boyd stated that of 39 demonstrated technologies, EM-40 had adopted 14 without requiring Green Books (Boyd, 1997a), the committee found that DDFA has fallen short of its initial intent to disseminate information resulting from the LSDPs promptly. In addition to timely dissemination of results, the Green Books must provide important new information to DDFA's customers. However many of the technologies demonstrated in the LSDPs that will appear in Green Books are simply off-the-shelf commercial products. Some examples are the cutting and surface ablation techniques listed in Table 3. The committee found that DDFA may be harming its own credibility by promoting these as innovative technologies, and it questions whether simply testing and providing information about commercial technologies is a suitable role for the DDFA. Technology Assessment With increasing demand for justification of OST programs, methods must clearly be defined for determining the potential advantages of an innovative technology over the baseline.20 Although the potential for direct cost savings in the D&D operation is of primary importance, there are other technical and institutional issues that can have a major impact on the decision process. A key feature of the LSDP was to be the direct, side-by-side comparison of baseline and new technology, as discussed in Chapter 1. During the site visits, 19   For example, the Kelly Steam/Vacuum decontamination system, the AEA High-Pressure Water Sponge Jet, and the Vector Technologies VecLoader for asbestos removal were demonstrated at Fernald in August 1996. Green Books for these technologies had not been published as of April 1998. 20   Hearings by the Bliley committee and GAO audits were noted in Chapter 1. Testifying before the Bliley committee on May 7, 1997, Dr. Edgar Berkey, Chairman of the Committee on Technology Development and Transfer of DOE's Environmental Management Advisory Board (EMAB), stated that one of his committee's findings was that better performance metrics for EM's technology program are likely to yield better results (Berkey, 1997).

OCR for page 19
--> committee members discussed the procedures for selecting baseline and innovative technologies for each LSDP with DDFA and site personnel (Appendixes C-F). These discussions indicated that the technology selection procedures were not consistent from site to site. In addition, the DDFA's baseline technology, against which its innovative technology was to be compared, was not necessarily the technology used by the site contractor for the actual D&D of the facility. In some cases the baseline technology was not demonstrated as a part of the LSDP at all, but was a technology that had been used elsewhere. The committee found that actual side-by-side comparisons of technologies were achieved in a convincing manner in relatively few cases. The LSDPs therefore failed to provide the solid basis for technology comparison and assessment promised at the program's inception. Cost Estimation Since only technologies that have been proven to be safe and meet regulatory criteria can be considered for implementation, the choice among technologies is based almost entirely on economics. During its review, the committee placed considerable emphasis on evaluating the DDFA's methods for estimating cost savings that could result from implementing innovative technologies versus the baseline technologies. Common costing and performance assessment methodologies are essential if comparison of baseline and innovative technologies is to be done. The committee acknowledges that the DDFA has made a conscientious effort to provide accurate cost estimates of technologies tested in the LSDPs through its agreement with the U.S. Army Corps of Engineers (see Chapter 1). When the Corps first described its role to the committee in June 1997, the committee felt that the task assigned to the Corps was insufficient to provide reliable comparative cost estimates. Specifically, the Corps was not tasked with developing cost estimates of the baseline technology, but rather was to accept these data from DOE. Subsequently in the course of developing its comparative cost estimates, the Corps found it necessary to perform cost estimates of both the baseline and the innovative technology (Kessinger, 1997). The Corps therefore requested a change in its scope of work and will provide all cost estimates that are to be used in the DDFA's Green Books. The committee, however, has reservations about the reliability of the comparative cost estimates and believes that future technology demonstrations in the LSDPs (as well as new deployments in the TDIs) can be planned and conducted to provide better data for cost estimation. The committee's reservations arise from three observations: first, many of the baseline technologies were not actually demonstrated;21 second, the scope of the baseline and the innovative tech- 21   This is acknowledged in four of the five Green Books that were published in February 1998, which state that the baseline costs are not based on observed data.

OCR for page 19
--> nology tests was different in some cases; and third, cost and bidding procedures differ among the DOE sites (U.S. Army Corps of Engineers, 1997). In the 1995 CEMT report, the committee recommended that DOE base its estimates of cleanup costs on a standard methodology and explicit criteria that would prevail across all DOE programs, sites, and projects (NRC, 1996). Such a common basis is essential for valid economic comparison of alternative technologies. Other Considerations In addition to comparative cost, there are other factors that affect the selection of a new technology. These factors can have an overriding impact on the choice of the appropriate technology to accomplish a specific D&D task. The more important factors include technological risk, regulatory acceptance, conflicts with previously established interagency and state agreements on environmental impact, likely end-user acceptance, waste management implications, schedule impact, and D&D worker/public safety. New technologies may be at a disadvantage compared to the baseline just because their use could require altering contractual requirements. Secondary waste produced by a new technology may lead to difficulties since it may well be different from that of the baseline technology and could require a new risk assessment and permitting process. The committee found that the LSDP fails to address these important institutional barriers to the deployment of new technologies. Technology Implementation The introduction of new methods and processes understandably brings some apprehension compared to the utilization of older, familiar approaches especially in the larger DOE sites that have well-established procedures and customs for their operation. There are also the realities of schedules, regulations, and agreements with stakeholders that constrain site operators. The committee, as well as others, recognize the reality of the "not invented here" syndrome. The LSDP has fallen behind its original, ambitious schedule and in general has not met with the success initially expected. This lack of success is a clear signal that voluntary acceptance of new technologies by those responsible for implementing the actual cleanup will not occur until the user site organizations are convinced that there are definite advantages in doing so. Deployment strategies so far generally have failed to demonstrate the advantages of using the new technologies. The case in point is the LSDP that was intended to introduce and demonstrate innovative D&D technologies at full scale alongside existing baseline commercial technologies. Unfortunately, the concept of demonstrating new technologies came to individual cleanup programs late in their schedules and work

OCR for page 19
--> programs. D&D work at Fernald and at CP-5 had begun well before the idea of testing new technologies at these locations was put forth. As a consequence, the D&D plans previously established for these existing projects had to be changed, schedules revised, and budgets reallocated. In many cases the baseline technology specified for the LSDP differed from that in use by the contractor and was never itself demonstrated. In these cases, valid comparison of the baseline and innovative technologies was not possible. In addition, cancellation by field offices of the two LSDPs scheduled to begin in 1997 was seen as a clear and negative signal by the committee. To encourage site managers to accept new technologies, OST established its TDI in 1997 to provide financial incentive to potential users of its selected technologies. By the end of its review period, the committee had not been made aware of any documented technology deployment, nor could the committee find evidence to suggest that limited financial assistance will achieve the acceptance that eluded the LSDP. Although the committee felt that it was premature to formally evaluate the TDI (now called the Accelerated Site Technology Deployment Program), the committee expresses reservations about this approach and notes that similar concerns have been raised by other review groups (DOE, 1997d; GAO, 1997).