Executive Summary

The Office of Science and Technology (OST) of the U.S. Department of Energy's (DOE's) Environmental Management (EM) Program promotes the development of new and improved technologies to lower cleanup costs and risks and to improve cleanup capabilities throughout the nation's nuclear weapons complex. The annual budget for technology development activities within OST in fiscal year 1998 is approximately $220 million, which supports more than 200 research and development (R&D) projects at universities, national laboratories, and private-sector companies. These projects are chosen for new and continued funding through a complex technology selection process, which uses the results from various types of reviews, including programmatic reviews, technical assessment reviews, and peer reviews.

Several National Research Council (NRC) committees evaluated DOE-OST's technology selection process and recommended that OST develop and apply an independent, external review process to all of its technology development programs (NRC, 1995b,c, 1996). These findings were echoed in a subsequent General Accounting Office (GAO) report, which concluded that "although the lead sites used significantly different systems to select projects, none of them used disinterested reviewers to determine the technical merit of the proposed work" (GAO, 1996, p. 7). In response to these NRC and GAO reports, in October 1996, OST instituted a peer review program to assess the scientific merit of its technology projects. According to OST, the peer review program "is designed to provide unimpeachable technical reviews on a timely basis to assist in decision making" (DOE, 1998b, p. 1). In establishing this peer review program OST chose to use the American Society of Mechanical Engineers (ASME), with administrative and technical support provided by the Institute for Regulatory Science (RSI), to conduct peer reviews of technologies (or groups of technologies) at various stages of development.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 1
--> Executive Summary The Office of Science and Technology (OST) of the U.S. Department of Energy's (DOE's) Environmental Management (EM) Program promotes the development of new and improved technologies to lower cleanup costs and risks and to improve cleanup capabilities throughout the nation's nuclear weapons complex. The annual budget for technology development activities within OST in fiscal year 1998 is approximately $220 million, which supports more than 200 research and development (R&D) projects at universities, national laboratories, and private-sector companies. These projects are chosen for new and continued funding through a complex technology selection process, which uses the results from various types of reviews, including programmatic reviews, technical assessment reviews, and peer reviews. Several National Research Council (NRC) committees evaluated DOE-OST's technology selection process and recommended that OST develop and apply an independent, external review process to all of its technology development programs (NRC, 1995b,c, 1996). These findings were echoed in a subsequent General Accounting Office (GAO) report, which concluded that "although the lead sites used significantly different systems to select projects, none of them used disinterested reviewers to determine the technical merit of the proposed work" (GAO, 1996, p. 7). In response to these NRC and GAO reports, in October 1996, OST instituted a peer review program to assess the scientific merit of its technology projects. According to OST, the peer review program "is designed to provide unimpeachable technical reviews on a timely basis to assist in decision making" (DOE, 1998b, p. 1). In establishing this peer review program OST chose to use the American Society of Mechanical Engineers (ASME), with administrative and technical support provided by the Institute for Regulatory Science (RSI), to conduct peer reviews of technologies (or groups of technologies) at various stages of development.

OCR for page 1
--> OST asked the NRC to convene a committee to evaluate the effectiveness of its new peer review program and to make specific recommendations to improve the program, if appropriate. This is the second of two reports prepared by this committee. In its interim report published in October 1997 (NRC, 1997b), the committee conducted a preliminary assessment of OST's new peer review program. In this report, the committee provides a more complete assessment of the program after its first complete annual cycle. Definition of Peer Review Peer review is used throughout the scientific and engineering communities to evaluate the technical merit of research proposals, projects, and programs. In this report the committee has used the following definition of peer review from the U.S. Nuclear Regulatory Commission (USNRC), which articulates the scientific and engineering communities' sense of the term: A peer review is a documented, critical review performed by peers [defined in the USNRC report as "a person having technical expertise in the subject matter to be reviewed (or a subset of the subject matter to be reviewed) to a degree at least equivalent to that needed for the original work"] who are independent of the work being reviewed. The peer's independence from the work being reviewed means that the peer, a) was not involved as a participant, supervisor, technical reviewer, or advisor in the work being reviewed, and b) to the extent practical, has sufficient freedom from funding considerations to assure the work is impartially reviewed. (USNRC, 1988, p. 2) The term "peer review" has the following characteristics: expert (including national and international perspectives on the issue), independent, external, and technical. Benefits of Peer Review Peer review is recognized as an effective tool that R&D program managers can use to obtain high-quality technical input to decisions on allocating their resources (NRC, 1995a; Committee on Economic Development, 1998).

OCR for page 1
--> This is especially important in situations of constrained funding, where program managers are required to make decisions on the relative merit of projects within their program's R&D portfolio. If its results are used as a significant input into programmatic decision making, peer review can improve both the technical quality of projects in a R&D program and the credibility of the decision-making process. In the case of OST, such improvements may increase the likelihood that the program will produce technologies that prove effective in cleaning up contaminated sites throughout the nation's nuclear weapons complex. The 1995 report by the NRC's Committee on Environmental Management Technologies recommended the development and implementation of such a peer review program for OST's technology development program for just this reason (NRC, 1995b). Improving Technical Quality The independence of peer reviewers makes them more effective than internal reviewers because experts who are newly exposed to a project often can recognize technical strengths and weaknesses, and can suggest ways to improve the project that may have been overlooked by those close to it (Bozeman, 1993). Peer review can improve the technical quality of projects in a R&D program in two ways: (1) by identifying projects that lack technical merit (or are technically inferior to other feasible alternatives) so that they can be discontinued early in the R&D cycle (before large investments of funds are made), and (2) by identifying specific ways to improve proposed or ongoing projects. As a result, a greater number of alternative projects can be supported in the early stages of the development cycle, thus increasing options and chances of ultimate success in meeting the programs objectives. Improving the Credibility of the Decision-Making Process When peer review results are used to improve the quality of a decision process (e.g., selection of proposals, prioritization of projects for funding), they also enhance the credibility of the decisions. External experts often can be more open, frank, and challenging to the status quo than internal reviewers, who may feel constrained by organizational concerns. Evaluation by external reviewers thus can enhance the credibility of the review process by avoiding both the reality and the appearance of conflict of interest (Kostoff, 1997a). In addition, peer reviews that are conducted publicly, using known reviewers and following an

OCR for page 1
--> established process that provides immediate feedback in the reviewers' own words, can enhance credibility by increasing confidence in the review process (NRC, 1997a; Royal Society, 1995). For all of these reasons, the use of peer review increases the likelihood that decisions are consistent with the best available scientific and technical information. Of course, peer reviews in and of themselves cannot ensure the success of a project or program. Effective peer review, however, can increase the probability of project and program success. Realization of the benefits of peer review requires that the process of peer review be effective and credible and that its results be used as important input in making decisions regarding future support for the reviewed project (Chubin and Hackett, 1990). Peer Review Process The peer review process can be broken down into five general steps. For a peer review process to be credible and effective as a whole, each of these steps should be performed following well-defined procedures that are understood and accepted by those involved with the process. Selection of proposals, projects, or programs to be reviewed. In cases where funding limitations or other factors do not allow all projects (or proposals) to be reviewed regularly, peer review program managers must have a systematic and credible approach for selecting which projects are peer reviewed. An effective selection process employs well-defined criteria to prioritize those activities to be peer reviewed. Definition of objectives of the peer review and selection of specific review criteria. The goals, or objectives, of the peer review also must be spelled out clearly so that they are understood by those involved in the process (Chubin, 1994; Chubin and Hackett, 1990; Kostoff, 1997b). For peer reviews of projects, the objectives and utility of peer review vary with the stage of the technology development, adoption, and implementation processes. Although peer review is especially useful at the outset of a project, it can play an important role even at later stages of development and in the implementation phase, where the objectives of the peer review might be to enable late-stage refinements of the technology or to validate expectations of performance. The specific review criteria should be defined prior to the selection of peer reviewers to ensure that reviewers as a whole have the appropriate expertise. Because peer reviews are by definition technical in nature, both the objectives of the review and the review criteria should focus on technical considerations.

OCR for page 1
--> Selection of the peer review panel. The process for selecting reviewers must consider the fundamental characteristics of peer review and the specific objectives and criteria for the particular review being organized, and should be conducted by a person or group independent of the group being reviewed1 (Cozzens, 1987; Koning, 1990). Peer reviewers should be selected in accordance with formally established qualification criteria that include at a minimum the following: relevant demonstrated experience, peer recognition, knowledge of the state of the art of the subject matter under review, absence of a real or perceived conflict of interest, and bias2 such that the panel as a whole is balanced. Preparing and conducting the peer review. For the peer review to be objective and effective, reviewers should receive written documentation that describes the proposed project and its significance and a focused charge that describes the purpose of the peer review and the review criteria. These materials should be provided to the reviewers well in advance of the review. In cases where a review panel is convened, the panel should be provided with clear presentations by the project team, as well as adequate time to assess the project comprehensively so that the panel is able to write a report that effectively summarizes and supports its conclusions and recommendations. Complete respect for confidentiality of proprietary information during review is vital. Confidentiality issues can be dealt with through panel selection (i.e., by avoiding reviewers with conflicts of interest) and by requiring panel members to formally agree not to use any such information without written permission from the author or proposer. Use of peer review results in decision making. A peer review program will be effective only if its results are an important factor in making program decisions (Bozeman, 1993; Cozzens, 1987). Peer review reports that clearly provide the rationale for their conclusions and recommendations are an essential first step in achieving this objective. If a peer review has been planned for use in decision making, as recommended in this report, the use should be straightforward. 1   Reviewers should not be selected by persons connected to the projects being reviewed (e.g., principal investigators, project managers). In cases where program managers are experts in the subject matter of the peer review and are not involved in the projects themselves (e.g., at the National Science Foundation or National Institutes of Health), however, they can be involved in the reviewer selection process. 2   "Bias" refers to an inclination of one's outlook or point of view due to the nature of one's background, experience, and connections.

OCR for page 1
--> Uses of Peer Review Peer review can be employed for a variety of purposes. The following examples are illustrative, not prescriptive, and could be applied to many types of technology development programs in addition to OST's: proposal evaluations—peer reviews of proposed R&D projects; project maturity evaluations—peer reviews conducted as a project develops from a research idea to a technology that can be demonstrated, for example, at the following stages: entrance into applied research, entrance into engineering development, entrance into demonstration, and predeployment; program balance evaluations—peer reviews that assess whether the technology development program adequately addresses the technology needs, given the resources available in the context of other competing programs; and "needs" determinations—peer reviews of technology development needs; in OST's case, a review of R&D needs to address environmental problems at contaminated sites in the context of generally available technologies in the public and private sectors (both national and international). These potential applications of peer review, along with examples of peer review programs in select organizations, are described more fully in Chapter 3. Assessment of OST's Peer Review Program The committee finds that OST has made significant improvements in its peer review process since the program began in October 1996. Throughout the committee's study, OST has continued to change its peer review procedures in an effort to improve the program's effectiveness. In particular, OST has revised its review criteria to focus on technical issues, has developed a more systematic approach for selecting projects to be reviewed, and has modified its list of required documentation for the peer reviews. OST also has made a number of policy changes since this committee issued its interim report in October 1997 (see Table 1 and the main body of this report for more details on these policy changes). Although in many cases it is too early to judge the actual results of

OCR for page 1
--> these changes, the committee is encouraged that OST senior management appears to be committed to this improvement process. Linkage of Peer Reviews to Management Decisions Despite the marked improvements in the procedures for conducting peer reviews over the past year, OST's peer review program still has not fully achieved its stated objectives of providing high-quality technical input to assist in decision making. The committee has found that in many cases the results of the peer reviews have not been used as input into program management and decision making. Peer reviews have been conducted on projects after OST already had decided and committed to fund the project's next stage of development, or even after a project was virtually completed. In other cases, projects on which millions of dollars had already been spent to construct pilot plants were peer reviewed before the pilot plants collected any data, that is, at a point in the project's development when no decision was to be made. In sum, peer review has in at least some instances not been used as an OST management tool, which is the stated objective of the peer review program. The committee made several recommendations to address this issue in its interim report, and OST has made a number of policy changes in response (see Table 1). Although it is still too early to evaluate the impact of these policy changes, it is encouraging that OST leadership has responded to the committee's interim report by taking such actions. The linkage between peer review results and OST's decision-making process also could be improved by explicitly identifying where and how the results of peer reviews will be used, before the review is conducted. Therefore, the committee recommends that as part of the documentation provided to peer review program management during the process of selecting projects for review, OST program managers be required to clearly identify the upcoming decision or milestone for which the results of the peer review will be used. This information also should be provided to peer reviewers as part of the documentation that they receive in preparation for the review. Selection of Projects for Review OST has conducted peer reviews on only a small percentage of the projects currently funded within its program. As of May 1, 1998, 43 out of 226 active projects had been peer reviewed. As a result, OST developed three

OCR for page 1
--> specific criteria to select candidate technologies for the peer review program in planning for fiscal year 1998 (FY98) peer reviews: (1) projects that were at Gate 4 or higher, (2) projects that have been supported for more than three years without being peer reviewed, or (3) projects that were new starts in FY97 or FY98. After applying these criteria, OST's peer review program staff ranked the projects in each focus area/crosscutting area (FA/CC)3 based on the amount of funding received by each project. These ranked lists were then used by the FA/CC program managers to determine which projects were to be peer reviewed in FY98. OST's approach addresses two primary issues identified by the committee in its interim report: the need to focus short-term peer review efforts on high-budget, late-stage projects that have never been peer reviewed, and the need to review all proposals for new technologies as they enter the project development cycle. Although OST's three selection criteria are reasonable and should help OST choose projects to be reviewed, however, they do not explicitly address two issues that have been emphasized recently by OST management: (a) the deployment of new technologies in the field, and (b) the need to reduce funding levels due to budget cuts. To address these issues, the committee recommends that OST adopt two additional criteria to choose from those projects that satisfy one of the three existing selection criteria: (1) technologies that are being considered for deployment, and (2) technologies for which a request for further funding has been received or is anticipated. Although the two additional selection criteria recommended by the committee would assist OST in identifying those projects for which peer review is of highest priority, application of these criteria would still leave a large number of projects that are not peer reviewed. To address this issue, the committee recommends that OST expand its practice of evaluating a number of related technologies in a single peer review, whenever possible. This issue is addressed more fully in Chapter 6 (see also "Reducing the Backlog," below). 3   OST's FA/CCs are administrative units used by OST to manage and coordinate its technology development activities. The four focus areas (based on OST's major problems) are: (1) mixed waste characterization, treatment, and disposal; (2) radioactive tank waste, (3) subsurface contamination; and (4) decontamination and decommissioning. The five crosscutting and supporting technology areas (technologies that apply to multiple focus areas) are: (1) robotics; (2) efficient separations; (3) characterization, sensors, and monitors; (4) industry and university programs; and (5) technology integration.

OCR for page 1
--> Review Criteria A successful peer review program requires well-defined criteria to evaluate the project or program being reviewed. In response to this committee's interim report, in February 1998 OST issued a list of four "core technical criteria" for the peer review program: (1) relevance, (2) scientific and technical validity, (3) nonduplicative or superior to alternatives, and (4) data validity (see Chapter 4 for details of the criteria). The new policy states that these core technical criteria must be augmented and particularized by technology-specific criteria, to be developed by FA/CC program managers and approved by the ASME Peer Review Committee (DOE, 1998b). The committee finds that these revised general criteria and the procedure for developing technology-specific criteria are a meaningful improvement over the original review criteria because they allow OST's peer review program staff to focus the reviews on important technical issues. The procedure also has sufficient flexibility to allow the review criteria to vary as a function of the stage of development of a technology (see Chapter 5). Selection of Reviewers The selection of reviewers is one of the most important steps in the peer review process. In assessing an individual's qualifications for participation as a peer reviewer, all relevant career experience, published papers, patents, and participation in professional activities should be considered. It is also important to consider the individual's experience with peer review itself, especially for selection of the chair of a peer review panel. The group of peer reviewers should be balanced by including individuals with an appropriate range of knowledge and experience to provide credible and effective peer review of the technologies being judged. This is particularly important for the diverse and complex technologies being developed for environmental cleanup of the DOE complex. Although the range of expertise on peer review panels observed by this committee has been acceptable, the committee believes that the current databases used to identify potential reviewers (see Chapter 4, "Selection of Reviewers") may not have adequate scope to identify the broad range of reviewers likely to be necessary for some complex reviews. The committee therefore recommends that OST establish a more systematic approach for accessing reviewer information from other databases (e.g., chemical engineers, geologists, physicists, materials scientists, biologists) or from other professional

OCR for page 1
--> societies, as needed to ensure the appropriate range of expertise for all review panels. In the ideal situation, peer reviewers, being fully independent and external, should have no conflicts of interest. That is, they should have no current or previous relationships with the principal investigators, their organization, their proposed project, or competing projects or technologies that would impair their ability to provide an objective review. However, for various real-world reasons — for example, because contractors have many divisions and technical professionals change jobs — there can be at least appearances of conflicts. OST recently revised its conflict-of-interest policy to explicitly exclude ''all DOE staff and contractors with real or potential conflicts'' from consideration as peer reviewers (DOE, 1998, p. 7); in practice, OST has interpreted this policy to exclude all DOE staff, whether or not there is a real or potential conflict of interest with the projects under review. This interpretation goes beyond the committee's recommendation in its interim report (see Table 1) but is consistent with the ASME conflict-of-interest policy. The committee would like to point out that concerns over conflicts of interest should not necessarily preclude all DOE staff and contractors from serving on a peer review panel, however. The committee believes that DOE staff from organizations outside OST and its contractors (e.g., staff at DOE national laboratories) could be used in special circumstances when the appropriate expertise is not available outside DOE and when these individuals have had no connection with the projects under review. The reviewer selection process should in general avoid DOE staff as peer reviewers, however, and should ensure that the DOE-affiliated persons are never more than a small fraction of a panel's membership. Documentation for Peer Reviews The documentation required for an OST peer review is listed in Chapter 4. This list identifies some of the documents required to evaluate the technical merit of a project and, if implemented, should improve the quality of background materials provided to peer review panels. One document that is not included in this revised list of required materials is a statement of work, or proposal, describing the specific activities that will be carried out if the project is funded. The committee recommends that a detailed proposal or statement of work be required for all peer reviews.

OCR for page 1
--> Openness of Peer Reviews The committee also encourages OST to continue to promote openness of its peer reviews and to fully inform the public and others attending the reviews of their nature. The committee believes that the strengths of open reviews (e.g., enhanced credibility of the process, and the potential for more constructive evaluations) far outweigh the potential weaknesses (e.g., possible lack of candor by some reviewers when evaluating weak proposals), especially for the peer review of projects or programs (see "Anonymous Versus Open Peer Reviews," Chapter 2). Reducing the "Backlog" As noted previously, OST had peer reviewed only 43 out of 226 projects that were receiving funding from the program as of May 1, 1998. Thus, there is a large backlog of projects that have never been peer reviewed. OST's current practice, in which nearly all peer reviews include formal presentations by the project team, followed by deliberations by the panel, and further question-and-answer sessions over the course of two to three days, places significant limits on the number of projects that can be peer reviewed. Even if the number of projects that were peer reviewed during a single review could be increased by improved efficiency, OST's backlog of technologies that have never been peer reviewed still would take years to address through the current process. If OST is to fulfill its policy that "all projects are to be peer reviewed" in the short term (i.e., the next year), it will have to make significant changes in how peer reviews are conducted. The committee recommends that OST consider adopting a "triage" approach that would allow far greater numbers of technologies to be peer reviewed . This approach would involve a formal prescreening of projects by peer reviewers based exclusively on the written documentation on the project — in effect, a "mail review" of projects, followed by a formal meeting of the panel to discuss and rank them. During the prescreening review, panel members would be asked to rank all related technologies in a given area that are being considered for additional development or deployment.4 Rankings from the panel as a whole could then be used by OST program managers to determine those highly ranked, low-budget projects that should be considered for funding without additional 4   If the prescreening review involves technologies at very different stages of development, it might be necessary to use somewhat different review criteria for each general stage of development. The same reviewers could perform all of the reviews, however.

OCR for page 1
--> peer review; those highly ranked projects that should receive a more detailed evaluation (including presentations by the project team and question-and-answer sessions); and those technically weak projects that should not be considered for funding. This approach would provide OST program managers the basis for discontinuing funding for technically weak projects, and might provide them with sufficient technical input (to be supplemented by input on nontechnical factors) to make a decision to fund a low-budget project. The prescreening evaluations should not be used as the sole means for providing technical input into decisions to fund high-budget, environmental remediation projects, however. Peer reviews involving presentations by the project team and question-and-answer sessions should be carried out for all projects involving significant capital investment by OST. Because prescreening evaluations require only written documentation on the projects to be reviewed, the triage approach could be used to evaluate a large number of projects at a single review. For example, a single prescreening review could evaluate all projects developed to address a specific type of environmental management problem or an entire OST FA/CC program. The approach also could include related projects from OST's inventory of nearly 600 projects that are not currently funded. This might allow OST to identify especially promising technologies within its inventory that should be funded for demonstration or deployment. Evaluation and Improvement Mechanisms In order to continue to develop and achieve a more effective peer review program, OST leadership will have to commit to a process of continuous assessment and improvement involving cycles of planning, execution, and evaluation. One approach for guiding the development of an internal evaluation procedure for the peer review program would be for OST peer review program managers to proactively seek out and learn from other organizations that have more mature peer review processes. This process of learning from the practices of other organizations is called "benchmarking." Peer review programs in some organizations that could be used by OST in such a benchmarking process are described in Chapter 3. In addition, metrics can be used to assist in the measurement of effectiveness and can help evaluate the success of a program in realizing its objectives. The committee recommends that OST management develop an effective evaluation and improvement process for the peer review program that includes regular benchmarking against other peer review

OCR for page 1
--> programs and the collection of activity and performance metrics (benchmarking and metrics are discussed more fully in Chapter 7). Potential Applications of Peer Review Within OST Because OST has chosen to focus its new peer review program on the review of individual projects at various stages of development, the committee also has focused its reports on the peer review of projects. However, the fundamental principles of peer review (i.e., independent, external, expert, technical) also can be applied to programs or subsets of programs rather than individual projects. One potential application of such a peer review would be to evaluate R&D efforts that are needed to address the environmental problems at contaminated sites in the context of technologies available internationally in both the private and the public sectors. Another potential application would be to assess the technical balance of OST programs in the context of other programs, both within DOE (and OST in particular) and outside the DOE complex. Chapter 3 includes an overview of special characteristics of these types of reviews, including the types of reviewers and review criteria required for credible peer reviews. Although the committee discusses these issues at some length in this report, additional peer reviews should not be added to the large number of reviews currently used within OST without first evaluating the objectives and effectiveness of existing reviews. The committee recommends that OST carefully evaluate the objectives and roles of all of its existing reviews, and then determine the most effective use of peer reviews (of various types) in meeting its overall objectives. OST's "Organizational" Culture and Leadership Improving the linkage of peer review results to OST's decision-making process will require more than these and other procedural changes recommended in this report. Realization of the potential benefits of peer review to the technology development program also will require a "culture change" within OST, whereby staff at all levels understand the potential benefits of peer review and incorporate peer review as an essential part of the decision-making process. Individual members of the OST organization will value peer review when they see beneficial results (e.g., which might be disseminated by using case histories), when management gives them logical messages of the value of peer review, and/or when they have incentives to use it or disincentives not to use it (Kostoff,

OCR for page 1
--> 1997b). The committee recommends that OST leadership develop an explicit strategy to accomplish a change in its organizational culture by distributing (1) educational materials that summarize the basic principles and benefits of peer review as a tool for decision making, (2) case histories illustrating how peer review input served to improve specific projects, and (3) summaries of key performance metrics that demonstrate how peer reviews are used to meet the overall objectives of OST's program.

OCR for page 1
--> TABLE 1 Recommendations from Interim Report, OST's Response, and New Findings and Recommendations Interim Report Recommendation(s) OST Response(s) New Findings(s) and Recommendation(s) Overall Assessment OST has made progress in its implementation of the peer review program. To fully achieve the objectives of the program, however, OST must continue to address a number of key issues that hinder the program's successful implementation. OST has revised its review criteria to focus on technical issues, has developed a more systematic approach for selecting projects to be reviewed, and has modified its list of required documentation for peer reviews (see details below). OST also has made a number of policy changes (see details below). Although in many cases it is too early to judge the actual results of these changes, the committee is encouraged that OST senior management appears to be committed to this improvement process. OST has made significant improvements in its peer review process since the program began in October 1996. Despite the marked improvements in the procedures for conducting peer reviews over the past year, OST's peer review program still has not fully achieved its stated objectives of providing high-quality technical input to assist in decision making. Linkage to Decision Making The committee . . . encourages OST to enforce the 30-day requirement for written responses, and to require more detailed responses that fully describe how the recommendations of the peer review reports were implemented or considered. The EM-53 Office Director now coordinates with program managers regarding their written responses to peer review reports. Detailed responses to the review panel's recommendations are now required. The committee recommends that as part of the documentation provided to peer review program management during the process of selecting projects for review, OST program managers be required to clearly identify the upcoming decision or milestone for which the results of the peer review will be used.

OCR for page 1
--> Interim Report Recommendation(s) OST Response(s) New Finding(s) and Recommendation(s) Application of Peer Review OST should restrict the term "peer review" to only those technical reviews conducted by independent, external experts. OST should adopt alternative terms, such as "technical review," for its internal reviews of scientific merit and pertinency. OST's Implementation Guidance states that OST is restricting the term peer review to only those technical reviews conducted by independent, external experts. Although the committee discusses a number of potential applications of peer review, additional peer reviews should not be added to the large number of reviews currently used within OST without first evaluating the objectives and effectiveness of existing reviews. The committee recommends that OST carefully evaluate the objectives and roles of all of its existing reviews, and then determine the most effective use of peer reviews (of various types) in meeting its overall objectives. Selection of Projects to be Reviewed OST should develop a rigorous process for selecting projects to be peer reviewed. To be fair and credible, this process should employ well-defined project selection criteria, and OST peer review program staff should be directly involved in making decisions regarding which projects will be reviewed. The Peer Review Coordinator is to recommend a rigorous process for selecting projects to be peer reviewed. In FY98, OST developed three specific criteria to select candidate projects for peer review: (1) projects that were at Gate 4 or higher, (2) projects that had been funded for more than three years, or (3) projects that were new starts in FY97 or FY98. Program managers then prioritized projects for peer review from among the list of candidate projects. The three selection criteria are reasonable and should help OST choose projects to be reviewed; however, they do not explicitly address OST's goal of deploying new technologies or the need to reduce funding levels due to budget cuts. The committee recommends that OST apply two additional criteria to choose from among those projects that satisfy one of the three existing selection criteria: (1) technologies that are being considered for deployment, and (2) technologies for which a request for further funding has been received or is anticipated. The committee also recommends that OST expand its practice of evaluating a number of related technologies in a single peer review, whenever possible.

OCR for page 1
--> Interim Report Recommendation(s) OST Response(s) New Finding(s) and Recommendation(s) Review Criteria The committee . . . recommends that OST revise . . . general nontechnical criteria to focus on technical aspects of these issues, or to remove them from the list of review criteria. OST generated a new list of general review criteria that do not include nontechnical issues. Core criteria must be supplemented by technology-specific criteria developed by the FA/CC program manager requesting the review. The committee finds that OST's revised general criteria and the procedure for developing technology-specific criteria are a meaningful improvement over the original review criteria because they allow OST peer review program staff to focus the reviews on important technical issues. The procedure also has sufficient flexibility to allow the review criteria to vary as a function of the stage of development of a technology. The committee recommends that OST develop a well-defined general set of technical criteria for peer reviews, to be augmented by technology-specific criteria as needed for particular reviews.     Objectives of Reviews The committee encourages OST to use the statements of desired "scope" and "purpose" that are developed by the FA/CC program managers to communicate the objectives of the review to all involved in the peer review. The committee also recommends that the objectives of the review be clearly stated in the review report. OST's new policy states that the objectives of the reviews (or ''Charter of the Review Panel'') are to be included in every peer review report, although none of the reports issued through April 1998 included such a description. The committee concludes that OST has developed an appropriate policy but has not yet implemented it, and therefore encourages OST to follow up on this new requirement. Selection of Reviewers OST should consider modifying reviewer selection criteria to emphasize expertise relevant to the review. In addition, the size of the review panel should depend on a number of factors, including the complexity and number of projects being reviewed and the specific review criteria established for the peer review. One approach that could be used to help identify reviewers with relevant expertise would be to develop and use a large data base of potential reviewers. RSI uses several sources of names of potential reviewers, including a data base of past reviewers, reviewer nominees, an RSI database of contacts, and the ASME membership list. The committee recommends that OST establish a more systematic approach for accessing reviewer information from other databases (e.g., chemical engineers, geologists, physicists, materials scientists, biologists) and other professional societies, as needed to ensure the appropriate range of expertise for all review panels.

OCR for page 1
--> Interim Report Recommendation(s) OST Response(s) New Finding(s) and Recommendation(s) Conflict of Interest In order to ensure the independence of the peer reviewers . . . the committee recommends that OST also include a criterion that explicitly excludes EM staff and contractors with real or potential conflicts of interest, including all OST staff and contractors, from consideration as peer reviewers. All DOE staff and contractors with real or potential conflicts of interest are explicitly excluded from consideration as reviewers. In practice, OST has interpreted this policy to exclude all DOE staff, whether or not there is a real or potential conflict of interest. This restrictive standard goes beyond the committee's recommendation but is consistent with the ASME Peer Review Committee's interpretation of ASME's conflict-of-interest policy. Concerns over conflicts of interest should not necessarily preclude all DOE staff and contractors from serving on a peer review panel, however. The committee believes that DOE staff from organizations outside OST and its contractors (e.g., staff at DOE national laboratories) could be used in special circumstances when the appropriate expertise is not available outside DOE and when these individuals have had no connection with projects under review. Documentation for Peer Reviews The committee encourages OST to refine its list of required materials to include only technical documents relevant to the peer review criteria established for the peer review. OST has modified the required documentation to fit with the new general review criteria and has removed nontechnical documents from the list. One document that is not included in the revised list of required materials is a statement of work, or proposal, describing the specific activities that will be carried out if the project is funded. The committee recommends that a detailed proposal or statement of work be required for all peer reviews. Peer Review Reports The committee believes that the peer review reports could be improved further . . . by also including a statement of the objective of the review and a list of references used in the analysis. OST policy states that the format of the review report will be changed to include these items, although the committee has not reviewed a technical report prepared completely in the new format. The committee concludes that OST has developed an appropriate policy but has not yet implemented it, and therefore encourages OST to follow up on this new requirement.

OCR for page 1
--> Interim Report Recommendation(s) OST Response(s) New Finding(s) and Recommendation(s) Reducing the Backlog Because the majority of projects are in the late stages of development (past Gate 4), OST should concentrate reviews on high-budget, late stage projects that have never been reviewed, but that still face upcoming decisions with major programmatic and/or funding issues. To develop its FY98 schedule from a list of 182 projects that met one of its three new selection criteria, OST sorted the projects by amount of funding and then provided lists to FA/CC managers who were asked to prioritize the lists by identifying those projects for which a peer review would be most valuable for decision making. The committee recommends that OST consider adopting a triage approach that would allow far greater numbers of technologies to be peer reviewed. Management of Peer Review Program The committee recommends that OST develop a targeted plan for the peer review program. Such a plan should consider factors such as how many of OST's technology projects can be peer reviewed, realistic schedules for the reviews, and the peer review program budget. To be effective, this plan also must assure that peer reviews are conducted early enough in the budget cycle to allow peer review results to be used as an input into meaningful funding decisions. The Peer Review Coordinator has been tasked with developing a targeted plan for the peer review program. The Peer Review Coordinator has developed a draft schedule for FY98 peer reviews. The EM-53 Office Director now coordinates with the program managers regarding their written responses to peer review reports. Detailed responses to the review panel's recommendations are now required. The committee recommends that OST management develop an effective evaluation and improvement process for the peer review program that includes regular benchmarking against other peer review programs and the collection of activity and performance metrics.

OCR for page 1
--> Interim Report Recommendation(s) OST Response(s) New Findings(s) and Recommendation(s) Organizational Culture and Leadership The peer review program will be meaningful only if OST acknowledges the benefits of effective peer review and makes peer review a vital part of the decision and management process throughout the organization. Although attaining these benefits will require a sustained effort from management, the entire organization will be rewarded through enhancement of the technology-development program. OST has improved its documentation of the peer review program, including its "Implementation Guidance for the Office of Science and Technology Technical Peer Review Process" document and the "The Office of Science and Technology Peer Review Process: At a Glance" flyer. The committee recommends that OST leadership develop an explicit strategy to accomplish a change in its organizational culture by distributing (1) educational materials that summarize the basic principles and benefits of peer review as a tool for decision making, (2) case histories illustrating how peer review input served to improve specific projects, and (3) summaries of key performance metrics that demonstrate how peer reviews are used to meet the overall objectives of OST's program.