Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 42
Building an Effective Environmental Management Science Program: Final Assessment 4 PROPOSAL SELECTION AND FUNDING In its Initial Assessment Report, the committee devoted considerable attention to the Department's process for proposal solicitation and selection. The committee's comments in that report focused on the FY96 solicitation and proposal review process, which was well under way when the committee began its work.1 The committee's Letter Report focused primarily on the content and structure of the FY97 program announcement, but the committee also offered suggestions on the FY97 review process. The purpose of this chapter is to summarize and extend the comments from these previous reports to address the committee's charge (Appendix A) to provide advice on the structure and operation of the program. The comments in this chapter address the following issues: review process, program funding, and the role of stakeholders in the program. Additional comments on program management can be found in the next chapter. PROPOSAL REVIEW PROCESS The Environmental Management Science Program (EMSP) employs a two-stage review process to evaluate proposals submitted to the program—a review of scientific and technical merit followed by a review to assess relevance to the cleanup mission. The merit reviews are performed by panels of scientists and engineers convened by Office of Energy Research (ER) staff, whereas the relevance reviews are performed by panels of Office of Environmental Management (EM) program managers who are familiar with the Department's cleanup 1 As noted in Chapter 1, the FY96 program announcement was published in February 1996, and full proposals were due in May, during the early stages of the committee's study.
OCR for page 43
Building an Effective Environmental Management Science Program: Final Assessment problems. This proposal review process has received considerable scrutiny from the committee in its previous reports. In general, the committee has been satisfied with the design of the review process—as noted, for example, in the following excerpt from page 6 of its Letter Report: The committee reaffirms its endorsement (from the Initial Assessment Report) of the two-phase review process used in the FY1996 competition that first evaluates the scientific and technical merit of the proposals and then examines more closely the relevance of the proposed work to the clean-up mission. The committee believes that this two-phase review process should continue in FY1997 and that it should continue to be managed as a partnership between ER and EM. However, this satisfaction is based entirely on the results of the FY96 program competition—which may or may not be typical of future competitions. As noted in Chapter 1, the committee received extensive written documentation on successful proposals from the Department, including principal investigator (P.I.) and co-P.I. names and affiliations, biographical sketches of P.I.s, abstracts of funded projects, and amounts of other current DOE funding. The committee reviewed these data, and individual committee members paid particular attention to those projects that were within their areas of expertise. Based on this review, the committee reached the following conclusions about the FY96 proposal competition: Meritorious projects appear to have been selected in the FY96 proposal competition. This is a qualified judgment, however, because the committee was not able to examine the unsuccessful proposals to determine whether they were qualitatively different from funded proposals. The committee was unable to ascertain what criteria were used in the selection process and, as a consequence, whether these criteria were ones with which it would agree.
OCR for page 44
Building an Effective Environmental Management Science Program: Final Assessment Collaborative efforts were well represented among the list of successful projects. As shown in Table 4.1, about two-thirds of the projects supported in the FY96 competition involved collaborations. At least 33 of the 140 P.I.s supported in the FY96 competition currently do not have other Department of Energy (DOE) funding, suggesting that the Department was successful in attracting some ''new" researchers to the program. The committee was able to obtain firsthand information on the membership of one of the review panels and was able to confirm its overall quality. The success of this joint review process can be attributed in large part to good communication and coordination between EM and ER staff. In the committee's opinion, a continuing partnership between EM and ER is essential to maintain the effectiveness of the review process. The committee remains concerned about some elements of the review process, particularly the interaction of the merit and relevance review panels. Basic research, by its very nature, is not usually measured against the yardstick of "relevance." Thus, the relevance review, unless carefully managed, has the potential to compromise the outcome of the merit review process. This could happen if, for example, the relevance review panels were to select many proposals that ranked lower in the merit review instead of more highly ranked proposals. This would have the effect of diminishing the overall quality of the science in the EMSP, which could reduce the long-term effectiveness of the EMSP to the cleanup effort. It also would have the effect of diminishing the influence of merit review panelists on the final outcome of the competition and could discourage highly regarded scientists from serving on EMSP merit review panels. The committee has two concerns about the transparency and technical credibility of the merit review process, concerns that were expressed in its Letter Report. First, as presently managed, the merit review process is "opaque" to those who submitted proposals to the program, merit review panelists, and the broader research community. The names of the merit review panelists are kept confidential by the Department, so there is no way for P.I.s to evaluate the intrinsic quality
OCR for page 45
Building an Effective Environmental Management Science Program: Final Assessment TABLE 4.1 Investigator Collaborations in the FY96 Proposal Competition Based on Data Received from the Office of Science and Technology, U.S. Department of Energy Type of Collaboration Number Percent Partnerships involving a single university 27 20 Partnerships involving multiple universities 7 5 Partnerships involving a single national laboratory 22 16 Partnerships involving multiple national laboratories 3 3 Partnerships involving universities and national laboratories 31 21 Partnerships involving universities and industry 1 <1 Partnerships involving universities, national laboratories, and industry 1 <1 No partnerships (i.e., single-investigator awards) 47 34 Information not available 1 <1 Total 140 100 of the proposal reviews.2 Additionally, the merit review panelists were asked to provide individual scores on proposals, but they were not told how their scores were used by ER program managers to make award decisions. Second, the merit review panels are not constituted as FACA3 committees. Consequently, the merit review panelists are allowed to discuss and provide individual scores on each proposal, but the panels as a whole are not allowed to reach consensus on individual proposals or to 2 The committee understands that the Department is not required to keep the names confidential, but it has been its practice to do so. 3 Federal Advisory Committee Act, Public Law 92-463.
OCR for page 46
Building an Effective Environmental Management Science Program: Final Assessment provide ER program managers with a ranking of proposals or to make comparative assessments of proposals. Such assessments become especially important when large numbers of proposals are being reviewed, but only a small number of these proposals can be supported—a problem that is likely to get worse in the next few years if funding for the program is not increased.4 Collectively, the panelists have much greater knowledge on the subjects of the proposals than individual program managers, and it makes good sense to take full advantage of this expertise in the review process. The current process allows ER program managers to operate fairly autonomously with relatively little visibility in the research community for decisions that are being made in the program. In its Letter Report the committee recommended that ER constitute its merit review panels as FACA committees. In subsequent discussions with ER staff, the committee learned that DOE is prohibited by law from convening FACA committees that are closed to the public. FACA permits agencies to close meetings to the public if sensitive personal or other information is being discussed—as would be the case for proposal reviews. However, the Department's statutory legislation prohibits it from closing any committee meetings, including those constituted under FACA, except for purposes of protecting national security.5 A 1991 U.S. General Accounting Office (GAO) report,6 which also recommended that the Department convene its peer review committees under FACA, acknowledged these legal barriers but recommended that the Department seek a change in its legislation to make the use of such committees possible. In its response, which was included at the end of the GAO report, the Department agreed to seek such a change. To the committee's knowledge, however, no change was ever sought. 4 In FY96, DOE received 810 full proposals in the FY96 competition. Based on individual scores from the merit review panelists, DOE program managers grouped these proposals into one of three categories: 77 proposals were rated as "must fund," 111 as "should fund," and 622 as "don't fund." A total of 140 awards were made, including 73 awards to "must fund" proposals and 67 awards to ''should fund" proposals. 5 15 U.S.C. § 776(b) provides the applicable language. 6 U.S. General Accounting Office, 1991, Peer Review: Compliance with the Privacy Act and Federal Advisory Committee Act, GAO/GGD-91-48, 30 pp. (Washington, D.C.: GAO).
OCR for page 47
Building an Effective Environmental Management Science Program: Final Assessment ER staff have asserted that the FACA process would impose a heavy paperwork burden on the Department. The committee does not doubt that FACA will entail some extra paperwork but notes that other federal agencies such as the National Science Foundation (NSF) and the National Institutes of Health (NIH) are able to meet the paperwork requirements routinely. The committee recommends that the Department examine the entire review process for the EMSP with the goal of increasing its transparency and technical credibility. To this end, the committee recommends that the Department carry through on its stated intention (in its response to the 1991 GAO report) to seek a change in its legislation to allow FACA proposal review panels—and to convene the EMSP merit review panels under FACA once this change is made. The committee also is concerned with the lack of timely feedback to proposers—both successful and unsuccessful—on the results of the merit and relevance reviews. In discussions with EM and ER staff at its open meetings, the committee learned that in the FY96 proposal competition panelist reviews were not sent to P.I.s unless requested, and these reviews did not always reflect the discussions in the panel meetings.7 Consequently, some of the reviews were of limited usefulness to P.I.s in understanding why their proposals were declined or how they could be improved. The committee recommends that in future competitions the proposal reviews be modified to reflect the discussions at the panel meetings and, further, that applicants receive feedback on the content and result of the reviews in a timely fashion. PROGRAM FUNDING The issue of program funding received considerable attention from the committee in its previous reports, as noted in Chapter 1. The committee's Initial Assessment Report provided comments on the program's annual budget, the Department's initial allocation of funding for non-DOE (i.e., university and industry) and DOE (i.e., national 7 These written reviews were prepared by the merit and relevance review panelists before the panel meetings, and they were not updated to reflect any changes that occurred as a result of the panel discussions before being sent to the P.I.s.
OCR for page 48
Building an Effective Environmental Management Science Program: Final Assessment laboratory) proposals, and full funding of successful proposals out of current-year funds. The committee recommended that awards in the 1996 program be fully funded up front to ensure that there would be a relatively constant number of new starts in succeeding years of the program. In its Letter Report the committee returned to the issue of full funding of proposals and also addressed the developing "mortgage" on future-year budgets. This mortgage developed because the Department was unable to fully fund awards to national laboratory investigators but instead had to commit funding from future-year budgets. In the Letter Report the committee presented a financial analysis for the EMSP based on the funding commitments from the FY96 competition. This analysis provided two scenarios for future funding of the EMSP to illustrate the committee's concerns about the future levels of funding for the program given current commitments on future-year program funds. The steady-state funding scenario,8 which is shown in Table 4.2, was generated using the following set of assumptions: Funding of new awards for non-DOE performers (i.e., university, industry, and other nonprofit performers) is continued at the FY96 level of $43 million for three-year grants, and these awards are funded fully in the first year, as was the case for the FY96 proposal competition. The ratio of dollars committed each year to awards to non-DOE performers to the dollars committed each year to new awards to national lab performers remains constant at FY96 levels. Awards to national lab performers are paid in equal installments over three years. Total annual funding for the EMSP is allowed to increase as necessary to satisfy the foregoing assumptions. As shown in Table 4.2, to maintain funding for new starts at FY96 levels, the total annual funding for the program would almost triple, to $131 million in FY99, before declining to a steady-state value of 8 Referred to as the unconstrained funding scenario in the committee's Letter Report.
OCR for page 49
Building an Effective Environmental Management Science Program: Final Assessment $112 million in FY2000. This amount is roughly 225 percent of the current annual budget for the program. The constrained funding scenario, which is shown in Table 4.3, was generated using the following set of assumptions: Total annual program funding is constrained to FY96 levels of $50 million. As in the steady-state funding scenario, the ratio of dollars committed each year to awards to non-DOE performers to the dollars committed to new awards to national laboratory performers remains essentially constant at FY96 levels. As in the steady-state funding scenario, awards to national laboratory performers are paid in equal installments over three years. The first installment is paid during the fiscal year in which the awards were made. The two remaining installments are paid in the two succeeding fiscal years. As shown by the scenario in Table 4.3, for example, the $27 million awarded to national laboratories in FY97 would be paid in three equal installments of $9 million in FY97, $9 million in FY98, and $9 million in FY99. This scenario illustrates the full effects of the mortgage when national laboratory performers receive funding one year at a time and non-DOE performers receive all of their funding up front. As shown in Table 4.3, the mortgage from the FY96 award cycle creates a significant drain on program funds through FY99. In FY97, for example, only $27 million in new funds is available—$18 million to non-DOE performers and $9 million to DOE performers.9 Indeed, by FY99 only $10 million in new funds is available to non-DOE performers and $6 million in new funds is available to national laboratory performers, about a quarter of the funding available in FY96. Based on this analysis, the committee draws the following conclusions about funding for the EMSP: (1) the budget for the EMSP will have to increase significantly to maintain a reasonable number of 9 The FY97 program announcement was released just before this report entered review. It indicates that only $20 million in new funding is available, not the $27 million indicated in the calculation shown in Table 4.3.
OCR for page 50
Building an Effective Environmental Management Science Program: Final Assessment TABLE 4.2 Hypothetical Funding for the EMSP When Annual Program Funding Is Allowed to Reach a Steady State Program Funds Distributed During Fiscal Year (millions of dollars) Fiscal Year 1996 1997 1998 1999 2000 2001 2002 Non-DOE performers 1996" 43 0 0 0 0 0 0 1997 43 0 0 0 0 0 1998 43 0 0 0 0 1999 43 0 0 0 2000 43 0 0 2001 43 0 2002 43 National laboratory performers 1996a 4 23 23 19 0 0 0 1997 23 23 23 0 0 0 1998 23 23 23 0 0 1999 23 23 23 0 2000 23 23 23 2001 23 23 2002 23 TOTAL 47 89 112 131 112 112 112 a Results from the FY96 proposal competition. new starts and competitive renewals with a reasonable distribution of funding between DOE and non-DOE performers; or (2) if the budget remains at current levels, both non-DOE and DOE performers could see about a 75 percent drop in funding for new and competitive renewal projects. In discussions with the committee, EMSP staff have stated that DOE financial practices do not permit them to provide full funding for multiyear proposals from DOE performers. ER staff told the committee that the Director of the Office of Energy Research would like to change these practices and provide full funding for national laboratory proposals
OCR for page 51
Building an Effective Environmental Management Science Program: Final Assessment TABLE 4.3 Hypothetical Funding for the EMSP when Annual Program Funding Is Constrained to $50 Million Program Funds Distributed During Fiscal Year (millions of dollars) Fiscal Year 1996 1997 1998 1999 2000 2001 2002 Non-DOE performers 1996a 43 0 0 0 0 0 0 1997 18 0 0 0 0 0 1998 12 0 0 0 0 1999 10 0 0 0 2000 25 0 0 2001 20 0 2002 17 National laboratory performers 1996a 4 23 23 19 0 0 0 1997 9 9 9 0 0 0 1998 6 6 6 0 0 1999 6 6 6 0 2000 13 13 13 2001 11 11 2002 9 TOTAL 47 50 50 50 50 50 50 a Results from the FY96 proposal competition. in some of its programs but has so far been unable to do so. Indeed, ER staff indicated that they are finding it increasingly difficult to provide multiyear funding for university proposals, even in regular ER programs. The committee believes that it is beyond its charge to evaluate the Department's current financial practices or to assess the likelihood that these practices can be changed in time to impact the FY97 proposal competition. Nevertheless, the committee continues to be very concerned about the full funding issue because of its potentially significant impacts on future project awards. Simply put, the program must be large enough to support a significant number of "new starts" (i.e., new projects or competitive renewals) each year if it is to be successful in attracting
OCR for page 52
Building an Effective Environmental Management Science Program: Final Assessment innovative proposals from outstanding researchers who are not now doing research relevant to EM's problems. The committee believes that, without some assurance that funding will be available to support a reasonable number of new awards annually, EMSP will simply not be viewed as "worth the effort" by potential proposers. Over time this situation is very likely to adversely affect the quality of the program and to diminish its potential benefit to the overall EM program. The committee notes that DOE itself recognized that EMSP should be a significantly larger program, on the order of $150 million (as expressed by Thomas Grumbly, then-Assistant Secretary for Environmental Management, in the document entitled Summary of Workshop to Initiate the Development of a Science Program to Support the Department of Energy's Office of Environmental Management10). The committee appreciates the difficult budget environment that DOE now finds itself in and recognizes that any increases in the budget for the EMSP may be at the expense of other Department programs. In the committee's view, however, this funding should not come from existing ER programs, which are vital to the Department's long-term mission and are an important part of the nation's basic research portfolio. Nevertheless, the EMSP cannot live up to its potential without careful consideration by DOE of both the total funding levels and the funding patterns (i.e., the balance between new and continuing awards). The committee urges DOE to find a solution to the problem of not being able to "forward fund" projects at national laboratories and reiterates its recommendation from the previous reports to fully fund all awards in the first year. ROLE OF STAKEHOLDERS IN PROPOSAL REVIEW AND SELECTION During the course of this study, the Department held workshops at three of its sites—Hanford, Savannah River, and Idaho—to inform 10 This workshop was held at the Holiday Inn, Washington Dulles Airport, on July 21, 1995.
OCR for page 53
Building an Effective Environmental Management Science Program: Final Assessment stakeholders11 about the EMSP and obtain feedback on the kinds of cleanup problems that would benefit from basic research. The workshops were attended by DOE staff, contractors, national laboratory and university researchers, members of citizens' advisory groups, and other interested members of the public. The committee did not participate formally in any of these workshops, but two members of the committee and one member of the staff attended two of the workshops as observers. They found the workshops to be useful for providing information to stakeholders about the EMSP and generating some enthusiasm among the stakeholders for this program but less useful for obtaining feedback on research needs. These workshops were organized because EM staff recognize that stakeholders have legitimate fiscal and programmatic concerns about the EMSP. In particular, stakeholders have an interest in ensuring that the EMSP is using its financial resources—resources that might otherwise be used for cleanup—effectively and that the research sponsored by the program is addressing important problems at the sites. In the committee's opinion, Department staff have a responsibility to keep the stakeholders informed of this program and to seek their input in defining the site problems for the EMSP science plan. The committee suggested a process in Chapter 3 for obtaining this input. At the same time, some stakeholder groups, particularly industry and government agencies, can assist with the transfer of research results into cleanup. The committee suggests a process for this transfer in Chapter 5. The committee does not believe that stakeholders should be involved in the day-to-day management of the program, particularly the proposal review and selection process. Proposal review and selection should be based primarily on expert judgments of the intrinsic merit of the proposed research, the feasibility of the technical approach, the competence of the principal investigators to undertake the proposed research, and the adequacy of the facilities for carrying out the proposed work. To be effective and credible, the review and selection process should be carried out by technical experts and should remain free of local concerns and special-interest pressures. 11 A stakeholder is defined by the Department as anyone with an interest in DOE activities or anyone who may be affected by DOE activities. This definition was taken from the EM Primer, which is posted on the Department's Web page.
OCR for page 54
Building an Effective Environmental Management Science Program: Final Assessment Having said this, the committee also believes that participation of EMSP investigators in the proposal selection process would be very helpful in future-years. As the program matures, these individuals can bring an important perspective that helps link EMSP more closely to the broad research community, which will benefit the process of shaping the longer-term character of EMSP. DOE should also improve and enhance the ways in which it informs the potential users of EMSP results (e.g., technology managers at the various sites) about the process and the outcome of EMSP proposal selection. In this way the problem holders will become more aware of the kinds of research and the quality of the people that EMSP supports. The hoped-for result of such improved information flow is that these problem holders become more attuned to the long-term benefits of the EMSP to their efforts.
Representative terms from entire chapter: