National Academies Press: OpenBook

Evaluating Research Efficiency in the U.S. Environmental Protection Agency (2008)

Chapter: Appendix G: OMB's Research and Development Program Investment Criteria

« Previous: Appendix F: Draft Board of Scientific Counselors Handbook for Subcommittee Chairs: Draft Proposed Charge Questions for BOSC Reviews
Suggested Citation:"Appendix G: OMB's Research and Development Program Investment Criteria." National Research Council. 2008. Evaluating Research Efficiency in the U.S. Environmental Protection Agency. Washington, DC: The National Academies Press. doi: 10.17226/12150.
×
Page 113
Suggested Citation:"Appendix G: OMB's Research and Development Program Investment Criteria." National Research Council. 2008. Evaluating Research Efficiency in the U.S. Environmental Protection Agency. Washington, DC: The National Academies Press. doi: 10.17226/12150.
×
Page 114
Suggested Citation:"Appendix G: OMB's Research and Development Program Investment Criteria." National Research Council. 2008. Evaluating Research Efficiency in the U.S. Environmental Protection Agency. Washington, DC: The National Academies Press. doi: 10.17226/12150.
×
Page 115
Suggested Citation:"Appendix G: OMB's Research and Development Program Investment Criteria." National Research Council. 2008. Evaluating Research Efficiency in the U.S. Environmental Protection Agency. Washington, DC: The National Academies Press. doi: 10.17226/12150.
×
Page 116
Suggested Citation:"Appendix G: OMB's Research and Development Program Investment Criteria." National Research Council. 2008. Evaluating Research Efficiency in the U.S. Environmental Protection Agency. Washington, DC: The National Academies Press. doi: 10.17226/12150.
×
Page 117
Suggested Citation:"Appendix G: OMB's Research and Development Program Investment Criteria." National Research Council. 2008. Evaluating Research Efficiency in the U.S. Environmental Protection Agency. Washington, DC: The National Academies Press. doi: 10.17226/12150.
×
Page 118

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Appendix G OMB’s Research and Development Program Investment Criteria1,2 As another initiative of the President’s Management Agenda, the devel- opment of explicit R&D investment criteria builds on the best of the planning and assessment practices that R&D program managers use to plan and assess their programs. The Administration has worked with experts and stakeholders to build upon lessons learned from previous approaches. Agencies should use the criteria as broad guidelines that apply at all levels of Federally funded R&D efforts, and they should use the PART as the instru- ment to periodically evaluate compliance with the criteria at the program level. To make this possible, the R&D PART aligns with the R&D criteria. The R&D criteria are reprinted here as a guiding framework for addressing the R&D PART. The R&D criteria address not only planning, management, and prospec- tive assessment but also retrospective assessment. Retrospective review of whether investments were well-directed, efficient, and productive is essential for validating program design and instilling confidence that future investments will be wisely invested. Retrospective reviews should address continuing program relevance, quality, and successful performance to date. While the criteria are intended to apply to all types of R&D, the Admini- stration is aware that predicting and assessing the outcomes of basic research in particular is never easy. Serendipitous results are often the most interesting and 1 (OMB 2007). 2 To assist agencies with significant research programs, additional instructions were added to the PART Guidance and titled the “Research and Development Program In- vestment Criteria.” The R&D Investment Criteria are found in Appendix C of the PART instructions. Unlike the main body of the PART instructions, which apply to all federal agencies and programs, the R&D Investment Criteria attempt to clarify OMB’s expecta- tions specifically for R&D programs. 113

114 Evaluating Research Efficiency in EPA ultimately may have the most value. Taking risks and working toward difficult- to-attain goals are important aspects of good research management, and innova- tion and breakthroughs are among the results. However, there is no inherent conflict between these facts and a call for clearer information about program goals and performance toward achieving those goals. The Administration ex- pects agencies to focus on improving the management of their research pro- grams and adopting effective practices, and not on predicting the unpredictable. The R&D investment criteria have several potential benefits: • Use of the criteria allows policy makers to make decisions about pro- grams based on information beyond anecdotes, prior-year funding levels, and lobbying of special interests. • A dedicated effort to improve the process for budgeting, selecting, and managing R&D programs is helping to increase the return on taxpayer invest- ment and the productivity of the Federal R&D portfolio. • The R&D investment criteria will help communicate the Administra- tion’s expectations for proper program management. • The criteria and subsequent implementation guidance will also set stan- dards for information to be provided in program plans and budget justifications. • The processes and collected information promoted under the criteria will improve public understanding of the possible benefits and effectiveness of the Federal investment in R&D. DETAILS ON THE CRITERIA The Relevance, Quality, and Performance criteria apply to all R&D pro- grams. Industry- or market-relevant applied R&D must meet additional criteria. Together, these criteria can be used to assess the need, relevance, appropriate- ness, quality, and performance of Federal R&D programs. Relevance R&D investments must have clear plans, must be relevant to national pri- orities, agency missions, relevant fields, and “customer” needs, and must justify their claim on taxpayer resources. Programs that directly support Presidential priorities may receive special consideration with adequate documentation of their relevance. Review committees should assess program objectives and goals on their relevance to national needs, “customer” needs, agency missions, and the field(s) of study the program strives to address. For example, the Joint DOE/NSF Nuclear Sciences Advisory Committee’s Long Range Plan and the Astronomy Decadal Surveys are the products of good planning processes be- cause they articulate goals and priorities for research opportunities within and across their respective fields.

Appendix G 115 OMB will work with some programs to identify quantitative metrics to es- timate and compare potential benefits across programs with similar goals. Such comparisons may be within an agency or among agencies. Programs Must Have Complete Plans, With Clear Goals and Priorities Programs must provide complete plans, which include explicit statements of: • specific issues motivating the program; • broad goals and more specific tasks meant to address the issues; • priorities among goals and activities within the program; • human and capital resources anticipated; and • intended program outcomes, against which success may later be as- sessed. Programs Must Articulate the Potential Public Benefits of the Program Programs must identify potential benefits, including added benefits be- yond those of any similar efforts that have been or are being funded by the gov- ernment or others. R&D benefits may include technologies and methods that could provide new options in the future, if the landscape of today’s needs and capabilities changes dramatically. Some programs and sub-program units may be required to quantitatively estimate expected benefits, which would include metrics to permit meaningful comparisons among programs that promise similar benefits. While all programs should try to articulate potential benefits, OMB and OSTP recognize the difficulty in predicting the outcomes of basic research. Consequently, agencies may be allowed to relax this as a requirement for basic research programs. Programs Must Document Their Relevance to Specific Presidential Priorities to Receive Special Consideration Many areas of research warrant some level of Federal funding. Nonethe- less, the President has identified a few specific areas of research that are particu- larly important. To the extent a proposed project can document how it directly addresses one of these areas, it may be given preferential treatment. Program Relevance to the Needs of the Nation, of Fields of Science and Technology [S&T], and of Program “Customers” Must Be Assessed Through Prospective External Review Programs must be assessed on their relevance to agency missions, fields of science or technology, or other “customer” needs. A customer may be another

116 Evaluating Research Efficiency in EPA program at the same or another agency, an interagency initiative or partnership, or a firm or other organization from another sector or country. As appropriate, programs must define a plan for regular reviews by primary customers of the program’s relevance to their needs. These programs must provide a plan for ad- dressing the conclusions of external reviews. Program Relevance to the Needs of the Nation, of Fields of S&T, and of Program “Customers” Must Be Assessed Periodically Through Retrospective External Review Programs must periodically assess the need for the program and its rele- vance to customers against the original justifications. Programs must provide a plan for addressing the conclusions of external reviews. Quality Programs should maximize the quality of the R&D they fund through the use of a clearly stated, defensible method for awarding a significant majority of their funding. A customary method for promoting R&D quality is the use of a competitive, merit-based process. NSF’s process for the peer-reviewed, com- petitive award of its R&D grants is a good example. Justifications for processes other than competitive merit review may include “outside-the-box” thinking, a need for timeliness (e.g., R&D grants for rapid response studies of Pfisteria), unique skills or facilities, or a proven record of outstanding performance (e.g., performance-based renewals). Programs must assess and report on the quality of current and past R&D. For example, NSF’s use of Committees of Visitors, which review NSF director- ates, is an example of a good quality-assessment tool. OMB and OSTP encour- age agencies to provide the means by which their programs may be bench- marked internationally or across agencies, which provides one indicator of program quality. Programs Allocating Funds Through Means Other Than a Competitive, Merit-based Process Must Justify Funding Methods and Document How Quality is Maintained Programs must clearly describe how much of the requested funding will be broadly competitive based on merit, providing compelling justifications for R&D funding allocated through other means. (See OMB Circular A-11 for defi- nitions of competitive merit review and other means of allocating Federal re- search funding.) All program funds allocated through means other than unlim- ited competition must document the processes they will use to distribute funds to each type of R&D performer (e.g., Federal laboratories, Federally-funded R&D

Appendix G 117 centers, universities, etc.). Programs are encouraged to use external assessment of the methods they use to allocate R&D and maintain program quality. Program Quality Must Be Assessed Periodically Through Retrospective Expert Review Programs must institute a plan for regular, external reviews of the quality of the program's research and research performers, including a plan to use the results from these reviews to guide future program decisions. Rolling reviews performed every 3-5 years by advisory committees can satisfy this requirement. Benchmarking of scientific leadership and other factors provides an effective means of assessing program quality relative to other programs, other agencies, and other countries. Performance R&D programs should maintain a set of high priority, multi-year R&D ob- jectives with annual performance outputs and milestones that show how one or more outcomes will be reached. Metrics should be defined not only to encourage individual program performance but also to promote, as appropriate, broader goals, such as innovation, cooperation, education, and dissemination of knowl- edge, applications, or tools. OMB encourages agencies to make the processes they use to satisfy the Government Performance and Results Act (GRPA) consistent with the goals and metrics they use to satisfy these R&D criteria. Satisfying the R&D performance criteria for a given program should serve to set and evaluate R&D performance goals for the purposes of GPRA. OMB expects goals and performance measures that satisfy the R&D criteria to be reflected in agency performance plans. Programs must demonstrate an ability to manage in a manner that pro- duces identifiable results. At the same time, taking risks and working toward difficult-to-attain goals are important aspects of good research management, especially for basic research. The intent of the investment criteria is not to drive basic research programs to pursue less risky research that has a greater chance of success. Instead, the Administration will focus on improving the management of basic research programs. OMB will work with some programs to identify quantitative metrics to compare performance across programs with similar goals. Such comparisons may be within an agency or among agencies. Construction projects and facility operations will require additional per- formance metrics. Cost and schedule earned-value metrics for the construction of R&D facilities must be tracked and reported. Within DOE, the Office of Sci- ence’s formalized independent reviews of technical cost, scope, and schedule baselines and project management of construction projects (“Lehman Reviews”)

118 Evaluating Research Efficiency in EPA are widely recognized as an effective practice for discovering and correcting problems involved with complex, one-of-a-kind construction projects. REFERENCES OMB (Office of Management and Budget). 2007. Research and development program investment criteria. Pp. 72-77 in Guide to the Program Assessment Rating Tool (PART). Program Assessment Rating Tool Guidance No. 2007-02. Office of Man- agement and Budget, Washington, DC. January 29, 2007 [online]. Avail- able: http://stinet.dtic.mil/cgi-bin/GetTRDoc?AD=ADA471562&Location=U2& doc=GetTRDoc.pdf [accessed Nov. 14, 2007].

Next: Appendix H: Charge to the BOSC Subcommittee on Safe Pesticides/Safe Products Research »
Evaluating Research Efficiency in the U.S. Environmental Protection Agency Get This Book
×
Buy Paperback | $48.00 Buy Ebook | $38.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

A new book from the National Research Council recommends changes in how the federal government evaluates the efficiency of research at EPA and other agencies. Assessing efficiency should be considered only one part of gauging a program's quality, relevance, and effectiveness. The efficiency of research processes and that of investments should be evaluated using different approaches. Investment efficiency should examine whether an agency's R&D portfolio, including the budget, is relevant, of high quality, matches the agency's strategic plan. These evaluations require panels of experts. In contrast, process efficiency should focus on "inputs" (the people, funds, and facilities dedicated to research) and "outputs" (the services, grants, publications, monitoring, and new techniques produced by research), as well as their timelines and should be evaluated using quantitative measures. The committee recommends that the efficiency of EPA's research programs be evaluated according to the same standards used at other agencies. To ensure this, OMB should train and oversee its budget examiners so that the PART questionnaire is implemented consistently and equitably across agencies.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!