National Academies Press: OpenBook
« Previous: 3 COMMUNICATION ISSUES
Suggested Citation:"4 Conclusions and Recommendations." National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. 2001. Implementing the Government Performance and Results Act for Research: A Status Report. Washington, DC: The National Academies Press. doi: 10.17226/10106.
×
Page 35
Suggested Citation:"4 Conclusions and Recommendations." National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. 2001. Implementing the Government Performance and Results Act for Research: A Status Report. Washington, DC: The National Academies Press. doi: 10.17226/10106.
×
Page 36
Suggested Citation:"4 Conclusions and Recommendations." National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. 2001. Implementing the Government Performance and Results Act for Research: A Status Report. Washington, DC: The National Academies Press. doi: 10.17226/10106.
×
Page 37
Suggested Citation:"4 Conclusions and Recommendations." National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. 2001. Implementing the Government Performance and Results Act for Research: A Status Report. Washington, DC: The National Academies Press. doi: 10.17226/10106.
×
Page 38
Suggested Citation:"4 Conclusions and Recommendations." National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. 2001. Implementing the Government Performance and Results Act for Research: A Status Report. Washington, DC: The National Academies Press. doi: 10.17226/10106.
×
Page 39
Suggested Citation:"4 Conclusions and Recommendations." National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. 2001. Implementing the Government Performance and Results Act for Research: A Status Report. Washington, DC: The National Academies Press. doi: 10.17226/10106.
×
Page 40
Suggested Citation:"4 Conclusions and Recommendations." National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. 2001. Implementing the Government Performance and Results Act for Research: A Status Report. Washington, DC: The National Academies Press. doi: 10.17226/10106.
×
Page 41
Suggested Citation:"4 Conclusions and Recommendations." National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. 2001. Implementing the Government Performance and Results Act for Research: A Status Report. Washington, DC: The National Academies Press. doi: 10.17226/10106.
×
Page 42
Suggested Citation:"4 Conclusions and Recommendations." National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. 2001. Implementing the Government Performance and Results Act for Research: A Status Report. Washington, DC: The National Academies Press. doi: 10.17226/10106.
×
Page 43
Suggested Citation:"4 Conclusions and Recommendations." National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. 2001. Implementing the Government Performance and Results Act for Research: A Status Report. Washington, DC: The National Academies Press. doi: 10.17226/10106.
×
Page 44
Suggested Citation:"4 Conclusions and Recommendations." National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. 2001. Implementing the Government Performance and Results Act for Research: A Status Report. Washington, DC: The National Academies Press. doi: 10.17226/10106.
×
Page 45
Suggested Citation:"4 Conclusions and Recommendations." National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. 2001. Implementing the Government Performance and Results Act for Research: A Status Report. Washington, DC: The National Academies Press. doi: 10.17226/10106.
×
Page 46

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

C H A P T E R 4 Conclusions and Recommendations ver the last 4 years, federal agencies that support research in science O and engineering have moved by stages toward full implementation of GPRA. The central objective of the act is to elicit from the agencies a regular accounting of the planning, performance, and results of their research activities. Agencies have spent substantial time and effort in devising ways to implement the act. However, both the agencies and oversight bodies must still develop better refinements to improve interpreting, implementing, and communicating with each other about GPRA. To assist in the complex processes of implementing GPRA, this report has attempted to summarize and interpret the experi- ences of agencies and oversight bodies. In particular, its major sections examine the current process and recommend the most appropriate methods of evaluating basic– and applied–research programs, the criteria that agencies can and should use to perform their evaluations, and the experiments and difficulties experienced by agencies in communicating their evaluation results internally and externally. After its study of GPRA with agency and oversight person- nel, the present panel has concluded that the manner of planning and evaluating research programs carries great importance. It is apparent that inappropriate methods and inadequate communica- tion can harm the programs that the law seeks to strengthen. We hope that the general observations, conclusions, and recommenda- tions in this report help agencies and oversight groups as they 35

IMPLEMENTING THE GOVERNMENT PERFORMANCE AND RESULTS ACT FOR RESEARCH continue to take the incremental steps necessary to implement GPRA for the country’s federal research programs. Chapters 2 and 3 each contain specific recommendations for agencies and oversight bodies that are designed to expedite the implementation of GPRA. This chapter offers a brief set of more- general conclusions and recommendations that consolidate the major themes of the preceding text. The panel offers the following 10 conclusions: Some agencies stated that GPRA compliance has added substantially to the cost of their planning and evaluation activities in the form of staff time and resources. Others report that they have been able to integrate GPRA with their traditional budget and planning processes al- though at some cost of time and effort. These agencies report benefits in strengthening program management and enhancing communication about their programs to the users of research and the general public. The need to do so depends on the goal of that agency and the degree to which there is concern about a given field of research or about new and emerging programs. In promoting greater accountability, the act calls for firmer alignment of research programs with overall strategic planning and for a higher degree of accountability. These agencies report progress on both counts—in strengthening the management of their programs and in enhancing their ability to communicate the value of their programs to the users of research and the public. However, while some agencies report that they have been 36

Conclusions and Recommendations able to derive their GPRA requirements from the same manage- ment processes that they traditionally use for internal control and budgeting, others see GPRA requirements as extra burdens that add to the planning and reporting workload, with lost opportunities in terms of costs of staff time and resources devoted to this require- ment. Agency approaches to GPRA research programs demonstrate the utility of expert review using the same criteria of quality and relevance as outlined in COSEPUP’s original report. The international leadership criteria is generally not evaluated by most federal agencies at this time, although several are interested in such a measure. However, given the diversity in mission, complexity, culture, and structure of federal agencies that support research, it is not surprising that their approaches to GPRA have varied. One size definitely does not fit all. In particular, oversight bodies expressed a desire for better understanding of the methodology and results of expert review evaluations. Some agencies provide evaluations on a field-specific or program-specific basis; others do so for the research program in its entirety. Aggregating at a high level can make it difficult for oversight bodies to clearly see and understand the methods and programs that are the focus of the analyses. 37

IMPLEMENTING THE GOVERNMENT PERFORMANCE AND RESULTS ACT FOR RESEARCH When this objective is explicit, it not only affirms the value of the US tradition that includes graduate students in the research pro- grams of their advisers—but also shows how reductions in research funding can jeopardize the preparation of the scientists and engi- neers the nation will need in the future. For example, one agency made an effort to tie its GPRA reports more closely to its annual budget, as required in the act, only to be told by a congressional committee to return to a previ- ously used format—another was told the reverse. As a result, the poten- tial benefit of GPRA in providing a mechanism for incorporating performance results of previous years into performance plans for later years is limited. During focus groups, the workshop, and interviews, it was consistently clear that improved communication between these two sectors could reduce the difficulties and misun- derstandings experienced by some agencies. 38

Conclusions and Recommendations Are the results of the “results act” being used? In particular, agencies have not yet seen the use of their reports in the congressional decision-making that determines the size and priori- ties of their budgets. On the basis of these observations, the panel offers the following general recommendations: The language of the act strongly urges agencies to evaluate their programs annually through the use of quantitative measures so that progress can be followed with clear numerical indicators. The panel reaffirms COSEPUP’s earlier assertion that research pro- grams, especially those supporting basic research, cannot be meaningfully evaluated this way annually. Instead, these programs can be evaluated over a somewhat longer term through expert review, which has a long tradition of effectiveness and objectivity. Transparency refers to the ability to readily see how and why an agency decides to emphasize or de-emphasize a particular program or area of research. When an agency describes its perfor- 39

IMPLEMENTING THE GOVERNMENT PERFORMANCE AND RESULTS ACT FOR RESEARCH mance plans and reports from an agencywide point of view (for example, an agency might describe its efforts to reduce global warming as though it were a single program), it is difficult for oversight bodies or the public to understand the process of priority- setting. Although oversight bodies or agents of the public would not be expected to review the thousands of subentities that perform their own planning and reviewing within agencies, they can reason- ably expect access to documents that help them to answer specific questions. Although GPRA requires annual reporting on all programs, basic research often does not produce useful results in a single year and must be monitored over several years before outcomes become apparent. Agencies should experiment with alternative reporting forms, as permitted by GPRA, that provide realistic evaluations of long-term research. Although expert review has long been the accepted method for evaluating research in the science and engineering communities, some aspects of its performance are unclear to outside observers. Agencies should make clear how they validate their research-evaluation methods, such as the means by which they select expert reviewers and choose to aggregate research programs for review. Agencies have a large stake in the education and training of scientists and engineers, but this objective might not receive explicit emphasis or visibility in GPRA plans and reports. The objective must be explicit not only because it affirms the value of educating young scientists and engineers in the context of research, but also because it demonstrates how reductions in research funding could weaken the corps of human resources that are essential for the nation’s future. 40

Conclusions and Recommendations Whenever possible, agencies should use procedures already in place without adding steps. GPRA should not add unnecessarily to the workload of agencies, and oversight bodies should help agencies to ensure that this does not happen. At the same time, effective linkage of GPRA reporting with budgets may help agencies explain their needs to Congress and justify funding levels during periods of restrained budgets. A principal purpose of GPRA is to improve how agencies communicate their results to oversight groups, the “users” of research, and the general public. More-effective communication will enhance the value of the act to all constituents. As indicated in COSEPUP’s first report, GPRA is poten- tially useful because it “provides an opportunity for the research community to ensure the effective use of the nation’s research resources in meeting national needs and to articulate to policymakers and the public the rationale for and results of re- search.” However, the act will not fulfill its intended objectives unless the Senate and House Operations committees, working with OMB, identify and respond to agency concerns through open discussion. Unless the agency responses to GPRA are useful to Congress in the urgent task of setting priorities and budgeting, the value of the act might not warrant the time and effort it requires of the federal government. 41

IMPLEMENTING THE GOVERNMENT PERFORMANCE AND RESULTS ACT FOR RESEARCH Provided below are the specific recommendations that are scattered throughout this report: 42

Conclusions and Recommendations 43

IMPLEMENTING THE GOVERNMENT PERFORMANCE AND RESULTS ACT FOR RESEARCH Much has been learned about the procedures of planning, evaluation, and management in the last several years, and some value will have been gained by the agencies from their own discus- sion of accountability. However, one key remaining question is the degree to which oversight groups are using the results of the “results act” for programmatic decision-making. Unless the agency responses to GPRA are useful to Congress in the urgent task of setting priorities and budgeting, the value of the act might not warrant the time and effort it requires of the federal government. But by working more closely together than they have in the past, the federal agencies and the oversight bodies can implement the letter and spirit of GPRA in ways that lead to greater efficiency, lower cost, and more effective research programs that are demon- strably conducted in the national interest. 44

Appendixes

Next: APPENDIX A Panel and Staff Biographical Information »
Implementing the Government Performance and Results Act for Research: A Status Report Get This Book
×
 Implementing the Government Performance and Results Act for Research: A Status Report
Buy Paperback | $60.00 Buy Ebook | $47.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

As requested by Congress and the White House Office of Science and Technology Policy (OSTP), this report assists federal agencies in crafting plans and reports that are responsive to the Government Performance and Results Act (GPRA), OMB Guidance, and agency missions. Using a case study approach, the report identifies best practices used by individual agencies to evaluate the performance and results of their science and technology programs. The report takes into account individual agencies' missions and how science and technology programs and human resource needs are factored into agency GPRA plans. Specific applications of recommendations are included from COSEPUP's earlier report entitled Evaluating Federal Research Programs: Research and the Government Performance and Results Act.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!