Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page R1
Page i IMPLEMENTING THE GOVERNMENT PERFORMANCE AND RESULTS ACT FOR RESEARCH A Status Report Committee on Science, Engineering, and Public Policy National Academy of Sciences National Academy of Engineering Institute of Medicine Policy and Global Affairs NATIONAL ACADEMY PRESSWashington, D.C.
OCR for page R2
Page ii NATIONAL ACADEMY PRESS 2101 Constitution Avenue, NW Washington, D.C. 20418 NOTICE: This volume was produced as part of a project approved by the Governing Board of the National Research Council, whose members are drawn from the councils of the National Academy of Sciences (NAS), the National Academy of Engineering (NAE), and the Institute of Medicine (IOM). It is a result of work done by a panel of the Committee on Science, Engineering, and Public Policy (COSEPUP). The members of the panel responsible for the report were chosen for their special competences and with regard for appropriate balance. COSEPUP is a joint committee of NAS, NAE, and IOM. It includes members of the councils of all three bodies. For more information on COSEPUP, see www.nationalacademies.org/cosepup. This material is based upon work supported by the National Science Foundation under Grant No. OIA-0073616, the National Research Council, the Department of Defense under Purchase Order SP4700-99-M-0510, the Department of Energy under Grant No. DE-FG02-00ER45803, the Department of Health and Human Services/National Institutes of Health under Contract No. N01-OD-4-2139, Task Order No. 60, and the National Aeronautics and Space Administration under Grant No. NASW-9937, Task Order No. 112. Additionally, this project, N01-OD-4-2139, Task Order No. 60, received support from the evaluation set-aside section 513, Public Health Service Act. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation, the Department of Defense, the Department of Energy, the Department of Health and Human Services, or the National Aeronautics and Space Administration. International Standard Book Number: 0-309-07557-2 Implementing the Government Performance and Results Act for Research: A Status Report is available from the National Academy Press, 2101 Constitution Avenue, NW, PO Box 285, Washington, D.C. 20055 ( 1-800-624-6242 or 202-334-3313 in the Washington metropolitan area; Internet http://www.nap.edu ). Copyright 2001 by the National Academy of Sciences. All rights reserved. This document may be reproduced solely for educational purposes without the written permission of the National Academy of Sciences. Printed in the United States of America .
OCR for page R3
Page iii THE NATIONAL ACADEMIES National Academy of Sciences National Academy of Engineering Institute of Medicine National Research Council The National Academy of Sciences is a private, nonprofit, self-perpetuating society of distinguished scholars engaged in scientific and engineering research, dedicated to the furtherance of science and technology and to their use for the general welfare. Upon the authority of the charter granted to it by the Congress in 1863, the Academy has a mandate that requires it to advise the federal government on scientific and technical matters. Dr. Bruce M. Alberts is president of the National Academy of Sciences. The National Academy of Engineering was established in 1964, under the charter of the National Academy of Sciences, as a parallel organization of outstanding engineers. It is autonomous in its administration and in the selection of its members, sharing with the National Academy of Sciences the responsibility for advising the federal government. The National Academy of Engineering also sponsors engineering programs aimed at meeting national needs, encourages education and research, and recognizes the superior achievements of engineers. Dr. William A. Wulf is president of the National Academy of Engineering. The Institute of Medicine was established in 1970 by the National Academy of Sciences to secure the services of eminent members of appropriate professions in the examination of policy matters pertaining to the health of the public. The Institute acts under the responsibility given to the National Academy of Sciences by its congressional charter to be an adviser to the federal government and, upon its own initiative, to identify issues of medical care, research, and education. Dr. Kenneth I. Shine is president of the Institute of Medicine. The National Research Council was organized by the National Academy of Sciences in 1916 to associate the broad community of science and technology with the Academy's purposes of furthering knowledge and advising the federal government. Functioning in accordance with general policies determined by the Academy, the Council has become the principal operating agency of both the National Academy of Sciences and the National Academy of Engineering in providing services to the government, the public, and the scientific and engineering communities. The Council is administered jointly by both Academies and the Institute of Medicine. Dr. Bruce M. Alberts and Dr. William A. Wulf are chairman and vice chairman, respectively, of the National Research Council.
OCR for page R4
Page iv PANEL ON RESEARCH AND THE GOVERNMENT PERFORMANCE AND RESULTS ACT (GPRA) 2000 ENRIQUETA C. BOND (Cochair), President, The Burroughs Wellcome Fund, Research Triangle Park, North Carolina ALAN SCHRIESHEIM (Cochair), Director Emeritus, Argonne National Laboratory, Argonne, Illinois JOHN E. HALVER, Professor Emeritus in Nutrition, School of Fisheries, University of Washington-Seattle, Washington BRIGID L. M. HOGAN, Investigator and Professor, Vanderbilt University Medical Center, Howard Hughes Medical Institute, Nashville, Tennessee WESLEY T. HUNTRESS, JR., Director, Geophysical Laboratory, Carnegie Institution of Washington, Washington, D.C. LOUIS J. LANZEROTTI, Distinguished Member, Technical Staff, Bell Laboratories, Lucent Technologies, Murray Hill, New Jersey RUDOLPH A. MARCUS, Arthur Amos Noyes Professor of Chemistry, California Institute of Technology, Pasadena, California STUART A. RICE, Frank P. Hixon Distinguished Service Professor, James Franck Institute, The University of Chicago, Illinois HERBERT H. RICHARDSON, Associate Vice Chancellor of Engineering and Director, Texas Transportation Institute, The Texas A&M University System, College Station, Texas MAX D. SUMMERS, Distinguished Professor, The Texas A&M University, College Station, Texas MORRIS TANENBAUM, Retired Vice Chairman and Chief Financial Officer, AT&T, Short Hills, New Jersey BAILUS WALKER, JR., Professor of Environmental and Occupational Medicine, Howard University, Washington, D.C. ROBERT M. WHITE, University Professor and Director, Data Storage Systems Center, Carnegie Mellon University, Pittsburgh, Pennsylvania Principal Study Staff: DEBORAH D. STINE, Study Director and Associate Director, COSEPUP SUSAN E. COZZENS, GPRA Consultant DAVID M. HART, Case Study Consultant ALAN ANDERSON, Consultant Science Writer DAVID BRUGGEMAN, Intern, Christine Mirzayan Internship Program CARL PICCONATTO, Intern, Christine Mirzayan Internship Program REBECCA BURKA, Administrative Associate KEVIN ROWAN, Project Assistant NORMAN GROSSBLATT, Editor
OCR for page R5
Page v COMMITTEE ON SCIENCE, ENGINEERING, AND PUBLIC POLICY MAXINE F. SINGER (Chair), President, Carnegie Institution of Washington, Washington, D.C. BRUCE M. ALBERTS, * President, National Academy of Sciences, Washington, D.C. ENRIQUETA C. BOND, President, The Burroughs Wellcome Fund, Research Triangle Park, North Carolina LEWIS M. BRANSCOMB, Professor Emeritus, Center for Science and International Affairs, John F. Kennedy School of Government, Harvard University, Cambridge, Massachusetts GERALD P. DINNEEN, * Vice President, Science and Technology, Honeywell, Inc. (retired), Edina, Minnesota JAMES J. DUDERSTADT, President Emeritus and University Professor of Science and Engineering, Millennium Project, University of Michigan, Ann Arbor, Michigan MARYE ANNE FOX, Chancellor, North Carolina State University, Raleigh, North Carolina RALPH E. GOMORY, President, Alfred P. Sloan Foundation, New York, New York RUBY P. HEARN, Senior Vice President, The Robert Wood Johnson Foundation, Princeton, New Jersey SAMUEL H. PRESTON, Dean, University of Pennsylvania School of Arts and Sciences, Philadelphia, Pennsylvania KENNETH I. SHINE, * President, Institute of Medicine, Washington, D.C. EDWARD H. SHORTLIFFE, Professor and Chair, Department of Medical Informatics, Columbia University, New York, New York HUGO F. SONNENSCHEIN, Charles H. Hutchenson Distinguished Service Professor, University of Chicago, Chicago, Illinois PAUL E. TORGERSEN, John W. Hancock, Jr. Chair and President Emeritus, Virginia Polytechnic Institute and State University, Blacksburg, Virginia IRVING L. WEISSMAN, Karele and Avice Beekhuis Professor of Cancer Biology and Professor of Pathology, Stanford University School of Medicine, Palo Alto, California SHEILA E. WIDNALL, Abby Rockefeller Mauze Professor of Aeronautics, Massachusetts Institute of Technology, Cambridge, Massachusetts WILLIAM JULIUS WILSON, Lewis P. and Linda L. Geyser University Professor, Harvard University, Cambridge, Massachusetts WILLIAM A. WULF, * President, National Academy of Engineering, Washington, D.C. Staff RICHARD E. BISSELL, Executive Director DEBORAH D. STINE, Associate Director MARION RAMSEY, Administrative Associate *Ex officio member.
OCR for page R6
OCR for page R7
Page vii PREFACE In February 1999, the Committee on Science, Engineering, and Public Policy (COSEPUP) released a report titled Evaluating Federal Research Programs: Research and the Government Performance and Results Act (see Appendix E). The report recommended a set of criteria by which federal agencies might evaluate their programs of research in science and engineering. The criteria were intended to help agencies to respond to the Government Performance and Results Act (GPRA), enacted in 1993 (see Appendix F). The National Academies were later asked by Congress to undertake another study, as part of the 1999 VA-HUD Independent Agencies Authorization Act, titled “Accountability of Federally Funded Research.” Because many of the issues raised by Congress were addressed by COSEPUP in the original study, the Academies worked with the White House Office of Science and Technology Policy (OSTP) as indicated in the legislation to craft a study that would be most useful to all involved. In a letter dated April 6, 1999, Dr. Neal Lane, director of OSTP, asked the Academies to undertake a more in-depth study of the actual application of GPRA to research programs as the agencies were shortly to release their first performance reports under GPRA. The study plan was endorsed by the House Committee on Science and by Senators William Frist, John Rockefeller, Jeff Bingaman, and Joseph Lieberman who were cosponsors of the original legislation. The specific charge to the panel was as follows: As requested by Congress and the White House Office of Science and Technology Policy, this study would assist federal
OCR for page R8
Page viii agencies in crafting plans and reports that are responsive to the Government Performance and Results Act (GPRA), OMB Guidance, and agency missions. The study would undertake independent assessments via case studies of the strategic and performance plans federal agencies have developed and of the responsiveness of their performance reports (which are due in March 2000) to the Government Performance and Results Act. The assessment would take into account the agencies' missions and how science and technology programs and human resource needs are factored into agency GPRA plans. In addition, the study would suggest specific applications of recommendations from COSEPUP's earlier report entitled “Evaluating Federal Research Programs: Research and the Government Performance ad Results Act.” In addition, workshops would be conducted where the agencies could share best practices regarding their performance reports and stakeholders views could be heard. The Senators also requested that the Academies evaluate the extent to which independent merit-based evaluation achieves the goal of eliminating unsuccessful or unproductive programs and projects and to investigate and report on the validity of using quantitative performance goals for administrative management of these activities. COSEPUP decided not to pursue these analyses for the time being and to instead focus on the task above. The National Academies formed the Panel on Research and the Government Performance and Results Act 2000 under the auspices of COSEPUP to respond to the request. This panel, which we chair, began its work by examining the GPRA performance reports each federal agency released in March of 2000. These performance reports provided the public with the first opportunity to see the implementation of GPRA. In May, project staff at the behest of panel members met with the staff at 11 federal agencies to gain a better understanding of the methodology each used for their research programs. At this stage, problems with the charge to the panel emerged based on its
OCR for page R9
Page ix discussions with the agency staff and the consultants and the panel's review of the agency performance plans. Specifically, at its initial meeting in June, the panel members determined it was not appropriate to indicate the degree to which a given agency's work was acceptable, nor was it possible to conduct an in-depth review of each agency's program activities as would have been required to conduct an independent assessment of strategic and performance plans. In the first instance, agencies were still in the experimental stage regarding the evaluation of research programs in response to GPRA. In the latter case, no single committee could mobilize the level of expertise necessary to conduct an in-depth review in the group of agencies selected given the tremendous diversity of the research programs each supported. In sum, the panel determined it was not possible to provide the “independent assessment” of each agency's strategic and performance plan anticipated by Dr. Lane. In the spirit of the OSTP request, the panel instead decided to focus on the general methods and approaches each agency undertook. It also intentionally decided not to make agency-specific analyses beyond that which is presented in Appendix C summarizing each agency's approach. Therefore, instead of attempting an investigation for which it was not equipped, the panel chose to take a “snapshot” of the current state of affairs of agencies' response to GPRA. After reviewing the process used by the 11 federal agencies, the panel in the end decided to select for review the five agencies that provide the most financial support for federal research programs. The five agencies selected were the National Science Foundation (NSF), National Institutes of Health (NIH), Department of Defense (DOD), Department of Energy (DOE), and National Aeronautics and Space Administration (NASA). The panel then convened five focus groups—one on the process used by each agency—and a workshop to discuss overarching issues that affected all the agencies. Participants in the
OCR for page R10
Page x focus groups and the workshop included several panel members, members of agency scientific advisory groups, and staff from the agencies, Office of Management and Budget (OMB), General Accounting Office (GAO), and Congressional Research Service (CRS). Congressional committee staff were invited, but none attended. During each focus group, agencies were asked to respond to the following questions: What methodology is used for evaluating research programs under GPRA? What level of unit is the focus of the evaluation? Who does the evaluation of the research program under GPRA? What criteria are used for the evaluation? How are the selection and evaluation of projects related to the evaluation of the research program? How is the result communicated to different audiences (such as the S&T community, advisory committees, agency leadership, the administration, congress)? How is the result used in internal and external decision-making? Their responses are summarized in Appendix C. During the workshop, a number of overarching issues were discussed, including these: Criteria for evaluation. Aggregation of research programs for purposes of evaluation. Usefulness of GPRA. GPRA and the workload of agencies. Issues of timing. Verification and validation.
OCR for page R11
Page xi The results of the workshop are summarized in Appendix D. The report itself should be considered a cross section or “snapshot” of agency responses to GPRA based on the agencies' own descriptions. We hope that the observations and recommendations presented here will be useful to other agencies in their efforts to implement GPRA and to oversight bodies in their efforts to supervise and facilitate the implementation. We believe, on the basis of first-hand observation, that the interactions during the focus groups and workshop were useful to all participants. In the end, this panel does not attempt to recommend a single strategy to be used by all federal agencies in developing their plans to respond to GPRA. Instead, the panel, as requested by OSTP, has worked with individual agencies to focus on observations that could facilitate their responses to GPRA. Ideally, these lessons can be discussed and extended by all agencies and their oversight bodies to begin assembling agency-appropriate, broadly helpful strategies for GPRA compliance beyond that in COSEPUP's original report. Enriqueta Bond Alan Schriesheim Panel Cochairs
OCR for page R12
OCR for page R13
Page xiii ACKNOWLEDGMENTS This report is the product of many individuals. First, we would like to thank all the agency, congressional, and White House staff and the agency advisory group members for their input (see Appendix C for focus group and workshop participants). The panel also extends thanks to Bob Simon and John Jennings, Senate staff, and Richard Russell and Beth Sokol, House staff, for their guidance. Without the help of all of them, the panel would have had great difficulty understanding the intricacies of the issues surrounding the implementation of GPRA for federal research programs. Second, we would like to thank the reviewers of this report. This guide has been reviewed in draft form by individuals chosen for their diverse perspectives and technical expertise, in accordance with procedures approved by the National Research Council's Report Review Committee. The purpose of this independent review is to provide candid and critical comments that will assist the institution in making the published report as sound as possible and to ensure that the report meets institutional standards for objectivity, evidence, and responsiveness to the study charge. The review comments and draft manuscript remain confidential to protect the integrity of the deliberative process. We wish to thank the following individuals for their participation in the review of this report: JOHN F. AHEARNE, Director, Ethics Programs, Sigma Xi, The Scientific Research Society, Research Triangle Park, North Carolina
OCR for page R14
Page xiv EUGENE W. BIERLY, Senior Scientist, American Geophysical Union, Washington, D.C. NICHOLAS P. BIGELOW, Professor, The University of Rochester, New York RADFORD BYERLY, JR., Boulder, Colorado THEODORE J. CASTELE, Fairview Park, Ohio MELANIE C. DREHER, Dean, College of Nursing, The University of Iowa, Iowa City, Iowa DAVID W. ELLIS, President and Director, Museum of Science, Boston, Massachusetts FRANCIS B. FRANCOIS, Consultant, Bowie, Maryland CUTBERTO GARZA, Vice Provost, Cornell University, Ithaca, New York VICTORIA FRANCHETTI HAYNES, President, Research Triangle Institute, Research Triangle Park, North Carolina SCOTT RAYDER, Director of Government Affairs, Consortium for Oceanographic Research and Education, Washington, D.C. MICHAEL J. SAILOR, Professor of Chemistry, University of California, San Diego, La Jolla, California Although the reviewers listed above have provided many constructive comments and suggestions, they were not asked to endorse the conclusions or recommendations, nor did they see the final draft of the report before its release. The review of this report was overseen by William G. Howard, an independent consultant in Scottsdale, Arizona. Appointed by the National Research Council, he was responsible for making certain that an independent examination of this report was carried out in accordance with institutional procedures and that all review comments were carefully considered. Responsibility for the final content of this report rests entirely with the authoring committee and the institution. Finally, we would like to thank the staff for this project, including Deborah Stine, associate director of COSEPUP and study director; Alan Anderson, consultant writer, who worked with the
OCR for page R15
Page xv panel to develop the text of the guide; Susan Cozzens and David Hart, who also served as our GPRA and case study consultants, respectively, on the report; Rebecca Burka, administrative associate, and Kevin Rowan, project assistant, who provided project support; David Bruggeman and Carl Picconatto, interns who provided research support; Norman Grossblatt, editor; and Richard Bissell, executive director of COSEPUP.
OCR for page R16
OCR for page R17
Page xvii CONTENTS EXECUTIVE SUMMARY 1 1 THE CHALLENGE OF EVALUATING RESEARCH 7 1.1 Barriers to Evaluating Research and the Solution, 10 1.2 COSEPUP's Evaluation Criteria, 13 1.2.1 Quality, 13 1.2.2 Relevance, 14 1.2.3 Leadership, 14 1.3 Organization of this Report, 16 2 AGENCY METHODS 17 2.1 Expert Review, 18 2.2 Evaluation Criteria, 20 2.2.1 Quality, 20 2.2.2 Relevance, 21 2.2.3 Leadership, 21 2.3 Human Resources, 23 2.4 Aggregation, 23 2.5 Validation and Verification, 25 2.6 Summary, 26 3 COMMUNICATION ISSUES 27 3.1 Communication Between Agencies and Oversight Groups, 28 3.2 Communication by Agencies with User Groups and the Public, 30 3.3 Communication by Oversight Groups, 31
OCR for page R18
Page xviii 3.4 Communication Within Agencies, 31 3.5 The Issue of Timing, 32 3.6 Summary, 34 4 CONCLUSIONS AND RECOMMENDATIONS 35 4.1 General Conclusions, 36 4.2 General Recommendations, 39 4.3 Specific Recommendations, 42 4.3.1 Agency Methods, 42 4.3.2 Communication, 43 4.4 Summary, 44 APPENDIXES A Panel and Staff Biographical Information, 47 B White House and Congressional Correspondence, 57 C Summaries of Agency Focus Group Presentations, 63 C-1 Summary of Department of Defense Focus Group, 65 C-2 Summary of National Institutes of Health Focus Group, 81 C-3 Summary of National Aeronautics and Space Administration Focus Group, 97 C-4 Summary of the National Science Foundation Focus Group, 113 C-5 Summary of the Department of Energy Focus Group, 125 D Summary of Workshop, 137 E Executive Summary of Evaluating Federal Research Programs: Research and the Government Performance and Results Act, 155 F Government Performance and Results Act, 167 G Federal Agency GPRA Web Sites, 189