Skip to main content

Currently Skimming:

Appendix D: Summary of Workshop
Pages 137-154

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 137...
... It also reiterates the findings of the first report, with emphasis on the first four recommendations: research programs, including basic research, should be evaluated regularly; the methodology of evaluation should match the character of the research; the primary method for evaluating research programs should be expert review; and agencies should describe in their GPRA plans and reports the goal of developing human resources. See Appendix C for summaries of focus groups with Department of Defense, Department of Energy, National Aeronautics and Space Administration, National Institutes of Health, and National Science Foundation.
From page 138...
... But the panel heard from agency representatives that they could not find useful quantitative metrics to evaluate the results of basic research. Limits of Quantitative Metrics It is true that quantitative measures are used to evaluate researchers, research proposals, and research programs throughout science.
From page 139...
... that is relevant to the mission of the agency supporting the work. Panelists offered several illustrations of the difficulty of trying to evaluate basic research annually with quantitative metrics.
From page 140...
... They also noted that once oversight groups recognized the basic principles of science, they might understand better why micromanaging of agency research programs does not necessarily lead to better science. Criteria for Evaluation In its first report on GPRA, COSEPUP recommended the use of the following criteria to evaluate research programs: the qualify of the research, the relevance of the research to the agency's mission, and leadership that is, the level of the work being performed compared with the level of the finest research done in the 140
From page 141...
... Because of such variations, the panel acknowledged that there are multiple approaches for gathering information on quality and for setting priorities and that agencies should be free to design their own approaches. There was also acknowledgment of how much work is asked of the science and engineering communities in serving on review panels.
From page 142...
... For example, the users of results of NIH research include pharmaceutical companies, hospital administrators, applied researchers, and doctors. NIH was asked whether it heard from such users, and a representative responded that the agency hears from them through its national advisory councils, which include scientists, health-care providers, and members of the public.
From page 143...
... The discussants encouraged agencies to experiment with ways to increase the international perspective in their evaluation procedures. One panelist cited an earlier COSEPUP experiment with international benchmarking, in which the United States was deemed to be the overall world leader in materials science and engineering, but the study revealed some fields in which the United States was not ahead.
From page 144...
... A workshop participant noted that some committees do not use GPRA documents when research activities are too highly aggregated. She noted that the law requires research activities to be described at the program and financing budget levels.
From page 145...
... Nearly all NSF's budget is spent on research, but only about 1.5% of DOD's budget is. And yet each has to respond to the same GPRA requirements, even though the overall DOD GPRA plan barely has space to mention research at all, let alone deliver a detailed analysis of planning and evaluation methods.
From page 146...
... Each enterprise has a portion of a kind of research, and that portion must be integrated with the other activities of the enterprise, such as building hardware and planning space missions. It is difficult to explain the different qualities of scientific research within a GPRA document that comprehends an entire enterprise with all its diverse activities and goals.
From page 147...
... But agency representatives described a considerable amount of extra workload in the form of meetings, workshops, and other activities. One representative offered an unofficial estimate that one-fourth to one-third of the time of some middle- and high-level officials was devoted to GPRA compliance.
From page 148...
... They also urged oversight bodies to help agencies to develop reporting formats that min required. Two issues of timing imize the extra effort Linking performance plans with the budget cycle Most agency representatives reported difficulties in complying with the timing of GPRA requirements.
From page 149...
... Evaluating basic-research programs annually Like the focus groups, this workshop featured extended discussions on the difficulty of evaluating the results of basic research each year. Such a requirement, several participants said, puts unrealistic pressure on a principal investigator to come up with the "next great discovery of the last 12 months." One participant noted that the "output" of good research is original knowledge, as measured by publications and perhaps new products, but that the "outcome" of that knowledge might be unknown for years.
From page 150...
... How much quantitative information is included? The absence of such information from most GPRA reports leads to some suspicion about the objectivity of expert review and the independence of reviewers.
From page 151...
... Some oversight personnel complained that agencies did not explain the special needs of science adequately, did not reveal their specific planning and reporting mechanisms with sufficient transparency, did not adequately align the program activities with budget line items, and did not explain their validation and verification procedures for evaluating research programs. One agency representative reported that he had never been contacted by a representative of Congress about GPRA.
From page 152...
... IMPLEMENTING THE GOVERNMENT PERFORMANCE AND RESULTS ACT FOR RESEARCH agency representatives reported the same difficulty with committees and sometimes with their own agencies. One agency representative acknowledged that the process was still relatively young, and participants were still learning what the others wanted.
From page 153...
... Richardson Associate Vice Chancellor of Engineering and Director, Texas Transportation Institute The Texas A&M University System College Station, Texas Max D Summers Professor of Entomology Department of Entomology Texas A&M University College Station, Texas Morris Tanenbaum Retired Vice Chairman and Chief Financial Officer, AT&T Short Hills, New Jersey Bailus Walker, Jr.
From page 154...
... Washington, D.C. Harriet Ganson Chief, Planning, Evaluation and Legislation Branch of National Institute of Dental and Craniofacial Research, NIH Bethesda, Maryland Robin 1.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.