The following HTML text is provided to enhance online
readability. Many aspects of typography translate only awkwardly to HTML.
Please use the page image
as the authoritative form to ensure accuracy.
Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program
program performance.2 Such an evaluation is needed to justify continued taxpayer support, especially in an era of declining budgets.
Studies in industry, academia, and the government suggest that metrics can be developed to document progress from past research programs and to evaluate future research performance.3 The challenge is to create meaningful and effective metrics that accomplish the following:
Convey an accurate view of scientific progress. A metric commonly used to evaluate advances in climate models, for example, is reduction of uncertainty of a projection or forecast.4 However, progress in scientific and technical understanding can both increase and decrease uncertainty estimates.
Result in a balance between high-risk research, long-term gain, and success in specific applications that are more easily measured.
Accommodate the long time scales necessary for achieving results in basic research.
The following additional challenges are specific to the CCSP:
To develop a methodology for creating metrics that can be applied to the entire CCSP. This is especially challenging because of the scope and diversity of the program. Thirteen agencies participate in the program, which encompasses a wide range of natural and social science disciplines, each of which has different approaches to and results from research, and activities ranging from observations, to basic research, to assessments and decision support (Box 1.1).
To collect consistent data that can be used to assess and manage programs at the interagency level.
Presentation to the committee by J. Mahoney, CCSP director, on December 17, 2003.
Army Research Laboratory, 1996, Applying the Principles of the Government Performance and Results Act to the Research and Development Function: A Case Study Submitted to the Office of Management and Budget, 27 pp., <http://govinfo.library.unt.edu/npr/library/studies/casearla.pdf>; National Science and Technology Council, 1996, Assessing Fundamental Science, <http://www.nsf.gov/sbe/srs/ostp/assess/start.htm>; General Accounting Office, 1997, Measuring Performance: Strengths and Limitations of Research Indicators, GAO/RCED-97-91, Washington, D.C., 34 pp.; National Academy of Engineering and National Research Council, 1999, Industrial Environmental Performance Metrics: Challenges and Opportunities, National Academy Press, Washington, D.C., 252 pp.; National Research Council, 1999, Evaluating Federal Research Programs: Research and the Government Performance and Results Act, National Academy Press, Washington, D.C., 80 pp.; National Research Council, 2001, Implementing the Government Performance and Results Act for Research: A Status Report, National Academy Press, Washington, D.C., 190 pp.
Presentations to the committee by J. Kaye, National Aeronautics and Space Administration, on December 17, 2003, and J. Rothenberg, OMB, on March 4, 2004.