National Academies Press: OpenBook

Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program (2005)

Chapter: Appendix A Measuring Government Performance

« Previous: 7 Conclusions and Next Steps
Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×

Appendixes

Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×

This page intentionally left blank.

Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×

Appendix A
Measuring Government Performance

A number of federal laws and policies require government agencies to measure and report the performance of their programs. These include the Government Performance and Results Act (GPRA), the research and development (R&D) investment criteria, and the Program Assessment Rating Tool (PART). GPRA establishes a broad statutory framework for management and accountability, whereas the R&D investment criteria and PART are focused on more simplified measures of performance for budget decisions.

GOVERNMENT PERFORMANCE AND RESULTS ACT

The Government Performance and Results Act of 19931 was intended to increase the effectiveness, efficiency, and accountability of the federal government. It requires federal agencies to set strategic goals and to measure program performance against those goals. Reporting takes three forms:

  1. a strategic plan, which states the agency mission, goals and objectives, and a description of how the goals and objectives will be achieved over the next five or more years;

  2. an annual performance plan, which establishes performance goals as well as performance indicators for measuring or assessing the outputs, service levels, and outcomes of each program activity; and

1  

Public Law 103-62.

Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×
  1. an annual performance report, which compares actual accomplishments with the performance goals.

The GPRA does not apply to interagency programs such as the Climate Change Science Program (CCSP). However, agency contributions to such programs are subject to GPRA, although they may be formulated in agency terms, rather than interagency terms.

Agencies that cannot express performance goals in an objective, quantifiable, and measurable form can seek Office of Management and Budget (OMB) approval for alternative forms. Science agencies have generally adopted both quantitative (e.g., publication count) and qualitative (e.g., progress in understanding) indicators.2

RESEARCH AND DEVELOPMENT INVESTMENT CRITERIA AND THE PROGRAM ASSESSMENT RATING TOOL

In 2002, two White House management initiatives—the R&D investment criteria and PART—were introduced in part to inform budget decisions. The R&D investment criteria were intended to improve the process for budgeting, selecting, and managing research and development programs.3 Managers must demonstrate the extent to which their programs meet the tests of relevance, quality, and performance (see Box A.1). The criteria also address retrospective review of whether investments were well directed, efficient, and productive.

PART focuses on the subset of long-term and annual performance measures that capture the most important aspects of the program’s mission and priorities. Based on a set of yes or no questions (see Box A.2), each program is assigned a score, which is translated into a qualitative rating: effective, moderately effective, adequate, ineffective, or results not demonstrated. The rating is intended to be used to tie program performance to

2  

General Accounting Office, 1997, Measuring Performance: Strengths and Limitations of Research Indicators, GAO/RCED-97-91, Washington, D.C., 34 pp.

3  

Memorandum on FY 2004 interagency research and development priorities, from John H. Marburger III, director of the Office of Science and Technology Policy, and Mitchell Daniels, director of the Office of Management and Budget, on May 30, 2002, <http://www.ostp.gov/html/ombguidmemo.pdf>. The guidelines drew heavily from National Research Council, 2001, Implementing the Government Performance and Results Act for Research: A Status Report, National Academy Press, Washington, D.C., 190 pp. OMB also developed guidelines for applied research, using the Department of Energy’s (DOE’s) applied energy technology programs as a pilot. See <http://www7.nationalacademies.org/gpra/Applied_Research.html>.

Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×

Box A.1
R&D Investment Criteria

The following criteria apply to all federal research and development programs.


Relevance

  • Programs must have complete plans, with clear goals and priorities.

  • Programs must articulate the potential public benefits of the program.

  • Programs must document their relevance to specific presidential priorities to receive special consideration.

  • Program relevance to the needs of the nation, of fields of science and technology, and of program “customers” must be assessed through prospective external review.

  • Program relevance to the needs of the nation, of fields of science and technology, and of program “customers” must be assessed periodically through retrospective external review.

Quality

  • Programs allocating funds through means other than a competitive, meritbased process must justify funding methods and document how quality is maintained.

  • Program quality must be assessed periodically through retrospective expert review.

Performance

  • Programs may be required to track and report relevant program inputs annually.

  • Programs must define appropriate output and outcome measures, schedules, and decision points.

  • Program performance must be retrospectively documented annually.

SOURCE: Office of Management and Budget, 2003, Budget Procedures Memorandum No. 861, Completing the Program Assessment Rating Tool (PART) for the FY 2005 Review Process, 60 pp., <http://www.whitehouse.gov/omb/part/bpm861.pdf>.

Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×

Box A.2
PART Questions and Relation to the R&D Investment Criteria

Program Purpose and Design (20 percent weighting)


Questions address R&D investment criteria of program relevance:


1.1 Is the program purpose clear?

1.2 Does the program address a specific and existing problem, interest, or need?

1.3 Is the program designed so that it is not redundant or duplicative of any other federal, state, local or private effort?

1.4 Is the program design free of major flaws that would limit the program’s effectiveness or efficiency?

1.5 Is the program design effectively targeted so that resources will address the program’s purpose directly and will reach intended beneficiaries?


Strategic Planning (10 percent weighting)


Questions address prospective aspects of the R&D investment criteria:


2.1 Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

2.2 Does the program have ambitious targets and time frames for its long-term measures?

2.3 Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program’s long-term goals?

2.4 Does the program have baselines and ambitious targets for its annual measures?

2.5 Do all partners (including grantees, subgrantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

2.6 Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

2.7 Are budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program’s budget?

2.8 Has the program taken meaningful steps to correct its strategic planning deficiencies?


Additional questions for R&D programs:


2.RD1 If applicable, does the program assess and compare the potential benefits of efforts within the program and (if relevant) to other efforts in other programs that have similar goals?

2.RD2 Does the program use a prioritization process to guide budget requests and funding decisions?

Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×

Program Management (20 percent weighting)


Questions address prospective aspects of program quality and performance in the R&D investment criteria, as well as general program management issues:


3.1 Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

3.2 Are federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule, and performance results?

3.3 Are funds (federal and partners’) obligated in a timely manner and spent for the intended purpose?

3.4 Does the program have procedures (e.g., competitive sourcing or cost comparisons, information technology improvements, appropriate incentives) to measure and achieve efficiencies and cost-effectiveness in program execution?

3.5 Does the program collaborate and coordinate effectively with related programs?

3.6 Does the program use strong financial management practices?

3.7 Has the program taken meaningful steps to address its management deficiencies?


Additional question for R&D programs:


3.RD1 For R&D programs other than competitive grants programs, does the program allocate funds and use management processes that maintain program quality?


Program Results and Accountability (50 percent weighting)


Questions address retrospective aspects of the R&D investment criteria, with emphasis on performance:


4.1 Has the program demonstrated adequate progress in achieving its long-term performance goals?

4.2 Does the program (including program partners) achieve its annual performance goals?

4.3 Does the program demonstrate improved efficiencies or cost-effectiveness in achieving program goals each year?

4.4 Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

4.5 Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?


SOURCE: Office of Management and Budget, 2005, Guidance for Completing the Program Assessment Rating Tool (PART), 64 pp., <http://www.whitehouse.gov/omb/part/fy2005/2005_guidance.doc>.

Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×

budget appropriations.4 Twenty percent of federal programs are being rated each year, beginning with the fiscal year (FY) 2004 budget request.5

A 2004 General Accounting Office (GAO) report found that PART had helped structure OMB’s use of performance information for program analysis and internal review.6 However, budget allocations were not always tied to program ratings. Programs rated as “effective” or “moderately effective” did not always receive increased funding, and programs rated as “ineffective” did not always lose funding. The report also noted that by using PART to influence GPRA measures, OMB is influencing agency program goals, to the detriment of a wide range of stakeholders. It concluded that although PART is useful for program-level budget analysis, it cannot substitute for GPRA’s longer-term, strategic focus on thematic goals. Nevertheless, goals and performance measures relevant to the R&D criteria and PART are being incorporated into future GPRA agency performance plans.7

4  

Office of Management and Budget, 2003, Performance Measurement Challenges and Strategies, 13 pp., <http://www.whitehouse.gov/omb/part/challenges_strategies.pdf>.

5  

Agency programs relevant to climate change that were evaluated in the FY 2004 budget include Department of Defense (DOD) Basic Research; DOE’s Basic Energy Sciences, Biological and Environmental Research, Environmental Management, and Office of Science; U.S. Agency for International Development (USAID) Climate Change; and National Science Foundation (NSF) Geosciences. In FY 2005, relevant agency programs include the U.S. Department of Agriculture’s (USDA’s) National Resources Inventory and Soil Survey; Department of Interior’s (DOI’s) Science and Technology; Environmental Protection Agency’s (EPA’s) Ecological Research; and National Aeronautics and Space Administration’s (NASA’s) Biological Sciences Research and Earth Science Applications. See <http://www.whitehouse.gov/omb/part/program_assessments_planned_2005.html>.

6  

General Accounting Office, 2004, Performance Budgeting: Observations on the Use of OMB’s Program Assessment Rating Tool for the Fiscal Year 2004 Budget, GAO-04-174, Washington, D.C., 67 pp.

7  

Office of Management and Budget, 2003, Budget procedures memorandum no. 861, Completing the Program Assessment Rating Tool (PART) for the FY2005 review process, 60 pp., <http://www.whitehouse.gov/omb/part/bpm861.pdf>.

Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×
Page 95
Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×
Page 96
Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×
Page 97
Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×
Page 98
Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×
Page 99
Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×
Page 100
Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×
Page 101
Suggested Citation:"Appendix A Measuring Government Performance." National Research Council. 2005. Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program. Washington, DC: The National Academies Press. doi: 10.17226/11292.
×
Page 102
Next: Appendix B Case Study Metrics for the Climate Change Science Program »
Thinking Strategically: The Appropriate Use of Metrics for the Climate Change Science Program Get This Book
×
Buy Paperback | $52.00 Buy Ebook | $41.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Climate Change Science Program (CCSP) and its predecessor U.S. Global Change Research Program have sponsored climate research and observations for nearly 15 years, yet the overall progress of the program has not been measured systematically. Metrics—a system of measurement that includes the item being measured, the unit of measurement, and the value of the unit—offer a tool for measuring such progress; improving program performance; and demonstrating program successes to Congress, the Office of Management and Budget, and the public. This report lays out a framework for creating and implementing metrics for the CCSP. A general set of metrics provides a starting point for identifying the most important measures, and the principles provide guidance for refining the metrics and avoiding unintended consequences.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!