Skip to main content

Currently Skimming:

Appendix C: Summaries of Agency Focus Group Presentation
Pages 63-136

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 63...
... responsible for review of GPRA performance plans and reports from research agencies. A similar agenda was followed during each focus group.
From page 64...
... . How is the result communicated to different audiences (e.g., the SAT community, advisory committees, agency leadership, the administration, Congress)
From page 65...
... , the Joint Warfighting Science and Technology Plan (JWSTP) , and the Defense Technology Areas Plan (DTAP)
From page 66...
... Bush argued that such investments in basic research are acts of both faith and patience, but the investments are justified many times over by returns of great value. DOD's BRP is described as "the cutting edge of the Defense Science and Technology Program." This plan is realized primarily by directly funding research in universities, federal laboratories, and industry and by keeping "a watchful eye on research activities all over the world to prevent technological surprise." The BRP contains an overview of the entire DOD research program, most of which can be described in 12 disciplinary categories.
From page 67...
... The BRP is coupled with two complementary plans that focus on applied research and advanced technology development: the Joint Warfighting Science and Technology Plan ({WSTP) and the Defense Technology Area Plan (DTAP)
From page 68...
... TARAs are held every two years for a particular technology area. This year, evaluations are being done in biomedical, battlespace environments, ground/sea vehicles, materials and processes, space platforms, chemical and biological defense, and sensors, electronics and electronics warfare.
From page 69...
... The following two examples from the 2000 Chemical and Biological Defense Science and Technology TARA illustrate how evaluation works by using the DTOs as the unit of focus. For example, the TARA process gave the "Modeling and Simulation" DTO a yellow grade because of management problems.
From page 70...
... implies a linear model, this is often honored more in the breach than the practice. The 'push' of the linear process is augmented in DOD by a feedback process, whereby changing operational requirements and new results from multidisciplinary research continually keep the Basic Research Program on target." DOD Basic Research Plan, 1999, p.
From page 71...
... The evaluation of basic and applied research is carried out by both internal agency panels of experts and by TARA review panels. Each panel consists of 10-12 technical experts from academe, industry, and nonprofit research organizations.
From page 72...
... At a higher level, annual performance goals are evaluated in two ways. First, results for each of the subordinate measures and indicators are evaluated within the context of overall program performance.
From page 73...
... Retrospective studies are intended to build support for the process, not for individual projects. It is not possible to point to the outcome of an on-going individual project.
From page 74...
... One aspect of the problem is that manpower is considered to be "hard-wired" into the budget process, but there is no evaluation of the educational component itself and thus no incentive structure for good teaching, research training, or mentoring. For example, the substantial cuts in the 6.1 budget from 1993 to 1998 brought reductions in the number of the graduate students who could be supported by research grants at universities, but the GPRA process did not report this result.
From page 75...
... The BRP is evaluated by the director of defense research and engineering, with feedback to the agencies after the annual program review. The services and defense agencies also conduct other periodic program reviews to assess quality, relevance, and .
From page 76...
... ? The results of TARA reviews are communicated to agency leadership by "TARA outbriefings" for each technology area (6.2 and 6.3)
From page 77...
... The TARA review process is used at all levels of decisionmaking. For example, the TARA 2000 review of chemical-biologic defense SET program (against such agents as mustard gas, nerve gas, anthrax, plague, and smallpox)
From page 78...
... Basic research in cognitive science is overseen by a combination of NSF and the Office of Science and Technology Policy in the White House. Other important decision-making that is not addressed by GPRA concerns the choice of basic research fields and the transition of a 6.1 program to a 6.2 and 6.3 program.
From page 79...
... White University Professor and Director Data Storage Systems Center Carnegie Mellon University Pittsburgh, Pennsylvania James Baker Internal Medicine University of Michigan Ann Arbor, Michigan Greg Henry Program Examiner, National Security Division Office of Management and Budget Washington, D.C. Genevieve Knezo Congressional Research Service Library of Congress Washington, D.C.
From page 80...
... IMPLEMENTING THE GOVERNMENT PERFORMANCE AND RESULTS ACT FOR RESEARCH Agency Represer~tarti~es: Robert Foster Director, Biosystems US Department of Defense Roslyn, Virginia Joanne Spriggs Deputy Under Secretary of Defense (SET) US Department of Defense Washington, D.C.
From page 81...
... Preparation of NIH's annual GPRA performance plans and performance reports is the response bility of the Office of Science Policy within the Office of the . Director, NIH.
From page 82...
... The performance goals in NIH's annual performance plans address both the long-term, intended results or outcomes of NIH core program activities and the management and administrative processes that facilitate the core program activities and lead to the achievement of outcomes. For example, within the Research Program, outcome goals include increased understanding of biological processes and behaviors, as well as the development of new or improved methods for the prevention, diagnosis, and treatment of disease and disability.
From page 83...
... In such cases, GPRA provides an avenue for an agency to define performance goals that rely on criteria that are more descriptive and to use an alternative form of assessment. A small subset of the annual performance goals, related to the NIH Research Program, are more qualitative in nature, and the NIH has used the alternative form, as allowed by the GPRA, for these five goals: .
From page 84...
... . The overall DHHS performance plan is the total of the plans of its 13 subagencies, which in turn stem from the DHHS strategic plan.
From page 85...
... Therefore, to develop its approach to GPRA, NIH developed an independent assessment process for evaluating program outcomes and compared them with the performance goals for the 85
From page 86...
... One-paragraph snapshots of the breadth and scope of individual NIH research program outcomes. Their brevity allows for a greater number of vignettes, each offering a thumbnail description of an advance and its significance, so that the overall picture created by the capsules is more nearly representative of the research effort as a whole.2 · Stories of discovery.
From page 87...
... The resulting assessment materials were considered to provide an extensive illustration of NIH's FY1999 research outcomes that address the five qualitative research-program performance goals. 1.4.3 Evaluating outcomes.
From page 88...
... · The NIH biomedical research enterprise has substan~tially exceeded this goal when, in addition to fulfilling the above criteria, any of the following apply: Discoveries result in significant new understanding. Research yields answers to long-standing, important questions.
From page 89...
... COSEPUP has suggested, and tested, the use of "international benchmarking" to measure leadership, a technique discussed in its full report.4 I.S How does the selection and evaluation of projects relate to the evaluation of the research program? Selection criteria were developed by NIH on the basis of the decision to aggregate its individual research projects and to evaluate them as part of broad biomedical goals.
From page 90...
... The reasons for this approach are derived from the unique challenges for agencies whose missions include basic and clinical research. As proposed in NIH's report, Assessment of Research Program Outcomes, scientists and the practice of science "exist because of what we do not know.
From page 91...
... In addition, the annual reporting requirements of GPRA present a problem. The outcomes of fundamental science commonly unfold over a period of years.
From page 92...
... Finally, OMB is both a participant and an audience in the GPRA process for NIH. OMB receives the budget requests and performance plans, engages with the agency, and asks questions about how well the plan reflects the agency's priorities.
From page 93...
... The linkages are not made dollar for dollar, especially in research, but the information gathered for GPRA is useful to help make decisions earlier in the budgeting process. Performance plans are also used by the DHHS budget review board, which has made a commitment to using GPRA.
From page 94...
... Such situations are not clearly dealt with in some GPRA reporting mechanisms. 3.3 Changing the process.
From page 95...
... Theodore CasteIe, M.D., FACR Member, NIH GPRA Assessment Working Group Fairview Park, Ohio Melanie C Dreher Dean and Professor The University of Iowa College of Nursing Iowa City, Iowa Marc Garuli Program Examiner, Health Programs and Services Branch Office of Management and Budget Washington, D.C.
From page 96...
... IMPLEMENTING THE GOVERNMENT PERFORMANCE AND RESULTS ACT FOR RESEARCH Agency Represer~tarti~es: Robin 1. Kawazoe Director, Office of Science Policy & Planning National Institutes of Health Bethesda, Maryland Lana R
From page 97...
... , and targets achieved (e.g., "several spacecraft have been successfully developed and launched with a 3.8 percent average overrun"~. For its FY2001 performance plan, NASA has instituted several major changes.
From page 98...
... 1.1.1 Interna/ and externa/ reviews. To take a broader view of NASA's evaluation techniques, the agency uses extensive internal and external reviews to assess its research efforts against its performance plans.
From page 99...
... At the workshop, NASA devoted about half its time to presenting the research programs in the three strategic enterprises with science missions, and the GPRA responses of those specific research programs: space science, earth science, and biologic and physical science. Each has different requirements, and each should have its own methods for complying with GPRA.
From page 100...
... For example, NASA is currently developing the performance plan for the budget planning year (02) , tracking the current plan for the current budget year (00)
From page 101...
... NASA divides its activities into five "enterprises": the Space Science Enterprise, the Earth Science Enterprise, the Human Exploration and Development of Space Enterprise, the Biological and Physical Research Enterprise, and the Aero-Space Technology Enterprise. Each enterprise then evaluates is mission by several strategic plan goals.
From page 102...
... These metrics are reviewed by the independent NASA Advisory Council. The GPRA coordinators take this input, integrate it with the rest of the performance plan, and send it to the Office of Management and Budget (OMB)
From page 103...
... (Because of the relationship between the budget and the performance plans, the NASA CFO has primary responsibility for the conduct of the performance plan and reporting process.) Several external groups also help to guide the process.
From page 104...
... In FY1990, nearly 5000 mail reviews were received, and peer review panels in FY1999 and FY2000 involved nearly 300 people. Also, NASA Earth Science Enterprise programs have extensive interaction with the community, including National Research Council panels, US Global Change Research Program, and international science organizations (such as World Climate Research Program and EGBP)
From page 105...
... NASA will continue to report annual GPRA-type metrics for enabling activities, such as satellite development, as well as annual science accomplishments in the GPRA performance report. It would review one-third of the research program annually, providing regular scrutiny.
From page 106...
... Similarly, unmeasurable or difficult-to-measure programs give the perception that their progress and ability to produce useful results are not being tracked regularly. The use of expert review to track program performance could be accurately reflected in the performance report.
From page 107...
... Summary of the National Aeronautics and Space Administration Focus Group light of how science is performed. The use of milestones, for example, implies a one-directional progression toward a goal.
From page 108...
... In addition, GPRA criteria should allow the agency to retain some flexibility and not place it in a straitjacket in its dealings with Congress and other oversight bodies. GPRA reports, in general, must be understandable to a wide array of people, but compliance requirements must recognize the need for technical discussion to capture the full reasoning behind the science.
From page 109...
... 2.3 Communicating with the public. Several participants congratulated NASA on the fullness and diversity of its communication with the public, including research publications, data archives, and Web sites.
From page 110...
... In fact, a research portfolio is more like the stock market, featuring many short-term ups and downs and the occasional long-term "home run." An effective GPRA reporting process would better communicate the high-risk, high-reward nature of research and provide convincing evidence of its value and continuing contributions to society.
From page 111...
... Jack Kaye Director, Earth Sciences Research Division NASA Headquarters Washington D.C.
From page 112...
... IMPLEMENTING THE GOVERNMENT PERFORMANCE AND RESULTS ACT FOR RESEARCH Guenter Riegler Director, Space Sciences Research Program Management Division NASA Headquarters Washington, D.C. Eugene Trinh Director, Physical Sciences Research Division NASA Headquarters Washington, D.C.
From page 113...
... The National Science Foundation (NSF) is an independent agency of the US government, established by the NSF Organic Act of 1950 to "promote the progress of science; to advance the national health, prosperity and welfare; and to secure the national defense." NSF is governed by the National Science Board (24 part-time members)
From page 114...
... . The format relies on a mixture of quantitative and qualitative measures and relies primarily on expert review at the project and program levels.
From page 115...
... That is one reason, as several participants pointed out, why NSF's "results" cannot be matched with budgetary line items. The core of the research enterprise is the individual research project; most of them are university-based.
From page 116...
... At the "grass roots" level, some 95% of individual research projects are approved and reviewed by independent expert reviewers (a small number are initiated internally by the director or others)
From page 117...
... In a broad sense, NSF relies on multiple criteria in evaluating programs. These include peer (expert)
From page 118...
... COSEPUP also addressed the issue of whether to rely more heavily on qualitative or quantitative means to evaluate research. It suggested that basic research can best be evaluated by expert review, which makes use of quantitative measures wherever appropriate.
From page 119...
... The agency-wide evaluation is performed on a different scale from a single-project evaluation, but by the same principles. Hence, NSF's GPRA performance plan for FY2001 includes the statements that "even the best proposals are guesses about the future," "true research impact is always judged in retrospect," and "the greatest impacts are often 'non-linear' or unexpected." In contrast, some of NSF's projects are easily quantifiable and are evaluated on that basis.
From page 120...
... Similarly, the agency is about to begin work on the performance plan for 2002, but it does not yet have the report for 2000 to know where it should be making changes for 2002. An NSF representative said, "In addition, it may take a year or two to put a solicitation out, receive proposals, evaluate them, and make awards.
From page 121...
... (fine consequence of complying with the GPRA performance plan has been to simplify NSF goals. Five broad agency goals have been whittled down to three: .
From page 122...
... Before, the agency had a "visionary plan," but it was not very well connected to implementation. Now, NSF has to connect that strategic plan all the way down to program level and assessment.
From page 123...
... engineering University of Virginia Charlottesville, Virginia Genevieve Knezo Specialist, Science and Technology Policy Congressional Research Service Library of Congress Washington, D.C. Robin Camaro Assistant Director US General Accounting Office Washington, D.C.
From page 124...
... Simpson The IRIS Consortium Washington, D.C. Robert Eisenstein Assistant Director for Math and Physical Sciences The National Science Foundation Arlington, Virginia Pau!
From page 125...
... Large laboratories and user facilities (26 major facilities and 12 collaborative research centers) receive over 30% of the office's budget; smaller portions go to major construction projects (currently featuring the Spallation Neutron Source and a high-energy physics project to study neutrino oscillations)
From page 126...
... Planners knew that a new planning model would have to be flexible because each new administration formulates different policies. But GPRA requires uniformity and a clear linkage between the performance plan and the performance report.
From page 127...
... Below the level of programs (and of the GPRA performance reports) , much of DOE's research budget is allocated to support individual, investigatordriven research projects in universities.
From page 128...
... For the university grant program, virtually all individual projects are evaluated by regular peer review under the Office of Science's Merit Review System guidelines. This external process conforms to standard independent peer-review procedures.
From page 129...
... For each of the five SC programs, the evaluation procedure also includes advisory committees. For example, the 26-member Basic Energy Sciences Advisory Committee (BESAC)
From page 130...
... Some excellent research may not be relevant to the agency's mission, and some relevant research may not be of excellent quality. 1.4.2 IS0 measures.
From page 131...
... Missions need flexibility in choosing research directions because peer review by itself is inherently conservative.
From page 132...
... ? In its report to Congress on the usefulness of agency performance plans, GAO noted that SC's FY2000 plan was "moderately improved" over the FY1999 plan but still bore little relationship to budgeting.
From page 133...
... GAO randomly sampled 100 BES research projects and concluded that the agency was performing merit review properly and following the established procedures.
From page 134...
... The agency noted that its reviews do have results that a poor review of the construction of the Spallation Neutron Source had resulted in substantial changes in senior management.
From page 135...
... Program Examiner Office of Management and Budget Washington, D.C. Genevieve Knezo Specialist, Science and Technology Policy Congressional Research Service Library of Congress Washington, D.C.
From page 136...
... IMPLEMENTING THE GOVERNMENT PERFORMANCE AND RESULTS ACT FOR RESEARCH Agency Represer~tarti~es: Patricia Dehmer Associate Director, Office of Basic Energy Sciences US Department of Energy Germantown, Maryland William.J. Valciez Director of the Office of Planning and Analysis US Department of Energy Washington, D.C.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.