in a supplemental questionnaire developed by the committee. It is important to note that both of these sources involved grantee self-reports, which could be susceptible to social desirability bias. Moreover, the APR is designed as a grant monitoring tool rather than as a source of information for a program evaluation. Because the information supplied on the APR and the questionnaire was not always sufficient to inform the quality ratings, additional methods are needed to ensure complete information for such reviews.

Reviewer Expertise

The committee was directed to assess the quality of four types of prespecified outputs. While the most common output type was publications, NIDRR grants produce a variety of other outputs, including tools and measures, technology devices and standards, and informational products. These outputs vary widely in their complexity and the investment needed to produce them. The criteria used by the committee to assess the quality of outputs were based on the cumulative literature reviewed and the committee members’ own research expertise in diverse areas of disability and rehabilitation research, medicine, and engineering, as well as their expertise in evaluation, economics, knowledge translation, and policy. However, the committee’s combined expertise did not include every possible content area in the broad field of disability and rehabilitation research.

Recommendation 6-4: If future evaluations of output quality are conducted, the process developed by the committee should be implemented with refinements to strengthen the design related to the diversity of outputs, timing of evaluations, sources of information, and reviewer expertise.

Improving Use of the Annual Performance Report

The APR data set provided to the committee by NIDRR at the outset of the evaluation was helpful in profiling the grants for sampling and in listing all of the grantees’ projects and outputs. In addition, the narrative information provided in the reports was useful to the committee in compiling descriptions of the grants; however, they varied with respect to the quality of the information they contained.

Recommendation 6-5: NIDRR should consider revising its APR to better capture information needed to routinely evaluate the quality and impacts of outputs, grants, or program mechanisms. They might consider efforts such as consolidating existing data elements

The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement