COMMITTEE’S CHARGE

In 2007, at the request of the National Science Foundation, the National Research Council (NRC) appointed an ad hoc committee to conduct an assessment of NSF’s program Grants for Vertical Integration of Research and Education in the Mathematical Sciences (see Appendix A for biographies of the committee members). The committee was given the following tasks:

  1. Review the goals of the VIGRE program and evaluate how well the program is designed to address those goals;

  2. Evaluate past and current practices at NSF for steering and assessing the VIGRE program;

  3. Draw tentative conclusions about the program’s achievements based on the data collected to date;

  4. Evaluate NSF’s plans for future data-driven assessments and identify data collection priorities that will, over time, build understanding of how well the program is attaining its goals; and

  5. Offer recommendations for improvements to the program and NSF’s ongoing monitoring of it.

EVALUATION OF THE SCOPE AND APPROACH

The NRC’s Committee to Evaluate the NSF’s Vertically Integrated Grants for Research and Education (VIGRE) Program held its first meeting in June 2007. To carry out its charge, the committee began by asking what components of the VIGRE program could be evaluated. Program evaluation is beneficial in assessing the processes or outcomes of a program to determine whether improvements can be made. Evaluations are sometimes planned during the creation of a program. They may also be designed during the program or even retrospectively after the program has ended. The latter types of evaluation may be more difficult to conduct if the assessment’s data needs were not planned for in advance. Finally, evaluations can be longitudinal (for example, comparing an outcome such as the number of students pursuing a mathematics major before, during, or after a program’s lifetime) or cross-sectional (for example, comparing two departments, where one had a particular program—such as an undergraduate research experience—and the other department did not).

To organize an answer to the questions involved in evaluation, the committee sought to describe the VIGRE program, as captured in Figure 1-1 in the diagram that the committee developed. As that figure illustrates in the topmost boxes, the state of the mathematical sciences as well as other factors (e.g., NSF-wide goals, NSF budget) motivate the VIGRE program; and each year DMS releases a call for proposals from PhD-granting departments in the mathematical sciences in the United States. Departments in applied mathematics, mathematics, and statistics may submit proposals. The proposals are then subjected to a review process at NSF, the end result of which is that some proposals are funded (becoming the “VIGRE awardees”). The awardees carry out the plans developed in their proposals over the first 3 years of the award, submitting annual reports of their progress. In the 3rd year, NSF conducts site visits to determine whether each individual award should be continued for 2 more years. If continuation is approved, the departments proceed with their programs and submit additional annual reports and a final report.

The diagram in Figure 1-1 shows several feedback mechanisms. Recall that the VIGRE program has evolved over time, and most of the processes shown in the diagram can be thought of as occurring annually. The actions of awardees are supposed to have a positive impact on the mathematical sciences and, to the extent that they do, they might alter DMS’s goals for the program. Awardees’ actions might also directly affect DMS’s goals: in response to submissions, DMS could change the submission process; in response to the programs that individual departments are proposing to carry out, DMS could change the goals of the VIGRE program.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement