National Academies Press: OpenBook
« Previous: 3 The VIGRE Program
Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×

4
Administering, Monitoring, and Assessing the Program

This chapter evaluates the Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE) proposal process. Beginning with the program’s 1998 request for proposals (RFP), there were two noteworthy requirements: proposals should include a program evaluation component, and they should contain a data appendix:

Performance assessment. Each proposal should describe a performance evaluation plan that includes goals, objectives, indicators, and specific measurements for assessing the progress toward the achievement of the goals. This plan will form the basis of the required annual progress reports as well as an in depth review to be conducted by NSF during the third year. Examples of indicators that may be useful are shortening time-to-degree, broadening career opportunities, assessment of the postdoctoral fellows’ and graduate trainees’ performance, impact of the research experience on the career plans of undergraduates, placement of graduate students and postdoctoral fellows upon completion of the program, and the participation of women and members of underrepresented groups.


Each proposal should include an appendix (Appendix 1) indicating (a) the number of baccalaureate degrees in the mathematical sciences in the past five years, (b) the number of full-time graduate students for each of the previous five years, (c) the PhD recipients during the past five years, their placements, and thesis advisors, (d) the names of postdoctoral fellows (e.g. holders of named instructorships) during the past five years and their mentors and placements, (e) the dollar amount of non-teaching support of graduate students supplied by the university for each of the previous five years, and (f) the anticipated size of the graduate program should this award be received. This information will provide baseline data to be used in subsequent performance assessments.1

According to the 1999 RFP, the requested data appendix was to include a new element, the amount of funding by federal agencies for graduate students and for postdoctorates in each of the previous 5 years. The 2000 and 2001 RFPs retained this language.

1

From the first program solicitation: “Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE),” NSF 97-155, available at http://www.nsf.gov/pubs/1197/nsf97155/nsf97155.htm. Accessed June 12, 2009.

Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×

In 2002, a number of changes to the solicitation language were adopted. First, the indicators suggested for the performance assessment were dropped. However, the RFP called for an additional component, a “Post-VIGRE plan.” The RFP noted: “The VIGRE program is intended to help stimulate and implement permanent positive changes in education and training within the mathematical sciences in the U.S. Thus it is critical that a VIGRE site adequately plan how to continue the pursuit of VIGRE goals when funding terminates.”2 In 2003, similar language was used. It was noted in the section on the data appendix that “[e]xisting VIGRE institutions should also include data for five years prior to the beginning of their existing award,”3 so that the data history for those departments would be longer than what was required of departments not already holding a VIGRE award.

The 2004 RFP made that last point more emphatic and also added a dissemination requirement:

Dissemination. The VIGRE program is intended to have a positive impact at the national level on the mathematical sciences community. Broad dissemination of VIGRE site activities, experiences, and insights is critical to achieve this. Each proposal must include a plan for this dissemination. It is important to disseminate both successful activities as well as information on less successful activities and mid-course corrections.4


Results from Prior Support. Existing VIGRE departments should include a summary of what has been accomplished with a previous VIGRE award. This should include information on career paths of VIGRE-supported graduate students and postdocs.5

The most current proposal solicitations state these information requirements as follows:

Performance Assessment Plan. Each proposal should describe a plan to assess the progress towards the achievement of the EMSW21 [Enhancing the Mathematical Sciences Workforce in the 21st Century] goals. This plan should describe the quantitative and qualitative information that will be used to monitor the EMSW21 activities and determine necessary mid-course corrections. The performance assessment of a VIGRE proposal will form part of the basis for the comprehensive third year review that will be conducted by NSF of the VIGRE sites.


Dissemination. The EMSW21 program is intended to have a positive impact at the national level on the mathematical sciences community. Broad dissemination of EMSW21 site activities, experiences, and insights is critical to achieve this. Each proposal must include a plan for this dissemination. It is important to disseminate both successful activities as well as information on less successful activities and mid-course corrections. A minimum form of dissemination is a web page devoted to EMSW21 describing its activities. The department’s web page should contain an easily seen link to its EMSW21 page.


Post-EMSW21 plan. The EMSW21 program is intended to help stimulate and implement permanent positive changes in education and training within the mathematical sciences in the U.S. Thus, it is critical that an EMSW21 site adequately plan how to continue the pursuit of EMSW21 goals when funding terminates….

2

From the program solicitation: “Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE),” NSF 02-120, available at http://www.nsf.gov/pubs/2002/nsf02120/nsf02120.pdf. Accessed June 29, 2009.

3

From the program solicitation: “Enhancing the Mathematical Sciences Workforce in the 21st Century (EMSW21),” NSF 03-575, available at http://www.nsf.gov/pubs/2003/nsf03575/nsf03575.pdf. Accessed June 29, 2009.

4

From the program solicitation: “Enhancing the Mathematical Sciences Workforce in the 21st Century (EMSW21),” NSF 04-600, available at http://www.nsf.gov/pubs/2004/nsf04600/nsf04600.pdf. Accessed June 29, 2009.

5

Ibid.

Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×

Outcome of curriculum review. Describe the nature of the curriculum review and any planned or implemented changes based on it.


Results from Prior Support. Existing VIGRE departments should include a summary of what has been accomplished with a previous VIGRE award. This should include information on career paths of VIGRE-supported graduate students and postdocs.


Trainee Data. All EMSW21 proposals must supply the following data unless such data is irrelevant to the proposed activities: (a) a list of Ph.D. recipients during the past five years (ten years for those seeking a second VIGRE award), along with each individual’s citizenship status, baccalaureate institution, time-to-degree, post-Ph.D. placement, and thesis advisor; (b) the names of postdoctoral associates (including holders of named instructorships and 2- or 3-year terminal assistant professors) during the past five years (ten years for those seeking a second VIGRE award), their Ph.D. institutions, postdoctoral mentors, and post-appointment placements; (c) the dollar amount of funding by federal agencies for REUs, for graduate students, and for postdoctoral associates in each of the past five years (ten years for those seeking a second VIGRE award).6

VIGRE proposals must also include the data listed in Box 1-1 of Chapter 1 in this report for each of the previous 5 academic years, or for each of the previous 10 academic years if the department is applying for a renewal grant.

The committee draws the following four conclusions:

  1. Producing a proposal for a VIGRE program grant involves a substantial amount of work. While some departments have implemented what they consider to be positive change as a result of the application process,7 spending the time and energy to produce a proposal is of limited value if the department does not receive an award. The requirements for preparing a VIGRE proposal are fairly onerous.

  2. The performance assessment requirement is problematic. The Division of Mathematical Sciences (DMS) at the National Science Foundation (NSF) has never established a formal, consistent evaluation paradigm for the VIGRE program, one that could guide an analysis of how the contents and demands of the program are linked to the program’s long-term goals. Besides not making those linkages explicit, NSF did not identify at least a short list of basic indicators or measures that would reflect progress toward the goals; it might even have requested baseline data for those measures. As for data collection, early in the life of the VIGRE program it was not thoroughly thought through at DMS what would be a minimal core of data needed to determine, at least in very general terms, whether or not the VIGRE program was a success. If the amount of data were small but very specific and carefully targeted, it would not impose an onerous responsibility on grant recipients. However, leaving it to the individual applicants and grantees to identify measures for assessing performance ensures that there would be no common template against which to compare outcomes and to relate those outcomes to the goals of the program.

6

From the program solicitation: “Enhancing the Mathematical Sciences Workforce in the 21st Century (EMSW21),” NSF 05-595, available at http://www.nsf.gov/pubs/2003/nsf05595/nsf05595.pdf. Accessed June 29, 2009.

7

For example, in its survey (see Appendix C), the committee asked departments that had applied for, but not received, VIGRE grants whether the process of applying for a VIGRE award led to any changes in the department. The response from 19 out of 45 respondents was “yes.” Positive effects were, for example: “A new graduate course was initiated, inspired by discussions during the grant proposal preparation”; “Along with our process of program reviews, it motivated changes in the undergraduate curriculum—a capstone course and gateway courses—and in the graduate program, firming up required core courses”; “We launched a curriculum review and revision of our graduate program.”

Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×
  1. The dissemination component of the proposals raises two concerns. Most important is that there does not seem to be a clear plan for VIGRE awardees to share their successes with others except for NSF. Given that NSF does not have the budget to assist all mathematics, applied mathematics, and statistics departments, it would seem critical that NSF attempt to leverage its resources by using VIGRE awardees as testbeds for piloting new and innovative solutions to the problems facing higher education and solutions to the workforce issues facing the mathematical sciences and then seek ways to have other departments copy or learn from those solutions. There does not seem to be a mechanism in place to make this happen. A second concern is that information about the VIGRE awardees is essentially contained within NSF and not sufficiently public. The committee noted with frustration the lack of a central database of information on departments that had received VIGRE awards as well as the lack of Web-based information provided by some of the VIGRE awardees. It was difficult, for example, to find annual or final reports on VIGRE awardee Web sites.

  2. The sustainability of the VIGRE program at VIGRE sites is problematic. An infusion of NSF funding cannot and was never meant to last indefinitely. When awardees take on expensive new initiatives, such as expanding the number of postdoctorals supported by the department, it is not at all clear that such efforts can last beyond the VIGRE grant. In fact, in the committee’s request for information from VIGRE awardees, it became clear that a number of efforts would not remain in place “post VIGRE.” In its survey, the committee asked those whose awards had ended if they planned to continue all the VIGRE-funded activities. About 40 percent said no. Following are comments from these departments:

    • “Anything that costs money for which we haven’t been able to find alternate funding [will go away]: we’ve cancelled undergraduate stipends for participation in working groups, undergraduate stipends for participation in our summer REU-like program, new graduate traineeships (fellowships and relief-from-teaching quarters) for graduate students; and about half of our postdoctoral positions have been downgraded to high-teaching-load lectureships. One reason for these rather drastic cutbacks has been our decision to keep our commitments to current VIGRE postdocs and VIGRE graduate trainees.”

    • “Obviously, our graduate and postdoctoral fellowships will not be maintained.”

    • “We had to drastically decrease the number of admitted graduate students. This had a profound effect on our graduate program.”8

This is very disconcerting, because for maximum effect, the grant money provided by the VIGRE program should serve as seed money.

PROPOSAL AND AWARD REVIEW PROCESS

The NSF review of VIGRE proposals consists of two elements: the panel review conducted by DMS prior to the award (which the committee considered to be outside the scope of its charge) and the two site visits, one prior to making an award and one during the 3rd-year review of awardees. The deliberations of the proposal review panels are observed by DMS staff, who then recommend which proposals should advance to the next stage of consideration, the pre-award site visit. A site-visit team consists of two or more DMS VIGRE program directors and one mathematical scientist from outside NSF. Site visits take

8

Responses to the survey by the Committee to Evaluate the NSF’s Vertically Integrated Grants for Research and Education (VIGRE) Program; for information see Appendix C.

Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×

place over the course of 1 day. During the site visit, the team meets with the principal investigator(s), undergraduate students, graduate students, postdoctoral fellows, faculty, and relevant administrators.

In advance of a pre-award site visit, DMS sends the department to be visited a set of basic questions to be addressed, as shown in Box 4-1. Sometime during the evolution of the VIGRE program, DMS began to include additional, site-specific questions.

BOX 4-1

Guidance from the National Science Foundation to Departments Preparing for a Pre-Award Site Visit

The following questions are quoted from the standard letter, provided by the National Science Foundation (NSF), which NSF sends to mathematical sciences departments that have submitted VIGRE grant proposals and have been recommended, after a proposal review panel’s deliberations, for a pre-award site visit.

  1. Elaborate on your plans to recruit U.S. students and postdocs to careers in the mathematical sciences, including individuals who might otherwise choose other careers. Specify any special attention to be given to recruitment of people from groups underrepresented in the mathematical sciences.

  2. For each of the past ten entering cohorts … of graduate students in your program, provide longitudinal retention data by filling in the attached Excel spreadsheet. The matrix to be filled in is on the “Template” tab, with directions on the “Instructions” tab. If some additional clarification of the data would be useful, include a brief written statement. We are doing this because questions about retention and tracking students through graduate programs are being examined more closely, and it is best if we have such data in a consistent form. We hope that the information that you have already assembled for the proposal submission will make this task relatively straightforward.

  3. Elaborate on your plans for mentoring:

    1. undergraduate students;

    2. graduate students;

    3. postdocs.

  1. Elaborate on planned activities to help graduate students and postdoctoral researchers improve their instructional and communication skills.

  2. Elaborate on plans for broadening the education of students.

  3. Provide detailed evidence of faculty plans for participation in VIGRE activities.

    1. What percentage of the faculty has agreed to participate in the research groups?

    2. What percentage of the faculty has agreed to participate in the VIGRE project in some mentoring capacity?

  1. Discuss your plans for dissemination of the results of the VIGRE project, in terms of national impact on best practices for training the mathematical sciences workforce.

  2. What will you be able to accomplish with a VIGRE award that would not be possible without an award?

  3. To what extent will the accomplishments be sustainable when VIGRE funding ceases?

  4. Report significant changes, if any, relevant to the VIGRE project since submission of the proposal.

Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×

During the 3rd year of a VIGRE award, the awardee is visited by NSF to determine the awardee’s eligibility for continuation of the award for the 4th and 5th years. Again, awardees are requested to answer a set of questions as well as to provide data to NSF, as detailed in Chapter 1.

As indicated in Chapter 1 (see the subsection entitled “Information Collected by the Committee”), to understand the adequacy of the site visits, the committee held a conference call with seven mathematical scientists (excluding NSF staff members) who had participated in site visits; it also sent an e-mail to all non-NSF site-visit team members. That e-mail asked team members what component of the site visit they found most/least helpful for the purpose of evaluating a proposal or awardee and whether they had any suggestions for how NSF could improve the value of the site visits. Finally, National Research Council (NRC) staff working on this study looked at both pre-award site-visit reports and at the 3rd-year site-visit reports and recorded their impressions of the structure of the reports.

In talking with the reviewers from outside NSF, the committee heard the following messages:

  • Site visits were well planned.

  • The duration of the site visits was about right.

  • NSF provided appropriate guidance to the site visitors as to how to conduct the visit, and site visitors thought that NSF program managers participated at the appropriate level during the site visits.

  • Site-visit teams met with all the appropriate groups at the institution being evaluated.

    • Some team members, however, seemed to think that they were meeting mostly with people who like the program and so missed hearing some negative comments; and

    • Some team members commented that they would like to interact also with some people (presumably faculty or postdoctoral fellows) who are not involved in VIGRE activities, in order to calibrate their impressions of the VIGRE program.

  • The information collected during site visits is appropriate to perform the review, and there is no need to collect additional information.

  • Concerns expressed by site visitors included the following:

    • There may be a burden on departments to put together data in support of a site visit. Data gathering was also a burden for the site-visit team, as they sometimes were overwhelmed by the amount of information provided. Site visitors would like to see the assembling of data necessary for an efficient evaluation more streamlined.

    • Site visitors are not anonymous, and being the only reviewers in the NSF proposal review process who are not anonymous can be a bit awkward for them.

    • It is assumed that the institution’s dean will be a cheerleader for the proposal, so not everyone believed it to be useful for the site-visit team to meet with the dean.

    • Some, but not all, would have preferred the team to have two non-NSF members to complement the two NSF program officers.

  • Site visitors thought that NSF appropriately included input from them in the site-visit report.

  • Site visitors noted that lunch with students was a particularly worthwhile component of the visit, as was talking with students and postdoctorals, comparing trends in the department, and seeing interactions among students at different levels.

The committee also asked for comments from site visitors in an e-mail request. The comments received in response should be taken as illustrative, as they represent only those who responded—that is, only a fraction of all those who participated in site visits. For those who had participated in a pre-award site visit, comments relating to the most helpful elements of the site visit included these:

Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×
  • “The most important part of the proposal was the data provided: number of majors, number of PhDs and their placement, postdocs and their placement, years to degree for PhD students. During the visit, meeting the students and the postdocs was the most useful part.”

  • “Meetings with the faculty, postdocs, and graduate students; data on the recruitment and retention of students. All information was useful for the overview of the success or problems of the previous VIGRE award.”

  • “Two NSF program officers talked with me about questions regarding the VIGRE program before we went on the site visits. One stressed the importance of mentoring the postdocs and graduate students. I think we reviewed two departments together. It was important for me to be told what VIGRE means for the DMS program, i.e., what the program officers envision. Each person wants to see something else stressed, and it’s important to have that discussed, say at a casual meal, before the visits.”

  • “Original proposal, preliminary panel evaluations, departmental responses to issues raised in the preliminary evaluation. I thought these were all essential to have and were helpful. During the site visit itself, meetings with current undergraduate students, graduate students, and postdocs were all especially helpful in understanding the reality of education and training at the institution.”9

Comments from the e-mail respondents on the least helpful elements of the site visits included the following:

  • “Speaking to various deans. You can predict what they will say.”

  • “The VIGRE program description at the time, online through nsf.gov, was so general that I wasn't sure what it was asking for, in terms of DMS.”10

In terms of possible improvements to the process, it is clear that site visitors who responded to the committee’s e-mail were quite positive about the experience. The major issue raised by respondents focused on the amount of work to be done in the time allotted:

  • “The visits are rushed, with the need to produce a document on-site (at least a first draft). Having said this, the reality is that it is not feasible to spend more time on a visit.”

  • “I am not sure that it could be improved, in that the program officers had specifically asked for a certain schedule, and that schedule was pretty tight. I think it was a fair way to compare different departments by asking for the same schedule from each place.”11

For those respondents who had been on 3rd-year site visits, the most helpful elements included the following:

  • “I found the interviews with the key personnel to be quite important, as was a careful analysis of how the funds were spent. Information on the recruiting pool and the status of the trainees was useful, but in an intermediate review there is little to report. What is hardest to judge, but essential, I think, is what exceptional experiences are the trainees getting and what is being institutionalized as a result of the program. Again, this is hard to judge in a 3rd-year review. I also found the

9

Responses to the e-mail request to site visitors from the committee.

10

Ibid.

11

Ibid.

Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×

interviews with the trainees to be important to understand how committed they were and what their true role was. It is easy to make things look good on paper.”

  • “Talking to the managers and participants in the program. Having the original proposal and communications between the program officers and PIs [principal investigators].”12

Only one respondent mentioned a “least-helpful” element, which seconds the notion above, that most respondents were quite satisfied with the process:

  • “In general, detailed CVs and course syllabi are of limited value unless they are included to make a point. To the extent that institutional support is provided, a leveling of how things are reported would be helpful.”13

Suggestions of respondents regarding improving the 3rd-year site process included the following:

  • “The review teams need to provide some level of anonymity. I would also recommend that the site visit rules be established generally so that, for example, the PI isn’t in all meetings. It would probably also be appropriate to have a pre-meeting of the site visit team where pre-specified topics are discussed prior to the site visit. Included in this could be NSF-mandated ground rules, site specific concerns and key questions that NSF wants answered.”

  • “Assuming the NSF staffers act as observers, the outsider participation needs to be increased and the NSF staffers need to talk less and listen more.”

  • “It would be worth considering the option of allowing the site visit team to send questions in areas where clarity is required to the institution prior to the visit so that any appropriate information can be accumulated.”

  • “Making sure that as many participants in the project as possible are available for interviews, unfettered.”14

Finally, the NRC staff analyzed a number of site-visit reports. The pre-award site-visit reports from 1999 through 2006 were of widely varying quality. There was not much consistency of detail or topic within a year and among years, nor was there consistency of format. It appears that the structure of the report depended a great deal on the composition of the visiting committee and, in particular, on who chaired the site-visit team. Many of the reports were merely a brief, or even bulleted, version of the university’s proposal—often without comment—whereas others contained very detailed evaluations. Many of the site-visit reports do not appear to contain a recommendation for action, or the recommended action is hard to interpret (for instance, “The award should be made if there are sufficient funds.”).

By contrast, the 3rd-year site visits are very uniform, as noted previously. These reports are much more detailed and contain well-reasoned recommendations.

STRUCTURE OF ANNUAL AND FINAL REPORTS

The award letter that accompanies a VIGRE grant contains a requirement to provide data to the National Science Foundation. Originally, the following list of indicators was identified, and the 1998

12

Ibid.

13

Ibid.

14

Ibid.

Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×

award letter added that the awardee is “free to identify additional indicators that it deems appropriate to the process”:

  • The number of students taking each of the “new” graduate and undergraduate courses;

  • Summary of the course evaluations;

  • A list of the VIGRE participants (including both students and mentors) and the topics for that year’s research experiences for undergraduates;

  • Time to PhD of any graduate student who graduated that year and was supported by VIGRE funds;

  • The name and baccalaureate institution of each graduate student supported by VIGRE funds that year, the cumulative amount of funding, and the graduate’s mentor;

  • A list of the postdoctoral fellows supported by VIGRE during that year, the mentor of each, and their PhD institutions;

  • A list of publications emanating from the activities;

  • The next position of each undergraduate, graduate student, and postdoctoral fellow supported under VIGRE and leaving the institution that year;

  • A list of significant VIGRE-related presentations that year by VIGRE-supported undergraduates and graduate students;

  • A description of the outreach activities; and

  • The number of women and members of underrepresented groups involved in each aspect (undergraduate, graduate, postdoctoral) of the VIGRE program during that year.

However, in the middle of the VIGRE program’s history, the focus for data collection became much less prescriptive. By 2000, the award letters contained the following description of data needed in each annual report:

  • “The previous institution and the placement institution for each recipient of a VIGRE stipend during the past year.”

  • “A list of the faculty who participated in the VIGRE program during the past year, and their roles in the project.”

The committee notes that the information collected by NSF during the application process differs from the information collected during the reporting phase, and it is not clear how helpful some of the data in the annual reports are. Tracking student placement, for example, might not be a good indicator because the students in programs with VIGRE awards are often at top institutions, meaning that many of them would be well placed regardless of whether or not the department had a VIGRE program. Also, without knowing a student’s preferences, it is difficult to make inferences based on their subsequent placement. What might be more instructive would be to see whether students in VIGRE programs considered a range of career options broader than those considered by other students.

Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×

This page intentionally left blank.

Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×
Page 47
Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×
Page 48
Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×
Page 49
Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×
Page 50
Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×
Page 51
Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×
Page 52
Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×
Page 53
Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×
Page 54
Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×
Page 55
Suggested Citation:"4 Administering, Monitoring, and Assessing the Program." National Research Council. 2009. Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). Washington, DC: The National Academies Press. doi: 10.17226/12716.
×
Page 56
Next: 5 Program Achievements »
Evaluation of NSF's Program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE) Get This Book
×
Buy Paperback | $45.00 Buy Ebook | $35.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

In 1998, the National Science Foundation (NSF) launched a program of Grants for Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE). These grants were designed for institutions with PhD-granting departments in the mathematical sciences, for the purpose of developing high-quality education programs, at all levels, that are vertically integrated with the research activities of these departments. To date, more than 50 departments at 40 institutions have received VIGRE awards.

As requested by NSF, the present volume reviews the goals of the VIGRE program and evaluates how well the program is designed to address those goals. The book considers past and current practices for assessing the VIGRE program; draws tentative conclusions about the program's achievements based on the data collected to date; and evaluates NSF's plans for future data-driven assessments. In addition, critical policy and programmatic changes for the program are identified, with recommendations for how to address these changes.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!