National Academies Press: OpenBook

Evaluation of the Sea Grant Program Review Process (2006)

Chapter: 4 Program Oversight and Management

« Previous: 3 Critique of the Periodic Assessment Process
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 69
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 70
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 71
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 72
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 73
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 74
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 75
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 76
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 77
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 78
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 79
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 80
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 81
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 82
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 83
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 84
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 85
Suggested Citation:"4 Program Oversight and Management." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 86

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

4 Program Oversight and Management I n the United States and around the world, research programs are funded through a variety of mechanisms and use different project selection and program evaluation approaches to maintain quality of performance. However, the trend across research programs worldwide is toward more competitive project funding and stronger retrospective evaluation processes, with stronger links to resource allocation. In keep- ing with this trend and in compliance with congressional directive, the National Sea Grant College Program (NSGCP) funds are distributed through: (1) centrally managed competitive awards to investigators, (2) awards based on historical factors of individual Sea Grant programs and subsequently distributed as competitive awards to investigators, and (3) competitive bonus awards to individual Sea Grant programs based on their relative ranking as a result of the program review process. Although the program oversight, structure and management pro- cesses followed by the NSGCP are somewhat unique, program review by other federal, state and private grant programs share similarities and dissimilarities with Sea Grant's review program. This chapter will discuss these similarities and dissimilarities, especially in regards to the six main elements of administration used by the Sea Grant program, (introduced in Chapter 2): (1) annual reports prepared by the individual Sea Grant programs; (2) sporadic interactions with the National Sea Grant Office (NSGO) administrators and program officers; (3) periodic assessments by high-level external review teams (Program Assessment Teams [PATs]); (4) certification reviews for aspirant and deficient programs; (5) the devel- opment, approval, and implementation of strategic plans at the national 69

70 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS and individual levels; and, (6) annual allocation of federal funds (base, merit, bonus, national initiative, special projects). INTRODUCTION TO PROGRAM MANAGEMENT Program management and program oversight are interrelated pro- cesses that directly affect the success of a program in achieving its mis- sion, goals, and objectives as described in the strategic plan. Effective program management requires acquisition of information about program performance, outcomes, and impacts through a combination of continu- ous and periodic processes. Program administrators use this and other information to evaluate progress, allocate resources, and make decisions that influence the direction and focus of the program in implementing the strategic plan. Effective program oversight includes both internal mecha- nisms and external mechanisms at a variety of time scales. Internal moni- toring and oversight occurs on continuous or short time scales (days, weeks, months) to drive short-term decision-making at a local program level, while external monitoring and reviews take place on longer time scales (semi-annual, annual, quadrennial, etc.) to inform national pro- gram decisions and long-term local program decisions. A key element of effective management and oversight in any program, especially a dis- persed national program such as Sea Grant, is a strategic plan that is integrated throughout the program structure, bringing cohesion to the effort without eliminating the focus on local challenges, opportunities, and networks. MANAGEMENT AND OVERSIGHT OF RESEARCH AND OUTREACH PROGRAMS The Sea Grant program is certainly unique within the National Oce- anic and Atmospheric Administration (NOAA), and perhaps within the federal government as a whole, in terms of the use of annual rating and rankings to determine eligibility for and size of bonus funding. However, the use of formal performance reviews of individual programs is com- mon. Federally supported research, outreach, and education programs are funded through a variety of mechanisms, including broadly com- peted awards, formula-funded block grants, and funds budgeted to and expended within federal agencies. In addition, funds can be awarded for short or long periods to individual investigators, small teams, centers and institutes, or national labs. In this section, a handful of different programs are described to provide examples of existing federal programs and show the diversity of program oversight and assessment processes.

PROGRAM OVERSIGHT AND MANAGEMENT 71 Sea Grant is one of a handful of federal research programs that pro- vide block funding, on either a fixed or formula basis, to universities for research, outreach, and education. The largest such federal program and the conceptual model for Sea Grant, the U.S. Department of Agriculture administered Cooperative State Research, Education and Extension Ser- vice (USDA-CSREES), uses formulae to allocate about $550 million annu- ally to public land grant colleges and universities pursuant to several federal laws1 (See Box 4.1). While some of the USDA-CSREES formula- funded programs are of recent origin, new funding for research and out- reach has been increasingly directed to competitive programs. In addi- tion, USDA-CSREES has come under increasing pressure to shift funding from formula programs to competitive programs. 2 For example, the An- nual CRIS (Current Research Information System) report, a different re- port from the follow-up report, is required and is prepared by principal investigators. The CRIS report includes descriptions of each research project, project outcomes, and impacts. The CRIS database can be queried to parse reports by geographic region, subject of investigation, individual investigator, etc. The annual reports are used by the State Experiment Station directors to oversee and assess the productivity of funded projects. The initial FY06 budget proposal included a $250 million increase for National Research Initiative (NRI) competitive grants, funded in part from a $104 million reduction in formula funds. The CREES reviews have this flexibility because the need for intercomparison among reviewed programs is less and the programs are not ranked for the purpose of making funding decisions. 1Hatch Act; Smith-Lever: 1862 Institution; Smith-Lever Act 3(d); Food and Agriculture Defense Initiative; Renewable Resources Extension Act; McIntire-Stennis Cooperative For- estry; Animal Health and Disease Formula; Aquaculture Centers; Evans-Allen 1890 Re- search Formula; 1890 Extension Formula; 1890 Facilities Grants; 1890 Institutions Teaching and Research Capacity Building Grants; Tribal Colleges Endowment Fund; Tribal Colleges Education Equity Grants; Extension Services at the 1994 Institutions; Tribal Colleges Re- search Grants; Hispanic Serving Institutions Education Grants; Resident Instruction for In- sular Areas; Alaska-Native Serving and Native-Hawaiian Serving Institutions Education Grants; and Agriculture in the Classroom. 2Some examples of comparable programs: National Research Initiative; Sustainable Ag- riculture Research and Extension; Outreach and Assistance for Socially Disadvantaged Farmers and Ranchers; Organic Agriculture Research and Extension Initiative; Higher Edu- cation Challenge Grants; Secondary and Two-Year Postsecondary Agriculture Education Challenge Grants; Food and Agricultural Sciences National Needs Graduate and Postgradu- ate Fellowship Grants; Multicultural Scholars; International Science and Education Com- petitive Grants; Biotechnology Risk Assessment Research; Small Business Innovation Re- search; Community Food Projects; and Risk Management Education.

72 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS Box 4.1 Case Study: Program Review Processes in a Formula-Based Block-Funded Program: U.S. Department of Agriculture- Cooperative State Research, Education, and Extension Service (USDA-CSREES) The periodic review and evaluation process of CSREES programs is similar to the Sea Grant review process but with significant differences. In both programs an outside panel of experts is convened to review pertinent program documents, conduct a site visit to the program, and report findings and recommendations based on a set of pre-determined review criteria. However, unlike the Sea Grant program reviews, which are previously scheduled, the local program administrator requests a CSREES program review 18 to 24 months in advance of the review, often in conjunction with a university review. The purposes of a CSREES review are specifically to "assess the benefits that the programs provide to the agricultural industry, rural communities/environment, and consumers, and in meeting other social goals stated in congressional authorizing legislation" (CSREES, 1999). While most reviews comprehensively address research, outreach, and instruction- al programs, some are designed around specific issues or programs. Unlike a Sea Grant program review in which the scope and timing have been determined by the NSGO and expressed in the PAT Manual, the scope, purpose, and timing of a CSREES review are determined through consultations among the Experiment Station/Extension administrator, program leader, and CSREES team leader. Specific objectives for the review, a general timeframe for the site visit, and the size of the review team and specific areas of needed team expertise are all determined through these consultations and all parties reach a verbal agreement. Composition of CSREES review teams is similar to the composition of Sea Grant PATs although selected through a different process. Review teams include 4 or 5 members, selected by the CSREES team leader based, in part, on nomina- tions by the local program administrator and program leader. The team usually includes one member as an institutional representative--a department head or program leader from a related department--of the program being reviewed. Other team members are selected for their recognized knowledge and experience rele- vant to the review objectives and usually include department heads, program lead- ers, or Experiment Station/Extension administrators from peer institutions. The review team approves the site visit agenda four to five months prior to the review. Site visits are usually two to three days, including time for deliberation and writing the report. Reviews may include visits to laboratory and other facilities, slide and video presentations, structured meetings with faculty and administrators, and unstructured interactions with faculty, students, and staff. At the end of the site visit, the review team presents preliminary findings and recommendations to campus administrators and to the faculty. The review team has four weeks to submit a final written report to the CSREES administrator, who then writes and forwards a final report to the Experiment Station/Extension admin- istrator within an additional two weeks. CSREES requests that the program leader/ department head or Experiment Station/Extension administrator submit a follow- up report about one year after the review outlining actions taken in response to the review. However, since this follow-up report is not required, it is rarely completed.

PROGRAM OVERSIGHT AND MANAGEMENT 73 The Sea Grant Review Process Compared to Other Federal Programs The stability of base funding for individual Sea Grant programs is like the funding stability enjoyed by government laboratories. In some govern- ment laboratories, such as the intramural laboratories of the National Insti- tutes of Health (NIH) and the Agricultural Research Service (ARS). NIH and ARS researchers are federal employees, so personnel evaluation procedures help to maintain quality. But such laboratories typically also have strong program review processes. NIH laboratories, for example, are reviewed by external teams every 4 years (M. Gershengorn, NIH, personal communica- tion, 2005), and ARS laboratories every 5 years (www.ars.usda.gov). ARS funds activities through proposals that are externally reviewed and that must meet minimum quality requirements. The national laboratories funded by the Department of Energy and managed by contractors also review all projects before they are approved for funding. The individual Sea Grant programs resemble some additional as- pects of federally funded centers. For example, NIH, National Science Foundation (NSF), and other U.S. federal agencies, provide multiyear awards to centers that perform multiple functions, usually including re- search; education; outreach; in some cases, service (e.g., in NIH centers, translation into clinical practice); and the provision of central infrastruc- ture. These awards are usually made for four or five years at a time. At approximately mid-award cycle, NSF centers receive a site visit. Perfor- mance is closely examined in relation to milestones specified in the origi- nal proposal. At both NSF and NIH, centers must submit applications for continuing funding every five years. These requests for continued fund- ing are considered in competition with other projects considered for funding. At NSF, most programs have sunset clauses that limit funding for individual centers to a maximum of ten years, or two five-year awards. Centers are expected to be self-sufficient after that time. Implications of Review on Funding and Competition The individual Sea Grant programs could be described as centers funded by the NSGCP. They differ from NIH and NSF centers in that their locations and base budgets were not openly competed, the base funding is not subject to regular recompetition, and they are not subject to a sunset clause unless their performance warrants decertification. Individual Sea Grant programs also differ from NIH and NSF centers to the extent that nonfederal funds, especially state funds,3 are used for their support. 3The average reported share of state funding across all the state programs from 1995­ 2003 is 35.2 percent. Because many individual programs only report to the NSGO those nonfederal funds needed to demonstrate the match to federal funds, this amount is likely the minimum.

74 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS The initial certification of an individual Sea Grant program does not involve a national request for proposals (RFPs) and open competition among perspective institutions, but instead involves a review process wherein the candidate program prepares a proposal that describes insti- tutional and state support, program leadership and organization, an analysis of stakeholder needs, and the identification of priority research, outreach, and education programs. Programs that fail to demonstrate ex- cellent potential are not certified. Further program evaluation may also occur in the course of annual program review if there is a persistent failure to address shortcomings identified in the periodic review or if there is a loss of critical personnel or institutional support. To date, no individual Sea Grant College program has been decertified as a result of poor performance. The principal alternatives to block funding of research, education, and outreach programs are (1) the funding of in-house research by agency or organization staff and (2) project funding through peer review of com- peting proposals. Most research grant programs supported by NSF and NIH, and other federal research programs (such as the U.S. Environmen- tal Protection Agency Science to Achieve Results [EPA-STAR] and USDA-NRI grant competitions) are characterized by a broadly distrib- uted open public solicitation for proposals, peer or panel review of the proposals and credentials of the principle investigators, and administra- tive review of the highest rated proposals and determination of funding levels. The grants may be awarded to individuals, teams, or research centers. NSF, NIH, and USDA-NRI typically fund 20 to 30 percent of the proposals submitted. The EPA-STAR program funds about 15 percent of the proposals submitted. For a concise description of the oversight and management processes of these and related federally funded research programs, see NRC (2001). Although Sea Grant sponsors some nation- wide open competitions for research funding, most Sea Grant research funds are allocated as formula-based block grants to the individual Sea Grant programs, where funds are allocated to outreach and education programs, program administration, and through competitive awards to investigators. Allocation of Funds, Peer Review, Competition, and Awards to Meritorious Projects In the case of individual Sea Grant programs and USDA-CSREES formula-funded programs, there is a second-stage allocation of funds at a state level, which usually relies on peer review of competing proposals. For example, individual Sea Grant programs hold biannual competitions for research funding. While the specific details of the competition vary

PROGRAM OVERSIGHT AND MANAGEMENT 75 across programs, all of the competitions include anonymous peer review of the scientific merit of the project and the qualifications of the principal investigator (PI) or investigators. Many include review by a panel of stake- holders to score the proposals for their relevance to critical regional needs and all include a final technical review by a panel of peers supervised by the program director. Projects to be included in the program's omnibus proposal to the NSGCP are selected during the final panel review. In the case of USDA-CSREES, the formula funds allocated to state agriculture experiment stations include a second stage-allocation that typi- cally involves peer evaluation of research proposals, but the peer evalua- tion is usually in-house (at the institution) and is designed to provide advice about improving projects rather than to screen and identify projects to be funded. Within the land grant system, USDA-CSREES formula funds have been built into the base funding for tenured and tenure-track faculty and so it is problematic if state experiment station directors deny salary funding for projects, and less problematic if they deny operating funds. Because operating funds are typically small relative to salary funds, the state experiment station directors have limited ability to promote strong projects or eliminate weak projects. In contrast, because Sea Grant projects have not been captured into the base salaries of research faculty, individual Sea Grant program direc- tors have more flexibility to shift funds to meritorious projects. Although the competition for Sea Grant research funds is not strictly a national competition, the competition at the individual Sea Grant program level is intense (less than 20 percent of the proposals are funded and many pro- posals are funded at less than the requested level). Moreover, proposals are broadly solicited and often involve PIs and co-PIs who are not associ- ated with the university or consortium that hosts the individual program. While there is within-program competition for research funds in the for- mula-funded USDA-CSREES and individual Sea Grant programs, there has not been, but could be, a comparable within-program competition for outreach and education funds. STRATEGIC PLANNING AS A PROGRAM DEVELOPMENT AND EVALUATION FRAMEWORK Strategic planning is a cornerstone of effective program management. A well-designed strategic plan reflects goals and objectives that the pro- gram intends to accomplish within the planning horizon. In a disaggre- gated and regionally dispersed program, strategic planning could help integrate individual programs into a national whole while supporting regional and local differences in program emphasis. Weaknesses in the strategic planning process and the lack of effective integration of local

76 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS and national strategic plans were recognized by Duce et al. (2002). The NSGO has responded to some of the recommendations of Duce et al. (2002) and to the increased emphasis that the federal Office of Manage- ment and Budget (OMB) and the National Oceanic and Atmospheric Ad- ministration (NOAA) have placed on strategic planning (NSGO, 2004b). Similarly, some individual Sea Grant programs have developed strategic plans that reflect active collaboration with the NSGO as well as its local constituents. However, other individual Sea Grant programs have been slow to develop strategic plans or have strategic plans that are poorly designed, poorly integrated with the national strategic plan, or lack speci- ficity for addressing local and regional needs. As noted in NRC (1994) and Duce et al. (2002), information from the individual Sea Grant programs is extremely valuable in the development of national priorities and objectives. The individual Sea Grant programs have direct contact with researchers, educators, outreach specialists, and stakeholders in the marine community and are thus well positioned to identify emerging issues. While the goals and objectives expressed in each program's strategic plan can be expected to address issues of uniquely local importance, it is essential that they also be placed in the context of the NSGCP strategic plan, which is modified every four years to comply with U.S. Code. Conversely, the formation of a cohesive inte- grated national program with discernible regional goals that could be addressed through the combined efforts of individual Sea Grant pro- grams, would require that the strategic plans for each individual Sea Grant program include elements that are common with the national plan as well as elements unique to the locale, including those elements that address needs identified by the states and other sources of financial sup- port. Hence, there is a need for top-down and bottom-up integration of strategic plans. While integration is important for overall program coor- dination and oversight, the NSGCP strategic plan should be more than a simple collation of the strategic plans developed by the individual Sea Grant programs, and the individual strategic plans should be more than a simple subset of the NSGCP strategic plan. Development of strategic plans for individual Sea Grant programs presents a prime opportunity to strengthen interactions with the NSGO and regional or thematically rel- evant sister programs. An effective integrated strategic planning process could begin with the development of an appropriately ambitious draft strategic plan with input from key stakeholders, university or consortium administration, and the NSGO. When formally approved by the National Director, the individual Sea Grant program's strategic plan represents a compact be- tween the individual program and the network as a whole. Approval by the National Director signifies that the program's strategic plan is suffi-

PROGRAM OVERSIGHT AND MANAGEMENT 77 ciently ambitious and attendant to local, regional, and national priorities, so that successful and timely accomplishment of the goals and objectives outlined in the plan can be expected to result in superior or outstanding ratings for corresponding elements of the annual and periodic program reviews. In turn, when an approved strategic plan is in place, annual reports and periodic program reviews can be framed in the context of accomplishments relative to goals and objectives outlined in the strategic plans in effect during the review period. Programs that achieve the iden- tified goals should be assured of receiving superior or outstanding ratings. Because the NSGCP is required to prepare a new strategic plan every four years, there are advantages to having the individual Sea Grant pro- grams prepare or update their strategic plans on a coincident cycle. Har- monizing the periodicity of the strategic planning process and the period- icity of program review would allow the program review process to look back at performance relative to strategic plans in place during the review period. The program review process could also look forward to the strate- gic plan that has been developed for future activities and could comment on the significance of the activities proposed and the availability of re- sources to support those activities. ROLE OF THE NATIONAL SEA GRANT OFFICE The role of the NSGO was examined recently by Duce et al. (2002). A comprehensive reexamination of the NSGO is outside the scope of this review, but an evaluation of the Sea Grant program review process can- not be entirely decoupled from consideration of the role of the NSGO in that review process. Effective program administration within a diverse and decentralized organization such as Sea Grant requires a clear and consistent process for providing the central organization with accurate and comparable information about the objectives, activities, and perfor- mance relative to those objectives of the decentralized elements of the organization. In addition, there must be a clear and consistent vehicle for conveying information about current and anticipated goals and objectives from the center of the organization back to the individual programs. The National Director, working through the NSGO, is responsible for ensur- ing that there are effective conduits of top-down and bottom-up informa- tion flows. However, based on discussions with individual Sea Grant program directors and with NSGO administrators and program officers, it is evident that NSGO personnel have limited interaction with the indi- vidual Sea Grant program directors and that top-down and bottom-up information conduits are less than effective. NSF and many other federally funded research programs rely on

78 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS program officers for ongoing communication between distributed pro- grams and national program administrators. To be effective as the pri- mary top-down and bottom-up information conduits, program officers must receive training in program evaluation and administration, and must have backgrounds in the technical disciplines of the programs with which they interact. Irrespective of whether NSGO relies on program officers to serve as the primary information conduit, it is essential that some struc- ture be in place to serve this function. Effective program evaluation de- pends on the degree to which assessment is normalized by the national office, based on the objectives and program planning of individual pro- grams rather than some preconceived standard. Because of differences among individual programs (financial resources, talent pools with vari- ous specialties, issues, approaches, geographic and demographic charac- teristics) and the unique institutional environments, assessment should be tailored to take into account program variability. The NSGO program officer could be the link between the NSGCP and the individual program directors, providing the perspective for assessing program effectiveness annually while considering institutional characteristics. ANNUAL AND PERIODIC ASSESSMENT PROCESSES AS INTEGRAL ELEMENTS OF PROGRAM ADMINISTRATION Periodic program assessment within the NSGCP is intended to serve two related purposes. The first, narrower purpose, is to fulfill the congres- sional mandate to rank programs for the competitive award of merit and bonus funds. The second purpose is to identify areas for improvement in individual programs. These are related insofar as competition for funds serves as an incentive to the individual programs to improve and the periodic program assessment process provides information that can be used to direct improvements where necessary. The periodic assessment process, as it has evolved, appears to be aimed disproportionately at the narrow goal of ranking programs and distributing competitive funds. Although this is understandable given the congressional mandate, an assessment process that is excessively geared toward ranking the pro- grams may do a poor job in other aspects of program improvement. For example, while the episodic interactions between the NSGO and the pro- grams may be sufficient for ranking, it may not provide sufficient timely information for directing program improvements. Similarly, the National Sea Grant Review Panel (NSGRP) has become overly concerned with the periodic assessment process. All this implies that simply tinkering with the PAT manual and eliminating discontinuities in the way in which competitive funds are distributed will not solve the problem. For the program to improve--and, in particular, for it to become and be per-

PROGRAM OVERSIGHT AND MANAGEMENT 79 ceived as a truly national program--there is need for more individual Sea Grant program involvement with the NSGO and there is a need for more NSGO involvement with the individual Sea Grant programs. At the same time, the role of the NSGRP in the assessment process needs to evolve. The extent to which the NSGRP has become involved in the details of the periodic assessments is a reflection of the overreliance on these assess- ments. The Director of the National Sea Grant College Program, in consultation with the National Sea Grant Review Panel, should work to establish an independent body to carry out the periodic assessments under the supervision of the National Sea Grant Review Panel. By re- moving itself from direct involvement in the individual program assess- ments, the NSGRP will be better positioned to comment on issues of broader significance to the overall program, including efforts by NSGO and the individual Sea Grant directors to strengthen the partnership as- pects of the NSGCP. The NSGRP should continue to monitor the process closely, but should be perceived as a neutral body whose sole function is to promote the effectiveness of the program as a whole. The purpose of program oversight and management is to ensure that the program managers are aware of the array of activities that are being undertaken and to ensure that program managers have a basis for pro- gram assessment so that resources can be managed to improve the capac- ity and performance of program components. Strong program oversight and management systems blend ongoing and annual assessment of pro- gram activities and outcomes with periodic assessments that explore the long-term effectiveness of programs and consider the summation of ac- complishments, outcomes, and impacts. In addition, a periodic assess- ment provides external validation of the annual program assessment (Box 4.2 gives an example of another federal program, the Louis Stokes Alliances for Minority Participation [LSAMP] grant program, with peri- odic assessments and external reviews and illustrates a reverse review concept). Although NSGCP has annual reporting requirements, ongoing inter- actions between the NSGO and the individual Sea Grant programs, and a periodic program assessment process, the information provided through the annual reports and ongoing interactions between NSGO and indi- vidual programs could play a more prominent role in the annual assess- ment of programs, and specifically, could provide information for pro- gram oversight and management. Ongoing and annual assessments are essential for effective program management. To effectively administer the program, the NSGO must be aware of the activities and accomplishments of and opportunities and challenges faced by the individual Sea Grant programs. The National Director cannot effectively convey information

80 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS Box 4.2 Case Study: Program Review Processes in a Competitive Grant Program: Louis Stokes Alliances for Minority Participation The objective of LSAMP, a program managed by NSF, is to increase the quality and quantity of undergraduate baccalaureate recipients in the natural sci- ences, mathematics, engineering, and technology, with particular focus on stu- dents underrepresented in these areas. Each LSAMP project is administered through a five-year cooperative agreement between the academic institution and the NSF. Although the awards of $2.5 to $5 million are for a full five-year cycle, the cooperative agreements are administered as five one-year contracts, with continu- ing funding contingent on achieving satisfactory progress as determined by a three- part annual evaluation. The program review and evaluation process of LSAMP programs shares some commonalities with Sea Grant's evaluation process. In both, an outside pan- el of experts conducts a site visit and reports its findings and recommendations based on an assessment of the effectiveness of each activity in supporting pro- gram goals. However, unlike the Sea Grant program assessments, the site visits for LSAMP programs are conducted annually and represent only one part of the program review process. In addition to the annual site visit, the annual evaluation process for LSAMP programs includes an annual report and a reverse site visit (LSAMP program officers visit NSF). The NSF reviews the results of all three com- ponents of the evaluation, in toto, before deciding whether the next year of funding should be awarded. This enables the agency to terminate projects that fail to achieve agreed upon goals. The NSF assigns a single program officer to the LSAMP program whose sole responsibilities are the administration of LSAMP projects. The program officer serves as chair of the annual site visit team. about these activities, accomplishments, opportunities, and challenges to NOAA, DOC, or the Congress unless the information is readily available. As recommended in Chapter 3, performance metrics are needed that can be readily validated, and that can assess the quality and significance of program activities, outcomes, and impacts. If the annual reports de- scribe activities, outcomes, and impacts in terms of the same metrics that form a basis for the periodic PAT reviews, then the annual ranking of the individual Sea Grant programs could be based on a combination of infor- mation submitted in the annual reports, information available to the NSGO through other reporting requirements, and interactions between NSGO representatives and the individual Sea Grant programs, aug- mented by the PAT reports and the individual program directors' re- sponses to the PAT reports. By viewing the PAT report as only one, albeit important, source of information feeding into program assessment, con-

PROGRAM OVERSIGHT AND MANAGEMENT 81 The LSAMP site visits are short--one to two days--and focus on assessing the effectiveness of new outcomes that would not have occurred in the absence of allocated resources. Similarly to the Sea Grant process, the site visits follow stan- dardized protocols and assess performance relative to well-defined criteria, includ- ing degree production and enrollment data, expenditures of NSF and nonfederal matching funds, and programmatic activities. Site visit teams focus on (1) pre- activity status; (2) postactivity results; and (3) net changes as they relate to enroll- ment, retention, and degree production goals. Unlike the Sea Grant PATs, the LSAMP site review teams do not produce a report that reifies performance into a single numeric rating, but instead the review team provides a report that discusses strengths and weaknesses of the program. The LSAMP annual reports, the second component of the annual review pro- cess, follow a template defined by NSF. While these reports provide flexibility for programs to report on a wide variety of activities, the template prescribes inclusion of standardized metrics, the "Minimum Obligatory Set" (MOS), which include key performance indicators. The MOS data are used to track program performance through time and make interproject comparisons. Unlike Sea Grant, project rank- ings are not a part of the evaluation process. At the reverse site visits, the third component of the annual evaluation pro- cess, each LSAMP program participates in a one-hour review at NSF headquar- ters with the program officer and other NSF administrators. These (individual) ses- sions consist of a brief presentation by the LSAMP program team followed by a discussion of data reported in the program's annual report and the findings and recommendations of the site visit team. All reverse site visits are scheduled during a single week, so that all programs can be evaluated within the same period of time. SOURCE: NSF, 2003. cerns about the asynchronicity of periodic assessments will be lessened. At the same time, more fully incorporating the annual reports and other ongoing information (communicated via miscellaneous documentation, e-mails, phone conversations, general site visits by the program officer, and program interactions) regarding program activities and progress, will ensure that the annual ranking is based on the most recent information about each individual program. FINDINGS AND RECOMMENDATIONS REGARDING PROGRAM OVERSIGHT AND MANAGEMENT As discussed in Chapter 3, the periodic assessment process, as cur- rently carried out during PAT site visits and NSGO Final Evaluation Re- view (FE), will require some modification to increase its reliability and

82 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS credibility for the purposes of rating and ranking individual programs in a manner that will support the distribution of merit and bonus funding. The more important issue identified here is the need to fit periodic assess- ment into the larger effort to continually improve and enhance how peri- odic assessment should fit into a larger effort to continually improve and enhance the overall program as it strives to provide "an appropriately balanced response to local, regional, and national needs" (33 U.S.C. 1123) is needed. Sea Grant Program Administration Since the 1994 report, the NSGCP has significantly improved the struc- ture of its management and oversight processes. The improvements in- clude a stronger strategic planning process, decentralized and profes- sional review of project proposals, and a robust program review process. Altogether, the NSGCP oversight processes include annual allocation of federal funds (base, merit, bonus, national initiative, special projects), periodic reviews of national and individual Sea Grant program processes and outcomes (PAT visits and reports, certification reviews for aspirant and deficient programs), regular monitoring of national and individual Sea Grant programs (annual reporting, interactions with program offic- ers), and the development, approval, and implementation of strategic plans at the national and individual Sea Grant program levels. In practice, however, many elements of NSGCP's program oversight system have atrophied, and program oversight has essentially been reduced to the PAT site visits and the ratings and report that derive from the PAT visits. This overreliance on periodic review of outcomes and impacts fails to provide timely, ongoing feedback to the NSGO throughout the review cycle and diminishes the effectiveness of program oversight. The Direc- tor of the National Sea Grant College Program should ensure that pro- gram administration carried out by the National Sea Grant Office makes full and consistent use of annual reporting, frequent and meaningful interactions with individual Sea Grant programs by National Sea Grant Office program officers, and the development, approval, and imple- mentation of strategic plans to monitor and assess the performance of the individual Sea Grant programs on an ongoing basis. Reverse site visits (see LSAMP case study, Box 4.2) appear to be a viable mechanism for connecting individual Sea Grant program directors with program of- ficers and NSGCP administrators, and would likely provide an opportu- nity for the National Director to evaluate the nature of the relationships between NSGO staff and the individual Sea Grant programs, and for collective discussion of near-term planning and information exchange. The intent of the reverse site visit suggested here is to ensure that the

PROGRAM OVERSIGHT AND MANAGEMENT 83 NSGO is responsive to its state and local partners; the reverse site visit should not be used as a substitute for NSGO program officer visits to individual Sea Grant programs. Periodic program assessment is an important external check on the effectiveness of both the individual Sea Grant programs and the NSGO's ability to facilitate and coordinate their efforts. The Director of the Na- tional Sea Grant College Program, working with the National Sea Grant Review Panel, should redirect the focus from periodic external Program Assessment Team reviews towards identifying areas and mechanisms for improving the individual Sea Grant programs as well as the Na- tional Sea Grant Office's efforts to facilitate and coordinate program efforts. External, periodic review can thus provide an independent snap- shot of program performance in areas assessed annually by the NSGO. The Director of the National Sea Grant College Program, in consulta- tion with the National Sea Grant Review Panel, should create a process for determining the underlying causes of disagreement for instances where a Program Assessment Team review appears to reach conclu- sions at odds with the most recent annual assessment provided by the National Sea Grant Office. Role of the National Sea Grant Office The NSGO does not currently play a sufficient role in ongoing pro- gram assistance, monitoring, communication, and assessment, nor does it maintain close ongoing working relationships with the individual Sea Grant programs. There were more interactions and better relationships between the NSGO and the individual Sea Grant programs prior to 1995. As noted in Duce et al. (2002), closer and more frequent interaction with NSGO would help integrate individual Sea Grant programs into the Na- tional Program. In order to effectively administer the Sea Grant pro- gram, the Director of the National Sea Grant College Program should take steps to ensure that sufficient qualified staff are available to inter- act with the individual Sea Grant programs, to ensure effective two- way communication, and to monitor and assess program performance on an ongoing basis. Strategic Planning Process Strategic planning is key to effective management and oversight of the individual Sea Grant programs. Strategic planning is not well inte- grated into the NSGCP despite the fact that strategic plans are a specific criterion in the program assessment process. Sea Grant program strategic plans do not reflect active collaboration between the NSGO, the indi-

84 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS vidual Sea Grant program, and the institutional representative. Many individual Sea Grant programs have strategic plans but the quality varies widely. Although some programs submit their strategic plans to the NSGO, those plans are neither formally reviewed nor approved by the NSGO except as part of the PAT. Each individual Sea Grant program, in collaboration with its local network and the National Sea Grant Office, should develop an appropriately ambitious, high quality strategic plan that meets local and institutional needs while simultaneously reflect- ing the individual program's role in addressing the regional and na- tional needs identified in the strategic plans of NOAA and National Sea Grant College Program. The plan should include clearly articulated goals, tailored to the individual program that can form the basis of annual and periodic performance evaluation. In other words, the benchmarks of performance in each area should be jointly developed by the NSGO and the individual Sea Grant program, and incorporated into the strategic plan of each program, through a process separate from either the annual or periodic performance evaluation. Coordination between the individual Sea Grant program director and the NSGO on strategic planning can also provide the NSGO with feedback on local trends and shifts in local and regional perspectives, which could improve the content of future NSGCP strategic plans. The Director of the National Sea Grant College Pro- gram, in consultation with National Sea Grant Review Panel, should formally review and approve each individual strategic plan. The ap- proved strategic plan would then serve as the basis for annual and peri- odic evaluation of the performance of each program, with the accom- plishment of objectives identified in the strategic plan constituting effective performance. Increasing Reliability and Transparency of Annual and Periodic Assessment Periodic assessment should be based on the same criteria as ongoing annual program assessment. Program attributes, activities, outcomes, and impacts that are sufficiently important to warrant annual or ongoing as- sessment are important enough to evaluate on a periodic basis. Review material prepared for the periodic review should be a compilation of the annual reports, book-ended by material that demonstrates the extent to which the annual activities combine to form a cohesive, ongoing program of activity organized to accomplish the objectives of an appropriately ambitious set of strategic plans and demonstrating effective progress to- wards accomplishment of the goals and objectives identified in those stra- tegic plans.

PROGRAM OVERSIGHT AND MANAGEMENT 85 Currently, the individual Sea Grant programs are ranked each year at the conclusion of the FE review. However, only one-quarter are actually rated in a given year (those that underwent a PAT review in the previous calendar year), thus the rankings change only in as much as the ratings for one-quarter of the programs changed in a given year. Thus the rankings reflect ratings that are as much as three years out of date for one-quarter of the programs, and three-quarters of the rating are at least one year out of date. The frequency of periodic assessment (once every four years) and the number of programs reviewed in a given year (one-quarter) is thus insufficient to support meaningful annual rankings of the programs as required by Congress. The Director of the National Sea Grant College Program, in consultation with the National Sea Grant Review Panel and the directors of the individual Sea Grant programs, should modify the NSGO Final Evaluation review process so that every individual Sea Grant program is rated and ranked each year. The rating (and subse- quent ranking) should be based on an assessment of each program's progress for the reporting year based on annual reports of activities, outcomes, and impacts in the context of the unique strategic plans ap- proved for each program. This is referred to as "Annual Assessment" in the Summary and Chapter 5, and is different from the current FE process. Finally, as the functions of the annual and periodic assessments evolve, they will provide different and independent sources of informa- tion about the state of the program as a whole. This information should provide important insights about the status of the Sea Grant program overall to the Secretary of Commerce, the National Director, and poten- tially Congress. Thus, there is a need to synthesize and analyze the results of these assessments every four years, including a synthesis of the most recent periodic reviews of the individual programs and a systematic re- view of the NSGO. Developing such a "state of the program" report would seem to be an obvious role for the NSGRP. The Director of the National Sea Grant College Program, acting under authority of the Secretary, should direct the National Sea Grant Review Panel to undertake the development of a systematic review of the "state of the Sea Grant pro- gram" once every four years. The review should rely extensively on information collected during the annual and periodic assessments, aug- mented with a site visit to the National Sea Grant Office, and should focus on how the program is functioning as a whole. In addition to commenting on the how the program is performing in terms of the vari- ous criteria used during the assessments, the "state of the program" re- port could identify needed changes in program administration, conduct of the assessment process, or other areas as deemed valuable by the Secre- tary of Commerce or the National Director. The ability of the NSGRP to be

86 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS seen as a credible source of such insight and advice to all parties may require that the NSGRP redefine its role in carrying out some components of the assessment. For example, greater consideration could be given to changing the NSGRP role to that of an observer, rather than actual evalu- ator, during the periodic assessments.

Next: 5 Major Findings and Recommendations »
Evaluation of the Sea Grant Program Review Process Get This Book
×
 Evaluation of the Sea Grant Program Review Process
Buy Paperback | $63.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!