Click for next page ( 26


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 25
2 History of Sea Grant Program Review and Assessment T he National Sea Grant College Program's (NSGCP) processes for overseeing and evaluating the individual Sea Grant programs have evolved since the program's inception. One of the more notable changes in the process has been the increasing use of peer review and the administrative level at which it is carried out. Another notable change has been the emergence of external periodic assessment as a tool to support the distribution of merit and bonus funding, rather than simply to iden- tify areas for program improvement. These changes have affected the make-up and role of National Sea Grant Office (NSGO). Prior to 1994, the NSGO was organized around program officers and specialists assigned to monitor institutional programs and exercise general oversight over re- search, education and outreach. The National Sea Grant Review Panel (NSGRP) had the responsibility of reviewing the NSGO and offering ad- vice for conduct of the NSGCP. The NSGO solicited omnibus proposals from each individual Sea Grant program. The omnibus proposals in- cluded project proposals for individual research, outreach, and education projects and associated management proposals for implementation of pro- gram activities for the upcoming funding cycle. Funding levels for the omnibus proposals were based on the peer reviews and NSGO evalua- tions. Individual Sea Grant program directors then operated within the limits of their omnibus award and nonfederal funding. Site visits were conducted every two years by a NSGO review team to evaluate the pro- gram management process. Although the individual Sea Grant programs were not assured "base" funding (i.e., stable level of annual funding to support program activities), changes in response to reviews were rela- 25

OCR for page 25
26 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS tively small; in practice, funding was fairly stable from year to year. Most individual Sea Grant programs conducted peer reviews to identify re- search project proposals to include in their omnibus proposals. THE TRANSITION: 19941998 In 1993, the National Research Council (NRC) conducted a review of NSGO oversight and evaluation of the individual Sea Grant programs (NRC, 1994). The NRC review explored the roles of NSGO, the individual Sea Grant program directors, the NSGRP, and their respective responsi- bilities for program review and evaluation. The resulting report, A Review of the NOAA National Sea Grant College Program, was released in 1994 and recommended: The review process for research proposals should be decoupled from the NSGO evaluation of state programs prior to the 1995 reauthoriza- tion. Standard scientific and peer review procedures should be imple- mented for all state [individual] Sea Grant programs. The review process and all aspects of program implementation, including admin- istration, should be streamlined prior to FY 1996. NSGO should evalu- ate the success of each state program on a four-year cycle, using, in part, retrospective information on recent achievements, based on mea- sures for each of the three areas of research, education, and outreach. NSGO began implementation of these recommendations in 1995. Pro- gram review was decoupled from the review of project proposals and institutional program directors implemented a standardized peer review and selection process for project proposals submitted to their programs. Congress reauthorized the NSGCP in 1998 and codified many of the rec- ommendations of the 1994 NRC report, particularly with regard to pro- gram evaluation (for more details see http://www.sga.seagrant.org). The 1994 NRC report recommended that a certain level of core fund- ing be provided to each individual Sea Grant program to support an ongoing program of research, education, and outreach as long as the program performed at an "expected level of performance." The NRC re- port also recommended that changes in overall program funding be linked to past performance, with new funds awarded to individual Sea Grant programs on a competitive basis determined by the program review and evaluation process. To establish a process for program evaluation, the National Director tasked the Committee on Procedures and Operations with developing recommendations for the protocol, criteria, and scheduling of a process for reviewing the individual Sea Grant programs (Copeland et al., 1997). The list of recommendations included: (1) a four-year cycle of external

OCR for page 25
HISTORY OF SEA GRANT PROGRAM REVIEW AND ASSESSMENT 27 program reviews (25 percent of the programs each year) and ongoing assessment of the program by NSGO throughout the four-year period; (2) that the evaluations be framed in the context of a well-developed strategic plan and agreed to by the individual Sea Grant program director and the NSGO program officer, based on input by identifiable program advisors representing program constituents and institutional representatives (e.g., NSGO, 2004b); (3) institutional implementation plans be developed on a two-year cycle; (4) each individual Sea Grant program devise an internal review process to identify progress relative to strategic plan objectives; (5) regular progress reports, written by the individual Sea Grant programs, be provided to the assigned NSGO program officer; and (6) Topical Advi- sory Team (TAT) assessments be organized by an individual Sea Grant program director and NSGO program officer to address specific concerns that might arise during the review cycle. It is not clear how fully or uni- formly these recommendations were implemented across the entire pro- gram. As emphasized in this report, strategic planning continues to be an area of concern with regard to program evaluation and the level of inter- action between the NSGO and the individual programs. PROGRAM REVIEW: 1998 AND BEYOND Beginning in 1998, the NSGO implemented a quadrennial program review process recommended by Copeland et al. (1997). The first round of quadrennial reviews--Cycle 1--began in 1998 and was completed in 2001.1 The second round of quadrennial reviews--Cycle 2--began in 2003 and will be completed by the end of 2006. While the basic framework of the quadrennial Program Assessment Team (PAT) reviews has been re- tained throughout Cycle 1 and Cycle 2, specific details of the program review process have been modified pursuant to internal and external reviews (e.g., Toll et al. 2001; Duce et al. 2002; Kudrna et al., 2005)2 and congressional directives (33 U.S.C. 1121-1131; see Appendix H). The fol- lowing "time line" gives key events (Table 2.1). 1The number of Sea Grant programs evaluated in Cycle 1 was 29, and the number in Cycle 2 was 30. In Cycle 1, Maine and New Hampshire operated and were evaluated as a single bi-state program. Before the start of Cycle 2, the joint Maine/New Hampshire pro- gram spilt into two programs that are now evaluated separately. There are currently 30 Sea Grant programs located in all of the coastal and Great Lakes states except Pennsylvania and the U.S. Commonwealth of Puerto Rico, with 3 additional programs in development stages. 2The Kudrna et al. (2005) review was released during the NRC study (November 2005), too late to assess the NSGO response for inclusion in this report. The summary of the Kudrna et al. review is reprinted in Appendix J.

OCR for page 25
28 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS Table 2.1 Timeline of Key Events Cited in This Report Original Review Process 1990 1991 1992 1993 NRC Study Begins 1994 NRC Report 1 1995 NSGO Response 1996 1997 NSGO Response Copeland, Griswold, Fetterolf Report Cycle 1 Reviews 1998 National Sea Grant College Program Reauthorization Act of 1998 (P.L. 105-106) 1999 2000 2001 NSGRP Review of Cycle 1: Review and Recommendations: Sea Grant Program Evaluation Process ("Toll Report'") Revisions to Review Process 2002 NSGRP Report: Building Sea Grant: The Role of the National Sea Grant Office ("Duce Report") NSGO Response National Sea Grant College Program Act Amendments (P.L. 107-299) Cycle 2 Reviews 2003 2004 2005 NRC Study Begins NSGRP Program Evaluation Committee Report: Review and Recommendations: Sea Grant Program Evaluation Process (Kudrna et al., 2005) 2006 NRC Report 2 Following Cycle 1, the Toll Committee (see Box 2.1), named by the NSGRP, evaluated the procedures and made recommendations for modi- fications to address a variety of issues raised during Cycle 1. The NSGO subsequently made numerous changes in the details of the review and evaluation process (NSGO, 2005c) that were implemented in Cycle 2 (Duce et al., 2002; see Box 2.2). Differences in the criteria and evaluation pro- cesses under Cycle 1 and Cycle 2, and perhaps more importantly, lack of independent assessment prior to the implementation of these changes to establish a baseline, make it difficult, if not impossible, to directly com- pare the effectiveness of the evaluation processes used in each of the two cycles or to specifically tie improvement in the individual programs or the program overall, to changes in the evaluation process.

OCR for page 25
HISTORY OF SEA GRANT PROGRAM REVIEW AND ASSESSMENT 29 Box 2.1 The "Toll Report" The National Sea Grant Review Panel's Program Evaluation Committee, chaired by John Toll, was charged in December 2000 with reviewing of Cycle 1 program reviews conducted pursuant to changes instituted in 1998. The resulting report, Review and Recommendations: Sea Grant Program Evaluation Process (often referred to as the "Toll Report", was published in October 2001. That report contained 40 recommendations, grouped into thirteen categories: NSGO Final Program Review and Merit Fund Allocation Process PAT Program Assessment Metrics Identification of Best Practices/Best Management Practices Public Notification of Upcoming Program Assessments Program Assessment Evaluation Criteria PAT Grades Alternative #1: A Case for Eliminating Scores Assigned by the PATs Alternative #2: Improved Standards for Program Assessment The Role of the NSGO Program Officer Effective and Aggressive Long-Range Planning The Biennial Implementation Plan Developing Guidelines for Self Evaluation TATs: Topical Advisory Teams Phase II of the Program Assessment Process While the NSGO instituted many of the recommendations of the Toll Report (see NSGO, 2005c), several issues identified by that committee continue to be of concern, in particular the reliability of assessments conducted by different groups of individuals assessing different programs, the limited nature of constructive, on- going interaction with NSGO staff, and the lack of a comprehensive planning pro- cess that can be implemented at both the local and national level. Program Assessment Team (External Review) The PAT3 is a principal element of the evaluation process created by the NSGO in 1998. The PAT is a high-level "external" review team com- prised of an NSGRP member as the chair, almost always an NSGRP mem- ber as vice chair, and 3 to 5 other members including an individual Sea Grant program director (of a program not under review), and other highly 3The PAT process will be discussed in some detail in chapters 3 and 4; it is introduced here simply to help the reader develop an understanding of the overall nature of the Sea Grant program.

OCR for page 25
30 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS Box 2.2 The "Duce Report" The National Sea Grant Office Review Committee, chaired by Robert Duce, was appointed by the NSGRP in early 2001 to "conduct a comprehensive review of the NSGO and how it serves its many stakeholders, including its university part- ners, NOAA, the Department of Commerce, and other federal agencies." Released in 2002 the resulting report, Building Sea Grant: The Role of the National Sea Grant Office (often referred to as the "Duce report"), made several recommendations that were intended to strengthen the NSGCP by improving the strategic planning process, encouraging cooperation among the individual pro- grams to address regional challenges, and clarifying the roles and responsibilities of the NSGO. Four overarching points were made with direct relevance to the motivation behind and means utilized by the NSGO to carry out oversight and evaluation of the various individual programs. Specifically, the report recommend- ed that NSGO: Lead in developing a comprehensive strategic plan for NSGCP and a na- tional Sea Grant agenda. Provide leadership in communicating the NSGCP agenda, the achieve- ments, and the opportunities of Sea Grant to Congress, the executive branch, and the public. Streamline and better manage the myriad administrative details essential to theoperation of the NSGCP. Continue to seek adequate funding to effectively carry out the functions of the NSGO utilizing the findings of this report. The "Duce Report" is widely seen as having a positive impact on the process. While many of the highest order recommendations regarding were adopted, con- cerns about the ability of the NSGO to more fully and meaningfully engage in the network development process remain. regarded scientists, educators, and administrators from academia, gov- ernment, and industry. The PAT receives training (NSGO, 1998) and is guided by detailed procedural and evaluation criteria (NSGO 1998, 1999a, 2000, 2001, 2003a, 2004a, 2005a) prepared by the NSGO and the NSGRP, and compiled to create the PAT Manual. Based on those guidelines and the presentations and documentation provided before and during a 3- to 5-day site visit, the PAT prepares a report outlining its findings and rating the program's performance in a number of areas. PAT Guidelines The NSGO has prepared a detailed manual with criteria and proce- dures to guide the PAT review and evaluation (for most recent PAT

OCR for page 25
HISTORY OF SEA GRANT PROGRAM REVIEW AND ASSESSMENT 31 Manual see NSGO, 2005a). The PAT Manual also provides guidance to individual Sea Grant program directors for preparing a briefing book to assist the PAT in their assessment process. The PAT uses materials pro- vided in the briefing book and during the site visit to assess the: 1. overall productivity and accomplishments of the program relative to its strategic plan and level of support (NOTE: in both Cycle 1 and 2, both the adequacy of the strategic plan and progress made in implement- ing it were evaluated simultaneously); 2. overall scientific strength (e.g., the significance of scientific ad- vances, the rigor of planning and internal review processes, the level to which available university talent and resources have been brought to bear on program goals and objectives, success in meeting program goals and objectives, publications and other output); 3. outreach and educational productivity and effectiveness; 4. management team effectiveness in planning and meeting stated goals and objectives, and in providing overall leadership for the program; 5. use of internal linkages among program elements and the ability to integrate these elements to address priorities (e.g., research, education, extension, and information dissemination); 6. position and role in its academic setting; 7. linkages with other Sea Grant programs, state and regional aca- demic institutions, state and federal agencies, and the private sector; 8. linkages to industrial and user groups; and 9. potential for growth, considering all the above. These nine assessment areas provide the framework for a more de- tailed set of review and evaluation criteria and benchmarks that is in- cluded in the PAT Manual and which frame the scoring by the PAT mem- bers to generate the PAT overall score (see Box 2.3 for example of a Cycle 1 scorecard). Although much of the site visit is public, it is standard practice for the PAT to meet privately throughout the entire site visit (days, evenings, and whenever else is possible). The PAT discusses the review and comes to agreement on the evaluation scoring, findings, and recommendations, and writes its conclusions in an initial draft report (the final version of this report is called the "PAT Report," and it is discussed in the next section). Before concluding the site visit, the PAT meets with the individual Sea Grant program director and institutional representatives to discuss pre- liminary findings.

OCR for page 25
32 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS Box 2.3 Cycle 1 Program Score Sheet (Reprinted from NSGO, 2001, p. 17) Evaluation Criteria and Benchmarks for Performance--Summary I. EFFECTIVE & AGGRESSIVE LONG-RANGE PLANNING: The most effective programs will use the strategic planning framework from the NSGCP as a basis for developing their own strategic plan based on needs at the state and local level as identified in collaboration with a constituency advisory group. Effective planning may also involve regional programs. (10%) Rating__________ II. ORGANIZING AND MANAGING FOR SUCCESS (4 Criteria). MANAGING THE PROGRAM AND INSTITUTIONAL SETTING: Sea Grant programs are located within or work closely with university systems that are sites of major research and administrative activity. Each program must be man- aged to maximize the recruitment of outside resources to address Sea Grant problems and issues, as well as to build capability in the university system to address coastal problems and opportunities. MERITORIOUS PROJECT SELECTION: The program carries out a good peer review and evaluation process for research, education and outreach projects, and selects those which receive consistently high marks for merit, application, and priority fit. The review must take into account how well a prospective project targets an issue. PAT Report and Program Directors Formal Response The NSGO has adopted a review process that directs the PAT to provide the individual Sea Grant program director and the NSGO with a comprehensive written report within 30 days of the site visit. This report should contain: documentation of the program's strengths and weaknesses; specific recommendations for program improvement; and an overall evaluation using the evaluation criteria and benchmarks for performance in the PAT Manual. After receiving this final written PAT report, the individual Sea Grant program director has a reasonable time (until January of the following

OCR for page 25
HISTORY OF SEA GRANT PROGRAM REVIEW AND ASSESSMENT 33 RECRUITING AND FOCUSING THE BEST TALENT AVAILABLE: Every Sea Grant program has a variety of talent available for program development. The best efforts will involve the best talent. The program must have mechanisms in place to identify and attract the best talent available. MERITORIOUS INSTITUTIONAL PROGRAM COMPONENTS: It is impera- tive that research projects, advisory programs, communications and education activities, and management use state-of-the-art methods and work to advance their disciplines. (20%) Rating__________ III. CONNECTING SEA GRANT WITH USERS: Effective information transfer occurs most often when the end users are involved in the planning and development stages, the program has an extension process in the field, and there is a mechanism for follow-up with users. The program manage- ment team should interact at the state, regional, and national policy levels. At the university level, the Sea Grant program must occupy an appropriate administrative and leadership position and be involved in decision making. (20%) Rating__________ IV. PRODUCING SIGNIFICANT RESULTS: The program must be managed to produce significant results. A basic mission of Sea Grant is to integrate research and outreach to address and significantly im- pact the identified needs of its constituency and of the nation. (50%) Rating__________ OVERALL PROGRAM RATING________ calendar year) to respond in writing. Most program directors do so, par- ticularly in response to findings or conclusions with which the director disagrees or for which the director has additional information or perspec- tives. The PAT report and the director's response become part of the permanent record for the individual program and serve as the basis for the NSGO Final Evaluation Review (FE)(discussed below). It is envisioned that the PAT report will also establish a baseline for subsequent PAT assessments. The NSGO and the NSGRP continue to work on guidance and train- ing for PATs, to improve the quality of the PAT reports, to ensure that they are effective in informing the FE, and in guiding the individual Sea Grant program director in making improvements during the next review cycle (NSGO, 2005c).

OCR for page 25
34 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS PAT Review Criteria During Cycle 1 (19982001), PAT reviews were framed around four criteria weighted as follows: 1. Producing significant results 50 percent 2. Organizing and managing for success 20 percent Managing the program and institutional setting Meritorious project selection Recruiting the best talent available Meritorious institutional program components 3. Connecting Sea Grant with users 20 percent 4. Effective and aggressive long-range planning 10 percent Four categories (criteria) were used in Cycle 1 program ratings (NSGO, 2001, p. 16): 1. Excellent--If benchmarks of "Expected Performance" are substan- tially exceeded, the program will be rated as excellent. 2. Very Good--If the benchmarks of "Expected Performance" are generally exceeded, the program will be rated as very good. 3. Good--A program which generally meets the benchmarks of "Ex- pected Performance" should be given a rating of good. 4. Needs Improvement--A program which does not reach the bench- marks should be given a rating of needs improvement. Changes were made in the PAT review criteria and benchmarks based on the Toll et al. (2001), the Metrics Committee recommendations (see Appendix B of NSGO, 2005a), and in response to congressional require- ments in the National Sea Grant College Program Act Amendments of 2002 (P.L. 107299). These changes were implemented in Cycle 2, begin- ning in 2003 (see Table 2.2). (Details are included in the PAT Manuals (NSGO 1999a, 2000, 2001, 2003a, 2004a, 2005a.) Cycle 2 PAT reviews have utilized a ratings sheet (score sheet) based on the four criteria of Cycle 1, but with a far more detailed sub-criteria for each criteria. Programs are rated as "Needs Improvement," "Meets Bench- mark," "Exceeds Benchmark," or "Highest Performance" for each bench- mark. The PAT manual has a detailed discussion of each "sub-criteria" (this report uses the term "sub-criteria" to refer to what the 2005 PAT Manual calls "sub-elements") and benchmarks and the percentage weight for each are shown in Table 2.2. Here are the brief descriptions of the four benchmarks from Cycle 2 as printed in the PAT Manual (NSGO, 2005a):

OCR for page 25
HISTORY OF SEA GRANT PROGRAM REVIEW AND ASSESSMENT 35 1. Highest Performance--Performance goes well beyond the bench- mark for this sub-element and is outstanding in all areas. 2. Exceeds Benchmark--In general, performance goes beyond what would be required to simply meet the benchmark for this sub-element. 3. Meets Benchmark--In general, performance meets, but does not exceed, the benchmark for this sub-element. 4. Needs Improvement--In general, performance does not reach the benchmark for this sub-element. The PAT will identify specific program areas that need to be addressed. The Metrics Committee Following the Toll Committee report and its recommendation for improved metrics to fairly and uniformly evaluate programs across time, the NSGO appointed a Metrics Committee to examine potential qualita- tive and quantitative indicators of program performance and to make specific recommendations. The report of that committee Indicators of Per- formance for Program Evaluation was issued in March 2003 (Metrics Com- mittee Report is included as Appendix B in NSGO, 2004a, 2005a). Subse- quently, NSGO incorporated its recommendations and many of those in the Toll Committee report regarding metrics for review and evaluation. FINAL EVALUATION PROCESS The final evaluation process is carried out in five consecutive days during what is termed the NSGO Final Evaluation Review (FE), generally held in February. Participants include the NSGO leadership, NSGO tech- nical staff members, plus nonvoting participation by one or more (usually two) members of the NSGRP.4 The review looks back over all programs that were visited by PATs during the prior calendar year (a single PAT cohort). The result of the FE is a summary letter from NSGO and a score upon which merit and bonus funding decisions are based. The FE differs from the PAT review in several ways because more information, collected over time (1 to 4 years), is incorporated from NSGO assessment during the review cycle. According to Sea Grant program documentation and reports (discussed earlier in this chapter), the FE con- siders 7 or 8 programs simultaneously, thus providing a comparative perspective across programs, based on the following information: 4During the 2005 NSGO Final Evaluation Review, one member of the NRC review com- mittee and one OSB staff member were included as observers.

OCR for page 25
36 2 Cycle Highest Performance During Reviews Exceeds Benchmark and Assessments Meets Benchmark for Sheet") "sub-criteria". Needs Improvement ("Score term the Sheet % % % % % % % % with % % % % % % % % % % 20 6 4 2 3 5 20 15 5 10 4 4 2 50 10 10 25 5 Ratings 19. and "sub-elements" RESULTS to 16 Criteria ,pp. refers Components GRANT the Environment Planned Program Appropriate Process Review AGGRESSIVE PROGRAM Plan Science Education the 2005a AND the SEA Quality report Setting with PLANNING SIGNIFICANT to to The THE of Talent Program AND Society, and Achieving Outcomes Planning Plan this Selection on in 2.2 Communities Technology Outreach of :NSGO, Elements* USERS Elements RANGE Elements Elements text Sub Leadership Institutional Project Recruiting Integrated Sub Engagement User and and Economy, Program Partnerships Sub Strategic Strategic Implementation Sub Contributions Contributions Impacts Success ORGANIZING CONNECTING EFFECTIVE PRODUCING Table 1. MANAGING 2. WITH 3. LONG 4. *The SOURCE

OCR for page 25
HISTORY OF SEA GRANT PROGRAM REVIEW AND ASSESSMENT 37 1. The documentary material used in the FE includes: The PAT report along with the institution's response, The program's strategic plan/implementation plan, Annual progress reports, Information on major accomplishments, Trip- and peer-review-panel reports by the Program Officer (if any), Topical Advisory Team reports (if any), Sea Grant funding information, Other material deemed to be relevant by the Program Officer, Four-year project-by-project report on Sea Grant funding, and Copies of the PAT briefing books and omnibus proposals. 2. Insights (provided by the NSGO staff) into each program's perfor- mance, management, and results based on interactions with the programs over the entire four-year review period; and 3. Insights (provided by the NSGO staff) into the contributions of the individual programs in support of the total National Program. For ex- ample, whereas the PAT would evaluate program management in terms of the results and output, the NSGO would add considerations of how well the program supported NSGO and national initiatives, and the de- gree to which the program functioned and identified itself as part of the national Sea Grant network. Collaborative efforts among the programs are given credit. The manner in which these FE deliberations are carried out, with subsequent distribution of merit funds, was first described in a policy memorandum dated April 22, 1999, sent to the Sea Grant directors from the National Director (see Appendix D). This initial process was used until 2003 when, as a result of 2002 congressional action, the merit and bonus funding procedures were modified. Draft revisions of the policy document were circulated in 2004 for comment and the new version was promulgated April 8, 2005 (NSGO, 2005c). The bulk of the FE review week is spent, about half a day at a time, considering each of the individual programs reviewed the previous year. The program officer for each program begins with a formal presentation, following a common template, describing various aspects of the program being considered. This is followed by a detailed discussion, facilitated by the National Director, of the performance of the program in each of the evaluation criteria listed in the PAT Manual (note that the four criteria used by the PATs in Cycle 1 were subdivided into 14 criteria for Cycle 2, see sub-criteria in Table 2.2). These criteria-based discussions are the foun- dation for scoring programs for purposes of merit and bonus funding. At the conclusion of the discussion of each criterion, the group votes

OCR for page 25
38 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS to assign one of the four ratings to that criterion (as stated earlier in this chapter, ratings were slightly different from Cycle 1 to Cycle 2. ["good" vs. "exceeds benchmark," etc.]). The group vote in Cycle 1 was phrased in terms of agreeing with the original PAT rating or assigning a higher or lower evaluation (Schuler, 2005). In Cycle 1 and part of Cycle 2, a simple majority vote (more than half of the votes cast) was required to change a PAT rating. In response to concerns about the impact of assigning a score during the FE that differed from that provided by the PAT, steps were taken to revise the NSGO's role in the rating of individual programs. In 2005, this voting process changed to require a two-thirds majority (more than two-thirds of the votes cast) to assign a score different from the PAT rating. In effect, this reduces the role of the NSGO in rating individual programs; the significance of this change will be revisited in chapters 3 and 4. The final day of the of the FE week is spent reviewing the cohort of programs assessed by a PAT the previous year criterion by criterion, primarily emphasizing instances in which the FE rating differed from the PAT rating, but including instances in which comparative judgments among programs might lead to different ratings for programs discussed in the early part of the week. A significant function that takes place in the FE is the assignment of numerical values (i.e., scoring) to the ratings for the 4 criteria or 14 sub-criteria (for Cycle 1 or Cycle 2 respectively). The ratings assigned to each criterion are then converted to a numeri- cal equivalent on a four-point scale, so that the highest rating is given a 1.0 and the lowest a 4.0. Thus, the poorer the performance, the higher the score. These scores for the various criteria are then combined using the weightings described in the PAT manual to create an overall program score (NSGO, 2005a, p. 16). Program Performance Rating Based on the FE score, all 30 programs are divided into 4 categories. The scores of programs in each of the four categories vary considerably in age. Although one-quarter of the scores are at least three years old and three quarters of the scores more than a year old, the 1999 NSGO memo- randum makes no mention of numerical scores but defines the "rating categories" stating: "Ratings are based on grading of the same four cat- egories as the PAT evaluations" (NSGO, 1999b, p. 4). The 2005 version states "The NSGO final rating for the program is determined by locating a program's score along a fixed four-category rating scale for merit funding and a variable two-category rating scale for bonus funding" (NSGO, 2005b, p. 7). Historically, the actual scores were kept confidential from individual Sea Grant programs, but starting in

OCR for page 25
HISTORY OF SEA GRANT PROGRAM REVIEW AND ASSESSMENT 39 2004, programs were informed of their final scores, and any differences between the PAT and FE were explained. Beginning in 2005, each pro- gram was given a summary of the overall performance boundaries and number of programs within each boundary, but individual program scores remain confidential. As a matter of practice, category 4 is rarely assigned; such an assignment triggers special interactions to improve pro- gram management. The relevant policy memoranda are included in ap- pendixes D and E in this report. Final Report, Ranking, and Allocation of Funds Following the FE, the National Director prepares a final report for each program and transmits it as a letter to the individual Sea Grant program director. This final report summarizes the evaluation results for each program for each of the four major evaluation categories. Even though the Cycle 2 evaluations subdivide the 4 categories into 14 sub-criteria, the final report for Cycle 2 reviews does not address the 14 sub-criteria separately in an effort to maintain focus on the four major categories. In Cycle 1, and for the first two years of Cycle 2, it was the practice of the National Director not to include a final rating (e.g., Highest Perfor- mance, Exceeds Benchmark, etc.) in the letter to the individual Sea Grant program director. Instead, the Program Officer would inform the indi- vidual program director that the program had been assigned to a merit category. As discussed earlier, starting in 2005 (part-way through Cycle 2), the letter from the National Director to the individual programs included the ratings for each of the 14 sub-criteria and specified into which merit cat- egory the program had been placed. During the Cycle 1 and partial Cycle 2 reviews, the NSGO director's letter informed the individual Sea Grant program director of the actual amount of merit funding awarded (NSGO, 2003b). For Cycle 1 this fund- ing was in two parts: one remained fixed until the next review was com- pleted (4 years) and a second, smaller amount varied from year to year as additional programs were evaluated and additional programs entered or exited the top three categories. In Cycle 1 this merit funding was based on the score and the number of programs with similar scores, not on the relative position of the program in the overall ranking. Beginning in Cycle 2, evaluation, ranking, and the merit award pro- cess became more complicated. Not only did the number of evaluation criteria increase from 4 to 14 (as sub-criteria were specified), but Congress (P.L. 107-299) mandated a competitive ranking formula based on five categories, with no more than 25 percent of the programs ranked in the

OCR for page 25
40 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS top category (Merit Category 1), and no more than 25 percent could be ranked in the second category (Merit Category 2). NSGO responded to this formula by creating a ranking formula that contained, in essence, a total of six categories: Merit Category 1 (the top-ranking category) was subdivided into 1A (containing the top seven programs, just under 25 percent), 1B (containing the next seven), and 1C (containing the remain- der of programs in Merit Category 1). Merit categories 2 and 3 remained as required by Congress, and Category 4 represented the sixth category. To date, no program has been found to perform so poorly as to be as- signed into Category 4. While the overall category (e.g., Category 1, Cat- egory 2) to which a program is assigned (for determination of merit fund- ing) remains unchanged during the period between reviews, the scored programs ranked in the category subdivisions 1A, 1B, and 1C change yearly as additional programs are reviewed and relative rankings within the category change. For funding allocation, the amount of the merit award to a program remains unchanged throughout the period before the next PAT review. However, an additional bonus fund distributed to programs in sub-cat- egories 1A and 1B may change annually. Details of the method of ranking and allocating the merit and bonus funds are given in a Policy Memorandum on NSGO Final Evaluation and Merit Funding (NSGO, 1999b), revised in 2005 (see Appendixes D and E). An example of how the Cycle 2 funding allocations might play out is shown in Figure 2.1. The fact that the allocations are for specific dollar amounts unrelated to the size of the individual program's core funding makes the reward, and changes to it, much more significant for the smaller programs (see Chapter 3 for more discussion). CONCERNS WITH THE PROGRAM EVALUATION PROCESS Given the complexity and diversity of individual Sea Grant programs and the complex funding strategies of each program, the National Direc- tor, NSGO staff, and the NSGRP have developed a detailed process re- sulting in meaningful review and evaluation. There are, however, several shortcomings that could be rectified to make the overall process more effective. Since the reauthorization of the program in 2002, program evaluation within Sea Grant has evolved to serve two, theoretically related purposes. The 2002 amendments redefined the purpose of evaluation from simply gauging and encouraging improvement in individual programs to rating programs, "relative to each other for the purpose of determining perfor- mance-based funding." These dual purposes are related insofar as com- petition for funds serves as an incentive to the individual programs to

OCR for page 25
HISTORY OF SEA GRANT PROGRAM REVIEW AND ASSESSMENT 41 250 200 150 K $ 100 50 0 etc. 1A 1B 1C 2 Individual Programs FIGURE 2.1 How a hypothetical $3 million merit funding pool with a $1 million bonus funding pool might be allocated among 20 programs that have been ranked in Category 1. It should be noted that it is possible for a particular category to have no programs assigned to it. For example, if there were 14 or fewer Category 1 programs, the third group (1C) would have no programs assigned to it (NSGO, 2005b, p. 13). improve. However, an evaluation process that is well designed for identi- fying areas and mechanisms for program improvement may be inad- equate for ranking programs. A process whose foremost purpose is to rank programs may do a poor job of encouraging aspects of program improvement. The process must be balanced so that efforts to achieve one objective do not undermine efforts to achieve the other. Furthermore, Sea Grant is often considered a network or partnership, thus the process must balance efforts to improve the effectiveness of individual components against improving the effectiveness of the network or partnership as a whole. An "ideal" assessment process would include the following characteristics: Credible (uses professionally recognized methods or "best prac- tices" within the field) Reliable (results should be reproducible) Meaningful (criteria, benchmarks, and indicators should reflect characteristics of an effective program defined in terms of national, re- gional, and local benefits)

OCR for page 25
42 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS Cost-effective (cost of effort, in terms of human and fiscal resources should not exceed a reasonable fraction of the annual budget of indi- vidual programs or the network as a whole) Comprehensive (should assess effectiveness of individual compo- nents as well as the network itself) The process could be fine-tuned to focus on the overall objectives. Designing or modifying the assessment process to achieve the character- istics of an "ideal" assessment process would require balancing of out- standing performance recognition versus improvement of the network as a whole. The desire to stimulate competition among individual programs must be tempered to avoid creating barriers to improving the program as a whole. This could be achieved if emphasis were placed on rewarding the outstanding performer rather than on stigmatizing the acceptable per- former. Approaches for achieving such balance, based on detailed analy- sis of the current process are explored in chapters 3 and 4.