National Academies Press: OpenBook

Evaluation of the Sea Grant Program Review Process (2006)

Chapter: 5 Major Findings and Recommendations

« Previous: 4 Program Oversight and Management
Suggested Citation:"5 Major Findings and Recommendations." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 87
Suggested Citation:"5 Major Findings and Recommendations." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 88
Suggested Citation:"5 Major Findings and Recommendations." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 89
Suggested Citation:"5 Major Findings and Recommendations." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 90
Suggested Citation:"5 Major Findings and Recommendations." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 91
Suggested Citation:"5 Major Findings and Recommendations." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 92
Suggested Citation:"5 Major Findings and Recommendations." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 93
Suggested Citation:"5 Major Findings and Recommendations." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 94
Suggested Citation:"5 Major Findings and Recommendations." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 95
Suggested Citation:"5 Major Findings and Recommendations." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 96
Suggested Citation:"5 Major Findings and Recommendations." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 97
Suggested Citation:"5 Major Findings and Recommendations." National Research Council. 2006. Evaluation of the Sea Grant Program Review Process. Washington, DC: The National Academies Press. doi: 10.17226/11670.
×
Page 98

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

5 Major Findings and Recommendations A s pointed out in the previous chapters, the role of peer review and assessment within the National Sea Grant College Program (NSGCP or National Program) has evolved significantly since the program's inception, with many changes taking place since 1994. Col- lectively, there is evidence that these changes have led to a stronger pro- gram, although not all of the changes have been equally effective. In general, this report's analysis of efforts to "address the impact of the new procedures and evaluation process on Sea Grant as a whole" (see Box 1.2) suggests that changes in the evaluation process have been more success- ful in instituting competition as a mechanism for encouraging improve- ment in individual programs than in developing a national program that "provides an appropriately balanced response to local, regional, and na- tional needs" (33 U.S.C. 1123). The following discussion summarizes the evolution of the evaluation process and makes recommendations for bringing greater balance to the evaluation process with regard both to appropriately directed competition and to development of a robust na- tional program whose foundation is the network of local programs cre- ated and maintained by individual Sea Grant colleges and institutes and administered by the National Sea Grant Office (NSGO). IMPACT OF CHANGES IN RESPONSE TO THE 1994 REPORT Following the 1994 National Research Council (NRC) report A Review of the NOAA National Sea Grant College Program, the National Director instituted a number of changes in the way the program was evaluated. 87

88 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS Although it might be tempting to assume that simple quantitative mea- sures such as publication counts would be useful, the value is marginal unless the collection and analysis of the information is carried out in a comprehensive manner. In order to carry out a direct assessment of the impacts of these changes on both the NSGCP and the individual Sea Grant programs, it would be necessary to conduct an independent assess- ment of the program, compare it to a similar pre-1994 assessment, and determine which differences are attributable to the changes related to the 1994 report. Even if it were possible to conduct such an exercise, doing so would be beyond the resources available during this study. There is, how- ever, indirect evidence that the changes instituted after 1994 have strengthened the program. First, one of the key recommendations of the 1994 report was to estab- lish a process for strategic planning. Such a process was, in fact, estab- lished, and the individual Sea Grant programs have produced strategic plans. As discussed below, strategic planning within the NSGCP still needs to be improved, but on prima facie grounds, the adoption of a formal strategic planning process is an improvement over earlier practice. Second, the current Sea Grant directors were asked whether the new evaluation process has led to improvements in their programs. The re- sponse was substantially in the affirmative. These responses cannot be taken as objective indicators of the effect of the new evaluation processes and must be interpreted with caution. On the other hand, the directors are by no means enthusiastic about all the details of the new process and were not reluctant to reveal this, so one might reasonably conclude that their responses to this particular question provide some useful informa- tion. Third, other management practices employed by individual programs have demonstrably improved in direct response to periodic evaluation. These improvements include, but are not limited to, an enhanced rela- tionship within the university administration, better internal reporting and accountability, a focus on documenting impacts and outcomes, and an awareness of long- and short-term goals. The prestige of the individual program is often increased by the visiting PAT members, leading to im- proved visibility and appreciation of the individual program within the administration of the home institution. In at least one case, the PAT report provided the necessary justification for creation of a full-time position to expand program efforts in education and outreach. The process of gather- ing materials necessary for a PAT visit also brought about increased effort for documenting impacts and outcomes. Many individual programs noted that the second visit was easier because they not only had an awareness of what materials were needed for the briefing books, but also had the op- portunity to gather these materials during the years prior to the visit. In

MAJOR FINDINGS AND RECOMMENDATIONS 89 addition, the requirement (as mentioned above) to develop a strategic plan clearly aided the programs in terms of focusing the staff on goals, objectives, strategies, and outcomes. These responses from the individual program directors were tempered with concerns about the process, and some of the directors questioned if any of the acknowledged improve- ments were worth the expense and time that were invested in preparing for and hosting a PAT visit. Finally, several members of the committee have first-hand, long-term experience with the Sea Grant program and it is their considered opinion that the changes instituted since 1994 have strengthened the program overall. As with the Sea Grant directors, the opinions of even knowledge- able individuals cannot be taken as objective indicators. But the unanim- ity of response to this issue--particularly in light of differences of opin- ions on other issues--suggests that real improvements have been made. Effectiveness of Post-1998 Evaluation As discussed in Chapter 4 and above, the most readily identified improvements in the NSGCP and the individual programs are directly attributable to administrative changes implemented in response to the 1994 NRC report and codified by The National Sea Grant College Pro- gram Reauthorization Act of 1998 (P.L. 105­160). The process subse- quently established by the National Director and implemented by the NSGO to evaluate program performance and distribute merit funds as required by The National Sea Grant College Program Reauthorization Act of 2002 (P.L. 107­299) has also led to improvements in the overall program. However, several areas of concern remain. Perhaps the foremost concern about the Sea Grant evaluation process is the reliance by the NSGO, working under the authority of the National Director, on periodic external assessments as the primary, if not only, means of evaluation and oversight. The periodic assessments are based largely on information collected during quadrennial visits by PATs over- seen by the National Sea Grant Review Panel (NSGRP). Because the mem- bers of the PATs and the NSGRP are not federal employees, the prepon- derance of program evaluation is external. As the level of routine engagement of the NSGO with individual programs is rather low, reli- ance on external review reduces the federal component of the partnership that is central to the Sea Grant program. The Director of the National Sea Grant College Program, under supervision of the Secretary of Com- merce and in consultation with National Sea Grant Review Panel and the individual Sea Grant programs, should strengthen the ability of the National Sea Grant Office to carry out meaningful, ongoing internal assessment to complement periodic, external assessment currently tak-

90 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS ing place. It is important to emphasize that this implies no criticism of the individuals who have participated as members of the PATs or the NSGRP, which operate in a highly professional and thorough manner. Program administration by the NSGO should make better use of annual reporting and regular interactions between the NSGO program officers and the directors of individual Sea Grant programs and administrators of their home institutions. These interactions should be centered on the develop- ment, approval, and implementation of strategic plans. The periodic, ex- ternal reviews should continue because they provide an important oppor- tunity to inject fresh perspectives and independent evaluation. Reverse site visits (see the LSAMP case study; Box 4.2) should be considered a potential mechanism for strengthening the connection between individual Sea Grant programs and the NSGO, allowing the perspectives offered by the individual programs to better shape the national and regional actions of the NSGO. The reliance on the periodic assessments results in an unacceptable weighting of a single factor--the quadrennial PAT score--during the an- nual ranking of separate programs. The level of effort expended by all parties--the programs, the PAT members, and the NSGO--in evaluating a single program is so great that only 7 or 8 of the 31 programs have been assessed in any single year. Because the programs are ranked on an an- nual basis, the rankings are based on information that can be as much as 4 years out of date. As discussed in Chapter 3, the administrative rules established by the National Director (in partial response to P.L. 107­109) governing the distribution of merit funds, creates a situation in which closely ranked programs can receive substantially different awards (see Figure 3.4). The inherent subjectivity of the PAT evaluation, coupled with questions of reliability rooted in the minimal overlap of PAT member- ship, means that the PAT scores cannot be relied upon to discriminate the performance of different programs in a sufficiently meaningful way to justify relatively large differences in merit awards. While steps can and should be taken to further increase the reliability of the performance as- sessment process to support the rating and ranking of the individual programs, many of the changes proposed in this report may reduce the influence of the external periodic assessment process currently in use as a vehicle for identifying ways for the individual programs and the NSGO to work together to achieve the goal of providing "an appropriately bal- anced response to local, regional, and national needs" (33 U.S.C. 1123). The remainder of this chapter explores a number of changes that may be made to improve the overall value of program assessment within the Sea Grant program.

MAJOR FINDINGS AND RECOMMENDATIONS 91 Strategic Planning The importance of strategic planning in program development, implementation, and evaluation was emphasized in the 1994 NRC re- port. Specifically, the report recommended that "State Sea Grant Direc- tors and the Director of the NSGO must cooperate to develop a single strategic plan articulating a shared vision and strategies which must be fully integrated into, and reflective of, NOAA's strategic plan" (NRC, 1994, p. 2). Although strategic planning at the national level (as carried out by the NSGO) meets this recommendation, the degree to which the national plan translates into action by individual programs is unclear. As recommended by the U.S. Commission on Ocean Policy, greater atten- tion should be paid to regional scale issues. More effort is therefore needed to ensure that all of the individual programs develop strategic plans that are consistent with both national priorities, and local and re- gional priorities. To ensure that strategic planning reflects a shared vi- sion, NSGO program officers should participate in the local strategic planning process, just as the directors of individual Sea Grant programs now participate in the development of the national plan. The strategic plan of each individual Sea Grant program should serve as the basis upon which that program is evaluated. Steps should be taken by the Director of the National Sea Grant College Program, under supervi- sion of the Secretary of Commerce and in consultation with National Sea Grant Review Panel and the individual Sea Grant programs, to strengthen strategic planning at both the national and individual pro- gram level. The strategic plans of the individual programs and the national program should represent a coordinated and collective effort to serve local, regional, and national needs. As discussed in Chapter 4, actions by the National Director should include developing and imple- menting a process to assist individual programs in strategic planning, and creating a separate process for evaluating and approving appropri- ately ambitious strategic plans for the individual programs. Performance Criteria Performance criteria are a combination of quantitative and qualitative measures used to assess a selected program or activity, the program out- comes, and, in some instances, the system the program is intended to influence. In the case of assessing the effectiveness and impacts of indi- vidual Sea Grant programs, this involves setting benchmarks to describe the expected level of performance in a particular category (such as pro- gram organization and management) and indicators to help assess the performance of the individual program in that area. As discussed earlier,

92 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS strategic planning is the critical basis for implementation, review, and evaluation of institutional programs. Yet at present, the strategic plans of each program are reviewed only as part of the periodic assessment of individual programs and concomitant with an assessment of the program's effectiveness in achieving the goals the plan describes. The Director of the National Sea Grant College Program, under supervision of the Secretary of Commerce and in consultation with National Sea Grant Review Panel and the individual Sea Grant programs, should modify the benchmarks and indicators, as needed, to ensure that the performance of each program is measured against the objectives out- lined in the separately approved, program specific strategic plan called for in the previous recommendation. In addition, the current Sea Grant evaluation criteria do not suffi- ciently recognize the importance of individual programs in building co- operative efforts to address regional and national scale problems. The existing benchmarks tend to encourage program development at the local scale. Furthermore, the heavy emphasis on individual program perfor- mance in determining merit and bonus allocations may have resulted in lower levels of cooperative behavior between programs, which now see themselves as pitted against one another. Encouraging programs to un- dertake cooperative efforts to address regional-scale problems needs to be incorporated into the evaluation process. This call to modify the evaluation criteria to place greater weight on cooperative efforts is not intended as a recommendation to increase the complexity of the criteria. In the current review cycle (the assessment of all 30 programs over four years), 14 scored sub-criteria are considered in four major categories. As a consequence, considerable time and effort is devoted to assigning, and subsequently reviewing, a score in a criterion that may account for no more than 2 percent of the overall score. The current subdivision into 14 scored sub-criteria was not recommended by any of the major committees that have examined the process, nor is there evidence to suggest that 14 scored sub-criteria provide a more accurate assessment of program performance than a smaller number of less de- tailed criteria, as used in the first review cycle. The Director of the Na- tional Sea Grant College Program, under supervision of the Secretary of Commerce and in consultation with National Sea Grant Review Panel and the individual Sea Grant programs, should substantially reduce the overall number of scored criteria by combining various existing criteria, while adding cooperative, network-building activities as an explicitly evaluated, highly valued criterion. As discussed in Chapter 3, consideration should be given to reducing the number of scored criteria to be assessed in the next external, periodic review cycle. Rather than the existing 14 sub-criteria, ranging in weight from 2 percent to 25 percent, 4

MAJOR FINDINGS AND RECOMMENDATIONS 93 to 6 broader criteria--weighted to reflect a balance among the production of meaningful results; outreach and education; planning; organization; management; and coordination among programs--would move assess- ment efforts toward more holistic judgments of program performance. Implementation of revised criteria should be postponed until the begin- ning of the next cycle of program review (the current review cycle will conclude in late 2006). Program Assessment Team and Site Visit Focusing the PAT visit on essential evaluation tasks would reduce the demand placed on PAT members and could allow members to participate in a larger number of reviews (thereby increasing reliability across evalu- ations) and reduce the cost of program assessment. Historically, the length and content of the PAT visits was largely determined through discussion between the director of the individual Sea Grant program under review and the chair of the PAT. Although the NSGCP has implemented changes to provide greater standardization, many individual Sea Grant programs have expressed concern that variability in program size (both in terms of geographic area covered and program budget and scope) requires signifi- cant flexibility in the length of the PAT visit and the amount of material provided to the PAT members. No evidence was provided to substantiate concerns or claims that more complex (i.e., larger) programs required significantly longer PAT visits or greater volumes of supporting material. There is no reason to believe that greater standardization in the types and volume of informa- tion needed to characterize program performance would inappropriately handicap large programs. With regard to standardization of supporting material, it should be noted that the NSGO has made strides in the past year to reduce the amount and kinds of preparatory materials for PAT review. New language was added to the 2005 PAT Manual in the section called "PAT Preparation, Structure, and Cost Control" that provides sug- gestions for ways to minimize costs of the PAT visit, without reducing the PAT's effectiveness (NSGO, 2005a). This report supports these changes and suggests more of the same in the future. With regard to length of PAT visits, to some degree concerns in this area reflect the lack of clarity re- garding what constitutes acceptable or exceptional performances in the various performance metrics used during the PAT process. The Director of the National Sea Grant College Program, under supervision of the Secretary of Commerce and in consultation with National Sea Grant Review Panel and the individual Sea Grant programs, should shorten the duration and standardize the PAT site visits, based on the mini- mum time and material needed to cover essential, standardized ele-

94 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS ments of the program assessment. The length of the PAT visit should be no more than the length of time needed to gather information and carry out a relatively uniform evaluation of all the programs using the modified metrical evaluation called for above. Based on the committee's experi- ence, the essential information could be conveyed in two days, with a third day used for the PAT to complete its assessment and report out to the director and institutional representatives. Providing Coordination and Facilitation Through Informed, Ongoing Oversight Greater involvement and ongoing oversight by the NSGO is needed to ensure that the program as a whole continues to improve while ad- dressing, local, regional and national needs. Informed oversight is also needed to lend credibility to annual program rankings and the allocation of merit and bonus funds. These two goals can be simultaneously served by a meaningful ongoing annual evaluation process that complements the periodic assessment carried out during the PAT review. Review mate- rial prepared for the periodic review should be a compilation of the an- nual reports of individual programs, supplemented by material that dem- onstrates the extent to which the annual activities combine to form a cohesive, ongoing program of activity organized to accomplish the objec- tives of an appropriately ambitious strategic plan and demonstrates effec- tive progress towards accomplishment of the strategic plan's goals and objectives. The Director of the National Sea Grant College Program, under supervision of the Secretary of Commerce, should rank the indi- vidual Sea Grant programs based on a program evaluation process that includes more robust, credible, and transparent annual assessments of each individual Sea Grant program. Review of programs that have un- dergone periodic assessments in the preceding year should also include consideration of the PAT reports and the individual Sea Grant program directors' responses to the PAT reports. The additional effort required of individual Sea Grant programs to provide information on an annual basis can be offset to a degree by reducing the time required to prepare materi- als for the periodic review, if the majority of the information required by the latter is made up of materials submitted annually. Fairness in Competition Program ranking is often believed to be influenced by program size, age of the program, location, type of institutional administration linkages, term of the program officer, etc. With the exception of the term of the

MAJOR FINDINGS AND RECOMMENDATIONS 95 program officer with particular programs, statistical analysis failed to support these concerns. However, the current process produces a very narrow range of program scores, such that minute differences in assigned score may result in significant differences in the award of bonus funding. The Director of the National Sea Grant College Program, under super- vision of the Secretary of Commerce, should revise the calculation of bonus funding allocation relative to program rank to ensure that small differences in program rank do not result in large differences in bonus funding, while preserving or even enhancing the ability to competi- tively award bonus funds as required by the National Sea Grant Col- lege Program Act Amendments of 2002 (P.L. 107­299). For example, as discussed in Chapter 3, the bonus pool could be distributed to the top half of the programs in proportion to the amount that each program's score exceeds that of the median-ranked program. Conversely, the amount of bonus funding could be increased uniformly by rank, so that each pro- gram eligible for bonus funding received an amount in proportion to its ranking. Improving Program Cohesion The NSGO does not currently play a sufficient role in ongoing pro- gram assistance, communication, and assessment, nor does it maintain close ongoing working relationships with the individual Sea Grant pro- grams. This limits the ability of the NSGO, and by extension the National Director, to "provide an appropriately balanced response to local, re- gional, and national needs, which is reflective of integration with the relevant portions of strategic plans of the Department of Commerce and of the Administration"(33 U.S.C. 1123). There is a consensus among NSGO personnel and the directors of individual Sea Grant programs that there was a greater level of interaction between the NSGO and the individual Sea Grant programs prior to 1995. The expansion of external periodic review overseen by the NSGRP in partial response to successive amend- ments to 33 U.S.C. Chapter 22 has coincided with a reduced engagement by the NSGO in the ongoing activities of individual Sea Grant programs. As noted in Duce et al. (2002), closer and more frequent interaction with NSGO would help integrate individual Sea Grant programs and the Na- tional Program. This reduced level of engagement by the NSGO staff appears to re- flect several factors including: · conflicting mandates to NSGO staff as part of broader efforts by NOAA to integrate functions across the organization,

96 EVALUATION OF THE SEA GRANT PROGRAM REVIEW PROCESS · less emphasis on maintaining a high-level of interaction with indi- vidual programs by individual NSGO staff as individual programs as- sumed the responsibility for review of grant applications, · a greater emphasis on external review of individual program per- formance, and · turnover and attrition of the personnel in the NSGO. In order for the NSGO to more effectively administer the program and coordinate and facilitate the efforts of the individual Sea Grant col- lege and institutes, thus fulfilling the federal role within the Sea Grant partnership, the capabilities of the NSGO should be reevaluated and likely enhanced. The Secretary of Commerce, in consultation with the Na- tional Sea Grant Review Panel, should take steps to ensure that suffi- cient human and fiscal resources are available to allow robust, ongoing, and meaningful interaction among the Director of the National Sea Grant College Program, the staff of the National Sea Grant Office, and the directors of individual Sea Grant programs, and the administrators of the institutional homes of the individual Sea Grant programs. This interaction will provide a solid foundation for the annual perfor- mance evaluation needed to annually rate and rank individual programs as required by law, and will help ensure that the various elements of the National Program are truly capable of providing "an appropriately bal- anced response to local, regional, and national needs, which is reflective of integration with the relevant portions of strategic plans of the Depart- ment of Commerce and of the Administration" (33 U.S.C. 1123). The Di- rector of the National Sea Grant College Program, under supervision of the Secretary of Commerce and in consultation with National Sea Grant Review Panel and the individual Sea Grant programs, should under- take an evaluation of how work force capabilities and other compo- nents of effective program administration could be modified within the National Sea Grant Office to enhance its ability to coordinate and facilitate the actions of the individual Sea Grant programs. The imple- mentation of changes in the NSGO that might be proposed from an evalu- ation will likely span many months or even years. In the interim, the performance of the NSGO could benefit from the type of external per- spectives provided by bodies such as the PATs or the NSGRP. Site visits conducted by the PAT could provide a useful venue for such discussions and the resulting information could be channeled to NSGRP for further consideration. Based on comments received during information gathering meetings hosted by the committee, written correspondence submitted in response to committee request, and various NSGO and NSGRP documents, it is apparent that an unacceptable number of individual Sea Grant program

MAJOR FINDINGS AND RECOMMENDATIONS 97 directors and their staff remain confused about key aspects of the periodic evaluation process, the annual evaluation process, and their impacts on program rankings and funding. Although responsibility for understand- ing this process rests with the individual Sea Grant program directors, the NSGO has a responsibility to make sure the process is reasonably straight- forward and understandable. As discussed in Chapter 3, there should be greater attention and clarity regarding all aspects of program assessment. The Director of the National Sea Grant College Program, under super- vision of the Secretary of Commerce, should take steps to ensure that the program assessment process (both the new annual assessment called for in this report and the Program Assessment Team review) is well- described and understood by individual program directors, congres- sional staff, personnel at the Office of Management and Budget, uni- versity and state administrators, and the general public. If the recommendations put forward above are implemented, the functions of the annual and periodic assessments will evolve such that both will provide different and independent sources of information about the state of the program as a whole. This information should provide important insights about the state of Sea Grant program overall to the Secretary of Commerce, the National Director, and potentially Congress. Thus, there would seem to be a need to synthesize and analyze the results of these assessments every four years, including a synthesis of the most recent periodic reviews of the individual programs and a systematic re- view of the NSGO. Developing such a "state of the program" report would seem to be an obvious role for the NSGRP. The Director of the National Sea Grant College Program, acting under authority of the Secretary, should direct the National Sea Grant Review Panel to undertake the development of a systematic review of the "state of the Sea Grant pro- gram" once every four years. The review should rely extensively on information collected during the annual and periodic reviews, aug- mented with a site visit to the National Sea Grant Office, and should focus on how the program is functioning as a whole. In addition to commenting on the how the programs is performing in terms of the vari- ous criteria used during the assessments, the "state of the program" re- port could address needed changes in how the program is administered, how the assessment process is carried out, or other areas as deemed valu- able by the Secretary or the National Director. The ability of the NSGRP to be seen as a credible source of such insight and advice to all parties may require evolution of the role of NSGRP in carrying out some components of the assessment. Greater consideration, for example, may need to be given to changing the NSGRP role to that of an observer, rather than the actual evaluator, during the periodic assessments.

Next: References »
Evaluation of the Sea Grant Program Review Process Get This Book
×
 Evaluation of the Sea Grant Program Review Process
Buy Paperback | $63.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!