Since this is the first, formal evaluation of the Jennings Randolph Fellowship Program, the committee placed significant emphasis on providing advice to USIP for how to ensure that monitoring and evaluation become an established part of the program in the future. It also has a few recommendations, based on the survey of former Fellows, for steps that could improve the Fellowship itself. These recommendations are presented below, along with some supporting material where further explanation is needed to support and clarify the committee’s suggestions.
USIP has accumulated a substantial amount of information about applicants and Fellows, but the committee also encountered some significant limitations. The spreadsheet created by USIP (data categories are presented in Table 1-2) is a useful tool for collecting and organizing data on the applicants and Fellows. The committee recommends that:
USIP continue to collect the data for new applicants and fellows.
USIP contact fellows to collect data currently missing from the spreadsheet.
USIP collect new data to facilitate a better description of applicants and fellows. In particular, USIP could include a longer project description in the spreadsheet and could identify fellows as to whether they consider themselves to be scholars or practitioners.
For a number of reasons discussed earlier in the report, the committee was not able to make much progress in meeting this part of its charge beyond presenting a basic overview of Fellows’ research. To complete the second part of the committee’s charge and to better interpret the findings above, the committee recommends the following strategy:
USIP should conduct interviews or expert panels with former and current staff and board members to trace and assess the evolution of USIP’s goals with respect both to the Fellowship program and the USIP mandate.
USIP may wish to take a similar approach and collect information from external actors (e.g., government officials, academic experts, etc.). Although, ultimately, the program should be evaluated based on USIP’s rationale, it would nevertheless be interesting to see how these actors judged the purpose of the fellowship. (A start at
Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 71
Chapter 5 Recommendations for Next Steps Since this is the first, formal evaluation of the Jennings Randolph Fellowship Program, the committee placed significant emphasis on providing advice to USIP for how to ensure that monitoring and evaluation become an established part of the program in the future. It also has a few recommendations, based on the survey of former Fellows, for steps that could improve the Fellowship itself. These recommendations are presented below, along with some supporting material where further explanation is needed to support and clarify the committee’s suggestions. Gathering Additional Data USIP has accumulated a substantial amount of information about applicants and Fellows, but the committee also encountered some significant limitations. The spreadsheet created by USIP (data categories are presented in Table 1-2) is a useful tool for collecting and organizing data on the applicants and Fellows. The committee recommends that: • USIP continue to collect the data for new applicants and fellows. • USIP contact fellows to collect data currently missing from the spreadsheet. • USIP collect new data to facilitate a better description of applicants and fellows. In particular, USIP could include a longer project description in the spreadsheet and could identify fellows as to whether they consider themselves to be scholars or practitioners. Understanding How Fellows’ Research Advances USIP and U.S. Foreign Policy Goals For a number of reasons discussed earlier in the report, the committee was not able to make much progress in meeting this part of its charge beyond presenting a basic overview of Fellows’ research. To complete the second part of the committee’s charge and to better interpret the findings above, the committee recommends the following strategy: • USIP should conduct interviews or expert panels with former and current staff and board members to trace and assess the evolution of USIP’s goals with respect both to the Fellowship program and the USIP mandate. • USIP may wish to take a similar approach and collect information from external actors (e.g., government officials, academic experts, etc.). Although, ultimately, the program should be evaluated based on USIP’s rationale, it would nevertheless be interesting to see how these actors judged the purpose of the fellowship. (A start at 71
OCR for page 71
this approach is that both the survey of Fellows and the survey of peace and security experts included questions on this, as presented in Chapters 3 and 4.) • USIP should take steps to identify U.S. foreign policy goals to see how the working of the program relates to broader U.S. foreign policy goals. The committee suggests that a strategy for accomplishing this would involve identifying important foreign policy challenges or goals and examining which of those areas Fellows are researching—both before and after these challenges or goals are identified by policymakers and other “thought leaders.” This would enable USIP to begin to examine whether the research done under its aegis lags or leads larger policy issues. There are a number of ways to describe changes in U.S. foreign policy goals from 1987 to the present. There is no single authoritative source of information about U.S. foreign policy goals on which USIP could rely. If, as a government-funded institution, USIP’s primary concern is with official goals, then one could make use of the statements of foreign and security policy strategies that the White House has issued under most recent presidents. These are not always updated annually, but they do represent the product of an extensive interagency process. One could also undertake a content analysis of key speeches by government leaders, which would offer the option of including the views of Congress. If one wanted to move beyond official documents, an analysis of frequently cited terms in media reports would offer another way to track changes in goals over time. Survey data could offer the perceptions of foreign policy elites inside and out of government; an example of this is the series of surveys, “American Public Opinion and Foreign Policy,” conducted by the Chicago Council on Foreign Relations every four years since 1978 (CCFR 2004). The survey results could be employed to create a framework for comparing applicants and Fellow’s research to broader foreign policy concerns. The surveys involve both interviews of leaders and a survey of public opinion; it is the elite opinions that are relevant here. Leaders “with foreign policy power, specialization, and expertise” include “Congressional members or their senior staff, university administrators and academics who teach in the area of international relations, journalists and editorial staff who handle international news, administration officials and other senior staff in various agencies and offices dealing with foreign policy, religious leaders, senior business executives from FORTUNE 1,000 corporations, labor presidents of the largest labor unions, presidents of major private foreign policy organizations, and presidents of major special interest groups relevant to foreign policy (CCFR 2005).” The relevant survey question for the committee’s purposes is: “What do you feel are the two or three biggest foreign policy problems facing the United States today?” Over the course of surveys from 1978-2002, 67 different problems were identified. The top five issues for each survey from 1986 through 2002 are summarized in Table 5-1; the complete list may be found in Appendix D. The next step would be to examine the body of work produced by Fellows in the three years prior to and after each survey. This would be most efficiently done by surveying each Fellow and asking which of the problems identified s/he thought his or her work focused on (and perhaps the most important problem). This may be a more elaborate effort than USIP would want to undertake; the example 72
OCR for page 71
is offered to suggest that there are a variety of ways in which USIP could assess U.S. foreign policy goals that could be relevant to the Fellowship. Table 5-1 Top 5 foreign policy problems identified in Chicago Council on Foreign Relations surveys, 1986-2002 1986 Russia/dealings with Russia (46 percent); Arms Control (33 percent); Latin/South/Central America (28 percent); Balance of Payments (17 percent); Mid- East Situation (Non-specific) (11 percent); Terrorism (11 percent) 1990 Iraq (Saddam Hussein (44 percent); Mid-East Situation (non-specific) (29 percent); Russia/dealings with Russia (21 percent); International Trade (18 percent); World Economy (14 percent) 1994 International Trade (24 percent); Russia/dealings with Russia (23 percent); Weak leadership (19 percent); Stronger U.S. Foreign Policy Needed (16 percent); Our Relationship with Bosnia (16 percent) 1998 World Economy (21 percent); Iraq (Saddam Hussein) (18 percent); Arms Control (15 percent); Russia/dealing with Russia (13 percent); Japan/Asian Economy/Crisis (13 percent) 2002 Terrorism (50 percent); Mid-East Situation (non-specific) (38 percent); Unrest in Israel/Israel-Palestine (16 percent); India and Pakistan Issues (14 percent); Arms Control (9 percent) The survey findings also raise an issue about the purpose of the fellowship that could be further explored. Specifically, USIP should investigate whether to seek Fellows to advance thinking and offer more cutting-edge thinking in targeted areas, or focus on the application of such thinking to USIP priority issues. Making Monitoring and Evaluation a Regular Part of the Fellowship The committee feels strongly that USIP should undertake more rigorous and systematic monitoring and evaluation (M&E) of the Fellowship in the future. There are a number of approaches that USIP could take to develop a useful M&E strategy: • Conduct an evaluation midway through the Fellowship to assess the match between resources and the Fellow’s productivity, and to ascertain whether flexibility in timing and travel is needed. • Hold an exit interview with all Fellows at the conclusion of the Fellowship. An interview could focus on such topics as: 1. identify the various activities that Fellows pursued and how much time they spent on them. 2. a list of Fellows’ output, in particular asking what the Fellows believe to be their most important work. This could be done by collecting Fellow’s CVs. • Conduct an impact assessment of Fellows’ work, completed during their Fellowship period. Ideally, such an approach would consist of (1) identifying all the products a Fellow produced during or directly related to the Fellowship, and (2) quantifying the impact of those works. 73
OCR for page 71
In practice, the ideal is unlikely to be met. Fellows’ communications with others, briefings, and other informal interactions may inspire others, lead to policies, spur research, etc. It is very difficult to ascertain the impact of a Fellow’s briefing of Congressional staff member, for instance. However, as noted above, many Fellows who responded to the survey reported that a written work was their most important contribution and this is a good area to start and one that can be quantified. Three different measures will serve to illustrate possible approaches. Scopus and Google Scholar both count citations to authors’ work,1 while Web hits (using the Fellow’s name and title of publication as search terms) could also be examined. Related to this strategy, one could also look at where Fellows publish, in particular, which of their products were published by USIP Press, and which works funded by the fellowship were published in “top” academic presses or “top” journals in the field. A CV analysis would be helpful to this end. • Conduct an impact assessment of the Fellowship on Fellows’ careers. Once an initial assessment was undertaken, the process could be updated on a periodic basis. There are a number of possible directions that USIP might pursue. Since these are senior fellowships, the spotlight could be on Fellows’ research or their collaborations with other peace and security experts—in both cases comparing the time before and after the Fellowship, rather than focusing on their employment. That said, it might also be instructive to ask former Fellows how the Fellowship helped them advance in their careers (e.g., for academics, did the Fellowship have a positive impact on their receiving tenure or a promotion). Two methodologies for collecting this information are collecting CVs of Fellows or via a survey. The survey approach is more efficient in that one could ask Fellows what work before and after their Fellowship they consider to be related to their work at USIP. One could also survey Fellows to get a sense of how they view the research they conducted during the Fellowship: as a unique project, or as the beginning or the culmination of a research agenda. The latter could be done via a social network analysis. Social network analysis maps the relationships between individuals in networking or collaborative activities. One purpose of the analysis could be to test the hypothesis that the Fellowships increase Fellows’ networks; another could be to examine in more detail the groups of people Fellows’ interact with (e.g. academics, practitioners, media, or government officials). In implementing any of the suggestions for its approach to M&E provided above, it would be worthwhile to further disaggregate the Fellows into demographic categories to see if different groups of Fellows have significantly different views or outcomes. One category would be U.S. versus foreign Fellows. Although this is a “senior” fellowship, it might be possible to differentiate between more junior and senior individuals within this aforementioned group. A third category would be the Fellows’ employment sector (e.g., academia, media, etc.). 1 An alternative to Scopus is ISI’s Web of Science. 74
OCR for page 71
Understanding External Perceptions of the Fellowship The committee also makes several recommendations intended to help USIP gain further knowledge about the perceptions of the Fellowships in the wider expert community. • USIP should continue to probe the external peace and security community about their perceptions of the program’s impact. Information collected can assist USIP in reaching out to a broader audience, better tailoring its message, and improving competition for the fellowship by increasing the number of qualified applicants. 1. Information collected should include topics from the survey of experts. Additional topics might include whether peace and security experts had collaborated on research or other projects with Fellows; whether they used work produced by Fellows in their research, teaching, or practice; or particular Fellows’ products that experts thought were especially useful or influential. 2. Information should be collected from a broad range of experts, including academics, nongovernmental/nonprofit organization employees, and government employees. The committee’s survey focused on academics at peace and conflict centers. Many more academics would be included in the relevant population. USIP may wish to partner with relevant professional associations or seek to develop its own list of relevant academics. An important point is that for much of this information to be useful, USIP needs to pre-identify individuals for inclusion in any future surveys those who have some familiarity with the program. Such a list could start, but must go beyond, participants in USIP events and activities. Likewise the survey focused on representatives of NGOs with a focus on peace and conflict. One could also look to NGOs with a regional focus first, who work on conflict issues as a subtheme. • USIP should consider mixed modes to collect the data, reflecting the challenges of tapping different types of respondents’ views. Getting in touch with government employees proved the most difficult, and they were least likely to answer the survey. Future efforts to reach government employees would be better accomplished through expert panels or face-to-face interviews. This is facilitated by the fact that USIP staff and government employees are co-located in Washington, DC. It should be noted that for some potential respondents, confidentiality could be an issue. Again, USIP needs to pre-identify individuals who have some familiarity with the program. Academics responded well to a survey, so it may be worthwhile to conduct a more targeted survey to a broader range of academics, assuming a better population can be identified. Academics can also be reached for expert panels or interviews at major conferences or at other venues. NGO employees are, like academics, spread throughout the nation, and so they may best be reached via a survey. However, they may require more follow-ups than the academics do to ensure an adequate response rate. • USIP’s future research on the views of the expert community should seek more in-depth commentary on the impact of the program. The committee asked a question on familiarity. A logical follow-up is to probe more into how experts hear about the program and their connections to the 75
OCR for page 71
Fellowship (e.g., attending Fellows’ briefings, reading Fellows’ reports, etc.). The committee asked a question on prestige. Follow-up questions might focus on what makes the Fellowship prestigious. What is it about fellows or their work that stands out? The committee did not ask about issues of balance or priorities, although it received some open-ended comments. One comment that was noted earlier had to do with appropriate balance between scholars and practitioners (to the extent that there is a divide between them). Related to this is the notion of what type of people should be Fellows (e.g., from which disciplines). Finally, future research could explore experts’ views on what regions or topics USIP fellows ought to be covering. Such research could provide valuable information on how the direction of USIP matches the perceptions of the external community. Improving the Fellowship Experience Based on the survey results, the Committee recommends certain steps be considered to improve the Fellowship: • Explore setting up an alumni network for former Fellows. Such a network could take advantage of the current USIP website or involve a new product, for example by tapping a social network site. One way to facilitate a network would be to hold a meeting of Fellows designed to build such a network. • Consider establishing support from businesses or associations in the community to help fellows and families cope with expenses of life in the D.C. area. • Consider the potential for and ramifications of allowing for extensions of time to the Fellowship in individual cases. Some fellows and USIP may benefit greatly from having individual fellowships extended for a few months. In addition, USIP might want to consider greater flexibility in travel and support options for research outside DC, especially internationally, during the Fellowship. 76