5
Preliminary Results and Recommendations

PRELIMINARY RESULTS

An overall conclusion of the report is that some of the data that would support a full-fledged evaluation of the NIST/NRC RAP are simply not collected at this time. Some data are collected, but the number of RAs and advisers filling out the forms remain small, which may mean that the results based on information provided by those who did fill out the form are not representative of all RAs or all advisers. Thus caution must be exercised in reading the results. With those caveats firmly in mind, there are a number of interesting findings.

Turning first to examine applicants, the application form is a very useful data collection instrument. Among the three current instruments—application form, final report, adviser’s evaluation—the application form has produced the most data. Further, personal communication is the most likely means by which applicants hear about the RAPs, including the NIST/NRC RAP. Key findings regarding how applicants heard about the program were:

  • Applicants to the NIST/NRC RAP were twice as likely as applicants to the other RAPs to hear about the position initially from their Ph.D. advisor or other professor and somewhat more likely to hear about the program from colleagues or fellow graduate students, but less likely to hear about the program from a research advisor or other scientific staff at the federal laboratory.

  • The most common sources of information for applicants to the NIST/NRC RAP were professors or colleagues.

  • Male and female applicants heard about the NIST/NRC RAP similarly, except via presentations at professional meetings, which women cited twice as often as men.

  • There were no differences by race/ethnicity in how applicants to the NIST/NRC RAP heard about it.

Outreach efforts produce more qualified applicants than NIST has slots to fill for RAs; and the pool of applicants includes many from top research institutions and is increasingly diverse. Overall, 22 percent of applicants were awarded an appointment—a lower ratio than for RAPs elsewhere. Women are increasingly applying to the NIST/NRC RAP and being awarded research associateships. The NIST/NRC RAP seems to be as popular as the other RAPs for women. Underrepresented minorities are increasingly applying to the NIST/NRC RAP and being awarded research associateships. For applicants to the NIST/NRC RAP and awardees, at least half came from 20 of the top doctoral-granting institutions in the United States.

Applicants and awardees to the NIST/NRC RAP differ from their counterparts in the other RAPs. Since 1990, underrepresented minorities are proportionately more likely to be awarded a NIST/NRC Research Associateship than another research associateship. Applicants to, and awardees of the NIST/NRC RAP are younger on average than those who apply for the other programs. NIST/NRC RAP applicants and awardees are more likely to be single. They are more likely to have Ph.D.s in the physical sciences than biological. The majority of awards go to



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 88
5 Preliminary Results and Recommendations PRELIMINARY RESULTS An overall conclusion of the report is that some of the data that would support a full-fledged evaluation of the NIST/NRC RAP are simply not collected at this time. Some data are collected, but the number of RAs and advisers filling out the forms remain small, which may mean that the results based on information provided by those who did fill out the form are not representative of all RAs or all advisers. Thus caution must be exercised in reading the results. With those caveats firmly in mind, there are a number of interesting findings. Turning first to examine applicants, the application form is a very useful data collection instrument. Among the three current instruments—application form, final report, adviser’s evaluation—the application form has produced the most data. Further, personal communication is the most likely means by which applicants hear about the RAPs, including the NIST/NRC RAP. Key findings regarding how applicants heard about the program were: • Applicants to the NIST/NRC RAP were twice as likely as applicants to the other RAPs to hear about the position initially from their Ph.D. advisor or other professor and somewhat more likely to hear about the program from colleagues or fellow graduate students, but less likely to hear about the program from a research advisor or other scientific staff at the federal laboratory. • The most common sources of information for applicants to the NIST/NRC RAP were professors or colleagues. • Male and female applicants heard about the NIST/NRC RAP similarly, except via presentations at professional meetings, which women cited twice as often as men. • There were no differences by race/ethnicity in how applicants to the NIST/NRC RAP heard about it. Outreach efforts produce more qualified applicants than NIST has slots to fill for RAs; and the pool of applicants includes many from top research institutions and is increasingly diverse. Overall, 22 percent of applicants were awarded an appointment—a lower ratio than for RAPs elsewhere. Women are increasingly applying to the NIST/NRC RAP and being awarded research associateships. The NIST/NRC RAP seems to be as popular as the other RAPs for women. Underrepresented minorities are increasingly applying to the NIST/NRC RAP and being awarded research associateships. For applicants to the NIST/NRC RAP and awardees, at least half came from 20 of the top doctoral-granting institutions in the United States. Applicants and awardees to the NIST/NRC RAP differ from their counterparts in the other RAPs. Since 1990, underrepresented minorities are proportionately more likely to be awarded a NIST/NRC Research Associateship than another research associateship. Applicants to, and awardees of the NIST/NRC RAP are younger on average than those who apply for the other programs. NIST/NRC RAP applicants and awardees are more likely to be single. They are more likely to have Ph.D.s in the physical sciences than biological. The majority of awards go to 88

OCR for page 88
doctorates from the physical sciences. But, because there are so many applications from this discipline, only about one in five applicants with this background receive awards. Preliminary analysis suggests that labs receive different amounts of applications and awards are not made uniformly across different labs. Some awardees do decline NIST/NRC Research Associateships, though the percentage of declined offers is often lower than that for the other RAPs and has declined over time. Turning now to an assessment of the experiences of Research Associates, currently available data do not allow for a program evaluation of immediate outcomes of the Program. Little data are collected on Research Associates’ experiences or on research advisors’ evaluation of RAs. Data are also not collected on the value of the program to NIST or to the broader scientific and engineering community. Second, with the caveat that this conclusion is based on very limited data that may be biased by nonresponse, NIST/NRC RAs are as productive as RAs in other Programs. NIST/NRC RAs statistically were more likely to receive an award or give domestic presentations than RAs in other Programs. Conversely, they published fewer journal articles. However, while these differences were statistically significant, they were not substantively large. NIST/NRC RAs patent or give international presentations comparably with RAs in other Programs. Finally, and subject to the same caveat, RAs are quite satisfied with the Program. On a scale of 1 to 10, with 10 being excellent, NIST/NRC RAs rated short-term and long-term value of the program; lab, advisor, administrative (NIST and NRC) support between 7.7 and 8.5. In half the categories NIST/NRC RAs and Research Asssociates in other programs reported statistically similar levels of satisfaction. In the other half, other RAs reported higher levels of satisfaction. Finally, looking at the careers of former RAs, preliminary evidence—which is quite limited—suggests that RAs contribute to the pool of qualified applicants to permanent positions at NIST. About 45 percent of RAs reported that their immediate post-tenure position was at NIST as a permanent, temporary, or contract employee after their appointment—a higher percentage than RAs at other federal agencies. A survey of former RAs found that a higher percentage of former NIST/NRC RAs stayed at NIST than RAs at other federal agencies stayed at their host agency (37.6 to 28.1 percent). Second, evidence on the outcomes of the Program is largely lacking. Little data are collected on the career outcomes of former RAs; and the value of the program to NIST or to the broader scientific and engineering community. RECOMMENDATIONS 1. NIST should conduct a more thorough evaluation of the NIST/NRC Research Associateship Program. a. As a first step, NIST and the NRC should review specific goals of the program. b. The evaluation should include the following components: an assessment of outreach to potential applicants; an assessment of individuals who decline to accept a Research Associate position; an assessment of the benefits of the program on the RAs after they complete their appointments; an assessment of the benefits to NIST of hosting RAs; and an assessment of benefits of the Program to the broader scientific and engineering community. 2. NIST should conduct an evaluation of outreach efforts. a. To conduct such an evaluation, data need to be collected. In this regard, the question on the application about how applicants hear about the program is 89

OCR for page 88
helpful and should be retained. However, the “Other” category should be further analyzed and a choice of “Website” should be added as a category. b. Additional data could be collected from NIST personnel and former or current NIST RAs. Such data could be used to answer such questions as: i. What mechanisms do NIST personnel and RAs use to interact with potential applicants and ii. Which mechanisms seem to work best? iii. Has there been any effort to focus specifically on diversity? How? Such research could be undertaken via a combination of expert panels or surveys of NIST staff and current or former RAs to answer the first and third questions and to provide information for an assessment of the second question. Information should also be collected on the costs for individual outreach efforts (e.g., money spent on advertisements, time spent meeting with graduates) to compare to the benefits (how many applicants come from each individual outreach type). c. A second step to facilitate an evaluation of outreach efforts is to identify metrics for quantifying value obtained from different outreach strategies, such as hits to the website or number of graduate students met with at professional meetings. d. Examine individual outreach strategies for return on investment. This could include such strategies as assessing the NIST website for usability and informational content or assessing the return on advertising in publications. As part of the assessment of the NIST website, NIST could consider adding contact information for Research Advisors to facilitate a dialogue between potential applicants and relevant NIST staff. e. Finally, consider whether there might be other outreach strategies that are being underused currently, and which might have potential value, such as direct mail to deans, department heads and other university administrators. f. In addition, it is important to determine if any groups of graduate students—and potential applicants—who would make good candidates for the NIST/NRC RAP are unaware of the Program and how one applies. It would be difficult to craft a random sample of graduate students, but a limited survey might be possible. 3. NIST should conduct an evaluation of individuals who decline offers of Research Associateships. This could be done as a telephone interview or via a survey. As there are only a few people who decline each cycle, the burden would be relatively small. Two basic questions should be asked of those who are awarded but decline: (1) why are you declining, and (2) what are you planning to do instead? 4. The NRC should amend the application form. The number of fields should be reduced, in particular by collapsing very similar labels and by removing labels that are for multiple fields (e.g., “Biophysics Physics Biochemistry”). At least with regard to Ph.D. fields, an example of a smaller field list is found in the NSF’s Survey of Earned Doctorates (see Appendix B). 5. The NRC should update the DataRAP database to replace organizational names (e.g., institutes or labs) that no longer exist at NIST with current equivalents. 6. NIST should conduct a more thorough assessment of RAs’ experiences during the postdoctoral appointment, their satisfaction with and views on the benefits of the Program, and NIST staff’s satisfaction with and views on the benefits of the Program. 90

OCR for page 88
a. To assist in this, the NRC should redesign the final report and the Research Advisor’s evaluation form to maximize the collection of data from these instruments (see Box 3-1 and Box 3-2 for suggested questions). b. The final report and the Research Advisor’s evaluation should be made mandatory. c. Some elements of the current data collected could be subjected to further analysis. i. For example, NIST may wish to conduct further analysis on peer-reviewed journals, for example by: 1. asking whether the RA was sole or lead author, 2. examining whether RAs publish with NIST staff, and 3. examining the quality of the journals in which RAs publish, although this requires some ranking of journals. ii. NIST may wish to conduct an impact analysis of RAs’ productivity, for example by: 1. conducting a citation analysis to see how often RAs’ publications are referenced by others (note this can be accomplished using citation indexes), or 2. assessing the type or size of grants postdocs receive. iii. NIST may wish to conduct a more thorough review of their support of RAs, asking how familiar they are with NIST administrative offices, how often they turn to those offices for help, and for what reasons. d. NIST could also conduct a social network analysis of the collaboration of the RAs (or of NIST employees) to see how the Research Associateship Program facilitates new or wider collaboration among scientists and engineers. e. When data allow, NIST could consider disaggregating productivity and satisfaction measures for RAs by lab, gender, and race/ethnicity. 7. NIST should conduct a broad evaluation of the careers of former RAs to evaluate the impact of the Program on RAs’ careers, NIST, and the broader science and engineering community. The best approach for doing this is a survey, which would compare the career outcomes of NIST/NRC RAs to similar postdocs. The survey would be directed towards these former RAs and a suitable control group. Ideally, two possible comparisons could be made. First, one could construct a peer group. This would consist of a matched or stratified sample of individuals who had postdocs similar to the one at NIST for the comparison group. Although not ideal, one solution would be to take a stratified sample of former RAs from the Fellowships Office’s Directory. This is a census of former RAs; but as noted earlier in the report, many of these individuals could not be found or failed to respond to an earlier survey designed to collect information on their current employment. A second comparison group would consist of similar doctorates. A roster could be assembled by tapping the group of applicants to RAPs, who did not receive an award. These individuals will likely exhibit a diversity of career paths, including some who took postdocs (in academia or industry) and others who went straight into employment.47 47 An alternative approach is to construct a comparison group for the NSF’s Survey of Doctorate Recipients by identifying a group of former postdocs. For an example of a report that uses this approach, see: Oak Ridge Institute for Science and Education, 2003. 91