National Academies Press: OpenBook

Evaluation of the Minerva Research Initiative (2020)

Chapter: 2 Overview of the Committee's Information-Gathering Activities

« Previous: 1 Introduction
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

2

Overview of the Committee’s Information-Gathering Activities

As noted in Chapter 1, the committee undertook several information-gathering activities to gain an in-depth understanding of the Minerva Research Initiative and perceptions of the program. This chapter describes those activities, which included (1) reviews of Department of Defense (DoD) and other public records; (2) interviews with current and former DoD staff; (3) a grantee survey; (4) a survey of administrators of sponsored research at academic institutions; (5) the annual Minerva Conference; and (6) public information-gathering sessions held at the National Academies of Sciences, Engineering, and Medicine. Along with the data sources described in this chapter, the committee also issued a public call for comments, which was posted on the National Academies website and circulated in electronic newsletters. In this chapter, descriptions of the above information-gathering activities are followed by a discussion of the limitations of the committee’s evaluation.

Table 2-1 provides an overview of how the committee’s key research questions, discussed in Chapter 1, align with the main data sources that yielded information with which to address these questions. Table 2-2 shows the detailed list of questions included in the committee’s statement of task and the data sources used to address them.

REVIEWS OF DEPARTMENT OF DEFENSE AND OTHER PUBLIC RECORDS

The committee asked DoD to provide any available historical documentation about the program, including information about the grants that have been awarded over the years. The documents received from DoD, including

Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

TABLE 2-1 Key Research Questions for the Minerva Program Evaluation and Relevant Data Sources

Research Question Data Source
DoD and Other Public Records Interviews with DoD Staff Grantee Survey and Grantee Conference Survey of Sponsored Research Administrators Input from National Security Experts and Other Stakeholders Discussions with Other Funders
  1. How well does Minerva operate, and how does it compare with other basic social science research programs?
X X X X X X
  1. What research output has been supported by Minerva grants, and what is the quality of Minerva-supported research?
X X X X X
  1. Should and how can DoD make better use of the insights and tools/products of Minerva-supported research?
X X X X X
  1. Should and how can DoD change (1) the vision of Minerva; (2) the process for setting priorities/selecting research topics; and (3) the selection of projects to fund so as to meet contemporary, changing national security challenges, as well as the needs of each service branch, more effectively?
X X X X X
  1. Should and how can DoD increase—deepen and broaden—the engagement of social scientists in research relevant to national security?
X X X X X X
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

TABLE 2-2 Minerva Program Evaluation Detailed Questions from the Statement of Task and Relevant Data Sources

Question from the Statement of Task Data Source
DoD and Other Public Records Interviews with DoD Staff Grantee Survey and Grantee Conference Survey of Sponsored Research Administrators Input from National Security Experts and Other Stakeholders Discussions with Other Funders
Part I. Quality and Impact
  1. What has been accomplished after eight years of the program in terms of (a) basic science advances; (b) policy-relevant insights or tools for the security community?
X X X X
  1. What is the quality of research funded and its impact on the social science knowledge base, as well as on public understandings of the problems addressed by the researchers?
X X X X
  1. What challenges has Minerva confronted in generating interest in participation in the program among basic social scientists and how has it addressed those challenges?
X X X
  1. Has Minerva effectively fostered the development of communities working on social science issues around security, and the creation of organizational structures and processes to advance this research?
X X X X X
  1. What communities have benefited from Minerva-supported research and how would those benefits be characterized?
X X X X X
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Question from the Statement of Task Data Source
DoD and Other Public Records Interviews with DoD Staff Grantee Survey and Grantee Conference Survey of Sponsored Research Administrators Input from National Security Experts and Other Stakeholders Discussions with Other Funders
  1. Is Minerva unique as a funding source or are there other agencies/organizations funding similar research at similar levels?
X X X X X
  1. What is the relationship between basic research and applied insights of the research that Minerva seeks to generate?
X X X X
Part II. Program and Function
  1. How does the proposal review process compare to similar programs at NSF, DHS, and processes that the service branch research agencies use?
X X X X X
  1. How does the project implementation and management process compare to similar programs at NSF, DHS, and the service branch research agencies?
X X X X X
  1. Are the right projects being prioritized for (a) national security needs, generally speaking; and (b) the particular missions of the service branch research agencies?
X X X
  1. Is the program successful in connecting researchers to policymakers?
X X X
  1. How might the program improve outreach and integration of basic research insights into DoD?
X X X X
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Part III. Direction and Vision
  1. Has the vision that initiated MRI evolved, or are there ways in which it needs to evolve to better address contemporary security concerns?
X X X
  1. How can Minerva shape the future of basic research in social science around the issues of security?
X X X X
  1. How is Minerva influencing academic disciplines in their engagement with security and facilitating interdisciplinary and cross-disciplinary research and are their opportunities for improving in these efforts?
X X X X
  1. How might Minerva cultivate the interests of young scholars in working with DoD on social science security issues?
X X X X
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

memorandums, speeches, and presentations, provided valuable background and contextual information about the program. To complement this information, National Academies staff compiled additional background information and available records relevant to the Minerva program from public sources.

The committee was unable to obtain a definitive list of all the Minerva grants from DoD because no centralized database of the funded studies exists. The primary reason for the lack of such a database is that, as described in Chapter 1, the Minerva program is a collaboration among several units within DoD, and individual grants are executed and overseen by program managers based at the basic research organizations of the three military service branches: the Air Force Office of Scientific Research (AFOSR), the Army Research Office (ARO), and the Office of Naval Research (ONR). As noted in Chapter 1 and discussed in detail in Chapter 3, the role of the service branches has changed over the years; however, the program managers have always followed their own branch’s contracting, record keeping, and reporting practices. DoD began work on developing a central database of the Minerva grants around the time when the committee’s evaluation began, but this work was still ongoing when the evaluation was completed.

To compensate for the lack of a database of grants, National Academies staff worked with the Minerva program director to compile a list of grantees and information about the characteristics of the grants based on public records and other DoD materials, such as the Minerva Research Summaries from Minerva Conferences. A list of grants thereby derived is included in Appendix C and was used for the committee’s grantee survey (described below), as well as for other analyses included in this report. However, questions remained throughout the course of the study about whether this list of grantees was truly comprehensive.

The committee also attempted to obtain progress reports and final reports submitted to DoD by grantees as part of their reporting requirements. Again, these records are stored not in a central repository but at the service branches, in accordance with their differing practices. The committee was able to obtain an annual or final report for only about half of the grants, and these were primarily grants managed by ARO and ONR. While the reports obtained provided information useful to the committee, the missing reports and the inconsistent content in the reports that were available made them inadequate as a source for a systematic understanding of the quantity and quality of the outputs produced by the grants. For this reason, the committee requested a list of outputs from the grantees as part of the grantee survey discussed below.

INTERVIEWS WITH DEPARTMENT OF DEFENSE STAFF

To complement the information obtained from available records related to the Minerva program and learn about the perspectives of program staff, the committee conducted individual telephone interviews with current and

Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

former DoD staff involved with the Minerva Research Initiative over the years. Interviewees included current and former Minerva program directors, staff from the Basic Research Office and the Office of Policy, Minerva program managers within the military service branches, and others affiliated with the program.

A total of 14 telephone interviews were conducted between September 4 and October 8, 2018. Depending on the extent of current involvement in the program, interviews lasted from 40 minutes to 2 hours. The interviews were conducted by a National Academies senior program officer using a semistructured interview guide developed by the committee. Appendix D shows the detailed interview guide; the broad topics covered in the interviews were

  • structure and management of the program,
  • successes and challenges of the program,
  • quality of the research funded,
  • use of the research funded,
  • selection of topics to fund,
  • selection of projects to fund,
  • broadening of engagement among social scientists, and
  • vision of Minerva.

In addition to answering the questions asked of all interviewees, Minerva program managers and the current Minerva program director were asked to describe in detail the steps involved in managing the program. They were asked to describe the processes involved in

  • selecting topics to fund,
  • soliciting submissions,
  • reviewing proposal submissions,
  • selecting projects to fund,
  • awarding the grants,
  • managing the grants and monitoring grant progress and performance,
  • supporting dissemination activities, and
  • supporting translation activities.

GRANTEE SURVEY

The grantee survey was a census of all Minerva grantees on the list of grantees compiled in collaboration with DoD, including projects administered through the National Science Foundation (NSF). The survey was conducted by the National Opinion Research Center (NORC) at the University of Chicago.

For each of the grants, the principal investigator (PI) of record was asked to respond to the survey. PIs who had been awarded more than one

Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

Minerva grant over the years (six cases) received one invitation to complete the survey and were asked to think about all of their grants when answering the questions. The result was a universe of 102 PIs.

Appendix E includes the wording of each of the questions on the survey, which covered

  • experiences with Minerva grants compared with other social science grant programs,
  • broadening of engagement among social scientists,
  • challenges associated with conducting research relevant to national security,
  • opportunities resulting from the grant, and
  • outputs resulting from the research.

The survey was conducted via web, and potential respondents were initially contacted by email. NORC sent all individuals identified as PIs an email inviting them to participate in the study and conducted successful locating searches to identify valid email addresses for those whose contact information (initially compiled by National Academies staff) was invalid. For nonresponders, up to four additional emails were sent urging them to complete the survey. The first round of nonresponder prompt emails was sent 2 days after the initial email. For the following 3 weeks, each subsequent nonresponder prompt email was sent 1 week after the previous one. Remaining nonrespondents who failed to complete the survey after the four additional email prompts received a hard-copy letter via FedEx encouraging their participation.

Initially identified PIs who felt that a colleague (such as a co-PI) was better equipped to complete the survey were offered the opportunity to delegate the task accordingly. In the initial recruitment and subsequent reminder emails, the originally identified PIs were informed that they could provide the study team with contact information for such a colleague. Those to whom the survey was delegated then received a tailored recruitment email explaining that a colleague had delegated the survey to them. Delegated grantees who failed to complete the survey received up to the maximum number of nonresponder emails in accordance with the study’s nonrespondent prompting protocol.

In addition to responding to the survey questions, grantees were asked to provide a list of all of the publications and other outputs (such as manuscripts, presentations, briefings, and testimony) that had resulted from their Minerva grant. To make this task as easy as possible, respondents were offered several mechanisms for completing it. One option was to submit this information as part of the survey, by typing it or copying and pasting it into the web survey fields. The other options offered were to

Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

upload a curriculum vitae (CV) with the grant-related items highlighted or to email a list or a CV to a Minerva survey email inbox. The data collection contractor also accommodated additional requests, such as providing a link to a grant website. Given the higher burden associated with submitting this information (relative to answering the survey questions), grantees who provided responses to the survey questions but did not submit their lists of outputs received up to two additional email reminders to provide this further information. The information obtained about the grant outputs was then compiled from the different sources and organized into a database.

Table 2-3 shows the completion rates for the survey, along with the numbers of partials and nonresponse/refusals. Of the 102 grantees asked to complete the survey, 76 did so (a 75% completion rate). Three additional grantees submitted partially completed surveys, which were included in the analyses described in this report whenever possible.

Of those who completed the survey, 67 (88%) also responded to the request for information on grant outputs. This number includes 4 grantees who as yet had no outputs. For such a relatively complex and burdensome request on a web survey, some amount of misreporting is to be expected. It is possible that some PIs inadvertently reported outputs that were closely related to their Minerva research but, technically, were supported by another source of funding. It appears more likely, however, that Minerva-supported outputs were unintentionally omitted from the lists provided by PIs, and thus underreported. Comparison of the lists provided in response to the grantee survey with lists provided on progress and final reports to DoD appears to suggest that this may be the case. However, the previously discussed limitations of the information obtained from the latter reports prevent a more complete analysis of or correction for survey misreporting.

TABLE 2-3 Rates of Completion, Partial Completion, and Nonresponse/Refusal for the Grantee Survey

Number of Grantees Percent
Total 102 100
Completes 76 75
Grant Outputs Reported 67 88
Partial Completes 3 3
Nonresponse/Refusals 23 22

NOTES: The 67 grantees that reported grant outputs include 4 grants that had recently been awarded and as yet had no outputs. The percentage of grant outputs reported (n = 67) is shown as a percentage of the number of completes (n = 76).

Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

The lack of a fully accurate and comprehensive list of all papers, presentations, and other materials and activities resulting from each Minerva grant due to both nonresponse and misreporting on the grantee survey limited the committee’s ability to quantify the outputs of the grants. Further limiting what could be learned about the quantity of outputs was the fact that the program is relatively new and that many of the grants were still in progress at the time of this study (grants awarded from 2015 through 2017 represented approximately 40 percent of the projects funded). In addition, it is possible that grant-funded research tends to be more highly cited on average than research that is not grant-funded if grants make more resources available to projects, especially for dissemination. Thus, one limitation of the committee’s analysis is that the outputs could not be rigorously compared with outputs from other research, and had to be discussed in the context of academic research productivity in general. Nonetheless, the committee was able to gain a solid understanding of the breadth and quality of the outputs (as discussed in Chapter 4) and to draw conclusions that should be relatively robust to the limitations of the available data.

Table 2-4 shows the rates of response to the survey questions and to the request for information on grant outputs by several key grantee characteristics. Response rates overall were high, but there were some differences by subgroup. The rates were highest among the grantees who received funding during the first few years of the program (2009–2012) and lowest among those who received funding during later years but prior to 2017 (2013–2016). Response rates were particularly high among the early NSF grantees and low among grantees whose projects were overseen by AFOSR.

To enable comparison of response rates by the PI’s academic discipline, the field of doctoral degree was used as a proxy. Some fields that were represented by only one or two grantees were grouped together to maintain confidentiality. “Other social science fields” includes such fields as anthropology, demography, criminology, and geography, while “other fields” includes earth and ocean sciences, engineering, law, and physics. Response rates were lowest among those with degrees in mathematics and computer sciences.

To compare response rates by the seniority of the PIs, the Scopus database was consulted to determine the year of the PI’s first peer-reviewed publication, which was used as a rough measure of seniority (for discussion of the limitations of the Scopus database, see Chapter 4). Response rates were highest among PIs whose first peer-reviewed publication appeared between 1995 and 2008, in other words, among midcareer researchers. The number of publications in Scopus was used as a proxy for researcher productivity, and the number of citations in Scopus was used as a proxy

Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

TABLE 2-4 Rates of Response to the Grantee Survey, by Grantee Characteristics

Completed Survey Questions (%) Provided Information on Outputs (%) Number of Grantees
Grant Start Year
2009–2012 84 65 37
2013–2016 73 65 52
2017 77 69 13
Program Manager
Air Force Office of Scientific Research 50 39 18
Army Research Office 79 64 33
Office of Naval Research 85 76 33
National Science Foundation 89 78 18
Principal Investigator’s Academic Discipline
Economics 75 63 8
Mathematics and Computer Sciences 58 58 12
Political Science 81 77 53
Psychology 73 36 11
Sociology 83 83 6
Other Social Science Fields 83 50 6
Other Fields 83 33 6
Principal Investigator’s Earliest Peer-Reviewed Publication (Seniority)
1968–1994 72 58 36
1995–2008 81 72 54
2009–present 75 58 12
Number of Publications in Scopus
1–17 82 68 34
18–40 79 73 33
41–421 71 57 35
Number of Citations in Scopus
1–295 82 67 33
296–1,219 79 76 33
1,220–15,825 70 52 33
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

for influence. Rates of response to both the survey and the request for information on outputs were lowest among researchers with the most publications and highest number of citations.

Open-ended responses were coded with a coding scheme developed by NORC and approved by the committee. Responses were double-coded and then reviewed to ensure intercoder reliability. When coders disagreed about a code, the project team met to discuss and reconcile the code.

The survey was designed to allow respondents to skip any questions they did not wish to answer, including those set up in a Yes/No format. As is typical for surveys with similar setups that imply a “check all that apply” request, for some questions a subset of respondents selected Yes for a few items and then moved on to the next screen without selecting a response for the rest of the items. Based on the context and intent of these questions, prior experience with similar missing data, and prior experience with similar questions, the data collection contractor recommended recoding missing responses as No in cases in which the respondent selected at least one Yes response to questions of this type.

NORC provided a fully labeled datafile of the survey data for the committee. Frequency distributions are shown in Appendix E.

SURVEY OF ADMINISTRATORS OF SPONSORED RESEARCH AT ACADEMIC INSTITUTIONS

The survey of administrators of sponsored research was a census of 222 administrators at institutions with “highest research activity” and “higher research activity” based on the Carnegie Classification of Institutions of Higher Education, a framework for categorizing academic institutions in the United States (Indiana University Center for Postsecondary Research, 2016). The person asked to complete the survey was the director of the office of sponsored programs (or equivalent) at universities where a position of this type existed. In cases in which this position did not exist, the vice president for research or dean of research was contacted. In some cases, it was necessary to make a judgment call about the most appropriate person to contact, but as with the grantee survey, sample members were given the opportunity to delegate the survey to someone else if one of their colleagues was better suited to respond.

Appendix F includes the wording of each of the survey questions, which addressed

  • experiences with Minerva grants compared with other social science grant programs,
  • broadening of engagement among social scientists, and
  • perceptions of research relevant to national security.
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

NORC carried out the data collection for this survey using a list of universities and target individuals identified by the committee. The data collection and processing procedures for this survey were similar to the approach used for the grantee survey. The only notable difference was that instead of a FedEx reminder, an additional email prompt was sent to nonresponders to this survey. Table 2-5 shows the completion rates for the survey. Out of 222 administrators, 88 completed the survey (a 40% completion rate). An additional 18 individuals submitted partially completed surveys, which were included in the committee’s analyses. Table 2-6 shows that the response rates by institution type (highest versus higher research activity) were similar.

Missing data for items in questions with a Yes/No format were handled as with the grantee survey. NORC provided a fully labeled datafile of the survey data for the committee. Frequency distributions, along with the wording of the questions, are shown in Appendix F.

TABLE 2-5 Rates of Completion, Partial Completion, and Nonresponse/Refusal for the Survey of Administrators of Sponsored Research

Number of Administrators Percent
Total 222 100
Completes 88 40
Partial Completes 18 8
Nonresponse/Refusals 116 52

TABLE 2-6 Rates of Response to the Survey of Administrators of Sponsored Research, by Institution Type

Institution Type Number of Administrators Completed or Partially Completed Survey (%)
Highest research activity 115 46
Higher research activity 107 50
Overall 222 48

Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

MINERVA CONFERENCE

On September 26–27, 2018, three committee members and a National Academies staff member attended the annual conference of Minerva grantees, held in Washington, DC. The conference featured presentations from a subset of the grantees and breakout sessions to discuss the National Defense Strategy. Time also was set aside for the committee to describe its evaluation and solicit input in the form of a floor discussion. Participation in this event provided a valuable opportunity for the committee to gain an in-depth understanding of the grantees’ work, listen to their questions and comments, and experience firsthand the Minerva program’s main dissemination event.

PUBLIC INFORMATION-GATHERING SESSIONS

Input obtained from stakeholders and experts during public meetings with the committee was an important source of information for the committee’s evaluation. Efforts were made to reach out to a broad range of stakeholders with diverse viewpoints. The committee met with grantees, national security experts, representatives of social science organizations, staff from other government agencies and organizations with similar social science grant programs, and others. These stakeholders provided valuable background information and perspectives that informed the committee’s deliberations. Appendix G lists the individuals who provided input during these public meetings, held on January 16, April 12, July 19, and October 2, 2018.

LIMITATIONS OF THE COMMITTEE’S EVALUATION

In addition to the data availability and quality challenges described above, several overall limitations are associated with the committee’s evaluation.

First, as is always the case with a study of this type that does not involve a randomized or quasi-experimental design, it was not possible to draw causal inferences about the impact or effectiveness of the Minerva program. The committee considered constructing a comparison group of academic researchers in social science fields, but ultimately determined that it was not feasible to do so in a rigorous way that would hold up to scrutiny. Thus there was no valid basis for ascertaining the counterfactual—the research that would have been conducted (and by whom) in the absence of the Minerva program.

Second, while 10 years into the program’s operation might appear to be a timely point for an evaluation, significant changes to the program’s structure and functioning were introduced in the same year that this study was launched (see Chapters 1 and 3). Thus it was difficult to obtain a good

Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

understanding of the processes involved in the program, and particularly to disentangle perceptions about aspects of the program that predated the changes from the program’s current characteristics. More important, because the changes were so recent, it was not possible to assess how well the new approach was functioning.

Third, because of DoD’s confidentiality policies, the evaluation did not include a survey of those who had applied for Minerva funding but did not receive grants. Nor did the evaluation include a survey of social science researchers who conduct research relevant to national security but have never applied for a Minerva grant.

Acknowledging these limitations, the committee did reach out to a broad range of stakeholders and issued a call for comments during the course of the study. Through the mechanisms described in this chapter, the committee was able to amass a substantial volume of information and input reflecting diverse perspectives. As with any National Academies study, the findings and recommendations in this report are based on the collective judgment and expertise of the committee, as informed by the available information.

Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×

This page intentionally left blank.

Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 27
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 28
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 29
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 30
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 31
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 32
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 33
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 34
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 35
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 36
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 37
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 38
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 39
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 40
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 41
Suggested Citation:"2 Overview of the Committee's Information-Gathering Activities." National Academies of Sciences, Engineering, and Medicine. 2020. Evaluation of the Minerva Research Initiative. Washington, DC: The National Academies Press. doi: 10.17226/25482.
×
Page 42
Next: 3 Processes of the Minerva Program »
Evaluation of the Minerva Research Initiative Get This Book
×
 Evaluation of the Minerva Research Initiative
Buy Paperback | $55.00 Buy Ebook | $44.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Minerva Research Initiative is a Department of Defense (DoD) social science grant program that funds unclassified basic research relevant to national security. The goal of the program is to make use of the intellectual capital of university-based social scientists to inform understanding of issues important to DoD and the broader national security community. Evaluation of the Minerva Research Initiative discusses the program's successes and challenges over its first decade of operation, and highlights ways to strengthen the program's foundations and take advantage of opportunities for broadening its reach and usefulness.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!