Skip to main content

Currently Skimming:

9 Funding and Evaluation of Team Science
Pages 197-214

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 197...
... Recognizing this problem, the National Science Foundation (NSF) commissioned the current study to enhance its own understanding of how best to fund, evaluate, and manage team science, as well as to inform the broader scientific community (Marzullo, 2013)
From page 198...
... academic institutions that provide seed money or infrastructure, and (5) nonprofit organizations that obtain funding from private donors and/or the general public and use it to fund team science research (e.g., Stand Up to Cancer2)
From page 199...
... Prioritizing Research Topics and Approaches Public and private funders work closely with both the scientific community and policy makers to establish research priorities and approaches. Federal agencies that fund research are led and staffed by scientists, convene scientific advisory bodies (e.g., the Department of Energy's High Energy Physics Advisory Panel)
From page 200...
... The Growing Role of Private Funders Individual philanthropists and private foundations are beginning to play a larger role in establishing research priorities, and the continued debate regarding how much funders should influence the directions of science extends to these private entities. A policy analyst at the American Association for the Advancement of Science recently commented: "For better or for worse, the practice of science in the 21st century is becoming shaped less by national priorities or by peer review groups and more by the particular preferences of individuals with huge amounts of money" (Broad, 2014)
From page 201...
... The growth of private funding raises questions for federal policy makers and research funding agencies. One concern is that scientists funded by philanthropists with particular research agendas, who also sit on funding agency advisory panels and peer review panels, may have a significant influence over the priorities set by federal funding agencies.
From page 202...
... . Agencies often use public announcements, referred to as Funding Opportunity Announcements (FOAs)
From page 203...
... EVALUATION OF TEAM SCIENCE Funders evaluate science teams and larger groups throughout the evolution of a research endeavor, beginning with the proposal review, then dur 3 Professional leadership development to increase agency employees' understanding of team science, as recommended in Chapter 6, could help improve the clarity of communication in research solicitations involving team science.
From page 204...
... NSF NSF 13-500 Cyber-Enabled Standard grant Team must include Small to "Team composition Sustainability at least two Medium must be synergistic and Science and investigators from $12M total for interdisciplinary"; "focus Engineering distinct disciplines. ~12–20 awards on interdisciplinary, (CyberSEES)
From page 205...
... collaborating investigators (PIs/Co-PIs) working in different disciplines is required." NSF NSF 12-011 CREATIV: New grant Any NSF support Medium "Must integrate across Creative mechanism for topic area: Up to $1M total multiple disciplines;" "the Research Award special projects interdisciplinary, up to 5 years proposal must identify and for Trans- high-risk, novel, justify how the project formative Inter- potentially is interdisciplinary;" disciplinary transformative.
From page 206...
... NOTES: DoE = U.S. Department of Energy FOA = Funding Opportunity Announcements NASA = National Aeronautics and Space Administration NSF = National Science Foundation RFA = Request for Applications
From page 207...
... The authors found that members of peer review panels systematically gave lower scores to research proposals closer to their own areas of expertise and to highly novel research proposals. They suggested that, if funders wish to support novel research, then they prime reviewers with information about the need for and value of novel research approaches in advance of the review meeting.
From page 208...
... issued a revised review policy based on "the increasingly multi-disciplinary and collaborative nature of biomedical and behavioral research." As discussed in the previous chapter, larger and more complex projects are also at greater risk for collaborative challenges after funding, yet there are typically no sections of the grant application devoted to describing management or collaboration plans. Review criteria are typically focused on the technical and scientific merit of the application, and not the potential of the team to collaborate effectively.
From page 209...
... This report has highlighted evidence related to factors at many levels that influence the effectiveness of team science. The primary goal of collaboration plans is to engage teams and groups in formally considering the various relevant factors that may influence their effectiveness and deliberately and explicitly planning actions that can help maximize their effectiveness and research productivity.
From page 210...
... However, a recent review of more than 60 evaluations of NIH center and network projects from the past three decades found that while a majority of evaluation studies included some type of evaluation of the research process, this important dimension often was represented with either a single variable or a limited set of variables that were not linked to one another or to program outcomes in any conceptually meaningful way (The Madrillon Group, 2010)
From page 211...
... . Evaluators can also use a range of methods to judge how successful particular team science projects have been, such as citation analysis and the use of quasi-experimental comparison samples and research designs (Hall et al., 2012b)
From page 212...
... In addition, little research to date has used experimental designs, comparing team science approaches or interventions6 with control groups to identify impacts. The recent development of "altmetrics" provides helpful data that may be used to improve evaluation of team science projects (Priem, 2013; Sample, 2013)
From page 213...
... At the same time, the peer review process used to evaluate proposals typically focuses on technical and scientific merit, and not the potential of the team to collaborate effectively. Including collaboration plans in proposals, along with guidance to reviewers about how to evaluate such plans, would help ensure that projects include infrastructure and processes that enhance team science effectiveness.
From page 214...
... . The Funding Opportunity Announcements they use to solicit team science proposals often include vague language about the type of collaboration and the level of knowledge integration they seek in proposed research.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.