Skip to main content

Currently Skimming:

7 Measurement Efforts
Pages 43-56

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 43...
... As a labor economist, she previously worked with confidential micro datasets, and in her view, one major challenge has been that the current data infrastructure and tradi­ ional t measures (unemployment, gross domestic product) have become less salient since their initial development.
From page 44...
... Through two open-source competitions, Lane's team challenged the computer science community to develop a machine learning and natural language processing algorithm that could be used to identify datasets used in publications (in other words, to create what Lane's team calls a "dataset publication dyad")
From page 45...
... Lane described the work she has done with the Agricultural Research Management Survey (ARMS) , which began with a set of metrics that includes publication topics, publications that cite ARMS, the researchers involved, institutional representation, and the ecosystem where operations occurred.
From page 46...
... Second, researchers should use rigorous evaluative reasoning to avoid logical "dead ends." Laursen explained that to move logically from a descrip­ion of a measured criterion to a diagnostic value judgment of t that criterion, a researcher needs an explicit standard or scale of value. Moreover, each of these components must be explicit, although at least 83 percent of pathways surveyed by Laursen and team that aimed to form evaluative judgments did not include these minimum elements of rigorous evaluative reasoning.
From page 47...
... Fourth, Laursen encouraged researchers to use the dataset of pathways developed by her team to identify criteria, standards, measures, methods, and research approaches that may be appropriate for assessing convergence. Laursen and colleagues have built an interactive dashboard that allows the user to filter and show dominant modes and approaches that have previously been used to assess interdisciplinarity.1 Laursen closed by arguing that the published literature on assessing interdisciplinarity generally lacks evaluation basics.
From page 48...
... McFarland has explored potential links between intellectual and social ­ convergence such as including the directionality of this relationship (whether social connections promote intellectual convergence or intellectual interests foster social convergence) , as well as factors that facilitate both forms of convergence.
From page 49...
... Wang and Schneider have recently reported similar findings regarding inconsistency of universal indicators.2 Rafols concluded that using a single or even a few atomistic indicators to measure complex research activities capable of addressing societal problems is misguided. Rafols proposed shifting from an atomistic to a portfolio approach modeled on ecology.
From page 50...
... Linkages across fields can be investigated using bibliometric data to determine whether expectations of crossover are met or exceeded. Rafols' approach allows one to see not only the fact of interdisciplinarity but also the directions in which it is proceeding, which cannot be determined from unidimensional measures of interdisciplinarity.
From page 51...
... . Any study of convergence involves multiple dimensions, including disciplinary interactions and topics; the problem-orientedness of the research; people and organizations involved; institutional context; tools, equipment, and methods involved; the learning and educational dimension; and infrastructure.
From page 52...
... , the ­ acArthurM Foundation, and the Santa Fe Institute -- to understand how they define successful interdisciplinary collaboration. The interviews focused on how ­ members of the group define markers and facilitators of success; how they experience the construction of group membership and collective work; and the research infrastructure created by funding organizations that enable success.
From page 53...
... Emotional success was enabled by feelings of group belonging, respect and admiration of peers, a climate of conviviality, ­ and effective leadership by individuals who understood the cognitive, emotional, and social demands of successful interdisciplinary collaboration. Although many remarks emphasized the positives of interdisciplinary research, Lamont and colleagues found that participants did mention markers of unsuccessful collaboration, including the failure to frame a problem in ways that are shared by network participants and the failure to establish relatively shared expectations.
From page 54...
... Laursen noted that her team extracted 890 unique measures from 142 publications investigated. Laursen said that the most promising measures were combinations or aggregations of quantitative and qualitative measures because they unite a countable output (such as bibliometric data)
From page 55...
... Both Rafols and Schnell were asked about the prospects for providing interactive maps and data visualizations that would be useful to various stakeholders. Schnell stated that interactive tools could help policy makers and the public understand various policy questions better.
From page 56...
... He also emphasized the importance of ensuring that these practices are captured in bibliometric analyses. Rafols suggested that intentional and serendipitous trajectories do not always exist in dichotomous form.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.