Proceedings of a Workshop
Advancing Open Science Practices: Stakeholder Perspectives on Incentives and Disincentives
Proceedings of a Workshop—in Brief
Open science “aims to ensure the free availability and usability of scholarly publications, the data that result from scholarly research, and the methodologies, including code or algorithms that were used to generate those data."1 The actual and potential benefits of open science include strengthened rigor and reliability, the ability to address new questions, faster and more inclusive dissemination of knowledge, broader participation in research, effective use of resources, improved performance of research tasks, and open publication for public benefit.2
Yet, achieving open science will require overcoming several significant barriers. For example, the structure of the scholarly publications market limits open access to articles. Other barriers include the cost and accessibility of open infrastructure and researcher incentives. In addition, access to some types of research data may continue to be restricted due to privacy, proprietary, or national security concerns. As one effort to increase the contributions of open science among many, the Board on Research Data and Information (BRDI) of the National Academies of Sciences, Engineering, and Medicine (the National Academies) established the Roundtable on Aligning Incentives for Open Science (see Box 1). On September 20, 2019, the Roundtable organized a public symposium in Washington, DC, to consider some of the barriers and challenges to open science, as well as ways to overcome them. Key external stakeholders—including researchers, librarians, learned societies, publishers and infrastructure developers—shared their insights on the current state of the research ecosystem, as well as their visions for how open science can function at scale.
To open the symposium, Keith Yamamoto, Roundtable co-chair, noted the goals of open science to develop the best science, democratize information, and make discoveries and outcomes accessible to all. “We are far from these ideals,” he acknowledged, given the current system of hiring, funding, and promoting individual scientists. The Roundtable convened the symposium to listen to a broad range of stakeholders who, he said, “are not just talking about open science but are really working within their own contexts to make it the norm.” It is expected that these inputs will help the Roundtable further define immediate and longer term priorities.
1National Academies of Sciences, Engineering, and Medicine. 2018. Open Science by Design: Realizing a Vision for 21st Century Research. Washington, DC: The National Academies Press. https://doi.org/10.17226/25116.
The first panel consisted of four researchers who explored the implications of proposed changes related to open science in credit/reward systems for individual researchers, as well as for departments, institutions, and disciplines.
Making Incentives for Openness Explicit
Juan Pablo Alperin, Simon Fraser University and associate director of research for the Public Knowledge Project, described himself as simultaneously a subject, analyst, and builder of open science. He is part of a team that is quantitatively analyzing review, promotion, and tenure (RPT) guidelines across U.S. and Canadian colleges and universities. By text mining for concepts, terms, and words, they have looked at how documents from 129 institutions (baccalaureate, master’s level, and research-intensive) address open access and open science in the RPT process. The guidelines range across the life sciences; physical sciences and math; social sciences and the humanities; and multidisciplinary units, and, in many cases, different departments often within the same institution.
As an example of how different guidelines address the issue, citation metrics are an RPT criterion in 75 percent of guidelines of research-intensive universities versus 5 percent that mention open access. Even among those that explicitly mention open access, Alperin noted, one-third are in the form of a strong caution against publishing in what are known as “predatory open access journals.” In only very rare cases do the guidelines positively acknowledge the role of open-access, peer-reviewed publications.
The team is writing up its analysis on the kinds of outputs that RPT guidelines cover across fields. Some continue to stress publications, funding, and other conventional outputs such as posters or book reviews. Others, according to Alperin, take a less traditional view, such as organization and participation in conferences, data creation and management, and sharing of preprints prior to publication. Disparities can occur within the same institution: for example, one academic department judged output in a hierarchical order (e.g., referred books carrying the most weight), while another department in the same institution states, “All research, scholarly and other creative activities shall be assessed on the merits of the work, regardless of the form in which they appeared.”
Combining the analysis with his own experience, Alperin shared his perspective on how to change perceptions and make open access easier to implement. First, he said, the complexity shows that simply adding “open” to the long list of things faculty need to do is not enough to change the culture. Although he acknowledged that surveys do not show a strong relationship between the guidelines and faculty perceptions, the guidelines are still a place to signal what matters. He suggested encouraging conversations that highlight the value of openness. He noted that open science requires extra time and work, and people are already under pressure to perform. Funding positions to provide support for open science, as is currently done for administrative support, could help make open access the norm. He expressed the hope that journals where people already want to publish will flip to open access, but urged that the change not happen in ways that hamper faculty and institutions with fewer resources.
Open Access in the Neurosciences
Jean-Baptiste Poline, associate professor in the Department of Neurology and Neurosurgery at McGill University, described an open science experiment called the Tanenbaum Open Science Institute (TOSI), aimed at changing the culture of research at the Montreal Neurological Institute-Hospital. TOSI encompasses open early drug discovery, an
open biorepository, open data, open access, and intellectual property management compatible with open science. The project has met with successes and challenges.
Among TOSI’s successes, he said, are a strong institutional commitment, including an official “blessing” from McGill leadership, philanthropic funding as a kick-starter and key driver, and robust strategic partnerships with other initiatives.
The primary challenges relate to culture. Poline noted open science is a bit of a buzzword. There is great hype within academia, but the practice is still hampered by perceived risks, such as concerns about being scooped. Cultural change is a long process and must counter ingrained biases against openness. Concerns expressed relate to ethics, public support, reaction of clinicians, need for patents before creating a spin-off company, and other perceived obstacles. However, he said these concerns are not borne out by reality. He provided several examples of ways forward. First, the Aperture Project (www.apertureproject.org) rests on five principles: (1) it is an open-source platform; (2) governance is by a scientific society, rather than a commercial enterprise; (3) it is nonprofit and cheap; (4) review transparency is clear; (5) all research products are published, including datasets, protocols, and the like. He expressed the hope that Aperture could change the culture of research if scientific societies support it (see Figure 1).
A second example is NeuroLibre. This platform, which checks that open computational notebooks are running, operates at the technical level but also has publishing implications. Other promising initiatives include the Canadian Open Neuroscience Platform and NeuroHub.
Interoperability is important to open science, Poline stressed. The International Neuroinformatics Coordinating Facility (INCF) is developing standards and best practices in neuroscience. Sociological tools like those created by INCF can support open platforms and resolve interoperability barriers. Incentives for openness, as Carole Goble noted at a 2019 INCF conference, include love, money, fame, and rules (e.g., by a funder). To Poline, a number of “knobs” can be turned to satisfy these incentives, such as new publishing platforms, support from funding agencies and international organizations, infrastructure and tools, and training and community building.
Open Access in Policy Settings
David Yokum, director of the Policy Lab at Brown University, explained how the lab works with government agencies on data projects and field experiments. In this position and in his previous work, he said he brings the perspective of a researcher working with policy makers to do research that can change decisions about budget, law, staffing, and other issues. Examples include how data collected from police body-worn cameras impact community-police interactions, redesign of sanitation services, and mapping patterns of opioid use.
He offered three insights from the work. First, capacity in government for this type of work is low, especially at the state and local levels, including such basic functions as accessing articles. Local governments do not have scientists on staff to work on technical issues. Mayors and administrative directors usually do not articulate their research questions beyond broad goals such as curbing homelessness or increasing economic mobility, rather than applied research questions.
Second, the kinds of projects done at the Policy Lab represent a commitment to a type of “radical openness.” They are published and open, including pre-analysis plans and results, even if there is a null result or negative finding. He noted the relevance in seeing why some projects fail.
Third, he noted the powerful use of pre-analysis plans, which he described as advance write-ups of exactly what will happen, including how to do an experiment in a political setting. Knowing the plans will be public means people pay attention to them. “The preregistration moment can actually facilitate a more authentic conversation about what we’re trying to do with political decision makers,” he commented. Scientists should be running the experiments but not making value judgments, such as how large an effect size is meaningful enough to invest in something, he stressed.
Although long-term support for more scientists in local government should come from the taxpayers, Yokum said, the philanthropic community can play a pivotal role by providing “a spark of funding and a year of two to make the case.” There is a need for scientists who want to work in government, as well as ways to make better use of those already in government to take on this type of work.
He urged anchoring research in policy settings around issues, rather than more abstract concepts, which can also lead to greater access to data. Rather than request administrative data in the abstract, he suggested talking with those who hold the data about the kinds of questions the data can help answer. From there, he said, agencies are more likely to provide access and even develop codebooks for future researchers.
Shared resources, such as attorneys embedded in government offices to help write Data Use Agreements and memoranda of understanding, would be useful, he said. A replication service to ensure the rigor of data would also be useful, such as a team of data scientists or funding to take a dataset and re-do the analysis.
Advancing Open Science: A Perspective from a Disciplinary Standpoint
Meredith Niles, assistant professor in the Food Systems program and the Department of Nutrition and Food Sciences at University of Vermont, said she developed an interest in open access through her research on food system sustainability. Farmers could not read what she was publishing on climate change and, thus, were unable to act on the research. She urged seeking support from top-level leadership beyond junior and other pre-tenured faculty who are most interested in the RPT process.
Niles elaborated on the open access analysis described by Alperin (see above). Her team surveyed faculty at the institutions where guidelines were examined. They were asked which outputs they think matter most in RPT decisions. Overall (across all faculty levels), the top four were total number of publications, number of publications per year, name recognition of the journals, and impact factor. Preprints, open-access journals, and blog posts and other public outputs were perceived as the least valuable. Older faculty were more likely to value blogs, book chapters, performances and open access journals. Tenured faculty were more likely to value books and book chapters.
She suggested senior faculty could set an example by publishing in open access journals. RPT guidelines need revisions: as mentioned by Alperin, a large percentage stress impact factors and the few that mention open access are often negative. Open access could be made easier. For example, uploading articles into repositories is time-consuming, and fees further stymie efforts. Faculty may also lack knowledge of how to curate data or comply with federal mandates, she observed.
Niles reiterated fellow panelists’ points about culture change. People claim they want to publish to reach readers, whether leaders in the field or their peers. Yet they assume that others place factors like prestige and impact factor above readership. Niles urged an honest dialogue to help faculty recognize their shared interest in reaching readers, rather than focusing on the impact factors of the journals where they publish.
Another aspect of changing culture relates to the cost of publishing. She serves on the Library Advisory Committee and noted most faculty do not consider the costs of subscriptions when they do not have to pay them. Faculty may balk at approaches requiring author payments, but there is a cost of publishing, if not on the front end, in the form of ever-increasing subscription costs. Educating faculty about the real cost of publishing is needed.
Her own institution emphasizes impact factors in its RPT guidelines. What, she asked, if the guidelines stressed how candidate have made their work open? A coalition is need to promulgate this, not just one institution.
Most participants supported the importance of changing the minds of faculty about publishing in quality open-access journals. In some cases, citations are a good measure of readership, but different criteria, such as the effect on policy, are not captured in impact factors. Niles said the broader question is how to judge the quality of scientific publications. There may not be a single metric, and RPT guidelines could reflect a suite of factors. Poline noted that every measure can be gamed by those intent on doing so, but it is harder to “game the system” with more measures. Panelists urged societies and scientific communities to take the lead.
Alperin and Niles pointed out that a change in RPT guidelines in and of itself will not change perceptions and practices, but guidelines do signal values. Yamamoto suggested not just removing items (such as impact factors) but also adding what is desired, such as the quality of the work (rather than where it was published), team-based research, collaborations, data release, open access, and other items that reflect institutional values. While external reviewers do not have depth in all the fields in which they are judging cases, a participant added that guidelines need to reflect the culture and encompass varied signals of excellence.
OPEN REPORTING AND COMMUNICATION OF SCIENCE
Libraries, societies, and publishers are three important components of the evolving environment around open science practice, as touched upon in the first panel. The second panel carried this exploration further, in consideration of the key accelerators and roadblocks that could advance or impede progress toward open science.
Transformative Agreements as an Open Access Accelerator
Ivy Anderson, associate executive director and director of collections at University of California (UC) System, talked about transformative agreements as an open access accelerator. A transformative agreement, according to ESAC (Efficiencies and Standards for Article Charges), “are those contracts negotiated between institutions (libraries, national and regional consortia) and publishers that transform the business model underlying scholarly journal publishing, moving from one based on toll access (subscriptions) to one in which publishers are remunerated a fair price for their open access publishing services.”3 Anderson clarified that a transformative agreement is not just a changed relationship with publishers, but also authors. She said transformative agreements are just one element of open access, but can be a breakthrough. UC’s first agreement was signed with Cambridge University Press, and others are being discussed.
Returning to the question of why progress open access has been slow, despite the launch of the movement about two decades ago, she identified misaligned incentives relating to authors, libraries, funders and publishers.
• Authors: The Pay It Forward Study ranked eight factors authors consider when deciding on where to publish, with quality and reputation at the top and open access at the bottom. She said, “There’s a trust factor with the existing canon that people don’t necessarily have yet with open access.” Today, 81 percent of UC articles are published by 23 publishers; further, 50 percent of output is published by just five. The key takeaway, she said, is open access will not be achieved at scale by ignoring the gravitational pull of the existing literature.
• Libraries: Different parts of the library have mixed incentives. Collections librarians want to purchase materials to make their users happy and provide as much content as possible. Scholarly communication librarians want to break down the old order and fill their repositories. Library managers, meanwhile, worry about costs. These different communities do not always communicate.
• Funders: Different policies related to open access exist in North America and Europe. In North America, policies include those from the government and from universities. In Europe, they include the Finch Report, Horizon 2020, OA2020, APC Offset Agreements, and Plan S. Researchers must respond to different incentives and policies.
• Publishers: Publishers maintain subscriptions but are also gaining revenue from article processing charges (APCs). Analysis showed that the UC libraries are spending $40 million in subscriptions, and UC authors are spending an additional $10 million in APCs. She characterized this as unsustainable.
Transformative agreements are a way to realign incentives, Anderson said. They bring open access to researchers, put the financial agency of libraries in the service of open access, allow more holistic financial management by bringing subscriptions and OA expenditures together in a single agreement, and place library funding in the service of authors, not publishers.
The model involves a library subvention that is a baseline for every article. Authors contribute if their funding allows it; if they cannot, the library pays. Total fees are capped to manage risk. It is sustainable, she said, because authors choose a preferred platform, publishers will compete in the face of elastic author demand, and competition will lower the cost of scholarly communication.
Transformative agreements can be agents of future change, but face policy, financial, and market-based challenges. Making them work requires changes from all parties. Libraries and publishers must integrate licensing and open access approaches and support a cost-neutral transition, Anderson observed. Funders will need to create policies that support funding for research publication and encourage authors to allocate funds. Systems are needed to support shared funding. And authors will have to accommodate new workflows adapted to the models.
In closing, Anderson urged that stakeholders build coalitions aimed at making open access the default option. She also urged investing in innovation to allow newer dissemination models to emerge.
Open Science in Scientific Societies
Brooks Hanson, executive vice president for science for American Geophysical Union (AGU), spoke on the role of scientific societies in open science, drawing on his experience at AGU, the Smithsonian Institution, AAAS, and other organizations. Global sustainability challenges, Hanson began, mean that the 21st century and beyond will be a time of convergent science for the benefit of humanity. Silos must be broken to make progress.
AGU’s evolving open science encompasses not only publications, but also meetings, data, and more. Scientific societies can promote openness through research assessment; awards; ethics and equity; diversity and inclusion; and addressing implicit bias. These are all symbiotic at AGU. As an example, there are about 100,000 presentations in earth and space sciences each year made at AGU meetings and those of other organizations. About half of these are posters that are not shared outside of the meeting. Access is not open to non-attendees and even attendees have limited access. AGU has developed a preprint service that archives posters. It has also developed virtual programs and expanded remote participation so a broader audience can access posters and presentations.
AGU is working with its publishing partner, Wiley, to expand access to publications. Hybrid OA is growing with Projekt DEAL in Germany. Six new journals launched since 2010 are open access, as is additional content in the form of news stories, social media, journal commentaries, and other AGU communications. AGU has piloted a liberal green open access policy and public library access in California and the United Kingdom, although they have seen limited uptake.
According to Hansen, APCs can incentivize predatory and volume publishing, and they inhibit participation from unfunded scientists. Societies have a role in addressing this, particularly in issues related to integrity. Key is dialogue among societies, funders, libraries, institutions, and publishers around support for high-integrity publishing.
AGU considers its ethics, equity, diversity, and inclusivity efforts a part of its open access work. Opening the field has enriched publications, changed peer review practices, and improved dialogue across the community. It has also made meetings safer and more inclusive. Hanson urged looking at all the pieces together beyond publications. He advocated for a common, collective voice and urgency to support infrastructure and culture change internationally. In his view, a strong statement about the importance of open access issues by the National Academies, societies, and universities would make a collective impact.
PLOS Perspectives on Aligning Incentives for Open Science
Veronique Kiermer, publisher and executive editor at the Public Library of Science (PLOS), observed that some of the discussion identifies publishers as part of the problem, not the solution. But, she countered, publishers are a potentially huge part of the solution. To answer why there is not more rapid progress toward open science, access is only part of the issue; the other is assessment (see Figure 2). Publishers are trying to deal with the access issue, but researchers are currently assessed or evaluated on the bases of where they publish their work. “As long as that remains the dominant factor, we maintain the status quo,” she said. The research community is shackled by this dichotomy, she said.
Kiermer reported a growing number of efforts to develop new article-level metrics (ALMs) by PLOS and others. But hiring, tenure and promotion committees still give the most weight to journal impact factors. “As long as the assessment doesn’t change, even if you have a successful product market, we’re not moving [away from impact factors],” she commented.
PLOS promotes open science in various ways. Its policies state data must be accessible at the time of publication. It facilitates partnerships with data repositories and others to put tools in front of researchers to make their science more open. PLOS journals are exploring more inclusive criteria for submissions and are experimenting with registering reports so that pre-analysis plans can be evaluated and peer reviewed. Finally, they have created transparent processes to make preprints available and publish peer review history.
PLOS developed a policy in 2014 in which authors need to make data underlying their findings fully available. Yet, it has been observed that this policy makes PLOS less competitive in the minds of some authors who seek a more lenient data availability policy. Kiermer shared a vision of an open data workflow that would involve funders, institutions, and researchers, and not just the journals. Funders would set the expectation about sharing data at the time of grant application and evaluate proposals with that in mind. Researchers and institutions would know they had to share data from the outset and have the tools, resources, and training to do so. If the data were collected, stored, and curated, it would be easy to include the data in a journal submission. Currently, publishers are usually involved late in the process. Even if they have measures to promote open science, the research has already been done.
Another advantage to this workflow is that the information is sent to the ORCID (Open Researcher and Contributor ID) record of a researcher. Researchers could use their ORCID record to fill in their grant reporting forms, which lessens the burden on them. This process could become a virtuous cycle and contribute to broader alignment with university mandates and to breaking down silos. Funders and research institutions have the power to shift incentives earlier in the process, Kiermer noted. Research institutions could change the criteria used by hiring, tenure, and promotion committees, and provide support for open science practices. Funders could align requirements and make it easier to practice and report on open science.
While some publishers are proactive in promoting open science practices, Kiermer concluded, the competitive landscape for publishing is not yet conducive to open science.
The Role of Funders in Promoting Open Science
Kiermer’s call for funders to become more involved led to the final presentation of this session. Bodo Stern, chief of strategic initiatives for the Howard Hughes Medical Institute (HHMI), discussed two relevant initiatives: promoting academic incentives and transforming scientific publishing. Although HHMI spends most of its funds supporting their own scientists, funders in general have two levers as “keepers of the purse.” They can design best practices or policies, such as Coalition S. They can also fund solutions that promote open science, such as the open access journal eLIFE, supported by HHMI and the Wellcome Trust.
Why isn’t open science a no-brainer, Stern asked. He explained that it is because of the perceived cost to individual scientists, worried that peer recognition and career advancement suffer if they practice open science. Yet, the times change. Historically, pre-1665 (and the creation of the first scientific journal by the Royal Society) scientists kept their discoveries secret for fear their competitors would take them. The paradigm shifted when publication with authorship helped to support priority claims. The disadvantage of disclosure turned into an advantage. He called for a similar
shift to turn open science into an advantage for individual scientists.
The good news is that open science articles are more used and cited, Stern said. However, “name-brand journals” shape careers, even if careers should not rely on journal metrics.
Also, on the publishing side, contributions must be discoverable and usable. Scientists like to collaborate, but an article with 20 or more authors makes it difficult to know who did what. A credit taxonomy begins to address that. Another example is peer review, which is hidden from view today. To be useful, peer review reports should also be discoverable and usable. This is a mindset shift in how research outputs are shared.
The focus on content as the major product of publishers is outdated, he said. Rather, their primary value has shifted to the credentialing of scientific work, which includes peer review, evaluation and curation, and selection, even if they still operate as if they are primarily content providers. A journal that sees itself as a credentialing service would want its evaluation services to be discoverable. They would publish peer review reports, no matter the decision to showcase the quality and rigor of their review process.
Stern suggested uncoupling evaluation and publication. By separating dissemination and evaluation, authors would be in charge of dissemination of their own work, including preprints, revised versions, and final articles. Separately, an evaluation layer would evaluate these steps. Instead of a simple accept/reject, evaluation at the different stages would be available. Peer review could be distilled into a short editorial statement attached to a paper, and “badges” could be created for curation service. Although futuristic, aspects of this approach are being implemented by the Wellcome Trust and the Bill & Melinda Gates Foundation in partnership with the F1000Research open-access platform in an effort to accelerate the publication of articles and data sets. The “sweet spot” would be to show that this platform mode of publishing can operate at the high end. He expressed hope that scientific societies will develop methods of peer review distillation and curation that would be suitable for a platform model.
He suggested the following for the Roundtable. First, fix the root cause not the symptoms. One root cause, as discussed, is academic incentives. Another is to be stuck in a print mindset that is not fit for the digital world. He called for a shared vision to separate dissemination and evaluation. Finally, he urged that open priorities be adopted at scale: funders, institutions (to include libraries), societies and nonprofit publishers can come together to support the desired shared infrastructure and processes.
Discussion centered around the need for trust. One participant noted the difficulty in obtaining cost data for publishing. Kiermer and Hansen agreed, but said it is hard to disentangle costs, and account for, as an example, the true cost of peer review, overhead, or other costs.
Another participant asked about accountability of policies already on the books. Implementing accountability is difficult, especially with limited resources. Looking at the results can help measure whether policies are being acted upon, several people suggested. Positive discrimination can also take place, in which researchers are rewarded for making their preprints, data, code and other outputs available.
The final panel looked at the tools, technology and services necessary to save, catalog, and safeguard research outputs. Accommodating open outputs requires interoperability, discoverability, and metadata standardization. How should stakeholders ensure that policies are translated into effective and efficient practices?
How Open Infrastuctures Advance Discovery
Sergio Baranzini, distinguished professor of neurology at the University of California San Francisco (UCSF), noted in his introduction that he works across a number of programs, which each have a different culture. There is great value and also challenges that come with belonging to different consortia, in his case in multiple sclerosis research. The genetic landscape of MS has exponentially expanded as former competitors have begun to collaborate. A group of 40 labs around the world have created the International MS Genetics Consortium. Two hundred genomic associations have been discovered that confer susceptibility to MS, he said, a discovery that could only have been achieved through collaboration and sharing of data.
As another example of how infrastructure built on open science can lead to discovery, Baranzini described a project he is involved with called SPOKE (scalable precision medicine open knowledge network). It integrates disparate and heterogeneous types of data with a common key to develop an artificial intelligence (AI) system to track and predict disease outcomes, akin to “Google Maps” for health. It is only reliant on data, based on the premise that data leads
to information, and information to knowledge.
To give a sense of the extent of data generated worldwide, 90 percent of the world’s data was created in the last 2 years, according to one source. And while all fields must contend with vast amounts of data, biomedicine has a particular challenge. SPOKE is trying to create more cross-disciplinary integration across different biomedical disciplines. It combines databases related to genes, proteins, genetic predispositions, symptoms, treatments, side effects, and much more. One application of many relates to precision medicine, which can integrate knowledge at the population level with knowledge about an individual.
SPOKE recently received a Convergence Accelerator grant from the National Science Foundation. The grant is designed to solve a problem with an open knowledge network platform.
Values and Infrastructure
As the final formal presenter of the symposium, Kristen Ratan, founder of Strategies for Open Science (Stratos), described the “good, bad, and dangerous” related to infrastructure. Infrastructure is taken for granted when it works, she said, with technology the largest new wave of infrastructure in the last 40 years. She expressed support for community-driven open infrastructure, which means that communities have a say in their own infrastructure and are involved in designing, building, testing, using, and correcting it. Open infrastructure offers such benefits as opportunities for collaboration and reuse, she said, and its transparency makes it safer. In contrast, closed systems have customers, not communities. They employ the values, ideas, and ethics of the creators.
A report by SPARC (Scholarly Publishing and Academic Resources Coalition) on infrastructure analyzes the current infrastructure landscape and changes. Academic publishing is undergoing a major transition from a content-provision to a data analytics business, with new outputs. A shift is occurring in which a lot of the tools that help workflows and processes are consolidating into a few large corporations. She expressed concern about the data being held by a relatively few companies. For those thinking of building, commissioning, or using a new system, she asked if they have thought through the implications of who owns the data, where they are, and who are the people impacted. Corporate interests may be more focused on collecting data than making research work more effectively, and they may sell the data to third parties. She noted the report contains recommendations for establishing detailed data policies, mechanisms for ensuring compliance, figuring out what the ownership of data looks like, and encoding contracts with values form the start.
It is important to understand the rights to the data and who has control of it, Ratan noted. As an example of this issue, she noted that she uses Google Docs, yet the terms of service give Google a worldwide license to use an individual’s content. It is a tradeoff for convenience. Amazon, Facebook, and other companies have similar terms of service or have used algorithms that go against their stated values but maximize revenue.
There is a growing realization that technology is not a neutral construct. The way it was built, licensing of the code, how it is rolled out, the business model and other aspects can combine to serve different purposes, positive or not. The impact of big data is bringing some of these issues to light. Efforts are underway to introduce more fairness into how infrastructures are created and tested and to attempt to incorporate fairness from the beginning. “There is a direct line between infrastructures and values,” Ratan emphasized.
As a takeaway, she asserted that open source and community-driven infrastructure reduces risks and engages the community. She suggested insisting on a values-based software development process in building or commissioning software. She also noted the importance of diversity and collaboration to improve the final product.
In response to a question from the audience about ethical concerns in combining and releasing data, Baranzini clarified that SPOKE is open but does not contain individual patient data. It brings together knowledge at the population level. In the context of the roundtable topic on realigning incentives, another person asked how credit is given in these large collaborations. Baranzini commented that these kinds of projects tend to attract people who are comfortable with the sharing aspect. In the end, if the work is valuable and appropriate credit given to all who contributed, more can be accomplished than what an individual can do. This creates resources for the collaborators and others to use.
A participant asked Ratan about the aggregation of data from different projects. Commercial entities are profiting by taking data from disparate sources and comingling it, he noted. Ratan responded that the academic community could benefit, too. It is regularly done with science but could also happen with research and scholarly infrastructures.
Roundtable co-chair Yamamoto concluded the session by noting that success stories are needed to demon-
strate the ways that cooperation and open sharing will lead to victories. For example, SPOKE, as described by Baranzini, is in its early stages but has the potential to transform the field of precision medicine. Open integration of data is the wave of the future.
DISCLAIMER: The Proceedings of a Workshop—in Brief was prepared by Paula Whitacre as a factual proceedings of what occurred at the meeting. The statements made are those of the author or individual meeting participants and do not necessarily represent the views of all meeting participants, the planning committee, or the National Academies of Sciences, Engineering, and Medicine.
REVIEWERS: To ensure that it meets institutional standards for quality and objectivity, this Proceedings of a Workshop—in Brief was reviewed by Amy Brand, MIT Press, and Sarah Nusser, Iowa State University. Marilyn Baker, National Academies of Sciences, Engineering, and Medicine, served as the review coordinator.
PLANNING COMMITTEE: Keith Yamamoto (NAS/NAM), University of California, San Francisco (Chair); Heather Joseph, SPARC; and Thomas Kalil, Schmidt Futures. Staff: Thomas Arrison, program director, Policy and Global Affairs; Greg Tananbaum, consultant; George Strawn, director, Board on Research Data and Information (BRDI); Ester Sztein, deputy director, BRDI; Emi Kameyama, associate program officer, BRDI; and Reginald Hayes, senior program assistant, BRDI.
ROUNDTABLE MEMBERS: Thomas Kalil (Co-Chair), Chief Innovation Officer, Schmidt Futures; Keith Yamamoto (Co-Chair) (NAS/NAM), Vice Chancellor for Science Policy and Strategy, University of California, San Francisco; Elizabeth Albro, Commissioner, National Center for Education Research, U.S. Department of Education*; Danny Anderson, President, Trinity University; Roslyn Artis, President, Benedict College; Chris Bourg, Director of Libraries, Massachusetts Institute of Technology; Courtney Brown, Vice President for Strategic Impact, Lumina Foundation*; Stuart Buck, Vice President of Research, Arnold Ventures*; Jean-Claude Burgelman, Head of Unit C2, Directorate-General for Research and Innovation, European Commission*; Mary Sue Coleman (NAM), President, Association of American Universities*; Anne-Marie Coriat, Head of UK and Europe Research Landscape, Wellcome Trust*; Michael Crow, President, Arizona State University; Mark Cullen (NAM), Director, Center for Population Health Sciences, Stanford University; Ronald Daniels, President, Johns Hopkins University; Tashni-Ann Dubroy, Executive Vice-President and Chief Operations Officer, Howard University; Susan Fitzpatrick, President, James S. McDonnell Foundation*; Maryrose Franko, Executive Director, Health Research Alliance*; Nicholas Gibson, Senior Program Officer, Human Sciences, John Templeton Foundation*; Daniel Goroff, Vice President and Program Director, Alfred P. Sloan Foundation*; Randolph Hall, Vice President for Research and Professor, University of Southern California; Robert Hanisch, Director, Office of Data and Informatics, National Institute of Standards and Technology*; Patricia Hswe, Program Officer for Scholarly Communications, Andrew W. Mellon Foundation*; Adam Jones, Program Officer, Science Program, Gordon and Betty Moore Foundation*; Renu Khator, President, University of Houston; Boyana Konforti, Director, Scientific Strategy and Development, Howard Hughes Medical Institute*; Richard McCullough, Vice Provost for Research, Harvard University; Peter McPherson, President, Association of Public and Land-grant Universities*; Ross Mounce, Director of Open Access Programs, Arcadia*; Lisa Nichols, Assistant Director for Academic Engagement, Office of Science and Technology Policy*; Loretta Parham, Chief Executive Officer and Library Director, Robert W. Woodruff Library, Atlanta University Center; Heather Pierce, Senior Director, Science Policy and Regulatory Counsel, Association of American Medical Colleges*; Dawid Potgieter, Senior Program Officer, Templeton World Charity Foundation*; Brian Quinn, Assistant Vice President, Research-Evaluation-Learning, Robert Wood Johnson Foundation*; Robert Robbins, President, University of Arizona; Jerry Sheehan, Deputy Director, National Library of Medicine, National Institutes of Health*; Shirley Tilghman (NAS/NAM), President Emerita, Princeton University; Alan Tomkins, Deputy Director, Social, Behavioral & Economic Sciences, National Science Foundation*; Roger Wakimoto, Vice Chancellor for Research, University of California at Los Angeles; Thomas Wang, Chair, Open Science Committee, American Heart Association and Chair, Department of Internal Medicine, University of Texas Southwestern Medical Center; Jennifer Weisman, Chief of Staff, Global Health Division, Bill & Melinda Gates Foundation*; Richard Wilder, General Counsel and Director of Business Development, Coalition for Epidemic Preparedness Innovations*; Duncan Wingham, Executive Chair, Natural Environment Research Council, United Kingdom Research and Innovation*; Garabet Yeretssian, Program Director, Crohn’s Disease Program, Leona M. and Harry B. Helmsley Charitable Trust*
* denotes ex-officio member
SPONSORS: This workshop was supported by Arcadia, Arnold Ventures, Eric & Wendy Schmidt Fund for Strategic Innovation, Leona M. and Harry B. Helmsley Charitable Trust, National Library of Medicine, Open Research Funders Group, Open Society Foundations, Robert Wood Johnson Foundation, and the Wellcome Trust.
For additional information, visit http://www.nas.edu/brdi.
Suggested citation: National Academies of Sciences, Engineering, and Medicine. 2020. Advancing Open Science Practices: Stakeholder Perspectives on Incentives and Disincentives: Proceedings of a Workshop-in Brief. Washington, DC: The National Academies Press. https://doi.org/10.17226/25725.
Policy and Global Affairs
Copyright 2020 by the National Academy of Sciences. All rights reserved.