For the final session of the workshop, participants broke into small groups to consider the roles and responsibilities of researchers, publishers, institutions, and funders in improving transparent reporting of biomedical research.
In preparation for the small group activity, Harvey Fineberg shared his impressions from the workshop presentations and discussions thus far. There is simultaneously readiness and reluctance to act on improving transparency, he observed. He noted that ideas and attitudes about transparency seem to be converging in the right direction, and resources are being mobilized. However, he said, there remain “cultural barriers to introducing, implementing, and fulfilling the aspirations of open science through transparency.” Tools are available to researchers, publishers, and funders to promote rigorous research and transparent reporting. The task now is to determine what may be missing, what needs to be improved, and how stakeholder needs can be addressed, Fineberg said. During the small group discussions, he asked participants to identify what they consider the most important opportunities for moving forward.
Over the course of the workshop, perspectives were shared by researchers—from early career scientists to senior investigators and research support staff, as well as representatives of publishers, institutions, and funders. Fineberg asked participants to consider how stakeholder interests could be harmonized, and operations aligned, to realize the shared objective of increased transparency and rigor in research. He also asked participants to consider what leadership role each stakeholder plays in driving transparent reporting in biomedical research.
Upon reconvening in plenary session, a facilitator from each small group summarized the individual participants’ discussion (a summary of points made by small group participants is provided in Box 7-1). Group discussions were guided by the following prompts:
- What actions should funders, researchers, institutions, and journals take to drive widespread adoption of minimal reporting standards?
- Are reporting categories in guidelines for publishing (e.g., materials, design, analysis, and reporting) relevant for funders? For institutions? For small publishers/professional societies?
- What other information or reporting categories would be relevant?
- How should funders instruct reviewers of grant applications to reinforce transparent reporting? How much information should funders request (i.e., to what level of detail) in grant applications? Is it possible to obtain sufficient information about transparent reporting in grant applications without dramatic expansion of the application?
1 This small group exercise was intended to engage participants in using what they had learned thus far in the workshop. This section of the proceedings summarizes the small group discussions based on the report by each group’s designated rapporteur and should not be construed as reflecting any consensus of the group. All group responses and proposals reported are for discussion purposes only.
Michael Keiser reported for a small group that discussed the roles and responsibilities of researchers in improving transparent reporting of biomedical research. The small group discussions focused on potential actions that various stakeholders could take to help promote widespread adoption of minimal standards for transparency and rigor.
The first proposal, Keiser described, would be for researchers to report, in a standardized way, the reasoning and decision points that went into a given research project as well as relevant gaps in information. He acknowledged that few tools exist to facilitate this type of transparent reporting, so the development of new, more dynamic tools may be needed. Keiser suggested that this information should be included instead of or in addition to minimal reporting standards when submitting research applications to funders or manuscripts to journals.
The second proposal, Keiser outlined, would be to leverage the expertise of university librarians and create a resource for investigators starting a new laboratory. This resource might be similar in concept to the University of California’s QB3 “startup in a box” framework—a combination of hands-on support, services, and mentorship—which is designed to lower the barriers to innovation for university entrepreneurs interested in starting a company.2 This type of resource could provide biomedical researchers with a basic set of tools along with guidance on data access and sharing, standardization, compliance, and other information that early stage investigators may need to establish a new lab. Such a resource could also be useful for more senior investigators seeking to implement new tools or approaches.
Keiser shared additional points raised during the small group discussion regarding actions specific to trainees, early career researchers, and senior investigators that could help promote reproducibility and replication.
For trainees, a few participants suggested that reproducibility and replicability could be incorporated into graduate training programs. For example, graduate students, in consultation with faculty mentors, could be required to reproduce or replicate a study as part of their training grant or qualifying exam. Keiser said the feasibility of reproducing a study might vary by discipline, so perhaps a compilation of existing studies for this type of exercise could be a useful resource. Participants in the small group discussions pointed out that aligning training with a desired change in research practice would help support a culture of reproducibility rather than add administrative burden. Additionally, Keiser noted that participants in the small group discussed the value of implementing formal coursework in experimental research design for undergraduates across institutions.
Participants in the small group said that early career researchers may be more directly connected to the research and the data and more comfortable adopting new tools and practices than senior investigators, Keiser reported. Participants suggested that more substantive feedback, particularly for early stage research proposals, through peer review—internally within institutions as well as through external grant applications—would benefit investigators at all stages of their careers.
Finally, Keiser relayed that small group participants discussed the influential role of senior investigators in promoting a culture of openness (e.g., code review within the lab, experimental reproducibility and replicability by others within the group) and data-sharing practices (e.g., open
source tools, preprints, repositories for sharing data with the broader research community).
Franklin Sayre and Deborah Sweet reported for the two small groups that discussed ways institutions can help promote research practices that improve transparency and reproducibility.
Sayre laid out a few institutional impediments to transparency and reproducibility, which were raised by small group participants. In particular, he noted that the real and perceived incentives for tenure and promotion decisions do not necessarily align with incentives for transparent reporting. The tenure process can be a source of great anxiety for early career researchers. Participants suggested that this anxiety and focus on particular metrics, such as publication in high-impact journals, may promote a culture of poor research practices. Participants from both small groups discussed the need for more clarity for early career researchers on how tenure and promotion decisions are made at institutions. Sweet added that when funders show they value transparency, institutions will follow, which is why cooperation across stakeholders is critical. Sweet reported that small group participants discussed the possibility of assigning value to transparent reporting and team science. For example, she suggested that institutions could request tenure support letters to address a nominee’s “quality of contribution” to the research environment.
Small group participants also discussed how standards for awarding doctoral degrees should not be based on publication of a given number of papers, but should instead reward “transparency, openness, contribution to the research corpus, datasets, [and] even being honest about your negative results,” Sweet said. She added that positive incentives for applying best practices in research transparency might include awards (e.g., the National Institute of Neurological Disorders and Stroke’s Landis Mentor-ship Award3) or performance bonuses. She highlighted the importance of training doctoral students from the start and providing training to the broader scientific community as well. Training should make researchers aware of the availability of statistical advice and the value of consulting statisticians at the start of the research process. Steven Goodman added that institutions have a responsibility to provide adequate capacity for statistical and computational science support for researchers.
Sayre described another impediment discussed by small group participants, which was the perception that activities to support transparent
3 Available at https://www.ninds.nih.gov/Funding/About-Funding/landis-award-for-outstanding-mentorship (accessed January 10, 2020).
reporting constitute an administrative burden and that reproducibility studies are separate from the general “conduct of science.” Participants highlighted the need to reframe practices that promote transparency and reproducibility (e.g., checklists, data sharing) as a routine part of the scientific research process. Sweet shared a suggestion by small group participants that institutions might consider developing an inventory of existing research practices from across different laboratories to help identify and share approaches for enabling data sharing and transparent reporting. Institutions could then consider ways to incentivize the uptake of best practices, Sweet said, and perhaps create disincentives for not meeting particular research practice standards.
Sayre relayed a third impediment discussed by small group participants, which was the gap between researchers and institution-level policy, incentives, infrastructure, and research support. Participants suggested there is a need for guidance and better mechanisms to connect researchers with institutional resources. Sweet highlighted one main message raised during the small group discussion: the research culture set by leadership matters, and permeates throughout the organization. She added that institutions should create a “culture of openness” in which there are mechanisms for researchers to raise concerns about reproducibility and consequences for noncompliance with transparent reporting. Sayre also acknowledged that institutional leadership is needed to support widespread adoption of transparent reporting. However, he relayed that “[early career] researchers are often more interested in reproducibility, in transparency, and in sharing their work openly.” Small group participants discussed how small laboratories or investigators are often the driving force for grassroots change, so engaging these individuals could help build momentum for reproducibility and transparency within institutions. Sweet shared that small group participants raised the importance of approaching issues with transparent reporting from the perspective that most researchers want to “do the right thing and perform science honestly,” rather than taking a punitive approach.
Richard Nakamura reported for a small group that discussed the roles and responsibilities of funders in improving transparent reporting of biomedical research. Participants in the small group considered the benefit of harmonizing a set of minimal reporting standards across organizations so that investigators are not put in a position of having to meet multiple requirements when applying to different funders. Nakamura relayed the consideration that the application of a common set of minimal reporting standards would need to be flexible across organizations. As discussed
throughout the workshop, participants in this small group underscored the need to prioritize a core set of standards based on what is most important for enhancing transparent reporting in biomedical research.
Having tools available will be the key to helping facilitate interoperability across systems and reduce the burden on researchers, Nakamura reported. Small group participants pointed out that funders should adapt to support the development of tools and platforms for transparent reporting as well as replication studies.
The small group discussed how funding organizations can play a significant role in promoting culture change. Small group participants emphasized the need for training, Nakamura said. He added that National Institutes of Health training requirements on this topic should be forthcoming. Similar to other small group discussions, participants highlighted the role of leadership, including those at funding organizations, scientific societies, and academia. Department chairs and senior scientists as well as early career scientists will need to speak up about the problems related to reproducibility and replicability and push for culture change.
Finally, Nakamura said the small group discussed the importance of having metrics to measure whether a given approach leads to improvement. He added that there should be a feedback loop to evaluate whether a given intervention leads to the intended outcome.
Veronique Kiermer and Valda Vinson reported for two small groups, which considered the roles and responsibilities of publishers in improving transparent reporting of biomedical research. Vinson summarized the takeaway message of one small group discussion by saying, “If you want science better, make better science easier.” Kiermer summarized that institutions can help facilitate transparent reporting practices and make them normative, while journals and funders may “bookend the process” by aligning reporting requirements so that expectations are clear for researchers from the beginning.
One key topic discussed by small group participants was the need for coordinated action across sectors to address areas of highest need. One small group focused on study design and statistics, which might benefit from training modules developed for researchers and institutions as well as for journals and funders, Vinson said. Kiermer shared two areas of need discussed by the other small group: data archiving and sourcing biological materials. Data archiving is essential for transparent reporting, Kiermer said, but efforts are not coordinated. Publishers find that researchers reach the publication stage without having considered from the beginning how they intend to store datasets, including images. As
reported by Kiermer, small group participants suggested that the research flow could be improved (1) if funders set expectations from the beginning (e.g., requiring a data management plan as a condition of funding; providing guidance); (2) if institutions provided guidance to researchers on developing a data management plan and use of data repositories; and (3) if journals required reporting of raw data, which could help facilitate review or reproducibility studies following publication. Biological materials are also a source of variability in preclinical studies that contribute to the lack of reproducibility, said Kiermer. Small group participants suggested that clear identification of reagents, authentication of biological material, and archiving of reagents in a repository are all ways to help address this issue, Kiermer reported. For example, participants said the MD Anderson Cancer Center requires authentication of cell lines every year and has established a core facility to help carry out this work.
The small groups both discussed practical approaches for harmonizing a set of minimal reporting standards to facilitate better science. Vinson and Kiermer said that as discussed throughout the workshop, small group participants acknowledged that checklists on their own are not a solution—there would need to be a pragmatic and inclusive approach to implementation. Participants suggested that establishing a coordinated framework, which clearly outlines stakeholder expectations (e.g., editors, authors, reviewers, and the public), would help to drive the adoption of interventions, such as checklists, Vinson said. “Less is more” when promoting compliance with checklists, Kiermer added. Participants acknowledged the challenges of tracking compliance and noted the importance of metadata and persistent identifiers for digital objects “so that there is a clear provenance” from grant application to final publication, Kiermer said.
Finally, small group participants emphasized the need for cross-sector leadership and broad sector inclusion, Kiermer relayed. Participants noted the important role of institutional libraries, and the need to include the perspective of researchers. In this regard, Kiermer said that several small group participants supported the establishment of an ongoing National Academies forum as discussed by Marcia McNutt (see Chapter 2). Vinson suggested the concept of a “sea change” initiative to improve reproducibility in biomedical research that could be modeled after the American Association for the Advancement of Science STEM Equity Achievement (SEA)4 Change initiative on diversity, equity, and inclusion in science, or the Athena Scientific Women’s Academic Network (SWAN) Charter5
5 Available at https://www.ecu.ac.uk/equality-charters/athena-swan (accessed January 10, 2020).
promoting gender equity in science in the United Kingdom. In essence, Vinson explained, institutions could self-assess their status on the issue, chart a path toward improvement, and be rewarded for taking transformative action.
In closing the workshop, Fineberg observed that “there is a movement toward higher levels of rigor and transparency in science.” He encouraged participants to continue to foster this movement and accelerate progress toward improving reproducibility and replicability across the biomedical research life cycle through transparent reporting.