Skip to main content

Currently Skimming:

Appendix C: Recommendations Grouped by Stakeholder
Pages 209-220

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 209...
... SCIENTISTS AND RESEARCHERS RECOMMENDATION 4-1: To help ensure the reproducibility of computational results, researchers should convey clear, specific, and complete information about any computational methods and data products that support their published results in order to enable other researchers to repeat the analysis, unless such information is restricted by nonpublic data policies. That information should include the data, study methods, and computational environment: • the input data used in the study either in extension (e.g., a text file or a binary)
From page 210...
... RECOMMENDATION 6-2: Academic institutions and institutions managing scientific work such as industry and the national laboratories should include training in the proper use of statistical analysis and inference. Researchers who use statistical inference analyses should learn to use them properly.
From page 211...
... • Researchers should collaborate with expert colleagues when their education and training are not adequate to meet the computational requirements of their research. • In line with the National Science Foundations's (NSF's)
From page 212...
... . RECOMMENDATION 4-2: The National Science Foundation should consider investing in research that explores the limits of computational reproducibility in instances in which bitwise reproducibility is not reasonable in order to ensure that the meaning of consistent computational results remains in step with the development of new computational hardware, tools, and methods.
From page 213...
... Through these repository criteria, NSF would enable discoverability and standards for digital scholarly objects and discourage an undue proliferation of repositories, perhaps through endorsing or providing one go-to website that could access NSF-approved repositories. RECOMMENDATION 6-6: Many stakeholders have a role to play in improving computational reproducibility, including educational institutions, professional societies, researchers, and funders.
From page 214...
... RECOMMENDATION 6-10: When funders, researchers, and other stakeholders are considering whether and where to direct resources for replication studies, they should consider the following criteria: • The scientific results are important for individual decision making or for policy decisions. • The results have the potential to make a large contribution to basic scientific knowledge.
From page 215...
... • Researchers should collaborate with expert colleagues when their education and training are not adequate to meet the computational requirements of their research. • In line with its priority for "harnessing the data revolution," the National Science Foundation (and other funders)
From page 216...
... RECOMMENDATION 6-10: When funders, researchers, and other stakeholders are considering whether and where to direct resources for replication studies, they should consider the following criteria: • The scientific results are important for individual decision making or for policy decisions. • The results have the potential to make a large contribution to basic scientific knowledge.
From page 217...
... • Researchers should collaborate with expert colleagues when their education and training are not adequate to meet the computational requirements of their research. • In line with its priority for "harnessing the data revolution," the National Science Foundation (and other funders)
From page 218...
... The strength of the claims made in a journal article or conference submission should reflect the reproducibility and replicability standards to which an article is held, with stronger claims reserved for higher expected levels of reproducibility and replicability. Journals and conference organizers are encouraged to: • set and implement desired standards of reproducibility and replica bility and make this one of their priorities, such as deciding which level they wish to achieve for each Transparency and Openness Promotion guideline and working toward that goal; • adopt policies to reduce the likelihood of non-replicability, such as considering incentives or requirements for research materials trans parency, design, and analysis plan transparency, enhanced review of statistical methods, study or analysis plan preregistration, and replication studies; and • require as a review criterion that all research reports include a thoughtful discussion of the uncertainty in measurements and conclusions.
From page 219...
... Particular care in reporting on scientific results is warranted when • the scientific system under study is complex and with limited con trol over alternative explanations or confounding influences; • a result is particularly surprising or at odds with existing bodies of research; • the study deals with an emerging area of science that is character ized by significant disagreement or contradictory results within the scientific community; and • research involves potential conflicts of interest, such as work funded by advocacy groups, affected industry, or others with a stake in the outcomes. MEMBERS OF THE PUBLIC AND POLICY MAKERS RECOMMENDATION 7-3: Anyone making personal or policy decisions based on scientific evidence should be wary of making a serious decision based on the results, no matter how promising, of a single study.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.