National Academies Press: OpenBook
« Previous: 7 Confidence in Science
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

References

Albers, C., and Lakens, D. (2018). When Power Analyses Based on Pilot Data Are Biased: Inaccurate Effect Size Estimators and Follow-up Bias. Journal of Experimental Social Psychology, 74, 187-195. doi:10.1016/j.jesp.2017.09.004.

Alberts, B., Kirschner, M.W., Tilghman, S., and Varmus, H. (2014). Rescuing U.S. Biomedical Research from Its Systemic Flaws. Proceedings of the National Academy of Sciences of the United States of America, 111(16), 5773-5777.

Allison, D.B., Brown, A.W., George, B.J., and Kaiser, K.A. (2016). Reproducibility: A Tragedy of Errors. Nature, 530(7588), 27-29.

Alogna, V.K., Attaya, M.K., Aucoin, P., Bahník, Š., Birch, S., Birt, A.R., Bornstein, B.H., Bouwmeester, S., Brandimonte, M.A., Brown, C., Buswell, K., Carlson, C., Carlson, M., Chu, S., Cislak, A., Colarusso, M., Colloff, M.F., Dellapaolera, K.S., Delvenne, J.-F., Di Domenico, A., Drummond, A., Echterhoff, G., Edlund, J.E., Eggleston, C.M., Fairfield, B., Franco, G., Gabbert, F., Gamblin, B.W., Garry, M., Gentry, R., Gilbert, E.A., Greenberg, D.L., Halberstadt, J., Hall, L., Hancock, P.J.B., Hirsch, D., Holt, G., Jackson, J.C., Jong, J., Kehn, A., Koch, C., Kopietz, R., Körner, U., Kunar, M.A., Lai, C.K., Langton, S.R.H., Leite, F.P., Mammarella, N., Marsh, J.E., McConnaughy, K.A., McCoy, S., McIntyre, A.H., Meissner, C.A., Michael, R.B., Mitchell, A.A., Mugayar-Baldocchi, M., Musselman, R., Ng, C., Nichols, A.L., Nunez, N.L., Palmer, M.A., Pappagianopoulos, J.E., Petro, M.S., Poirier, C.R., Portch, E., Rainsford, M., Rancourt, A., Romig, C., Rubínová, E., Sanson, M., Satchell, L., Sauer, J.D., Schweitzer, K., Shaheed, J., Skelton, F., Sullivan, G.A., Susa, K.J., Swanner, J.K., Thompson, W.B., Todaro, R., Ulatowska, J., Valentine, T., Verkoeijen, P.P.J.L., Vranka, M., Wade, K.A., Was, C.A., Weatherford, D., Wiseman, K., Zaksaite, T., Zuj, D.V., and Zwaan, R.A. (2014). Registered Replication Report: Schooler and Engstler-Schooler (1990). Perspectives on Psychological Science, 9(5), 556-578.

American Association for the Advancement of Science. (2018). Science Journals: Editorial Policies. Available: http://www.sciencemag.org/authors/science-journals-editorial-policies [January 2019].

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

American Statistical Association. (2016). American Statistical Association Releases Statement on Statistical Significance and P-Values. Available: https://www.amstat.org/asa/files/pdfs/p-valuestatement.pdf [January 2019].

Amrhein, V., Trafimow, D., and Greenland, S. (2019a). Inferential Statistics as Descriptive Statistics: There Is No Replication Crisis If We Don’t Expect Replication. American Statistician, 73(Suppl. 1), 262-270. doi:10.1080/00031305.2018.1543137.

Amrhein, V., Greenland, S., and McShane, B. (2019b). Scientists Rise Up Against Statistical Significance. Nature, 567(7748), 305-307. doi:10.1038/d41586-019-00857-9.

Anderson, S., and Williams, R. (2017). LIGO Data Management Plan, June 2017. Available: https://dcc.ligo.org/public/0009/M1000066/025/LIGO-M1000066-v25.pdf [April 2019].

Annis, J., Yong, Z., Voeckler, J., Wilde, M., Kent, S., and Foster, I. (2002, November 16-22). Applying Chimera Virtual Data Concepts to Cluster Finding in the Sloan Sky Survey. Paper presented at the SC ‘02: Proceedings of the 2002 ACM/IEEE Conference on Supercomputing, Baltimore, MD. Los Alamitos, CA: IEEE Computer Society Press. Available: https://ieeexplore.ieee.org/document/1592892 [April 2019].

Aschwanden, C. (2015). Science Isn’t Broken: It’s Just a Hell of a Lot Harder Than We Give It Credit For. FiveThirtyEight, August 19. Available: https://fivethirtyeight.com/features/science-isnt-broken/#part1 [January 2019].

Association for Computing Machinery. (2018). Artifact Review and Badging. Available: https://www.acm.org/publications/policies/artifact-review-badging [December 2018].

Association for Psychological Science. (2018). Registered Replication Reports. Available: https://www.psychologicalscience.org/publications/replication [December 2018].

Association of American Universities and Association of Public and Land-grant Universities. (2017). AAU-APLU Public Access Working Group Report and Recommendations. Available: https://www.aau.edu/key-issues/aau-aplu-public-access-working-group-report-and-recommendations [January 2019].

Bacon, F. ([1620] 1889). Novum Organum. Oxford, UK: Clarendon Press.

Baer, D.R., and Gilmore, I.S. (2018). Responding to the Growing Issue of Research Reproducibility. Journal of Vacuum Science & Technology A, 36(6), 068502. doi:10.1116/1.5049141.

Bailey, D.H., Barrio, R., and Borwein, O. (2012). High-Precision Computation: Mathematical Physics and Dynamics. Applied Mathematics and Computation, 218(20), 10106-10121. doi:10.1016/j.amc.2012.03.087.

Baker, M. (2016). 1,500 Scientists Lift the Lid on Reproducibility. Nature News, 533(7604), 452-454. doi:10.1038/533452.

Barba, L.A. (2019). Praxis of Reproducible Computational Science. Computing in Science & Engineering, 21(1), 73-78. doi:10.1109/MCSE.2018.2881905. (preprint on Authorea, https://doi.org/10.22541/au.153922477.77361922).

Barba, L.A. (2018). Terminologies for Reproducible Research. arXiv, 1802.03311. Available: https://arxiv.org/pdf/1802.03311 [December 2018].

Barba, L.A., Clementi, N.C., and Forsyth, G.F. (2017, January 3-6). Essential Skills for Reproducible Research Computing. Presented at Universidad Técnica Federico Santa María, Valparaíso, Chile. Available: https://barbagroup.github.io/essential_skills_RRC [January 2019].

Barba, L.A., Clementi, N.C., and Forsyth, G.F. (2018). Essential Skills for Reproducible Research Computing: A Four-Day, Intensive, Hands-on Workshop on the Foundational Skills That Everyone Using Computers in the Pursuit of Scientific Research Should Have. Available: https://barbagroup.github.io/essential_skills_RRC [December 2018].

Barrett, B. (2018). Intel’s New Processors Are Built for the High-Powered Future of PCs. WIRED, May 30. Available: https://www.wired.com/2017/05/intels-new-processors-built-high-powered-future-pcs [December 2018].

Bauer, P., Thorpe, A., and Brunet, G. (2015). The Quiet Revolution of Numerical Weather Prediction. Nature, 525(7567), 47-55. doi:10.1038/nature14956.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Begley, C.G., and Ellis, L.M. (2012). Raise Standards for Preclinical Cancer Research. Nature, 483(7391), 531. doi:10.1038/483531a.

Bench, S.W., Rivera, G.N., Schlegel, R.J., Hicks, J.A., and Lench, H.C. (2017). Does Expertise Matter in Replication? An Examination of the Reproducibility Project: Psychology. Journal of Experimental Social Psychology, 68, 181-184. doi:10.1016/j.jesp.2016.07.003.

Benjamin, D.J., Berger, J.O., Johannesson, M., Nosek, B.A., Wagenmakers, E.J., Berk, R., Bollen, K.A., Brembs, B., Brown, L., Camerer, C., Cesarini, D., Chambers, C.D., Clyde, M., Cook, T.D., De Boeck, P., Dienes, Z., Dreber, A., Easwaran, K., Efferson, C., Fehr, E., Fidler, F., Field, A.P., Forster, M., George, E.I., Gonzalez, R., Goodman, S., Green, E., Green, D.P., Greenwald, A.G., Hadfield, J.D., Hedges, L.V., Held, L., Hua Ho, T., Hoijtink, H., Hruschka, D.J., Imai, K., Imbens, G., Ioannidis, J.P.A., Jeon, M., Jones, J.H., Kirchler, M., Laibson, D., List, J., Little, R., Lupia, A., Machery, E., Maxwell, S.E., McCarthy, M., Moore, D.A., Morgan, S.L., Munafó, M., Nakagawa, S., Nyhan, B., Parker, T.H., Pericchi, L., Perugini, M., Rouder, J., Rousseau, J., Savalei, V., Schönbrodt, F.D., Sellke, T., Sinclair, B., Tingley, D., Van Zandt, T., Vazire, S., Watts, D.J., Winship, C., Wolpert, R.L., Xie, Y., Young, C., Zinman, J., and Johnson, V.E. (2018). Redefine Statistical Significance. Nature Human Behaviour, 2(1), 6-10. doi:10.1038/s41562-017-0189-z.

Berger, J.O., and Delampady, M. (1987). Testing Precise Hypotheses. Statistical Science, 2(3), 317-335.

Berlin, J.A., and Ghersi, D. (2018). Preventing Publication Bias: Registries and Prospective Meta-Analysis. In H.R. Rothstein, A.J. Sutton, and M. Borenstein (Eds.), Publication Bias in Meta-Analysis: Prevention, Assessment and Adjustments (pp. 35-48). Hoboken, NJ: Wiley.

Besley, J.C., and Nisbet, M. (2013). How Scientists View the Public, the Media and the Political Process. Public Understanding of Science, 22(6), 644-659.

Bhattacharjee, Y. (2010). NSF Board Draws Flak for Dropping Evolution from Indicators. Science, 328(5975), 150-151.

Binder, A.R., Hillback, E.D., and Brossard, D. (2016). Conflict or Caveats? Effects of Media Portrayals of Scientific Uncertainty on Audience Perceptions of New Technologies. Risk Analysis, 36(4), 831-846.

Blischak, J.D., Davenport, E.R., and Wilson, G. (2016). A Quick Introduction to Version Control with Git and GitHub. PLOS Computational Biology, 12(1), e1004668. doi:10.1371/journal.pcbi.1004668.

Blum, D., Knudson, M., Henig, R.M., and National Association of Science Writers. (2005). A Field Guide for Science Writers: The Official Guide of the National Association of Science Writers. New York and Oxford: Oxford University Press.

Boettiger, C. (2015). An Introduction to Docker for Reproducible Research. ACM SIGOPS Operating Systems Review—Special Issue on Repeatability and Sharing of Experimental Artifacts, 49(1), 71-79.

Bollen, K., Cacioppo, J.T., Kaplan, R.M., Knosnick, J.A., and Olds, J.L. (2015). Social, Behavioral, and Economic Sciences Perspectives on Robust and Reliable Science. Report of the Subcommittee on Replicability in Science Advisory Committee to the National Science Foundation Directorate for Social, Behavioral, and Economic Sciences. Available: https://www.nsf.gov/sbe/AC_Materials/SBE_Robust_and_Reliable_Research_Report.pdf [April 2019].

Boos, D.D., and Stefanski, L.A. (2011). P-Value Precision and Reproducibility. American Statistical Association, 65(4), 213-221.

Borenstein, M., Hedges, L.V., Higgins, J.P., and Rothstein, H.R. (2010). A Basic Introduction to Fixed-Effect and Random-Effects Models for Meta-Analysis. Research Synthesis Methods, 1(2), 97-111.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Bosco, F.A., Aguinis, H., Singh, K., Field, J.G., and Pierce, C.A. (2015). Correlational Effect Size Benchmarks. Journal of Applied Psychology, 100(2), 431-449.

Boulbes, D.R., Costello, T.J., Baggerly, K.A., Fan, F., Wang, R., Bhattacharya, R., Ye, X., and Ellis, L.M. (2018). A Survey on Data Reproducibility and the Effect of Publication Process on the Ethical Reporting of Laboratory Research. Clinical Cancer Research, 24(14), 3447-3455. doi:10.1158/1078-0432.CCR-18-0227.

Bowers, S., and Ludäscher, B. (2005, October 24-28). Actor-Oriented Design of Scientific Workflows. Paper presented at the International Conference on Conceptual Modeling, Klagenfurt, Austria. doi:10.1007/11568322_24.

Boykoff, M.T., and Boykoff, J.M. (2004). Balance as Bias: Global Warming and the U.S. Prestige Press. Global Environmental Change, 14(2), 125-136.

Brainard, J. (2018). Rethinking Retractions. Science, 362(6413), 390-393.

Brehm, J. (1993). The Phantom Respondents: Opinion Surveys and Political Representation. Ann Arbor: The University of Michigan Press.

Brion, M.J. (2010). Commentary: Assessing the Impact of Breastfeeding on Child Health: Where Conventional Methods Alone Fall Short for Reliably Establishing Causal Inference. International Journal of Epidemiology, 39(1), 306-307.

Broom, D.P., and Hirscher, M. (2016). Irreproducibility in Hydrogen Storage Material Research. Energy & Environmental Science, 9(11), 3368-3380.

Brown, A.W., Kaiser, K.A., and Allison, D.B. (2018). Issues with Data and Analyses: Errors, Underlying Themes, and Potential Solutions. Proceedings of the National Academy of Sciences of the United States of America, 115(11), 2563-2570.

Buckheit, J., and Donoho, D.L. (1995). WaveLab and Reproducible Research. Available: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.53.6201 [December 2018].

Budescu, D.V., Broomell, S., and Por, H.H. (2009). Improving Communication of Uncertainty in the Reports of the Intergovernmental Panel on Climate Change. Psychological Science, 20(3), 299-308.

Buongiorno, J., Venerus, D.C., Prabhat, N., McKrell, T., Townsend, J., Christianson, R., Tolmachev, Y.V., Keblinski, P., Hu, L.-W., Alvarado, J.L., Bang, I.C., Bishnoi, S.W., Bonetti, M., Botz, F., Cecere, A., Chang, Y., Chen, G., Chen, H., Chung, S.J., Chyu, M.K., Das, S.K., Paola, R.D., Ding, Y., Dubois, F., Dzido, G., Eapen, J., Escher, W., Funfschilling, D., Galand, Q., Gao, J., Gharagozloo, P.E., Goodson, K.E., Gutierrez, J.G., Hong, H., Horton, M., Hwang, K.S., Iorio, C.S., Jang, S.P., Jarzebski, A.B., Jiang, Y., Jin, L., Kabelac, S., Kamath, A., Kedzierski, M.A., Kieng, L.G., Kim, C., Kim, J.-H., Kim, S., Lee, S.H., Leong, K.C., Manna, I., Michel, B., Ni, R., Patel, H.E., Philip, J., Poulikakos, D., Reynaud, C., Savino, R., Singh, P.K., Song, P., Sundararajan, T., Timofeeva, E., Tritcak, T., Turanov, A.N., Vaerenbergh, S.V., Wen, D., Witharana, S., Yang, C., Yeh, W.-H., Zhao, X.-Z., and Zhou, S.-Q. (2009). A Benchmark Study on the Thermal Conductivity of Nanofluids. Journal of Applied Physics, 106(9), 094312. doi:10.1063/1.3245330.

Bush, R. (2018). Perspectives on Reproducibility and Replication of Results in Climate Science. Paper prepared for the Committee on Reproducibility and Replicability in Science at the National Academies of Sciences, Engineering, and Medicine.

Buttliere, B., and Wicherts, J. (2018). What Next for Scientific Communication? A Large-Scale Survey of Psychologists on Problems and Potential Solutions. PsyArXiv Preprints. Available: https://psyarxiv.com/972eu [December 2018].

Button, K.S., Ioannidis, J.P.A., Mokrysz, C., Nosek, B.A., Flint, J., Robinson, E.S.J., and Munafò, M.R. (2013). Power Failure: Why Small Sample Size Undermines the Reliability of Neuroscience. Nature Reviews Neuroscience, 14, 365-376. doi:10.1038/nrn3475.

Byrne, M. (2017). Making Progress toward Open Data: Reflections on Data Sharing at PLOS ONE. PLOS Blogs, May 8. Available: https://blogs.plos.org/everyone/2017/05/08/making-progress-toward-open-data [January 2019].

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Callahan, S.P., Freire, J., Santos, E., Scheidegger, C.E., Silva, C.T., and Vo, H.T. (2006, June 27-29). VisTrails: Visualization Meets Data Management. Paper presented at the Proceedings of the 2006 ACM SIGMOD International Conference on Management of Data, Chicago, IL. Available: https://link.springer.com/chapter/10.1007/978-3-540-89965-5_13 [April 2019].

Callahan, S.P., Freire, J., Scheidegger, C.E., Silva, C.T., and Vo, H.T. (2008, June 17-18). Towards Provenance-Enabling Paraview. Paper presented at the Provenance and Annotation of Data and Processes, Salt Lake City, UT. Available: https://www.springer.com/us/book/9783540899648 [April 2019].

Callier, V. (2018). Yes, It Is Getting Harder to Publish in Prestigious Journals If You Haven’t Already. Science, December 10. Available: https://www.sciencemag.org/careers/2018/12/yes-it-getting-harder-publish-prestigious-journals-if-you-haven-t-already [December 2018].

Camerer, C.F., Dreber, A., Holzmeister, F., Ho, T.-H., Huber, J., Johannesson, M., Kirchler, M., Nave, G., Nosek, B.A., Pfeiffer, T., Altmejd, A., Buttrick, N., Chan, T., Chen, Y., Forsell, E., Gampa, A., Heikensten, E., Hummer, L., Imai, T., Isaksson, S., Manfredi, D., Rose, J., Wagenmakers, E.-J., and Wu, H. (2018). Evaluating the Replicability of Social Science Experiments in Nature and Science between 2010 and 2015. Nature Human Behaviour, 2(9), 637-644. doi:10.1038/541562-0180399-2.

Camerer, C.F., Dreber, A., Forsell, E., Ho, T.-H., Huber, J., Johannesson, M., Kirchler, M., Almenberg, J., Altmejd, A., Chan, T., Heikensten, E., Holzmeister, F., Imai, T., Isaksson, S., Nave, G., Pfeiffer, T., Razen, M., and Wu, H. (2016). Evaluating Replicability of Laboratory Experiments in Economics. Science, 351(6280), 1433-1436. doi:10.1126/science.aaf0918.

Carter, E.C., and McCullough, M.E. (2014). Publication Bias and the Limited Strength Model of Self-Control: Has the Evidence for Ego Depletion Been Overestimated? Frontiers in Psychology, 5, 823-823. doi:10.3389/fpsyg.2014.00823.

Casadevall, A., and Fang, F.C. (2016). Rigorous Science: A How-To Guide. mBio, 7(6), e01902-01916.

Cassey, P., and Blackburn, T.M. (2006). Reproducibility and repeatability in ecology. BioScience, 56(12), 958-959.

Center for Open Science. (2018). Open Science Framework (OSF). Available: https://osf.io/tvyxz/wiki/5.%20Adoptions%20and%20Endorsements/?_ga=2.111781956.11451633.1525987388-1384377334.1525987388 [December 2018].

Chambers, C.D., Feredoes, E., Muthukumaraswamy, S.D., and Etchells, P. (2014). Instead of “Playing the Game” It Is Time to Change the Rules: Registered Reports at Aims Neuroscience and Beyond. AIMS Neuroscience, 1(1), 4-17.

Chang, A.C., and Li, P. (2018). Is Economics Research Replicable? Sixty Published Papers from Thirteen Journals Say “Often Not.” Critical Finance Review, 7. Available: https://www.nowpublishers.com/article/Details/CFR-0053 [July 2019].

Chen, X., Dallmeier-Tiessen, S., Dasler, R., Feger, S., Fokianos, P., Gonzalez, J.B., Hirvonsalo, H., Kousidis, D., Lavasa, A., Mele, S., Rodriguez, D.R., Šimko, T., Smith, T., Trisovic, A., Trzcinska, A., Tsanaktsidis, I., Zimmermann, M., Cranmer, K., Heinrich, L., Watts, G., Hildreth, M., Lloret Iglesias, L., Lassila-Perini, K., and Neubert, S. (2018). Open Is Not Enough. Nature Physics, 15(2), 113-119. doi:10.1038/s41567-018-0342-2.

Cheung, I., Campbell, L., LeBel, E.P., Ackerman, R.A., Aykuto lu, B., Bahník, Š., Bowen, J.D., Bredow, C.A., Bromberg, C., Caprariello, P.A., Carcedo, R.J., Carson, K.J., Cobb, R.J., Collins, N.L., Corretti, C.A., DiDonato, T.E., Ellithorpe, C., Fernández-Rouco, N., Fuglestad, P.T., Goldberg, R.M., Golom, F.D., Gündo du-Aktürk, E., Hoplock, L.B., Houdek, P., Kane, H.S., Kim, J.S., Kraus, S., Leone, C.T., Li, N.P., Logan, J.M., Millman, R.D., Morry, M.M., Pink, J.C., Ritchey, T., Root Luna, L.M., Sinclair, H.C., Stinson, D.A., Sucharyna, T.A., Tidwell, N.D., Uysal, A., Vranka, M., Winczewski, L.A., and Yong, J.C. (2016). Registered Replication Report: Study 1 from Finkel, Rusbult, Kumashiro, and Hannon (2002). Perspectives on Psychological Science, 11(5), 750-764. doi:10.1177/1745691616664694.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Chirigati, J.F., and Fernando, S. (2018). Provenance and the Different Flavors of Reproducibility. IEEE Data Engineering Bulletin, 41(1), 15-26.

Chirigati, F., Shasha, D., and Freire, J. (2013). ReproZip: Using Provenance to Support Computational Reproducibility. Paper presented at the Proceedings of the 5th USENIX Workshop on the Theory and Practice of Provenance. Available: https://www.usenix.org/conference/tapp13/technical-sessions/presentation/chirigati [April 2019].

Chirigati, F., Rampin, R., Shasha, D., and Freire, J. (2016, June 26-July 1). ReproZip: Computational Reproducibility with Ease. Paper presented at the Proceedings of the 2016 International Conference on Management of Data, San Francisco, CA. New York: Association for Computing Technology. doi:10.1145/2882903.2899401.

Chue Hong, N., Engineering, E., Physical Sciences Research, C., Hettrick, S., Antonioletti, M., Carr, L., Chue Hong, N., Crouch, S., De Roure, D., Emsley, I., Goble, C., Hay, A., Inupakutika, D., Jackson, M., Nenadic, A., Parkinson, T., Parsons, M.I., Pawlik, A., Peru, G., Proeme, A., Robinson, J., and Sufi, S. (2015). UK Research Software Survey 2014 [Dataset]. Available: https://datashare.is.ed.ac.uk/handle/10283/785 [December 2018].

Claerbout, J.F., and Karrenbach, M. (1992). Electronic Documents Give Reproducible Research a New Meaning. SEG Technical Program Expanded Abstracts, 601-604. doi:10.1190/1.1822162.

Cohen, J. (1988). Statistical Power Analysis for the Behavioral Sciences. Hillsdale, NJ: Lawrence Erlbaum Associates.

Collberg, C., Proebsting, T., Moraila, G., Shankaran, A., Shi, Z., and Warren, A. (2014). Measuring Reproducibility in Computer Systems Research. University of Arizona Department of Computer Science. Available: http://reproducibility.cs.arizona.edu/tr.pdf [January 2019].

Collins, H.M. (1975). The seven sexes: A study in the sociology of a phenomenon, or the replication of experiments in physics. Sociology, 9(2), 205-224.

Computational Fluid Dynamics Committee. (1998). Guide for the Verification and Validation of Computational Fluid Dynamics Simulations (AIAA G-077-1998 (2002)). doi:10.2514/4.472855.001.

Conniff, R. (2018). When Continental Drift Was Considered Pseudoscience. Smithsonian Magazine, June. Available: https://www.smithsonianmag.com/science-nature/when-continental-drift-was-considered-pseudoscience-90353214 [December 2018].

Cooper, H.V., Hedges, L.V., and Valentine, J.C. (2009). The Handbook of Research Synthesis and Meta-Analysis (2nd ed.). New York: Russell Sage Foundation.

Corley, E.A., Kim, Y., and Scheufele, D.A. (2011). Leading U.S. Nano-Scientists’ Perceptions about Media Coverage and the Public Communication of Scientific Research Findings. Journal of Nanoparticle Research, 13(12), 7041-7055.

Council on Governmental Relations. (2018) COGR Survey Report on Institutional Resources for Promoting Research Quality. Available: https://www.cogr.edu/cogr-survey-report-institutional-resources-promoting-research-quality [July 2019].

Cova, F., Strickland, B., Abatista, A., Allard, A., Andow, J., Attie, M., Beebe, J., Berni nas, R., Boudesseul, J., Colombo, M., Cushman, F., Diaz, R., N’Djaye, N., van Dongen, N., Dranseika, V., Earp, B. D., Gaitán Torres, A., Hannikainen, I., Hernández-Conde, J.V., Hu, W., Jaquet, F., Khalifa, K., Kim, H., Kneer, M., Knobe, J., Kurthy, M., Lantian, A., Liao, S., Machery, E., Moerenhout, T., Mott, C., Phelan, M., Phillips, J., Rambharose, N., Reuter, K., Romero, F., Sousa, P., Sprenger, J., Thalabard, E., Tobia, K., Viciana, H., Wilkenfeld, D., and Zhou, X. (2018). Review of Philosophy and Psychology, 1-36. doi:10.1007/s13164-018-0400-9.

Cox, D.R. (2006). Principles of statistical inference. Cambridge, UK: Cambridge University Press. doi:10.1017/CBO9780511813559.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Cummings, M. (2018). Project Revives Old Software, Preserves “Born-Digital” Data. Available: https://news.yale.edu/2018/02/13/project-revives-old-software-preserves-born-digital-data [December 2018].

Dauben, J.W. (1988). Georg Cantor and the Battle for Transfinite Set Theory. Available: http://heavysideindustries.com/wp-content/uploads/2011/08/Dauben-Cantor.pdf [December 2018].

Davidson, S.B., and Freire, J. (2008). Provenance and Scientific Workflows: Challenges and Opportunities. Paper presented at the Proceedings of the ACM SIGMOD International Conference on Management of Data, Vancouver, BC. Available: https://vgc.poly.edu/~juliana/pub/freire-tutorial-sigmod2008.pdf [April 2019].

Davies, S.R. (2008). Constructing communication: Talking to scientists about talking to the public. Science Communication, 29(4), 413-434.

de Groot, A.D. (2014). The Meaning of “Significance” for Different Types of Research Acta Psychologica, 148, 188-194. doi:10.1016/jactpsy.2014.02.0001.

Deelman, E., Peterka, T., Altintas, I., Carothers, C.D., van Dam, K.K., Moreland, K., Parashar, M., Ramakrishnan, L., Taufer, M., and Vetter, J. (2018). The Future of Scientific Workflows. The International Journal of High Performance Computing Applications, 32(1), 159-175.

de Vrieze, J. (2018). Meta-Analyses Were Supposed to End Scientific Debates. Often, They Only Cause More Controversy. Science, September 18. doi:10.1126/science.aav4617.

Dewald, W.G., Thursby, J.G., and Anderson, R.G. (1986). Replication in Empirical Economics: The Journal of Money, Credit and Banking Project. The American Economic Review, 76(4), 587-603.

Di Tommaso, P., Palumbo, E., Chatzou, M., Prieto, P., Heuer, M.L., and Notredame, C. (2015). The Impact of Docker Containers on the Performance of Genomic Pipelines. PeerJ, 3, e1273. doi:10.7717/peerj.1273.

Diethelm, K. (2012). The Limits of Reproducibility in Numerical Simulation. Computing in Science & Engineering, 14(1), 64-72.

Dillman, D.A., Smyth, J.D., and Christian, L.M. (2014). Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method. Hoboken, NJ: John Wiley & Sons.

Donoho, D.L. (2010). An Invitation to Reproducible Computational Research. Biostatistics, 11(3), 385-388.

Donoho, D.L., Maleki, A., Rahman, I.U., Shahram, M., and Stodden, V. (2009). Reproducible Research in Computational Harmonic Analysis. Computing in Science & Engineering, 11(1), 8-18.

Druckman, J.N., and Bolsen, T. (2011). Framing, Motivated Reasoning, and Opinions About Emergent Technologies. Journal of Communication, 61(4), 659-688.

Duvendack, M., Palmer-Jones, R.W., and Robert Reed, W. (2015). Replications in Economics: A Progress Report. Econ Journal Watch, 12(2), 164-191.

Earp, B.D., and Trafimow, D. (2015). Replication, falsification, and the crisis of confidence in social psychology. Frontiers in Psychology, 6(621), 1-11.

Ebersole, C.R., Atherton, O.E., Belanger, A.L., Skulborstad, H.M., Allen, J., Banks, J.B., Baranski, E., Bernstein, M.G., Bonfiglio, D.B.V., Boucher, L., Brown, E.R., Budiman, N.I., Cairo, A.H., Capaldi, C.A., Chartier, C.R., Chung, J.M., Cicero, D.C., Coleman, J.A., Conway, J.G., Davis, W.E., Devos, T., Fletcher, M.M., German, K., Grahe, J.E., Hermann, A.D., Hicks, J.A., Honeycutt, N., Humphrey, B., Janus, M., Johnson, D.J., Joy-Gaba, J.A., Juzeler, H., Keres, A., Kinney, D., Kirshenbaum, J., Klein, R.A., Lucas, R.E., Lustgraaf, C.J.N., Martin, D., Menon, M., Metzger, M., Moloney, J.M., Morse, P.J., Prislin, R., Razza, T., Re, D.E., Rule, N.O., Sacco, D.F., Sauerberger, K., Shrider, E., Shultz, M., Siemsen, C., Sobocko, K., Sternglanz, R.W., Summerville, A., Tskhay, K.O., van Allen, Z., Vaughn, L.A., Walker, R.J., Weinberg, A., Wilson, J.P., Wirth, J.H., Wortman, J., and Nosek, B.A. (2016a). Many Labs 3: Evaluating participant pool quality across the academic semester via replication. Journal of Experimental Social Psychology, 67, 68-82. doi:10.1016/j.jesp.2015.10.012.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Ebersole, C.R., Axt, J.R., and Nosek, B.A. (2016b). Scientists’ Reputations Are Based on Getting It Right, Not Being Right. PLOS Biology, 14(5), e1002460.

Ecklund, E.H., James, S.A., and Lincoln, A.E. (2012). How Academic Biologists and Physicists View Science Outreach. PLOS ONE, 7(5), e36240.

Edwards, M.A., and Roy, S. (2017). Academic Research in the 21st Century: Maintaining Scientific Integrity in a Climate of Perverse Incentives and Hypercompetition. Environmental Engineering Science, 34(1), 51-61.

Eerland, A., Sherrill, A.M., Magliano, J.P., Zwaan, R.A., Arnal, J.D., Aucoin, P., et al. (2016). Registered replication report: Hart and Albarracín (2011). Perspectives on Psychological Science, 11(1), 158-171.

Epskamp, S., and Nuijten, M.B. (2016). statcheck: Extract Statistics from Articles and Recompute p Values. Available: http://CRAN.R-project.org/package=statcheck [January 2019].

Ernst, E. (2002). A Systematic Review of Systematic Reviews of Homeopathy. British Journal of Clinical Pharmacology, 54(6), 577-582.

Error Prone. (2012). Editorial. Nature, 487(7408), 406. doi:10.1038/487406a.

Erway, R., and Rinehart, A. (2016). If You Build It, Will They Fund? Making Research Data Management Sustainable. Dublin, OH: OCLC Research. Available: https://www.oclc.org/content/dam/research/publications/2016/oclcresearch-making-research-datamanagement-sustainable-2016.pdf [April 2019].

Etz, A., and Vandekerckhove, J. (2016). A Bayesian Perspective on the Reproducibility Project: Psychology. PLOS ONE, 11(2), e0149794.

Fanelli, D. (2009). How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data. PLOS ONE, 4(5), e5738.

Fanelli, D. (2012). Negative Results Are Disappearing from Most Disciplines and Countries. Scientometrics, 90(3), 891-904.

Fanelli, D. (2018). Opinion: Is Science Really Facing a Reproducibility Crisis, and Do We Need It To? Proceedings of the National Academy of Sciences of the United States of America, 115(11), 2628-2631.

Fanelli, D., Costas, R., and Ioannidis, J.P.A. (2017). Meta-Assessment of Bias in Science. Proceedings of the National Academy of Sciences of the United States of America, 114(14), 3714-3719.

Fang, F.C., Steen, R.G., and Casadevall, A. (2012). Misconduct Accounts for the Majority of Retracted Scientific Publications. Proceedings of the National Academy of Sciences of the United States of America, 109(42), 17028-17033.

Federation of American Societies for Experimental Biology. (2016). Enhancing Research Reproducibility: Recommendations from the Federation of American Societies for Experimental Biology. Available: https://www.faseb.org/Portals/2/PDFs/opa/2016/FASEB_Enhancing%20Research%20Reproducibility.pdf [January 2019].

Ferguson, C.J., and Brannick, M.T. (2012). Publication Bias in Psychological Science: Prevalence, Methods for Identifying and Controlling, and Implications for the Use of Meta-Analyses. Psychological Methods, 17(1), 120-128.

Fiedler, K., Kutzner, F., and Krueger, J.I. (2012). The Long Way from α-Error Control to Validity Proper: Problems with a Short-Sighted False-Positive Debate. Perspectives on Psychological Science, 7(6), 661-669.

Finkel, E.J., Eastwick, P.W., and Reis, H.T. (2015). Best Research Practices in Psychology: Illustrating Epistemological and Pragmatic Considerations with the Case of Relationship Science. Journal of Personality and Social Psychology, 108(2), 275-297.

Finkel, E.J., Eastwick, P.W., and Reis, H.T. (2017). Replicability and Other Features of a High-Quality Science: Toward a Balanced and Empirical Approach. Journal of Personality and Social Psychology, 113(2), 244-253.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Fischhoff, B., Brewer, N.T., and Downs, J.S. (Eds.). (2011). Communicating Risks and Benefits: An Evidence-Based User’s Guide. Available: https://www.fda.gov/aboutfda/reportsmanualsforms/reports/ucm268078.htm [December 2018].

Fischhoff, B., and Davis, A.L. (2014). Communicating Scientific Uncertainty. Proceedings of the National Academy of Sciences of the United States of America, 111(Suppl. 4), 13664-13671.

Fisher, R.A. (1935). The Design of Experiments. Oxford, UK: Oliver & Boyd.

Fomel, S., and Claerbout, J.F. (2009). Guest Editors’ Introduction: Reproducible Research. Computing in Science & Engineering, 11(1), 5-7.

Foster, I., Vockler, J., Wilde, M., and Yong, Z. (2002, July 24-26). Chimera: A Virtual Data System for Representing, Querying, and Automating Data Derivation. Paper presented at the Proceedings 14th International Conference on Scientific and Statistical Database Management. Washington, DC: IEEE Computer Society. doi:10.1109/SSDM.2002.1029704.

Fraley, R.C., and Vazire, S. (2014). The N-Pact Factor: Evaluating the Quality of Empirical Journals with Respect to Sample Size and Statistical Power. PLOS ONE, 9(10), e109019.

Franco, A., Malhotra, N., and Simonovits, G. (2014). Publication Bias in the Social Sciences: Unlocking the File Drawer. Science, 345(6203), 1502-1505.

Franco, A., Malhotra, N., and Simonovits, G. (2015). Underreporting in Psychology Experiments: Evidence from a Study Registry. Social Psychological and Personality Science, 7(1), 8-12.

Fraser, H., Parker, T., Nakagawa, S., Barnett, A., and Fidler, F. (2018). Questionable Research Practices in Ecology and Evolution. PLOS ONE, 13(7), e0200303.

Freedman, M.H., Gukelberger, J., Hastings, M.B., Trebst, S., Troyer, M., and Wang, Z. (2011). Galois Conjugates of Topological Phases. arXiv, 1106.3267. doi:10.1103/PhysRevB.85.045414.

Freire, J., and Silva, C.T. (2012). Making Computations and Publications Reproducible with VisTrails. Computing in Science & Engineering, 14(4), 18-25. doi:10.1109/MCSE.2012.76.

Freire, J., Silva, C.T., Callahan, S.P., Santos, E., Scheidegger, C.E., and Vo, H.T. (2006, May 3-5). Managing Rapidly-Evolving Scientific Workflows. Paper presented at the Provenance and Annotation of Data, Chicago, IL. Available: https://link.springer.com/chapter/10.1007/11890850_2 [April 2019].

Freire, J., Koop, D., Santos, E., and Silva, C.T. (2008). Provenance for Computational Tasks: A Survey. Computing in Science & Engineering, 10(3), 11-21.

Freire, J., Fuhr, N., and Rauber, A. (2016). Reproducibility of Data-Oriented Experiments in e-Science (Dagstuhl Seminar 16041). Dagstuhl Reports, 6, 108-159. doi:10.4230/DagRep.6.1.108.

Frewer, L., Hunt, S., Brennan, M., Kuznesof, S., Ness, M., and Ritson, C. (2003). The views of scientific experts on how the public conceptualize uncertainty. Journal of Risk Research, 6(1), 75-85.

Fugh-Berman, A.J. (2010). The Haunting of Medical Journals: How Ghostwriting Sold “HRT”. PLOS Medicine, 7(9), e1000335. doi:10.1371/journal.pmed.1000335.

Funk, C., and Rainie, L. (2015). Public and Scientists’ Views on Science and Society. Pew Research Center, January 29. Available: http://www.pewinternet.org/2015/01/29/public-and-scientists-views-on-science-and-society [December 2018].

Funk, C., Gottfried, J., and Mitchell, A. (2017). Science news and information today. Pew Research Center, September 20. Available: http://www.journalism.org/2017/09/20/science-news-and-information-today [December 2018].

Gadbury, G.L., and Allison, D.B. (2012). Inappropriate Fiddling with Statistical Analyses to Obtain a Desirable P-Value: Tests to Detect Its Presence in Published Literature. PLOS ONE, 7(10), e46363.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Galtung, J., and Ruge, M.H. (1965). The Structure of Foreign News. Journal of Peace Research, 2(1), 64-91.

Garfield, E. (2006). The History and Meaning of the Journal Impact Factor. Journal of the American Medical Association, 295(1), 90-93.

Gervais, W.M., Jewell, J., Najle, M.B., and Ng, B. (2015). A powerful nudge? Presenting calculable consequences of underpowered research shifts incentives toward adequately powered designs. Social Psychological and Personality Science, 6.

Gigerenzer, G., Swijtink, Z., Porter, T., Daston, L., Beatty, J., and Kruger, L. (1989). The Empire of Chance: How Probability Changed Science and Everyday Life. Cambridge, UK: Cambridge University Press.

Gil, Y., Deelman, E., Ellisman, M., Fahringer, T., Fox, G., Gannon, D., Goble, C., Livny, M., Moreau, L., and Myers, J. (2007). Examining the Challenges of Scientific Workflows. Computer, 40(12), 24-32.

Gilbert, D.T., King, G., Pettigrew, S., and Wilson, T.D. (2016). Comment on “Estimating the Reproducibility of Psychological Science.” Science, 351(6277), 1037-1037.

Global Young Academy. (2018, May 31). Young Scientist Perspectives on Replicability and Reproducibility in Science and Engineering. Presented by K. Vermeir, L. Fierce, and A. Coussens on behalf of the GYA Working Groups for Scientific Excellence and Open Science for the Committee on Reproducibility and Replicability in Science at the National Academies of Sciences, Engineering, and Medicine. Available: https://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_192050.pdf [April 2019].

Goldin-Meadow, S. (2016). Why Preregistration Makes Me Nervous. APS Observer, August 31. Available: https://www.psychologicalscience.org/observer/why-preregistration-makesme-nervous/comment-page-1 [April 2019].

Goodchild van Hilten, L. (2015). Why It’s Time to Publish Research “Failures.” Elsevier Connect, May 5. Available: https://www.elsevier.com/connect/scientists-we-want-your-negative-results-too [December 2018].

Goodman, S.N. (1992). A Comment on Replication, p-Values and Evidence. Statistics in Medicine, 11(7), 875-879.

Goodman, S. (2018). Research Reproducibility and Statistics. Presentation to the Committee on Reproducibility and Replicability in Science, February 22, National Academies of Sciences, Engineering, and Medicine, Washington, DC.

Goodman, S.N., Fanelli, D., and Ioannidis, J.P.A. (2016). What Does Research Reproducibility Mean? Science Translational Medicine, 8(341). doi:10.1126/scitranslmed.aaf5027.

Goodman, S., and Greenland, S. (2007). Assessing the Unreliability of the Medical Literature: A Response to “Why Most Published Research Findings Are False.” Working Paper 135. Baltimore MD: Johns Hopkins University, Department of Biostatistics.

Goodstein, D. (2018). On Fact and Fraud: Cautionary Tales from the Front Lines of Science. Princeton, NJ: Princeton University Press.

Gøtzsche, P.C., Hróbjartsson, A., Marić, K., and Tendal, B. (2007). Data Extraction Errors in Meta-Analyses That Use Standardized Mean Differences. Journal of the American Medical Association, 298(4), 430-437.

Grieneisen, M.L., and Zhang, M. (2012). A Comprehensive Survey of Retracted Articles from the Scholarly Literature. PLOS ONE, 7(10), e44118.

Gundersen, O.E., Gil, Y., and Aha, D.W. (2018). On Reproducible AI: Towards Reproducible Research, Open Science, and Digital Scholarship in AI Publications. AI Magazine, 39(3), 56-68.

Gunning, D. (2018). Explainable Artificial Intelligence (XAI). Defense Advanced Research Projects Agency. Available: https://www.darpa.mil/program/explainable-artificial-intelligence [January 2019].

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Guo, P. J. (2012). CDE: A Tool for Creating Portable Experimental Software Packages. Computing in Science & Engineering, 14(4), 32-35.

Guo, P.J., and Seltzer, M.I. (2012). BURRITO: Wrapping Your Lab Notebook in Computational Infrastructure. Paper presented at TaPP’12 Proceedings of the 4th USENIX Conference on Theory and Practice of Provenance, Boston, MA. Available: https://www.usenix.org/system/files/conference/tapp12/tapp12-final10.pdf [April 2019].

Hagger, M.S., Chatzisarantis, N.L.D., Alberts, H., Anggono, C.O., Batailler, C., Birt, A.R., Brand, R., Brandt, M.J., Brewer, G., Bruyneel, S., Calvillo, D.P., Campbell, W.K., Cannon, P.R., Carlucci, M., Carruth, N.P., Cheung, T., Crowell, A., De Ridder, D.T.D., Dewitte, S., Elson, M., Evans, J.R., Fay, B.A., Fennis, B.M., Finley, A., Francis, Z., Heise, E., Hoemann, H., Inzlicht, M., Koole, S.L., Koppel, L., Kroese, F., Lange, F., Lau, K., Lynch, B.P., Martijn, C., Merckelbach, H., Mills, N.V., Michirev, A., Miyake, A., Mosser, A.E., Muise, M., Muller, D., Muzi, M., Nalis, D., Nurwanti, R., Otgaar, H., Philipp, M.C., Primoceri, P., Rentzsch, K., Ringos, L., Schlinkert, C., Schmeichel, B.J., Schoch, S.F., Schrama, M., Schütz, A., Stamos, A., Tinghög, G., Ullrich, J., vanDellen, M., Wimbarti, S., Wolff, W., Yusainy, C., Zerhouni, O., and Zwienenberg, M. (2016). A Multilab Preregistered Replication of the Ego-Depletion Effect. Perspectives on Psychological Science, 11(4), 546-573.

Hall, A., and Stouffer, R.J. (2001). An abrupt climate event in a coupled ocean–atmosphere simulation without external forcing. Nature, 409(6817), 171. doi:10.1038/35051544.

Hallock, R. (2015). Is Solid Helium a Supersolid? Physics Today, 68(5), 30-35.

Hansen, J., Lacis, A., Rind, D., Russell, G., Stone, P., Fung, I., Lerner, J., et al. (1984). Climate sensitivity: Analysis of feedback mechanisms. Climate Processes and Climate Sensitivity, 130-163.

Harris, R.F. (2017). Rigor Mortis: How Sloppy Science Creates Worthless Cures, Crushes Hope, and Wastes Billions. New York: Basic Books.

Hartl, D.L., and Fairbanks, D.J. (2007). Mud Sticks: On the Alleged Falsification of Mendel’s Data. Genetics, 175(3), 975-979.

Hartling, L., Featherstone, R., Nuspl, M., Shave, K., Dryden, D.M., and Vandermeer, B. (2017). Grey Literature in Systematic Reviews: A Cross-Sectional Study of the Contribution of Non-English Reports, Unpublished Studies and Dissertations to the Results of Meta-Analyses in Child-Relevant Reviews. BMC Medical Research Methodology, 17(1), 64. doi:10.1186/s12874-017-0347-z.

Head, M.L., Holman, L., Lanfear, R., Kahn, A.T., and Jennions, M.D. (2015). The Extent and Consequences of P-Hacking in Science. PLOS Biology, 13(3), e1002106.

Herndon, T., Ash, M., and Pollin, R. (2013). Does High Public Debt Consistently Stifle Economic Growth? A Critique of Reinhart and Rogof. Political Economy Research Institute Working Paper Series Number 322. Available: https://www.peri.umass.edu/fileadmin/pdf/working_papers/working_papers_301-350/WP322.pdf [April 2019].

Heroux, M.A., Barba, L.A., Parashar, M., Stodden, V., and Taufer, M. (2018). Toward a Compatible Reproducibility Taxonomy for Computational and Computing Sciences. Sandia Report SAND2018-11186. Available: https://cfwebprod.sandia.gov/cfdocs/CompResearch/docs/SAND2018-11186.pdf [December 2018].

Hettrick, S. (2017). A Journey of Reproducibility from Excel to Pandas. Available: https://www.software.ac.uk/blog/2017-09-06-journey-reproducibility-excel-pandas [December 2018].

Hey, A.J.G., Tansley, S., and Tolle, K.M. (Eds.). (2009). The Fourth Paradigm: Data-Intensive Scientific Discovery. Redmond, WA: Microsoft Research. Available: https://www.immagic.com/eLibrary/ARCHIVES/EBOOKS/M091000H.pdf [April 2019].

Hines, W.C., Su, Y., Kuhn, I., Polyak, K., and Bissell, M.J. (2014). Sorting out the FACS: A Devil in the Details. Cell Reports, 6(5), 779-781.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Ho, S.S., Brossard, D., and Scheufele, D.A. (2008). Effects of Value Predispositions, Mass Media Use, and Knowledge on Public Attitudes toward Embryonic Stem Cell Research. International Journal of Public Opinion Research, 20(2), 171-192.

Hollenbeck, J.R., and Wright, P.M. (2016). Harking, Sharking, and Tharking: Making the Case for Post Hoc Analysis of Scientific Data. Journal of Management, 43(1), 5-18.

How Science Goes Wrong. (2013). The Economist. Available: https://www.economist.com/leaders/2013/10/21/how-science-goes-wrong [December 2018].

Howe, B. (2012). Virtual Appliances, Cloud Computing, and Reproducible Research. Computing in Science & Engineering, 14(4), 36-41.

Howell, E. (2018). Public Perceptions of Scientific Uncertainty and Media Reporting of Reproducibility and Replication in Science. Paper prepared for the Committee on Reproducibility and Replicability in Science, National Academies of Sciences, Engineering, and Medicine, Washington, DC.

Hutson, M. (2018). Missing Data Hinder Replication of Artificial Intelligence Studies. Science, February 15. doi:10.1126/science.aat3298.

Hynek, S., Fuller, W., and Bentley, J. (1997). Hydrogen Storage by Carbon Sorption. International Journal of Hydrogen Energy, 22(6), 601-610.

Ingraham, P. (2016). Ioannidis: Making Medical Science Look Bad Since 2005. Painscience. com, September 15. Available: https://www.painscience.com/articles/ioannidis.php [January 2019].

Institute of Medicine. (2011). Finding What Works in Health Care: Standards for Systematic Reviews. Washington, DC: The National Academies Press.

Institute of Medicine. (2012). Evolution of Translational Omics: Lessons Learned and the Path Forward. Washington, DC: The National Academies Press. doi:10.17226/13297.

Ioannidis, J.P. (2005). Why Most Published Research Findings Are False. PLOS Medicine, 2(8), e124.

Ioannidis, J.P. (2009). Population-Wide Generalizability of Genome-Wide Discovered Associations. Journal of the National Cancer Institute, 101(19), 1297-1299. doi:10.1093/jnci/djp298.

Ioannidis, J.P. (2012). Why Science Is Not Necessarily Self-Correcting. Perspectives on Psychological Science, 7(6), 645-654.

Ioannidis, J.P., and Trikalinos, T.A. (2007). An Exploratory Test for an Excess of Significant Findings. Clinical Trials, 4(3), 245-253.

Ioannidis, J.P., Munafo, M.R., Fusar-Poli, P., Nosek, B.A., and David, S.P. (2014). Publication and Other Reporting Biases in Cognitive Sciences: Detection, Prevalence, and Prevention. Trends in Cognitive Sciences, 18(5), 235-241.

Ioannidis, J.P., Fanelli, D., Dunne, D.D., and Goodman, S.N. (2015). Meta-Research: Evaluation and Improvement of Research Methods and Practices. PLOS Biology, 13(10), e1002264.

Ip, S., Chung, M., Raman, G., Chew, P., Magula, N., DeVine, D., Trikalinos, T., and Lau, J. (2007). Breastfeeding and Maternal and Infant Health Outcomes in Developed Countries. Evidence Report/Technology Assessment, 153, 1-186. Available: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4781366 [April 2019].

Iqbal, S.A., Wallach, J.D., Khoury, M.J., Schully, S.D., and Ioannidis, J.P.A. (2016). Reproducible Research Practices and Transparency across the Biomedical Literature. PLOS Biology, 14(1), e1002333.

Jachimowicz, J., Duncan, S., Weber, E.U., and Johnson, E.J. (2018). When and Why Defaults Influence Decisions: A Meta-Analysis of Default Effects. SSRN. doi:10.2139/ssrn.2727301.

Jacoby, W.G. (2017, December 12-13). Perspectives on Reproducibility and Replication: Scientific Societies: Behavioral and Social Sciences. Presented for the Committee on Reproducibility and Replicability in Science at the National Academies of Sciences, Engineering, and Medicine.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Jamieson, K.H. (2018). Crisis or Self-Correction: Rethinking Media Narratives About the Well-Being of Science. Proceedings of the National Academy of Sciences of the United States of America, 115(11), 2620-2627.

John, L.K., Loewenstein, G., and Prelec, D. (2012). Measuring the Prevalence of Questionable Research Practices with Incentives for Truth Telling. Psychology Science, 23(5), 524-532.

Joint Committee for Guides in Metrology. (2012). The International Vocabulary of Metrology—Basic and General Concepts and Associated Terms (VIM) (3rd ed.). 200:2012. Available: https://www.bipm.org/utils/common/documents/jcgm/JCGM_200_2012.pdf [April 2019].

Kahan, D., and Landrum, A.R. (2017). A Tale of Two Vaccines—and Their Science Communication Environments. In K.H. Jamieson, D.M. Kahan, and D.A. Scheufele (Eds.), The Oxford Handbook of the Science of Science Communication (pp. 165-172). New York: Oxford University Press.

Kaplan, R.M., and Irvin, V.L. (2015). Likelihood of Null Effects of Large NHLBI Clinical Trials Has Increased over Time. PLOS ONE, 10(8), e0132382. doi:10.1371/journal. pone.0132382.

Kass, R.E., and Raftery, A.E. (1995). Bayes Factors. Journal of the American Statistical Association, 90(430), 773-795.

Kidwell, M.C., Lazarević, L.B., Baranski, E., Hardwicke, T.E., Piechowski, S., Falkenberg, L.-S., Kennett, C., Slowik, A., Sonnleitner, C., Hess-Holden, C., Errington, T.M., Fiedler, S., and Nosek, B.A. (2016). Badges to Acknowledge Open Practices: A Simple, Low-Cost, Effective Method for Increasing Transparency. PLOS Biology, 14(5), e1002456.

King, G. (1995). Replication, Replication. PS: Political Science and Politics, 28(3), 444-452. doi:10.2307/420301.

Kitzes, J., Turek, D., and Deniz, F. (2017). The Practice of Reproducible Research Case Studies and Lessons from the Data-Intensive Sciences. Berkeley: University of California Press.

Klein, R.A., Ratliff, K.A., Vianello, M., Adams, R.B., Bahník, Š., Bernstein, M.J., Bocian, K., Brandt, M.J., Brooks, B., Brumbaugh, C.C., Cemalcilar, Z., Chandler, J., Cheong, W., Davis, W.E., Devos, T., Eisner, M., Frankowska, N., Furrow, D., Galliani, E.M., Hasselman, F., Hicks, J.A., Hovermale, J.F., Hunt, S.J., Huntsinger, J.R., Ijzerman, H., John, M.-S., Joy-Gaba, J.A., Barry Kappes, H., Krueger, L.E., Kurtz, J., Levitan, C.A., Mallett, R.K., Morris, W.L., Nelson, A.J., Nier, J.A., Packard, G., Pilati, R., Rutchick, A.M., Schmidt, K., Skorinko, J.L., Smith, R., Steiner, T.G., Storbeck, J., Van Swol, L.M., Thompson, D., van ‘t Veer, A.E., Ann Vaughn, L., Vranka, M., Wichman, A.L., Woodzicka, J.A., and Nosek, B.A. (2014). Investigating Variation in Replicability: A “Many Labs” Replication Project). Social Psychology, 45(3), 142-152.

Klein, R.A., Vianello, M., and Nosek, B.A. (2018). Many Labs 2: Investigating Variation in Replicability Across Samples and Settings. Advances in Methods and Practices in Psychological Science, 1(4), 443-490. Available: doi:10.1177/2515245918810225.

Kluyver, T., Ragan-Kelley, B., Perez, F., Granger, B., Bussonnier, M., Frederic, J., Kelley, K., Hamrick, J., Grout, J., Corlay, S., Ivanov, P., Avila, D., Abdalla, S., Willing, C., and Juptyer Development Team. (2016). Jupyter Notebooks—A Publishing Format for Reproducible Computational Workflows. In F. Loizides and B. Scmidt (Eds.), Positioning and Power in Academic Publishing: Players, Agents and Agendas (pp. 87-90). Fairfax, VA: IOS Press. doi:10.3233/978-1-61499-649-1-87.

Knight, W. (2017). The Dark Secret at the Heart of AI. MIT Technology Review, April 11. Available: https://www.technologyreview.com/s/604087/the-dark-secret-at-the-heart-of-ai [January 2019].

Koerth-Baker, M. (2019). Forget the Black Hole Picture—Check Out the Sweet Technology that Made It Possible. FiveThirtyEight, April 11. Available: https://fivethirtyeight.com/features/forget-the-black-hole-picture-check-out-the-sweet-technology-that-made-itpossible [April 2019].

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Konkol, M., Kray, C., and Pfeiffer, M. (2019). Computational reproducibility in geoscientific papers: Insights from a series of studies with geoscientists and a reproduction study. International Journal of Geographical Information Science, 33(2), 408-429.

Kotlikoff, M.I. (2018). Statement of Cornell University Provost Michael I. Kotlikoff. Available: http://statements.cornell.edu/2018/20180920-statement-provost-michael-kotlikoff.cfm [April 2019].

Kühberger, A., Fritz, A., and Scherndl, T. (2014). Publication Bias in Psychology: A Diagnosis Based on the Correlation between Effect Size and Sample Size. PLOS ONE, 9(9), e105825.

Kupferschmidt, K. (2018). More and More Scientists Are Preregistering Their Studies. Should You? Science, September 21. Available: https://www.sciencemag.org/news/2018/09/more-and-more-scientists-are-preregistering-their-studies-should-you [April 2019].

Lau, J., Ioannidis, J.P.A., Terrin, N., Schmid, C.H., and Olkin, I. (2006). The Case of the Misleading Funnel Plot. BMJ, 333(7568), 597-600.

Lazzeroni, L.C., Lu. Y., and Belitskaya-Lévy, I. (2016). Solutions for Quantifying P-Value Uncertainty and Replication Power. Nature Methods, 28(13), 107-108.

Le, L., Lee, E.H., Hardy, D.J., Truong, T.N., and Schulten, K. (2010). Molecular Dynamics Simulations Suggest That Electrostatic Funnel Directs Binding of Tamiflu to Influenza N1 Neuraminidases. PLOS Computational Biology, 6(9), e1000939.

LeBel, E.P., Campbell, L., and Loving, T.J. (2017). Benefits of Open and High-Powered Research Outweigh Costs. Journal of Personality and Social Psychology, 113(2), 230-243.

Leeflang, M.M.G., Bossuyt, P.M.M., and Irwig, L. (2009). Diagnostic Test Accuracy May Vary with Prevalence: Implications for Evidence-Based Diagnosis. Journal of Clinical Epidemiology, 62(1), 5-12.

Lepper, M.R., and Henderlong, J. (2000). Turning “Play” into “Work” and “Work” into “Play”: 25 Years of Research on Intrinsic Versus Extrinsic Motivation. In C. Sansone and J.M. Harackiewicz (Eds.), Intrinsic and Extrinsic Motivation (Ch. 10, pp. 257-307). San Diego, CA: Academic Press.

Levelt Committee, Noort Committee, and Drenth Committee. (2012). Flawed Science: The Fraudulent Research Practices of Social Psychologist Diederik Stapel. Available: https://poolux.psychopool.tu-dresden.de/mdcfiles/gwp/Reale%20F%C3%A4lle/Stapel%20-%20Final%20Report.pdf [July 2019].

Lijmer, J.G., Bossuyt, P.M., and Heisterkamp, S.H. (2002). Exploring Sources of Heterogeneity in Systematic Reviews of Diagnostic Tests. Statistics in Medicine, 21(11), 1525-1537.

Lin, W., and Green, D.P. (2016). Standard Operating Procedures: A Safety Net for Pre-Analysis Plans. PS: Political Science & Politics, 49(3), 495-500. doi:10.1017/S1049096516000810.

Lin, X. (2018). Reproducibility and Replicability in Large Scale Genetic Studies. Paper prepared for the Committee on Reproducibility and Replicability in Science, National Academies of Sciences, Engineering, and Medicine, Washington, DC.

Loftus, E.F., Dysart, J.E., and Newirth, K.A. (2017). Eyewitness Testimony: Civil and Criminal (5th ed.). Charlottesville, VA: Lexis Law.

Ludäscher, B., Altintas, I., Berkley, C., Higgins, D., Jaeger, E., Jones, M., Lee, E.A., Tao, J., and Zhao, Y. (2006). Scientific Workflow Management and the Kepler System. Concurrency and Computation: Practice and Experience—Workflow in Grid Systems, 18(10), 1039-1065.

Lupia, A. (2017). Now is the time: How to increase the value of social science. Social Research: An International Quarterly, 84, 689-715.

Lupia, A., and Elman, C. (2014). Openness in Political Science: Data Access and Research Transparency: Introduction. PS: Political Science & Politics, 47(1), 19-42.

Luttrell, A., Petty, R.E., and Xu, M. (2017). Replicating and Fixing Failed Replications: The Case of Need for Cognition and Argument Quality. Journal of Experimental Social Psychology, 69, 178-183. doi:10.1016/j.jesp.2016.09.006.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

MacInnis, B., and Krosnick, J.A. (2016). Trust in Scientists’ Statements About the Environment and American Public Opinion on Global Warming. In J.A. Krosnick, I.-C.A. Chiang, and T.H. Stark (Eds.), Political Psychology: New Explorations (Ch. 13, pp. 487-526). New York: Psychology Press.

Malle, B.F. (2006). The Actor-Observer Asymmetry in Attribution: A (Surprising) Meta-Analysis. Psychological Bulletin, 132(6), 895-919.

Marsden, A.L. (2015). Cardiovascular Blood Flow Simulation: From Computation to Clinic. SIAM News, December 1. Available: https://sinews.siam.org/Details-Page/cardiovascular-blood-flow-simulation [December 2018].

Marshall, D. (2018). An Overview of the California Earthquake Authority. Risk Management and Insurance Review, 21(1), 73-116.

Marwick, B. (2017). Computational Reproducibility in Archaeological Research: Basic Principles and a Case Study of Their Implementation. Journal of Archaeological Method and Theory, 24(2), 424-450.

Maxwell, J.C. (1954). Treatise on Electricity and Magnetism. Oxford, UK: Clarendon Press.

Maxwell, S.E., Lau, M.Y., and Howard, G.S. (2015). Is psychology suffering from a replication crisis? What does “failure to replicate” really mean? The American Psychologist, 70(6), 487-498.

McCook, A. (2018). One Publisher, More Than 7000 Retractions. Science, 362(6413), 393.

McCullough, B.D., and Vinod, H.D. (2003). Verifying the Solution from a Nonlinear Solver: A Case Study American Economic Review, 93(3), 873-892. doi: 10.1257/000282803322157133.

Merton, R.K. (1973). The Normative Structure of Science. In The Sociology of Science: Theoretical and Empirical Investigations (Ch. 13, pp. 267-278). Chicago, IL: University of Chicago Press.

Mesnard, O., and Barba, L.A. (2017). Reproducible and Replicable Computational Fluid Dynamics: It’s Harder Than You Think. Computing in Science & Engineering, 19(4), 44-55.

Miller, J. (2018). Metrology. In R. Leach and S.T. Smith (Eds.), Basics of Precision Engineering (Ch. 2, pp. 25-50). Boca Raton, FL: CRC Press.

Mischel, W. (1961). Father-Absence and Delay of Gratification. The Journal of Abnormal and Social Psychology, 63(1), 116-124.

Mitchell, A., Funk, C., and Gottfried, J. (2017). Most Americans See Science-Related Entertainment Shows and Movies in Either a Neutral or Positive Light. Pew Research Center, September 20. Available: http://www.journalism.org/2017/09/20/most-americans-see-science-related-entertainment-shows-and-movies-in-either-a-neutral-or-positive-light [January 2019].

Mohr, P.J., Newell, D.B., and Taylor, B.N. (2016). CODATA recommended values of the fundamental physical constants: 2014. Reviews of Modern Physics, 88, 035009.

Moraila, G., Shankaran, A., Shi, Z., and Warren, A.M. (2013). Measuring Reproducibility in Computer Systems Research. Available: https://www.researchgate.net/publication/267448249_Measuring_Reproducibility_in_Computer_Systems_Research [April 2019].

Moreau, L., Clifford, B., Freire, J., Futrelle, J., Gil, Y., Groth, P., Kwasnikowska, N., Miles, S., Missier, P., Myers, J., Plale, B., Simmhan, Y., Stephan, E., and den Bussche, J.V. (2011). The Open Provenance Model Core Specification (V1.1). Future Generation Computer Systems, 27(6), 743-756.

Moreau, L., Freire, J., Futrelle, J., McGrath, R., Myers, J., and Paulson, P. (2007). The Open Provenance Model (V1.00). Available: https://eprints.soton.ac.uk/264979/1/opm.pdf [December 2018].

Morey, R. (2019). You Must Tug That Thread: Why Treating Preregistration as a Gold Standard Might Incentivize Poor Behavior. Psychonomic Society, January 16. Available: https://featuredcontent.psychonomic.org/you-must-tug-that-thread-why-treating-preregistration-as-a-gold-standard-might-incentivize-poor-behavior [April 2019].

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Munafò, M.R., Nosek, B.A., Bishop, D.V.M., Button, K.S., Chambers, C.D., Percie du Sert, N., Simonsohn, U., Wagenmakers, E.-J., Ware, J.J., and Ioannidis, J.P.A. (2017). A Manifesto for Reproducible Science. Nature Human Behaviour, 1, 0021. Available: https://www.nature.com/articles/s41562-016-0021 [April 2019].

Mutz, D.C., and Reeves, B. (2005). The New Videomalaise: Effects of Televised Incivility on Political Trust. The American Political Science Review, 99(1), 1-15.

Nakagawa, S., and Parker, T.H. (2015). Replicating research in ecology and evolution: Feasibility, incentives, and the cost-benefit conundrum. BMC Biology, 13(88), 1-6.

Nangia, U., and Katz, D.S. (2017, October 24-27). Understanding Software in Research: Initial Results from Examining Nature and a Call for Collaboration. Paper presented at the 2017 IEEE 13th International Conference on e-Science (e-Science), Auckland, New Zealand. doi:10.1109/eScience.2017.78.

National Academies of Sciences, Engineering, and Medicine. (2016a). Genetically Engineered Crops: Experiences and Prospects. Washington, DC: The National Academies Press.

National Academies of Sciences, Engineering, and Medicine. (2016b). Science Literacy: Concepts, Contexts, and Consequences. Washington, DC: The National Academies Press.

National Academies of Sciences, Engineering, and Medicine. (2016c). Statistical Challenges in Assessing and Fostering the Reproducibility of Scientific Results: Summary of a Workshop. Washington, DC: The National Academies Press.

National Academies of Sciences, Engineering, and Medicine. (2017). Fostering Integrity in Research. Washington, DC: The National Academies Press.

National Academies of Sciences, Engineering, and Medicine. (2018). Open Science by Design: Realizing a Vision for 21st Century Research. Washington, DC: The National Academies Press.

National Academy of Sciences and Committee on the Conduct of Science. (1989). On Being a Scientist. Washington, DC: National Academy Press.

National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. (2009). On Being a Scientist: A Guide to Responsible Conduct in Research (3rd ed.). Washington, DC: The National Academies Press.

National Institutes of Health. (2015). Implementing Rigor and Transparency in NIH & AHRQ Research Grant Applications. NOT-OD-16-011. Available: https://grants.nih.gov/grants/guide/notice-files/NOT-OD-16-011.html [December 2018].

National Institutes of Health. (2018a). Enhancing Reproducibility through Rigor and Transparency. Available: https://grants.nih.gov/policy/reproducibility/index.htm [December 2018].

National Institutes of Health. (2018b). NIH Enhancing Reproducibility Guidelines: What You Need to Know. Available: https://grants.nih.gov/reproducibility/documents/grantguideline.pdf [December 2018].

National Institutes of Health. (2018c). NIH Sharing Policies and Related Guidance on NIH-Funded Research Resources. Available: https://grants.nih.gov/policy/sharing.htm [December 2018].

National Institutes of Health. (2018d). Rigor and Reproducibility. Available: https://www.nih.gov/research-training/rigor-reproducibility [December 2018].

National Institutes of Health. (2018e). Rigor and Reproducibility in NIH Applications: Resource Chart. Available: https://grants.nih.gov/grants/RigorandReproducibilityChart508.pdf [December 2018].

National Library of Medicine. (2018). Key MEDLINE Indicators. Available: https://www.nlm.nih.gov/bsd/bsd_key.html [January 2019].

National Science Foundation. (2016a). Dear Colleague Letter: Encouraging Reproducibility in Computing and Communications Research. NSF 17-022. Available: https://www.nsf.gov/pubs/2017/nsf17022/nsf17022.jsp [December 2018].

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

National Science Foundation. (2016b). Dear Colleague Letter: Reproducibility and Robustness of Results. NSF 16-083. Available: https://www.nsf.gov/pubs/2016/nsf16083/nsf16083.jsp [December 2018].

National Science Foundation. (2018a). Dear Colleague Letter: Achieving New Insights through Replicability and Reproducibility. NSF 18-053. Available: https://www.nsf.gov/pubs/2018/nsf18053/nsf18053.jsp [December 2018].

National Science Foundation. (2018b). Dissemination and Sharing of Research Results. Available: https://www.nsf.gov/bfa/dias/policy/dmp.jsp [December 2018].

National Science Foundation. (2018c). Harnessing the Data Revolution (HDR): Institutes for Data-Intensive Research in Science and Engineering-Ideas Labs (I-DIRSE-IL). NSF 19-543. Available: https://www.nsf.gov/pubs/2019/nsf19543/nsf19543.htm?WT.mc_id=USNSF_179 [December 2018].

National Science Foundation. (2018d). Reports & Publications. Available: https://www.nsf.gov/oig/reports [December 2018].

National Science Foundation. (2018e). Science & Engineering Indicators 2018. Available: https://www.nsf.gov/statistics/2018/nsb20181 [December 2018].

Nauenberg, M. (2015). Solution to the Long-Standing Puzzle of Huygens “Anomalous Suspension.” Archive for History of Exact Sciences, 69(3), 327-341.

Nelson, L.D., Simmons, J., and Simonsohn, U. (2018). Psychology’s Renaissance. Annual Review of Psychology, 69(1), 511-534.

Netherlands Organisation for Scientific Research. (2016). NWO Makes 3 Million Available for Replication Studies Pilot. Available: https://www.nwo.nl/en/news-and-events/news/2016/nwo-makes-3-million-available-for-replication-studies-pilot.html [June 2019].

Nieuwland, M. (2018). Nature Says It Wants to Publish Replication Attempts. So What Happened When a Group of Authors Submitted One to Nature Neuroscience? Available: https://retractionwatch.com/2018/05/08/nature-says-it-wants-to-publish-replication-attempts-so-what-happened-when-a-group-of-authors-submitted-one-to-nature-neuroscience [December 2018].

Nisbet, M.C., Brossard, D., and Kroepsch, A. (2003). Framing Science: The Stem Cell Controversy in an Age of Press/Politics. Harvard International Journal of Press/Politics, 8(2), 36-70.

Noah, T., Schul, Y., and Mayo, R. (2018). When Both the Original Study and Its Failed Replication Are Correct: Feeling Observed Eliminates the Facial-Feedback Effect. Journal of Personality and Social Psychology, 114(5), 657-664.

Normand, S.-L.T. (1999). Meta-Analysis: Formulating, Evaluating, Combining, and Reporting. Statistics in Medicine, 18(3), 321-359.

Nosek, B.A. (2016). Let’s Not Mischaracterize Replication Studies: Authors. Available: https://retractionwatch.com/2016/03/07/lets-not-mischaracterize-replication-studies-authors [December 2018].

Nosek, B.A., and Errington, T.M. (2017). Making Sense of Replications. eLife, 6, e23383. doi:10.7554/eLife.23383.

Nosek, B.A., Spies, J.R., and Motyl, M. (2012). Scientific Utopia: II. Restructuring Incentives and Practices to Promote Truth over Publishability. Perspectives on Psychological Science, 7(6), 615-631. doi:10.1177/1745691612459058.

Nosek, B.A., Alter, G., Banks, G.C., Borsboom, D., Bowman, S.D., Breckler, S.J., Buck, S., Chambers, C.D., Chin, G., Christensen, G., Contestabile, M., Dafoe, A., Eich, E., Freese, J., Glennerster, R., Goroff, D., Green, D.P., Hesse, B., Humphreys, M., Ishiyama, J., Karlan, D., Kraut, A., Lupia, A., Mabry, P., Madon, T., Malhotra, N., Mayo-Wilson, E., McNutt, M., Miguel, E., Paluck, E.L., Simonsohn, U., Soderberg, C., Spellman, B.A., Turitto, J., VandenBos, G., Vazire, S., Wagenmakers, E.J., Wilson, R., and Yarkoni, T. (2015). Promoting an Open Research Culture. Science, 348(6242), 1422-1425.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Nosek, B.A., Ebersole, C.R., DeHaven, A.C., and Mellor, D.T. (2018). The preregistration revolution. Proceedings of the National Academy of Sciences of the United States of America, 115(11), 2600-2606. doi:10.1073/pnas.1708274114.

Nuijten, M.B., Hartgerink, C.H., van Assen, M.A., Epskamp, S., and Wicherts, J.M. (2016). The Prevalence of Statistical Reporting Errors in Psychology (1985-2013). Behavior Research Methods, 48(4), 1205-1226.

O’Collins, V.E., Macleod, M.R., Donnan, G.A., Horky, L.L., van der Worp, B.H., and Howells, D.W. (2006). 1,026 Experimental Treatments in Acute Stroke. Annals of Neurology, 59(3), 467-477.

O’Donnell, M., Nelson, L.D., Ackermann, E., Aczel, B., Akhtar, A., Aldrovandi, S., Alshaif, N., Andringa, R., Aveyard, M., Babincak, P., Balatekin, N., Baldwin, S.A., Banik, G., Baskin, E., Bell, R., Białobrzeska, O., Birt, A.R., Boot, W.R., Braithwaite, S.R., Briggs, J.C., Buchner, A., Budd, D., Budzik, K., Bullens, L., Bulley, R.L., Cannon, P.R., Cantarero, K., Cesario, J., Chambers, S., Chartier, C.R., Chekroun, P., Chong, C., Cleeremans, A., Coary, S.P., Coulthard, J., Cramwinckel, F.M., Denson, T.F., Díaz-Lago, M., DiDonato, T.E., Drummond, A., Eberlen, J., Ebersbach, T., Edlund, J.E., Finnigan, K.M., Fisher, J., Frankowska, N., García-Sánchez, E., Golom, F.D., Graves, A.J., Greenberg, K., Hanioti, M., Hansen, H.A., Harder, J.A., Harrell, E.R., Hartanto, A., Inzlicht, M., Johnson, D.J., Karpinski, A., Keller, V.N., Klein, O., Koppel, L., Krahmer, E., Lantian, A., Larson, M.J., Légal, J.-B., Lucas, R.E., Lynott, D., Magaldino, C.M., Massar, K., McBee, M.T., McLatchie, N., Melia, N., Mensink, M.C., Mieth, L., Moore-Berg, S., Neeser, G., Newell, B.R., Noordewier, M.K., Ali Özdo ru, A., Pantazi, M., Parzuchowski, M., Peters, K., Philipp, M.C., Pollmann, M.M.H., Rentzelas, P., Rodríguez-Bailón, R., Philipp Röer, J., Ropovik, I., Roque, N.A., Rueda, C., Rutjens, B.T., Sackett, K., Salamon, J., Sánchez-Rodríguez, Á., Saunders, B., Schaafsma, J., Schulte-Mecklenbeck, M., Shanks, D.R., Sherman, M.F., Steele, K.M., Steffens, N.K., Sun, J., Susa, K.J., Szaszi, B., Szollosi, A., Tamayo, R.M., Tinghög, G., Tong, Y.-Y., Tweten, C., Vadillo, M.A., Valcarcel, D., Van der Linden, N., van Elk, M., van Harreveld, F., Västfjäll, D., Vazire, S., Verduyn, P., Williams, M.N., Willis, G.B., Wood, S.E., Yang, C., Zerhouni, O., Zheng, R., and Zrubka, M. (2018). Registered Replication Report: Dijksterhuis and Van Knippenberg (1998). Perspectives on Psychological Science, 13(2), 268-294.

Office of Science and Technology Policy. (2000). Federal Policy on Research Misconduct. Federal Register, 65, 76260-76264. Available: https://ori.hhs.gov/federal-research-misconduct-policy [April 2019].

Oinn, T., Addis, M., Ferris, J., Marvin, D., Senger, M., Greenwood, M., Carver, T., Glover, K., Pocock, M.R., Wipat, A., and Li, P. (2004). Taverna: A Tool for the Composition and Enactment of Bioinformatics Workflows. Bioinformatics, 20(17), 3045-3054.

Open Science Collaboration. (2015). Estimating the Reproducibility of Psychological Science. Science, 349(6251), aac4716. doi:10.1126/science.aac4716.

Paluck, B.L. (2018, December 10). The State of Social Science. Keynote address presented at Berkeley Initiative for Transparency in the Social Sciences Annual Meeting, Dec. 10, Berkeley, CA. Available: https://cega.berkeley.edu/resource/the-state-of-social-science-betsy-levy-paluck-bitss-annual-meeting-2018 [June 2019].

Park, J., Howe, J.D., and Sholl, D.S. (2017). How Reproducible Are Isotherm Measurements in Metal–Organic Frameworks? Chemistry of Materials, 29(24), 10487-10495. doi:10.1021/acs.chemmater.7b04287.

Pashler, H., and Wagenmakers, E.J. (2012). Editors’ Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence? Perspectives on Psychological Science, 7(6), 528-530.

Patil, P., and Parmigiani, G. (2018). Training Replicable Predictors in Multiple Studies. Proceedings of the National Academy of Sciences of the United States of America, 115(11), 2578-2583.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Patil, P., Peng, R.D., and Leek, J.T. (2016). What Should Researchers Expect When They Replicate Studies? A Statistical View of Replicability in Psychological Science. Perspectives on Psychological Science, 11(4), 539-544.

Peng, R.D. (2011). Reproducible Research in Computational Science. Science, 334(6060), 1226-1227.

Peng, R.D. (2016). A Simple Explanation for the Replication Crisis in Science. Simply Statistics, August 24. Available: https://simplystatistics.org/2016/08/24/replication-crisis [January 2019].

Peng, R.D., Dominici, F., and Zeger, S.L. (2006). Reproducible Epidemiologic Research. American Journal of Epidemiology, 163(9), 783-789.

Perez-Riverol, Y., Gatto, L., Wang, R., Sachsenberg, T., Uszkoreit, J., Leprevost, F.D.V., Fufezan, C., Ternent, T., Eglen, S.J., Katz, D.S., Pollard, T.J., Konovalov, A., Flight, R.M., Blin, K., and Vizcaíno, J.A. (2016). Ten Simple Rules for Taking Advantage of Git and GitHub. PLOS Computational Biology, 12(7), e1004947.

Perkel, J. (2017). Techblog: My Digital Toolbox: Lorena Barba. Naturejobs, April 17. Available: http://blogs.nature.com/naturejobs/2017/04/17/techblog-my-digital-toolbox-lorena-barba [December 2018].

Perkel, J. (2018a). Techblog: Git: The Reproducibility Tool Scientists Love to Hate. Naturejobs, June 11. Available: http://blogs.nature.com/naturejobs/2018/06/11/git-the-reproducibility-tool-scientists-love-to-hate [December 2018].

Perkel, J. (2018b). Why Jupyter Is Data Scientists’ Computational Notebook of Choice. Nature, 563, 145-146. doi:10.1038/d41586-018-07196-1.

Perrin, S. (2014). Preclinical Research: Make Mouse Studies Work. Nature, 507(7493), 423-425. doi:10.1038/507423a.

Peters, E., Hibbard, J., Slovic, P., and Dieckmann, N. (2007). Numeracy Skill and the Communication, Comprehension, and Use of Risk-Benefit Information. Health Affairs, 26(3), 741-748.

Peters, H.P., Brossard, D., de Cheveigne, S., Dunwoody, S., Kallfass, M., Miller, S., and Tsuchida, S. (2008). Science Communication: Interactions with the Mass Media. Science, 321(5886), 204-205.

Peto, R. (2011). Current Misconception 3: That Subgroup-Specific Trial Mortality Results Often Provide a Good Basis for Individualising Patient Care. British Journal of Cancer, 104(7), 1057-1058.

Peto, R., and Early Breast Cancer Trialists’ Collaborative Group. (1988). Effects of Adjuvant Tamoxifen and of Cytotoxic Therapy on Mortality in Early Breast Cancer. New England Journal of Medicine, 319, 1681-1692.

Pew Research Center. (2018). News Use Across Social Media Platforms 2018. Available: https://www.journalism.org/2018/09/10/news-use-across-social-media-platforms-2018/ [August 2019].

Phillips, P., Lithgow, G.J., and Driscoll, M. (2017). A Long Journey to Reproducible Results. Nature, 548(7668), 387-388.

Plant, A., and Hanisch, R. (2018). Reproducibility and Replicability in Science, a Metrology Perspective. Paper prepared for the Committee on Reproducibility and Replicability in Science at the National Academies of Sciences, Engineering, and Medicine.

Plant, A.L. (2018). Reproducibility in the Physical Sciences. Presented for the Committee on Reproducibility and Replicability in Science at the National Academies of Sciences, Engineering, and Medicine.

PLOS Collections. (2018). The Missing Pieces: A Collection of Negative, Null and Inconclusive Results. Available: https://collections.plos.org/missing-pieces [December 2018].

PLOS ONE. (2018). Data Availability. Available: https://journals.plos.org/plosone/s/data-availability [January 2019].

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Popper, K. (2005). The Logic of Scientific Discovery. London, UK: Routledge.

Possolo, A.M. (2015). Simple Guide for Evaluating and Expressing the Uncertainty of NIST Measurement Results. NIST Technical Note 1900. Available: https://www.nist.gov/publications/simple-guide-evaluating-and-expressing-uncertainty-nist-measurementresults [January 2015].

Possolo, A.M., and Iyer, H.K. (2017). Invited Article: Concepts and Tools for the Evaluation of Measurement Uncertainty. Review of Scientific Instruments, 88, 011301. doi:10.1063/1.4974274.

Prinz, F., Schlange, T., and Asadullah, K. (2011). Believe It or Not: How Much Can We Rely on Published Data on Potential Drug Targets? Nature Reviews Drug Discovery, 10(9), 712. doi:10.1038/nrd3439-c1.

Pugliucci, M. (2010). Nonsense on Stilts: How to Tell Science from Bunk. Chicago, IL: University of Chicago Press.

Ragan-Kelley, B., Walters, W.A., McDonald, D., Riley, J., Granger, B.E., Gonzalez, A., Knight, R., Perez, F., and Caporaso, J.G. (2013). Collaborative Cloud-Enabled Tools Allow Rapid, Reproducible Biological Insights. The ISME Journal, 7(3), 461-464. doi:10.1038/ismej.2012.123.

Rampin, R., Chirigati, F., Shasha, D., Freire, J., and Steeves, V. (2016). ReproZip: The Reproducibility Packer. The Journal of Open Source Software, 1(8).

Rampin, R., Chirigati, F., Steeves, V., and Freire, J. (2018). ReproServer: Making Reproducibility Easier and Less Intensive. arXiv, 1808:01406. Available: https://arxiv.org/abs/1808.01406 [April 2019].

Read, K.B., Sheehan, J.R., Huerta, M.F., Knecht, L.S., Mork, J.G., Humphreys, B.L., and NIH Big Data Annotator Group. (2015). Sizing the Problem of Improving Discovery and Access to NIH-Funded Data: A Preliminary Study. PLOS ONE, 10(7), e0132735. doi:10.1371/journal.pone.0132735.

Reichenbach, H. (1938). Experience and Prediction: An Analysis of the Foundations and the Structure of Knowledge. Chicago, IL: University of Chicago Press.

Reinhart, C.M., and Rogoff, K.S. (2010). Growth in a Time of Debt. NBER Working Paper Number 15639. Cambridge, MA: National Bureau of Economics Research. doi: 10.3386/w15639.

Reinhart, C.M., and Rogoff, K.S. (2013). Response to Herndon, Ash and Polin. Paper contributed to News Documents, The New York Times. Available: https://archive.nytimes.com/www.nytimes.com/interactive/2013/04/17/business/17economix-response.html [June 2019].

Richter, S.H., Garner, J.P., and Würbel, H. (2009). Environmental Standardization: Cure or Cause of Poor Reproducibility in Animal Experiments? Nature Methods, 6(4), 257-261. doi:10.1038/nmeth.1312.

Rind, D., Schmidt, G.A., Jonas, J., Miller, R., Nazarenko, L., Kelley, M., and Romanski, J., (2018. Multicentury Instability of the Atlantic Meridional Circulation in Rapid Warming Simulations with GISS ModelE2. Journal of Geophysical Research: Atmospheres, 123(12), 6331-6355.

Rosenthal, R. (1979). The File Drawer Problem and Tolerance for Null Results. Psychological Bulletin, 86(3), 638-641.

Rothstein, H.R. (2006). Use of Unpublished Data in Systematic Reviews in the Psychological Bulletin 1995–2005. Unpublished Manuscript.

Rous, B. (2018). The ACM Task Force on Data, Software, and Reproducibility in Publication. Available: https://www.acm.org/publications/task-force-on-data-software-and-reproducibility [January 2019].

Rowhani-Farid, A., Allen, M., and Barnett, A.G. (2017). What Incentives Increase Data Sharing in Health and Medical Research? A Systematic Review. Research Integrity and Peer Review, 2(1), 4. doi:10.1186/s41073-017-0028-9.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Royal Netherlands Academy of Arts and Sciences. (2018). Replication Studies—Improving Reproducibility in the Empirical Sciences. Amsterdam: Author. Available: https://www.knaw.nl/shared/resources/actueel/publicaties/pdf/20180115-replication-studies-web [January 2019].

Rutherford, F.J., and Ahlgren, A. (1991). Science for All Americans. Project 2061. American Association for the Advancement of Science. New York: Oxford University Press. Available: http://www.project2061.org/publications/sfaa/online/chap1.htm [January 2019].

Rutjes, A.W., Reitsma, J.B., Di Nisio, M., Smidt, N., van Rijn, J.C., and Bossuyt, P.M. (2006). Evidence of Bias and Variation in Diagnostic Accuracy Studies. CMAJ: Canadian Medical Association Journal, 174(4), 469-476.

Scheufele, D.A. (2013). Communicating Science in Social Settings. Proceedings of the National Academy of Sciences of the United States of America, 110(Suppl. 3), 14040-14047.

Scheufele, D.A. (2014). Science Communication as Political Communication. Proceedings of the National Academy of Sciences of the United States of America, 111(Suppl. 4), 13585-13592.

Scheufele, D.A., and Krause, N.M. (2019). Science Audiences, Misinformation, and Fake News. Proceedings of the National Academy of Sciences of the United States of America, 116(16), 7662-7669. doi:10.1073/pnas.1805871115.

Scheufele, D.A., Corley, E.A., Dunwoody, S., Shih, T.-J., Hillback, E., and Guston, D.H. (2007). Scientists Worry About Some Risks More Than the Public. Nature Nanotechnology, 2(12), 732-734. doi:10.1038/nnano.2007.392.

Scheufele, D.A., Brossard, D., Dunwoody, S., Corley, E.A., Guston, D.H., and Peters, H.P. (2009). Are Scientists Really Out of Touch? Available: https://www.the-scientist.com/daily-news/are-scientists-really-out-of-touch-43968 [December 2018].

Schimmack, U. (2012). The Ironic Effect of Significant Results on the Credibility of Multiple-Study Articles. Psychological Methods, 17(4), 551-566.

Schimmack, U. (2014). R-Index and the Test of Insufficient Variance, TIVA. Available: https://replicationindex.org [June 2019].

Schönbrodt, F.D. (2018). P-Checker: One-for-All P-Value Analyzer. Available: http://shinyapps.org/apps/p-checker [January 2019].

Sedlmeier, P., and Gigerenzer, G. (1989). Do Studies of Statistical Power Have an Effect on the Power of Studies? Psychological Bulletin, 105(2), 309.

Setti, G. (2018, February 22-23). Reproducibility Issues in Engineering: Experiences within the IEEE. Paper prepared for the Committee on Reproducibility and Replicability in Science at the National Academies of Sciences, Engineering, and Medicine.

Shen, H. (2014). Interactive Notebooks: Sharing the Code. Nature, 515(7525), 151-152.

Shiffrin, R.M., Börner, K., and Stigler, S.M. (2018). Scientific Progress Despite Irreproducibility: A Seeming Paradox. Proceedings of the National Academy of Sciences of the United States of America, 115(11), 2632-2639. doi:10.1073/pnas.1711786114.

Sholl, D. (2017). Testing Reproducibility in Materials Chemistry via Literature Meta-Analysis. Paper prepared for the Committee on Reproducibility and Replicability in Science at the National Academies of Sciences, Engineering, and Medicine. Available: https://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_184249.pdf [April 2019].

Shoemaker, P.J., and Reese, S.D. (1996). Mediating the Message: Theories of Influences on Mass Media Content. White Plains, NY: Longman.

Shrout, P.E., and Rodgers, J.L. (2018). Psychology, Science, and Knowledge Construction: Broadening Perspectives from the Replication Crisis. Annual Review of Psychology, 69(1), 487-510.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Siberzahn, R., Uhlmann, E.L., Martin, D.P., Anselmi, P., Aust, F., Awtrey, E., Bahník, Š., Bai, F., Bannard, C., Bonnier, E., Carlsson, R., Cheung, F., Christensen, G., Clay, R., Craig, M.A., Dalla Rosa, A., Dam, L., Evans, M.H., Flores Cervantes, I., Fong, N., Gamez-Djokic, M., Glenz, A., Gordon-McKeon, S., Heaton, T J., Hederos, K., Heene, M., Hofelich Mohr, A.J., Högden, F., Hui, K., Johannesson, M., Kalodimos, J., Kaszubowski, E., Kennedy, D.M., Lei, R., Lindsay, T.A., Liverani, S., Madan, C.R., Molden, D., Molleman, E., Morey, R.D., Mulder, L.B., Nijstad, B.R., Pope, N.G., Pope, B., Prenoveau, J.M., Rink, F., Robusto, E., Roderique, H., Sandberg, A., Schlüter, E., Schönbrodt, F.D., Sherman, M.F., Sommer, S.A., Sotak, K., Spain, S., Spörlein, C., Stafford, T., Stefanutti, L., Tauber, S., Ullrich, J., Vianello, M., Wagenmakers, E.-J., Witkowiak, M., Yoon, S., and Nosek, B.A. (2015). Many Analysts, One Data Set: Making Transparent How Variations in Analytic Choices Affect Results. Advances in Methods and Practices in Psychological Science, 1, 337-356. doi:10.1177/2515245917747646.

Silva, C.T., Freire, J., and Callahan, S.P. (2007). Provenance for Visualizations: Reproducibility and Beyond. Computing in Science & Engineering, 9(5), 82-89.

Simmons, J.P., Nelson, L.D., and Simonsohn, U. (2011). False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant. Psychological Science, 22(11), 1359-1366.

Simonsohn, U. (2015). Small Telescopes: Detectability and the Evaluation of Replication Results. Psychological Science, 26(5), 559-569.

Simonsohn, U., Nelson, L.D., and Simmons, J.P. (2014a). P-Curve and Effect Size: Correcting for Publication Bias Using Only Significant Results. Psychological Science, 9(6), 666-681.

Simonsohn, U., Nelson, L.D., and Simmons, J.P. (2014b). P-Curve.com. Available: http://www.p-curve.com [January 2019].

Smaldino, P.E., and McElreath, R. (2016). The Natural Selection of Bad Science. Royal Society Open Science, 3(9), 160384.

Soto, C. (2019). How Replicable Are Links Between Personality Traits and Consequential Life Outcomes? The Life Outcomes of Personality Replication Project. Psychological Science, 30(5), 711-727. doi:10.1177/0956797619831612.

Sripada, C., Kessler, D., and Jonides, J. (2014). Methylphenidate blocks effort-induced depletion of regulatory control in healthy volunteers. Psychological Science, 25(6), 1227-1234.

Stanley, T.D., Carter, E., and Doucouliagos, H. (2018). What meta-analyses reveal about the replicability of psychological research. Psychological Bulletin, 144(12), 1325-1346.

Stark, P.B. (2016). A Noob’s Guide to Reproducibility (& Open Science). Available: https://www.stat.berkeley.edu/~stark/Seminars/reproNE16.htm#1 [December 2018].

Steen, R.G. (2011). Retractions in the Scientific Literature: Is the Incidence of Research Fraud Increasing? Journal of Medical Ethics, 37(4), 249-253.

Steen, R.G., Casadevall, A., and Fang, F.C. (2013). Why Has the Number of Scientific Retractions Increased? PLOS ONE, 8(7), e68397.

Stefanescu, C., Puig-Montserrat, X., Samraoui, B., Izquierdo, R., Ubach, A. and Arrizabalaga, A. (2017). Back to Africa: Autumn migration of the painted lady butterfly Vanessa cardui is timed to coincide with an increase in resource availability. Ecological Entomology, 42, 737-747.

Sterne, J.A.C., and Egger, M. (2006). Regression Methods to Detect Publication and Other Bias in Meta-Analysis. In H.R. Rothstein, A.J. Sutton, and M. Borenstein (Eds.), Publication Bias in Meta-Analysis (pp. 99-110). Hoboken, NJ: Wiley. doi:10.1002/0470870168. ch6.

Sterne, J.A.C., Sutton, A.J., Ioannidis, J.P.A., Terrin, N., Jones, D.R., Lau, J., Carpenter, J., Rücker, G., Harbord, R.M., Schmid, C.H., Tetzlaff, J., Deeks, J.J., Peters, J., Macaskill, P., Schwarzer, G., Duval, S., Altman, D.G., Moher, D., and Higgins, J.P.T. (2011). Recommendations for Examining and Interpreting Funnel Plot Asymmetry in Meta-Analyses of Randomised Controlled Trials. BMJ, 343, d4002. doi:10.1136/bmj.d4002.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Stocking, S.H. (1999). How journalists deal with scientific uncertainty. In S.M. Friedman, S. Dunwoody, and C.L. Rogers (Eds.), Communicating Uncertainty: Media Coverage of New and Controversial Science (pp. 23-41). Mahwah, NJ: Lawrence Erlbaum.

Stodden, V., and Miguez, S. (2014). Best Practices for Computational Science: Software Infrastructure and Environments for Reproducible and Extensible Research. Journal of Open Research Software, 2(1). doi:10.5334/jors.ay

Stodden, V., Leisch, F., and Peng, R.D. (2014). Implementing Reproducible Research. Boca Raton, FL: CRC Press.

Stodden, V., Krafczyk, M.S., and Bhaskar, A. (2018a, June 11). Enabling the Verification of Computational Results: An Empirical Evaluation of Computational Reproducibility. Paper presented at the Proceedings of the First International Workshop on Practical Reproducible Evaluation of Computer Systems, Tempe, AZ. New York: Association for Computing Machinery. doi:10.1145/3214239.3214242.

Stodden, V., Seiler, J., and Ma, Z. (2018b). An Empirical Analysis of Journal Policy Effectiveness for Computational Reproducibility. Proceedings of the National Academy of Sciences of the United States of America, 115(11), 2584-2589.

Stone, J.E., Sener, M., Vandivort, K.L., Barragan, A., Singharoy, A., Teo, I., Ribeiro, J.V., Isralewitz, B., Liu, B., Goh, B.C., Phillips, J.C., MacGregor-Chatwin, C., Johnson, M.P., Kourkoutis, L.F., Hunter, C.N., and Schulten, K. (2016). Atomic Detail Visualization of Photosynthetic Membranes with GPU-Accelerated Ray Tracing. Parallel Computing, 55, 17-27. doi:10.1016/j.parco.2015.10.015.

Strack, F., Martin, L., and Stepper, S. (1988). Inhibiting and facilitating conditions of the human smile: A nonobtrusive test of the facial feedback hypothesis. Journal of Personality and Social Psychology, 54, 768-777.

Sumner, P., Vivian-Griffiths, S., Boivin, J., Williams, A., Bott, L., Adams, R., Venetis, C.A., Whelan, L., Hughes, B., and Chambers, C.D. (2016). Exaggerations and Caveats in Press Releases and Health-Related Science News. PLOS ONE, 11(12), e0168217.

Sutton, A.J. (2009). Publication Bias. In H. Cooper, L.V. Hedges, and J.C. Valentine (Eds.), The Handbook of Research Synthesis and Meta-Analysis (2nd ed., pp. 435-451). New York: Russell Sage Foundation.

Szucs, D., and Ioannidis, J.P.A. (2017). Empirical Assessment of Published Effect Sizes and Power in the Recent Cognitive Neuroscience and Psychology Literature. PLOS Biology, 15(3), e2000797.

Tang, J.L., and Liu, J.L. (2000). Misleading Funnel Plot for Detection of Bias in Meta-Analysis. Journal of Clinical Epidemiology, 53(5), 477-484.

Taylor, B.N., and Kuyatt, C.E. (1994). NIST Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results. NIST Technical Note 1297. Available: https://www.nist.gov/pml/nist-guide-evaluating-and-expres-meas-uncertainty-cover [April 2019].

Trafimow, D., and Marks, M. (2015). Editorial. Basic and Applied Social Psychology, 37(1), 1-2.

Trouble at the Lab. (2013). The Economist. Available: https://www.economist.com/briefing/2013/10/18/trouble-at-the-lab [December 2018].

Tukey, J.W. (1980). We Need Both Exploratory and Confirmatory. The American Statistician, 34(1), 23-25.

Udit, N., and Daniel S.K. (2017). Track 1 Paper: Surveying the U.S. National Postdoctoral Association Regarding Software Use and Training in Research. Available: http://danielskatz.org/papers/postdocsurveyfull_WSSSPE5.1.pdf [April 2019].

U.S. Department of Health and Human Services and Office of Research Integrity. (2012-2013). Office of Research Integrity Annual Report. Washington, DC: Author.

Van Bavel, J.J., Mende-Siedlecki, P., Brady, W.J., and Reinero, D.A. (2016). Contextual Sensitivity in Scientific Reproducibility. Proceedings of the National Academy of Sciences of the United States of America, 113(23), 6454-6459.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Vandewalle, P., Barrenetxea, G., Jovanovic, I., Ridolfi, A., and Vetterli, M. (2007). Experiences with Reproducible Research in Various Facets of Signal Processing Research. Presented at the 2007 IEEE International Conference on Acoustics, Speech and Signal ProcessingICASSP ‘07, Honolulu, HI.

Vaughan-Nichols, S.J. (2018). What Is Docker and Why Is It So Darn Popular? ZDNet, March 21. Available: https://www.zdnet.com/article/what-is-docker-and-why-is-it-so-darn-popular [December 2018].

Verhagen, J., and Wagenmakers, E.J. (2014). Bayesian Tests to Quantify the Result of a Replication Attempt. Journal of Experimental Psychology, 143(4), 1457-1475.

Vilhuber, L. (2018). Reproducibility and Replicability in Economics. Paper prepared for the Committee on Reproducibility and Replicability in Science, National Academies of Sciences, Engineering, and Medicine, Washington, DC.

Vitorino, A., Nepomuceno, E.G., Resende, D.F., and Lacerda, M.J. (2017, March 19-22). Evaluating the Reproducibility of Multiagent Systems. Paper presented at the 2017 IEEE World Engineering Education Conference (EDUNINE), Santos, Brazil. doi:10.1109/EDUNINE.2017.7918184.

Wagenmakers, E.-J., Beek, T., Dijkhoff, L., Gronau, Q.F., Acosta, A., Adams, R.B., Albohn, D.N., Allard, E.S., Benning, S.D., Blouin-Hudon, E.-M., Bulnes, L.C., Caldwell, T.L., Calin-Jageman, R.J., Capaldi, C.A., Carfagno, N.S., Chasten, K.T., Cleeremans, A., Connell, L., DeCicco, J.M., Dijkstra, K., Fischer, A.H., Foroni, F., Hess, U., Holmes, K.J., Jones, J.L.H., Klein, O., Koch, C., Korb, S., Lewinski, P., Liao, J.D., Lund, S., Lupianez, J., Lynott, D., Nance, C.N., Oosterwijk, S., Ozdoğru, A.A., Pacheco-Unguetti, A.P., Pearson, B., Powis, C., Riding, S., Roberts, T.-A., Rumiati, R.I., Senden, M., SheaShumsky, N.B., Sobocko, K., Soto, J.A., Steiner, T.G., Talarico, J.M., van Allen, Z.M., Vandekerckhove, M., Wainwright, B., Wayand, J.F., Zeelenberg, R., Zetzer, E.E., and Zwaan, R.A. (2016). Registered Replication Report: Strack, Martin, and Stepper (1988). Perspectives on Psychological Science, 11(6), 917-928.

Waltemath, D., and Wolkenhauer, O. (2016). How Modeling Standards, Software, and Initiatives Support Reproducibility in Systems Biology and Systems Medicine. IEEE Transactions on Bio-Medical Engineering, 63(10), 1999-2006.

Walton, G.M., and Wilson, T.D. (2018). Wise Interventions: Psychological Remedies for Social and Personal Problems. Psychological Review, 125(5), 617-655.

Warner, F., Dhruva, S.S., Ross, J.S., Dey, P., Murugiah, K., and Krumholz, H.M. (2018). Accurate Estimation of Cardiovascular Risk in a Non-Diabetic Adult: Detecting and Correcting the Error in the Reported Framingham Risk Score for the Systolic Blood Pressure Intervention Trial Population. BMJ Open, 8(7), e021685. doi:10.1136/bmjopen-2018-021685.

Washburn, A.N., Hanson, B.E., Motyl, M., Skitka, L.J., Yantis, C., Wong, K.M., Sun, J., Prims, J.P., Mueller, A.B., Melton, Z.J., and Carsel, T.S. (2018). Why Do Some Psychology Researchers Resist Adopting Proposed Reforms to Research Practices? A Description of Researchers’ Rationales. Advances in Methods and Practices in Psychological Science, 1(2), 166-173. doi:10.1177/2515245918757427.

Wasserstein, R.L., and Lazar, N.A. (2016). The ASA’s Statement on p-Values: Context, Process, and Purpose. The American Statistician, 70(2), 129-133. doi:10.1080/00031305.2016. 1154108.

Wasserstein, R.L., Schirm, A.L., and Lazar, N.A. (2019). Moving to a world beyond “p < 0.05”. The American Statistician, 73(Suppl. 1), 1-19. doi:10.1080/00031305.2019. 1583913.

Weingart, P. (2017). Is there a hype problem in science? If so, how is it addressed? In K.H. Jamieson, D. Kahan, and D. Scheufele (Eds.), The Oxford Handbook of the Science of Science Communication (pp. 111-118). New York: Oxford University Press.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

White, K.E., Robbins, C., Khan, B., and Freyman, C. (2017). Science and Engineering Publication Output Trends: 2014 Shows Rise of Developing Country Output While Developed Countries Dominate Highly Cited Publication. NSF 18-300. Available: https://www.nsf.gov/statistics/2018/nsf18300/nsf18300.pdf [December 2018].

Wilkinson, M.D., Dumontier, M., Aalbersberg, I.J., Appleton, G., Axton, M., Baak, A., Blomberg, N., Boiten, J.-W., da Silva Santos, L.B., Bourne, P.E., Bouwman, J., Brookes, A.J., Clark, T., Crosas, M., Dillo, I., Dumon, O., Edmunds, S., Evelo, C.T., Finkers, R., Gonzalez-Beltran, A., Gray, A.J.G., Groth, P., Goble, C., Grethe, J.S., Heringa, J., Hoen, P.A.C., Hooft, R., Kuhn, T., Kok, R., Kok, J., Lusher, S.J., Martone, M.E., Mons, A., Packer, A.L., Persson, B., Rocca-Serra, P., Roos, M., van Schaik, R., Sansone, S.-A., Schultes, E., Sengstag, T., Slater, T., Strawn, G., Swertz, M.A., Thompson, M., van der Lei, J., van Mulligen, E., Velterop, J., Waagmeester, A., Wittenburg, P., Wolstencroft, K., Zhao, J., and Mons, B. (2016). The Fair Guiding Principles for Scientific Data Management and Stewardship. Scientific Data, 3, 160018. doi:10.1038/sdata.2016.18.

Wilson, B.M., and Wixted, J.T. (2018). The Prior Odds of Testing a True Effect in Cognitive and Social Psychology. Advances in Methods and Practices in Psychological Science, 1(2), 186-197.

Wilson, G., Aruliah, D.A., Brown, C.T., Chue Hong, N.P., Davis, M., Guy, R.T., Haddock, S.H.D., Huff, K.D., Mitchell, I.M., Plumbley, M.D., Waugh, B., White, E.P., and Wilson, P. (2014). Best Practices for Scientific Computing. PLOS Biology, 12(1), e1001745.

Wilson, G., Bryan, J., Cranston, K., Kitzes, J., Nederbragt, L., and Teal, T.K. (2017). Good Enough Practices in Scientific Computing. PLOS Computational Biology, 13(6), e1005510.

Winchester, S. (2018). The Perfectionists: How Precision Engineers Created the Modern World. New York: HarperCollins.

Winsberg, E. (2010). Science in the Age of Computer Simulation. Chicago, IL: University of Chicago Press.

Wood, W., and Carden, L. (2014). Elusiveness of Menstrual Cycle Effects on Mate Preferences: Comment on Gildersleeve, Haselton, and Fales (2014). Psychological Bulletin, 140(5), 1265-1271.

Wood, W., and Neal, D.T. (2016). Healthy through Habit: Interventions for Initiating and Maintaining Health Behavior Change. Behavioral Science & Policy, 2(1), 89-103.

Wood, A.C., Wren, J.D., and Allison, D.B. (2019). The Need for Greater Rigor in Childhood Nutrition and Obesity Research. JAMA Pediatrics, 173(4), 311-312. doi:10.1001/jamapediatrics.2019.0015.

Yale Law School Roundtable on Data and Code Sharing. (2010). Reproducible Research. Computing in Science & Engineering, 12(5), 8-13.

Yarkoni, T. (2009). Big Correlations in Little Studies: Inflated fMRI Correlations Reflect Low Statistical Power-Commentary on Vul et al. (2009). Perspectives on Psychological Science, 4(3), 294-298. doi:10.1111/j.1745-6924.2009.01127.x.

Yong, E. (2016). Psychology’s Replication Crisis Can’t Be Wished Away. The Atlantic, March 4. Available: https://www.theatlantic.com/science/archive/2016/03/psychologysreplication-crisis-cant-be-wished-away/472272 [December 2018].

Zhao, G., Perilla, J.R., Yufenyuy, E.L., Meng, X., Chen, B., Ning, J., Ahn, J., Gronenborn, A.M., Schulten, K., Aiken, C., and Zhang, P. (2013). Mature HIV-1 Capsid Structure by Cryo-Electron Microscopy and All-Atom Molecular Dynamics. Nature, 497(7451), 643-646. doi:10.1038/nature12162.

Zhao, C., Puig-Montserrat, X., Samraoui, B., Izquierdo, R., Ubach, A., and Arrizabalaga, A. (2017). Back to Africa: Autumn Migration of the Painted Lady Butterfly Vanessa Cardui Is Timed to Coincide with an Increase in Resource Availability. Ecological Entomology, 42(6), 737-747.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×

Ziai, H., Zhang, R., Chan, A.W., and Persaud, N. (2017). Search for Unpublished Data by Systematic Reviewers: An Audit. BMJ Open, 7(10), e017737. doi:10.1136/bmjopen2017-017737.

Ziletti, A., Kumar, D., Scheffler, M., and Ghiringhelli, L.M. (2018). Insightful Classification of Crystal Structures Using Deep Learning. Nature Communications, 9(1), 2775. doi:10.1038/s41467-018-05169-6.

Zwaan, R., Etz, A., Lucas, R., E., and Donnellan, B. (2018). Making Replication Mainstream. Behavioral and Brain Sciences, 41, e20. doi:10.1017/S0140525X17001972.

Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 163
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 164
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 165
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 166
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 167
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 168
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 169
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 170
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 171
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 172
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 173
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 174
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 175
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 176
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 177
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 178
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 179
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 180
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 181
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 182
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 183
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 184
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 185
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 186
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 187
Suggested Citation:"References." National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. doi: 10.17226/25303.
×
Page 188
Next: Appendix A: Biographical Sketches of Committee Members and Staff »
Reproducibility and Replicability in Science Get This Book
×
 Reproducibility and Replicability in Science
Buy Paperback | $65.00 Buy Ebook | $54.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

One of the pathways by which the scientific community confirms the validity of a new scientific discovery is by repeating the research that produced it. When a scientific effort fails to independently confirm the computations or results of a previous study, some fear that it may be a symptom of a lack of rigor in science, while others argue that such an observed inconsistency can be an important precursor to new discovery.

Concerns about reproducibility and replicability have been expressed in both scientific and popular media. As these concerns came to light, Congress requested that the National Academies of Sciences, Engineering, and Medicine conduct a study to assess the extent of issues related to reproducibility and replicability and to offer recommendations for improving rigor and transparency in scientific research.

Reproducibility and Replicability in Science defines reproducibility and replicability and examines the factors that may lead to non-reproducibility and non-replicability in research. Unlike the typical expectation of reproducibility between two computations, expectations about replicability are more nuanced, and in some cases a lack of replicability can aid the process of scientific discovery. This report provides recommendations to researchers, academic institutions, journals, and funders on steps they can take to improve reproducibility and replicability in science.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!