National Academies Press: OpenBook
« Previous: Appendix A: Registered Workshop Participants
Suggested Citation:"Appendix B: Workshop Agenda." National Academies of Sciences, Engineering, and Medicine. 2016. Statistical Challenges in Assessing and Fostering the Reproducibility of Scientific Results: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/21915.
×

B

Workshop Agenda

DAY 1
FEBRUARY 26, 2015

Session I: Overview and Case Studies

8:30 a.m. Introductions from the Workshop Co-Chairs
Constantine Gatsonis, Brown University
Giovanni Parmigiani, Dana-Farber Cancer Institute
8:45 Perspectives from Stakeholders
Lawrence Tabak, National Institutes of Health
Irene Qualters, National Science Foundation
Justin Esarey, Rice University and The Political Methodologist
Gianluca Setti, University of Ferrara, Italy, and IEEE
Joelle Lomax, Science Exchange
9:45 Overview of the Workshop
Victoria Stodden, University of Illinois, Urbana-Champaign
Suggested Citation:"Appendix B: Workshop Agenda." National Academies of Sciences, Engineering, and Medicine. 2016. Statistical Challenges in Assessing and Fostering the Reproducibility of Scientific Results: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/21915.
×
10:30 Case Studies
Yoav Benjamini, Tel Aviv University
Justin Wolfers, University of Michigan

Session II: Conceptualizing, Measuring, and Studying Reproducibility

1:30 p.m. Definitions and Measures of Reproducibility
Speaker:

Steven Goodman, Stanford University

Discussant:

Yoav Benjamini, Tel Aviv University

2:30 Reproducibility and “Statistical Significance”
Speaker:

Dennis Boos, North Carolina State University

Discussants:

Andreas Buja, Wharton, University of Pennsylvania

Val Johnson, Texas A&M University

3:45 Assessment of Factors Affecting Reproducibility
Speaker:

Marc Suchard, University of California, Los Angeles

Discussants:

Courtney Soderberg, Center for Open Science

John Ioannidis, Stanford University

4:45 Reproducibility from the Informatics Perspective
Speaker:

Mark Liberman, University of Pennsylvania

Discussant:

Micah Altman, Massachusetts Institute of Technology

5:45 Adjourn Day 1
Suggested Citation:"Appendix B: Workshop Agenda." National Academies of Sciences, Engineering, and Medicine. 2016. Statistical Challenges in Assessing and Fostering the Reproducibility of Scientific Results: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/21915.
×

DAY 2
FEBRUARY 27, 2015

Session III: The Way Forward: Using Statistics to Achieve Reproducibility

8:30 a.m. Panel Discussion: Open Problems, Needs, and Opportunities for Methodologic Research
Moderator:

Giovanni Parmigiani, Dana-Farber Cancer Institute

Panelists:

Lida Anestidou, National Academies of Sciences, Engineering, and Medicine

Tim Errington, Center for Open Science

Xiaoming Huo, National Science Foundation

Roger Peng, Johns Hopkins Bloomberg School of Public Health

10:00 Panel Discussion: Reporting Scientific Results and Sharing Scientific Study Data
Moderator:

Victoria Stodden, University of Illinois, Urbana-Champaign

Panelists:

Keith Baggerly, MD Anderson Cancer Center

Ronald Boisvert, Association for Computing Machinery and National Institute of Standards and Technology

Randy LeVeque, Society for Industrial and Applied Mathematics and University of Washington

Marcia McNutt, Science magazine

11:45 Panel Discussion: The Way Forward from the Data Sciences Perspective: Research
Moderator:

Constantine Gatsonis, Brown University

Panelists:

Chaitan Baru, National Science Foundation

Philip Bourne, National Institutes of Health

Rafael Irizarry, Harvard University

Jeff Leek, Johns Hopkins University

1:00 p.m. Adjourn Wokshop
Suggested Citation:"Appendix B: Workshop Agenda." National Academies of Sciences, Engineering, and Medicine. 2016. Statistical Challenges in Assessing and Fostering the Reproducibility of Scientific Results: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/21915.
×
Page 115
Suggested Citation:"Appendix B: Workshop Agenda." National Academies of Sciences, Engineering, and Medicine. 2016. Statistical Challenges in Assessing and Fostering the Reproducibility of Scientific Results: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/21915.
×
Page 116
Suggested Citation:"Appendix B: Workshop Agenda." National Academies of Sciences, Engineering, and Medicine. 2016. Statistical Challenges in Assessing and Fostering the Reproducibility of Scientific Results: Summary of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/21915.
×
Page 117
Next: Appendix C: Acronyms »
Statistical Challenges in Assessing and Fostering the Reproducibility of Scientific Results: Summary of a Workshop Get This Book
×
Buy Paperback | $42.00 Buy Ebook | $33.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Questions about the reproducibility of scientific research have been raised in numerous settings and have gained visibility through several high-profile journal and popular press articles. Quantitative issues contributing to reproducibility challenges have been considered (including improper data measurement and analysis, inadequate statistical expertise, and incomplete data, among others), but there is no clear consensus on how best to approach or to minimize these problems.

A lack of reproducibility of scientific results has created some distrust in scientific findings among the general public, scientists, funding agencies, and industries. While studies fail for a variety of reasons, many factors contribute to the lack of perfect reproducibility, including insufficient training in experimental design, misaligned incentives for publication and the implications for university tenure, intentional manipulation, poor data management and analysis, and inadequate instances of statistical inference.

The workshop summarized in this report was designed not to address the social and experimental challenges but instead to focus on the latter issues of improper data management and analysis, inadequate statistical expertise, incomplete data, and difficulties applying sound statistic inference to the available data. Many efforts have emerged over recent years to draw attention to and improve reproducibility of scientific work. This report uniquely focuses on the statistical perspective of three issues: the extent of reproducibility, the causes of reproducibility failures, and the potential remedies for these failures.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!