OBSERVATIONAL STUDIES
IN A LEARNING HEALTH SYSTEM

Workshop Summary

Claudia Grossmann and Joe Alper, Rapporteurs

A Learning Health System Activity

Roundtable on Value & Science-Driven Health Care

INSTITUTE OF MEDICINE
                      OF THE NATIONAL ACADEMIES

THE NATIONAL ACADEMIES PRESS

Washington, D.C.

www.nap.edu



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page R1
ROUNDTABLE ON VALUE & SCIENCE-DRIVEN HEALTH CARE OBSERVA TIONAL STUDIES IN A LEARNING HEALTH SYSTEM Workshop Summary Claudia Grossmann and Joe Alper, Rapporteurs A Learning Health System Activity Roundtable on Value & Science-Driven Health Care

OCR for page R1
THE NATIONAL ACADEMIES PRESS  500 Fifth Street, NW  Washington, DC 20001 NOTICE: The workshop that is the subject of this workshop summary was ap- proved by the Governing Board of the National Research Council, whose members are drawn from the councils of the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine. This activity was supported by an unnumbered agreement between the National Academy of Sciences and the Patient-Centered Outcomes Research Institute. The views presented in this publication do not necessarily reflect the views of the orga- nizations or agencies that provided support for the activity. International Standard Book Number-13:  978-0-309-29081-4 International Standard Book Number-10:  0-309-29081-3 Additional copies of this workshop summary are available for sale from the Na- tional Academies Press, 500 Fifth Street, NW, Keck 360, Washington, DC 20001; (800) 624-6242 or (202) 334-3313; http://www.nap.edu. For more information about the Institute of Medicine, visit the IOM home page at: www.iom.edu. Copyright 2013 by the National Academy of Sciences. All rights reserved. Printed in the United States of America The serpent has been a symbol of long life, healing, and knowledge among almost all cultures and religions since the beginning of recorded history. The serpent ad- opted as a logotype by the Institute of Medicine is a relief carving from ancient Greece, now held by the Staatliche Museen in Berlin. Suggested citation: IOM (Institute of Medicine). 2013. Observational studies in a learning health system: Workshop summary. Washington, DC: The National Academies Press.

OCR for page R1
“Knowing is not enough; we must apply. Willing is not enough; we must do.” —Goethe Advising the Nation. Improving Health.

OCR for page R1
The National Academy of Sciences is a private, nonprofit, self-perpetuating society of distinguished scholars engaged in scientific and engineering research, dedicated to the furtherance of science and technology and to their use for the general welfare. Upon the authority of the charter granted to it by the Congress in 1863, the Acad- emy has a mandate that requires it to advise the federal government on scientific and technical matters. Dr. Ralph J. Cicerone is president of the National Academy of Sciences. The National Academy of Engineering was established in 1964, under the charter of the National Academy of Sciences, as a parallel organization of outstanding en- gineers. It is autonomous in its administration and in the selection of its members, sharing with the National Academy of Sciences the responsibility for advising the federal government. The National Academy of Engineering also sponsors engineer- ing programs aimed at meeting national needs, encourages education and research, and recognizes the superior achievements of engineers. Dr. C. D. Mote, Jr., is presi- dent of the National Academy of Engineering. The Institute of Medicine was established in 1970 by the National Academy of Sciences to secure the services of eminent members of appropriate professions in the examination of policy matters pertaining to the health of the public. The Insti- tute acts under the responsibility given to the National Academy of Sciences by its congressional charter to be an adviser to the federal government and, upon its own initiative, to identify issues of medical care, research, and education. Dr. Harvey V. Fineberg is president of the Institute of Medicine. The National Research Council was organized by the National Academy of Sci- ences in 1916 to associate the broad community of science and technology with the Academy’s purposes of furthering knowledge and advising the federal government. Functioning in accordance with general policies determined by the Academy, the Council has become the principal operating agency of both the National Academy of Sciences and the National Academy of Engineering in providing services to the government, the public, and the scientific and engineering communities. The Coun- cil is administered jointly by both Academies and the Institute of Medicine. Dr. Ralph J. Cicerone and Dr. C. D. Mote, Jr., are chair and vice chair, respectively, of the National Research Council. www.national-academies.org

OCR for page R1
PLANNING COMMITTEE ON OBSERVATIONAL STUDIES IN A LEARNING HEALTH SYSTEM1 RALPH I. HORWITZ (Co-Chair), Senior Vice President, Clinical Sciences Evaluation, GlaxoSmithKline JOE V. SELBY (Co-Chair), Executive Director, Patient-Centered Outcomes Research Institute ANIRBAN BASU, Associate Professor and Director, Health Economics and Outcomes Methodology, University of Washington TROYEN A. BRENNAN, Executive Vice President and Chief Medical Officer, CVS/Caremark STEVEN N. GOODMAN, Associate Dean for Clinical & Translational Research, Stanford University School of Medicine, Stanford University LOUIS B. JACQUES, Director, Coverage and Analysis Group, Centers for Medicare & Medicaid Services JEROME P. KASSIRER, Distinguished Professor, Tufts University School of Medicine MICHAEL S. LAUER, Director, Division of Cardiovascular Sciences, National Heart, Lung, and Blood Institute DAVID MADIGAN, Chair of Statistics, Columbia University SHARON-LISE T. NORMAND, Professor, Department of Biostatistics and Health Care Policy, Harvard University RICHARD PLATT, Chair of Ambulatory Care and Prevention and Chair of Population Medicine, Harvard Pilgrim Health Care Institute BURTON H. SINGER, Professor, Emerging Pathogens Institute, University of Florida JEAN R. SLUTSKY, Director, Center for Outcomes and Evidence, Agency for Healthcare Research and Quality ROBERT TEMPLE, Deputy Director for Clinical Science, Center for Drug Evaluation and Research, U.S. Food and Drug Administration IOM Staff KATHERINE BURNS, Program Assistant CLAUDIA GROSSMANN, Senior Program Officer DIEDTRA HENDERSON, Program Officer ELIZABETH JOHNSTON, Program Assistant ELIZABETH ROBINSON, Research Associate VALERIE ROHRBACH, Senior Program Assistant 1  Institute of Medicine planning committees are solely responsible for organizing the work- shop, identifying topics, and choosing speakers. The responsibility for the published workshop summary rests with the workshop rapporteurs and the institution. v

OCR for page R1
JULIA SANDERS, Senior Program Assistant (through August 2013) ROBERT SAUNDERS, Senior Program Officer BARRET ZIMMERMANN, Program Assistant (through October 2013) J. MICHAEL McGINNIS, Senior Scholar, Executive Director, Roundtable on Value & Science-Driven Health Care Consultant JOE ALPER, Consulting Writer vi

OCR for page R1
ROUNDTABLE ON VALUE & SCIENCE-DRIVEN HEALTH CARE1 MARK B. McCLELLAN (Chair), Senior Fellow and Director, Health Care Innovation and Value Initiative, The Brookings Institution RAYMOND BAXTER, Senior Vice President, Community, Benefit, Research and Health Policy, Kaiser Permanente DAVID BLUMENTHAL, President, The Commonwealth Fund BRUCE G. BODAKEN, Former Chairman and Chief Executive Officer, Blue Shield of California PAUL CHEW, Chief Science Officer and Chief Medical Officer, Sanofi U.S. FRANCIS COLLINS, Director, National Institutes of Health (Ex Officio) (designee: Kathy Hudson) HELEN DARLING, President, National Business Group on Health SUSAN DEVORE, Chief Executive Officer, Premier, Inc. JUDITH FAULKNER, Founder and Chief Executive Officer, Epic Health Systems THOMAS R. FRIEDEN, Director, Centers for Disease Control and Prevention (Ex Officio) (designee: James Galloway) PATRICIA A. GABOW, Former Chief Executive Officer, Denver Health ATUL GAWANDE, General and Endocrine Surgeon, Brigham and Women’s Hospital GARY L. GOTTLIEB, President and Chief Executive Officer, Partners HealthCare System JAMES A. GUEST, President and Chief Executive Officer, Consumers Union GEORGE C. HALVORSON, Chairman and Chief Executive Officer, Kaiser Permanente MARGARET A. HAMBURG, Commissioner, U.S. Food and Drug Administration (Ex Officio) (designee: Peter Lurie) JAMES HEYWOOD, Cofounder and Chairman, PatientsLikeMe RALPH I. HORWITZ, Senior Vice President, Clinical Evaluation Sciences, GlaxoSmithKline PAUL HUDSON, Executive Vice President, AstraZeneca BRENT C. JAMES, Chief Quality Officer, Intermountain Healthcare CRAIG JONES, Director, Vermont Blueprint for Health Gary Kaplan, Chairman and Chief Executive Officer, Virginia Mason Health System DARRELL G. kIRCH, President and Chief Executive Officer, Association of American Medical Colleges 1  Institute of Medicine forums and roundtables do not issue, review, or approve individual documents. The responsibility for the published workshop summary rests with the workshop rapporteurs and the institution. vii

OCR for page R1
RICHARD KRONICK, Director, Agency for Healthcare Research and Quality (Ex Officio) Richard C. Larson, Mitsui Professor, Massachusetts Institute of Technology JAMES L. MADARA, Chief Executive Officer, American Medical Association FARZAD MOSTASHARI, National Coordinator, Office of the National Coordinator for Health IT (Ex Officio) MARY D. NAYLOR, Director, NewCourtland Center, University of Pennsylvania WILLIAM D. NOVELLI, Former Chief Executive Officer, AARP; Professor, Georgetown University SAM R. NUSSBAUM, Chief Medical Officer, WellPoint, Inc. JONATHAN B. PERLIN, President, Clinical and Physician Services and Chief Medical Officer, HCA, Inc. ROBERT A. PETZEL, Under Secretary for Health, U.S. Department of Veterans Affairs (Ex Officio) RICHARD PLATT, Chair, Population Medicine, Harvard Medical School MICHAEL ROSENBLATT, Executive Vice President and Chief Medical Officer, Merck and Co. JOHN W. ROWE, Former Chairman and Chief Executive Officer, Aetna; Professor, Columbia University JOE V. SELBY, Executive Director, Patient-Centered Outcomes Research Institute MARK D. SMITH, President and Chief Executive Officer, California HealthCare Foundation GLENN D. STEELE, President and Chief Executive Officer, Geisinger Health System MARILYN TAVENNER, Administrator, Centers for Medicare & Medicaid Services (Ex Officio) (designee: Patrick Conway) REED V. TUCKSON, Former Executive Vice President, UnitedHealth Group; Managing Director, Tuckson Health MARY WAKEFIELD, Administrator, Health Resources and Services Administration (Ex Officio) Debra B. Whitman, Executive Vice President, Policy, Strategy, and International Affairs, AARP JONATHAN WOODSON, Assistant Secretary for Health Affairs, U.S. Department of Defense (Ex Officio) viii

OCR for page R1
Institute of Medicine Roundtable on Value & Science-Driven Health Care Charter and Vision Statement Vision: Our vision is for the development of a continuously learning health system in which science, informatics, incentives, and culture are aligned for continuous improvement and innovation, with best practices seamlessly embedded in the care process, patients and families active participants in all elements, and new knowledge captured as an integral by-product of the care experience. Goal: By the year 2020, 90 percent of clinical decisions will be supported by accurate, timely, and up-to-date clinical information and will reflect the best available evidence. We believe that this pres- ents a tangible focus for progress toward our vision, that Americans ought to expect at least this level of performance, that it should be feasible with existing resources and emerging tools, and that mea- sures can be developed to track and stimulate progress. Context: As unprecedented developments in the diagnosis, treat- ment, and long-term management of disease bring Americans closer than ever to the promise of personalized health care, we are faced with similarly unprecedented challenges to identify and deliver the care most appropriate for individual needs and conditions. Care that is important is often not delivered. Care that is delivered is often not important. In part, this is due to our failure to apply the evidence that we have about the medical care that is most effective—a fail- ure related to shortfalls in provider knowledge and accountability, inadequate care coordination and support, lack of insurance, poorly aligned payment incentives, and misplaced patient expectations. Increasingly, it is also a result of our limited capacity for timely generation of evidence on the relative effectiveness, efficiency, and safety of available and emerging interventions. Improving the value of the return on our health care investment is a vital imperative that will require much greater capacity to evaluate high-priority clinical interventions, stronger links between clinical research and practice, and reorientation of the incentives to apply new insights. We must quicken our efforts to position evidence development and application as natural outgrowths of clinical care to foster health care that learns. Approach: The Institute of Medicine Roundtable on Value & Science-Driven Health Care serves as a forum to facilitate the col- laborative assessment and action around issues central to achieving ix

OCR for page R1
the vision and goal stated. The challenges are myriad and include issues that must be addressed to improve evidence development, evidence application, and the capacity to advance progress on both dimensions. To address these challenges, as leaders in their fields, Roundtable members work with their colleagues to identify the issues not being adequately addressed, the nature of the barriers and pos- sible solutions, and the priorities for action and marshal the resources of the sectors represented on the Roundtable to work for sustained public-private cooperation for change. Activities include collabora- tive exploration of new and expedited approaches to assessing the ef- fectiveness of diagnostic and treatment interventions, better use of the patient care experience to generate evidence on the effectiveness and efficiency of care, identification of assessment priorities, and com- munication strategies to enhance provider and patient understanding and support for interventions proven to work best and deliver value in health care. Core concepts and principles: For the purpose of the Roundtable activities, we define science-driven health care broadly to mean that, to the greatest extent possible, the decisions that shape the health and health care of Americans—by patients, providers, payers, and policy makers alike—will be grounded in a reliable evidence base, will account appropriately for individual variation in patient needs, and will support the generation of new insights on clinical effective- ness. Evidence is generally considered to be information from clinical experience that has met some established test of validity, and the appropriate standard is determined according to the requirements of the intervention and clinical circumstance. Processes that involve the development and use of evidence should be accessible and transpar- ent to all stakeholders. A common commitment to certain principles and priorities guides the activities of the Roundtable and its members, including the commitment to the right health care for each person; putting the best evidence into practice; establishing the effectiveness, efficiency, and safety of the medical care delivered; building constant measure- ment into our health care investments; the establishment of health care data as a public good; shared responsibility distributed equitably across stakeholders, both public and private; collaborative stake- holder involvement in priority settings; transparency in the execution of activities and reporting of results; and subjugation of individual political or stakeholder perspectives in favor of the common good. x

OCR for page R1
Reviewers This workshop summary has been reviewed in draft form by individu- als chosen for their diverse perspectives and technical expertise, in accor- dance with procedures approved by the National Research Council’s Report Review Committee. The purpose of this independent review is to provide candid and critical comments that will assist the institution in making its published workshop summary as sound as possible and to ensure that the workshop summary meets institutional standards for objectivity, evidence, and responsiveness to the study charge. The review comments and draft manuscript remain confidential to protect the integrity of the process. We wish to thank the following individuals for their review of this workshop summary: John Concato, U.S. Department of Veterans Affairs Sheldon Greenfield, University of California, Irvine Harold Sox, Dartmouth Geisel School of Medicine Alexander Walker, Harvard School of Public Health Although the reviewers listed above have provided many constructive comments and suggestions, they did not see the final draft of the workshop summary before its release. The review of this workshop summary was overseen by Eric Larson, Group Health Research Institute. Appointed by the Institute of Medicine, he was responsible for making certain that an independent examination of this workshop summary was carried out in ac- cordance with institutional procedures and that all review comments were carefully considered. Responsibility for the final content of this workshop summary rests entirely with the rapporteurs and the institution. xi

OCR for page R1

OCR for page R1
Foreword Clinical research strains to keep up with the rapid and iterative evolu- tion of medical interventions, clinical practice innovation, and the increasing demand for information on the clinical effectiveness of these advancements. Given the growing availability of archived and real-time digital health data and the opportunities this data provides for research, as well as the increas- ing number of studies using prospectively collected clinical data, the Insti- tute of Medicine’s (IOM’s) Roundtable on Value & Science-Driven Health Care, with the support of the Patient-Centered Outcomes Research Institute (PCORI), convened a workshop on Observational Studies in a Learn- ing Health System, which is summarized in this publication. Participants included experts from a wide range of disciplines—clinical researchers, statisticians, biostatisticians, epidemiologists, health care informaticians, health care analytics, research funders, health products industry, clinicians, payers, and regulators. The workshop explored leading edge approaches to observational stud- ies, charted a course for the use of the growing health data utility, and identified opportunities to advance progress. This publication summarizes discussions that considered concepts of rigorous observational study design and analysis, emerging statistical methods, opportunities and challenges of observational studies to complement evidence from experimental methods, treatment heterogeneity, and effectiveness estimates tailored toward indi- vidual patients. The work of the Roundtable is focused on moving toward a continu- ously learning health system, one where every health care encounter is an opportunity for learning and evidence is applied to ensure and improve xiii

OCR for page R1
xiv FOREWORD best care practices. Since its inception in 2006, the Roundtable has set out to help realize this vision through the involvement and support of senior leadership from key health care stakeholders. In engaging the nation’s lead- ers in workshops and other activities, Roundtable members and colleagues contribute to progress on issues important to advancing the development and use of a digital health data utility for knowledge generation and con- tinuous improvement. Building on this groundwork, the objectives of this workshop were to explore the role of observational studies in the generation of evidence to guide clinical and health policy decisions. Issues of rigor, internal validity and bias were engaged, as well as opportunities for using observational studies to generalize findings from randomized controlled trials (RCTs), and to better understand treatment heterogeneity. Workshop speakers and individual participants strove to identify stakeholder needs and barriers to the broader application of observational studies-generated evidence for decision making by engaging colleagues from disciplines typically under- represented in clinical evidence discussions. A number of specific issues were identified by speakers and participants who spoke in the course of the workshop as especially important to accel- erate progress in the appropriate use of observational studies for evidence generation. In the following sections we highlight some of the key points that emerged from each of the four topics in the workshop. The first theme covered in the workshop focused on the challenge to mitigate the potential effects of bias in the absence of randomization. Two of the speakers, Small and Basu, emphasized the role of instrumental variables (IVs) as “natural randomizers” to achieve similarity between com- pared groups that would strengthen causal inferences. Their contributions included a list of examples of potential IVs, including distance to hospital or health care provider, timing of hospital admission, and insurance plan coverage, among others. The current lack of efficient IVs was noted by several individual workshop participants to be a major limitation of the method. A presentation from Ryan of the Observational Medical Outcomes Partnership discussed an empirical approach to measuring bias and error in observational research. Their findings that bias is common and differential by design, analysis, source of data and outcome definition was accompanied by straightforward strategies to measure and mitigate the effects of bias. Heterogeneity of treatment effect (HTE) was the second theme of the workshop and was discussed both in theory and in specific examples. The lack of frequent HTE in the analysis of many RCTs led some participants to wonder how much HTE is present in clinical research. This view was challenged by Kent who attributed the lack of reliably measured HTE to the failure of information and the low analytical power of conventional analytic methods. It was also noted that many “traditional” RCTs use ex-

OCR for page R1
FOREWORD xv clusion and inclusion criteria that may remove HTE from the study. This point was illustrated by one example, presented by Hlatky that described HTE in a comparative effectiveness study of coronary artery bypass graft- ing and percutaneous coronary intervention using a 20 percent sample of Medicare data. The third topic engaged by the workshop was generalizing RCT results to broader populations. A presentation by Hernán shifted the emphasis of discussion from the validity of the answer (bias reduction) to emphasize the quality of the question. One of the points highlighted by this discussion was how central the research question is to issues in methods, analysis, and inferences from clinical research. A resulting suggestion, made by individual workshop participants who spoke, was that the analysis of observational studies and RCTs be the same, except for adjustment for any potential baseline confounding. The final session was on individual risk prediction. The sentiment ex- pressed by many workshop participants was captured in one of the talks that titled its first slide: “When the average applies to no one.” Speakers took on the pragmatic issues of prediction, including the observation that most risk prediction tools do not tailor the instrument to reflect the variabil- ity among patients in age, comorbidities, extent and severity of disease, or other relevant features. Among the hopeful suggestions that arose from this session was that electronic health records may be useful to build prediction models, but this was balanced by the acknowledgment that incomplete pa- tient follow up remains the largest barrier to creating prediction rules that are helpful to patients and physicians. Tatonetti presented a data-driven prediction of drug effects and interactions using observational data and addressing “synthetic” associations that occurred when drugs that are co- prescribed are also associated with the adverse risks of the other medicine. The authors described a new method, the Statistical Correction of Unchar- acterized Bias, to minimize these synthetic associations and validated the method by returning to the laboratory for experimental confirmation of drug disease interactions. Multiple individuals donated valuable time toward the development of this publication. We would like to acknowledge and offer strong ap- preciation for the contributors to this volume for their presence at the workshop and their efforts to further develop their presentations into the summaries contained in this publication. We are especially indebted to those who provided sterling expert guidance as members of the Planning Committee: Anirban Basu (University of Washington), Troyen Brennan (CVS/Caremark), Steven Goodman (Stanford University), Louis Jacques (Centers for Medicare & Medicaid Services), Jerome Kassirer (Tufts Uni- versity School of Medicine), Michael Lauer (National Heart, Lung, and Blood Institute), David Madigan (Columbia University), Sharon-Lise

OCR for page R1
xvi FOREWORD Normand (Harvard University), Richard Platt (Harvard Pilgrim Health Care Institute), Burton Singer (University of Florida), Jean Slutsky (Agency for Healthcare Research and Quality), and Robert Temple (U.S. Food and Drug Administration). Various IOM Roundtable staff played instrumental roles in coordinat- ing the workshop and translating the workshop proceedings into this sum- mary, including Claudia Grossmann, Elizabeth Johnston, Valerie Rohrbach, Julia Sanders, Rob Saunders, and Barret Zimmermann. We would like to recognize Joe Alper for his assistance in drafting this publication. Fi- nally, we want to thank Daniel Bethea, Marton Cavani, Laura Harbold DeStefano, and Chelsea Frakes for helping to coordinate various aspects of review, production, and publication. An effective and efficient health care system requires a continually evolving evidence base to guide clinical decisions at the patient level and policy decisions at the level of the population level. Observational stud- ies play an important role in complementing other research methods and building this evidence base. We believe Observational Studies in a Learning Health System: Workshop Summary will be a valuable resource as efforts to ensure that learning from digital health data are a crucial part of any health system. Ralph Horwitz, Co-Chair Planning Committee on Observational Studies in a Learning Health System Senior Vice President, Clinical Sciences Evaluation GlaxoSmithKline Joe Selby, Co-Chair Planning Committee on Observational Studies in a Learning Health System Executive Director Patient-Centered Outcomes Research Institute J. Michael McGinnis Executive Director, Roundtable on Value & Science-Driven Health Care Institute of Medicine

OCR for page R1
Contents ACRONYMS AND ABBREVIATIONS xix 1 INTRODUCTION 1 The Role of Observational Studies in a Learning Health System, 2 The Roundtable and the Learning Health System Series, 4 Workshop Scope and Objectives, 5 Organization of the Summary, 6 References, 7 2 ISSUES OVERVIEW FOR OBSERVATIONAL STUDIES IN CLINICAL RESEARCH 9 Pressing Questions for Consideration, 10 Discussion, 14 References, 15 3 ENGAGING THE ISSUE OF BIAS 17 An Introduction to the Issue of Bias, 18 Instrumental Variables and Their Sensitivity to Unobserved Biases, 20 An Empirical Approach to Measuring and Calibrating for Error in Observational Analyses, 23 Comment, 26 Discussion, 27 References, 29 xvii

OCR for page R1
xviii CONTENTS 4 GENERALIZING RANDOMIZED CLINICAL TRIAL RESULTS TO BROADER POPULATIONS 31 Introduction to the Issue, 32 Generalizing the Right Question, 34 Use of Observational Studies to Determine Generalizability of RCTs, 37 Comment, 39 Discussion, 40 References, 42 5 DETECTING TREATMENT-EFFECT HETEROGENEITY 45 Key Concepts in Heterogeneity, 46 Example of Comparative Effectiveness, 48 Identification of Effect-Heterogeneity Using Instrumental Variables, 50 Comment, 52 Discussion, 54 References, 56 6 PREDICTING INDIVIDUAL RESPONSES 57 Introduction to Individual Response Prediction, 58 Data-Driven Prediction Models, 60 Individualized Prediction of Risk, 62 Comment, 64 Discussion, 65 References, 68 7 STRATEGIES GOING FORWARD 69 A Journal Editor’s Perspective, 70 Issues to Consider Moving Forward, 71 Lessons from PCORI, 73 Discussion, 74 Closing Remarks, 76 8 COMMON THEMES FOR PROGRESS 79 Methods, 80 Policy, 81 Stakeholder Engagement, 82 APPENDIXES A Biographies of Workshop Speakers 85 B Workshop Agenda 101 C Workshop Participants 111

OCR for page R1
Acronyms and Abbreviations AHRQ Agency for Healthcare Research and Quality CABG coronary artery bypass grafting CMS Centers for Medicare & Medicaid Services Csxover case-crossover study EHR electronic health record FDA U.S. Food and Drug Administration IOM Institute of Medicine IPSM implicit propensity score matching ITT intent to treat analysis IV instrumental variable OMOP Observational Medical Outcomes Partnership PCI percutaneous coronary intervention PCORI Patient-Centered Outcomes Research Institute PSA prostate-specific antigen RCT randomized controlled (clinical) trial SCCS self-controlled case series xix

OCR for page R1