Appendix B
Workshop Agendas and Questions to Panelists

AGENDA—WORKSHOP 1

INSTITUTE OF MEDICINE


COMMITTEE ON REVIEWING EVIDENCE TO IDENTIFY HIGHLY EFFECTIVE CLINICAL SERVICES


NOVEMBER 7, 2006


National Academy of Sciences Building

2101 Constitution Avenue, NW, Lecture Room, Washington, DC


Workshop Objective: To review three case studies that reveal the challenges that decision makers face when trying to determine the clinical effectiveness of healthcare technologies.

8:30

Welcome and introductory remarksBarbara McNeil, Chair, Institute of Medicine (IOM) Committee

8:45

Panel 1—PET Scan for Alzheimer’s Disease. Moderator: Dick Justman (UnitedHealthcare)

 

Marilyn Albert, Johns Hopkins University School of Medicine, Division of Cognitive Neuroscience

David Matchar, Duke University Medical School, Center for Clinical Health Policy Research



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 181
Appendix B Workshop Agendas and Questions to Panelists AGENDA—WORKSHOP 1 INStItutE MEDIcINE Of cOMMIttEE ON REvIEWING EvIDENcE tO IDENtIfy HIGHly EffEctIvE clINIcAl SERvIcES NOvEMbER 7, 2006 National Academy of Sciences Building 2101 Constitution Avenue, NW, Lecture Room, Washington, DC Workshop Objective: To review three case studies that reveal the challenges that decision makers face when trying to determine the clinical effective- ness of healthcare technologies. Welcome and introductory remarks—Barbara McNeil, Chair, 8:30 Institute of Medicine (IOM) Committee Panel 1—PET Scan for Alzheimer’s Disease. Moderator: 8:45 Dick Justman (UnitedHealthcare) Marilyn Albert, Johns Hopkins University School of Medicine, Division of Cognitive Neuroscience David Matchar, Duke University Medical School, Center for Clinical Health Policy Research 

OCR for page 181
 KNOWING WHAT WORKS IN HEALTH CARE Sean Tunis, Center for Medical Technology Policy Susan Molchan, Alzheimer’s Disease Neuroimaging Initiative, National Institute on Aging (NIH) Question & Answer/Open Discussion 10:15 Break Panel 2—Avastin and Lucentis for Age Related Macular 10:30 Degeneration. Moderator: Diana Petitti (Kaiser Permanente, Southern California) Reginald Sanders, American Society of Retina Specialists Winifred Hayes, Hayes, Inc. Steve Phurrough, CMS Coverage and Analysis Group Dan Martin, Emory University School of Medicine Question & Answer/Open Discussion 12:15 Lunch Panel 3—Screening and Treating Colorectal Cancer—The Fecal 1:00 DNA Test and an Assay for Irinotecan Toxicity. Moderator: Steve Shak (Genomic Health, Inc.) Barry Berger, Exact Sciences Corporation Margaret Piper, Blue Cross and Blue Shield Association Technology Evaluation Center Richard Goldberg, University of North Carolina, Chapel Hill Atiqur Rahman, Food and Drug Administration, Center for Drug Evaluation and Research Question & Answer/Open Discussion 3:00 Break Panel 4—Experts React. Moderator: Hal Sox, Vice Chair, IOM 3:15 Committee Daniel Cain, Cain Brothers Peter Juhn, Health Policy and Evidence, Johnson & Johnson Cindy Mulrow, University of Texas and the American College of Physicians David Ransohoff, University of North Carolina Chapel Hill, School of Medicine Earl Steinberg, Resolution Health

OCR for page 181
 APPENDIX B Adjourn 5:00 Questions for the Panelists—Workshop 1 November 7, 2006 Panel —PET Scans for Alzheimer’s • What does this experience show about the feasibility of coverage with evidence development under Medicare? • What roles did evidence assessment and political pressure have in this coverage decision? How is this experience instructive for future cases? • What challenges were involved in ensuring that the evidence available on PET was applicable to everyday clinical practice? Panel —Lucentis/Aastin • Was the substantial uptake in Avastin use for wet AMD justifiable given the lack of evidence and the needs of the patient population? • How do evidence reviewers and payers address the relative effectiveness of Lucentis and Avastin given the limited data? • Given the state of the evidence base, what role should cost play in payer decisions? • What does this case study say about the societal need for more clinical data and information and the mechanisms by which data development is financed? • How will the head-to-head trial supported by NIH alter their role in terms of assessing cost effectiveness? Panel —Genetic Tests • Do you think that more of these types of tests will be developed [toxicity, and non-invasive screening]? How will experiences with these technologies affect the development of more similar tests? • Are non-invasive screenings (genetic byproduct screening) and toxicity testing the “wave(s) of the future”? What types or levels of evidence are needed to recommend replacement of current therapies? Will comparative testing be done as newer technologies emerge? • Are there specific challenges due to the nature of the populations qualified for testing? (Such as, are the populations so small as to affect the feasibility of large clinical trials?) • As the evidence for these tests is emerging, how do gaps in evidence compare with more traditional technologies?

OCR for page 181
 KNOWING WHAT WORKS IN HEALTH CARE • What are the labeling issues for these technologies? What needs to be included and what would prompt a change? • Is patient compliance and invasiveness considered when determining effectiveness? Panel —Reactor • What role did evidence play in these examples (as compared to other influences such as political pressure and provider experience)? • Was the process of data collection and assessment able to keep pace with consumer and provider demand? • In what ways did lack of data influence the process? • What is the likelihood that the gaps in data will be filled (and in a timely manner)? • How should current standards and methods in evidence assessment change in this era of personalized medicine, new biologic therapies, and advanced imaging techniques (if at all)? • What needs to change to expedite the introduction of clinical services that are potentially highly effective (e.g., expediting the clinical trial process, and obtaining different funding mechanisms for investigations). • How might information about new technologies be made more accessible for patients? AGENDA—WORKSHOP 2 INStItutE MEDIcINE Of cOMMIttEE ON REvIEWING EvIDENcE tO IDENtIfy HIGHly EffEctIvE clINIcAl SERvIcES JANuARy 25, 2007 National Academy of Sciences Building 2101 Constitution Avenue, NW, Lecture Room, Washington, DC Welcome and introductory remarks—Barbara McNeil, Chair, 8:30 Institute of Medicine Committee Panel 1—Using Systematic Reviews to Develop Clinical 8:35 Recommendations. Moderator: Richard Marshall Carolyn Clancy, Agency for Healthcare Research and Quality Mary Barton, U.S. Preventive Services Task Force

OCR for page 181
 APPENDIX B Steven Findlay, Consumers Union Ray Gibbons, American Heart Association Question & Answer/Open Discussion Panel 2—Using Systematic Reviews to Develop Quality Measures 9:55 and Practice Standards. Moderator: Lisa Simpson Janet Corrigan, National Quality Forum Greg Pawlson, National Committee for Quality Assurance Dennis O’Leary, Joint Commission on Accreditation of Healthcare Organizations Cary Sennett, AMA-convened Physician Consortium for Performance Improvement Question & Answer/Open Discussion Panel 3—Approaches to Priority Setting: Identifying Topics and 11:15 Selection Criteria. Moderator: Dana Goldman Richard Justman, UnitedHealthcare Kay Dickersin, Cochrane USA Jean Slutsky, AHRQ Effective Health Care Program Naomi Aronson, BCBSA Technology Evaluation Center Question & Answer/Open Discussion Lunch 12:30 Panel 4—Stakeholders Forum. Moderator: Robert Galvin 1:00 Kathy Buto, Johnson & Johnson Art Small, Genentech Vivian Coates, ECRI Jim Weinstein, Dartmouth-Hitchcock Medical Center Question & Answer/Open Discussion

OCR for page 181
6 KNOWING WHAT WORKS IN HEALTH CARE INStItutE MEDIcINE WORKSHOP Of REvIEWING EvIDENcE IDENtIfy HIGHly EffEctIvE clINIcAl SERvIcES tO JANuARy 25, 2007 PANEl 1—uSING SyStEMAtIc REvIEWS tO DEvElOP clINIcAl REcOMMENDAtIONS Moderator: Richard Marshall (Harvard Vanguard Medical Associates) Panelists: Carolyn Clancy (AHRQ), Mary Barton (USPSTF), Steven Findlay (Consumers Union), and Ray Gibbons (American Heart Association) The objectie of this panel discussion is to learn about the experiences of well-regarded organizations that use systematic reiews or other syntheses of bodies of eidence to deelop clinical recommendations. AHRQ seres multiple roles; generator of eidence (e.g., DEcIDE, CERTS), synthesizer of eidence (e.g., Eidence-based Practice Center Program), and deeloper of clinical recommendations (e.g., USPSTF). The USPSTF assesses and syn- thesizes the eidence on preentie serices and issues clinical recommenda- tions based on these bodies of eidence. Consumers Union’s Best Buy Drugs Program relies on eidence-based analyses of the safety and effectieness of prescription drugs to help consumers choose the drug best suited to their medical needs. The American Heart Association (in collaboration with the American College of Cardiology) synthesizes bodies of eidence on selected topics and draws from other reiews to deelop clinical practice guidelines for cardioascular care. Questions for the Panelists 1. Who is the principal audience for your clinical recommendations? Have you assessed their use of the recommendations? Approximately how often does your audience follow your clinical recommendations: • In full, _____ percent • In part, _____ percent 2. How do you identify and prioritize areas for which clinical recommendations are necessary? 3. How do you identify sources of evidence on clinical effectiveness? Which criteria (if any) do you use to judge quality of evidence? 4. How would you characterize the available evidence on the clinical

OCR for page 181
7 APPENDIX B questions you address? Does it sufficiently cover services and populations of interest? What are the critical gaps (if any)? 5. Do you incorporate observational and other nonrandomized data in your evidence syntheses? If yes, what are the parameters for their use? 6. How do you respond to pressing demands for clinical recommendations when the body of evidence is insufficient or when the available evidence is relevant to only a subset of patients? Approximately how often do you make recommendations in the absence of sufficient data in these cases? 7. What resources does your organization dedicate to developing evidence-based clinical recommendations (e.g., staff time, special committee responsibility, conferences)? INStItutE MEDIcINE WORKSHOP Of REvIEWING EvIDENcE IDENtIfy HIGHly EffEctIvE clINIcAl SERvIcES tO JANuARy 25, 2007 PANEl 2—uSING SyStEMAtIc REvIEWS tO DEvElOP QuAlIty MEASuRES AND PRActIcE StANDARDS Moderator: Lisa Simpson (Cincinnati Children’s Hospital Medical Center) Panelists: Janet Corrigan (NQF), Greg Pawlson (NCQA), Dennis O’Leary (JCAHO), Cary Sennett (AMA-convened Physician Consortium) The objectie of this panel discussion is to learn about the experiences of leading organizations that are at the forefront of U.S. efforts to deelop, implement, and improe eidence-based clinical practice standards. The mission of the NQF is to promote quality improement in healthcare by en- dorsing national performance measures. NCQA’s HEDIS measures are used to assess health plan performance on arious dimensions of care. JCAHO’s ORYX initiatie incorporates outcome performance measurement into the accreditation process for health care organizations. The AMA-conened Physician Consortium for Performance Improement® (Consortium) has deeloped more than 00 performance measures for practicing physicians. The Consortium includes more than 00 medical specialty and state medi- cal societies, the Council of Medical Specialty Societies, American Board of Medical Specialties and its member-boards, experts in methodology and

OCR for page 181
 KNOWING WHAT WORKS IN HEALTH CARE data collection, the Agency for Healthcare Research and Quality, and the Centers for Medicare & Medicaid Serices. Questions for the Panelists 1. How do you identify areas for which quality measures or practice standards are necessary? 2. Who is the principal audience for your evidence-based measures/ standards? What factors promote your credibility and establish you as a trusted source of information for these groups? 3. How do you respond to pressing demands for measures/standards when the relevant body of evidence is insufficient? Approximately how often do you make recommendations in the absence of sufficient data in these cases? 4. Do you have a mechanism for influencing research to produce needed evidence? What are your thoughts about the emerging practice of coverage with evidence development? 5. How would you characterize the available evidence on the clinical questions you address? Does it sufficiently cover services and populations of interest? What are the critical gaps (if any)? 6. Do you consider observational and other nonrandomized data on clinical effectiveness? If yes, what are the parameters for their use? 7. What resources does your organization use to identify, assess, and incorporate evidence syntheses in your clinical quality measures/ standards (e.g., staff time, special committee responsibility, conferences)? INStItutE MEDIcINE WORKSHOP Of REvIEWING EvIDENcE IDENtIfy HIGHly EffEctIvE clINIcAl SERvIcES tO JANuARy 25, 2007 PANEl 3—APPROAcHES tO PRIORIty SEttING: IDENtIfyING tOPIcS AND SElEctION Moderator: Dana Goldman Panelists: Richard Justman (UnitedHealthcare), Kay Dickersin

OCR for page 181
9 APPENDIX B (Cochrane USA), Jean Slutsky (AHRQ Effective Health Care Program), Naomi Aronson (BCBSA TEC) The objectie of this panel discussion is to learn how these leading orga- nizations prioritize their efforts to conduct systematic reiews on clinical effectieness. UnitedHealthcare is one of the nation’s largest health plans with an estimated  million coered lies. The Cochrane Collaboration is an international, not-for-profit organization that produces and disseminates systematic reiews of health care interentions. AHRQ’s Effectie Health Care Program is the leading federal agency charged with systematically reiewing and synthesizing eidence on clinical effectieness. BCBSA TEC, an AHRQ-designated Eidence-based Practice Center, is a highly respected source of eidence-based assessments of the clinical effectieness of medical procedures, deices, and drugs. Questions for the Panelists 1. How do you identify topics? Please provide a detailed outline of your approach, using, for example, last year’s topics. 2. How would you characterize the yield from your efforts to identify topics? Does it capture a broad spectrum of services? What about surgical procedures? Existing services? Behavioral health? Disease management? Children’s health? 3. What are your criteria for selecting topics and how are the criteria implemented (e.g., through a formal process and quantitative method)? 4. Do you have a mechanism for picking up missed but important topics? How often (in retrospect) has your horizon scanning failed to identify a key service? Consider the past two years in answering this question. 5. How do you rank priorities? Please be specific as to the criteria used. How many of the priority topics are addressed each year? 6. What factors, if any, can override already determined priorities? 7. Has this approach worked well given your objectives? What are the strengths and weaknesses of the process? 8. What resources are involved in this activity (e.g., staff time, special committee responsibility, conferences)?

OCR for page 181
90 KNOWING WHAT WORKS IN HEALTH CARE INStItutE MEDIcINE WORKSHOP Of REvIEWING EvIDENcE IDENtIfy HIGHly EffEctIvE clINIcAl SERvIcES tO JANuARy 25, 2007 PANEl 4—StAKEHOlDERS fORuM Moderator: Robert Galvin (General Electric) Panelists: Kathy Buto (J&J), Vivian Coates (ECRI), Art Small (Genentech), James Weinstein (Dept. of Orthopedics, Dartmouth-Hitchcock Medical Center) The objectie of this panel discussion is to learn key stakeholders’ iews on how highly effectie clinical serices are identified. Johnson & Johnson is one of the world’s largest manufacturers of medical deices, drugs, and equipment. ECRI, an AHRQ-designated Eidence-based Practice Center, is a highly respected source of eidence-based assessments of the clinical effectieness of medical procedures, deices, and drugs. Genentech is one of the world’s leading biotech companies. Dartmouth-Hitchcock’s department of orthopedics is the primary site of a -year, $ million trial comparing surgical to nonsurgical treatments for certain back problems. Questions for the Panelists This IOM Committee has been charged with recommending an approach to identifying highly effective clinical services across the spectrum of care— from prevention, diagnosis, treatment, and rehabilitation, to end-of-life care and palliation. In light of this charge and from the perspective of your organization, please answer the following: 1. How do you think that priorities should be set for services that need evidence development or synthesis? 2. What is your organization’s current role in the development, use, and analysis of evidence on the clinical effectiveness of health care services (including drugs, devices, procedures, and other methods used to promote health or rehabilitation)? 3. Several groups and individuals—perhaps most recently Gail Wilensky in a Health Affairs piece1—have proposed the establishment of a sizable entity to effect a quantum leap in the national capacity to assess the comparative effectiveness of health care services. How 1 Wilensky G. 2006. Health Affairs Web Exclusie w572-w585. Bethesda, MD: Project Hope.

OCR for page 181
9 APPENDIX B might such a venture provide benefits, what would be the key concerns, and what would be the implications for your organization? 4. The U.S. Preventive Services Task Force evaluates evidence and develops recommendations for clinical preventive services.2 How would your organization respond to the formation of a similar task force that provided the same function for clinical interventions, e.g., diagnostic testing, treatment, etc? 2 Harris, R. P., M. Helfand, S. H. Woolf, K. N. Lohr, C. D. Mulrow, S. M. Teutsch, and D. Atkins. 2001. Current methods of the U.S. Preventive Services Task Force: A review of the process. American Journal of Preentie Medicine 20(3 Suppl):21-35.

OCR for page 181