Skip to main content

Currently Skimming:

Proceedings of a Workshop
Pages 1-64

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 1...
... To enable these molecular tumor characterizations to effectively and safely inform cancer care, the cancer community is working to develop and validate multiparameter omics tests and imaging tests as well as software and computational methods for interpretation of the resulting datasets. To examine opportunities to improve cancer diagnosis and care in the new precision oncology era, the National Cancer Policy Forum developed a two-workshop series.
From page 2...
... The workshop included presentations and panel discussions on the current state of computational precision oncology and its opportunities, challenges, and limitations. Topics explored included • Data quality, completeness, sharing, and privacy; • Preclinical and clinical validation of the reliability, safety, and effectiveness of diagnostic tests and clinical decision support tools; • Regulatory oversight and reimbursement; • Communication of omics findings to clinicians and patients; and • Lessons from the use of computational precision oncology in clinical practice.
From page 3...
... Addressing Regulatory Oversight of Computational Precision Oncology • Increase federal regulation of computational precision oncology. (Butte, Shah)
From page 4...
... (Levy, McLeod) • Enhance training for clinicians to help them understand, evaluate, and implement computational precision medicine tools.
From page 5...
... Creating Data Standards for Computational Precision Oncology • Adopt policies to ensure transparency about data reliability, q ­ uality, and completeness. (Abernethy)
From page 6...
... (Levy, McGraw) Improving Data Security •  Implement standards for data sharing to protect patient privacy.
From page 7...
... "You are a much better radi ologist when you work in a multidisciplinary team and can issue clinically relevant reports," she noted. Improving and Increasing the Use of Clinical Decision Support Clinical decision support tools should be efficiently embedded into a clinician's workflow and should provide brief, actionable, and continued
From page 8...
... Cogle stressed that the quality of computational modeling output depends on the quality and completeness of the data used. McShane agreed and added that there is a misconception that powerful computational methods can compensate for poor-quality data.
From page 9...
... Howard McLeod, medical director of the DeBartolo Family Personalized Medicine Institute, said genetic testing of a patient's tumor and germline genome will be pivotal to the success of precision medicine. He noted that genetic information provides diagnostic, prognostic, treatment, and toxicity risk information, and can also suggest appropriate clinical trials for patients.
From page 10...
... We need computational methods to drive us in the direction of making more subtle clinical decisions." Hricak stressed, "Only by proper predictive modeling are we going to have personalized treatment." Levy added, "We're at this precipice of change in how we think about clinical decision support in medicine, which has evolved from the evidence-driven paradigm, to protocol-driven care, to data-driven approaches." Computational Technologies Machine Learning Pratik Shah, principal research scientist and a principal investigator at the Massachusetts Institute of Technology (MIT) Media Lab who leads the Health 0.0 research program, classified computational technologies used in precision oncology in a hierarchy of three major types.
From page 11...
... that makes certain diagnostic features more prominent and facilitates cancer diagnosis. Shah and colleagues used a machine learning program to detect those features in tissue samples without staining, and then digitally altered the images akin to what a physical H&E stain would do when applied to the tissue slices.
From page 12...
... . Giovanni Parmigiani, associate director for population sciences at the Dana-Farber/Harvard Cancer Center, noted that machine learning and genetic tests could identify primary care patients at high risk for cancer and might motivate behavior change that could prevent cancer.
From page 13...
... TRANSLATION CHALLENGES Several participants described challenges in translating computational technologies for clinical use, including ensuring data quality and completeness, identifying methods to validate novel computational methods, ensuring appropriate regulatory oversight, communicating results and potential risks to patients, and achieving appropriate reimbursement under patients' insurance plans. Data Quality and Completeness Reliable algorithms depend on reliable data, stressed Amy Abernethy, chief medical officer, scientific officer, and senior vice president of oncology at Flatiron Health.4 She noted that many factors affect the reliability of data used to train an algorithm (i.e., completeness, quality, diversity, relevancy, 4 In February 2019, Dr.
From page 14...
... Abernethy stressed that potential data for precision oncology should be assessed against each of these requirements before they are used in any clinical or regulatory context, and added that inadequate data quality will result in poorly performing algorithms. McShane agreed, and highlighted the importance of involving individuals with appropriate expertise to make assessments regarding whether data quality is sufficient to be used in the development of algorithms.
From page 15...
... SOURCES: Abernethy presentation, October 29, 2018; Daniel et al., 2018. Risk of Bias Several participants stressed that inadequately representative datasets that do not include diverse populations lead to the creation of invalid and biased algorithms.
From page 16...
... . Ferryman noted that since the passage of the 1993 Revitalization Act, which aimed to increase diversity in clinical trials, less than 2 percent of the more than 10,000 cancer clinical trials funded by the NCI included enough minority participants to meet the goals of the National Institutes of Health (NIH)
From page 17...
... One challenge for validation, identified by Gatsonis, is the "moving target" nature of machine learning algorithms. "The software evolves constantly so there's a big moving target problem," he said, raising the question "At what point do you evaluate such an algorithm?
From page 18...
... He noted, "If you are training a classifier using a machine learning technique, the ability of that classifier to do well on a set-aside piece of the original dataset is going to be way too optimistic in predicting what is going to happen when you take it elsewhere. That is well established." He added that "if you teach your classifiers on multiple datasets, they will not only learn about what predicts patient outcome, but we will also learn about how that will vary from one context to another and we will only retain the features that are more stable across multiple studies." Parmigiani continued, "The ability to take classifiers trained in one context and take them to different contexts is an essential part of the translational process that goes from machine learning to bedside," adding, "This is one of the important lessons the clinical community has been learning often the hard way over the past few years." He also noted that the datasets used to train algorithms for health care are often much smaller than those used to develop other applications, and thus the predictions made by these machine learning systems may be less reliable.
From page 19...
... Lockdown refers to not varying anything in the test when conducting the validation, including all steps in the data preprocessing and prediction algorithm, and the computer code used to create it.
From page 20...
... Lukas Amler, senior director of the Late-Stage Oncology Biomarker Development Department at Genentech, agreed with McShane, noting, "We need very significant scale as far as data go." He suggested that clinical validation for some cancers will likely require a combination of real-world data and data from clinical trials. With regard to machine learning algorithms that are continually evolving, Gatsonis noted that we don't know the way to guarantee that these algorithms are always going to hit a minimum level of performance.
From page 21...
... You cannot make a transparent treatment decision on the basis of a piece of information that is not transparent to you." Gatsonis stressed that algorithms are decision support tools, and ultimately the clinician retains responsibility for making the appropriate diagnosis or treatment selection. Cogle reinforced this notion, stating, "If a medical oncologist is legally responsible for using the data in an app, we want to make sure we understand how the app or the computational system came to the conclusion it did." Clinicians need to be able to demonstrate the face validity of decision support algorithms to justify their prescribing choices to insurers, he said.
From page 22...
... With computational technologies, even the best of clinical and biologic researchers may have no idea what is going on in the computer and do not know how to look over the shoulders of their colleagues who do have this expertise." He said in one online survey of 1,576 researchers, nearly 90 percent of respondents agreed that reproducibility of research findings could be improved by having a better understanding of statistics (Baker, 2016)
From page 23...
... " FDA has multiple regulations and standards relevant to omics tests, algorithms, and decision support tools, including regulation of digital data quality, performance standards for diagnostic tests, and regulation of devices, as described below. Digital Data Standards FDA regularly receives outcomes data from sponsors from registration of clinical trials as well as from postmarketing surveillance.
From page 24...
... Class II General controls Premarket notification 510(k) or de novo Special controls Class III General controls Premarket approval (highest risk)
From page 25...
... . This software (e.g., machine learning algorithms used to diagnose or monitor disease)
From page 26...
... . Petrick noted that FDA has received substantial input regarding machine learning and AI tools, especially related to imaging, but their approach to regulating these technologies is still evolving.
From page 27...
... FIGURE 2  Risk-based approach to assessing importance of independent review. NOTES: SaMD = Software as a Medical Device.
From page 28...
... Schilsky noted that clinicians want assurance that an algorithm can help them make better decisions for their specific clinical context. Schilsky asked if software would be subject to regulatory oversight if it was used as a clinical decision support tool to select drug treatments for a cancer patient based on the results of a multiplex genomic test.
From page 29...
... He also stressed that the performance of a single marker, such as CT lesion volume assessment, should be consistent when the same tool is used at different clinical sites and incorporated into devices made by different companies. Petrick noted that technical assessment of quantitative imaging devices includes evaluation of accuracy and precision.
From page 30...
... However, most organizations prefer to de-identify data to satisfy privacy requirements. McGraw explained that there is also a trend for regulators to enable 9 See https://aspe.hhs.gov/report/studies-welfare-populations-data-collection-andresearch-issues/fair-information-practices (accessed January 7, 2019)
From page 31...
... There is some relaxation of individual rights provisions when data are "­ seudonymized" or "coded," but the GDPR does not define "de-identified" p data. McGraw noted that GDPR uses a broader definition of personal data than the Privacy Rule10 promulgated under the Health Insurance Portability 10 The HIPAA Privacy Rule "establishes national standards to protect individuals' medi cal records and other personal health information and applies to health places, health care clearinghouses, and those health care providers that conduct certain health care transactions electronically." See https://www.hhs.gov/hipaa/for-professionals/privacy/index.html (accessed March 29, 2019)
From page 32...
... McGraw noted that CCPA provides some exemptions for health care entities, including for limited types of data collected as part of a clinical trial, although this is a narrow exemption that does not apply broadly to medical research. McGraw also noted that protected health information (PHI)
From page 33...
... Data Right to receive personal data in Right to digital copy of portability a structured, commonly used, information maintained and machine readable format and digitally; right to copy in the right to transmit that data form and format requested to another controller without if reproducible in that form/ hindrance, where processing is based format on consent and processing is carried out by automated means a Health Insurance Portability and Accountability Act of 1996. SOURCES: McGraw presentation, October 29, 2018; European Commission, 2016; Federal Register, 2003; U.S.
From page 34...
... Joseph described a study on communication of genetic breast cancer risk in which she and colleagues identified "a profound mismatch between what the genetic counselors talked about and what the women actually wanted to know about" (Joseph et al., 2017)
From page 35...
... Joseph concluded by noting that equal access to advanced therapies and technologies is not sufficient to ensure health care equity -- effective communication is necessary for the ethical implementation of precision oncology. When considering how to broadly integrate precision oncology into clinical practice, the health care community must develop communication strategies to ensure that information is interpretable by all patients and clinicians.
From page 36...
... , suggested that clinical data from the VA could be used to conduct a retrospective proof-ofconcept study that would approximate the clinical trial suggested by Newcomer. Newcomer noted that the complexity of genomic sequencing tests also presents a reimbursement challenge for precision oncology.
From page 37...
... Newcomer responded that a value-based reimbursement model would incentivize precision oncology as a method of identifying effective therapies, and would circumvent the need to find payment within a fee-for-service paradigm. He said that bundled care could provide financial support for these computational tools "because if the tools you use lead to more efficient care, the decision support algorithms would be paid for by the shared savings component." Newcomer also suggested that claims data could be used for post-approval surveillance.
From page 38...
... " i McLeod noted that a retrospective study of patients with advanced non-small cell lung cancer identified no statistically significant difference in 12-month mortality between patients who underwent broad-based genomic sequencing and those who underwent routine genetic testing (Presley et al., 2018) , although a difference in survival was observed at cancer centers that offered access to diverse clinical trials.
From page 39...
... Levy suggested conducting pragmatic studies to assess the impact of decision support tools, and noted that she is currently conducting a randomized, prospective study of a tool used to match cancer patients to clinical trials. Weichold also suggested designing more pragmatic and adaptive clinical trials in order the bridge the gap between clinical research and real-world clinical care.
From page 40...
... Shah agreed, adding that prospective adaptive clinical trials can be used to enrich patient populations with rare genetic variants. Parmigiani added, "The general concept is that a substantially increased degree of adaptivity is what is needed to come to much smaller patient strata that are much more homogeneous and where the level of prediction is much more accurate." EXAMPLES OF CARE DELIVERY MODELS FOR COMPUTATIONAL PRECISION MEDICINE Several participants discussed care delivery models for computational precision medicine, including models that have been implemented at the Moffitt Cancer Center, Intermountain Healthcare, the VA, the University of California, and the Vanderbilt–Ingram Cancer Center.
From page 41...
... Payers are interested in data and want to make good decisions, so they can be good partners on this." Intermountain Healthcare Precision Oncology Program Nadauld reported on the Intermountain Healthcare system for precision oncology decision support. Intermountain is composed of 23 hospitals, 180 medical group clinics, and nearly 1 million patients insured by its health plan.
From page 42...
... SOURCE: Nadauld presentation, October 30, 2018. Used with permission from Intermountain Healthcare.
From page 43...
... He said the VA does not have many clinical trials in which to enroll these patients, and is currently trying to increase the opportunities to offer off-label treatments in clinical trials by collaborating with the NCI and other partners. Kelley noted that in addition to genomic testing, the VA employs an ondemand and patient-specific electronic consult service that is facilitated by its unified EHR system.
From page 44...
... University of California Butte reported on the University of California's health data research effort, funded by the Chan Zuckerberg Initiative, that is working to integrate precision medicine in clinical care. He said that all 10 University of California campuses are partnered with UnitedHealth Group, with the plan to combine the campuses into a single accountable care organization over the next decade.
From page 45...
... FIGURE 5  Infrastructure of the Center for Data-Driven Insights and Innovation. 45 SOURCE: Butte presentation, October 29, 2018.
From page 46...
... Financial Support Several participants discussed the need for institutional financial support in the implementation of computational precision oncology, including funds for establishing and maintaining databases, conducting validation studies, and developing health care systems that can seamlessly integrate precision oncology data. McShane noted that although the NCI has worked to establish databases for precision oncology, the effort has been complicated by the lack of resources available for the creation and maintenance of these databases.
From page 47...
... Patient-Centered and Clinician-Friendly Design Workshop participants discussed the importance of designing computational precision oncology applications that are user friendly for both clinicians and patients. Noting that the burden posed by EHRs contributes to clinician burnout, Levy stated that it is important to design precision oncology systems to fit into the workflow of clinical care.
From page 48...
... He suggested that patients could be engaged in the design and implementation of computational precision oncology tools to ensure they address outcomes important to the patient. McLeod said one outcome important to patients is treatment toxicity, noting that it is a frequent reason for stopping therapy.
From page 49...
... Hricak agreed, saying, "That's why we call it augmented intelligence and not artificial intelligence." Standards Workshop participants suggested several ways in which greater standardization could improve the field of computational precision oncology. These suggestions included standardization across databases and system interoperability, the development of standards for validating new technologies, and the development of standards for cancer genomic testing.
From page 50...
... "Most haven't and so there are existing barriers to this," he said. A few participants emphasized the need for consistent evidence standards for data and interpretation in computational precision oncology.
From page 51...
... Amler pointed out that making genetic testing a standard of care for patients diagnosed with cancer would also provide evidence that could be used to validate diagnostics, decision support tools, and new targeted therapies. Such standards might also facilitate the entry of patients into clinical trials because many of these trials require such testing.
From page 52...
... . 19 See https://www.ncbi.nlm.nih.gov/clinvar/intro (accessed January 8, 2019)
From page 53...
... The project expects to include 100,000 cases within a few years, including 10,000 cases that will be manually annotated for linkage to clinical data. Mia Levy, director of the Rush University Cancer Center, noted that a limitation of the project is a lack of participant diversity.
From page 54...
... . b See http://sagebionetworks.org/who-we-are-s (accessed January 8, 2019)
From page 55...
... In the United States, data submitted to FDA becomes public information as soon as the product for which it was collected has obtained approval to go on the market. Depending on how certain FDA statutes are interpreted, patient data under 21 See https://bluebutton.cms.gov (accessed January 8, 2019)
From page 56...
... However, there is confusion about the meaning of "research purposes" and what types of data processing are allowed without obtaining consent. In addition, McGraw noted that CCPA provides an exemption for health data shared in clinical trials, but does not discuss other types of research.
From page 57...
... "We need to bring the machine learning community and the clinical community closer so they both talk the same language and so machine learning people can validate their tools on relevant data," Parmigiani said. Hricak agreed, stating There is a complexity of cancer diagnosis and treatment that requires a multi­ disciplinary approach on both the human side and the data side.
From page 58...
... WRAP-UP Cogle identified several key points from the workshop. He said computational technologies that interpret patient data should be evaluated for clinical utility and should be subject to FDA oversight.
From page 59...
... He also stressed the importance of reproducibility of computational precision oncology findings across different datasets and contexts, and noted concerns that data used to develop decision support algorithms may be not representative of diverse populations. In addition, Cogle said that current computational precision oncology technologies often fail to consider heterogeneity within patients' tumors, and he emphasized the power in sharing data among institutions.
From page 60...
... : Laying the groundwork for improving minority clinical trial accrual: Renewing the case for enhancing minority participation in cancer clinical trials. Cancer 120(Suppl 7)
From page 61...
... https://www.fda.gov/medicaldevices/ productsandmedicalprocedures/invitrodiagnostics/laboratorydevelopedtests/default. htm (accessed January 11, 2019)
From page 62...
... 2018. Improving cancer diagnosis and care: Patient access to oncologic imaging and pathology expertise and technologies: Proceedings of a workshop.
From page 63...
... https://dam-prod.media.mit. edu/x/2018/09/06/MLHC%20paper-Pratik%20Shah.pdf (accessed January 14, 2019)


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.