Skip to main content

Currently Skimming:

2 Ethically Leveraging Digital Technology for Health
Pages 11-34

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 11...
... (Mello) • Many of the ethical concerns associated with emerging digital technologies cannot be adequately addressed within the exist ing regulatory system and should take into account different views on data privacy and intergenerational shifts in privacy perceptions.
From page 12...
... • Data scientists and digital technology developers operate under a very different set of cultural norms, ethical commitments, and incentive structures than those of biomedical research and health care practice. (Estrin, Mello, Ossorio)
From page 13...
... The increasing ability of machine learning algorithms to interpret the data collected by wearables is enhancing the utility of those data for individuals in self-care decision making as well as for use in guiding clinical care and informing research. For example, Estrin suggested, a wearable might help an individual better understand how exercise, diet, and alcohol consumption contribute to his or her poor sleep patterns; the clinician might use the data to evaluate the effectiveness of interventions to reduce the impacts of poor sleep quality on cognition or metabolism; and the data can help inform research on interventions to improve sleep quality.
From page 14...
... • Access to Clinical Health Records -- Mobile apps are also used to provide individuals with access to their clinical health records. Estrin said that Apple HealthKit and Android CommonHealth are developer platforms that take advantage of data interoperability standards, such as Fast Healthcare Interoperability Resources, to provide access to electronic health records (EHRs)
From page 15...
... Risks and Concerns Related to Digital Technologies Potential ethical risks and concerns associated with the use of digital technologies in research and clinical care include privacy exposure when using these digital technologies for health-related surveillance, data use, and transparency around AI-assisted agents, Estrin said. How the data should be controlled depends on the context of use, Estrin explained, and she said that laws and system architectures addressing how data are shared for surveillance need to take the context of use into account.
From page 16...
... These concerns now encompass data generated by digital technologies, including whether such data can be shared or sold for research purposes. The digital data of interest for research might include data from user interactions with apps and websites and clinical data generated by digital technologies in the care setting (e.g., ambient listening devices such as surgical black boxes)
From page 17...
... •  he ethical adolescence of data science -- Although training and accul T turation in science and medicine convey a strong, clear set of ethical norms and sense of professionalism, this is not yet the case for computer scien tists, yet digital technology companies enjoy a high degree of autonomy, with few external ethical controls. SOURCE: Michelle Mello workshop presentation, February 26, 2020.
From page 18...
... A current example is health care organizations transferring large volumes of EHR data to technology companies for use in developing commercial products and services. Addressing potential context transgressions has generally involved clearly disclosing that individuals do not have any rights to a share of the 7  or more information, see https://www.hopkinsmedicine.org/henriettalacks/upholding-the F highest-bioethical-standards.html (accessed April 20, 2020)
From page 19...
... Furthermore, digital technology companies have developed sufficient analytic capacity that they no longer need to interact with academic biomedical researchers for anything except to acquire patient data. The need for that interaction is also declining since digital product developers can often obtain health information directly from consumers or from direct-to-consumer companies.
From page 20...
... Emerging Issues for Digital Technologies The Scale of Data Collection Mobile devices, ambient listening devices, and other passive data­ collection technologies have the capability to collect vast amounts of data with minimal cost and effort, Mello said. There are benefits to this scale of data collection, but there are also concerns.
From page 21...
... In addition, efforts to address these concerns need to engage people of younger generations and to take into consideration their perspectives on privacy and tradeoffs. USING ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING IN RESEARCH AND CLINICAL CARE Artificial Intelligence, Machine Learning, and Bias The future of AI, Saria said, is in augmenting health care providers' capabilities to more efficiently offer higher-quality care.
From page 22...
... Evolving Health Care Practice Provider practice patterns evolve over time, Saria said, and if predictive algorithms are not robust to this type of dataset shift, this can lead to false alerts. As an example, an algorithm for the early detection of sepsis based its predictions on the laboratory tests being ordered by providers and, in 10  ataset D shift is a condition that occurs when data inputs and outputs differ between the training and testing stages.
From page 23...
... However, the study found that the high users of health care identified by the algorithm tended to be white, with black individuals using health care less frequently. This resulted in health systems unknowingly offering more care to those already accessing care and thereby further widening the disparities in care.
From page 24...
... . ETHICAL ISSUES IN MACHINE LEARNING AND DEEP LEARNING NEURAL NETWORKS Sharing Health Care Data Many of the ethical issues associated with machine learning involve concerns about data sharing, Ossorio said.
From page 25...
... Developing and Implementing Responsible Artificial Intelligence for Health Care Based on her experience, Ossorio said that many of the companies developing AI do not fully understand the scope or context of health care data. For example, in the case of a machine learning algorithm to aid in the interpretation of clinical laboratory test results, to improve that algorithm after deployment one would need data about how the clinical laboratory is using that test as well as patient clinical outcomes data.
From page 26...
... DISCUSSION Ethics Training in Data Science and Artificial Intelligence Are there efforts, Lo asked, to incorporate a discussion of ethical issues into the training of data scientists and AI researchers? Individuals in data science have learned norms and behaviors in the context of the companies they work for and the incentive structures they are presented with, Estrin said, and these do not translate to the health care context.
From page 27...
... Mello said a given field will go through three stages of ethical maturation: recognizing that there are ethical issues, developing a framework for solving those problems, and gaining traction and leadership buy-in so that those who are trained in ethics are supported in taking ethical actions. The field of data science is currently in the first stage, she said, and is just beginning to enter the second.
From page 28...
... Allowing Patients to Consent or Opt Out of Data Sharing Should patients be able to opt out of the sharing of their health care data for secondary purposes, Lo asked, and how might that impact the datasets and the ability of researchers to develop and validate digital technologies? Individuals should be given the choice to opt out of data sharing, Saria said, adding that having some patients choose to not share data should not create technical problems for researchers.
From page 29...
... In many cases it is difficult to avoid a free technology or service because it has become part of the digital technology infrastructure, and in a capitalist economy consumers cannot vote with their purchase power when something is already free. Unlike patients in integrated health systems, many people do not have the ability to transfer their health data from one provider to another, Estrin noted.
From page 30...
... There are similar examples in medicine of existing structural inequalities being perpetuated by algorithms, Roberts continued, such as the study of the Optum algorithm discussed by Saria. In that case, an algorithm designed to identify high-risk users of health care in need of additional services was trained using payment data.
From page 31...
... This was the next question Lo posed. Views on Health Data Sharing and Privacy Research is needed to better understand how patients would respond if given the choice to opt out of having their clinical data shared with digital technology companies, said Benjamin Wilfond, the director of the Treuman Katz Center for Pediatric Bioethics at Seattle Children's Hospital and Research Institute and a professor in and the chief of the Division of Bioethics and Palliative Care in the Department of Pediatrics at the University of Washington School of Medicine.
From page 32...
... Developing ethics training programs for computer scientists and educational materials for consumers should not be difficult, Mello said; the challenge is gaining and holding the attention of consumers who are already bombarded with opportunities to consider information and make decisions about data sharing. Ossorio said that an educational approach being developed at Duke provides information about an algorithm in the form of prescribing information (e.g., recommended use, contraindications)
From page 33...
... Potential ethical issues need to be addressed up front, Mello said, before digital technologies are released for use, while Estrin underscored the need to understand the incentive structures that currently drive digital technology development and deployment.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.