Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
2 The Value and Importance of Health Information Privacy Ethical health research and privacy protections both provide valuable benefits to society. Health research is vital to improving human health and health care. Protecting patients involved in research from harm and preserv- ing their rights is essential to ethical research. The primary justification for protecting personal privacy is to protect the interests of individuals. In con- trast, the primary justification for collecting personally identifiable health information for health research is to benefit society. But it is important to stress that privacy also has value at the societal level, because it permits complex activities, including research and public health activities to be car- ried out in ways that protect individualsâ dignity. At the same time, health research can benefit individuals, for example, when it facilitates access to new therapies, improved diagnostics, and more effective ways to prevent illness and deliver care. The intent of this chapter1 is to define privacy and to delineate its importance to individuals and society as a whole. The value and importance of health research will be addressed in Chapter 3. CONCEPTS AND VALUE OF PRIVACY Definitions Privacy has deep historical roots (reviewed by Pritts, 2008; Westin, 1967), but because of its complexity, privacy has proven difficult to 1 Sections of this chapter were adapted from a background paper by Pritts (2008).
BEYOND THE HIPAA PRIVACY RULE define and has been the subject of extensive, and often heated, debate by philosophers, sociologists, and legal scholars. The term âprivacyâ is used frequently, yet there is no universally accepted definition of the term, and confusion persists over the meaning, value, and scope of the concept of privacy. At its core, privacy is experienced on a personal level and often means different things to different people (reviewed by Lowrance, 1997; Pritts, 2008). In modern society, the term is used to denote different, but overlapping, concepts such as the right to bodily integrity or to be free from intrusive searches or surveillance. The concept of privacy is also context specific, and acquires a different meaning depending on the stated reasons for the information being gathered, the intentions of the parties involved, as well as the politics, convention and cultural expectations (Nissenbaum, 2004; NRC, 2007b). Our report, and the Privacy Rule itself, are concerned with health informational privacy. In the context of personal information, concepts of privacy are closely intertwined with those of confidentiality and secu- rity. However, although privacy is often used interchangeably with the terms âconfidentialityâ and âsecurity,â they have distinct meanings. Privacy addresses the question of who has access to personal information and under what conditions. Privacy is concerned with the collection, storage, and use of personal information, and examines whether data can be collected in the first place, as well as the justifications, if any, under which data collected for one purpose can be used for another (secondary)2 purpose. An important issue in privacy analysis is whether the individual has authorized particular uses of his or her personal information (Westin, 1967). Confidentiality safeguards information that is gathered in the context of an intimate relationship. It addresses the issue of how to keep informa- tion exchanged in that relationship from being disclosed to third parties (Westin, 1976). Confidentiality, for example, prevents physicians from disclosing information shared with them by a patient in the course of a physicianâpatient relationship. Unauthorized or inadvertent disclosures of data gained as part of an intimate relationship are breaches of con- fidentiality (Gostin and Hodge, 2002; NBAC, 2001). Security can be defined as âthe procedural and technical measures required (a) to prevent unauthorized access, modification, use, and dis- semination of data stored or processed in a computer system, (b) to prevent any deliberate denial of service, and (c) to protect the system in its entirety from physical harmâ (Turn and Ware, 1976). Security helps keep health 2 The National Committee on Vital and Health Statistics has noted that the term âsecond- ary usesâ of health data is ill defined and therefore urged abandoning it in favor of precise description of each use. Consequently, the IOM committee has chosen to minimize use of the term in this report.
HEALTH INFORMATION PRIVACY records safe from unauthorized use. When someone hacks into a computer system, there is a breach of security (and also potentially, a breach of con- fidentiality). No security measure, however, can prevent invasion of privacy by those who have authority to access the record (Gostin, 1995). The Importance of Privacy There are a variety of reasons for placing a high value on protecting the privacy, confidentiality, and security of health information (reviewed by Pritts, 2008). Some theorists depict privacy as a basic human good or right with intrinsic value (Fried, 1968; Moore, 2005; NRC, 2007a; Terry and Francis, 2007). They see privacy as being objectively valuable in itself, as an essential component of human well-being. They believe that respecting privacy (and autonomy) is a form of recognition of the attributes that give humans their moral uniqueness. The more common view is that privacy is valuable because it facilitates or promotes other fundamental values, including ideals of personhood (Bloustein, 1967; Gavison, 1980; Post, 2001; Solove, 2006; Taylor, 1989; Westin, 1966) such as: â¢ Personal autonomy (the ability to make personal decisions) â¢ Individuality â¢ Respect â¢ Dignity and worth as human beings The bioethics principle nonmaleficence3 requires safeguarding personal privacy. Breaches of privacy and confidentiality not only may affect a personâs dignity, but can cause harm. When personally identifiable health information, for example, is disclosed to an employer, insurer, or fam- ily member, it can result in stigma, embarrassment, and discrimination. Thus, without some assurance of privacy, people may be reluctant to provide candid and complete disclosures of sensitive information even to their physicians. Ensuring privacy can promote more effective communica- tion between physician and patient, which is essential for quality of care, enhanced autonomy, and preventing economic harm, embarrassment, and discrimination (Gostin, 2001; NBAC, 1999; Pritts, 2002). However, it should also be noted that perceptions of privacy vary among individuals and various groups. Data that are considered intensely private by one per- son may not be by others (Lowrance, 2002). But privacy has value even in the absence of any embarrassment or 3 The ethical principle of doing no harm, based on the Hippocratic maxim, primum non nocere, first do no harm.
BEYOND THE HIPAA PRIVACY RULE tangible harm. Privacy is also required for developing interpersonal rela- tionships with others. Although some emphasize the need for privacy to establish intimate relationships (Allen, 1997), others take a broader view of privacy as being necessary to maintain a variety of social relationships (Rachels, 1975). By giving us the ability to control who knows what about us and who has access to us, privacy allows us to alter our behavior with different people so that we may maintain and control our various social relationships (Rachels, 1975). For example, people may share different information with their boss than they would with their doctor. Most discussions on the value of privacy focus on its importance to the individual. Privacy can be seen, however, as also having value to society as a whole (Regan, 1995). Privacy furthers the existence of a free society (Gavison, 1980). For example, preserving privacy from widespread surveil- lance can be seen as protecting not only the individualâs private sphere, but also society as a whole: Privacy contributes to the maintenance of the type of society in which we want to live (Gavison, 1980; Regan, 1995). Privacy can foster socially beneficial activities like health research. Indi- viduals are more likely to participate in and support research if they believe their privacy is being protected. Protecting privacy is also seen by some as enhancing data quality for research and quality improvement initiatives. When individuals avoid health care or engage in other privacy-protective behaviors, such as withholding information, inaccurate and incomplete data are entered into the health care system. These data, which are subse- quently used for research, public health reporting, and outcomes analysis, carry with them the same vulnerabilities (Goldman, 1998). The bioethics principle of respect for persons also places importance on individual autonomy, which allows individuals to make decisions for them- selves, free from coercion, about matters that are important to their own well-being. U.S. society also places a high value on individual autonomy, and one way to respect persons and enhance individual autonomy is to ensure that people can make the choice about when, and whether, per- sonal information (particularly sensitive information) can be shared with others. Public Views of Health Information Privacy American society places a high value on individual rights, personal choice, and a private sphere protected from intrusion. Medical records can include some of the most intimate details about a personâs life. They docu- ment a patientâs physical and mental health, and can include information on social behaviors, personal relationships, and financial status (Gostin and Hodge, 2002). Accordingly, surveys show that medical privacy is a major concern for many Americans, as outlined below (reviewed by Pritts, 2008;
HEALTH INFORMATION PRIVACY Westin, 2007). As noted in Chapter 1, however, there are some limits to what can be learned from surveys (Tourangeau et al., 2000; Wentland, 1993; Westin, 2007). For example, how the questions and responses are worded and framed can significantly influence the results and their inter- pretation. Also, responses are biased when respondents self-report measures of attitudes, behavior, and feelings in such a way as to represent themselves favorably. In a 1999 survey of consumer attitudes toward health privacy, three out of four people reported that they had significant concerns about the privacy and confidentiality of their medical records (Forrester Research, 1999). In a more recent survey, conducted in 2005 after the implementation of the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule, 67 percent of respondents still said they were concerned about the privacy of their medical records, suggesting that the Privacy Rule had not effectively alleviated public concern about health privacy. Ethnic and racial minorities showed the greatest concern among the respondents. Moreover, the survey showed that many consumers were unfamiliar with the HIPAA privacy protections. Only 59 percent of respondents recalled receiving a HIPAA privacy notice, and only 27 percent believed they had more rights than they had before receiving the notice (Forrester Research, 2005). One out of eight respondents also admitted to engaging in behaviors intended to protect their privacy, even at the expense of risking dangerous health effects. These behaviors included lying to their doctors about symptoms or behaviors, refusing to provide information or providing inaccurate infor- mation, paying out of pocket for care that is covered by insurance, and avoiding care altogether (Forrester Research, 2005). A series of polls conducted by Harris Interactive suggest, however, that the privacy of health information has improved since implementation of the Privacy Rule. Prior to its creation, a 1993 survey by Harris Interac- tive showed that 27 percent of Americans believed their personal medical information had been released improperly in the past 3 years. In contrast, 14 percent and 12 percent of respondents believed this had happened to them in 2005 and 2007, respectively (Harris Interactive, 2005, 2007). In the 2005 survey, about two-thirds of respondents reported having received a HIPAA privacy notice, and of these people, 67 percent said the privacy notice increased their confidence that their medical information is being handled properly (Harris Interactive, 2005). Responses to other questions on recent public opinion polls conducted by Harris Interactive only partially corroborate these findings. In one sur- vey, 70 percent of respondents indicated that they are generally satisfied with how their personal health information is handled with regard to privacy protections and security. Nearly 60 percent of the respondents reported that they believe the existing federal and state health privacy pro-
0 BEYOND THE HIPAA PRIVACY RULE tection laws provide a reasonable level of privacy protection for their health information (Harris Interactive, 2005). Nonetheless, half of the respondents also believed that â[P]atients have lost all control today over how their medical records are obtained and used by organizations outside the direct patient health care such as life insurers, employers, and government health agencies.â In another survey, 83 percent of respondents reported that they trust health care providers to protect the privacy and confidentiality of their personal medical records and health information (Westin, 2007). However, in that survey, 58 percent of respondents believed the privacy of personal medical records and health information is not protected well enough today by federal and state laws and organizational practices. A number of studies suggest that the relative strength of privacy, con- fidentiality, and security protections can play an important role in peopleâs concerns about privacy (reviewed by Pritts, 2008). When presented with the possibility that there would be a nationwide system of electronic medi- cal records, one survey found 70 percent of respondents were concerned that sensitive personal medical record information might be leaked because of weak data security, 69 percent expressed concern that there could be more sharing of medical information without the patientâs knowledge, and 69 percent were concerned that strong enough data security will not be installed in the new computer system. Confidentiality is particularly important to adolescents who seek health care. When adolescents perceive that health services are not confidential, they report that they are less likely to seek care, particularly for reproduc- tive health matters or substance abuse (Weddle and Kokotailo, 2005). In addition, the willingness of a person to make self-disclosures necessary to mental health and substance abuse treatment may decrease as the perceived negative consequences of a breach of confidentiality increase (Petrila, 1999; Roback and Shelton, 1995; Taube and Elwork, 1990). These studies show that protecting the privacy of health information is important for ensuring that individuals seek and obtain quality care. The potential for economic harm resulting from discrimination in health insurance and employment is also a concern for many people (reviewed by Pritts, 2008). Polls consistently show that people are most concerned about insurers and employers accessing their health information without their per- mission (Forrester Research, 2005; PSRA, 1999). This concern arises from fears about employer and insurer discrimination. Concerns about employer discrimination based on health information, in particular, increased 16 percent between 1999 and 2005, with 52 percent of respondents in the later survey expressing concern that their information might be seen by an employer and used to limit job opportunities (Forrester Research, 2005; PSRA, 1999). Reports alleging that major employers such as Wal-Mart base
HEALTH INFORMATION PRIVACY some of their hiring decisions on the health of applicants suggest that these concerns may be justified (Greenhouse and Barbaro, 2005). Studies show that individuals are especially concerned about genetic information being used inappropriately by their insurers and employers (reviewed by Pritts, 2008). Even health care providers appear to be affected by these concerns. In a survey of cancer-genetics specialists, more than half indicated that they would pay out of pocket rather than bill their insurance companies for genetic testing, for fear of genetic discrimination (Hudson, 2007). Although surveys do not reveal a significant percentage of indi- viduals who have experienced such discrimination, geneticists have reported that approximately 550 individuals were refused employment, fired, or denied life insurance based on their genetic constitution (NBAC, 1999). In addition, a study in the United Kingdom suggested that life insurers in that country do not have a full grasp on the meaning of genetic informa- tion and do not assess or act in accord with the actuarial risks presented by the information (Low et al., 1998). There is, therefore, some legitimate basis to individualsâ concerns about potential economic harm and the need to protect the privacy of their genetic information. Recent passage of the Genetic Information Nondiscrimination Act in the United States will hope- fully begin to address some of these concerns.4 Patient Attitudes About Privacy in Health Research Ideally, there would be empirical evidence regarding the privacy value of all the specific Privacy Rule provisions that impact researchers, but there are only limited data on this topic from the consumer/patient perspective. A few studies have attempted to examine the publicâs attitudes about the use of health information in research. However, few have attempted to do so with respect to the intricacies of the protections afforded by the Privacy Rule or the Common Rule,5 which are likely not well known to the public. A review by Westin of 43 national surveys with health privacy ques- tions fielded between 1993 and September 2007 identified 9 surveys6 with one or more questions about health research and privacy (Westin, 2007). In some, the majority of respondents were not comfortable with their health 4 The Genetic Information Nondiscrimination Act of 2008 establishes some protections to prevent discrimination based on a patientâs genetic background. 5 The âCommon Ruleâ is the term used by 18 federal agencies who have adopted the same regulations governing the protection of human subjects of research. See Chapter 3 for a detailed description of the rule. 6 These surveys were undertaken by a wide range of sponsors (Markle Foundation, Equifax, Institute for Health Freedom, Geneforum, Privacy Consulting Group) and a wide range of surveyors (Harris Interactive, Public Opinion Strategies, Genetics and Public Policy Center).
BEYOND THE HIPAA PRIVACY RULE information being provided for health research except with notice and express consent. But in others, a majority of respondents were willing to forgo notice and consent if various safeguards and specific types of research were offered. For example, a recent Harris Poll found that 63 percent of respondents would give general consent to the use of their medical records for research, as long as there were guarantees that no personally identifiable health information would be released from such studies (Harris Interactive, 2007). This is similar to the percentage of people willing to participate in a âclinical research studyâ (Research!America, 2007; Woolley and Propst, 2005) (see also Chapter 3). A 2006 British survey also found strong sup- port for the use of personally identifiable information without consent for public health research and surveillance, via the National Cancer Registry (Barrett et al., 2007). Westin noted that opinions varied in the surveys according to devel- opments on the health care scene and with consumer privacy trends. He concluded from this review that the majority of consumers are positive about health research, and if asked in general terms, support their medical information being made available for research. However, he also noted that most of these surveys presented the choice in ways that did not articulate the key permission process, and that there was much ambiguity in who âresearchersâ are, what kind of âhealth researchâ is involved, and how the promised protection of personal identities would be ensured (Westin, 2007). Reviewing the handful of detailed studies examining patient views of the use of their medical information in research through surveys, structured interviews, or focus groups, Pritts determined that a number of common themes emerge (reviewed by Pritts, 2008): â¢ Patients were generally very supportive of research provided safe- guards are established to protect the privacy and security of their medical information (Damschroder et al., 2007; Kass et al., 2003; Robling et al., 2004; Westin, 2007; Willison et al., 2007). â¢ Patients were much more comfortable with the use of anonymized data (e.g., where obvious identifiers have been removed) than fully identifiable data for research (Damschroder et al., 2007; Kass et al., 2003; Robling et al., 2004; Whiddett et al., 2006). â¢ Patients were less comfortable with sharing information about âsensitiveâ conditions such as mental health with researchers (Damschroder et al., 2007; Robling et al., 2004). In studies where patients were able to provide unstructured comments, they expressed concern about the potential that anonymized data would be reidentified. They were also concerned that insurers or employers or others who could discriminate against subjects could potentially access informa-
HEALTH INFORMATION PRIVACY tion maintained by researchers (Damschroder et al., 2007; Kass et al., 2003; Robling et al., 2004). Some feared that researchers would sell information to drug companies or other third parties (Damschroder et al., 2007). Although supportive of research, the majority of patients in these studies expressed a desire to be consulted before their information was released for research (Damschroder et al., 2007; Kass et al., 2003; Robling et al., 2004; Westin, 2007; Whiddett et al., 2006; Willison et al., 2007). Some surveys also show that even if researchers would receive no directly identifying information (e.g., name, address, and health insurance number), the majority of respondents still wanted to have some input before their medical records were disclosed (Damschroder et al., 2007; Robling et al., 2004; Willison et al., 2007). For example, in a 2005 Australian survey, 67 percent of respondents indicated they would be willing to allow their deidentified health records to be used for medical research purposes, but 81 percent wanted to be asked first (Flannery and Tokley, 2005). Studies indicate that public support for research and willingness to share health information can vary with the purpose or type of activity being conducted (reviewed by Pritts, 2008). Studies have found there was less support for activities that were primarily for a commercial purpose, or that might be used in a manner that would not help patients (Damschroder et al., 2007; Willison et al., 2007). Some participants expressed concern that some researchers were motivated by monetary rewards and that decision makers would act out of self-interest (Damschroder et al., 2007). One recent study suggests that the biggest predictor of whether patients are willing to share their medical records with researchers is the patientsâ trust that their information will be kept private and confidential (Damschroder et al., 2007). In this study, the patients who most trusted the Veterans Affairs system to keep their medical records private were more likely to accept less stringent requirements for informed consent. Thirty- four percent of veterans who participated in intensive focus groups using deliberative democracy were willing to allow researchers associated with the Veterans Health Administration to use their medical records without any procedures for patient input, subject to Institutional Review Board (IRB) approval, and another 17 percent reported that patients should have to ask for their medical records to be excluded from research studies (opt-out). But participants in focus groups also have expressed a desire to be informed of how their health information was used for research. This desire was tied to a sense of altruismâthey wanted to know that their informa- tion was useful and that they may have contributed to helping others by allowing their medical records to be used for research (Damschroder et al., 2007; Robling et al., 2004). The veterans also recommended methods to give research participants more control over how their medical records are used in research. These recommendations included requiring that partici- pants are fully informed about how their medical records are being used
BEYOND THE HIPAA PRIVACY RULE in research; providing assurances that the research being conducted will benefit fellow veterans; updating research participants about findings and ongoing research; and setting out clear and consistent consequences for anyone who violates a patientâs privacy (Damschroder et al., 2007). The recent Harris poll7 commissioned by the Institute of Medicine (IOM) committee for this study found that 8 percent of respondents had been asked to have their medical information used in research, but declined. When asked why, 30 percent indicated they were concerned about the privacy and confidentiality of their personal information, but many other reasons were also commonly cited (ranging from 5 to 24 percent of respondents), including worry that participation would be risky, painful, or unpleasant; lack of trust in the researchers; or belief that it would not help their condition or their family (Westin, 2007). Some studies also suggest that individualsâ attitudes toward the use of their medical records in research may be influenced by a personâs state of health. Although the commissioned Harris Poll found that people who are in only fair health, who have a disability, or who had taken a genetic test were slightly more concerned than the public about health researchers see- ing their medical records (55 percent versus 50 percent), other data suggest that people with health concerns may be more supportive of using medi- cal records in research. For example, qualitative market research by the National Health Council showed that individuals with chronic conditions have a very favorable attitude toward the implementation of electronic personal health records (EPHRs). During the focus group discussions, par- ticipants noted that EPHRs could be very advantageous in medical research and were supportive of this use even though many had expressed concern about the privacy and confidentiality of EPHRs (Balch et al., 2005, 2006). Although the Council did not specifically ask about attitudes toward health research and privacy, these results suggest that individuals with chronic conditions may be more likely to grant researchers access to their medical records, and to place less emphasis on protecting privacy than members of the general population. Also, a Johns Hopkins University survey of patients having, or at risk for, serious medical conditions examined these patientsâ attitudes about the use of their medical records in research, and compared those results to polls from the general population. Thirty-one percent of respondents stated that medical researchers should have access to their medical records without their permission if it would help to advance medical knowledge. In contrast, the recent Harris poll of the public found that 19 percent of respondents would be willing to forgo consent to use personal medical and health information, as long as the study never revealed their identity 7 The survey was conducted online by Harris Interactive between September 11 and 18, 2007, with 2,392 respondents. The methodology for the survey is described in Appendix B.
HEALTH INFORMATION PRIVACY and it was supervised by an IRB (Westin, 2007). An additional 8 percent indicated they would be willing to give general consent in advance to have personally identifiable medical or health information used in future research projects without the researchers having to contact them, and 1 percent said researchers should be free to use their personal medical and health informa- tion without their consent at all. Thus, 28 percent of respondents would be willing to grant researchers access to their medical records without giving specific consent for each research project. Thirty-eight percent believed they should be asked to consent to each research study seeking to use their personally identifiable medical or health information, and 13 percent did not want researchers to contact them or to use their personal or health information under any circumstances. However, those who preferred not to be contacted at all were actually less likely than those who would grant conditional permission to have declined participating in a research study. Notably, 20 percent of respondents were unsure how to respond to the question about notice and consent for research. Among the 38 percent who said they wanted notice and consent, 80 percent indicated that they would want to know the purpose of the research, and 46 percent wanted to know specifically whether the research could help their health condition or those of family members. Sixty-two percent indicated that knowing about the specific research study and who would be running it would allow the respondent to decide whether to trust the researchers. A little more than half of the respondents (54 percent) said they would be worried that their personally identifiable information may be disclosed outside the study. Among those 54 percent, three-quarters agreed with the statement âI would feel violated and my trust in the researchers betrayed.â Between 39 and 67 percent were concerned about discrimination in a government program, by an employer, or in obtaining life or health insurance (Westin, 2007). However, about 70 percent of all respondents indicated that they trusted health researchers to protect the privacy and confidentiality of the medical records and health information they obtain about research participants. Furthermore, among respondents who had participated in health research, only 2 percent reported that any of their personally identifiable medical information used in a study was given to anyone outside the research staff, and half of those disclosures were actually made to other researchers or research institutions (Westin, 2007). In summary, very limited data are available to assess the privacy value of the Privacy Rule provisions that impact researchers. Surveys indicate that the public is deeply concerned about the privacy and security of per- sonal health information, and that the HIPAA Privacy Rule has perhaps reducedâbut not eliminatedâthose concerns. Patients were generally very supportive of research, provided safeguards were established to protect the privacy and security of their medical information, although some surveys
BEYOND THE HIPAA PRIVACY RULE indicate that a significant portion of the public would still prefer to control access to their medical records via consent, even if the information is anony- mized. Studies indicate that public support for research and willingness to share health information varies with health status and the type of research conducted, and depends on the patientsâ trust that their information will be kept private and confidential. An understanding the publicâs attitude toward privacy is important throughout the rest of this report, because many of the IOM committeeâs recommendations affect the nature of the privacy protec- tions afforded by the federal health research regulations. HISTORICAL DEVELOPMENT OF LEGAL PROTECTIONS OF HEALTH INFORMATION PRIVACY The medical community has long recognized the importance of pro- tecting privacy in maintaining public trust in doctors and researchers, and codes of medical ethics reflect a desire to increase this public trust. Since the time of Hippocrates, physicians have pledged to keep information about their patients private and confidential (Feld and Feld, 2005). The Hippocratic Oath states, âWhat I may see or hear in the course of the treatment or even outside of the treatment in regard to the life of men, which on no account one must spread abroad, I will keep to myself. . . .â This pledge to privacy has been included in the code of ethics of nearly all health care professionals in the United States. For example, the first Code of Ethics of the American Medical Association in 1847 included the concept of confidentiality (OTA, 1993). The value of health information privacy has also been recognized by affording it protection under the law (reviewed by Pritts, 2008). The rules for protecting the privacy of health information in the clinical care and health research contexts developed along fairly distinct paths until the promulgation of the federal privacy regulations under HIPAA.8 Prior to HIPAA, health information in the clinical setting was protected primarily under a combination of federal and state constitutional law, as well as state common law and statutory protections (Box 2-1). In contrast, research practices have been governed largely by federal regulations called the Common Rule, which have historically focused on protecting individuals from physical and mental harm in clinical trials (see subsequent sections of this chapter). Although the standards apply to research that uses personally identifiable health information, the protection of information is not their primary focus. 8 Health Insurance Portability and Accountability Act, Public Law 104â191 (1996) (most relevant sections codified at 42 U.S.C. Â§Â§ 1320(d)â1320(d)(8).
HEALTH INFORMATION PRIVACY BOX 2-1 Overview of Privacy Protections in the Law Constitutional Protections Both federal and state constitutions generally afford citizens some protec- tion for the privacy of their health information. However, with limited exceptions, individuals are only protected against governmental intrusions into their personal health information and may not raise constitutional concerns about private action. Even when state action is involved, individuals rarely prevail on claims premised on constitutional rights to informational privacy because state interests generally outweigh the individualâs privacy interest. The U.S. Constitution does not expressly provide a right to privacy, but the courts have determined that various constitutional provisions implicitly create zones of privacy that are protected by the Constitution. The privacy interests rec- ognized include both the individualâs interest in making certain kinds of important decisions, and the individualâs interest in avoiding disclosure of personal matters. With respect to informational privacy, the courts have afforded limited constitu- tional protections, although the right is not absolute, with the courts weighing factors such as the type of record and information that it contains, the potential for harm in an unauthorized disclosure and the injury from disclosure to the rela- tionship in which the record was generated against the public interest or need for the disclosure, and the adequacy of safeguards to prevent unauthorized access or disclosure. Several federal courts have expressly recognized the constitutional right of privacy in connection with medical and prescription records. All states have constitutional provisions similar to those in the U.S. Constitu- tion, which give rise to an implied right of privacy. Unlike the U.S. Constitution, however, constitutions in 10 states grant individuals an express right to privacy. Courts have consistently determined that health or medical information is an area of privacy that is protected by state constitutions. Common Law Protections State common law generally recognizes that some health care relationships are based on maintaining the confidentiality of information obtained in the course of care and affords a remedy when that confidentiality is breached. Traditionally, the lawâs regulation of âprivacyâ consisted essentially of the protection of confi- dentiality within the doctorâpatient relationship. Courts have found that actions may be maintained against private parties for unauthorized disclosures of health information under a number of legal theories, including invasion of privacy, implied breach of contract, breach of confidentiality, and breach of fiduciary relation- ship. Obtaining a remedy for disclosure of health information under any of these theories, however, is difficult. In the health care context, the promise of confidentiality is intended to encour- age patients to fully disclose their most personal information to assist in accurate continued
BEYOND THE HIPAA PRIVACY RULE BOX 2-1 Continued diagnosis and treatment. Courts have thus found the duty of confidentiality applies to physicians, hospitals, psychiatrists, and social workers. The underlying duty of confidentiality is not absolute, and the courts have indicated that there is no breach of confidentiality when a disclosure is made as required by statute (e.g., mandatory reporting to state officials of infectious or contagious diseases) or common law (e.g., a duty to disclose information concerning the safety of third persons). The extent to which state common law protects the confidentiality of health information in the evolving health care paradigm, where many people and organizations that receive and maintain health information do not have a direct relationship with the patient, is unclear. In most states, common law protections, particularly in tort, have been codified in statute. Statutory and Regulatory Protections Since the 1970s, the trend has been to augment existing constitutional and common law rights with statutory protections specifically designed to protect the privacy and confidentiality of health information (see Table 2-1). Although the common law continues to be important, the federal and state governments have increasingly focused on promulgating distinct standards for the protection of health information. The shift to statutory and regulatory protections for health information was largely a response to the changing nature of recordkeeping in general, and of the nature of the provision of health care. As noted by the 1977 Privacy Protection Study Commission, âThe emergence of third-party payment plans; the use of health care information for non-healthcare purposes; the growing involvement of government agencies in virtually all aspects of health care; and the exponential increase in the use of computers and automated information systems for health- care record information have combined to put substantial pressure on traditional confidentiality protections.â SOURCES: Bodger (2006); Gostin (1995); Magnussen (2004); NCSL (2008); Pritts (2002, 2008); Privacy Protection Study Commission (1977); Richards and Solove (2007); Terry and Francis (2007).
HEALTH INFORMATION PRIVACY TABLE 2-1 Federal Health Privacy Statutes and Executive Orders That Regulate the Collection and Disclosure of Information Statute Year Privacy Protection Freedom of Information 1966 Prevents personally identifiable health Act (FOIA) information from being included in the release of information as part of a FOIA request Privacy Act 1974 Protects the privacy of health, research, and other records held by federal agencies Family Educational Rights 1974 Requires schools to have written and Privacy Act permission from a parent or student prior to releasing information from a studentâs education record Veterans Omnibus Health 1976 Protects the privacy of medical records Care Act relating to the treatment of drug abuse, alcohol abuse, infection with AIDS or sickle cell anemia, in the Department of Veterans Affairs Protection of Pupil Rights 1978 Protects the rights of pupils and the Amendment parents of pupils in programs funded by the Department of Education Social Security Act, 1986 Prohibits unauthorized disclosure of Section 1106 individually identifiable records held by the Department of Health and Human Services, the Social Security Administration, and their contractors Clinical Laboratory 1988 Requires clinical laboratories to protect Improvement the confidentiality of test results and Amendments reports, including information on patient and clinical study subjects; medical information may only be disclosed to authorized persons as defined by state or federal law Public Health Service Act, 1988 Provides for Certificates of Confidentiality Health Omnibus Program that protect personally identifiable Extension research information continued
0 BEYOND THE HIPAA PRIVACY RULE BOX 2-1 Continued TABLE 2-1 Continued Statute Year Privacy Protection Americans with 1990 Employers must treat employeesâ and Disabilities Act applicantsâ medical information and medical conditions confidentially Public Health Service Act, 1992 Federally assisted alcohol or substance Section 543, Federal abuse programs must keep patient alcohol Confidentiality and drug abuse treatment records Requirements for confidential, absent patient consent or a Substance Abuse Patient court order Records Health Insurance 1996 Protects the privacy of individually Portability and identifiable information held by covered Accountability Act entities (HIPAA), Privacy Rule Balanced Budget Act 1997 Added language to the Social Security Act to require Medicare+Choice organizations to establish safeguards for the privacy of individually identifiable patient information Clintonâs Executive Order 2000 Bans the use of genetic information in 13145 federal hiring and promotion decisions Confidential Information 2002 Ensures that information supplied by Protection and Statistical individuals or organizations to a federal Efficiency Act agency for statistical purposes under a pledge of confidentiality is used exclusively for statistical purposes Medicare Prescription 2003 Requires prescription drug plan sponsors Drug, Improvement and to comply with the HIPAA Privacy Rule Modernization Act and the Security Rule requirements Genetic Information 2008 Prohibits discrimination against Nondiscrimination Act individuals based on their genetic information in health insurance and employment
HEALTH INFORMATION PRIVACY Principles of Fair Information Practice The framework in which detailed statutory and regulatory protections of privacy originated was in the 1973 report of an advisory committee to the U.S. Department of Health, Education and Welfare (HEW), âdesigned to call attention to issues of recordkeeping practice in the computer age that may have profound significance for us allâ (HEW, 1973). The principles were intended to âprovide a basis for establishing procedures that assure the individual a right to participate in a meaningful way in decisions about what goes into records about him and how that information shall be usedâ (HEW, 1973). In addition to affording individuals the meaningful right to control the collection, use, and disclosure of their information, the fair information practices also impose affirmative responsibilities to safeguard information on those who collect it (reviewed by Pritts, 2008). The fundamental principles of fair information practice articulated in the report have since been amplified and adopted in various forms at the international, federal, and state levels (Gelman, 2008). The fair information practices endorsed by the Organisation for Economic Co-operation and Development (OECD), which have been widely cited, include the following principles (OECD, 1980): â¢ Collection Limitation There should be limits to the collection of personal data, and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject. â¢ Data Quality Personal data should be relevant to the purposes for which they are to be used, and to the extent necessary for those purposes, should be accurate, complete, and kept updated. â¢ Purpose Specification The purposes for which personal data are collected should be speci- fied not later than at the time of data collection, and the subsequent use limited to the fulfillment of those purposes or such others as are not incompatible with those purposes, and as are specified on each occasion of change of purpose. â¢ Use Limitation Personal data should not be disclosed, made available, or otherwise used for purposes other than those specified in accordance with [the Purpose Specification] except: (a) with the consent of the data subject; or (b) by the authority of law.
BEYOND THE HIPAA PRIVACY RULE â¢ Security Safeguards Personal data should be protected by reasonable security safeguards against such risks as loss or unauthorized access, destruction, use, modification, or disclosure of data. â¢ Openness There should be a general policy of openness about developments, practices, and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller. â¢ Individual Participation An individual should have the right to know whether a data con- troller has data relating to him/her, to obtain a copy of the data within a reasonable time in a form that is intelligible to him/her, to obtain a reason if the request for access is denied, to challenge such a denial, to challenge data relating to him/her, and, if the chal- lenge is successful, to have the data erased, rectified, completed, or amended. â¢ Accountability A data controller should be accountable for complying with mea- sures, which give effect to the principles stated above. These principles have been adopted at the federal and state levels to varying degrees. The United States has taken a sector-driven approach toward adopting the principles of fair information practices, with the fed- eral and state governments promulgating statutes and regulations that apply only to specific classes of record keepers or categories of records.9,10 At the federal level, the fair information practices were first incorporated into the Privacy Act of 1974, which governs the collection, use, and disclo- sure of personally identifiable data held by the federal government and some of its contractors. Hospitals operated by the federal government and health care or research institutions operated under federal contract are subject to the Privacy Act, while other health care entities remained outside its scope (Gostin, 1995). Nevertheless, the Privacy Act afforded perhaps the broadest 9 The original 1973 HEW Advisory Committee contemplated and rejected the creation of a centralized, federal approach to regulating the use of all automated personal data systems (see HEW, 1973). 10 Europe, in contrast, has adopted fair information practices in a broad, more uniform fashion by incorporating them into the European Union (EU) Directive, which protects indi- viduals with regard to the processing of any personal data and on the free movement of such data. The EU Directive applies to personal data of many types, including medical and financial, and widely applies to all who process such data, resulting in protections (Gelman, 2008).
HEALTH INFORMATION PRIVACY protection for health information at the federal level until the promulgation of the HIPAA Privacy Rule. For their part, states have adopted (and continue to adopt) laws that not only mirror the Privacy Act in protecting government-held records, but also that afford broader protections for personally identifiable health information held by private parties. However, these principles have not been adopted uniformly among states, resulting in a patchwork of state health privacy laws that provide little consistency from entity to entity or from state to state. For example, the states have enacted the fair information practice restriction on use and disclosure of information in varying ways (reviewed by Pritts, 2008). Some allow the disclosure of health information for research without the individualâs permission and others require such per- mission. Others only require such permission to release only certain types of information for research. Similarly, state statutes vary widely in how they have applied the accountability principle, both in the way they pro- vide remedies for breaches in confidentiality and security and with respect to the standard imposed for initiating a suit. Also, only a few states have statutorily required providers to undertake security measures to ensure that health information is used and disclosed properly. SECURITY OF HEALTH DATA Protecting the security of data in health research is important because health research requires the collection, storage, and use of large amounts of personally identifiable health information, much of which may be sensitive and potentially embarrassing. If security is breached, the individuals whose health information was inappropriately accessed face a number of potential harms. The disclosure of personal information may cause intrinsic harm simply because that private information is known by others (Saver, 2006). Another potential danger is economic harm. Individuals could lose their job, health insurance, or housing if the wrong type of information becomes public knowledge. Individuals could also experience social or psychological harm. For example, the disclosure that an individual is infected with HIV or another type of sexually transmitted infection can cause social isola- tion and/or other psychologically harmful results (Gostin, 2008). Finally, security breaches could put individuals in danger of identity theft (Pritts, 2008). Protecting the privacy of research participants and maintaining the confidentiality of their data have always been paramount in research and a fundamental tenet of clinical research. However, several highly publicized examples of stolen or misplaced computers containing health data have heightened the publicâs concerns about the security of health data (for a list
BEYOND THE HIPAA PRIVACY RULE of security breaches in health research, see Table 2-2). The extent to which these breaches have caused tangible harm to the individuals involved is diffi- cult to quantify (Pritts, 2008). A Government Accountability Office (GAO) report studying major security breaches involving nonmedical personal information concluded that most security breaches do not result in identity theft (GAO, 2007). However, the lack of identity theft resulting from past breaches is no guarantee that future breaches will not result in more serious harm. A recent report from the Identity Theft Resources Center found that identity theft is up by 69 percent for the first half of 2008, compared to the same time period in 2007 (ITRC, 2008). Also, regardless of actual harm, security breaches are problematic for health research because they under- mine public trust, which is essential for patients to be willing to participate in research (Hodge et al., 1999). A recent study found patients believe that requiring researchers to have security plans encourages researchers to take additional precautions to protect data (Damschroder et al., 2007). More- over, data security is important to protect because it is a key component of comprehensive privacy practices. The HIPAA Security Rule and Its Limitations The goals of security are threefold: to ensure that (1) only authorized individuals see stored data; (2) they only see the data when they need to use it for an authorized purpose; and (3) what they see is accurate. Traditionally, these goals have been pursued through protections intended to make data processing safe from unauthorized access, alteration, deletion, or transmission. The HIPAA Security Rule employs this traditional solution to protecting security, and sets a floor for data security standards within covered entities (Box 2-2).11 The HIPAA Security Rule has several major gaps in security protection. First, like the HIPAA Privacy Rule, the HIPAA Security Rule only applies to covered entities. Many researchers who rely on protected health information (PHI)12 to conduct health research are not covered entities, and thus are not required to implement any of the security requirements outlined in the Security Rule. Although federal research regulations include protections of privacy, there are no other laws that specifically require researchers to imple- ment security protections for research data. Second, the HIPAA Security Rule only protects electronic medical records; it does not require covered entities 11 Security Standards, 45 C.F.R. parts 160, 162, and 164 (2003). The final standards were adopted on February 20, 2003. Covered entities were required to be in compliance with the regulation on April 21, 2005 (and April 21, 2006, for small health plans). 12 Protected health information (PHI) refers to all personally identifiable health information maintained by a HIPAA covered entity. 45 C.F.R. Â§ 160.103 (2002).
HEALTH INFORMATION PRIVACY TABLE 2-2 Research Security Breaches: 2006â2008 No. of Records Date Organization Event Affected Consequence 3/3/06 Georgetown A cyber attack on a 41,000 The person making the University server exposed the attack came from personal information outside the University of elderly District of and was not authorized Columbia (DC) to access the data. residents. The However, there is no compromised server evidence that personal was used by data have been researchers to misused. monitor services provided to the elderly for the DC Office on Aging. 6/20/06 University of A computer 9,800 The computer was Alabama containing the stolen in February, but School of personal information individuals were not Medicine on donors, recipients, notified until June, and potential because it took months recipients from the for the University to universityâs kidney reconstruct the missing transplant program data. was stolen. 9/29/06 University of A computer was 14,500 There is no evidence Iowa, attacked that stored that any personal Department of the personal information on the Psychology information of computer was accessed. research subjects participating in a study on maternal and child health from 1995 to the present. 10/24/06 Jacobs A laptop containing Unknown The chief technology Neurological patient records and officer reported that no Institute research data was personally identifiable (Buffalo, NY) stolen from a information was stored researcherâs locked on the laptop. office. 1/4/07 SickKids A laptop containing Unknown The computer was (Ontario) personal health password protected, so information on it is unlikely that any research participants data were accessed. from 10 research studies was stolen. continued
BEYOND THE HIPAA PRIVACY RULE TABLE 2-2 Continued No. of Records Date Organization Event Affected Consequence 2/16/07 U.S. An unencrypted Unknown The Secretary of the Department of computer hard drive VA shut down the Veterans disappeared from a Research Enhancement Affairs (VA) VA research center in Award Programs until Alabama. proper security standards are in place. 3/30/07 University of A computer that +3,000 There is no evidence Californiaâ contained the that any information San Francisco personal information on the computer was on cancer research used by unauthorized subjects was stolen persons. from a locked research office. 6/7/07 U.S. Marines/ A researcher posted 10,554 During the time period Pennsylvania the personal that the information State information of many was online, the website University U.S. Marines online. was only accessed by This information then one individualâa turned up in a cache Marine whose on Googleâs search information was engine. released. 8/17/07 Walter Reed Boxes of documents Unknown The boxes were intact Army Institute containing personal when discovered, so it of Research information were is unlikely that any of found in the the personal dumpster at an information was apartment building. accessed. The documents should have been shredded. 2/23/08 National A laptop computer 2,500 Because the Institutes of that contained information was not Health sensitive medical encrypted, it is possible information on that private health patients enrolled in a information was clinical trial was accessed on the stolen. The computer. information was not encrypted, in violation of the governmentâs data- security policy. SOURCES: ITRC (2006, 2007); PRC (2008).
HEALTH INFORMATION PRIVACY BOX 2-2 The HIPAA Security Rule The final HIPAA Security Standards were adopted on February 20, 2003. Covered entities were required to be in compliance with the regulation on April 21, 2005 (and April 21, 2006, for small health plans). In designing the HIPAA Security Rule, the Department of Health and Human Services (HHS) recognized that cov- ered entities affected by this Rule were varied in terms of size, sophistication in technology use, and relative risks. Rather than dictate specific technological solu- tions, HHS deliberately made the rule flexible and usable by all covered entities regardless of size and purpose. HHS also specifically stayed away from requiring any particular technology solutions to protect security. The Rule was intended to encourage covered entities to use future technology as it developed. Unlike the HIPAA Privacy Rule, the HIPAA Security Rule only protects elec- tronic protected health information (EPHI). The Security Rule requires covered entities that process EPHI to maintain sufficient security measures to ensure the confidentiality, integrity, and availability of all EPHI. The Rule enumerates specific administrative, technical, and physical security safeguards for covered entities to implement. Each safeguard is either classified as âaddressableâ or ârequired.â For the former, a âcovered entity must conduct a risk analysis to determine whether each specification is reasonable and appropriate for its unique situation,â and only those safeguards that are âreasonable and appropriateâ must be implemented. Required security safeguards are those mandated by the Rule. The Rule gives covered entities the responsibility for training their workforces to comply with the security regulation and for having written security policies and procedures in place. However, covered entities are only required to protect against reasonably anticipated threats or hazards to the security of the data, and reasonably anticipated uses or disclosures of such information that are not permit- ted under the Privacy Rule. SOURCES: 68 Fed. Reg. 8333, 8334 (2003); 45 C.F.R. Â§ 164.306; 45 C.F.R. Â§ 164.316 (2007). to implement any security protections for health information stored in paper records. There is an ongoing effort to implement electronic health records. However, many health records now exist only in paper form and may not be securely protected. Third, many covered entities apparently are not yet in full compliance with all the requirements of the HIPAA Security Rule, based on surveys13 of 13 Since 2004, the American Health Information Management Association has annually surveyed health care privacy officers and others whose jobs related to the HIPAA privacy function to gain an understanding of where health care organizations stand with regard to implementing the Privacy and Security Rules required by HIPAA (AHIMA, 2006).
BEYOND THE HIPAA PRIVACY RULE health care privacy officers and other individuals responsible for implement- ing the HIPAA regulations conducted by the American Health Information Management Association (AHIMA). The surveys found that although the percentage of respondents who believe their facilities are in full compliance with the HIPAA Security Rule is increasing yearly, the number is still not 100 percent. In 2006, 1 year after implementation of the HIPAA security regula- tions, 25 percent of respondents described themselves as fully compliant with the Security Rule, and 50 percent described themselves as 85 to 95 percent compliant (compared to 17 percent of respondents in 2005 reporting they were fully compliant, and 43 percent describing themselves as 85 to 95 per- cent compliant). More than halfâ54 percentâof respondents reported that their covered entity had upgraded its electronic software system to comply with the HIPAA Security Rule. All the respondents reported that their cov- ered entity has an individual responsible for assessing data protection needs and implementing solutions and staff training (compared to 89 percent in 2005), but the number of facilities reporting that they have an entire com- mittee or task related to security decreased from 2005 (59 percent versus 78 percent) (AHIMA, 2006). The Centers for Medicare & Medicaid Services (CMS) has the authority to enforce the HIPAA Security Rule, and has received 378 security com- plaints as of 2008 without issuing any fines or penalties. A recent report issued by the HHS Office of Inspector General evaluated CMSâs oversight and enforcement of the HIPAA Security Rule and âfound that CMS had taken limited steps to ensure that covered entities adequately implement security protectionsâ (OIG, 2008). However, a 2008 Resolution Agreement entered into by the U.S. Department of Health and Human Services (HHS) and CMS with Seattle-based Providence Health & Services for breaches of the HIPAA Privacy and Security Rules may indicate that CMS is starting to take a more affirmative approach to enforcement. The agreement requires Providence Health & Services to pay $100,000 and to implement a correc- tive action plan to ensure electronic patient information is appropriately safeguarded against future security breaches (OCR, 2008). In addition, CMS has recently partnered with PricewaterhouseCoopers to conduct secu- rity audits of covered entities to examine how well they are implementing the requirements of the HIPAA Security Rule. Ten to 20 assessments are planned for 2008 (Conn, 2008). Together these actions may have a positive effect on the percentage of covered entities fully compliant with the HIPAA Security Rule. Regardless of whether the HIPAA Security Rule is actively enforced, the other gaps in the HIPAA Security Ruleâs protection of personal health infor- mation are problematic because enhanced security is necessary to reduce the risk of data theft and to reinforce the publicâs trust in the research commu- nity by diminishing anxiety about the potential for unintentional disclosure
HEALTH INFORMATION PRIVACY of information. Thus, the IOM committee recommends that all institutions (both covered entities and non-covered entities) in the health research com- munity that are involved in the collection, use, and disclosure of personally identifiable health information take strong measures to safeguard the secu- rity of health data. Given the differences among the missions and activities of institutions in the health research community, some flexibility in the implementation of specific security measures will be necessary. Examples of measures that institutions should implement include appointment of a security officer on IRBs and Privacy Boards to be respon- sible for assessing data protection needs and implementing solutions and staff training; use of encryption and encoding techniques, especially for laptops and removable media containing personally identifiable health information; and implementation of a breach notification requirement, so that patients may take steps to protect their identity in the event of a breach (IOM, 2000). More generally, institutions should implement layers of security protections, so that if security fails at one layer the breach will likely be stopped by another layer of security protection. The publication of best practices combined with a cooperative approach to compliance with security standardsâsuch as self-evaluation, security audits, and cer- tification programsâwould also promote progress in this area. Research sponsors could play a role in the adoption of best practices in data secu- rity, by requiring researchers to implement appropriate security measures prior to providing funding. In addition, the federal government should support the development of technologies to enhance the security of health information. Examples of security standards and guidelines already exist in some sectors, but they are not widely applied in health research. For instance, the National Institute of Standards and Technology has developed standards and guidance for the implementation of the Federal Information Security Management Act of 2002, which was meant to bolster computer and net- work security within the federal government and affiliated parties (e.g., government contractors). These include standards for minimum security requirements for information and information systems, as well as guidance for assessing and selecting appropriate security controls for information sys- tems, for determining security control effectiveness, and for certifying and accrediting information systems (NIST, 2007). However, two recent GAO reports found that although the federal government is improving informa- tion security performance, a number of significant information security control deficiencies remain (GAO, 2008a,b). HHS, working through its Office of the National Coordinator for Health Information Technology,14 could play an important role in developing or adapting standards for health 14 See http://www.hhs.gov/healthit/onc/mission/.
00 BEYOND THE HIPAA PRIVACY RULE research applications, and then encourage and facilitate broader use of such standards in the health research community. POTENTIAL TECHNICAL APPROACHES TO HEALTH DATA PRIVACY AND SECURITY The security of data will continue to grow in importance as the health care industry moves toward greater implementation of electronic health records, and Congress has already proposed numerous bills to facilitate and regulate that transition (see also Chapter 6). Advances in information technology will likely make it easier to implement such measures as audit trails and access controls in the future. Although the committee does not recommend a specific technology solution, there are at least four techno- logical approaches to enhancing data privacy and security that have been proposed by others as having the potential to be particularly influential in health research: (1) Privacy-preserving data mining and statistical disclo- sure limitation, (2) personal electronic health record devices, (3) indepen- dent consent management tools, and (4) pseudonymisation. Each seeks to minimize or eliminate the transfer of personally identifiable data (Burkert, 2001). The advantages, limitations, and current feasibility of each are described briefly below. Privacy-preserving data mining and statistical disclosure limitation. In recent years, a number of techniques have been proposed for modifying or trans- forming data in such a way so as to preserve privacy while statistically analyzing the data (reviewed in Aggarwal and Yu, 2008; NRC, 2000, 2005, 2007b,c). Typically, such methods reduce the granularity of representation in order to protect confidentiality. There is, however, a natural trade-off between information loss and the confidentiality protection because this reduction in granularity results in diminished accuracy and utility of the data, and methods used in their analysis. Thus, a key issue is to maintain maximum utility of the data without compromising the underlying privacy constraints. In addition, there are a very large number of definitions of pri- vacy and its protection in the statistical disclosure limitation and the privacy- preserving data mining literatures, in part because of the varying goals. Examples of statistical disclosure limitation and privacy-preserving data mining methods include perturbation methods such as noise addition, which attempts to mask the identifiable attributes of individual records, aggrega- tion methods such as k-anonymity, which attempts to reduce the granularity of representation of the data in such a way that a given record cannot be distinguished from at least (k â 1) other records, the release of summary statistics that can be used for actual statistical analyses such as marginal
0 HEALTH INFORMATION PRIVACY totals from contingency tables, and various approaches to the generation of synthetic data. Several of these are reviewed in Aggarwal and Yu (2008). Other technologies include cryptographic methods for distributive pri- vacy protection, which operate by allowing researchers to query various databases online using cryptographic algorithms (Brands, 2007; reviewed in Aggarwal and Yu, 2008), query auditing techniques, and output perturba- tion using methodology known as differential privacy (many of these tech- niques are reviewed in Aggarwal and Yu, 2008, and Dwork, 2008). These technologies aim to protect privacy by minimizing the outflow of informa- tion to researchers, as the providers of the databases do not make any of the actual data available to the researchers. The principal drawback of many of these methods relates to the potentially limited utility of the released information, especially for secondary analyses not planned in advance. Each of the methods referred to above have strengths and weaknesses for specific kinds of statistical analyses. Precisely how this body of develop- ing methodologies may be effectively used in the types of health research of the sort envisioned in this report remains an open question and this is an area of active research. Thus, alternative mechanisms for data protec- tion going beyond the removal of obvious identifiers and the application of limited modifications of data elements are required. These mechanisms need to be backed up by legal penalties and sanctions. Personal electronic health record devices. The use of personal electronic health record devices requires that all individuals possess a personal elec- tronic device, such as a personal digital assistant (PDA) or personal com- puter, to manage their health information. The electronic device is intended to be used by individuals to aggregate all of their health information into one location (i.e., the electronic device). The infrastructure for implement- ing this privacy-enhancing technology exists, but there are several serious problems with relying on this technology in health research. First, it is unclear who would provide individuals with the devices, how they would be maintained, and who would bear the cost of the maintenance. Second, it is impossible for researchers to query every single individual for permission to access his/her personal electronic health record device in order to deter- mine if he/she meets the criteria for the relevant study. Only individuals who are on the Internet and are involved in health research could easily be queried. Third, the use of personal electronic devices would make it almost impossible to aggregate data because of the difficulty of accessing data from multiple sources. These problems are sufficiently serious that the use of this technology is unlikely to offer a satisfactory solution to the privacy and security concerns in health research (Brands, 2007). Independent consent management tools. The independent consent manage-
0 BEYOND THE HIPAA PRIVACY RULE ment tool (or infomediary) relies on a health trust to store all of an individ- ualâs health data. When researchers are interested in accessing an individualâs health information for a study, the researchers must contact the health trust. The health trust will then approach the individual and asks whether he/she is willing to give consent for the research. Examples of this technology include Microsoftâs HealthVault, Google Health, and Revolution Health. Independent consent management tools allow individuals to make blanket consents for their health information to be released for certain types of researchers. For example, an individual can have a standing consent that his/her information can be released to all researchers at the Mayo Clinic, or for all research on cancer, etc. Thus, the use of a health trust allows an indi- vidual to have the power of consent for all uses of his/her health informa- tion, but does not require a specific consent in all instances (Brands, 2007). Some privacy advocates are very favorable about the use of this technology because they see it as a way to give patients complete control over who can see and use their health information (PPR, 2008). However, the use of this technology in health research has several major problems. The first problem is that the health trust in this system becomes a âhoney potâ (i.e., the health trust holds ALL of an individualâs data). This creates serious trust and security issues because a personâs entire health record is stored in a single entity (Brands, 2007). A 2006 survey of global financial services institutions found that respondents reported that nearly 50 percent of all security breaches were a result of an internal failure (e.g., a virus or worm originating inside the organization, insider fraud, or inadvertent leakage of consumer data) (Melek and MacKinnon, 2006). Many security breaches in health care are likely also a result of internal failures. In addition, these organizations are currently not regulated by the HIPAA Privacy Rule, so there are no legal federal privacy restrictions preventing these entities from releasing individualsâ data to the govern- ment, marketing companies, or others, and no mandatory data security requirements. New legislation or regulation making health trusts liable for security breaches may be necessary before the public is willing to trust these organizations to store personal health data (Metz, 2008). The second major impediment to the widespread adoption of indepen- dent consent management tools is the difficulty of providing individuals with secure online access to view their health information. The companies marketing this technology need to develop a mechanism where individuals can access their medical information held by the health trust without endangering its security and privacy. The current methods for individual authentication online do not work well (NRC, 2003), but the use of a strong authentication system in a single domain may solve this problem. The companies will also need to address the fact that a significant portion of the population does not have online access at all (Brands, 2007).
0 HEALTH INFORMATION PRIVACY The final problem with using independent consent management systems in health research is the inability to ensure the authenticity and integrity of responses. There is no existing method for the health trusts to provide the researchers with a guarantee that the information contained in their database is accurate. If data are authenticated using existing methods, such as through the use of digital signing, then it is impossible to truly protect the privacy of the individualsâ information being disclosed (NRC, 2003). Cryptographic selective disclosure techniques may be able to solve this problem, but the technology does not exist yet (Brands, 2007). Pseudonymization. Pseudonymization is a method âused to replace the true identities (nominative) of individuals or organizations in databases by pseudo-identities (pseudo-IDs) that cannot be linked directly to their cor- responding nominative identitiesâ (Claerhout and De Moor, 2005). The benefit of using pseudonymization in health research is that it protects indi- vidualsâ identities while allowing researchers to link personal data across time and place by relying on the pseudo-IDs. Most pseudonymization methods use a trusted third party to perform the pseudonymization process. This results in at least three entities being involved in the creation of each database. There is the data source that has access to nominative personal data (e.g., PHI), the trusted third party, and the data register that uses the pseudonymized data for research. Two methods of pseudonymization are the batch data collection and the interactive data collection. In the batch data collection, the data sup- plier splits the data into two parts: (1) the identifiers that relate to a specific person (e.g., Social Security number, name), and (2) the payload data, which includes all the nonidentifiable data associated with each individual. The data are prepseudonymized at the data source and transferred to the trusted third party, which converts the prepseudonyms data into a final pseudo-ID. Both the final pseudo-ID and payload data are transferred to the data regis- ter, where they are stored and used for research; no data are stored with the trusted third party. Privacy concerns are minimized because the only version of the data that is available to researchers is pseudonymized data. The interactive data collection is used in situations where neither the data supplier nor the data register has a need for local storage of the data. All the data is stored by a trusted third party in pseudonymous form. Both the data supplier and the data register must query the trusted third party to access the data (Claerhout and De Moor, 2005; De Moor et al., 2003). It is unclear how technologies relying on pseudonymization would be implemented under the requirements of the HIPAA Privacy Rule. In order for information to be considered deidentified, the HIPAA Privacy Rule specifically states that covered entities can assign a code or other means of
0 BEYOND THE HIPAA PRIVACY RULE record identification (such as a pseudo-ID), but the code cannot be derived from, or related to, information about the subject of the information.15 This means that any pseudo-IDs created using this technology must be based entirely on nonpersonal information. Alternatively, any researchers using the pseudonymized data must go through the normal IRB/Privacy Board review process. CONCLUSIONS AND RECOMMENDATIONS Based on its review of the information described in this chapter, the committee agreed on an overarching principle to guide the formation of recommendations. The committee affirms the importance of maintaining and improving the privacy of health information. In the context of health research, privacy includes the commitment to handle personal information of patients and research participants with meaningful privacy protections, including strong security measures, transparency, and accountability.16 These commitments extend to everyone who collects, uses, or has access to personally identifiable health information of patients and research par- ticipants. Practices of security, transparency, and accountability take on extraordinary importance in the health research setting: Researchers and other data users should disclose clearly how and why personal informa- tion is being collected, used, and secured, and should be subject to legally enforceable obligations to ensure that personally identifiable information is used appropriately and securely. In this manner, privacy protection will help to ensure research participation and public trust and confidence in medical research. As part of the process of implementing this principle into the federal oversight regime of health research, the committee recommends that all institutions in the health research community that are involved in the col- lection, use, and disclosure of personally identifiable health information should take strong measures to safeguard the security of health data. For example, institutions could: â¢ Appoint a security officer responsible for assessing data protection needs and implementing solutions and staff training. â¢ Make greater use of encryption and other techniques for data security. â¢ Include data security experts on IRBs. 15 Standards for Privacy of Individually Identifiable Health Information: Final Rule, 67 Fed. Reg. 53182, 53232 (2002). 16 This is derived from the principles of fair information practices (see Chapter 2 for more detail).
0 HEALTH INFORMATION PRIVACY â¢ Implement a breach notification requirement, so that patients may take steps to protect their identity in the event of a breach. â¢ Implement layers of security protection to eliminate single points of vulnerability to security breaches. In addition, the federal government should support the development and use of: â¢ Genuine privacy-enhancing techniques that minimize or eliminate the collection of personally identifiable data. â¢ Standardized self-evaluations and security audits and certification programs to help institutions achieve the goal of safeguarding the security of personal health data. Effective health privacy protections require effective data security mea- sures. The HIPAA Security Rule (which entails a set of regulatory provisions separate from the Privacy Rule) already sets a floor for data security stan- dards within covered entities, but not all institutions that conduct health research are subject to HIPAA regulations. Also, the survey data presented in this chapter show that neither the HIPAA Privacy Rule nor the HIPAA Security Rule have directly improved public confidence that personal health information will be kept confidential. Therefore, all institutions conducting health research should undertake measures to strengthen data protections. For example, given the recent spate of lost or stolen laptops containing patient health information, encryption should be required for all laptops and removable media containing such data. However, in general, given the differences among the missions and activities of institutions in the health research community, some flexibility in the implementation of specific secu- rity measures will be necessary. Enhanced security would reduce the risk of data theft and reinforce the publicâs trust in the research community by diminishing anxiety about the potential for unintentional disclosure of information. The publication of best practices and outreach to all stakeholders by HHS, combined with a cooperative approach to compliance with security standards, such as self-evaluation and audit programs, would promote progress in this area. Research sponsors could also play a roll in fostering the adoption of best practices in data security. REFERENCES Aggarwal, C. C., and P. S. Yu, eds. 2008. Privacy-preserving data mining: Models and algo- rithms. Boston, MA: Kluwer Academic Publishers. AHIMA (American Health Information Management Association). 2006. The state of HIPAA privacy and security compliance. http://www.ahima.org/emerging_issues/ 2006StateofHIPAACompliance.pdf (accessed April 20, 2008).
0 BEYOND THE HIPAA PRIVACY RULE Allen, A. 1997. Genetic privacy: Emerging concepts and values. In Genetic secrets: Protecting privacy and confidentiality in the genetic era, edited by M. Rothstein. New Haven, CT: Yale University Press. Pp. 31â59. Balch, G. I., L. Doner, M. K. Hoffman, and E. Macario. 2005. An exploration of how patients and family caregivers think about counterfeit drugs and the safety of prescription drug retail outlets for the National Health Council. Oak Park, IL: Balch Associates. Balch, G. I., L. M. A. Doner, M. K. Hoffman, M. P. Merriman, E. Monroe-Cook, and G. Rathjen. 2006. Concept and message development research on engaging communities to promote electronic personal health records for the National Health Council. Oak Park, IL: Balch Associates. Barrett, G., J. A. Cassell, J. L. Peacock, and M. P. Coleman. 2007. National survey of British publicâs view on use of identifiable medical data by the National Cancer Registry. British Medical Journal 332(7549):1068â1072. Bloustein, E. 1967. Privacy as an aspect of human dignity: An answer to Dean Prosser. New York Law Review 39:34. Bodger, J. A. 2006. Note, taking the sting out of reporting requirements: Reproductive health clinics and the constitutional right to informational privacy. Duke Law Journal 56:583â609. Burkert, H. 2001. Privacy-enhancing technologies: Typology, critique, vision. In Technology and privacy: The new landscape, edited by P. E. Agre and M. Rotenberg. Cambridge, MA: The MIT Press. Pp. 125â142. Claerhout, B., and G. J. E. De Moor. 2005. Privacy protection for clinical and genomic data: The use of privacy-enhancing techniques in medicine. Journal of Medical Informatics 74:257â265. Conn, J. 2008. CMSâ HIPAA watchdog presents potential conflict. Modern Healthcare. http:// www.modernhealthcare.com (accessed July 28, 2008). Damschroder, L. J., J. L. Pritts, M. A. Neblo, R. J. Kalarickal, J. W. Creswell, and R. A. Hayward. 2007. Patients, privacy and trust: Patientsâ willingness to allow researchers to access their medical records. Social Science & Medicine 64(1):223â235. De Moor, G. J. E., B. Claerhout, and F. De Meyer. 2003. Privacy enhancing techniques: The key to secure communication and management of clinical and genomic data. Methods of Information in Medicine 42:148â153. Dwork, C. S., 2008. An ad omnia approach to defining and achieving private data analysis, proceedings of the first sigkdd international workshop on privacy, security, and trust in kdd (invited). Lecture Notes in Computer Science 4890. Feld, A. D., and A. D. Feld. 2005. The Health Insurance Portability and Accountability Act (HIPAA): Its broad effect on practice. American Journal of Gastroenterology 100(7):1440â1443. Flannery, J., and J. Tokley. 2005. AMA poll shows patients are concerned about the privacy and security of their medical records. Australian Medical Association. http://www.ama. com.au/web.nsf/doc/WEEN-6EG7LY (accessed December 10, 2007). Forrester Research. 1999. National survey: Confidentiality of medical records. http://www. chcf.org (accessed February 12, 2007). Forrester Research. 2005. National consumer health privacy survey 00. http://www.chcf. org/topics/view.cfm?itemID=115694 (accessed February 12, 2007). Fried, C. 1968. Privacy. Yale Law Journal 77:475â493. GAO (Government Accountability Office). 2007. Personal information: Data breaches are frequent, but evidence of resulting identity theft is limited. Washington, DC: GAO. GAO. 2008a. Although progress reported, federal agencies need to resolve significant deficiencies: Statement of Gregory C. Wilshusen, Director, Information Security Issues. Washington, DC: GAO.
0 HEALTH INFORMATION PRIVACY GAO. 2008b. Information security: Progress reported, but weaknesses at federal agencies persist: Statement of Gregory C. Wilshusen, Director, Information Security Issues. Wash- ington, DC: GAO. Gavison, R. 1980. Privacy and the limits of the law. Yale Law Journal 89:421â471. Gelman, R. 2008. Fair information practices: A basic history. http://bobgellman.com/rg-docs/ rg-FIPshistory.pdf (accessed April 15, 2008). Goldman, J. 1998. Protecting privacy to improve health care. Health Affairs 17(6):47â60. Gostin, L. O. 1995. Health information privacy. Cornell Law Review 80:101â184. Gostin, L. 2001. Health information: Reconciling personal privacy with the public good of human health. Health Care Analysis 9:321. Gostin, L. 2008. Surveillance and public health research: Personal privacy and the âright to know.â In Public health law: Power, duty, restraint. 2nd ed. Berkeley, CA: University of California Press. Gostin, L. O., and J. G. Hodge. 2002. Personal privacy and common goods: A framework for balancing under the national health information Privacy Rule. Minnesota Law Review 86:1439. Greenhouse, S., and M. Barbaro. 2005. Walmart memo suggests ways to cut employee benefit costs. The New York Times. http://www.nytimes.com/2005/10/26/business/26walmart. ready.html?pagewanted=1&_r=1 (accessed April 14, 2008). Harris Interactive. 2005. Health Information Privacy (HIPAA) notices have improved pub- licâs confidence that their medical information is being handled properly. http://www. harrisinteractive.com/news/printerfriend/index.asp?NewsID=849 (accessed April 3, 2007). Harris Interactive. 2007. Many U.S. adults are satisfied with use of their personal health information. http://www.harrisinteractive.com/harris_poll/index.asp?PID=743 (accessed May 15, 2007). HEW (Department of Health, Education and Welfare). 1973. Records, computers and the rights of citizens: Report of the Secretaryâs Advisory Committee on Automated Per- sonal Data Systems. http://aspe.hhs.gov/datacncl/1973privacy/tocprefacemembers.htm (accessed July 12, 2008) Hodge, J. G., Jr., L. O. Gostin, and P. D. Jacobson. 1999. Legal issues concerning electronic health information: Privacy, quality, and liability. JAMA 282(15):1466â1471. Hudson, K. L. 2007. Prohibiting genetic discrimination. New England Journal of Medicine 356:2021. IOM (Institute of Medicine). 2000. Protecting data privacy in health services research. Washington, DC: National Academy Press. ITRC (Identity Theft Resource Center). 2006. 00 disclosures of U.S. Data incidents. http:// idtheftmostwanted.org/ITRC%20Breach%20Report%202006.pdf (accessed July 7, 2008). ITRC. 2007. 00 breach list. http://idtheftmostwanted.org/ITRC%20Breach%20Report% 202007.pdf (accessed July 7, 2008). ITRC. 2008. Security breaches. http://www.idtheftcenter.org/artman2/publish/lib_survey/ ITRC_2008_Breach_List_printer.shtml (accessed July 22, 2008). Kass, N. E., M. R. Natowicz, S. C. Hull, R. R. Faden, L. Plantinga, L. O. Gostin, and J. Slutsman. 2003. The use of medical records in research: What do patients want? Journal of Law, Medicine & Ethics 31:429â433. Low, L., S. King, and T. Wilkie. 1998. Genetic discrimination in life insurance: Empirical evidence from a cross sectional survey of genetic support groups in the United Kingdom. British Medical Journal 317:1632â1635. Lowrance, W. W. 1997. Privacy and health research: A report to the U.S. Secretary of Health and Human Services. http://aspe.hhs.gov/DATACNCL/PHR.htm (accessed May 10, 2008).
0 BEYOND THE HIPAA PRIVACY RULE Lowrance, W. W. 2002. Learning from experience, privacy and the secondary use of data in health research. London: The Nuffield Trust. Magnussen, R. 2004. The changing legal and conceptual shape of health care privacy. The Journal of Law, Medicine & Ethics 32:681. Melek, A., and M. MacKinnon. 2006. Deloitte global security survey. http://www.deloitte.com/ dtt/cda/doc/content/us_fsi_150606globalsecuritysurvey(1).pdf (accessed July 23, 2008). Metz, R. 2008. Google makes health service publicly available. Associated Press. http://biz. yahoo.com/ap/080519/google_health.html (accessed August 13, 2008). Moore, A. 2005. Intangible property: Privacy, power and information control. In Informa- tion ethics: Privacy, property, and power, edited by A. Moore. Seattle, WA: University of Washington Press. NBAC (National Bioethics Advisory Commission). 1999. Research involving human biologi- cal materials: Ethical issues and policy guidance, report and recommendations. Vol. 1. Rockville, MD: NBAC. NBAC. 2001. Ethical and policy issues in research involving human participants. Rockville, MD: NBAC. NCSL (National Conference of State Legislatures). 2008. Privacy protections in state constitu- tions. http://www.ncsl.org/programs/lis/privacy/stateconstpriv03.htm (accessed June 10, 2008). Nissenbaum, H. 2004. Privacy as Contextual Integrity. Washington Law Review 79: 101â139. NRC (National Research Council). 2000. Improving access to and confidentiality of research data: Report of a workshop. Washington, DC: National Academy Press. NRC. 2003. Who goes there?: Authentication through the lens of privacy. Washington, DC: The National Academies Press. NRC. 2005. Expanding access to research data: Reconciling risks and opportunities. Wash- ington, DC: The National Academies Press. NRC. 2007a. Engaging privacy and information technology in a digital age. Washington, DC: The National Academies Press. NRC. 2007b. Privacy and information technology in a digital age. Washington, DC: The National Academies Press. NRC. 2007c. Putting people on the map: Protecting confidentiality with linked social-spatial data. Washington, DC: The National Academies Press. OCR (Office for Civil Rights). 2008. HIPAA compliance and enforcement. http://www.hhs. gov/ocr/privacy/enforcement/ (accessed August 13, 2008). OECD. 1980. Guidelines on the protection of privacy and transborder flows of personal data. http://www.oecd.org/document/0,2340,en_2649_34255_1815186_1_1_1_1,00.html (ac- cessed August 13, 2008). OIG (Office of Inspector General). 2008. Nationwide review of the Centers for Medicare & Medicaid Services Health Insurance Portability and Accountability Act of oversight. Washington, DC: Department of Health and Human Services. OTA (Office of Technology Assessment). 1993. Protecting privacy in computerized medical information. Washington, DC: OTA. Petrila, J. 1999. Medical records confidentiality: Issues affecting the mental health and sub- stance abuse systems. Drug Benefit Trends 11:6â10. Post, R. 2001. Three concepts of privacy. Georgetown Law Journal 89:2087â2089. PPR (Patient Privacy Rights). 2008 (October 4). Press release: Microsoft raises the bar for privacy in electronic health record solutions. http://www.patientprivacyrights.org/site/ PageServer?pagename=HealthVault_PressRelease/ (accessed August 13, 2008). PRC (Privacy Rights Clearinghouse). 2008. A chronology of data breaches. http://www. privacyrights.org/ar/ChronDataBreaches.htm (accessed July 8, 2008).
0 HEALTH INFORMATION PRIVACY Pritts, J. L. 2002. Altered states: State health privacy laws and the impact of the federal health Privacy Rule. Yale Journal of Health Policy, Law & Ethics 2(2):327â364. Pritts, J. 2008. The importance and value of protecting the privacy of health information: Roles of HIPAA Privacy Rule and the Common Rule in health research. http://www.iom. edu/CMS/3740/43729/53160.aspx (accessed March 15, 2008). Privacy Protection Study Commission. 1977. Personal privacy in an information society. http://epic.org/privacy/ppsc1977report/ (accessed April 21, 2008). PSRA (Princeton Survey Research Associates). 1999. Medical privacy and confidentiality survey. http://www.chcf.org/topics/view.cfm?itemID=12500 (accessed August 11, 2008). Rachels, J. 1975. Why privacy is important. Philosophy and Public Affairs 4:323â333. Regan, P. 1995. Legislating privacy: Technology, social values, and public policy. Chapel Hill, NC: University of North Carolina Press. Research!America. 2007. America speaks: Poll summary. Vol. 7. Alexandria, VA: United Health Foundation. Richards, N. M., and D. J. Solove. 2007. Privacyâs other path: Recovering the law of confi- dentiality. Georgetown Law Journal 96:124. Roback, H., and M. Shelton. 1995. Effects of confidentiality limitations on the psycho- therapeutic process. Journal of Psychotherapy Practice and Research 4:185â193. Robling, M. R., K. Hood, H. Houston, R. Pill, J. Fay, and H. M. Evans. 2004. Public attitudes towards the use of primary care patient record data in medical research without consent: A qualitative study. Journal of Medical Ethics 30:104â109. Saver, R. 2006. Medical research and intangible harm. University of Cincinnati Law Review 74:941â1012. Solove, D. J. 2006. A taxonomy of privacy. University of Pennsylvania Law Review 154:516â518. Taube, D. O., and A. Elwork. 1990. Researching the effects of confidentiality law on patientsâ self-disclosures. Professional Psychology: Research and Practice 21:72â75. Taylor, C. 1989. Sources of the self: The making of modern identity. Cambridge, MA: Harvard University Press. Terry, N. P., and L. P. Francis. 2007. Ensuring the privacy and confidentiality of electronic health records. University of Illinois Law Review 2007(2):681â736. Tourangeau, R., L. J. Rips, and K. Rasinski. 2000. The psychology of survey response. Cambridge, UK: Cambridge University Press. Turn, R., and W. H. Ware. 1976. Privacy and security issues in information systems. The RAND Paper Series. Santa Monica, CA: The RAND Corporation. Weddle, M., and P. Kokotailo. 2005. Confidentiality and consent in adolescent substance abuse: An update. Virtual Mentor, American Medical Association Journal of Ethics. http://virtualmentor.ama-assn.org/2005/03/pdf/pfor1-0503.pdf (accessed August 1, 2008). Wentland, E. J. 1993. Survey responses: An evaluation of their validity. San Diego, CA: Academic Press. Westin, A. 1966. Science, privacy and freedom. Columbia Law Review 66(7):1205â1253. Westin, A. 1967. Privacy and freedom. New York: Atheneum. Westin, A. 1976. Computers, health records, and citizen rights. http://eric.ed.gov/ERICWebPortal/ custom/portlets/recordDetails/detailmini.jsp?_nfpb=true&_&ERICExtSearch_SearchValue_ 0=ED143358&ERICExtSearch_SearchType_0=no&accno=ED143358 (accessed July 30, 2008). Westin, A. 2007. How the public views privacy and health research. Institute of Medicine. http:// www.iom.edu/Object.File/Master/48/528/%20Westin%20IOM%20Srvy%20Rept%2011- 1107.pdf (accessed November 11, 2007).
0 BEYOND THE HIPAA PRIVACY RULE Whiddett, R., I. Hunter, J. Engelbrecht, and J. Handy. 2006. Patientsâ attitudes towards sharing their health information. International Journal of Medical Informatics 75(7):530â541. Willison, D. J., L. Schwartz, J. Abelson, C. Charles, M. Swinton, D. Northrup, and L. Thabane. 2007 (September 25â28). Alternatives to project-specific consent for access to personal information for health research. What do Canadians think? Paper presented at 29th International Conference of Data Protection and Privacy Commissioners, Montreal, Canada. Woolley, M., and S. M. Propst. 2005. Public attitudes and perceptions about health related research. JAMA 294:1380â1384.