Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Appendix B Commissioned Survey Methodology A review of the literature demonstrated a dearth of systematic data to determine what impact the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule was having on health research. As a result, the Institute of Medicine (IOM) committee sought larger surveys with national coverage. In consultation with committee members, the IOM took the unusual step of commissioning1 several surveys to assess current percep- tions among health researchers of the effect of the Privacy Rule on research and to measure the publicâs perception of and expectations for privacy in health research. The first survey entailed a national web-based survey of U.S. epidemiol- ogists, overseen by Dr. Roberta Ness at the University of Pittsburgh. A sec- ond project, undertaken by Sarah Greene and Dr. Ed Wagner at the Group Health Center for Health Studies in Seattle, involved a survey of HMO Research Network (HMORN) investigators and a survey of HMORN Institutional Review Boards (IRBs). The third survey was a Harris Interac- tive Poll of the public, developed by Alan Westin of the Privacy Consulting Group. The methods used in conducting these surveys are described below. The results are described in Chapters 2 and 5 of this report and are reported in more detail in Ness (2007), Westin (2007), and Greene et al. (2008). 1 The surveys were commissioned with private funding. No federal funds were used to sup- port collection of survey data.
BEYOND THE HIPAA PRIVACY RULE NATIONAL SURVEY OF U.S. EPIDEMIOLOGISTS Participants Epidemiologists were surveyed because they are an identifiable profes- sional group of scientists engaged in human subjects research, and their research often involves the use of medical records. Support was enlisted from all professional groups that were known to represent U.S. epidemi- ologists employed in academia, industry, government, and nongovernment organizations. These included the American Academy of Pediatrics, Section on Epidemiology; American College of Epidemiology; American College of Preventive Medicine; American Diabetes Association, Council on Epidemi- ology & Statistics; American Public Health Association, Epidemiology Sec- tion; International Genetic Epidemiology Society; International Society for Environmental Epidemiology; International Society for Pharmacoepidemiol- ogy; Society for Clinical Trials; Society for Epidemiologic Research; Society for Healthcare Epidemiology; Society for Pediatric and Perinatal Epidemiol- ogy; and Society for the Analysis of African-American Public Health Issues. Of 14 societies approached, the 13 listed above participated. Each society e-mailed all its active members and requested that they respond to a web-based survey on the Privacy Rule. E-mail lists are updated annually for dues collection. Identical e-mails requesting participation in the survey were sent to the membership of each society three times, once a month during a 3-month period (JanuaryâApril 2007). In an effort to avoid response duplicationâbecause a substantial number of epidemiolo- gists belong to more than one organizationârespondents were asked, both in the cover e-mail and in the introduction to the survey, to respond only once. Individual responses were submitted anonymously over the Internet so that they could not be linked to any individual. IRB approval as an exempt protocol was obtained at the University of Pittsburgh and reviewed and approved by the National Academiesâ IRB. The 13 participating epidemiology societies sent e-mails to a total of 10,347 individual addresses. A cover e-mail asked professionals who are engaged in the conduct of U.S.-based human subjects research and who recognized the term Health Insurance Portability and Accountability Act or HIPAA to respond. A total of 2,805 individuals accessed the Website, and 2,376 individuals answered a screening question that asked, âSince HIPAA was implemented in April 2003, how many new applications involving human subjects have you submitted to a U.S. IRB?â Respondents answer- ing zero were thanked for their time, and no further questions were asked. The 1,527 respondents who provided a response of one or more are the participants in these analyses.
APPENDIX B Survey Content The survey questionnaire was developed by Roberta Ness with input and review by the IOM committee. Questions were asked about both posi- tive and negative potential influences of the HIPAA Privacy Rule, including the influence of the Privacy Rule on participant privacy, confidentiality, and public trust, as well as on research procedures. Four general approaches were used to ascertain information. First, questions with quantitative response categories were asked. These questions addressed issues such as the frequency of various types of data collection undertaken by respondents; changes in participant recruitment before and after the implementation of the Privacy Rule; frequency of IRB modifications secondary to Privacy Rule provisions and their effect; level of difficulty in obtaining deidentified data and waivers; familiarity with covered entitiesâ opting out of research because of the Privacy Rule; studies conceived but not submitted to IRBs because of Privacy Rule concerns; and perceived effect of the Privacy Rule on patient confidentiality. Survey respondents were also asked about their gender, train- ing, employment, and sources of funding. Second, researchers were asked for their perceptions rated on a 5-point Likert scale about issues such as the ease and difficulty of conducting research under the Privacy Rule and the effect the Privacy Rule has had on participant privacy/confidentiality. Third, respondents were asked whether and under what circumstances their IRB would approve each of five case studies. These involved retrieval of historical identified medical records; access to identified participants in a hospital-based cancer registry; access to deidentified data in a hospital-based tissue bank; review of medical records of deceased individuals; and request for a limited dataset (defined by the Privacy Rule) from a nonaffiliated hospital. Finally, respondents were asked open-ended qualitative questions, including a final request: âPlease tell us your stories about HIPAA. These will help us to understand all of the cir- cumstances in which HIPAA has affected your research.â After development of a draft instrument, survey content was vetted and modified by members of the IOM committee. In a pilot phase, ques- tions were distributed to 10 epidemiologists at the University of Pittsburgh. After completing the survey, the respondents were debriefed to identify ambiguities, streamline the instrument, and determine how readily a typical epidemiologist could answer questions. After the instrument was finalized, timed pilot tests took 10 to 15 minutes to complete. Statistical Analysis Simple descriptive statistics, retaining each distinctive response cat- egory, were used to analyze these data. The 5-point Likert scales that were
BEYOND THE HIPAA PRIVACY RULE anchored by ânoneâ and âa great dealâ were collapsed into 1 to 2, 3, and 4 to 5. Categories were reported rather than central tendencies in order to retain the full and unedited character of the data. Only univariate analyses were reported because the focus was on a description of the self-reported impact of the Privacy Rule, rather than predictors of responses. HARRIS INTERACTIVE POLL OF THE PUBLIC Successive drafts of the survey questionnaire were prepared by Alan Westin, with input and review by the IOM committee members. The final version was then reviewed and edited by David Krane, vice president, Harris Interactive. The survey was conducted online by Harris Interactive between September 11 and 18, 2007, with 2,392 respondents. Both closed and open-ended questions were used. The results were adjusted by Harris to represent the total adult U.S. population in 2007, estimated at 225 mil- lion persons, not just those who go online. Figures for age, gender, race/ ethnicity, education, region, and household income were weighted where necessary to bring them in line with their actual proportions in the popula- tion. Propensity score weighting was also used to adjust for respondentsâ propensity to using the Internet. Respondents for this survey were selected from among those who have agreed to participate in Harris Interactive surveys. Because the sample is based on those who agreed to participate in the Harris Interactive panel, no estimates of theoretical sampling error can be calculated. According to Harris Interactive, all sample surveys and polls, whether or not they use probability sampling, are subject to multiple sources of error. These errors are most often not possible to quantify or estimate, including sampling error, coverage error, error associated with nonresponse, error associated with question wording and response options, and postsurvey weighting and adjustments. Therefore, Harris Interactive avoids the term âmargin of errorâ because it is misleading. All that can be calculated are different possible sampling errors with different probabilities for pure, unweighted, random samples with 100 percent response rates. These are only theoretical because no published polls come close to this ideal. For analytic purposes, standard demographics for cross-tabulations were collected for region, age, generation, gender, race, party affiliation, education, income, marital status, children in the household, sexual ori- entation, disabilities, political philosophy, and employment. In addition, a set of custom health demographics was created from respondentsâ answers to questions about their overall health status, whether they have been care- givers, whether they have or have had six specified types of health condi- tions, and whether they have had a genetic test. Generally, a group with a 5 percent or higher variation from the total
APPENDIX B publicâs response, from one of our demographic, health-aspect, or attitudi- nal subsets, was reported as a significant demographic variation. When the public total is 18 percent or less, 3 or 4 percent higher variation was used. SURVEY OF HMORN RESEARCHERS AND IRB ADMINISTRATORS Researcher Survey A Web-based survey was used to collect data about researchersâ expe- rience with the HIPAA Privacy Rule (e.g., how their research protocols may have been affected by HIPAA, knowledge of and attitudes toward the HIPAA Privacy Rule regulations, and limited demographic information). To be eligible, participants had to be members of the scientific faculty (i.e., assistant, associate, or full investigator level, or the equivalent rank- ing system, plus research associates or staff scientists) at one of the HMO Research Network sites listed below: 1. Geisinger Health System, Center for Health Research and Rural Advocacy 2. Group Health Center for Health Studies 3. Harvard Pilgrim Health Care, Department of Ambulatory Care & Prevention 4. HealthPartners Research Foundation 5. Henry Ford Health System, Center for Health Services Research & the Research Epidemiology Programs in cancer and biostatistics 6. Kaiser Permanente Colorado, Clinical Research Unit 7. Kaiser Permanente Northern California, Division of Research 8. Kaiser Permanente Northwest, The Center for Health Research (includes investigators from Kaiser Permanente Georgia & Kaiser Permanente Hawaii) 9. Kaiser Permanente Southern California, Department of Research and Evaluation 10. Lovelace Clinic Research Foundation, Health Services Research Division 11. Marshfield Clinic Research Foundation, Epidemiology Research Center 12. Meyers Primary Care Institute 13. Scott & White Health System, Research and Education Department Successive drafts of the survey questionnaire were prepared by Sarah Greene, with input and review by the IOM committee. An invitation e- mail was sent to all faculty members with a link to a web-based survey. Each respondent received a unique website address taking him/her to the
BEYOND THE HIPAA PRIVACY RULE survey, and once a respondent completed the survey the website address could not be reused. The recipients also received two e-mail reminders to complete the web-based survey. A total of 235 investigators were invited to complete the web-based survey. Of those, 26 of the e-mail invitations bounced back, and 2 individuals actively refused to complete the survey. A total of 89 individuals completed the survey, and the remaining 118 individuals never responded to the invitations. The information obtained from the investigators included: â¢ The degree to which a study protocol was affected by the HIPAA Privacy Rule; â¢ Characteristics of the affected study; â¢ Attitudes toward the HIPAA Privacy Rule provisions; â¢ Specific structures or personnel created at their site to address HIPAA; â¢ Studies considered, but not implemented due to real/perceived HIPAA-related concerns; â¢ Open-ended âcommentsâ fields to allow researchers to elaborate on their responses; â¢ Select demographic items including number of years in research; â¢ HMORN site membership; and â¢ Request to contact researcher for follow-up interview if web survey answers warrant it. Nineteen respondents who reported a HIPAA-affected study in the web survey indicated a willingness to participate in a follow-up interview. These subjects were contacted via e-mail to initiate an appointment for a tele- phone interview at a mutually convenient time. Three individuals opted out when they were contacted about scheduling the interview, and two could not be reached after 4 weeks of both e-mail and phone attempts. Twelve interviews were completed. The interviews were semi-structured to ensure systematic collection of key study details, but also to allow each individual to describe his/her unique experience about conducting research under the HIPAA Privacy Rule. At least two members of the study team were present for each inter- view to assist with note taking and capturing all relevant information. The principal investigator then reviewed the qualitative data for common themes and unique issues. Each response was rated as positive, negative, or neutral. The information obtained from these investigators included: â¢ A general description of the study (e.g., purpose, design, protocol, data sources, intervention)
APPENDIX B â¢ What types of changes were made to the study as a result of the HIPAA regulations â¢ Whether these changes had a negative impact on the study design or time line â¢ For multisite studies, whether differences arose across sites, and the nature of those differences â¢ Perceptions about their siteâs interpretations of HIPAA For the web survey, descriptive characteristics were reported as means, medians, or frequencies. Frequencies were generated for categorical vari- ables, and chi square tests were used to analyze continuous variables. Survey responses were stratified by site, years of experience, and number of studies affected by HIPAA provisions. For the semi-structured interviews, qualitative information provided by the respondents was synthesized to determine the characteristics of the affected studies. The content of the interviews was also analyzed to identify recurrent themes. IRB Administrator Survey A mailed survey was used to collect data about IRB administratorsâ experience with the HIPAA Privacy Rule. The IRB administrators were the 15 who work at the HMO Research Network sites. Responses were received from 11 of the 15 sites. The survey was developed by Sarah Greene, with input and review by the IOM Committee. The survey asked questions regarding: â¢ The role of the IRB as it relates to HIPAA Privacy Rule compliance â¢ Knowledge of and attitudes toward the research-related provisions of the HIPAA Privacy Rule â¢ Procedures in place (or planned) to ensure adherence to HIPAA Privacy Rule provisions â¢ Approaches (e.g., training, new staff) established at the site to address HIPAA compliance â¢ Specific type of HIPAA Privacy Ruleârelated training/education developed by the siteâs IRB â¢ Sample scenarios of privacy breaches to see how each IRB would respond â¢ Impact of HIPAA Privacy Rule on IRB process flow â¢ Desired training/guidance from federal agencies specifically about the HIPAA Privacy Rule â¢ Open-ended âcommentsâ fields to allow respondents to elaborate on their responses
00 BEYOND THE HIPAA PRIVACY RULE Given the small sample size (nmax = 11), the analysis was limited primar- ily to reporting frequencies, means, and medians. If warranted, selected frequencies were stratified based on characteristics (e.g., volume of IRB applications and perceived impact of the Privacy Rule). REFERENCES Greene, S. M., S. Bennett, B. Kirlin, K. R. Oliver, R. Pardee, and E. Wagner. 2008. Impact of the HIPAA Privacy Rule in the HMO Research Network. Seattle, WA: Group Health Cooperative Center for Health Studies. Ness, R. 2007. Influence of the HIPAA Privacy Rule on health research. JAMA 298(18): 2164â2170. Westin, A. 2007. How the public views privacy and health research. http://www.iom.edu/ Object.File/Master/48/528/%20Westin%20IOM%20Srvy%20Rept %2011-1107.pdf (accessed November 11, 2007).