4
New Approaches— Learning Systems in Progress

OVERVIEW

Incorporation of data generation, analysis, and application into healthcare delivery can be a major force in the acceleration of our understanding of what constitutes “best care.” Many existing efforts to use technology and create research networks to implement evidence-based medicine have produced scattered examples of successful learning systems. This chapter includes papers on the experiences of the Veterans Administration (VA) and the practice-based research networks (PBRNs) that demonstrate the power of this approach as well as papers outlining the steps needed to knit together these existing systems and expand these efforts nationwide toward the creation of a learning healthcare system. Highlighted in particular are key elements—including leadership, collaboration, and a research-oriented culture—that underlie successful approaches and their continued importance as we take these efforts to scale.

In the first paper, Joel Kupersmith discusses the use of the EHR to further evidence-based practice and research at the VA. Using diabetes mellitus as an example, he outlines how VistA (Veterans Health Information Systems and Technology Architecture) improves care by providing patient and clinician access to clinical and patient-specific information as well as a platform from which to perform research. The clinical data within electronic health records (EHRs) are structured such that data can be aggregated from within VA or with other systems such as Medicare to provide a rich source of longitudinal data for health services research (VA Diabetes Epidemiology Cohort [DEpiC]). PBRNs are groups of ambulatory



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 185
The Learning Healthcare System: Workshop Summary 4 New Approaches— Learning Systems in Progress OVERVIEW Incorporation of data generation, analysis, and application into healthcare delivery can be a major force in the acceleration of our understanding of what constitutes “best care.” Many existing efforts to use technology and create research networks to implement evidence-based medicine have produced scattered examples of successful learning systems. This chapter includes papers on the experiences of the Veterans Administration (VA) and the practice-based research networks (PBRNs) that demonstrate the power of this approach as well as papers outlining the steps needed to knit together these existing systems and expand these efforts nationwide toward the creation of a learning healthcare system. Highlighted in particular are key elements—including leadership, collaboration, and a research-oriented culture—that underlie successful approaches and their continued importance as we take these efforts to scale. In the first paper, Joel Kupersmith discusses the use of the EHR to further evidence-based practice and research at the VA. Using diabetes mellitus as an example, he outlines how VistA (Veterans Health Information Systems and Technology Architecture) improves care by providing patient and clinician access to clinical and patient-specific information as well as a platform from which to perform research. The clinical data within electronic health records (EHRs) are structured such that data can be aggregated from within VA or with other systems such as Medicare to provide a rich source of longitudinal data for health services research (VA Diabetes Epidemiology Cohort [DEpiC]). PBRNs are groups of ambulatory

OCR for page 185
The Learning Healthcare System: Workshop Summary practices, often partnered with hospitals, academic health centers, insurers, and others to perform research and improve the quality of primary care. These networks constitute an important portion of the clinical research enterprise by providing insight from research at the “coalface” of clinical care. Robert L. Phillips suggests that many lessons could be learned about essential elements in building learning communities—particularly the organization and resources necessary—as well as how to establish such networks between many unique practice environments. The electronic component of the Primary Care Research Network PBRN has the potential to extend the capacity of existing PBRNs by providing an electronic connection that would enable the performance of randomized controlled trials (RCTs) and many other types of research in primary care practices throughout the United States. While the work of the VA and PBRNs demonstrates immense potential for the integration of research and practice within our existing, fragmented, healthcare system, the papers that follow look at how we might bring their success to a national scale. George Isham of HealthPartners lays out a plan to develop a national architecture for a learning healthcare system and discusses some recent activities by the AQA (formerly the Ambulatory Care Quality Alliance) to promote needed systems cooperation and use of data to bring research and practice closer together. In particular, AQA is focused on developing a common set of standardized measures for quality improvement and a strategy for their implementation; a unified approach to the aggregation and sharing of data; and common principles to improve public reporting. Citing a critical need for rapid advance in the evidence base for clinical care, Lynn Etheredge makes the case for the potential to create a rapidly learning healthcare system if we build wisely on existing resources and infrastructure. In particular he focused on the potential for creating virtual research networks and the improved use of EHR data. Through the creation of national standards, the many EHR research registries and databases from the public and private sectors could become compatible. When coupled with the anticipated expansion of databases and registry development these resources could be harnessed to provide insights from data that span populations, health conditions, and technologies. Leadership and stable funding are needed along with a shift in how we think about access to data. Etheredge advances the idea of the “economics of the commons” as one to consider for data access in which researchers would give up exclusive access to some data but benefit from access to a continually expanding database of clinical research data.

OCR for page 185
The Learning Healthcare System: Workshop Summary IMPLEMENTATION OF EVIDENCE-BASED PRACTICE IN THE VA1 Joel Kupersmith, M.D. Veterans Administration As the largest integrated delivery system in the United States, the Veterans Health Administration serves 5.3 million patients annually across nearly 1,400 sites of care. Although its patients are older, sicker, and poorer than the general U.S. population, VA’s performance now surpasses other health systems on standardized quality measures (Asch et al. 2004; Kerr et al. 2004; Jha et al. 2003). These advances are in part related to VA’s leadership in the development and use of the electronic health record, which has fostered veteran-centered care, continued improvement, and research. Human and system characteristics have been essential to the transformation of VA care. Adding computers to a care delivery system unprepared to leverage the advantages of health information can create inefficiency and other negative outcomes (Himmelstein and Woolhandler 2005). In contrast, during the period of time in which VA deployed its EHR, the number of veterans seen increased from less than 3 million to nearly 5 million, while costs per patient and full-time employees per patient both decreased (Evans et al. 2006; Perlin et al. 2004). To understand how this could be possible, it is important to highlight historical and organizational factors that were important to the adoption of VA’s EHR. VA health care is the product of decades of innovation. In 1930, Congress consolidated programs for American veterans under VA. Facing more than 1 million returning troops following World War II, VA partnered with the nation’s medical schools, gaining access to faculty and trainees and adding research and education to its statutory missions. That bold move created an environment uniquely suited to rapid learning. At present, VA has affiliations with 107 medical schools and trains almost 90,000 physicians and associated health professionals annually. The VA was originally based on inpatient care, and administrative and legal factors created inefficiency and inappropriate utilization. By the 1980s, the public image of the VA was poor. In 1995, facing scrutiny from Congress, VA reorganized into 22 integrated care networks. Incentives were created for providing care in the most appropriate setting, and legislation established universal access to primary care. 1 This paper is adapted from an article copyrighted and published by Project HOPE/Health Affairs as Kupersmith et al., “Advancing Evidence Based Care in Diabetes Through Health Information Technology: Lessons from the Veterans Health Administration, Health Affairs, 26(2),w156-w168, 2007. The published article is archived and available online at www.healthaffairs.org.

OCR for page 185
The Learning Healthcare System: Workshop Summary These changes resulted in a reduction of 40,000 inpatient beds and an increase of 650 community-based care sites. Evidence-based practice guidelines and quality measures were adopted and safeguards were put in place for vulnerable groups such as the mentally ill and those needing chronic care while VA’s performance management system held senior managers accountable for evidence-based quality measures. All of these changes created a strong case for robust information systems and spurred dramatic improvements in quality (Jha et al. 2003; Perlin et al. 2004). VistA: VA’s Electronic Health Record Because VA was both a payer and a provider of care, its information system was developed to support patient care and its quality with clinical information, rather than merely capture charges and facilitate billing. In the early 1980s, VA created the Decentralized Hospital Computer Program (DHCP), one of the first EHRs to support multiple sites and healthcare settings. DHCP developers worked incrementally with a network of VA academic clinicians across the country, writing and testing code locally and transmitting successful products electronically to other sites where they could be further refined. Over time, the group had created a hospital information system prototype employing common tools for key clinical activities. The system was launched nationally in 1982, and by 1985, DHCP was operational throughout VA. DHCP evolved to become the system now known as VistA, a suite of more than 100 applications supporting clinical, financial, and administrative functions. Access to VistA was made possible through a graphical user interface known as the Computerized Patient Record System (CPRS). With VistA-CPRS, providers can securely access patient information at the point of care and, through a single interface, update a patient’s medical history, place orders, and review test results and drug prescriptions. Because VistA also stores medical images such as X-rays and photographs directly in the patient record, clinicians have access to all the information needed for diagnosis and treatment. As of December 2005, VistA systems contained 779 million clinical documents, more than 1.5 billion orders, and 425 million images. More than 577,000 new clinical documents, 900,000 orders, and 600,000 images are added each workday—a wealth of information for the clinician, researcher, or healthcare administrator. Clinicians were engaged at the onset of the change process. This meant working incrementally to ensure usability and integration of the EHR with clinical processes. Both local and national supports were created (e.g., local “superusers” were designated to champion the project), and a national “Veterans Electronic Health University” facilitated collaboration among local, regional, and national sponsors of EHR rollout. National

OCR for page 185
The Learning Healthcare System: Workshop Summary performance measures, as well as the gradual withdrawal of paper records, made use of the EHR an inescapable reality. With reductions in time wasted searching for missing paper records and other benefits, over time, staff came to view VistA-CPRS as an indispensable tool for good clinical care (Brown et al. 2003). VistA-CPRS allows clinicians to access and generate clinical information about their individual patients, but additional steps are needed to yield insights into population health. Structured clinical data in the EHR can be aggregated within specialized databases, providing a rich source of data for VA administrators and health services researchers (Figure 4-1). Additionally, unstructured text data, such as clinician notes, can be reviewed and abstracted electronically from a central location. This is of particular benefit to researchers—VA multisite clinical trials and observational studies are facilitated by immediate 100 percent chart availability. Furthermore, VA has invested in an External Peer Review Program (EPRP), in which an independent external contractor audits the electronic text records to assess clinical performance using evidence-based performance criteria. Finally, data derived FIGURE 4-1 The sources and flow of the data most often used by VA researchers for national studies. Most data originate from the VistA system but VA data can also be linked with external data sources such as Medicare claims data and the National Death Index.

OCR for page 185
The Learning Healthcare System: Workshop Summary from the EHR can be supplemented by information from other sources such as Medicare utilization data or data from surveys of veterans. Leveraging the EHR: Diabetes Care in VA Much of the work that follows has been supported by VA’s Office of Research and Development through its Health Services Research and Development and Quality Enhancement Research Initiative (QUERI) programs (Krein 2002). Diabetes care in the VA illustrates advantages of a national EHR supported by an intramural research program. Veterans with diabetes comprise about a quarter of those served, and the VA was an early leader in using the EHR for a national diabetes registry containing clinical elements as well as administrative data. While VA’s EHR made a diabetes registry possible, operationalizing data transfer and transforming it into useful information did not come automatically or easily. In the early 1990s, VA began extracting clinical data from each local VA database into a central data repository. By the year 2000, the VA diabetes registry contained data for nearly 600,000 patients receiving care in the VA, including medications, test results, blood pressure values, and vaccinations. This information has subsequently been merged with Medicare claims data to create the DEpiC (Miller et al. 2004). Of diabetic veterans, 73 percent are eligible for Medicare and 59 percent of dual eligibles use both systems. Adding Medicare administrative data results in less than 1 percent loss to followup, and while it is not as rich as the clinical information in VA’s EHR, its addition fills gaps in follow-up, complication rates, and resource utilization (Miller, D. Personal communication, March 10, 2006.) Combined VA and Medicare data also reveal a prevalence of diabetes among veterans exceeding 25 percent. The impact of the diabetic population on health expenditures is considerable, including total inpatient expenditures (VA plus Medicare) of $3.05 billion ($5,400 per capita) in fiscal year 1999 (Pogach and Miller 2006). The rich clinical information made possible through the EHR yields other insights. For example, VA has identified a high rate of comorbid mental illness (24.5 percent) among patients with diabetes and is using that information to understand the extent to which newer psychotropic drugs, which promote weight gain, as well as mental illness itself, contribute to poor outcomes (Frayne et al. 2005). The influence of gender and race or ethnicity can also be more fully explored using EHR data (Safford et al. 2003). Delineating and tracking diabetic complications are also facilitated by the EHR, for example, the progression of diabetic kidney disease. Using clinical data from the EHR allows identification of early chronic kidney disease in a third of veterans with diabetes, less than half of whom have renal

OCR for page 185
The Learning Healthcare System: Workshop Summary impairment indicated in the record (Kern et al. 2006). VA is able to use the EHR to identify patients at high risk for amputation and is distributing that information to clinicians in order to better coordinate their care (Robbins, J. 2006. Personal communication, February 17, 2006). EHR-Enabled Approaches to Monitoring Quality and Outcomes Traditional quality report cards may provide incentives to health providers to disenroll the sickest patients (Hofer et al. 1999). VA’s EHR provides a unique opportunity to construct less “gameable” quality measures that assess how well care is managed for the same individual over time for diseases such as diabetes where metrics of process quality, intermediate outcomes, and complications (vision loss, amputation, renal disease) are well defined. Using the VA diabetes registry, longitudinal changes within individual patients can be tracked. In Figure 4-2, case-mix-adjusted glycosylated hemoglobin values among veterans with diabetes decreased by −0.314 percent (range −1.90 to 1.03, p < .0001) over two years, indicating improved glycemic control over time, rather than simply the enrollment of healthier veterans (Thompson et al. 2005). These findings provide a convincing demonstration of effective diabetic care. FIGURE 4-2 Trends in mean hemoglobin A1c levels.

OCR for page 185
The Learning Healthcare System: Workshop Summary Longitudinal data have other important uses. For example, knowledge of prior diagnoses and procedures can distinguish new complications from preexisting conditions. This was shown to be the case for estimates of amputation rates among veterans with diabetes, which were 27 percent lower once prior diagnoses and procedures were considered. Thus longitudinal data better reflect the effectiveness of the management of care and can help health systems avoid being unfairly penalized for adverse selection (Tseng et al. 2005). Longitudinal data from the EHR are also important for evaluating the safety and effectiveness of treatments, which are critical insights for national formulary decisions. Advancing Evidence-Based Care Figure 4-3 shows the trends in VA’s national performance scorecard for diabetes care based on EHR data. In addition to internal benchmarking, this approach has compared VA performance to commercial managed care (Kerr et al. 2004; Sawin et al. 2004). While these performance data are currently obtained by abstracting the electronic chart, the completion of a national Health Data Repository with aggregated relational data will eventually support automatic queries about quality and outcomes ranging from the individual patient to the entire VA population (see below). The richness of EHR data allows VA to refine its performance measures. FIGURE 4-3 Improvement in VA diabetes care (based on results from the VA External Peer Review Program).

OCR for page 185
The Learning Healthcare System: Workshop Summary VA investigators were able to demonstrate that annual retinal screening was inefficient for low-risk patients and inadequate for those with established retinopathy (Hayward et al. 2005). As a consequence, VA modified its performance metrics and is developing an approach to risk-stratified screening that will be implemented nationally. The greatest advantage of the EHR in the VA is its ability to improve performance by influencing the behavior of patients, clinicians, and the system itself. For instance, VA’s diabetes registry has been used to construct performance profiles for administrators, clinical managers, and clinicians. These profiles included comparisons between facilities and identified the proportion of veterans with substantial elevations of glycosylated hemoglobin, cholesterol, and blood pressure. Patient lists also facilitated follow-up with high-risk individuals. Additionally, the EHR allowed consideration of the actions taken by clinicians to intensify therapy in response to elevated levels (e.g., starting or increasing a cholesterol medication when the low-density lipid cholesterol is elevated). This approach credits clinicians for providing optimal treatment and also informs them about what action might be required to improve care (Kerr et al. 2003). Data from the EHR and diabetes registry also demonstrated the critical importance of defining the level of accountability in diabetes quality reporting. EHR data show that for most measures in the VA, only a small fraction ( 2 percent) of the variance is attributable to individual primary care providers (PCPs), and unless panel sizes are very large (200 diabetics or more), PCP profiling will be inaccurate. In contrast, much more variation (12-18 percent) was attributed to overall performance at the site of care, a factor of relevance for the design of approaches to rewarding quality. It also highlights the important influence of organizational and system factors on provider adherence to guidelines (Krein 2002). The EHR can identify high-risk populations and can facilitate targeted interventions. For instance, poor blood pressure control contributes significantly to cardiovascular complications, the most common cause of death in diabetics. VA investigators are currently working with VA pharmacy leaders to find gaps in medication refills or lack of medication titration and thereby proactively identify patients with inadequate blood pressure control due to poor medication adherence or inadequate medication intensification. Once identified, those patients can be assigned proactive management by clinical pharmacists integrated into primary care teams and trained in behavioral counseling (Choe et al. 2005). Other approaches currently being tested and evaluated using EHR data are group visits, peer counseling, and patient-directed electronic reminders. VistA-CPRS provides additional tools to improve care at the point of service. For example, PCPs get reminders about essential services (e.g., eye

OCR for page 185
The Learning Healthcare System: Workshop Summary exams, influenza vaccinations) at the time they see the patient, and CPRS functions allow providers and patients to view trends in laboratory values and blood pressure control. Perhaps most importantly, the VA’s EHR allows for effective care coordination across providers in order to communicate patients’ needs, goals, and clinical status as well as to avoid duplication of services. Care Coordination and Telehealth for Diabetes In-home monitoring devices now can collect and transmit vital data for high-risk patients from the home to a care coordinator who can make early interventions that might prevent the need for institutional intervention (Huddleston and Cobb 2004). Such a coordinated approach is possible only with an EHR. Based on promising pilot data as well as needs projections, VA has implemented a national program of Care Coordination through Home Telehealth (CCHT) (Chumbler et al. 2005). Information technology also supports cost-effective access to specialized services. VA recently piloted the use of digital retinal imaging to screen for diabetic retinopathy and demonstrated it could be a cost-effective alternative to ophthalmoscopy for detecting proliferative retinopathy (Conlin et al. 2006). Diabetic retinopathy is not only a preventable complication but also a biomarker for other end-organ damage (e.g., kidney damage). In October 2005, VA began implementing a national program of teleretinal imaging to be available on VistA-CPRS for use by clinicians and researchers. In the future, computerized pictorial analysis and new tools for mining text data across millions of patient records have the potential to transform the clinical and research enterprise by identifying biomarkers of chronic illness progression. Limits of the EHR in VA Although VA has one of the most sophisticated EHRs in use today, VistA is not a single system, but rather a set of 128 interlinked systems, each with its own database (i.e., a decentralized system with central control). This limits its ability to make queries against all of a patient’s known data. In addition, lack of standardization for laboratory values such as glycosylated hemoglobin and other data elements creates challenges for aggregating available data for administrative and research needs. The VA diabetes registry, while a product of the EHR, took years of effort to ensure data integrity. A national data standardization project is currently under way to ensure that data elements are compliant with emerging health data standards and data management practices. Extracting data from free-text data fields,

OCR for page 185
The Learning Healthcare System: Workshop Summary a challenge for all electronic records, will be addressed by defining moderately structured data elements for public health surveillance, population health, clinical guidelines compliance, and performance monitoring. Mapping of legitimate local variations to standard representations will allow easier creation of longitudinal registries for a variety of conditions. The care of diabetes is complex and demanding, and delivering all indicated services may require more time than is typically available in a follow-up visit (Parchman, Romero, and Pugh 2006). Studies of the impact of the EHR on workflow and efficiency in VA and other settings have shown conflicting results (Overhage et al. 2001). While it is unlikely that the EHR saves time during the office encounter, downstream benefits such as better care coordination, reduction of duplicative and administrative tasks, and new models of care (e.g., group visits) translate into a “business case” when the reimbursement structure favors population management. Creating Patient-Centered, Community-Based Care: My HealtheVet VA’s quality transformation since 1996 involved shifting from inpatient to integrated care. The next phase will involve empowering patients to be more actively engaged and moving care from the clinic to the community and home. Again, health information technology has been designed to support the new delivery system. My HealtheVet (MHV) is a nationwide initiative intended to improve the overall health of veterans and support greater communication between VA patients and their providers. Through the MHV web portal, veterans can securely view and manage their personal health records online, as well as access health information and electronic services. Veterans can request copies of key portions of their VA health records and store them in a personal “eVAult,” along with self-entered health information and assessments, and can share this information with their healthcare providers and others inside and outside VA. The full functionality of MHV will help patients plan and coordinate their own care through online access to care plans, appointments, laboratory values, and reminders for preventive care (U.S. Department of Veterans Affairs 2006). Research itself can be facilitated by MHV—patients will be able to identify ongoing clinical studies for which they are eligible to enroll, communicate with investigators via encrypted e-mail, have their outcomes tracked through computer-administered “smart surveys,” and even provide suggestions for future studies. In addition, the effectiveness of patient-centered care can be evaluated.

OCR for page 185
The Learning Healthcare System: Workshop Summary directing the state agencies to use incentives in the purchasing of health care that are based on established statewide goals, against the achievement of performance using specific targets and standard quality measures based on evidence-based clinical standards (Office of Governor Tim Pawlenty 2006). This initiative (QCARE) was designed by a work group that used the seven-step model in developing its recommendations. Institutions in place in Minnesota that enable this model include the Institute for Clinical Systems Improvement, which is a collaborative of Physician Medical Groups and Health Plans that develops and implements evidence-based clinical practice guidelines and provides technical assistance to improve clinical care (Institute for Clinical Systems Improvement 2006). The Institute for Clinical Systems Improvement is an effective way to engage the support of local practicing physicians for evidence-based standards of care. The Institute for Clinical Systems Improvement involves group practices large and small that represent 75 percent of the non-federal physicians practicing in the state. Minnesota Community Measurement, a second Minnesota collaborative of medical providers, health plans, purchasers, and consumers that collects and publicly reports clinical quality using standard measures grounded in the evidence-based, clinical standards work of the Institute for Clinical Systems Improvement is also a critical component that enables the support system for quality in Minnesota (MN Community Measurement 2006). Incentives for working on improving quality are not only provided by the governor’s QCARE program, but also by the private health plans in their pay for performance programs, and by Bridges to Excellence, a national pay for performance program (Bridges to Excellence 2006). All of these use the Institute for Clinical Systems Improvement and Minnesota Community Measurement as the common mechanism for creating incentives founded on evidence-based clinical standards, guidelines, targets, and measures. The individual organizational quality results are publicly reported. The Minnesota experience can be used in the effort to create a national system to support the improvement of quality of care across the country. NQIPA would be most effective as a federation of regional systems with the ability to engage local providers of care that are knit together by a national system of standards and rules and supported by mechanisms to aggregate data from national and regional sources for two purposes. The first is to report quality improvement progress on a national level for Medicare and other national purchasers. The second is to feed back performance at a regional level to enable local healthcare organizations and providers to be engaged in and part of the process of actually improving the quality of care over time. It is also critical that Medicare, national medical specialty societies, national purchasers, and others with national perspectives and needs be a part of and served well by this system. It is important, therefore,

OCR for page 185
The Learning Healthcare System: Workshop Summary that NQIPA implement national standards and priorities uniformly in each region of the country and be able to aggregate data and information at the multiregional and national levels. Progress has been made at the national level, although there is much yet to be done. National goals have been suggested (IOM 2003). What is needed next are the top 10 quality issues and problems by specialty to drive the development of evidence-based guidelines, measures, and incentives by specialty, including underuse, overuse, and misuse issues and problems. Many groups produce useful evidence-based recommendations and guidelines, but what is needed now are more evidence-based reviews of acute care, chronic care, and comparative drug effectiveness. In addition, many organizations use incentives. In 2005, 81 commercial health plans had pay for performance programs, and the Centers for Medicare and Medicaid Services (CMS) was sponsoring six pay for performance demonstrations (Raths 2006). In August 2006, the President signed an executive order promoting quality and efficient health care in federal government-administered or sponsored healthcare programs (The White House 2006). Already mentioned above is the effort by large employers to support incentives through the national Bridges to Excellence Program not only in Minnesota but in many states across the country. Needed now are more healthcare purchasing organizations synchronizing their incentives to standard targets against standard measures that address the most important quality issues across the private and public sectors. There are effective national and regional efforts that engage healthcare organizations and individual physicians in improving the quality of care (Institute for Clinical Systems Improvement 2006; Institute for Healthcare Improvement 2006). Unfortunately, all physician practices and regions of the country are not taking advantage of these healthcare improvement resources. More regional collaboratives are necessary to facilitate improvement in care in all regions of the country. Above all, these individual efforts need to be knit together to form a national strategy and support system—that is, NQIPA. The AQA Alliance The AQA Alliance (www.aqaalliance.org) is a broad-based national collaborative of physicians, consumers, purchasers, health insurance plans, and others that has been founded to improve healthcare quality and patient safety through a collaborative process. Key stakeholders agree on a strategy for measuring performance of physicians and medical groups, collecting and aggregating data in the least burdensome way, and reporting meaningful information to consumers, physicians, and stakeholders to inform choices and improve outcomes. The effort’s goals are to reach consensus as soon as possible on:

OCR for page 185
The Learning Healthcare System: Workshop Summary A set of measures for physician performance that stakeholders can use in private health insurance plan contracts and with government purchasers; A multiyear strategy to roll out additional measurement sets and implement measures in the marketplace; A model (including framework and governing structure) for aggregating, sharing, and stewarding data; and Critical steps needed for reporting useful information to providers, consumers, and purchasers. Currently there are more than 125 AQA alliance-affiliated organizations including the American College of Physicians, American Academy of Family Physicians, American College of Surgeons, American Association of Retired Persons, Pacific Business Group on Health, America’s Health Insurance Plans, and many others. Much progress has been made since the AQA alliance was established in late 2005. The performance measures workgroup (www.aqaalliance.org/performancewg.htm) has established a framework for selection measures, principles for the use of registries in clinical practice settings, a guide for the selection of measures for medical subspecialty care, principles for efficiency measures along with a starter set of conditions for which cost of care measures should be developed first, and 26 primary care measures as a starter set. In addition, eight cardiology measures, as well as measures for dermatology, rheumatology, clinical endocrinology, radiology, neurology, ophthalmology, surgery, and orthopedic and cardiac surgery, have been approved. The AQA parameters for the selection of measures emphasize that “measures should be reliable, valid and based on sound scientific evidence. Measures should focus on areas which have the greatest impact in making care safe, effective, patient-centered, timely, efficient or equitable. Measures which have been endorsed by the NQF should be used when available. The measure set should include, but not be limited to, measures that are aligned with the IOM’s priority areas. Performance measures should be developed, selected and implemented though a transparent process” (AQA Alliance 2006). The data-sharing and aggregation workgroup (www.aqaalliance.org/datawg.htm) has produced principles for data sharing and aggregation; provided a recommendation for a National Health Data Stewardship Entity (NHDSE) to set standards, rules, and policies for data sharing and aggregation, described desirable characteristics of an NHDSE; developed guidelines and key questions for physician data aggregation projects, and established principles to guide the use of information technology systems that support performance measurement and reporting so as to ensure that electronic health record systems can report these data as part of routine

OCR for page 185
The Learning Healthcare System: Workshop Summary practice. As a consequence of this workgroup’s effort, six AQA pilot sites were announced in March 2006. They include the California Cooperative Healthcare Reporting Initiative, Indiana Health Information Exchange, Massachusetts Health Quality Partners, Minnesota Community Measurement, Phoenix Regional Healthcare Value Measurement Imitative, and Wisconsin Collaborative for Healthcare Quality. These pilots are to serve as learning labs to link public and private datasets and assess clinical quality, cost of care, and patient experience. Each of these sites has strong physician leadership, a rich history of collaboration on quality and data initiatives, and the necessary infrastructure and experience to support public and private dataset aggregation. The collaboration across health plans and providers in these six pilot efforts yield a comprehensive view of physician practice. The lessons from the pilot sites can provide valuable input in the establishment of a national framework for measurement, data sharing, and reporting (NQIPA). The third AQA alliance workgroup is the reporting workgroup (www.aqaalliance/reportingwg.com). It has produced principles for public reporting as well as principles for reporting to clinicians and hospitals, and has had discussions on reporting models and formats. There are significant opportunities and challenges for the work of the AQA alliance. Among the opportunities are expansion of the measurement sets to address the critical quality issues in all specialties and expansion of the six pilot sites to form a national network of regional data aggregation collaboratives covering all regions of the country. The engagement and support of all medical and surgical specialty groups as well as physicians and their organizations are critical to the success of this work. Determining a business model and funding sources for the expansion of the pilot sites and the operation of the NHDSE are significant challenges. The expansion of the measurement set to address cost of care, access to care, equity, and patient-centered issues represents a major opportunity as well a significant methodological challenge. Determining the best legal structure and positioning between the public and private sectors of the NHDSE will be critical to its success in setting standards and rules for data aggregation for the public and private sectors. Establishing a common vision for the NQIPA will be important for mobilizing the effort necessary to maximize the value of priority setting, evidence-based medicine, target setting, measurement development, data aggregation, incentives for improved performance, and the public reporting of performance. Getting on with the task of implementing this vision is urgent. Every year that goes by without effective action represents another year of a quality chasm not bridged, of lives lost needlessly, of quality of life diminished unnecessarily.

OCR for page 185
The Learning Healthcare System: Workshop Summary ENVISIONING A RAPID LEARNING HEALTHCARE SYSTEM Lynn Etheredge George Washington University The United States can develop a rapid learning healthcare system. New research capabilities now emerging—large electronic health record databases, predictive computer models, and rapid learning networks—will make it possible to advance clinical care from the experience of tens of millions of patients each year. With collaborative initiatives in the public and private sectors, a national goal could be for the health system to learn about the best uses of new technologies at the same rate that it produces new technologies. This could be termed a rapid learning health system (Health Affairs 2007). There is still much to be done to reach that goal. Biomedical researchers and technology firms are expanding knowledge and clinical possibilities much faster than the health system’s ability to assess these technologies. Already, there are growing concerns about the evidence base for clinical care, its gaps and biases (Avorn 2004; Kassirer 2005; Abramson 2004; Ioannidis 2005; Deyo and Patrick 2005). Technological change is now the largest factor driving our highest-in-the world health spending, which is now more than $2 trillion per year. With advances in the understanding of the human genome and a doubling of the NIH research budget to more than $28 billion, there may be an even faster stream of new treatment options. Neither government regulation, healthcare markets, consumers, physicians, nor health plans are going to be able to deal with these technology issues, short of rationing, unless there are more rapid advances in the evidence base for clinical care. The “inference gap” concept, described by Walter Stewart earlier in this volume, incisively captures the knowledge issues that confront public officials, physicians, and patients for the 85 million enrollees in the Medicare and Medicaid programs. As he notes, the clinical trials database is built from randomized clinical trials mostly using younger populations with single diagnoses. The RCT patients are very different from Medicare and Medicaid enrollees who are mostly older patients with multiple diagnoses and treatments, women and children, and individuals with seriously disabling conditions. With Medicare and Medicaid now costing more than $600 billion annually—and projected to cost $3.5 trillion over the next five years— there is a fiscal, as well as a medical, imperative to learn more rapidly about what works in clinical care. As a practical matter, we cannot learn all that we would like to know, as rapidly as we need to know it, through RCTs and need to find powerful and efficient ways to learn rapidly from practice-based evidence.

OCR for page 185
The Learning Healthcare System: Workshop Summary Large EHR research databases are the key development that makes it possible to create a rapid learning health system. The VA and Kaiser Permanente are the public and private sector leaders; their new research databases each have more than 8 million EHRs. They are likely to add genomic information. New networks with EHR databases—“virtual research organizations”—add even more to these capacities. For instance, HMORN with 14 HMOs has 40 million enrollees and is sponsoring the Cancer Research Network (which has about 10 million patient EHRs) with the National Cancer Institute, as well as the Vaccine Safety Datalink (which has about 6 million patient records) with the Centers for Disease Control and Prevention (CDC). Institutions with EHR databases and genome data include Children’s Hospital of Philadelphia, Marshfield, Mayo, and Geisinger. Large research projects that need to access paper health records from multiple sites are now administratively complicated, time-consuming, expensive, and done infrequently. In contrast, studies with computerized EHR databases and new research software will be done from a computer terminal in hours, days, or a few weeks. Thousands of large population studies will be doable quickly and inexpensively. A fully operational national rapid learning system could include many such databases, sponsors, and networks. It could be organized in many different ways, including by enrolled populations (the VA, Medicare, Medicaid, private health plans); by healthcare professions (specialist registries); by institution (multispecialty clinics and academic health centers); by health condition (disease registries and national clinical studies databases); by technology (drug safety and efficacy studies, coverage with evidence development studies); by geographic area (Framingham study); by age cohort (National Children’s Study); or by special population (minorities, genomic studies). With national standards, all EHR research registries and databases could be compatible and multiuse. The key short-term issues for advancing a rapid learning strategy include leadership and development of learning networks, development of research programs, and funding. As reflected in the spectacularly rapid advances of the Human Genome Project and its sequels, we should be thinking about creating a number of leading-edge networks that cut across traditional organizational boundaries. Among potential new research initiatives, it is notable that large integrated delivery systems, such as Kaiser and VA, have been early leaders and that many parts of NIH could be doing much more to develop ongoing national research networks (see paper by Katz, Chapter 5) and EHR databases. With respect to new databases, the NIH could require reporting of all its publicly funded clinical studies, in EHR-type formats, into national computer-searchable NIH databanks; peer-reviewed journals could require that the datasets of the clinical studies they publish

OCR for page 185
The Learning Healthcare System: Workshop Summary also be available to the scientific community through such NIH databanks (National Cancer Institute 2005). This rapid learning strategy would take advantage of what we economists term the “economics of the commons”; each individual researcher would give up exclusive access to some data, but would benefit, in return, from access to a vast and expanding treasure trove of clinical data from the international research community. With these carefully collected, rich data resources, powerful mathematical modeling approaches will be able to advance systems biology, “virtual” clinical trials, and scientific prediction-based health care much more rapidly. There will also be benefits for research on heterogeneity of treatment responses and the design of “practical clinical trials” to fill evidence gaps (see papers by Tunis, Chapter 1, and by Eddy and Greenfield, Chapter 2). Another important research initiative would be to develop “fast track” learning strategies to evaluate promising new technologies. One model suggested is to establish study designs for new technologies when they are first approved and to review the evidence from patient experience at a specified date (e.g. three years later) to help guide physicians, patients, and future research as these technologies diffuse into wider use (Avorn 2004). To implement a national learning strategy, the Department of Health and Human Services (HHS) health agencies and the VA could be designers and funders of key public or private initiatives. HHS first-year initiatives could include expanding on the National Cancer Institute’s (NCI’s) Cancer Research Network with NIH networks for heart disease and diabetes; starting national computer searchable databases for NIH, the Food and Drug Administration (FDA), and other clinical studies; a broad expansion of AHRQ’s research to address Medicare Rx, Medicaid, national health spending, socioeconomic and racial disparities, effectiveness, and quality issues; expanding CDC’s Vaccine Safety Datalink network and FDA’s post-market surveillance into a national FDA-CDC program for evaluation of drug safety and efficacy, including pharmacogenomics; starting national EHR research programs for Medicaid’s special needs populations; and initiating a national “fast track” learning system for evaluating promising new technologies. A first-year budget of $50 million for these initiatives takes into account that research capabilities are still capacity limited by EHR database and research tool development. Within five years, a national rapid learning strategy could be taken to scale with about $300 million a year. To move forward, a national learning strategy also needs vision and consensus. The IOM is already having a key catalytic role through this workshop and publication of these papers. This paper identifies many opportunities for the public and private sectors to collaborate in building a learning healthcare system.

OCR for page 185
The Learning Healthcare System: Workshop Summary REFERENCES Abramson, J. 2004. Overdosed America. New York: HarperCollins. Asch, S, E McGlynn, M Hogan, R Hayward, P Shekelle, L Rubenstein, J Keesey, J Adams, and E Kerr. 2004. Comparison of quality of care for patients in the Veterans Health Administration and patients in a national sample. Annals of Internal Medicine 141(12):938-945. AQA Alliance. 2006 (April) AQA Parameters for Selecting Measures for Physician Performance. Available from http://www.aqaalliance.org/files/AQAParametersforSelectingAmbulatoryCare.doc. (accessed April 4, 2007). Avorn, J. 2004. Powerful Medicines. New York: Alfred A. Knopf. Bridges to Excellence. 2006. Bridges to Excellence Overview. Available from http://www.bridgestoexcellence.org/bte/about_us/home.htm. (accessed November 30, 2006). Brown, S, M Lincoln, P Groen, and R Kolodner. 2003. Vista-U.S. Department of Veterans Affairs national-scale HIS. International Journal of Medical Informatics 69(2-3):135-156. Choe, H, S Mitrovich, D Dubay, R Hayward, S Krein, and S Vijan. 2005. Proactive case management of high-risk patients with Type 2 diabetes mellitus by a clinical pharmacist: a randomized controlled trial. American Journal of Managed Care 11(4):253-260. Chumbler, N, B Neugaard, R Kobb, P Ryan, H Qin, and Y Joo. 2005. Evaluation of a care coordination/home-telehealth program for veterans with diabetes: health services utilization and health-related quality of life. Evaluation and the Health Professions 28(4):464-478. Conlin, P, B Fisch, J Orcutt, B Hetrick, and A Darkins. 2006. Framework for a national teleretinal imaging program to screen for diabetic retinopathy in Veterans Health Administration patients. Journal of Rehabilitation Research and Development 43(6):741-748. Deyo, R, and D Patrick. 2005. Hope or Hype. New York: American Management Association/AMACOM Books. Evans, D, W Nichol, and J Perlin. 2006. Effect of the implementation of an enterprise-wide electronic health record on productivity in the Veterans Health Administration. Health Economics, Policy, and Law 1(2):163-169. Frayne, S, J Halanych, D Miller, F Wang, H Lin, L Pogach, E Sharkansky, T Keane, K Skinner, C Rosen, and D Berlowitz. 2005. Disparities in diabetes care: impact of mental illness. Archives of Internal Medicine 165(22):2631-2638. Green, L. 2000. Putting practice into research: a 20-year perspective. Family Medicine 32(6):396-397. Green, L, and S Dovey. 2001. Practice based primary care research networks. They work and are ready for full development and support. British Medical Journal 322(7286):567-568. Green, L, G Fryer Jr., B Yawn, D Lanier, and S Dovey. 2001. The ecology of medical care revisited. New England Journal of Medicine 344(26):2021-2025. Greenfield, S, and S Kaplan. 2004. Creating a culture of quality: the remarkable transformation of the department of Veterans Affairs Health Care System. Annals of Internal Medicine 141(4):316-318. Hayward, R, C Cowan Jr., V Giri, M Lawrence, and F Makki. 2005. Causes of preventable visual loss in type 2 diabetes mellitus: an evaluation of suboptimally timed retinal photocoagulation. Journal of General Internal Medicine 20(5):467-469. Health Affairs. 2007. A rapid learning health system. Health Affairs (collection of articles, special web edition) 26(2):w107-w118. Himmelstein, D, and S Woolhandler. 2005. Hope and hype: predicting the impact of electronic medical records. Health Affairs 24(5):1121-1123. Hofer, T, R Hayward, S Greenfield, E Wagner, S Kaplan, and W Manning. 1999. The unreliability of individual physician “report cards” for assessing the costs and quality of care of a chronic disease. Journal of the American Medical Association 281(22):2098-2105.

OCR for page 185
The Learning Healthcare System: Workshop Summary Huddleston, M, and R Cobb. 2004. Emerging technology for at-risk chronically ill veterans. Journal of Healthcare Quality 26(6):12-15, 24. Institute for Clinical Systems Improvement. 2006. Available from http://www.icsi.org/about/index.asp. (accessed November 30, 2006). Institute for Healthcare Improvement. 2006. Available from http://www.ihi.org/ihi. (accessed December 3, 2006). IOM (Institute of Medicine). 1996. Primary Care: America’s Health in a New Era. Washington, DC: National Academy Press. ———. 1999. To Err Is Human: Building a Safer Health System. Washington, DC: National Academy Press. ———. 2001. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press. ———. 2003. Priority Areas for National Action: Transforming Health Care Quality. Washington, DC: The National Academies Press. ———. 2005. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. ———. 2006. The Richard and Hilda Rosenthal Lectures 2005: The Next Steps Toward Higher Quality Health Care. Washington, DC: The National Academies Press. Ioannidis, J. 2005. Contradicted and initially stronger effects in highly cited clinical research. Journal of the American Medical Association 294(2):218-228. Isham, G, and G Amundson. 2002 (October). A seven step process model for quality improvement. Group Practice Journal 40. Jha, A, J Perlin, K Kizer, and R Dudley. 2003. Effect of the transformation of the Veterans Affairs Health Care System on the quality of care. New England Journal of Medicine 348(22):2218-2227. Kassirer, J. 2005. On The Take. New York: Oxford University Press. Kern, E, M Maney, D Miller, C Tseng, A Tiwari, M Rajan, D Aron, and L Pogach. 2006. Failure of ICD-9-CM codes to identify patients with comorbid chronic kidney disease in diabetes. Health Services Research 41(2):564-580. Kerr, E, D Smith, M Hogan, T Hofer, S Krein, M Bermann, and R Hayward. 2003. Building a better quality measure: are some patients with “poor quality” actually getting good care? Medical Care 41(10):1173-1182. Kerr, E, R Gerzoff, S Krein, J Selby, J Piette, J Curb, W Herman, D Marrero, K Narayan, M Safford, T Thompson, and C Mangione. 2004. Diabetes care quality in the Veterans Affairs Health Care System and commercial managed care: the TRIAD study. Annals of Internal Medicine 141(4):272-281. Kleinke, J. 2005. Dot-gov: market failure and the creation of a national health information technology system. Health Affairs 24(5):1246-1262. Krein, S. 2002. Whom should we profile? Examining diabetes care practice variation among primary care providers, provider groups, and health care facilities. Health Services Research 35(5):1160-1180. Miller, D, M Safford, and L Pogach. 2004. Who has diabetes? Best estimates of diabetes prevalence in the Department of Veterans Affairs based on computerized patient data. Diabetes Care 27(Suppl.):2, B10-B21. MN Community Measurement. 2006. Our Community Approach. Available from http://mnhealthcare.org/~wwd.cfm. (accessed November 30, 2006). Mold, J, and K Peterson. 2005. Primary care practice-based research networks: working at the interface between research and quality improvement. Annals of Family Medicine 3(Suppl.):1, S12-S20. National Cancer Institute. 2005 (June). Restructuring the National Cancer Clinical Trials Enterprise. Available from http://integratedtrials.nci.nih.gov/ict/. (accessed May 24, 2006).

OCR for page 185
The Learning Healthcare System: Workshop Summary Office of Governor Tim Pawlenty. 2006 (July 31). Governor Pawlenty Introduces Health Care Inititative to Improve Quality and Save Costs. Available from http://www.governor.state.mn.us/mediacenter/pressreleases/PROD007733.html. (accessed November 30, 2006. Overhage, J, S Perkins, W Tierney, and C McDonald. 2001. Controlled trial of direct physician order entry: effects on physicians’ time utilization in ambulatory primary care internal medicine practices. Journal of the American Medical Informatics Association 8(4):361-371. Parchman, M, R Romero, and J Pugh. 2006. Encounters by patients with Type 2 diabetes—complex and demanding: an observational study. Annals of Family Medicine 4(1):40-45. Perlin, J. 2006. Transformation of the U.S. Veterans Health Administration. Health Economics, Policy, and Law 1(2):99-105. Perlin, J, R Kolodner, and R Roswell. 2004. The Veterans Health Administration: quality, value, accountability, and information as transforming strategies for patient-centered care. American Journal of Managed Care 10(11.2):828-836. Pogach, L, and D Miller. 2006. Merged VHA-Medicare Databases: A Tool to Evaluate Outcomes and Expenditures of Chronic Diseases. Poster Session, VHA Health Services Research and Development Conference, February 26, 2006. Raths, D. 2006 (February). 9 Tech Trends: Pay for Performance. Healthcare Informatics Online. Available from http://healthcare-informatics.com/issues/2006/02/48. (accessed December 2, 2006). Safford, M, L Eaton, G Hawley, M Brimacombe, M Rajan, H Li, and L Pogach. 2003. Disparities in use of lipid-lowering medications among people with Type 2 diabetes mellitus. Archives of Internal Medicine 163(8):922-928. Sawin, C, D Walder, D Bross, and L Pogach. 2004. Diabetes process and outcome measures in the Department of Veterans Affairs. Diabetes Care 27(Suppl.):2, B90-B94. Thompson, W, H Wang, M Xie, J Kolassa, M Rajan, C Tseng, S Crystal, Q Zhang, Y Vardi, L Pogach, and M Safford. 2005. Assessing quality of diabetes care by measuring longitudinal changes in hemoglobin A1c in the Veterans Health Administration. Health Services Research 40(6.1):1818-1835. Tseng, C, M Rajan, D Miller, G Hawley, S Crystal, M Xie, A Tiwari, M Safford, and L Pogach. 2005. Use of administrative data to risk adjust amputation rates in a national cohort of Medicare-enrolled veterans with diabetes. Medical Care 43(1):88-92. U.S. Department of Veterans Affairs. 2006. My HealtheVet. Available from http://www.my-health.va.gov. (accessed August 21, 2006). White, K, T Williams, and B Greenberg. 1961. The ecology of medical care. New England Journal of Medicine 265:885-892. White House. 2006 (August 22). Executive Order: Promoting Quality and Efficient Health Care in Federal Government Administered or Sponsored Health Care Programs. Available from http://www.whitehouse.gov/news/releases/2006/08/20060822-2.html. (accessed April 4, 2007)

OCR for page 185
The Learning Healthcare System: Workshop Summary This page intentionally left blank.