National Academies Press: OpenBook
« Previous: 13 CMS Oversight
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 361
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 362
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 363
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 364
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 365
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 366
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 367
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 368
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 369
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 370
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 371
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 372
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 373
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 374
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 375
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 376
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 377
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 378
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 379
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 380
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 381
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 382
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 383
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 384
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 385
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 386
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 387
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 388
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 389
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 390
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 391
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 392
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 393
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 394
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 395
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 396
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 397
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 398
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 399
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 400
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 401
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 402
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 403
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 404
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 405
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 406
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 407
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 408
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 409
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 410
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 411
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 412
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 413
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 414
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 415
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 416
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 417
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 418
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 419
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 420
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 421
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 422
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 423
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 424
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 425
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 426
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 427
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 428
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 429
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 430
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 431
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 432
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 433
Suggested Citation:"Appendix A Supporting Tables." Institute of Medicine. 2006. Medicare's Quality Improvement Organization Program: Maximizing Potential. Washington, DC: The National Academies Press. doi: 10.17226/11604.
×
Page 434

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Appendixes

A Supporting Tables 363

364 APPENDIX A TABLE A.1 Literature Review on Impact of Quality Improvementa Reference Data Source, Sample Size, and Time Frame Barr J, et al. "A Randomized Intervention · 1,908 women aged 50­75 enrolled in a to Improve Ongoing Participation in northeast HMO who had a mammogram Mammography." The American Journal of with no subsequent visits for next 18­21 Managed Care. 2001. months · 1994­1996 Berner, et al. "Do Local Opinion Leaders · Unit of analysis: acute care hospitals in Augment Hospital Quality Improvement Alabama with more than 100 patients with Efforts?" Medical Care. 2003. unstable angina (UA) as the primary or secondary diagnosis; 22 hospitals were willing to participate · Baseline: 1997­1998 Follow-up: 1999­2000 Bradley E, et al. "A Qualitative Study of · Interviews with hospital staff Increasing Beta-blocker Use After · 45 respondents of various disciplines, staff Myocardial Infarction." JAMA. 2001. levels, and hospitals · October 1996­September 1999

APPENDIX A 365 Study Purpose, Methodological Approach, and Outcome Measures Findings · Effectiveness of various interventions for breast · Telephone with option to schedule cancer screening guidelines appointment was the most effective · Randomized control trial with three groups: intervention (relative risk = 1.39) (1) received mailings, (2) telephone call with option Researchers suspect that its success to schedule appointment, and (3) regular publicity was due to convenience of scheduling campaign and personal aspect · The number of mammograms received after the · Mailings were not found to be intervention period and within 2 years of the initial useful mammogram · Limitations: this group of women may have been hard to motivate or had mammograms outside of the health plan · Assess whether or not physician opinion leaders · Use of OLs results in small, (OL) helped implementation of CMS's HCQIP inconsistent effects · Three-armed randomized control trial (no · Use of OLs resulted in significant intervention, HCQIP-CMS's quality improvement improvement only with the plan only, and OL-HCQIP plus addition of intervention of antiplatelet physician OL); HCQIP and OL administered medication within 24 hours change through education of guidelines, · Many caveats and reasons why the presentation of hospital-specific data, and clinical OLs did not show more influence reminders were presented: · Measured adherence to five of AHRQ's UA ­ Study was limited guidelines (electrocardiography within 20 minutes, to chart review data antiplatelet medication at discharge, antiplatelet ­ A physician leader may have medication within 24 hours, use of heparin, and stepped up in no-intervention and use of beta-blockers) HCQIP groups · Outcome measure: percent change in compliance ­ Hospital type may lead to bias with guidelines before and after the intervention ­ Hospital may concurrently for all five interventions participate in other QI projects ­ The quality-of-care indicators chosen · Identify factors that may improve beta-blocker · Importance of physician leadership use (i.e., hospital size, geographic region, and · Similar initiatives were used to changes in beta-blocker use rates). Develop method enhance use among hospitals with for classifying it various MI volumes · Qualitative study based on interviews with · No factors were found to directly hospital staff, data analyzed via qualitative coding correlate to higher performance techniques · Methods to improve care, coded qualitative data continues

366 APPENDIX A TABLE A.1 Continued Reference Data Source, Sample Size, and Time Frame Bradley E, et al. "From Adversary to · Primary data in the form of interviews Partner: Have Quality Improvement · 105 randomly selected hospital quality Organizations Made the Transition?" management directors Health Services Research. 2005. · 2002 Burwen D. "National and State Trends in · Medicare patients with AMI without Quality of Care for Acute Myocardial contraindications per state guidelines Infarction Between 1994­1995 and 1998­ · 1994­1995: 234,754 patients 1999." Archives of Internal Medicine. 2003. 1998­1999: 35,713 patients · Baseline: 1994­1995 Follow-up: 1998­1999

APPENDIX A 367 Study Purpose, Methodological Approach, and Outcome Measures Findings · Describe impact of QIOs on AMI quality of care · Interviews generally found the · Created survey instrument asking about the QIOs' quality improvement efforts to following: amount of contact between hospital be useful (more than 60% of quality departments and QIO, number of AMI- interviewees rated interventions as related QIO-supported or -led interventions, and helpful or very helpful) whether QIO interventions had affected AMI · Many thought the impact of QIOs quality was low in that quality of care would not be different in the absence of QIO efforts (only 25% thought care would be worse without QIOs) · QIOs are seen more as collaborative partners than as adversaries, as they were stigmatized in the past · Many believed that QIOs could be more effective at attaining more support from physicians and senior management of hospitals · Determine improvement in quality of care for · Quality improved overall between AMI the two periods · Analyzed data from CCP. Quality indicators · In practice, some types of quality studied: early administration of aspirin, aspirin indicators are more readily improved prescribed at discharge, early administration of than others (i.e., reperfusion therapy beta-blockers, beta-blocker prescribed at discharge, and smoking cessation counseling) ACE prescribed at discharge, and smoking due to challenges in implementation cessation counseling. Used r2 and chi-squared (e.g., improvements in an indicator analyses cannot always be accomplished · Probability of patients studied for whom quality through behavioral changes initiated indicators were documented by a single physician) · Improvement was not due to geographic or regional differences or patient characteristics · Diffusion of evidence-based therapies into practice is not optimal continues

368 APPENDIX A TABLE A.1 Continued Reference Data Source, Sample Size, and Time Frame Centor R. "Diffusion of Troponin Testing · Medicare patients with suspected cardiac in Unstable Angina Patients: Adoption Prior ischemia in 22 volunteer Alabama hospitals to Guideline Release." Journal of Clinical · Baseline: 1,272 patients Epidemiology. 2003. Follow-up: 1,302 patients · Baseline: March 1997­February 1998 Follow-up: January 1999­December 1999 Chu L, et al. "Improving the Quality of · Medical record abstraction Care for Patients with Pneumonia in Very · 36 hospitals, mostly rural community Small Hospitals." Archives of Internal hospitals, in Oklahoma Medicine. 2003. · Cycle 1: April 1995­June 1995 Cycle 2: November 1996­March 1997

APPENDIX A 369 Study Purpose, Methodological Approach, and Outcome Measures Findings · Determine status of quality indicators before · Guidelines released in 2000 implementation of guidelines reflected already accepted practice · Examined changes in troponin use before and not dissemination of new implementation of ACC/AHA presented their knowledge clinical guidelines in 2000; quality measures: · Troponin tended to be ordered for receipt of aspirin within 24 hours of admission, higher-risk patients, which may have receipt of aspirin at discharge, receipt of beta- been an indicator for more aggressive blocker during hospitalization, receipt of heparin clinical management during hospitalization for patients at moderate to high risk of AMI or death, performance of EKG within 20 min after arrival, and admission to hospital bed with cardiac monitoring; logistic regression analyses were used to determine appropriateness of troponin use · Troponin ordered, troponin positive when ordered, previously developed quality measures for unstable angina, use of ACE inhibitors, and procedure rates · Demonstrate that QIO can be effective external · Intervention versus control groups change agent driving improvement of pneumonia (Cycle 1): treatment guidelines ­ Intervention group found to be · Hospitals split into two groups. Two intervention more likely to show statistical cycles. Interventions consisted of QIO providing improvement in process measures hospitals feedback via face-to-face meetings with than control group medical staff and individual hospital profiles; ­ No statistically significant hospitals had to provide QIO with quality differences in outcomes measures improvement plans. (unadjusted mortality, p = 0.39; Cycle 1: first group of hospitals received length of stay, p = 0.47) intervention, results were compared with those for ­ During Cycle 1, no significant a control (Group 2) differences from results in control Cycle 2: second group (control group in Cycle 1) group found, maintaining that received intervention differences in process measures not · Chi-squared test for proportions, two-tailed due to external confounders related t-tests, ANOVA, regression coefficients; p < 0.05 to the condition · Intervention in control group (Cycle 2): ­ Statistically significant improvement made in four of five measures after intervention · Results may not be duplicated in large hospitals · CMS policy did not allow randomization of hospitals continues

370 APPENDIX A TABLE A.1 Continued Reference Data Source, Sample Size, and Time Frame Coleman E, et al. "Preparing Patients and · Colorado integrated delivery system Caregivers to Participate in Care Delivered · Patients age 65+ with at least one of nine Across Settings: The Care Transitions conditions Intervention." Journal of the American Control: 1,235 patients Geriatrics Society. 2004. Intervention: 158 patients · July 2001­September 2002 Cortes L. "The Impact of Quality · MDS reports of restraint use Improvement Programs in Long Term · Population statewide in LTC facilities, Care." Texas Department of Human 69,590­70,814 patients Services. 2004. · 2002­2003 Daniel D, et al. "A State-Level Application · 47 teams (representing public health of the Chronic Illness Breakthrough Series: delivery system, community care, large Results from Two Collaboratives on clinics, hospitals systems, and private Diabetes in Washington State." Joint practices) Commission Journal on Quality and Safety. · Collaborative I: October 1999­November 2004. 2000 Collaborative II: February 2001­March 2002

APPENDIX A 371 Study Purpose, Methodological Approach, and Outcome Measures Findings · Determine if transitions between health care · Use of transition coach and settings can be enhanced by more active roles of personal health record is promising patients and caregivers to reduce rehospitalization rates · Intervention: designate a transition coach to postdischarge work with patient and caregiver via visits and · OR at 30 days: 0.52 phone calls; coaches also teach patients about OR at 90 days: 0.43 personal health records; patient records are tracked OR at 180 days: 0.57 for rehospitalizations 6 months after discharge · Actual cost of transition coach over · Postdischarge hospital use rates at 30, 90, and 8 months: $47,133 180 days (rehospitalization and emergency room) · Determine the extent to which the Texas Dept. of · Facilities receiving both DHS and Human Services (DHS) program and QIO program QIO assistance showed a 55.1% each contributed to reduced use of restraints reduction in restraint use among LTC residents · Facilities receiving only DHS · Attributable fraction: 139 facilities enrolled in assistance showed a 35.3% reduction QIO TA program, all 1,050 facilities in state in restraint use received TA from DHS. The difference in observed · Estimated excess fraction of improvement between the QIO subgroup and the improvement attributable to the QIO remaining facilities is the fraction attributable to program: 19.8% the QIO intervention · Statewide, 90% of improvement is · Change in restraint prevalence among facilities attributable to the DHS program; receiving QIO TA and those receiving DHS TA 10% is attributable to QIO because only QIO served only 13% of facilities statewide · Conclusion: state and QIO programs are not redundant and the programs are complementary · Assess effect of collaboratives at state level; test · State-level collaboratives effective what efforts may be associated with quality ­ Provided more technical support improvement ­ Increased participation · Teams independently collected data on process · Higher absolute improvement and outcomes of clinical indicators of diabetes associated with teams with lower care; over 13-month test period, teams congregated baseline levels at four conferences, sharing lessons learned · Process measures had greater · Indicators of success: absolute improvement absolute improvement, perhaps due (from baseline to remeasurement) and to behavioral changes, which are improvement in remeasurement values necessary by both providers and patients continues

372 APPENDIX A TABLE A.1 Continued Reference Data Source, Sample Size, and Time Frame Ellerbeck E, et al. "Quality of Care for · Medicare's National Claims History File Medicare Patients with Acute Myocardial · All Medicare hospitalizations with AMI as Infarction: A Four-State Pilot Study from the primary diagnosis Cooperative Cardiovascular Project." · June 1, 1992­February 28, 1993 JAMA. 1995. Ellerbeck E, et al. "Impact of Quality · 117 acute care hospitals in Iowa Improvement Activities on Care for Acute · Baseline: June 1992­December 1992 Myocardial Infarction." International Follow-up: August 1995­November 1995 Journal for Quality in Health Care. 2000.

APPENDIX A 373 Study Purpose, Methodological Approach, and Outcome Measures Findings · Test the CCP quality-of-care indicators for AMI · Seventeen to 72% of Medicare · Compare data abstracted from medical records patients who should ideally receive for 26 quality indicators; physicians dealing with therapies for AMI did not receive AMI were asked to check the validity of care them either at all or in a timely received based on each indicator; ideal patients for manner CCP (those without contraindication) were · Treatments are underused for those identified who do receive them · Percentage of patients who received appropriate · Many Medicare AMI patients are care according to quality indicators, based on ACC not "ideal" and AHA guidelines (i.e., aspirin during · Need to improve medical record hospitalization, heparin doses, timing of documentation medication delivery) · Cannot fully validate the measures used due to a lack of standard criteria · CCP quality indicators showed areas for improvement, but quality indicators need to be refined · Assess relationship between PRO-involved · Found significant (p < 0.05) quality improvement activities and improvement in improvement only for three quality of care of AMI treatment indicators from baseline (aspirin · Two groups: treatment during stay, aspirin ­ Hospitals with no plan or no systematic change treatment at discharge, and beta- to improve AMI care (73 hospitals) blocker treatment at discharge) ­ Hospitals undergoing systems change and · Systems change and measurement measurement (44 hospitals) hospitals tended to have higher · Indicators: reperfusion, thrombolytics values at baseline <60 minutes; aspirin treatment during hospital · Suggested caveats: control group stay, aspirin treatment at discharge; treatment with was not completely devoid of PRO beta-blockers, ACE inhibitors, and calcium activities, QI activities were not blockers; smoking cessation counseling reported to PRO, process measures · Change from baseline of percentage of patients instead of outcomes measures were receiving indicated treatment measured continues

374 APPENDIX A TABLE A.1 Continued Reference Data Source, Sample Size, and Time Frame Gould B, et al. "Improving Patient Care · 77 second-year medical students Outcomes by Teaching Quality · 1999­2000 Improvement to Medical Students in Community-Based Practices." Academic Medicine. 2002. Halm E, et al. "Limited Impact of a · Medical record abstraction Multicenter Intervention to Improve the · Preintervention: 1,013 patients Quality and Efficiency of Pneumonia Care." Postintervention: 1,081 patients Chest. 2004. · Preintervention: December 1999­April 2000 Postintervention: November 2000­March 2001 Holman W, et al. "Alabama Coronary · Medical record abstraction Artery Bypass Grafting Project." JAMA. · Alabama: 5,784 patients 2001. Comparison state: 3,214 patients National sample: 3,758 patients · Baseline: July 1995­June 1996 Follow-up: July 1998­December 1998

APPENDIX A 375 Study Purpose, Methodological Approach, and Outcome Measures Findings · Determine impact of implementing quality · Quality of care: rate of improvement as a component of medical school documentation increased (p < 0.001) curriculum · Students were more aware of CQI · Students used chart abstraction and continuous efforts but did not necessarily find quality improvement methods in community value in CQI programs practice sites; measured improvement in quality of care and surveyed students on their perspectives of the program · Quality of care: improvement in rate of documentation from baseline Students were surveyed using qualitative, open-ended questions · Impact of multidisciplinary team in four · Slight increase in adherence to academic medical centers in New York City to process measures enhance care for community-acquired pneumonia · No change in outcomes measures · Matched pre- and postintervention patients; · Change in academic medical centers intervention: opinion leaders formed teams to may require more systems-based develop guidelines, run educational sessions, approach. Process measures may produce pocket reminder cards, and promote have been more successful due to standardized orders; analyzed antibiotic use, more evidence and fewer discharge rates prior to clinical stability, length of confounders stay, timely switch to oral antibiotics, and timely discharge · Statistical difference from pre- and postinterventions of process and outcomes measures · Assess quality improvement efforts for CABG in · Significant differences were seen in 20 Alabama hospitals Alabama's improvement in · Held meetings with all hospitals in Alabama that comparison with those of both the performed CABG to provide peer-based feedback comparison state and the national and share care processes; measured process and standard (p < 0.02 for all measures, outcomes indicators from baseline to follow-up p < 0.001 for some) and compared them with those from a national · Risk-adjusted mortality OR: 0.72 sample and a comparison state and 0.76 compared with comparison · Mean change from baseline to follow-up; ORs state and national sample, calculated for mortality respectively continues

376 APPENDIX A TABLE A.1 Continued Reference Data Source, Sample Size, and Time Frame Jamtvedt G, et al. "Audit and Feedback: · Reviewed 85 studies of randomized Effects on Professional Practice and Health control trials published between 1997 and Care Outcomes." Cochrane Library. 2004. 2001 Jencks S, et al. "Quality of Medical Care · Medical record abstraction, Medicare Delivered to Medicare Beneficiaries." hospital claims, and BRFSS JAMA. 2000. · Random sample of all Medicare fee-for- service populations diagnosed with particular conditions · 1997­1999 Jencks S, et al. "Change in the Quality of · Medical record abstraction Care Delivered to Medicare Beneficiaries, · Results from 52 QIOs (does not include 1998­1999 to 2000­2001." JAMA. 2003. Virgin Islands) · Data collection: 1998­1999 and 2000­ 2001 Joseph A, et al. "Results of a Randomized · Patient surveys, medical record review, Controlled Trial of Intervention to surveys of site leaders and pharmacy benefit Implement Smoking Guidelines in Veterans managers. Affairs Medical Centers." Medical Care. · 20 Veterans Affairs centers; 5,678 people 2004. · 2000­2001

APPENDIX A 377 Study Purpose, Methodological Approach, and Outcome Measures Findings · Assess whether audit and feedback (a summary · Audit and feedback can be useful of clinical performance over a period of time but yield inconsistent results (from returned to responsible party in written, electronic, negative effects to strong positive or verbal form) to health care professional and effects) patients was effective · Coupling of audit and feedback · Meta-analysis with other quality improvement efforts does not necessarily enhance effectiveness · Describe and report baseline values on 24 QIO · No state consistently performs measures. Focus on processes of care, not highly or poorly on quality measures outcomes. · Much room for improvement, · Measure performance of the median state (not according to the 24 measures the national average), rank of states for each · Need to focus on systems change, measure, and average overall ranking, with not individual practitioner geographic trends also evaluated; clinical topics · General geographic trend: higher- were chosen for their potential for substantial quality ranking associated with effect on quality northern and less populated states · Percentage of people receiving appropriate care · Impossible to attribute changes in for 22 indicators of heart failure, stroke, indicators to QIO activities pneumonia, breast cancer, and diabetes, as defined by CMS, ACC/AHA, ATS, BRFSS, HEDIS, DQIP, and CDC · Track changes for 22 quality measures at state · Care for Medicare fee-for-service and national levels and outpatient beneficiaries increased · Compared results in 1998­1999 per state per for 20 of 22 measures measure with results in 2000­2001; states were · Generally, states with lower also ranked based on performance improvement. baselines yielded greater relative · Relative improvement (reduction in failure rate improvements compared with those = (change in performance from baseline to follow- with higher baselines up)/(baseline performance ­ perfect performance) · National patterns of care were found: northern states generally had higher state rankings and larger relative improvement rates than southern states · Improve guideline implementation for tobacco · Interventions had little effect on use cessation by randomized control trial using smoking cessation (smoking cessation AHRQ guidelines or increase in medication use) (p > · Intervention: training to improve documentation 0.51) of tobacco use status in medical record, presentation of intervention to all smokers, and liberal use of smoking cessation medications · Odds ratios calculated for pre- and postintervention continues

378 APPENDIX A TABLE A.1 Continued Reference Data Source, Sample Size, and Time Frame Kiefe C, et al. "Improving Quality · Medical chart review Improvement Using Achievable Benchmarks · 70 community physicians treating 2,978 for Physician Feedback." JAMA. 2001. fee-for-service Medicare patients with diabetes in Alabama · December 1996­December 1998 Luthi J, et al. "Variations among Hospitals · 2,077 heart failure patients in 69 hospitals in the Quality of Care for Heart Failure." in Colorado, Connecticut, Georgia, Effective Clinical Practice. 2000. Oklahoma, and Virginia · June 1995­September 1996 Marciniak T, et al. "Improving Care for · Medical records from hospitals and PROs Medicare Patients with AMI." JAMA. 1998. with ICD-9 code 410 (AMI) · Base group: Medicare patients in Alabama, Connecticut, Iowa, and Wisconsin whose primary discharge diagnosis was AMI · Comparison group: Random sample of Medicare patients from the other 46 states with AMI as the primary diagnosis · Mortality comparisons of all Medicare patients with claims of AMI · June 1992­July 1996

APPENDIX A 379 Study Purpose, Methodological Approach, and Outcome Measures Findings · Determine if providing feedback to physicians for · Physician performance improved quality improvement is enhanced by use of when feedback was combined with benchmarking benchmarks · Randomized controlled trial where the control · Significant improvements were group physicians received feedback based on chart made by the experimental group over review and the intervention group feedback also the control group in reception of flu included benchmarking vaccine (OR = 1.57), foot exams (OR · Odds ratios were calculated = 1.33), and long-term HbA1c management (OR = 1.33) · Determine if care for heart failure varies among · Large variations exist in accordance hospitals with quality-of-care guidelines for · Calculated percentage of patients receiving care patients with heart failure matching the following indicators: assessment of · Extremal quotient reached a high left ventricular function, use of ACE inhibitors for of 5.5 (assessment of left ventricular systolic dysfunction, prescription of target dose, function) and low of 1.7 (use of ACE consumption of low-sodium diet, and daily weight inhibitors for systolic dysfunction) monitoring · Interquartile range, overall range, and extremal quotient (ratio of highest and lowest values) · Improve quality of care for AMI patients via CCP · Measures for CCP patients in Alabama, Connecticut, Iowa, and Wisconsin improved significantly more than · Compare measures for Medicare AMI patients in those for the other patients four CCP states with those in all other states; · CCP results were significantly looked at rates for eligible patients, as well as different for aspirin use at discharge, ideal patients (those with no contraindication) beta-blocker use at discharge, and · Quality improvement for 13 clinical guidelines mortality (i.e., reperfusion, aspirin treatment during · Results for eligible and ideal hospitalization), length of stay, and mortality, all patients were not statistically measured by percent difference between pilot different; therefore, results for the group and comparison group eligible group (for which it is easier to gather data) can be used · Lowered length of stay was probably not due to CCP continues

380 APPENDIX A TABLE A.1 Continued Reference Data Source, Sample Size, and Time Frame Mark B, et al. "A Longitudinal Examination · Previous research findings on five of Hospital Registered Nurse Staffing and variables: hospital characteristics, market Quality of Care." Health Services Research. characteristics, financial performance, 2004. staffing, and quality of care (American Hospital Association Annual Survey, CMS surveys, HCUP, OSCAR, Solucient, InterStudy, Area Resources) · 442 hospitals (from 1990­1995 HCUP National Inpatient Survey) · 1990­1995 Massing M. "Prevalence and Care of · 83,913 Medicare patients with diabetes Diabetes Mellitus in the Medicare · Baseline: July 1997­June 1999 Population of North Carolina." NC Medical Journal. 2003. McClellan W, et al. "Improved Diabetes · 22,971 Medicare diabetes patients; 477 Care by Primary Care Physicians: Results of PCPs in 123 counties in rural Georgia a Group-Randomized Evaluation of the · Baseline: January­December 1996 Medicare Health Care Quality Improvement Follow-up: January 1998­December 1999 Program." Journal Clinical Epidemiology. 2003. Ornstein S, et al. "A Multimethod Quality · Data extracted from electronic medical Improvement Intervention to Improve records Preventive Cardiovascular Care." Annals of · 87,291 patients from 20 different primary Internal Medicine. 2004. care settings in 14 states · January 2001­January 2003

APPENDIX A 381 Study Purpose, Methodological Approach, and Outcome Measures Findings · Find effects of change in nursing staffing on · Increases in staffing of registered changes in quality of care nurses correlate to lower mortality · Used first-difference transformation procedure to rates analyze data (the procedure studies relationships · Increased staffing has broad between changes in the five variables [Anderson relationship with decreased mortality and Hsiao, Journal of the American Statistical rates Association, 1981]), as well as ordinary least- · Increases in staffing have squares regression diminishing marginal effects · Changes in nursing staffing, changes in quality · The relationship between increased of care staffing and lowered mortality is not causal; confounders cannot positively be identified · HbA1c test, eye exam, lipid profile · Treatment levels for target · Percentage of diabetic population claiming one of indicators do not meet guidelines, the indicators; data abstracted from Medicare leaving much room for improvement: Part A and B claims HbA1c = 77%, lipid profile = 53%, eye exam = 70% · African Americans disproportionately received less diabetes care · Determine whether feedback reports from claims- · PCPs in intervention group had based data enhance quality improvement activities, higher rates of improvement than similar to QIO function control group · Random comparison groups: control group · Clinical guidelines were not met by versus intervention group; intervention group either group received mailings with information about clinical · Absolute difference in rate of guidelines and implementation tools for HbA1c, improvement for HbA1c was 4% urine protein, and dilated eye tests; measured (95% confidence, p = 0.02). Results difference in percentage of patients receiving test at were not significant for other baseline and follow-up for both groups indicators · Absolute difference in rates of improvement for · Physician disagreement with clinical each indicator guidelines cited as potential reason for not meeting guidelines · Assess if intensity of quality interventions · Only marginal change associated influences physician adherence to 21 clinical with intervention group. Absolute practice guidelines for stroke and cardiovascular difference = 6% (p > 0.2) disease · 22.4% and 16.4% improvement by · Distributed practice guidelines and performance intervention and control groups, reports quarterly to all providers; randomized respectively controlled trial: half of the practices participated in · Limitation of small n and lack of quarterly visits and annual meetings to share best true control group practices · Percentage of indicators meeting target of 90% adherence for each indicator continues

382 APPENDIX A TABLE A.1 Continued Reference Data Source, Sample Size, and Time Frame Pai C, et al. "The Combined Effect of Public · 30 hospitals in southeast Michigan Profiling and Quality Improvement Efforts · Baseline: 5,871 patients on Heart Failure Management." Journal on Follow-up: 4,716 patients Quality Improvement. 2002. · Baseline: January 1998­December 1998 Follow-up: March 2000­August 2000 Schade C, et al. "Using Statewide Audit and · 44 acute care hospitals in West Virginia Feedback to Improve Hospital Care in West · Preintervention: July 1998­June 2000 Virginia." Joint Commission Journal on Postintervention: July 2000­December 2001 Quality and Safety. 2004. Sheikh K, et al. "Evaluation of Quality · Medical record abstraction Improvement Interventions for Reducing · 14 states (7 control states, 7 intervention Adverse Outcomes of Carotid states) matched by number of beneficiaries Endarterectomy." Medical Care. 2004. and procedure rates · Intervention: 1997­1998 Time series: 1991­2001

APPENDIX A 383 Study Purpose, Methodological Approach, and Outcome Measures Findings · Assess effectiveness of public disclosure of · Public profiling and quality performance data and quality improvement improvement efforts not indicative of activities on heart failure management changes in clinical practice · Quality indicators for management of heart · Indicators for ACE inhibitor use failure: documentation of ejection fraction for showed regression toward the mean patients with LVSD, patients prescribed ACE (hospitals with high baseline rates inhibitors after LVSD profiling indicator, patients had showed declines from the with LVSD prescribed ACE or used ACE in baseline rates, whereas hospitals with hospital. Public reporting to employees in low baseline rates showed November 1999; hospitals were reported by name improvements from baseline rates). for those with above or below average rates at · Potential lack of physician 95% confidence understanding and support of quality · Change in quality indicators (measured in improvement efforts proportions of patients) from the baseline · Evaluate effectiveness of audit and feedback · Improvement of quality indicators systems in improving care in hospitals for five were achieved in most hospitals; conditions: AMI, heart failure, pneumonia, stroke, improvement could not be entirely and atrial fibrillation attributable to audit and feedback · Three groups, based on number of discharges of · Improvement trends were identified target conditions; intervention: audit and feedback mostly in postintervention period through reports released after collection of data · 20% average improvement of every 6 months quality indicator performance · Pooled pre- and postintervention data from all hospitals' quality indicator rates using simple and weighted paired t-tests, with significance being a p value <0.05; chi-squared tests performed for trend data · Determine effectiveness of quality interventions · No decrease found in 30-day in reducing mortality rates and adverse outcomes mortality and stroke rates of CEA · No change in trends studied by the · PRO provided clinical guidelines and quality 1991­2001 time series indicators to hospitals and physicians, aided in · Lack of improvement may be due development of improvement plans, and facilitates to little opportunity for change (i.e., dissemination of best practices among physicians hospitals and physicians already · Comparison of in-hospital and 30-day post-CEA made improvements to CEA stroke and mortality rates by using t-tests and chi- procedure before PRO intervention) square tests; time series from 1991 to 2001 was · Recognizes importance of provider- used to analyze trends led change continues

384 APPENDIX A TABLE A.1 Continued Reference Data Source, Sample Size, and Time Frame Sutherland D, et al. "Diabetes Management · Iowa Medicare patients diagnosed with Quality Improvement in a Family Practice diabetes Residency Program." Journal of the · 1997: 313 patients 1998: 268 patients American Board of Family Practice. 2001. · 1997­1998 Thomson O'Brien, M, et al. "Local Opinion · Reviewed eight randomized control trials Leaders: Effects on Professional Practice and from a search spanning from 1966 to 1998 Health Care Outcomes." Cochrane Library. 2005. aThis table was used in the committee's analysis of the QIO program; analyses are presented in Chapter 9. ABBREVIATIONS: ACC = American College of Cardiology; ACE = angiotensin-converting enzyme; AHA = American Heart Association; AHRQ = Agency for Healthcare Research and Quality; AMI = acute myocardial infarction; ANOVA = analysis of variance; ATS = American Thoracic Society; BRFSS = Behavioral Risk Factor Surveillance System; CABG = coronary artery bypass grafting; CCP = Cooperative Cardiovascular Project; CDC = Centers for Disease Control and Prevention; CEA = carotid endarterectomy; CMS = Centers for Medicare and Medicaid Services; DQIP = Diabetes Quality Improvement Project; EKG = electrocardiography;

APPENDIX A 385 Study Purpose, Methodological Approach, and Outcome Measures Findings · Determine if use of quality interventions at a · Significant improvement (p < local level targeting resident physicians improves 0.001) in documenting target quality of care indicators for diabetes care achieved · Retrospective cohort with the use of chi-square at local level and t-tests; p values <0.05 were significant. · Statewide results also indicated Compared local results to statewide results improvement in diabetes care but not · Utilization rate of accepted diabetes care in other indicators indicators · Determine the impact of local opinion leaders · The roles and definitions of opinion (educationally influential peer-nominated leaders are largely vague and without providers) consensus, making determination of · Meta-analysis effectiveness difficult · Can be effective in combination with other types of interventions (e.g., audit and feedback, mailings, etc) and by itself; may be more effective than audit and feedback · Results are mixed HbA1c = hemoglobin A1c; HCQIP = Health Care Quality Improvement Program; HCUP = Healthcare Cost and Utilization Project; HEDIS = Health Plan Employer Data and Information Set; HMO = health maintenance organization; ICD-9 = International Classification of Diseases- 9; LTC = long term care; LVSD = left ventricular systolic dysfunction; MDS = Minimum Data Set; OR = odds ratio; OSCAR = Online Survey, Recertification, and Reporting; PCP = primary care provider; PRO = Peer Review Organization; QI = quality improvement; QIO = Quality Improvement Organization; TA = technical assistance.

386 APPENDIX A TABLE A.2 Literature Methodology Table Conclusion About Effectiveness on Quality Quality Improvement Methodology and Reference or QIO Study Yes Maybe No Systematic review Jamtvedt QI x Thomson O'Brien QI x Randomized controlled trial Barr QIO x Berner QIO x Joseph QI x Kiefe QIO x McClellan QIO x Ornstein QI x Quasiexperimental Coleman QI x Holman QIO x Sheikh QIO x Snyder QIO x Cohort study Burwen QIO x Chu QIO x Cortes QIO x Daniel QIO x Dellinger QI x Ellerbeck (2000) QIO x Gould QIO x Halm QI x Marciniak QIO x Mark QI x Pai QIO x Schade QIO x Sutherland QIO x Cross-sectional study Centor QIO x Jencks (2000) QIO x Jencks (2003) QIO x Luthi QIO x Massing QIO x Ecologic study Ellerbeck (1995) QIO x Qualitative study Bradley (2005) QIO x Bradley (2001) QI x NOTE: QI = quality improvement; QIO = Quality Improvement Organization.

APPENDIX A 387 TABLE A.3a Comparison of Quality Improvement Organization (QIO) Performance Measures and Measures Recommended by Institute of Medicinea for Task 1a--Nursing Homes QIO--8th SOW QIO--7th SOW Statewide Performance and Identified Measureb Performance Measure Statewide CPOCC IPG Participants Starter Set Chronic care Basic tasks Toileting R Transferring R Eating R Mobility decline R Bedfast Pain R (optional) R (optional) R Infections R Pressure sores High risk R R R Low risk R Restraint use R R R Depression or anxiety R R worsening Incontinence Urinary tract infections Indwelling catheters Weight loss Post acute care Pain R Pressure sores R Delirium symptoms R Improved in walking R NOTE: = required performance measure; R = required performance measure that is re- ported to the public by the Centers for Medicare and Medicaid Services (CMS); CPOCC IPG = Clinical Performance and Organizational Culture Change Identified Participant Group. aSee the Performance Measurement report of this series. IOM (Institute of Medicine). 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Acad- emies Press. bThe performance measure set recommended by the Performance Measurement report of this series (see footnote a).

388 APPENDIX A TABLE A.3b Comparison of Quality Improvement Organization (QIO) Performance Measures and Measures Recommended by Institute of Medicinea for Task 1b--Home Health QIO-- QIO-- 8th SOW 7th SOW Statewide Statewide Performance and Clinical and Identified Measurec Performance Measure Performance IPGb Participants Starter Set Chronic care Activities of daily living Stabilization in bathing R Post acute care Activities of daily living Improvement in dressing upper body R Improvement in bathing R R Management of oral medications R R Getting around Improvement in toileting R Improvement in ambulation/ R R locomotion Improvement in transferring R R Improvement in pain interfering R R with activity Physical health Improvement in dyspnea Improvement in status of surgical wounds Improvement in urinary incontinence Mental health Improvement in cognitive functioning Improvement in confusion frequency Staying at home without home care Discharged to community Prevalence measures Acute care hospitalization R R Emergent care R NOTE: = required performance measure; R = required performance measure that is re- ported to the public by the Centers for Medicare and Medicaid Services (CMS). aSee the Performance Measurement report of this series. IOM (Institute of Medicine). 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Acad- emies Press. bIPG = Identified Participant Group. cThe performance measure set recommended by the Performance Measurement report of this series (see footnote a).

APPENDIX A 389 TABLE A.3c Comparison of Quality Improvement Organization (QIO) Performance Measures and Measures Recommended by Institute of Medicinea for Task 1c--Hospital Setting QIO-- QIO--8th SOW 7th SOW Task Task Statewide 1c1: 1c1: Task and Performance ACM SCIP 1c2: Identified Measureb Performance Measure Statewide IPG IPG CAH Participants Starter Set Surgical complications Infection On-time prophylactic * antibiotic administration Appropriate selection * of prophylactic antibiotics Prophylactic antibiotics * discontinued within 24 hours after surgery Controlled perioperative * serum glucose (<201 mg/dL) among major cardiac surgery patients Postoperative wound infection diagnosed during index hospitalization Appropriate hair * removal Perioperative * normothermia among colorectal surgical patients Controlled perioperative serum glucose (<201 mg/dL) among noncardiac major surgery patients Major surgical patients without planned hypothermia who maintained normothermia during the perioperative period continues

390 APPENDIX A TABLE A.3c Continued QIO-- QIO--8th SOW 7th SOW Task Task Statewide 1c1: 1c1: Task and Performance ACM SCIP 1c2: Identified Measureb Performance Measure Statewide IPG IPG CAH Participants Starter Set Cardiovascular Major noncardiac * surgery patients received beta-blockers during perioperative period Major surgery patients * received beta-blocker perioperatively if they were maintained on a beta-blocker prior to surgery Intra- or postoperative acute myocardial infarction diagnosed during index hospitalization and within 30 days of surgery Intra- or postoperative cardiac arrest diagnosed during index hospitalization and within 30 days of surgery Thromboembolic Thromboembolism * prophylaxis Appropriate venous * thromboembolism prophylaxis Intra- and postoperative pulmonary embolism Intra- and postoperative deep venous thrombosis

APPENDIX A 391 TABLE A.3c Continued QIO-- QIO--8th SOW 7th SOW Task Task Statewide 1c1: 1c1: Task and Performance ACM SCIP 1c2: Identified Measureb Performance Measure Statewide IPG IPG CAH Participants Starter Set Respiratory Postoperative orders and * documentation of elevation of Head of Bed Postoperative ventilator associated pneumonia during index hospitalization Peptic ulcer disease * prophylaxis received Ventilator-weaning * protocol Vascular access Permanent hospital end- stage renal disease vascular access procedures that are autogenous arteriovenous fistulas Global Mortality within 30 days of surgery Readmission within 30 days of surgery Acute myocardial infarction Aspirin at arrival * * Aspirin prescribed at * * discharge Angiotensin-Converting * * Enzyme for left ventricular systolic dysfunction (LVSD) at discharge Smoking cessation Beta-blocker at arrival * * continues

392 APPENDIX A TABLE A.3c Continued QIO-- QIO--8th SOW 7th SOW Task Task Statewide 1c1: 1c1: Task and Performance ACM SCIP 1c2: Identified Measureb Performance Measure Statewide IPG IPG CAH Participants Starter Set Acute myocardial infarction (continued) Beta-blocker prescribed at * * discharge Thrombolytic agent within 30 min of arrival Percutaneous Coronary Intervention within 120 minutes of arrival Time to EKG * Heart failure Left ventricular function * * assessment ACE therapy for LVSD at * * discharge Detailed discharge * instructions Smoking cessation advice/ counseling Pneumonia Blood culture performed within 24 hours prior to or after arrival at the hospital Blood culture collected prior to first antibiotic administration Patients with pneumonia who receive influenza screening or vaccination Patients with pneumonia * * who receive pneumococcal screening or vaccination Antibiotic treatment timing * * Oxygenation assessment * * Initial antibiotic treatment consistent with current recommendations

APPENDIX A 393 TABLE A.3c Continued QIO-- QIO--8th SOW 7th SOW Task Task Statewide 1c1: 1c1: Task and Performance ACM SCIP 1c2: Identified Measureb Performance Measure Statewide IPG IPG CAH Participants Starter Set Pneumonia (continued) Smoking cessation advice/ counseling Smoking cessation advice/ counseling for pediatric pneumonia patients NOTE: = required performance measure; * = required performance measure; the Centers for Medicare and Medicaid Services (CMS) evaluates the QIOs only on these particular mea- sures in assessing identified participant groups of Task 1c1 and all providers for Task 1c2. ACM IPG = Appropriate Care Measure Identified Participant Group (Task 1c1); SCIP IPG = Surgical Care Improvement Project Identified Participant Group (Task 1c1); CAH = Critical Access Hospital/Rural Hospital Quality Improvement Measures (Task 1c2). aSee the Performance Measurement report of this series. IOM (Institute of Medicine). 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Acad- emies Press. bThe performance measure set recommended by the Performance Measurement report of this series (see footnote a).

394 APPENDIX A TABLE A.3d Comparison of Quality Improvement Organization (QIO) Performance Measures and Measures Recommended by Institute of Medicinea for Task 1d--Physician Office QIO-- QIO--8th SOW 7th SOW Task Statewide 1d1: Task and Performance Task 1d1: IPG (1 1d2: Identified Measurec Performance Measure Statewide and 2)b UP Participants Starter Set Preventive care Tobacco cessation counseling Tobacco use Prevention Cholesterol screening R Blood pressure R Colorectal cancer screening R Breast cancer screening R,V R Cervical cancer screening Pneumococcal vaccine R,V R Influenza vaccine R,V R Prenatal Care Anti-D immune globulin Screening for human immunodeficiency virus Acute care Acute myocardial infarction Aspirin treatment at V arrival for acute myocardial infarction Beta-blocker treatment at V time of arrival for acute myocardial infarction Pneumonia Antibiotic administration V timing for patient hospitalized for pneumonia Surgery Antibiotic prophylaxis V Thromboembolism V prophylaxis Use of internal mammary V artery in coronary artery bypass graft (CABG) surgery Preoperative beta-blocker V for patient with isolated CABG

APPENDIX A 395 TABLE A.3d Continued QIO-- QIO--8th SOW 7th SOW Task Statewide 1d1: Task and Performance Task 1d1: IPG (1 1d2: Identified Measurec Performance Measure Statewide and 2)b UP Participants Starter Set Prolonged intubation in V isolated CABG Surgical reexploration in V CABG Aspirin or clopidogrel V treatment on discharge for isolated CABG Chronic disease care Diabetesd Hemogloblin A1c R R (HbA1c) test HbA1c control V R Urine protein testing R Lipid profile R R Low-density lipoprotein (LDL) cholesterol screening LDL control V Adults diagnosed with diabetes with most recent blood pressure <140/90 mm Hg High blood pressure control V Eye exam R R Foot exams R End-stage renal disease Dialysis dose V Hematocrit level V Receipt of autogenous V ateriovenous fistula Coronary Artery Diseasee Antiplatelet therapy V R Drug therapy for lowering R LDL cholesterol LDL control V Beta-blocker therapy-- V R prior myocardial infarction Angiotensin-Converting R Enzyme inhibitor therapy continues

396 APPENDIX A TABLE A.3d Continued QIO-- QIO--8th SOW 7th SOW Task Statewide 1d1: Task and Performance Task 1d1: IPG (1 1d2: Identified Measurec Performance Measure Statewide and 2)b UP Participants Starter Set Heart failure Weight measurement R Patient education R Beta-blocker therapy V R Warfarin therapy for V R patients with atrial fibrillation Left ventricular ejection R fraction testing Left ventricular function assessment ACE inhibitor/Angiotensin V II-Receptor Blocks therapy for left ventricular systolic dysfunction Asthma Use of appropriate medications Pharmacologic therapy Depression Acute Antidepressant V medication management Chronic Antidepressant V medication management Osteoporosis Screening in elderly female V patient Prescription of calcium and V vitamin D supplements Antiresorptive therapy or V parathyroid hormone treatment, or both, in patients with newly diagnosed osteoporosis Bone mineral density testing V and osteoporosis treatment and prevention following osteoporosis-associated nontraumatic fracture

APPENDIX A 397 TABLE A.3d Continued QIO-- QIO--8th SOW 7th SOW Task Statewide 1d1: Task and Performance Task 1d1: IPG (1 1d2: Identified Measurec Performance Measure Statewide and 2)b UP Participants Starter Set Osteoarthritis Annual assessment of V function and pain Chronic obstructive pulmonary disease Smoking cessation V intervention Long-term care Screening of elderly V patients for falls Screening of hearing V acuity in elderly patients Screening for urinary V incontinence in elderly patients Quality measures addressing overuse or misuse Appropriate treatment for children with upper respiratory infection Appropriate testing for children with pharyngitis NOTE: UP = underserved populations; = required performance measure; R = required performance measure that is reported to the public by the Centers for Medicare and Medicaid Services (CMS); V = measure included in the Physician Voluntary Reporting Program. aSee Performance Measurement report of this series. IOM (Institute of Medicine). 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Acad- emies Press. bIdentified Participant Groups are responsible for reporting these measures to CMS. cThe performance measure set recommended by the Performance Measurement report of this series (see footnote a). dIdentified participant groups are evaluated in part on the basis of having met target levels of performance for diabetes. eIdentified participant groups are evaluated in part on the basis of having met target levels of performance for coronary artery disease.

398 APPENDIX A TABLE A.3e Crosscutting Performance Measures and Other Settings Patients' reports of care CAHPS family of surveys, as they become validated Hospital CAHPS Ambulatory CAHPS End-stage renal disease Dialysis patients registered on a waiting list for transplantation Patients with treated chronic kidney failure receiving transplant within 3 years of renal failure Patient survival rate Hemodialysis patients with urea reduction ratio of 65 or greater Patients with hematocrit of 33 or greater Efficiency measures After diagnosis of acute myocardial infarction One-year mortality rate Resource use Functional status Health plans and accountable health organizations HEDIS integrated delivery system measures Effectiveness Access/availability of care Satisfaction with experience of care Health plan stability Use of service Cost of care, informed health care choices, health plan descriptive information Structural measures Computerized physician order entry Intensive care unit intensivists Evidence-based hospital referrals NOTE: Measures recommended by the Performance Measurement report, but not in Quality Improvement Organization measure sets. IOM (Institute of Medicine). 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. CAHPS = Consumer Assessment of Healthcare Providers and Systems; HEDIS = Health Plan Employer Data and Information Set.

APPENDIX A 399 TABLE A.4a Support Contracts for the 7th SOW Project Title QIO Award Task 1a--Nursing Home Support QIO RI $4,229,692 Task 1a--Working with Nursing Home Chains CO $2,331,171 Task 1b--Home Health Support QIO MD $6,350,000 Task 1b--Home Health Public Reporting Pilots CO, WA $2,223,521 Task 1c--Infectious Disease/Surgical Site Support QIO OK $3,000,000 Task 1c--Infectious Disease QIOSC for Surgical Care Improvement OK $543,896 Task 1c--Qualis Health IHI Collaborative Project (SIP) WA $914,468 Task 1c--Cardiovascular Support QIO CO $2,971,390 Task 1d--Physician Office Support QIO VA $2,400,000 Task 1d--Communities of Practice VA $506,092 Task 1d and f--Data Support QIO IA $2,367,468 Task 1d and f--Outpatient Data Additional Funding IA $440,000 Task 1e--Task 1e Support QIO TN $2,731,607 Task 1f--Task 1f Support QIO VA $502,773 Task 2a--Task 2a Support QIO WA $3,100,000 Task 2b--Hospital-Generated Data Support IA $2,698,628 Task 3a and c--Medicare Beneficiary Protection Support QIO CA $3,760,032 Task 3b--Hospital Payment Monitoring QIO TX $2,303,910 Hospital Payment Monitoring Program (HPMP) Projects Multiple $9,257,769 MedQIC Website--Transition from Delmarva to IFMC IA $3,980,548 Quality Improvement Interventions Support QIO VA, MD $1,263,774 Training QIOs in human factors UT $350,000 Nursing Home Initiative Ads IA $2,800,000 Home Health Initiative Ads IA $3,000,000 Hospital Initiative Ads WA $3,000,000 TOTAL SUPPORT SEVENTH SOW CORE WORK $67,026,739 Learning from innovative quality improvement approaches in nursing homes WI $975,000 Health Care Collaborative Network Project CO $200,000 Doctor's Office Quality--Information Technology CA $11,000,000 HHA Outcomes-Based Quality Improvement Evaluation UT $350,000 Surgical Complications OH, KY $3,009,672 Achievable NH targets for pressure ulcers & restraints RI $218,008 Depression projects NY, MI $1,046,000 CMS colorectal cancer screening NC $58,914 Physician's office registry development MD $35,000 Health outcomes survey AZ $3,600,000 Hospital public reporting pilot projects AZ $3,131,453 Patient safety learning pilot projects: IN, NV, UT, WI Multiple $1,874,056 Medicare Patient Safety Monitoring System (excluding CDACs) CT $786,031 Medicare Patient Safety QIOSC CT $1,998,290 Rural Antibiotic (RADAR) Project ID $33,107 Doctor's office quality CA, IA, NY $4,249,864 continues

400 APPENDIX A TABLE A.4a Continued Project Title QIO Award Chronic Kidney Disease Pilot Intervention GA $299,957 Rural Hospital Quality Measures MN $351,436 Quality of Care in Community Health Centers Using Health TN $146,377 Care Facilitators Identify New Areas of Disparity Work for Eighth SOW TN $131,104 Information Collection on Past/Potential Disparity Projects TN $118,380 Cervical Cancer Mortality TN $18,161 Physician Office Registry Development MT $4,000,000 Rebuild MedQIC Website like IHI's QHC.org IA $910,000 Best Practices in QIOs to Help Providers Improve Quality WA $2,795,610 Measures Case Studies--High Performers AZ $800,000 Statistical Support IA $636,710 Process Improvement QIOSC WA $1,287,910 New England Complex Systems Institute UT $100,000 Negative or Positive Public Reporting of Measures MD $93,979 Review of Managed Care Organization Required National NY, CA, MD $1,032,526 Quality Projects BIPA Notice of Proposed Rulemaking Grijalva IN $403,000 ESRD Facility Specific Reports (Dialysis Compare) WA $1,449,000 CAHPS Nursing Home AZ $1,300,000 Presenting Accurate Nursing Home Staffing Ratios CO $671,049 Continue Hospital Core PM Project MS $553,360 Measures Management AZ $1,301,638 Continuation of Pharmaceuticals Project MS $1,200,000 Development of Robust Measure Set Phase I NY $412,722 Voluntary Hospital Reporting (Setting Priorities) CT, MD $315,000 Risk Adjustment for HF outcomes CO $247,082 ESRD Facility Specific Reports CO $803,207 Admission Decisions: Developing Best Practices--NV/MO/CA Multiple $2,378,628 Continuing Medical Education IA $10,491 Continuing Medical Education CO $13,441 Process improvement--additional funds WA $125,375 Evaluating the framing of Publicly Reported Quality MD $65,000 Performance Measures Dev. Risk Adj. Models ($40,000 to cover shortfall) CO $40,000 Quality Improvement Recommendations MA, OH $420,969 Hospital Leadership's Impact on Performance MD $82,400 Dave squared RI $75,000 Culture Change Pilot/Workforce Study RI $985,494 SFF Study OK $100,000 Hopital PrU Cross-Setting Project CO $115,791 Supplemental Funding for the Task 2B QIOSC IA $469,265 (Total $1,288,265 less $819,000 existing IA funds) Alternative Methods for Resolving Beneficiary Complaints CO, NY $366,738 Development of Risk Adjustment Models CO $200,000

APPENDIX A 401 TABLE A.4a Continued Project Title QIO Award Field Testing of the HSPREAT (Human Subjects Protection VA $50,000 Research Exemption Assessment Tool) Quality Expert Panel VA $99,679 Call for Proposals for Creating an Enviornment for Quality IA $821,075 (projects must support transitional work for Task 2 under eighth SOW using $800,000 IA Surplus transferred to Qualis) Hospital Leadership and Systems Improvement IN $784,925 OBQI Web-Based Training ($250,000 to come from existing MD $200,000 HH QIOSC) Accelerating Hospital QI through Team Based Organizational MD $128,000 Culture Spreading Team Based Organization Culture to QIOs MD $98,500 Academic Medical Centers and Chronic Kidney Disease GA $560,448 Collaboration Rural Hospital Measure Development MN $196,547 Remaking American Medicine Website (PBS) WA $100,000 Cross-Setting Collaborative to Enhance Home Health Service MD $330,000 Utilization Spreading the Patient Safety Learning Pilot--IN/WI IN, WI $425,000 Optimizing the HCQIP Strategic Plan--Process Improvement WA $249,222 Training Determination of payment errors for improper billing of short-stay outliers for long-term acute care stays Multiple $209,876 Emergency Department Quality Measures Pilot Test WA $150,000 TOTAL DEVELOPMENTAL/SPECIAL PROJECTS $63,795,467 TOTAL APPROVED DEVELOPMENTAL/SUPPORT QIO $130,822,206 PROJECTS SOURCE: Personal communication, C. Lazarus, March 17, 2005.

402 APPENDIX A TABLE A.4b Special Studies for the 7th SOW Contract Number Proposed Activity/Project 3-Year Total 27 QIO Standard Data Processing System $31,003,638 28 Clinical Data Abstraction Centers $50,237,606 32 Medicare Surveillance System Data Collection $2,844,134 33 Health Care Quality Improvement Prog (HCQIP) $1,683,146 37 Quality Improvement and Evaluation System (QIES) $29,623,860 74 Facilities Management Contract Support $6,000,000 79 QIES MDS MDCN Charges $3,014,568 193 Health Plan Management System (HPMS) $3,173,948 331 Project to Integrate the ESRD System $6,428,030 5004 Seventh SOW Training/Development Meetings $31,660 5006 CDAC Pass-Thru $4,854,997 5026 Hospital Core Measurement Set $2,989,859 5029 Medicare HEDIS Quality-of-Care Performance Measures $3,849,519 5032 Measurement Indicators & Improvement Quality of Life in NHs $695,220 5038 MEGA QI Project $2,133,000 5046 QIO Audit Support $1,627,498 5059 Technology Assessment $4,800,000 5061 Usefulness of Quality Indicators in Survey Process $1,199,100 5077 Quality Forum Membership $47,250 5079 Immunization Remeasurement (Telesurvey) $534,240 5080 Healthy Aging Project $5,886,150 5081 Citizen Advocacy Center Training and Support $266,978 5082 Study and Development of QIO Best Practices $1,489,678 5083 PRO Mediation Training & Internal Quality Control $792,238 5084 Vista $100,000 5085 Clinical Data Abstraction Center (CDACs) Abstraction for CHF QAPI $1,107,250 5086 HL7 Standards Setting Process $100,000 5087 QIO Subtask Certification $149,986 5100 Data Accuracy and Verification $1,766,593 5200 ESRD CAHPS $165,000 5202 CAHPS $33,439,343 5217 Prevention Initiatives $799,993 5218 Website Quality Support $3,087,000 5220 Promotion, Quality, Consumer Research $5,381,777 5402 Influenza/Pneumococcal Vaccination Campaign $1,708,164 5403 Mammography Campaign $1,523,745 5501 National Quality Forum Collaboration $749,524 5502 Doctors Office Quality Improvement Project Collaboration with AMA $20,000 5503 Physician Measurement in Managed Care and Fee for Service $1,422,803 5505 ESRD Performance Measures $2,061,236 5506 Home Health Outcomes Based Quality Improvement $300,000 continues

APPENDIX A 403 TABLE A.4b Continued Contract Number Proposed Activity/Project 3-Year Total 5507 Home Health Quality Measurement & Refinement $1,299,673 5508 Minimum Data Set (MDS) 3.0 Development $4,420,840 5509 HEDIS Health Outcomes Survey $4,249,170 5510 ESRD Patient Survey $500,000 5511 Pittsburgh Research Initiative $1,499,740 5513 ESRD Public Reporting Initiative $248,532 5514 Analysis Contract $449,864 5515 Senior Risk Reduction $3,291,258 5516 Hospital Satisfaction Survey $1,700,000 6149 Validation of Managed Care Data for Risk Adjustment $6,388,706 5081 B Systematized Nomenclature of Medicine $350,000 Total $243,486,514 SOURCE: Personal communication, C. Lazarus, March 17, 2005.

404 APPENDIX A TABLE A.5 Evaluation of Task 1 in 7th SOW Task Setting Domains Performance Measures 1a Nursing home Chronic care (percentage Pain of residents with listed Infections (pneumonia, urinary tract condition) infections, etc.) Pressure sores Pressure sores (with additional risk adjustment) Loss of ability in some basic daily tasks Physical restraints Postacute care Pain (percentage of short- Walk as well or better stay residents with Delirium listed condition) Delirium (with additional risk adjustment) Provider satisfaction Task Setting CMS Priority Performance Measures 1b Home health Health status 11 OBQI/OASIS measuresc (getting improvement dressed, bathing, confusion, medication management, ambulation, toileting, transferring, pain when moving, emergency care, acute hospitalization) Provider satisfaction

APPENDIX A 405 Statewide Improvement Identified Participant Improvement Target Scoring Weightsa Target Scoring Weightsa 8% averaged 0.8--identified 8% averaged 0.44 × (actual improvement on participant score improvement on improvement/target three to five publicly three to five improvement) reported quality-of- publicly reported care measures quality-of-care measures 80% "satisfied" 0.05 80% "satisfied" 0.15 response rate response rate Statewide Improvement Identified Participant Improvement Target Scoring Weightb Target Scoring Weightb N/A 30% of HHAs in the 0.8 state must have statistically significant improvement in at least one OBQI / OASIS measure 80% "satisfied" 0.05 80% "satisfied" 0.15 response rate response rate continues

406 APPENDIX A TABLE A.5 Continued Task Setting CMS Priority Performance Measures 1c Hospital Clinical measures Acute myocardial infarction, heart failure, pneumonia, and surgical infection Provider satisfaction Task Setting CMS Priority Performance Measures 1d Physician's office Chronic disease care Biennial retinal exam, annual (diabetes) hemoglobin A1c testing, biennial testing of lipid profile Preventive services Biennial screening mammography (cancer screening) Preventive services Influenza immunization, pneumococcal (adult immunization) immunization Provider satisfaction

APPENDIX A 407 Statewide Improvement Identified Participant Improvement Target Scoring Weightsd Target Scoring Weightsd 8% improvement in 0.75 N/A combined topic average (average score for a condition, based on improvement in the four sets of indicators) 80% "satisfied" response rate 0.25 Statewide Improvement Identified Participant Improvement Target Scoring Weightse Target Scoring Weightse 8% improvement in 0.8--identified 8% improvement in 0.44 × (actual combined topic participant score diabetes and cancer improvement/target averagef screening measures improvement) 80% "satisfied" 0.2 response rate continues

408 APPENDIX A TABLE A.5 Continued Task Setting CMS Priority Performance Measures 1e Rural or Primary evaluation QIO must show reduction in disparity underserved between a nonunderserved reference population group and targeted underserved group Secondary evaluation Full description of targeted intervention group demographics and characteristics Documentation of rationale for why a specific intervention was chosen for a particular population Quantitative demonstration of intervention effectiveness compared with the outcome for a reference group Task Setting CMS Priority Performance Measures 1fh Medicare+ All areas Will use Medicare+Choice quality Choice review organizations or accreditation Organizations organization evaluations of QAPI projects to determine if expected improvement was demonstrated Provider satisfaction NOTE: If a Quality Improvement Organization (QIO) scores below 0.6 on any quantitative subtask (Tasks 1a to 1e and 2b), its contract will be reevaluated by a Centers for Medicare and Medicaid Services panel. OBQI/OASIS = Outcome-Based Quality Improvement/Outcome and Assessment Information Set; N/A = not applicable; HHA = Home Health Agency; QAPI = Quality Assessment and Performance Improvement. aA passing score is a score 1.0. The total possible score for Task 1a is 1.0, which is equal to the statewide score + identified participant satisfaction rate + satisfaction rate = (0.8 ­ identified participant score) + 0.44 × (identified participant actual improvement/target im- provement) + (0.2 satisfaction rate). b A passing score is a score 1.0. The total possible score for Task 1b is 1.0, which is equal to the (statewide score) + (satisfaction rate) = (0.8 statewide score) + (0.2 satisfaction rate). cSee Table A.3 for measures. d A passing score is a score of 1.0. The total possible score for Task 1c is 1.0, which is equal to the (statewide score) + (satisfaction rate) = (0.75 statewide score) + (0.25 satisfaction rate).

APPENDIX A 409 Statewide Improvement Identified Participant Improvement Target Scoring Weightsg Target Scoring Weightsg Demonstrated 1.0 N/A reduction in disparity 0.2 0.2 0.2 Statewide Improvement Identified Participant Improvement Target Scoring Weights Target Scoring Weights Technical assistance 0.5 N/A given; QAPI improvement 80% "satisfied" response rate 0.5 eA passing score is a score of 1.0. The total possible score for Task 1d is 1.0, which is equal to the (statewide score) + (identified participant score) + (satisfaction rate) = (0.8 ­ identified participant satisfaction rate) + [0.44 × (identified participant actual improvement/target im- provement)] + (0.2 satisfaction). fElements of combined topic average: administrative claims (used to measure diabetes and mammography measure rates for fee-for-service beneficiaries). The weighted average of Health Plan Empoyer Data and Information Set data will be used to derive diabetes and mammogra- phy measures of rates for Medicare+Choice organizations(if applicable). The Consumer As- sessment of Healthcare Providers and Systems survey will be used to derive immunization rates statewide. gA passing score is a score 1.0. The total possible score for Task 1e is 1.6, which is equal to (primary evaluation score) + (secondary evaluation score) = (1.0 primary evaluation score) + (0.6 secondary evaluation score). h Task 1f is not a pass-fail task.

410 APPENDIX A TABLE A.6 Evaluation of Task 1a in 8th SOW Dimension of Task Setting Performance Performance Measures 1a Nursing home Clinical performance Pressure ulcers among high-risk residents measure resultsb Physical restraints Management of depressive symptoms Management of pain in chronic (long- stay) residents

APPENDIX A 411 Statewide Improvement Identified Participant Improvement Scoring Weights Scoring Weights Targets (18% of totala) Targets (82% of totala) Identified Participant IPG 1: 0.094 Groups 1 and 2 8.5% of total score Baseline <10.5%: IPG 2: 0.625 achieve RFR of 5.7% of total score 15% Baseline 10.5%­15%: achieve RFR of 25% Baseline >15%: achieve RFR of 35% Identified Participant IPG 1: 0.094 Group 1 8.5% of total score Baseline <4%: achieve RFR of 15% Baseline 4%­10%: achieve RFR of 35% Baseline >10%: achieve RFR of 60% Identified Participant IPG 2: 0.625 Group 2 5.7% of total score Achieve 10% RFR Baseline <10%: IPG 1: 0.094 achieve RFR of 8.5% of total score 30% Baseline 10%: achieve RFR of 40% Baseline <5%: IPG 1: 0.094 achieve RFR of 8.5% of total score 25% Baseline 5%­8%: achieve RFR of 35% continues

412 APPENDIX A TABLE A.6 Continued Dimension of Task Setting Performance Performance Measures Process improvement Extra credit: Process change implementation Organization culture Target settingb change

APPENDIX A 413 Statewide Improvement Identified Participant Improvement Scoring Weights Scoring Weights Targets (18% of totala) Targets (82% of totala) Baseline >8%: achieve RFR of 50% Document at least one Extra credit: 0.05 of the following for each, totaling processes of care 0.2 for 50% of new 18% of total score admissions for: · Skin inspection and pressure ulcer risk assessment · Depression screening and treatment · Evaluation of the necessity and alternatives for the use of physical restraints · Pain assessment and treatment At least 25% of 0.1 All participants in nursing homes in 9% of total score Identified the state set targets Participant for high-risk Groups 1 and 2 pressure ulcers and must set targets for physical restraints. high-risk pressure QIO sets own ulcers and physical statewide target for restraints high-risk pressure ulcers and physical restraints continues

414 APPENDIX A TABLE A.6 Continued Dimension of Task Setting Performance Performance Measures Data collection on experience of careb Satisfaction and knowledge/ perceptionb Dimension of Task Setting Performance Performance Measures 1b Home health Clinical performance OASIS publicly reported measurese measure resultsd Acute care hospitalization

APPENDIX A 415 Statewide Improvement Identified Participant Improvement Scoring Weights Scoring Weights Targets (18% of totala) Targets (82% of totala) Identified Participant IPG1: 0.0925 for each Groups 1 and 2 survey must have 90% 8.4% of total for of nursing homes each survey collect and monitor IPG2: 0.0375 for each satisfaction survey experience of care 3.4% of total for for each of the each survey following: · Residents, annually · Staff, annually · Retention of certified nursing assistants, annually At least 80% score on 0.1 satisfaction and 9% of total score knowledge/perception surveys Statewide Improvement Identified Participant Improvement Scoring Weights Scoring Weights Targets (53% of Totalc) Targets (47% of totalc) Meet or exceed target 0.1 (0.13 max) Average rate of group 0.09 (0.11 max) RFR for one QIO- 10% of total score must meet or 9% of total score selected measure exceed identified OASIS publicly participant group reported measure target RFR for one home health agency­selected measureb Meet or exceed 30% 0.19 (0.22 max) Average rate of group 0.27 (0.32 max) RFR for acute care 19% of total score must meet or 27% of total score hospitalization exceed identified measureb participant group target RFR for acute care hospitalization measureb continues

416 APPENDIX A TABLE A.6 Continued Dimension of Task Setting Performance Performance Measures Systems improvementb Telehealth Process improvement Immunization assessment surveyb Incorporation of immunizations into computer Organization culture Survey tool to measure organizational change culture change

APPENDIX A 417 Statewide Improvement Identified Participant Improvement Scoring Weights Scoring Weights Targets (53% of Totalc) Targets (47% of totalc) Implementation by 0.05 identified 5% of total score participant group of telehealth meeting CMS telehealth guidelines 50% minimum 0.05 response rate 5% of total score Achieve 50% RFR (or 0.09 (0.11 max) 80% statewide 9% of total score performance) on the percentage of home health agencies that incorporated influenza or pneumococcal immunizations, or into comprehensive patient assessments Implement CMS 0.02 survey tool 2% of total score Implementation 0.04 of quality 4% of total score improvement activity and submission of a plan of action based on results of organizational culture change survey continues

418 APPENDIX A TABLE A.6 Continued Dimension of Task Setting Performance Performance Measures Organization culture Extra credit: Target setting change Satisfaction and knowledge/perception Dimension of Task Setting Performance Performance Measures 1c1 Hospital Clinical performance Appropriate care measureb,g measurement results Clinical performance Measures reportingb measurement and reporting Assistance to hospitals to ensure data are timely, valid, and completeb Process improvement Surgical Care Improvement Project (SCIP)

APPENDIX A 419 Statewide Improvement Identified Participant Improvement Scoring Weights Scoring Weights Targets (53% of Totalc) Targets (47% of totalc) At least 25% of non- Extra credit: 0.07 At least 50% of Extra credit: 0.05 identified participant identified participant group home health group home health agencies set targets agencies set targets for acute care for acute care hospitalization and hospitalization and other OASIS other OASIS measures measures At least 80% score 0.1 on satisfaction and 10% of total score knowledge/perception surveys Statewide Improvement Identified Participant Improvement Scoring Weights Scoring Weights Targets (27% of totalf) Targets (73% of totalf) At least 75% of 0.3 (0.4 max)h hospitals must 27% of total score achieve 50% RFR 25% of hospitals must 0.1 report on the set of 9% of total score 22 HQA measuresg More than 95% of 0.1 hospitals submitting 9% of total score data to QIO Data Warehouse At least 50% 0.3 identified 27% of total score participant group hospitals achieve an overall RFR 25% on SCIP process measures for surgical site infections and venous thromboembolisb,g continues

420 APPENDIX A TABLE A.6 Continued Dimension of Task Setting Performance Performance Measures Systems improvement Use of CPOE, bar coding, or telehealthb Satisfaction and knowledge/perceptionb Dimension of Task Setting Performance Performance Measures 1c2 Critical access Clinical performance One quality improvement measure hospital or measure results selected by each critical access hospital rural hospital Clinical performance Reporting of Hospital Quality Alliance measurement and measure seth reporting

APPENDIX A 421 Statewide Improvement Identified Participant Improvement Scoring Weights Scoring Weights Targets (27% of totalf) Targets (73% of totalf) Extra credit: Achieve Extra Crediti: overall RFR 25% 0.1 max for other SCIP process measures Percentage of 0.2 hospitals 18% of total score completing assessment tool Percentage of hospitals demonstrating improvement at remeasurement At least 80% score on 0.1 satisfaction and 9% of total score knowledge/perception surveys Statewide Improvement Identified Participant Improvement Scoring Weights Scoring Weights Targets (64% of totalj) Targets (36% of totalj) RFR >10% Weight = number of critical access Extra credit: hospitals reporting/ RFR >20% total number of critical access 50% of nonreporting hospitals critical access Score = (weight × 0.5) hospitals report on + {weight × 0.1 × at least one Hospital [(actual RFR ­ 0.1)/ Quality Alliance 0.1]} measure topic 0.6 max for both clinical continues

422 APPENDIX A TABLE A.6 Continued Dimension of Task Setting Performance Performance Measures Extra credit: Reporting on transfer measures for new acute myocardial infarction and/or emergency department Systems improvement Use of CPOE, bar coding, or telehealth Organizational change Hospital safety culture assessment Satisfaction and knowledge/perceptionb

APPENDIX A 423 Statewide Improvement Identified Participant Improvement Scoring Weights Scoring Weights Targets (64% of totalj) Targets (36% of totalj) Extra credit: 100% of performance nonreporting critical measure results access hospitals and reporting report on at least Weight = number of one Hospital Quality critical access Alliance measure hospitals not topic reporting/total number of critical access hospitals Score = (weight × 0.5) + {weight × 0.1 × [(% newly reporting ­ 0.5)/0.1]} Extra credit: Work Extra creditk: with critical access 0.2 max hospitals to promote reporting on these measures and identify a quality improvement project Extra credit: At least Extra credit: 0.05 one nonreporting critical access hospital works on CPOE, bar coding, or telehealth and achievement of evaluation criteria Percentage achieving 0.4j,k RFR 1% from 36% of total score results of safety culture assessment At least 80% score on 0.1 satisfaction and 9% of total score knowledge/ perception surveys continues

424 APPENDIX A TABLE A.6 Continued Dimension of Task Setting Performance Performance Measures 1d1 Physician Clinical performance Statewide support for Physician practice measure results Voluntary Reporting Programg Statewide quality improvement by working with public health, provider groups, and others to support prevention and disease-based care processes Assistance to Medicare Advantage plans Assistance to End-Stage Renal Disease Networks Medicare Management Demonstration Project Clinical performance Export data measurement and reportingm Process improvementm Care management process to meet individual's health needs through the practice site systems survey Systems improvementm Production and use of information from electronic systems Satisfaction and knowledge/perceptionb

APPENDIX A 425 Statewide Improvement Identified Participant Improvement Scoring Weights Scoring Weights Targets (17% of totall) Targets (83% of totall) Improvement, as 0.1 evaluated by project 8.3% of total score officer Report on at least one 0.2 DOQ measure: 0.2 Preexisting electronic systems (10% of sites did not have them; 20% of sites did) Adoption of care 0.2 management 0.2 process: Electronic clinical information systems (30% of sites did not have them; 75% sites did) Produce and use 0.2 electronic clinical 17% of total score information for 75% of sites without preexisting electronic clinical information systemsb At least 80% score on 0.1 satisfaction and 8.3% of total score knowledge/ perception surveys continues

426 APPENDIX A TABLE A.6 Continued Dimension of Task Setting Performance Performance Measures 1d2 Underserved Clinical performance Claims-based clinical measuresg populations measure results Clinical performance Task 1d1 activities measurement and reporting Systems improvement Promotion of culturally and linguistically appropriate service (CLAS) standards Process improvement Cultural competency education Satisfaction and knowledge/perceptionb

APPENDIX A 427 Statewide Improvement Identified Participant Improvement Scoring Weights Scoring Weights Targets (35% of totaln) Targets (65% of totaln) 4% absolute 0.25 improvement for all 25% of total score underserved populations for diabetes, mammography, and adult immunization measures Promote improvement Select underserved in rates for populations that at applicable least equal the underserved underserved populations population in the state to complete Task 1d1 activities Use Office of 0.25 Minority Health 25% of total score Theme 3 tool with 80% completion rate to promote adoption of CLAS standardsb 80% primary care 0.4 physicians complete 40% of total score both Themes 1 and 2 of Office of Minority Health toolb At least 80% score on 0.1 satisfaction and 10% of total score knowledge/ perception surveys continues

428 APPENDIX A TABLE A.6 Continued Dimension of Task Setting Performance Performance Measures 1d3 Part D Clinical performance prescription measure results drug Benefit NOTE: RFR = Reduction in failure rate; IPG = identified participant group; QIO = Quality Improvement Organization; OASIS = Outcome and Assessment Information Set; CMS = Cen- ters for Medicare and Medicaid Services; CPOE = Computerized Provider Order Entry; CAHPS = Consumer Assessment of Healthcare Providers and Systems. aThe Task 1a score is equal to (0.5 clinical performance measure scores) + (0.5 organization culture change scores) + (0.1 satisfaction and knowledge/perception score) + (0.2 extra credit); total score = 1.1; total possible score = 1.3. bCore activities. If a QIO does not complete these specific activities, its contract may be subject to reevaluation by a Centers for Medicare and Medicaid Services panel. cThe Task 1b score is equal to (0.65 clinical performance measure score) + (0.05 systems improvement score) + (0.14 process improvement score) + (0.06 organization culture change score) + (0.1 satisfaction and knowledge/perception score) + (0.27 extra credit); total score = 1.0; total possible score = 1.27. dThe total points for these measures are scaled on the basis of percent improvement above or below the target RFR. Extra credit is available for scoring above the target RFR, indicated here by (max). eExcept acute care hospitalization and emergent care; see Table A.3 for measures. fThe Task 1c1 score is equal to (0.3 clinical performance measure score) + (0.2 clinical performance measurement and reporting scores) + (0.3 process improvement score) + (0.2 systems improvement score) + (0.1 satisfaction and knowledge/perception score); total score = 1.1; total possible score = 1.3.

APPENDIX A 429 Statewide Improvement Identified Participant Improvement Targets Scoring Weightso Targets Scoring Weightso Measures to be Implementation of a To be determined by developed by quality improve- Government Task consensus review ment project Leader process CAHPS For QIOs electing to work on self- management of medication therapy gSee Table A.3 for measures. hExtra credit for the Appropriate Care Measure Identified Participant Group is based on recruitment of hospitals. iPartial credit is also given. QIOs achieving at least 25% RFR on three measures will receive 0.05 point; QIOs achieving at least 25% RFR on four measures will receive the full 0.1 point. jThe Task 1c2 score is equal to (0.6 clinical performance measure score and clinical perfor- mance measurement and reporting score) + (0.4 organization culture change) + (0.1 satisfac- tion and knowledge/perception score); total possible score = 1.35. kExtra credit for these activities are scaled on the basis of the percentage of critical access hospitals achieving the target RFR. lThe Task 1d1 score is equal to (0.1 clinical performance measure score) + (0.4 clinical performance measurement and reporting score) + (0.4 process improvement score) + (0.2 sys- tems improvement score) + (0.1 satisfaction and knowledge/perception score); total score = 1.2. m The total points for these activities are scaled on the basis of the ability of participants without electronic clinical information systems to produce clinical information. nThe Task 1d2 score is equal to (0.25 clinical performance measure score) + (0.25 systems improvement score) + (0.4 process improvement score) + (0.1 satisfaction and knowledge/ perception score); total score = 1.0. o"Passing" for Task 1d3 is to be determined by the Task 1d government task leader.

430 APPENDIX A TABLE A.7 Comparison of Deliverables for the 7th and 8th Scopes of Work 7th SOW Deliverables 8th SOW Deliverables Task 1a: Nursing Homes Development and implementation of a Development of alternative Task 1a criteria quality improvement plan in which 3 to 5 of (applicable to WY, AK, DC, and PR) the 10 nursing home quality-of-care measures were targeted for statewide improvement Development and implementation of a plan Lists of the identified participants for groups to partner with nursing home stakeholders 1 and 2 List of the identified participants Indicate whether QIO will work on process improvement measures and which nursing homes will submit data for these measures Contact name for each identified participant Set targets for the measures for high-risk pressure ulcers and measures for physical restraints (management of depressive symptoms and management of pain in patients with chronic pain are optional) with the help of nursing homes at the statewide level Submit statewide targets for the measures of high-risk pressure ulcers and for physical restraints; submissions for measures of management of depressive symptoms and management of pain in chronic pain are optional Documentation of PARTner activity codes Documentation of baseline and annual remeasurement rates for resident satisfaction Documentation of baseline and annual remeasurement rates for staff satisfaction Documentation of annual certified nursing assistant or nursing aids turnover rate Quarterly submission of mandatory process of care data (optional) Task 1b: Home Health QIO training of home health agencies on Lists of the clinical performance of identified OBQI participant group and their plans of action List of identified participants Lists of the systems improvement and organization culture change identified participant group

APPENDIX A 431 TABLE A.7 Continued 7th SOW Deliverables 8th SOW Deliverables List of contact information for each Selected statewide OASIS measure participant Acute care hospitalization strategic plan Acute care hospitalization strategic plan final report Systems improvement and organization culture change identified participant group survey results Systems improvement and organizational culture change identified participant group plans of action Statewide survey results of statewide immunization practices Documentation of PARTner activity codes Task 1c1: Hospitals List of contact information for every Update data on Provider Reporting System hospital in the state List of identified participants for acute care measure, surgical care improvement project, and systems improvement and organization culture change identified participant groups Documentation of contact with local American College of Surgeons president Results of baseline readiness/adoption tool for CPOE, bar coding, or telehealth Results of remeasurement readiness/adoption tool for CPOE, bar coding, or telehealth Systems improvement and organizational culture change hospitals' plans for CPOE, barcoding, and telehealth implementation plans Task 1c2: Critical Access Hospitals N/A Submission of critical access hospital measure set Report of quality improvement activities on at least one critical access hospital measure List of participants for identified participant group Final report of quality improvement activities with all reporting critical access hospitals continues

432 APPENDIX A TABLE A.7 Continued 7th SOW Deliverables 8th SOW Deliverables Submission of the Rural Organizational Safety Culture Change interventions and change models tested/implemented Baseline results and methods of safety culture survey Report of Rural Organizational Safety Culture Change intervention and change models implemented Remeasurement results of safety culture survey Task 1d1: Physician Practice List including each identified participant Assistance given to Medicare Advantage along with his or her Unique Physician plans Identification Number via PARTner List of contact information for each Assistance provided to support Physician participating physician office Voluntary Reporting Program and other statewide work Recruitment plan Work plan indicating the technical assistance activities offered to identified participant physician practice sites, including those sites in Task 1d2 List of physician practices sites receiving QIO assistance Strategy and assistance for electronic submission of DOQ measures Office System Survey assessing status of identified participant group for electronic clinical information production and use Updated environmental scan List of physician practice sites with applications of interest for QIO assistance List of physician practice sites using EHR due to work of QIO Information depicting QIO efficiencies Office System Survey of identified participant groups Task 1d2: Physician Practice: Underserved Populations N/A Identify Task 1d1 underserved identified participants Identify CLAS identified participants Report efforts to reach underserved populations Report CLAS results

APPENDIX A 433 TABLE A.7 Continued 7th SOW Deliverables 8th SOW Deliverables Task 1d3: Physician Practice/Pharmacy: Part D Prescription Drug Benefit N/A Assessment of environment for electronic prescribing and continuous quality improvement QIO staff/training plan Baseline levels of performance Submission of two concept papers for quality projects to be developed with Medicare Advantage and other prescription drug plans Submission of one project proposal for a quality project to be developed with Medicare Advantage and other prescription drug plans Plan interventions and develop interventional materials Identify annual quality measure targets Report required information on providers involved in projects Directory of contacts within each prescription drug plan Task 1e: Underserved and Rural Beneficiaries Submission of approved 6th SOW plans N/A targeting an underserved population Submission of plan if new project was chosen Report of final results Task 1f: Medicare Advantage Plan of action to invite Medicare+Choice N/A organizations to participate in Tasks 1a to 1e Submit list of contacts for all Medicare+ Choice organizations NOTE: SOW = scope of work; QIO = Quality Improvement Organization; PARTner = Pro- gram Activity Reporting Tool; OBQI = Outcome-Based Quality Improvement; OASIS = Out- comes and Assessment Information Set; CPOE = computerized provider order entry; N/A = not applicable; DOQ = Doctor's Office Quality; EHR = Electronic Health Record; CLAS = culturally and linguistically appropriate service.

Next: Appendix B Private Sector Organizations Offering Services Related to Quality Improvement »
Medicare's Quality Improvement Organization Program: Maximizing Potential Get This Book
×
Buy Hardback | $81.00 Buy Ebook | $64.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Medicare’s Quality Improvement Organization Program is the second book in the new Pathways to Quality Health Care series. Focusing on performance improvement, it considers the history, role, and effectiveness of the Quality Improvement Organization (QIO) program and its potential to promote quality improvement within a changing health care delivery environment that includes standardized performance measures and new data collection and reporting requirements. This book carefully examines the QIOs that serve every state as well as the national program that guides and supports them. In addition, it highlights the important roles that a national program with private organizations in each state can play in promoting higher quality care. Medicare’s Quality Improvement Organization Program looks closely at the technical assistance role of the QIO program and the need to encourage and support providers to improve their performance. By providing an in-depth assessment of the federal experience with quality improvement and recommendations for program improvement, this book helps point the way for those who strive to create higher quality and better value in health care. Intended for multiple audiences, Medicare’s Quality Improvement Organization Program is essential reading for members of Congress, the federal executive branch, the QIOs, health care providers and clinicians, and stakeholder groups.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!