National Academies Press: OpenBook

Performance Measurement: Accelerating Improvement (2006)

Chapter: Appendix C Case Studies

« Previous: Appendix B National Organizations Involved in Performance Measurement
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

Appendix C
Case Studies

THEMES FROM CASE STUDIES

The most carefully designed health policy cannot be realized without a reasonable implementation strategy with which to anticipate and prepare to address issues that could impede the envisioned change. In this section, the committee reviews those issues likely to arise in different health care settings, especially small practices, during the implementation of a national system for performance measurement and reporting. Led by the National Quality Coordination Board (NQCB), the call for this national system will require a major shift in the current culture of health care in the United States—a shift away from the traditional provision of care within care settings toward stronger involvement of patients in their care. Along with this redesign of health care delivery, a coordinated system will entail public reporting of performance measures in addition to a greater emphasis on shared accountability among providers and patients for improving the quality of care delivered. Many providers will need assistance as they undertake performance measurement activities. To implement the new processes in their practices and to prepare for participation in a national system, providers may need to invest financial and personal resources in the short run for long-term gain.

Health care organizations will need to be prepared to commit the resources necessary to change their operations to accommodate the measurement tasks called for by the NQCB and retool their internal processes. These tasks will likely strain existing resources, which will need to be redeployed within organizations. A part of the implementation strategy for a national

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

system for measurement and reporting is to have a phase-in period that will allow providers to learn procedures and protocols once the necessary infrastructure is in place, as well as obtain provider support and feedback with regard to performance measurement activities. Economies of scale available to larger organizations may allow them to respond more quickly than smaller organizations with fewer resources to devote to these tasks.

To examine the experience of practices that are currently implementing performance measurement, the committee sought input from a small sample of practices across various regions and communities. Major themes emerged from these case studies to reveal potential issues associated with implementing a national system for performance measurement and reporting. Specifically, the IOM committee sought to address two main questions: (1) What will it take to obtain provider support? and (2) How feasible is it to implement the proposed NQCB? In addition, the committee wished to address the issues associated with implementation of performance measurement, especially those faced by small practices, with particular attention to barriers and successes achieved in overcoming those barriers.

Three main themes emerged from the case studies: (1) the need to obtain physician support, (2) the need to obtain needed resources (human, technical, and financial), and (3) the importance of sustaining change. These themes are discussed below, followed by a review of barriers and successes achieved by the practices studied (see Table C-1).

Obtain Physician Support

The first step in implementing performance measurement within clinical practices is to obtain physicians’ agreement to participate. Omitting this important step could delay or undermine the success of the NQCB. The NQCB can obtain physician support in a number of ways, as indicated by the providers contacted in the committee’s case studies:

  • Support provider participation in federal, state, and local collaborative arrangements for data collection and interpretation of results.

  • Encourage those providers who are not already implementing quality improvement to seek help from their colleagues who are doing so or from their professional organizations.

  • Promote use of practice guidelines by measure developers to achieve consensus on measures, and include multiple stakeholders in the consensus process.

  • Encourage provider innovation in measurement activities that are clinically meaningful and specific to the practice setting, in addition to meeting national requirements.

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

TABLE C-1 Key Themes for Implementing Performance Measurement

Key Themes

Case Studya

A

B

C

D

E

F

G

Seek physician support

• Use of practice guidelines to reach consensus on measures

X

X

X

X

X

X

X

• Provider ownership of data

X

X

X

X

X

X

X

• Prior exposure to performance measurement and quality improvement through professional organizations

X

X

 

X

X

X

• Participation in federal, state, and local collaboratives

X

X

X

X

X

• Use of pay for performance

X

X

X

 

Obtain resources

Human (hiring new staff)

• Additional clinical staff, such as physician assistants and nurses, for internal quality improvement efforts

X

X

X

X

X

X

 

• Outside vendors for system maintenance

 

X

X

X

 

• Full-time technicians for data system management or part-time staff to help with data collection

X

 

X

X

 

X

Technical assistance

• Recruiting of staff with prior training in quality improvement, such as a quality assurance coordinator

X

X

X

X

 

X

• Provider assistance received from an outside organization—e.g., an academic health center helps with data collection and interpretation

 

X

X

 

X

X

X

• Provider collaboration with federal, state, and local organizations for assistance with data collection and feedback

 

X

 

X

X

 

Information technology

• Purchase of hardware and software within the practice

X

X

X

X

X

X

X

• Implementation of EHRs

X

X

 

X

X

X

X

Financial

• Up-front investments to get the practice ready for performance measurement

X

X

X

X

X

X

X

• Cost sharing through affiliations or partnerships with local collaboratives

 

X

X

 

X

X

 

• Grants for performance measurement activities

 

X

 

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

Key Themes

Case Studya

A

B

C

D

E

F

G

Successes

• Better patient care

X

X

X

X

X

X

X

• Provider and staff satisfaction

X

X

X

X

X

X

X

• Improvement shown by all practices that measure performance

X

X

X

X

X

X

X

• Ability of small practices, even a solo physician, to measure performance successfully

 

X

X

X

X

• Increased office efficiency

X

X

 

X

 

X

• Increased revenue

 

X

 

X

X

 

X

Barriersb

• Provider resistance

 

• Difficulty in demonstrating a business case

• Time required to get EHRs fully operational

• Requirement of additional resources to redesign practice care teams and ancillary personnel

Sustain change

• Review of performance measures annually to adjust criteria and practice goals

X

X

X

X

X

X

X

• Updating of the use of information technology to support quality efforts

X

X

X

X

X

X

X

• Increase in staff as needs dictate to continue internal quality improvement

X

X

X

X

 

• Continuation or increase of bonus payments for meeting care targets

X

X

X

 

• Creation of internal committees to review performance measurement

X

X

X

Uses of performance measurement

• Internal quality improvement

X

X

X

X

X

X

X

• Pay for performance

X

X

X

 

• Public reports

X

 

aCase studies are masked here as the focus is on the synthesis of key themes. This synthesis is based on responses by the 7 case study subjects to a list of questions prepared by the IOM committee.

bBarriers were not included in the committee’s initial list of questions; however, the practices indicated they had overcome these difficulties over the past 3–5 years, when they began using performance measurement.

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
  • Ensure that providers have ownership of their data. Allow them to check their data before sending it to a repository, and provide a process for them to dispute what they believe to be inaccurate or inappropriate data.

  • Encourage the use of pay for performance to help offset some of the personal and financial investments required of participating providers.

Obtain Needed Resources

Many practices examined by the committee indicated that they needed additional human, technical, and financial resources in order to implement performance measurement. The NQCB, in collaboration with other organizations, could help providers locate these resources. The assistance provided might include the following:

  • Human resources—support additional staff to manage data systems and input data.

  • Technical assistance

    • Assistance from other organizations at the national and subnational levels in educating practice staff in quality improvement.

    • Help with implementation of electronic health records (EHRs) or practice management systems that link billing and medical records, possibly at the national or subnational level.

    • Help with data interpretation early in the process, especially before public reporting.

    • Adapt data feedback to increase usability by the practice.

  • Financial support—Offer financial incentives to providers from either private or public funds for implementing performance measurement and achieving higher quality of care, as demonstrated by their data. A common issue is that a focus on performance measurement activities may result in a lower volume of patients because the focus of care is on providing quality, not on increasing office visits.

Sustain Change

Results of the committee’s case studies indicated that participation in performance measurement based on temporary support, such as a short-term grant or individual experimentation within a practice, does not lead to successful implementation of performance measurement. The decision to implement quality improvement and performance measurement activities must begin with physician support and the necessary human, technical, and financial resources for the long term, as discussed above. Moreover, initial gains must be translated into long-term success. The small practices examined by the committee provided several examples of ways to sustain quality improvement, such as the following:

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
  • Providers can participate in an annual review process to update performance measures, adjusting criteria and goals to align with the practice’s quality improvement efforts.

  • Practices need consistency in the levels of reimbursement tied to their performance and established goals, as well as in the associated procedures and policies.

  • Several providers mentioned that their reputation in their community for providing high-quality care is important to them and is an important reason for their wanting to continue with performance measurement. They like having quality data that can demonstrate that they provide quality care.

  • Public and private financial assistance can be provided for quality improvement and performance measurement activities. As noted above, providing high-quality care is expensive.

Barriers to Implementation

The case studies revealed a number of barriers to the implementation of performance measurement, regardless of practice size: provider resistance, difficulty in demonstrating a business case, time required to get EHRs or paper record systems ready for use, and the need for additional resources (as discussed above) to restructure practice care teams and ancillary personnel.

As noted earlier, since the NQCB tasks include collecting and reporting measures based on both administrative and medical record data, most small practices will need more help in this area than larger organizations. In addition, the small practices in the committee’s case studies emphasized both technological and fiscal barriers.

For example, a technological problem that can occur when implementing EHRs was cited by GreenField Health in Oregon (case study 5, described below). GreenField Health noted that the first step for a practice after implementing EHRs is to customize them for its own use and standardize the way clinicians enter the data, facilitating the collection of accurate measures. Thus, it may be useful to have someone in the office familiar with the clinical and database issues to help address these needs. Other major barriers reported by small practices are presented below in Box C-1.

Often it is difficult to collect the data needed by the practice to carry out quality improvement activities. For example, GreenField Health reported that laboratory results, such as hemoglobin A1c for diabetics or mammogram readings, are not currently retrievable from claims data. This small practice hired a physician with advanced computer skills who streamlined its data sources for performance measurement activities, including database programming when necessary. Likewise, a solo internist in Primer Care Family Practice in Oklahoma (case study 3) hired a part-time college student to scan laboratory results into his EHR system. Another example of practices being required to

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

BOX C-1
Barriers to Performance Measurement in Small Practices

  • Large cost of setting up the infrastructure; difficulty of hiring and retaining physicians and other staff who understand the goals of quality improvement.

  • Lack of standard software that can collect data for multiple purposes (often gathered using multiple paper forms to respond to different requests).

  • Lack of private contracts that reward high performers at low cost to the plan (e.g., high-quality care is not low cost if the guidelines require a certain number of tests and follow-up visits that drive up costs).

  • Lack of reimbursement for treating patients via telephone and e-mail (e.g., there is no financial incentive for fewer patient visits).

  • Continual costs for maintenance of data systems (in addition to the purchase cost, and usually paid on a monthly basis).

report data that are difficult to collect is proof of eye exams for diabetics, which is required by recognition programs for diabetes care. To overcome this difficulty, Community Medicine Associates (case study 6) bought a photo machine to administer eye exams that are interpreted by an ophthalmologist’s practice off site and returned for documentation purposes.

GreenField Health also noted that additional technical costs are likely to be incurred once a practice has identified problems with its performance based on the data collected. These costs include the technical assistance and resources required to mine the data from registries or other data sources so as to maximize the information obtained. Moreover, once performance data have been collected voluntarily, there is no source in the market that will pay a small practice for such data; thus a business case or financial incentive for collecting the data does not currently exist.

All of the barriers discussed above were overcome by small practices with perseverance and a commitment to providing the highest-quality care possible for their patients, even when they encountered challenges such as staff turnover, high out-of-pocket expenses, and limitations of technology. Several practices reported that setbacks result in learning that leads to improved internal processes, which eventually make it possible to achieve success.

Successes Achieved

All the practices in the committee’s case studies emphasized that they were able to overcome most of the above barriers and gave reasons for

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

continuing to believe and participate in performance measurement and quality improvement. Regardless of the practice size, performance measurement can be implemented successfully given the right cultural environment, resources, and tools. The case study practices reported general such successes as the following: increased office efficiency, provider and staff satisfaction, better patient care, higher quality of life because practitioners could work at home and save time at the office by using EHRs, and in some cases increased revenue after the initial short-term investments.

Specific successes are detailed in the next section. For example, Prime Care Family Practice (case study 3), a small rural practice, was informed by its state Quality Improvement Organization that only 8 percent of its eligible patients had been referred for a mammogram. To address this problem, the practice scheduled times for its patients to receive mammograms at the local hospital every Friday. Within 1 year, 100 percent of eligible patients had been referred for a mammogram, and 76 percent had a mammogram result documented in their patient record. This is a clear example of the improved patient care that can result from access to performance measurement data, without which providers would have been unaware of and thus unable to address the problem.

The case study practices also shared with the committee nonfinancial motivators that attract providers to participate in pay-for-performance arrangements, such as clinician satisfaction, data available for innovative tracking of patients, improvement of one’s local reputation, increased billing compliance, and decreased liability. For example, North Texas Medical Group (case study 7) stated it was able to use performance measurement to implement an innovative approach to improving blood test monitoring of patients taking coumadin. By using performance measurement, North Texas Medical Group was able to carefully monitor some of its high-risk patients, which increased its practitioners’ satisfaction with their clinical performance. As a result of other performance measurement activities, North Texas Medical Group was also able to develop a local reputation for providing excellent care, and has been able to decrease its liability concerns over the last several years. As an example of how a nonfinancial motivator can be linked to performance measurement, Community Medicine Associates (case study 6) used performance measurement primarily to improve productivity and billing compliance, which rests on billing correctly, and not overbilling, for services. Because of this increased productivity, the organization was able to provide bonuses of up to $5,000 per quarter, or $20,000 per year, to its participating providers.

All practices indicated that neither their financial and personal investments nor any frustrations experienced along the way detract from the value of performance measurement; the effort is worth the time and investment for them and their patients. Thus there is a clear need for a national system

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

for performance measurement and reporting to foster such quality improvement efforts, especially for those small practices that are already struggling with competing market demands.

FULL DESCRIPTIONS OF CASE STUDIES

Case Study 1: HealthPartners, Inc.

HealthPartners, Inc., is a large nonprofit health care organization structured as a mixed-model health plan serving 630,000 members in group practices throughout Minnesota. Organized as a broad network of physicians and hospitals, HealthPartners provides services in practices with 10 to 600 physicians. Among its members, 30 percent receive care from HealthPartners Medical Group and Clinics, a staff-model group, and 70 percent from other contracted medical groups. HealthPartners serves its members across a range of health needs, from preventative to chronic disease services.

In addition to tracking performance on individual measures, HealthPartners calculates a composite score for a set of critical aspects of care received by the patient for a given condition. Data for these composite measures—addressing diabetes, cardiovascular disease, preventive care, and depression—are derived from administrative data and chart abstraction based on either electronic or paper records. Computer-based and paper registries are maintained separately from medical records and are not currently used to report performance. Rigorous validation of measures consists of four functions: drafting technical specifications, testing the measures, applying appropriate sampling methodology, and modifying the measures as needed.

A quality measurement steering committee, including medical group representatives, oversees measurement development and reporting at HealthPartners. The committee develops the composite measures mentioned above to align with provider-developed, evidence-based guidelines of the Institute for Clinical Systems Improvement (ICSI). ICSI is a not-for-profit collaborative in Minnesota consisting of medical groups and hospital systems, and serving as a driving force for improvement in the delivery of health care. The association between ICSI and HealthPartners has facilitated providers’ acceptance of and involvement in performance measurement.

The cost to the plan for record review is $12 per review and approximately $0.014 per member per month (PMPM) for all health plan members (see Table C-2). In comparison, the plan’s review cost for 2004 Health Plan Employer Data and Information System commercial reporting in 2004 was approximately $0.013 PMPM (see Table C-3). Additional resources needed for data collection activities include staff time for identifying patient samples

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

TABLE C-2 Total Cost Estimate per Review for Plan Members of HealthPartners, Inc.

Performance Measure (total members = 631,780)

Sample Size

No. of Groups

No. of Components

No. of Records

Costa

Optimal depression care

60

16

3

960

$11,500

Optimal diabetes care

80

27

5

2,160

$26,000

Optimal cardiovascular disease care

80

26

4

2,080

$25,000

Preventive care up to date (adults)

80

26

7

2,080

$25,000

Preventive care (children and adolescents)

60

27

13

1,620

$19,500

Tobacco assessmentb

NAc

27

1

0

$0

Body mass indexb assessment

NAc

27

1

0

$0

Total

360

 

34

8,900

$107,000

aThe actual cost of chart review is under $12 per record.

bTobacco assessment and body mass index measures are collected through chart review on the Preventive Care up to date samples, at minimal incremental costs. For a full list of measures, see the 2004 Clinical Indicators Report, available at http://www.healthpartners.com/files/23463.pdf.

cNA = not applicable.

TABLE C-3 Total Cost Estimate per Review for Commercial Plan Members

Performance Measure (total commercial members = 531,186)

No. of Measures

No. of Records

Costa

Childhood immunization

8

411

$5,000

Adolescent immunization

5

411

$5,000

Colorectal cancer screening

1

411

$5,000

Beta-blocker treatment after a heart attack

1

411

$5,000

Cholesterol screening after an acute event

3

411

$5,000

Comprehensive diabetes care

7

411

$5,000

Timeliness of prenatal and postnatal care

2

411

$5,000

Well-child visits in the first 15 months of life

7

411

$5,000

Well-child visits in the third, fourth, fifth, and sixth years of life

1

411

$5,000

Well-adolescent visits

1

411

$5,000

Total

36

4,110

$50,000

aThe actual cost of chart review is under $12 per record.

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

BOX C-2
Key Lessons Learned from HealthPartners, Inc.

  • Performance measurement is a powerful tool. It should be focused on what is important, not what is easy to measure.

  • Measures must be clinically relevant to engage clinicians.

  • Composite measures provide a better assessment of system performance than multiple single-service measures.

  • Composite measures with aligned incentives engage medical groups in improving systems and implementing team-based care.

  • Data displays should not waste a viewer’s time. It should take no longer than 30 seconds to understand the “call to action.”

and calculating and validating performance rates, training of abstractors, maintenance of measurement specifications, and development and publication of results. Medical group costs relate to record retrieval, internal measurement and reporting, and quality improvement changes.

Performance improvement has been demonstrated for all composite measures. Optimal diabetes care (hemoglobin A1c ≤8, LDL cholesterol <130, blood pressure <130/85, not smoking, and daily aspirin) increased from 6.2 percent in 2000 to 18.4 percent in 2004. Optimal coronary artery disease care (LDL cholesterol <130, blood pressure <140/90 for age ≤60 and <160/90 for age >60, not smoking, and daily aspirin) increased from 21.3 to 51.0 percent in the same time period. The overall preventive care up-to-date rate (percentage of members within the sample who receive all preventive screenings appropriate to the member’s age and gender) rose from 44 percent in 1997 to over 70 percent in 2004.

HealthPartners’ performance measurement focuses on medical groups and comparative public reporting. Many, though not all, medical groups also report individual provider performance on the same measures internally but not publicly. The goal of medical group performance reporting is to achieve improvements in individual patient care and overall population health. In addition to the incentive created by public reporting, medical groups are eligible for bonus payments when performance targets are met. Key lessons learned from HealthPartners, Inc., are summarized in Box C-2.

Case Study 2: Internal Medicine Solo Practice

James P. Wilson is an internal medicine provider in Fort Walton Beach, Florida, who owns a small solo practice serving approximately 1,800 patients, 35 percent of whom are Medicare beneficiaries accounting for two-thirds of his 5,200 patient visits annually. A significant number of

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

Dr. Wilson’s patients are retired military personnel who receive treatment ranging from preventive to chronic disease services.

In 1999, Dr. Wilson began participating in a quality improvement consortium, Acceleration Translation of Research into Practice (ATRIP), sponsored by the Agency for Healthcare Research and Quality and designed to maximize physicians’ capacity to provide high-quality care through the use of information technology. ATRIP provides routine performance data and suggests ways to improve the use of templates and other features in the EHR. It also provides quarterly practice reports showing data trends in care for measures based on clinical practice guidelines. Dr. Wilson receives practice reports on more than 80 measures, including the following: diabetes, heart disease, stroke, asthma, infectious diseases, mental health, substance abuse, immunizations, and inappropriate prescribing for the elderly.

Dr. Wilson was using EHRs and developing his own quality measures when he joined the ATRIP consortium. One of the services provided by ATRIP is periodic site visits from staff members of the Medical University of South Carolina to help in implementing national practice guidelines. Assistance is provided in the development of templates, measurement structures, and patient information handouts. Computer hardware and software support is purchased locally on a contractual basis.

In 1994, Dr. Wilson purchased the electronic medical record, including hardware and software, for $24,000. He has received periodic upgrades to the software. In the last 6 months, he has spent $20,000 to upgrade the

BOX C-3
Key Lessons Learned from an Internal Medicine Solo Practice

  • A solo physician can successfully introduce electronic medical records and a quality-of-care program.

  • Introducing electronic medical records is not simple and requires perseverance.

  • Gaining buy-in from staff and giving them routine feedback and encouragement are vital.

  • The physician and staff must remain very flexible in the face of unexpected technical problems.

  • A learning network such as ATRIP permits expansion of the capabilities of electronic medical records and performance measurement, allowing serial analysis and comparison with national benchmarks.

  • A consortium with similar practice guidelines and goals provides support to sustain enthusiasm.

  • Risk management is an important benefit of electronic medical records.

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

server and three workstations and to purchase patient education software. Software support costs approximately $350 per month. Local computer specialists charge $60/hour, for an average total cost of $4,000 per year. A part-time employee loads data on blood chemistries and scans incoming mail, radiology reports, and paper forms.

Performance measurement has been demonstrated for Dr. Wilson’s practice among patients with diabetes and cardiovascular disease. For example, the percentage of diabetic patients with hemoglobin A1c below 7 increased from 20 percent in 2002 to 55 percent in 2003. Patients suffering from coronary heart disease who received lipid-lowering medications increased from 66 percent in 2002 to 83 percent in 2003. Patients with coronary heart disease whose LDL cholesterol was measured increased from 42 percent in 1999 to 70 percent in 2000. Patients in the general population (not with a specific disease) with cholesterol measured in the past 5 years increased from 50 percent in 1999 to 65 percent in 2000. Key lessons learned from Dr. Wilson’s solo practice are summarized in Box C-3.

Case Study 3: Prime Care Family Practice

Prime Care Family Practice is a small internal medicine clinic located in Clinton, Oklahoma, that serves approximately 4,500 patients annually. The practice consists of one internist, a licensed practical nurse, and a medical technician. To improve their efficiency and track their patients’ chronic conditions, they adopted EHRs for their office 5 years ago. Prime Care works voluntarily with its state Quality Improvement Organization (QIO), Oklahoma Foundation for Medical Quality, to improve patient care.

Six years ago, Prime Care began its involvement in quality improvement by initiating data collection using paper-based records to qualify for a recognition program in diabetes care. Currently, the practice routinely submits Medicare administrative claims data tracking such measures as diabetes, mammography, adult immunizations, and cardiovascular disease. These performance data are collected annually by the Centers for Medicare and Medicaid Services, and subsequently reported to Prime Care by the state QIO. The QIO also helps Prime Care choose quality areas for improvement.

In addition, the Oklahoma Foundation for Medical Quality offers technical support to Prime Care through educational programs designed to improve internal quality by increasing work efficiency and to help the practice treat its diabetic patients more effectively. For example, all staff members at Prime Care attended several diabetes education programs with the QIO. The QIO does not charge Prime Care for technical or educational assistance designed to help improve work efficiency or diabetes care.

Prime Care has invested $30,000 in software and hardware over the past 3.5 years and pays a $350 per month maintenance charge for its Web-

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

based software. The adoption of EHRs, along with the use of a wireless tablet computer, saved Prime Care’s internist 4 to 5 hours per day in documentation time. Additional costs include staff salaries and hiring of part-time staff, such as college students, to scan laboratory reports into the medical record so they can be entered and read electronically.

Performance improvement has been demonstrated by Prime Care in the management of diabetes. Examples of improvement in the practice’s diabetes measures from September 2003 to August 2004 include hemoglobin A1c (12 percent), complete lipid testing in-house (36 percent), eye consults in chart (44 percent), tobacco cessation counseling (44 percent), patient self-management (54 percent), and administration of pneumonia vaccines (61 percent). Additionally, it was found that only 8 percent of eligible patients had been referred for a mammogram. To address this problem, the practice scheduled times for its patients to receive mammograms at the local hospital every Friday, resulting in a 100 percent referral rate.

The practice is expanding its quality improvement efforts based on its experience with diabetes to address other chronic care areas, such as pain management and depression, in collaboration with other organizations. Prime Care’s use of EHRs has led to an increase in revenue due to increased work efficiency. The practice received the Oklahoma Outpatient Quality Award for 2 consecutive years from Oklahoma’s QIO, and earned dual recognition for diabetes and heart/stroke care from the National Committee for Quality Assurance. Key lessons learned from Prime Care Family Practice are summarized in Box C-4.

BOX C-4
Key Lessons Learned from Prime Care Family Practice

  • Incorporating technology into a practice, regardless of its size, can increase the practice’s revenue, patient satisfaction, and employee satisfaction and performance.

  • Frustration should be expected when initiating implementation of EHRs, but it is the only effective approach to chronic disease management in the long run.

  • Staff should be educated in how to help manage patients with chronic disease through conferences or other educational programs to reduce provider burden.

  • The investment is worth it because paper chart expenses run higher than what is expected using EHRs, and an increase in practice revenues may result.

  • There is no perfect system, so practices should expect to learn from mistakes.

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

Case Study 4: Rochester Individual Practice Association

Founded in 1977, Rochester Individual Practice Association (RIPA) is a large nonprofit physician organization in Rochester, New York, that contracts with managed care companies to provide professional medical services. RIPA membership includes group practices and individual clinicians representing approximately 3,000 practitioners, consisting of 900 primary care providers and encompassing more than 20 specialties. Currently, RIPA contracts with Excellus Blue Cross Blue Shield and serves 300,000 Blue Choice enrollees for acute and chronic conditions.

In 2002, RIPA created an individual physician profiling program, the Value of Care Plan, which reports performance at the individual provider level three times a year. Data are collected in three areas of measurement and weighted as follows: patient satisfaction (20 percent); quality of care, comparing practice patterns with recommended care (40 percent); and efficiency (40 percent). Measures are collected using administrative data, and validation testing has shown them to be 92–95 percent accurate. Measures are based on communitywide guidelines established by the Rochester Health Commission, a nonprofit community-based organization representing all insurers, physician organizations, large employers, and hospital systems in Rochester.

As a part of RIPA’s profiling system, registry data are available for family medicine, practitioners, obstetricians, and cardiologists with patients who have coronary artery disease, diabetes mellitus, and asthma, as well as those eligible for mammography. Each provider receives the following registry data: rate of patient adherence to expected care, costs of patient care, comparative data against the specialty average, and a target rate set by communitywide guidelines.

Technical support for RIPA, encompassing information technology and data analysis, is provided by Excellus Health Plans. RIPA and Excellus medical directors evaluate and propose measures, analyze variation patterns, and educate and meet with practitioners. Provider buy-in was obtained through the program’s explicit goal of reducing underuse, misuse, and overuse, moving toward a more balanced, data-driven incentive system.

The estimated annual cost to RIPA and Excellus for supporting the profiling program is approximately $1.2 million which includes staff costs. Additional expenses are accrued for time spent correcting the patient-specific data and for developing and implementing improvement programs. The overall cost is $0.33 PMPM.

RIPA demonstrates improvement in a practitioner’s performance that ultimately benefits the entire practice. For example, an opthalmologist requested data to improve his efficiency index. The efficiency index measure is the ratio of actual episode costs to the specialty average episode costs for

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

BOX C-5
Key Lessons Learned from Rochester Individual Practice Association

  • Employ a process to introduce measures that engages practitioners in creating and reporting measures that make clinical sense from the start.

  • Deliver understandable reports.

  • Anticipate practitioner concerns and solicit and address them.

  • Set realistic targets for evidence-based measures.

  • Make measure specifications available.

  • Develop the performance measurement program with the advice and guidance of multispecialty physician committees.

  • Make data issues actionable by developing tools to identify and address unnecessary variations, and do not assume that outliers are poor performers.

  • Incorporate an appeal process in pay-for-performance programs.

  • Make sure the plan and payer executives have a long-term commitment to the program.

  • Do not rush a program—test measures for reliability and accuracy, educate practitioners about what is expected, and evaluate the reporting of results. Try to introduce measures over a year.

a given case mix. An episode is the cluster of medical services received by a given patient for a particular condition. The ophthalmologist switched his prescriptions to generics; the reduction in his efficiency index was 10 percent, resulting in a savings of $90,000. As a result, in 2003 RIPA provided similar data and counsel to more than 50 practitioners. By 2005, the method described above had saved the plan $1.4 million.

RIPA provides financial rewards to physicians for improved patient satisfaction, quality of care, and efficiency. These rewards, totaling $15 million, are distributed to RIPA providers each year and equal $4.00 PMPM. A busy internist may receive additional performance-based payments of $5,000–$15,000. The data are reported privately to each provider to improve care and are not publicly reported. Key lessons learned from RIPA are summarized in Box C-5.

Case Study 5: GreenField Health

Founded in 2001, GreenField Health in Portland, Oregon, is a small primary care practice with four internists serving 1,600 patients. It provides care to adults with all levels of health care needs, from preventive care to

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

care for chronic conditions. GreenField’s practice is divided into two main functions: (1) serving its patient base, and (2) participating in research and development on the design of medical practice systems, with a focus on new ways of interacting with and delivering services to patients. Currently, GreenField incorporates performance measurement as a part of its routine clinical practice to improve patient care.

Data for performance measures are generated by claims data linked to EHRs. GreenField has also designed a large registry in Microsoft Access, separate from the EHRs, that represents 10 distinct diseases or preventive screenings. Registry performance measures are generated and collected at the practice level, with quarterly reports reflecting evidence-based guidelines being provided to each physician. For example, a report provides information on a given diabetic patient for the following performance measures: hemoglobin A1c, LDL cholesterol, blood pressure, eye exam, foot exam, and urine microalbumin.

One of GreenField’s physicians provides technical support and also serves as information technology director for performance measurement. Approximately 3 weeks was required for him to develop the current performance measurement system. Additionally, this physician spends 2 to 3 hours per week maintaining the system and producing data reports. Provider support for quality and performance improvement was built into the recruitment process, which favored providers experienced in quality and performance measurement practices. Since 70 percent of GreenField’s patients have access to e-mail accounts, a secure e-mail reminder is generated automatically from the registry; for example, a patient receives a reminder to schedule a hemoglobin A1c test or an overdue eye exam.

The above technical functions have up-front costs, as does the reporting of results following data collection to support quality improvement efforts. The estimated cost for the entire system approaches $40,000 over the last 3 years. As a result of the use of the system, work efficiency has increased for the practice. It is now possible to contact 80 percent of patients by phone or e-mail; only 20 percent of patient contacts require a visit, which requires more time-intensive services.

Trend data produced by GreenField’s patient registry document the results of internal quality improvement efforts that occurred from March 2004 to January 2005. For example, there was an estimated average decrease of 33 percent in the rate of diabetic patients with end-stage renal disease, a complication of uncontrolled diabetes mellitus. The average LDL cholesterol count for diabetics decreased approximately 10 percent from January 2004 to January 2005. Other diabetes quality measures showed similar positive trends. Key lessons learned from GreenField Health Care are summarized in Box C-6.

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

BOX C-6
Key Lessons Learned from GreenField Health

  • Develop data systems even though they are time-consuming and expensive because the technology makes quality improvement feasible.

  • Using internal data systems collect and manage data, with integrity to ensure that providers’ performance information is valid.

  • Do not expect that it will be easy to use data for quality improvement in practice. Having an infrastructure in place is necessary to improve identified deficiencies in care.

  • Accept that performance will never be perfect.

Case Study 6: Community Medicine Associates

Community Medicine Associates (CMA) is a medium-sized primary care practice located in San Antonio, Texas, in which approximately 60–70 percent of the patient population is uninsured and lower-income. The practice consists of 33 primary care physicians and 11 midlevel nurse practitioners and physician assistants who provide care to more than 180,000 patients. Staff are hired and employed by the local Bexar County hospital district, which is connected to the larger university health system. CMA’s mission is to care for county residents regardless of income level or insurance status.

CMA collects performance measurement data based on measures and criteria specified by the U.S. Preventive Services Task Force (USPSTF), a panel of experts in primary and preventive care that systematically reviews and develops recommendations for clinical preventive services.1 CMA’s preventive care and quality review committees convene annually to review and set performance measurement criteria guided by USPSTF practice goals. All measures are based on chart review, with the exception of adult immunizations, for which data are collected by a statewide registry. Each quarter, five quality measures—influenza vaccination, hemoglobin A1c, systolic blood pressure, foot exam, and patient satisfaction—are reported to each physician group.

Technical assistance, such as that needed for changes and updates to CMA’s registry database (created in Microsoft Access), is provided by a technician employed by the university health system. Chart abstraction is

1  

U.S. Preventive Services Task Force. 1996. Guide to Clinical Preventive Services. Baltimore, MD: Williams & Wilkins.

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

performed by a nurse who is the quality assurance coordinator for the practice. Physician support for performance measurement was greatly enhanced by CMA’s quality incentive program. Bonuses involve a weighting scheme whereby a provider group receives a score for each quality indicator, ranging from 2 (highest level of care) to −2 (worst level of care).

CMA’s estimated cost to implement performance measurement by chart review totals $14,330 per year. The quality assurance coordinator examines charts on a quarterly basis for more than 30 providers, reviewing a minimum of 15 charts per quality indicator. The overall cost for data collection is $2.76 PMPM, which encompasses staff salaries (including the cost of the medical director’s time), database maintenance, and support costs.

Table C-4 shows CMA’s improvement on performance measures for patients with heart disease and diabetes mellitus. For example, patients with congestive heart failure who were prescribed an ACE inhibitor during 2002–2004 increased by 10 percent. Similarly, the percentage of patients with coronary artery disease who were prescribed aspirin increased by 10 percent from 2002 to 2004. For diabetic patients, annual hemoglobin A1c testing increased 6 percent from 2002 to 2004, while annual microalbuminuria testing increased by 13 percent over the same period. The greatest increase in performance for diabetes patients was a 32 percent increase in those having LDL cholesterol levels below 100 from 2002 to 2004.

Bonuses earned can amount to up to $5,000 per quarter, or $20,000 per year. The bonus structure is based primarily on productivity and billing

TABLE C-4 Community Medicine Associates’ Illustrative Measures Improvement in Performance Measures for Patients with Heart Disease and Diabetes Mellitus

Measure

2002 (%)

2003 (%)

2004 (%)

Patients with congestive heart failure prescribed ACE inhibitor

87

97

97

Patients with coronary artery disease prescribed aspirin

74

74

84

Patients prescribed beta-blocker after myocardial infarction

81

79

100

Diabetes mellitus patients with annual hemoglobin A1c test

91

91

97

Diabetes mellitus patients receiving annual microalbuminuria testing

71

78

84

Diabetes mellitus patients with LDL cholesterol below 100

58

65

90

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

BOX C-7
Key Lessons Learned from Community Medicine Associates

  • Be sure to properly align economic incentives with performance measurement to motivate providers to aim for quality improvement.

  • Set incentives at a high enough level to engage providers in the program. For example, the CMA bonus structure allows providers to increase their salary up to 15 percent per year.

compliance (70 percent), which rest entirely on billing correctly and not overbilling patients for services. The remaining 30 percent of the bonus covers patient satisfaction (10 percent), quality indicators (10 percent), and unit cost-efficiency (10 percent). Key lessons learned from CMA are summarized in Box C-7.

Case Study 7: North Texas Medical Group

North Texas Medical Group is a small primary care clinic located in Plano, Texas, serving approximately 14,000 patients annually. The practice consists of six providers who are board certified in internal medicine and family practice. Additional support staff include medical assistants, a physician assistant, and a nurse practitioner. North Texas Medical Group serves its patients’ health needs, from preventive to chronic disease care.

Three years ago, the practice adopted EHRs for billing compliance as well as quality improvement purposes. The practice relies heavily on clinical guidelines for measuring its performance; when standard measures are not available, it relies on clinical judgment regarding best treatment practices to apply to its patient population. Data are collected using EHRs for the following measures: diabetes, hypertension, cholesterol, and use of the high-risk medication coumadin. Providers in the group receive feedback on their performance based on these measures on a monthly basis.

North Texas Medical Group designed its practice around EHRs and hired providers who were comfortable working in a technology-driven practice. Two full-time information technology staff were hired to manage the practice’s databases, thus allowing providers to focus on patient care as opposed to technical issues. These technicians input laboratory data into database elements that can be read by EHRs.

The total investment made by the practice to date is approximately $250,000 for hardware and software. The estimated annual cost for the two technicians is $110,000. Additional costs include software support

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

BOX C-8
Key Lessons Learned from North Texas Medical Group

  • Incorporating technology into practice helps practitioners provide better care to patients.

  • There is more control over patient data when the data are collected within the practice, instead of by outside sources. This control allows greater focus on quality improvement efforts specific to the care of patients within the practice.

  • There may not be an economic balance between initial investments in technology and the consequences of not implementing EHRs; however, providers within the practice will be satisfied that they are providing the best possible care.

($40,000/year) and providers’ time spent designing reports and undertaking quality improvement efforts.

An innovative use of performance measurement by the North Texas Medical Group is improved monitoring of blood testing for patients prescribed coumadin. The measure is the number of patients prescribed coumadin without blood test data in the preceding 30 days. Patients should be monitored for the amount of coumadin they receive because if the amount is not managed properly, the drug could prove fatal. By using its EHRs, the practice was able to demonstrate its ability to monitor the drug and achieve a modest improvement on the measure—from having 5 of 48 patients prescribed coumadin without blood tests in 2002 to 0 of 71 patients in 2004. Additionally, the practice achieved a 6 percent improvement from 2002 to 2004 in the number of patients diagnosed with diabetes, hypertension, dyslipidemia, or coronary disease who had their cholesterol measured in the preceding year.

North Texas Medical Group does not publicly report its data and uses its performance measurement activities exclusively for internal quality improvement purposes. Even though pay for performance is not currently a part of the practice, a small disincentive is used for the physician with the lowest percentage of treatment goals met each month—taking the other providers out to lunch. Key lessons learned from the North Texas Medical Group are summarized in Box C-8.

CASE STUDY QUESTIONS

The Institute of Medicine Performance Measures subcommittee is currently seeking examples of “real-life” case studies from entities who have

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×

successfully implemented performance measurement programs in physician practices. We are particularly interested in learning how more cumbersome measures (such as those requiring either chart review or registries) have been successfully implemented in ways that are acceptable to physicians while ensuring complete and valid data collection. We want to hear how you became successful and at what cost. Your organization has been identified as an innovator in this field. We would like to respectfully request from you, if willing, a succinct narrative of your initiative (no more than 2 pages single spaced) describing your implementation process, specifically addressing as many of the following questions as you are able to complete without major effort.

  1. What kinds of data have you used for your performance measures in physician practices (i.e., claims data, chart review, paper-based registries, computer based registries, full EHR)?

  2. Would you be able to give a brief “real-life” description of how one or more practices adopted chart review based measures? Paper-based registries for reporting measures?

  3. How was the data collection system validated (i.e. field testing, provider engagement, feedback loop for refinement)?

  4. What was the level of technical support that needed to be provided to physicians’ offices?

  5. How did you obtain provider buy-in (i.e., cultural and attitudinal change)?

  6. What was the cost to individual practices (particularly for smaller practices less than 5 physicians) for implementing the performance measures that required either chart review or in-office registries?

  7. What was the cost estimate to your organization as a whole for data collection? (Ideally this could be estimated “per-member per month” or per year.)

  8. Can you provide illustrative examples of observed improvement?

  9. Has improved performance been linked to any payment incentives? If yes, how?

  10. Are data currently publicly reported?

  11. Overall, what were your key “lessons learned”?

Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 144
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 145
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 146
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 147
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 148
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 149
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 150
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 151
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 152
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 153
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 154
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 155
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 156
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 157
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 158
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 159
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 160
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 161
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 162
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 163
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 164
Suggested Citation:"Appendix C Case Studies." Institute of Medicine. 2006. Performance Measurement: Accelerating Improvement. Washington, DC: The National Academies Press. doi: 10.17226/11517.
×
Page 165
Next: Appendix D Ten Design Principles »
Performance Measurement: Accelerating Improvement Get This Book
×
Buy Hardback | $75.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Performance Measurement is the first in a new series of an ongoing effort by the Institute of Medicine (IOM) to improve health care quality. Performance Measurement offers a comprehensive review of available measures and introduces a new framework to examine these measures against the six aims of the health care system: health care should be safe, effective, patient-centered, timely, efficient, and equitable. This new book also addresses the gaps in performance measurement and introduces the need for measures that are longitudinal, comprehensive, population-based, and patient-centered. This book is directed toward all concerned with improving the quality and performance of the nation's health care system in its multiple dimensions and in both the public and private sectors.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!