8
Looking Ahead to the Implementation of Quality-of-Cancer-Care Measures
In this report, the Institute of Medicine (IOM) committee has recommended several types of measures to monitor cancer outcomes in Georgia. The committee’s main focus has been on the selection of evidence-based quality indicators that can help the Georgia Cancer Coalition (GCC) direct and evaluate progress in cancer prevention, early detection, diagnosis, and treatment (including palliative and end-of-life care). GCC will now be faced with the challenging job of organizing a system to implement the quality-of-cancer-care measures identified in previous chapters. Precisely how implementation should best occur is a question that is well beyond the scope of this report, but implementation of the recommended measures is a very important challenge.
In this chapter, the IOM committee offers advice about some principles and approaches to implementing the complex set of cancer measures. Specifically, it presents six principles to guide GCC:
-
The use of quality-of-cancer-care measures has a dual purpose: evaluating progress and motivating change.
-
GCC should develop a cancer surveillance, monitoring, and evaluation plan that incorporates a strategy for promotion and dissemination.
-
Georgia’s quality-of-cancer-care monitoring system should be transparent and very public.
-
Monitoring of the quality-of-cancer-care indicators should be managed by the highest level of GCC.
-
GCC’s quality-of-cancer-care infrastructure should build on Georgia’s existing measurement and reporting systems.
-
Credibility of the system will be paramount: collect, interpret, and present the results carefully.
Each of these principles is discussed further below. In addition, the committee recommends that GCC look to the growing literature examining lessons learned from numerous quality measurement and improvement projects around the nation (California HealthCare Foundation, 2001; Hermann et al., 2002; Lorenzi, 2003; Mills and Weeks, 2004; McGlynn, 2004; Kanouse et al., 2004; Landon et al., 2004; Murphy-Smith et al., 2004; Bradley et al., 2004a; Bradley et al., 2004b).
1. The use of quality-of-cancer-care measures has a dual purpose: evaluating progress and motivating change.
GCC’s quality-of-cancer-care monitoring system will both evaluate progress in improving outcomes and fuel interest in investigating explanations for observations. The emphasis on outcome assessment should not outweigh the other important purpose of motivating change. Fear of slowness in progress will tend to reduce investments in monitoring. However, the measurement process itself can often serve to raise awareness to levels that can lead, even indirectly, to improvements apart from the direct indicator that is being assessed. There need not be an expectation, therefore, that all indicators will improve with every evaluation. In other words, surveillance is not just a “report card” on how much progress has been made. Rather, surveillance should be viewed as a process of push, pull, give, and take, which will serve to enlighten many interested parties, across many sectors, about the needs and opportunities for improving cancer care in Georgia.
2. GCC should develop a cancer surveillance, monitoring, and evaluation plan that incorporates a strategy for promotion and dissemination.
That plan should be a blueprint for a cancer surveillance, monitoring, and evaluation unit, with a strategic plan to assure both oversight and independence. See Figure 8-1 for a proposed schematic of a GCC Quality Monitoring, Surveillance, and Evaluation Unit. This unit will need to have long-term, sustainable funding support. The unit will require a full-time director who understands data systems, epidemiology, and health communications, and an adequate support staff. This unit will need to work closely with other surveillance systems, such as those in the State Health Department, cancer registries, and medical data systems, who should be full partners in the plan. The unit could be commissioned to be located within an organization or a university, but it is important that it be an independent unit answerable to GCC to assure credibility.
Outputs of the monitoring system should be well defined by the plan. In most instances the outputs will be annual reports. All outputs will be
public reports, with the exception of some provider-specific data that will best remain confidential to the providers and health care systems.
3. Georgia’s quality-of-cancer-care monitoring system should be transparent and very public.
The assessment process needs to be clearly described and well understood by GCC collaborators, and the general public should come to expect regular annual reports of the state of cancer in Georgia. The experience of several quality monitoring projects around the country suggests that credible reporting and data feedback will be key to achieving “buy-in” from clinical leaders, medical groups, health plans, business leaders, consumers and other stakeholders (California HealthCare Foundation, 2001; Lorenzi, 2003; Kanouse et al., 2004; Mills and Weeks, 2004; Bradley et al., 2004a, 2004b). The National Cancer Institute’s (NCI’s) annual cancer progress report and
the Cancer Control PLANET are two models worth considering.1 The collateral improvements that can occur with monitoring are best optimized by a very open and public process of monitoring and surveillance. The general public may not directly act on the reports, but the reports’ presence in the public domain could stimulate quality improvement among providers. Surveillance that is not understood, or that is not respected, or that is too private, cannot be effective in fostering change. Findings from the monitoring system should be disseminated on a regular schedule, and widely publicized, though some of the information in the system (e.g., provider-specific findings) will need to remain confidential.
Georgia’s quality-of-cancer-care monitoring system will yield an increasingly rich and unique dataset. As it will be beyond GCC’s means to fully exploit this new treasure trove of information, the state should consider making the data available to researchers for a wide range of health services and scientific research. Outside funding sources, including the federal government, might be willing to support such an effort as it is likely to be a valued resource for the nation. GCC should establish a standing scientific review committee to develop and oversee the implementation of a carefully considered policy for public access to Georgia’s quality-of-cancer-care data. There are numerous models for such an activity; see, for example, the national Consumer Assessment of Health Plans (CAHPS) benchmarking database, New York state’s Statewide Planning and Research Cooperative System (SPARCS), and the American College of Surgeons’ National Cancer Database (NCDB), among others.2
4. Monitoring of the quality-of-cancer-care indicators should be managed by the highest level of GCC.
Monitoring and surveillance should not be regarded as a secondary activity to be delegated to a low-level unit. Although various elements of the monitoring system will likely be implemented by organizations or entities that are themselves independent of GCC, the overall system should be managed by GCC as a high-profile public activity.
5. GCC’s quality-of-cancer-care infrastructure should build on Georgia’s existing measurement and reporting systems.
The building blocks of the system will include routinely collected data
1 |
See these websites for further information: NCI annual progress reports, www.cancer.gov/nci-annual-report/. NCI Cancer Control PLANET, cancercontrolplanet.cancer.gov/index.html. |
2 |
See these websites for further information: CAHPS, www.ncbd.cahps.org/Home/index.asp. NCDB, www.facs.org/cancer/ncdb/index.html. SPARCS, www.health.state.ny.us/nysdoh/sparcs/sparcs.htm. |
from the central cancer registries, state Behavioral Risk Factor Surveillance System (BRFSS), the linked Surveillance, Epidemiology, and End Results (SEER)/Medicare dataset, and state vital records systems. Parties principally responsible for collecting those data will be the current operators of those systems. In some cases, special data elements might be included (e.g., BRFSS special questions or oversampling of special populations), or special contractual arrangements may be needed to assure data timeliness and completeness. However, in most cases the data inputs will be routinely collected data, compiled on a once-per-year basis. Other data inputs will be specifically constructed for the monitoring system, such as the data from hospital records and special samples of clinical data collected for the treatment and outcomes monitoring elements that will be specially designed for this system. Those special inputs might well be more frequent than annual.
6. Credibility of the system will be paramount: collect, interpret, and present results carefully.
Quality monitoring and surveillance will require careful interpretation and presentation. The data inputs and the results must be perceived as valid to motivate change. Some data sources will be more reliable and complete than others. Case-mix or risk adjustment of some measures will be needed to compare providers fairly. Furthermore, the measures should be periodically reviewed and updated as needed. It will be important to have a basis for considering new quality measures over time and retiring existing measures if they prove to be ineffective or no longer relevant.
SUMMARY
In this chapter, the IOM committee has offered six principles to guide GCC as it takes on the daunting challenge of implementing a quality-of-cancer-care monitoring system in Georgia. In addition, the committee has recommended that GCC look to the growing literature on lessons learned from numerous quality measurement and improvement projects around the country as it undertakes the challenge of implementing the quality-of-cancer-care measures recommended in this report (California HealthCare Foundation, 2001; Hermann et al., 2002; Lorenzi, 2003; Mills and Weeks, 2004; McGlynn, 2004; Kanouse et al., 2004; Landon et al., 2004; Bradley et al., 2004a, 2004b).
REFERENCES
ACoS (American College of Surgeons). 2004. National Cancer Database. [Online] Available: http://www.facs.org/cancer/ncdb/index.html [accessed December 20, 2004].
Bradley EH, Holmboe ES, Mattera JA, Roumanis SA, Radford MJ, Krumholz HM. 2004a. Data feedback efforts in quality improvement: lessons learned from US hospitals. Qual Saf Health Care. 13(1): 26-31.
Bradley EH, Webster TR, Baker D, Schlesinger M, Inouye SK, Barth MC, Lapane KL, Lipson D, Stone R, Koren MJ. 2004b. Translating Research into Practice: Speeding the Adoption of Innovative Health Care Programs. Issue Brief. New York: The Commonwealth Fund.
California HealthCare Foundation. 2001. Voices of Experience: Case Studies in Measurement and Public Reporting of Health Care Quality. Oakland, CA: California HealthCare Foundation.
Cancer Control PLANET. 2004. [Online] Available: http://cancercontrolplanet.cancer.gov/about.html [accessed December 20, 2004].
Hermann RC, Leff HS, Lagodmos G. 2002. Selecting Process Measures for Quality Improvement in Healthcare. Cambridge, MA: The Evaluation Center@HSRI.
Kanouse DE, Spranca M, Vaiana M. 2004. Reporting about health care quality: a guide to the galaxy. Health Promot Pract. 5(3): 222-31.
Landon BE, Wilson IB, McInnes K, Landrum MB, Hirschhorn L, Marsden PV, Gustafson D, Cleary PD. 2004. Effects of a quality improvement collaborative on the outcome of care of patients with HIV infection: the EQHIV study. Ann Intern Med. 140(11): 887-96.
Lorenzi NM. 2003. Strategies for Creating Successful Local Health Information Infrastructure Initiatives. Nashville, TN: Vanderbilt University.
McGlynn EA. 2004. Localize the remedy: community efforts can ameliorate poor quality of care. RAND Review. 28(2): 12-6.
Mills PD, Weeks WB. 2004. Characteristics of successful quality improvement teams: lessons from five collaborative projects in the VHA. Jt Comm J Qual Saf. 30(3): 152-62.
Murphy-Smith M, Meyer B, Hitt J, Taylor-Seehafer MA, Tyler DO. 2004. Put Prevention into Practice implementation model: translating practice into theory. J Public Health Manag Pract. 10(2): 109-15.
NCI (National Cancer Institute). 2003. The Nation’s Progress in Cancer Research: An Annual Report for 2003. [Online] Available: http://www.cancer.gov/nci-annual-report/ [accessed December 20, 2004].
New York State Department of Health. 2004. Statewide Planning and Research Cooperative System. [Online] Available: http://www.health.state.ny.us/nysdoh/sparcs/sparcs.htm [accessed December 20, 2004].
Westat. 2004. NCDB: The National CAHPS Benchmarking Database. [Online] Available: http://ncbd.cahps.org/Home/index.asp [accessed December 20, 2004].