Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
13 CMS Oversight CHAPTER SUMMARY This chapter examines how the Centers for Medicare and Medicaid Services (CMS) oversees and manages the Quality Improvement Organization (QIO) program as a whole. First, the integration of the program into CMS's organizational structure is discussed, in- cluding the use of personnel who help with oversight. Next, com- munications, information technology, and data services are dis- cussed both in the context of how they are used in the operations of the program and how they are used as a resource for manage- ment. Then, contract issues are presented, including how contracts are competed, awarded, implemented, and monitored. Finally, there is an examination of how CMS provides overall guidance to the Quality Improvement Organization program through strategic planning, policy decision making, coordination, and overall pro- gram evaluation. ORGANIZATIONAL STRUCTURE OF QIO PROGRAM IN CMS Oversight of the Quality Improvement Organization (QIO) program involves coordination of the efforts of multiple personnel in several offices within the Centers for Medicare and Medicaid Services (CMS), each of which has distinct roles. The administrative office of CMS, located in Balti- more, Maryland, is commonly referred to as the "Central Office." Two offices within CMS's Central Office share the responsibility for manage- ment of the QIO program: the Office of Clinical Standards and Quality, the "Program Office," and the Office of Acquisition and Grants Management, the "Contracts Office." Other groups have indirect roles in the manage- ment of the QIO program. The QIO and End-Stage Renal Disease Steering Committee manages the daily operations of the QIO program. The mem- bership on the QIO and End-Stage Renal Disease Steering Committee com- prises the Associate Regional Administrator for each of the four Regional 325
326 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM Offices affiliated with the QIO program and both the director and the deputy director of each of three groups within the Office of Clinical Stan- dards and Quality: the Quality Improvement Group, the Information Sys- tems Group, and the Quality Measurement and Health Assessment Group. The committee, currently chaired by the director of the Quality Improve- ment Group, meets weekly and primarily discusses operational issues (per- sonal communication, J. V. Kelly, June 28, 2005). Program Office Overall responsibility for the QIO program lies in CMS's Office of Clinical Standards and Quality, with direct oversight provided by the Qual- ity Improvement Group (Jost, 1991; CMS, 2004c) and with support pro- vided by other groups within that office. The Program Office monitors the QIO program, coordinates with the Office of Internal Customer Support on financial matters, and creates and interprets policy related to the QIO program's operations. The office is divided into six groups, each of which may have one or more of the following divisions: · Quality Improvement Group, · Quality Measurement and Health Assessment Group, · Information Systems Group, · Quality Coordination Team, · Coverage and Analysis Group, and · Clinical Standards Group. In the Institute of Medicine (IOM) committee's web-based data collection tool, 52 QIOs rated the Program Office on several functions. Overall, the office received higher scores on "clarity" than on "timeliness" (Table 13.1). Concerns over clarity and timeliness also arose during the IOM com- mittee and staff site visits. Four QIOs mentioned that the information that they receive is often ambiguous, and eight related frustration with the time- liness of access to information or data related to their tasks (referred to here as data lags). Data lags, however, may also be attributable to the measure- ment process, based on claims (this is discussed more later in this chapter). Contracts Office Many groups contribute to the development of a QIO contract, includ- ing the Office of Clinical Standards and Quality, the Office of Acquisition and Grants Management, and Regional Office Divisions of Quality Im- provement (CMS, 2004b). However, responsibility for the QIO contract ultimately rests with the Acquisition and Grants Group of the Office of
CMS OVERSIGHT 327 TABLE 13.1 QIO Ratings of CMS Program Office Clarity of Overall Overall Clarity of Timeliness of Program Support of Information Information on Ratings Direction QIO Work on Core Tasks Core Tasks Excellent 2 3 1 0 Good 33 33 26 8 Fair 15 9 22 30 Poor 2 7 3 14 NOTE: The data in the table represent the number of QIOs responding as indicated. Data are for a total of 52 QIOs. SOURCE: IOM committee web-based data collection tool. Acquisitions and Grants Management. The Contracting Officer, a repre- sentative of the Acquisition and Grants Group, is the only person with the authority to release the contract or make modifications to the contract. The Contracting Officer oversees all contracts for the QIO program, and several contract specialists are each assigned to specific QIOs. As of June 2005, nine contract specialists were each assigned to work directly with between five and seven QIOs (personal communication, J. V. Kelly, June 30, 2005). The QIOs expressed frustration with their interactions with the Con- tracts Office. During the site visits, two QIOs raised issues about conflicting messages between the Program and Contracts Offices. Additionally, at CMS's annual technical conference for the QIO program (QualityNet 2004), many QIO staff related difficulties with being asked to perform du- ties not specified within their contracts. They were asked to perform these duties by different sources, such as their Program Officer or Government Task Leader, or through a Transmittal of Policy System (TOPS) document (all of these are described later in this chapter). Although CMS presenters clarified that the Contracting Officer has the final say on required duties, the QIOs expressed frustration with conflicting messages from different individuals and groups at CMS (Hughes, 2004). The QIOs rated the contracts office on many functions. Thirty-five of 52 QIOs stated that they had interaction with the Contracts Office only on an as-needed basis. The majority of QIOs rated the Contracts Office as "good" or "fair" on all questions (Table 13.2). Regional Offices CMS has 10 Regional Offices around the country. In four of these Re- gional Offices (Boston, Dallas, Kansas City, and Seattle), CMS established
328 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM TABLE 13.2 QIO Ratings of CMS Contracts Office Overall Overall Timeliness Expertise/ Clarity of Timeliness of of Contract Understanding Ratings Communications Communications Modifications of QIO Tasks Excellent 6 6 3 4 Good 26 25 27 19 Fair 14 17 15 19 Poor 6 4 7 10 NOTE: The data in the table represent the number of QIOs responding as indicated. Data are for a total of 52 QIOs. SOURCE: IOM committee web-based data collection tool. Divisions of Quality Improvement that act as liaisons between the QIOs and CMS's Central Office (Jost, 1991; CMS, 2004b). The remaining six CMS Regional Offices do not have any direct responsibility for the QIO program. The four Regional Offices with Divisions of Quality Improve- ment (referred to in the QIO program as "Regional Offices") assist QIOs with technical issues on a daily basis by interpreting CMS policy, monitor- ing finances, and providing feedback. The staff of the Divisions of Quality Improvement include an Associate Regional Administrator, Project Officers, and Scientific Officers. The Asso- ciate Regional Administrator oversees daily operations, including develop- ment and the implementation of goals, participation in consortium meet- ings, maintenance of stakeholder relationships, and management of funds (CMS, 2004b). Before the 7th SOW, Divisions of Quality Improvement existed in all 10 CMS Regional Offices and were generally staffed only by Project Officers. As the program focus shifted toward quality improvement, oversight was condensed into the four Regional Offices mentioned above, as new skills were needed to parallel the skills needed at the QIO level. New staff included epidemiologists, clinicians, biostatisticians, data managers, and communications specialists (CMS, 2004b). Today, staffing at each Re- gional Office varies in terms of both the numbers of personnel and the skill sets of those personnel (CMS, 2004b). CMS also divided the country into four consortiums that correlated with the four Regional Offices with Divisions of Quality Improvement. These consortiums (Northeast, Midwest, Southern, and Western) include the one Regional Office's with QIO oversight in that area and any other Regional Offices in that area that are not directly involved in the QIO pro- gram. The consortiums act to improve communications and share resources among the 10 Regional Offices and enhance consistency in the QIO pro- gram as a whole (CMS, 2004b).
CMS OVERSIGHT 329 Project Officers Project Officers monitor technical aspects of the QIO core contract (CMS, 2004b). All Project Officers participate in a week-long basic training session, with some officers completing optional advanced Project Officer training or performance-based contracting training. Each QIO is assigned one Project Officer, but a single Project Officer works with multiple QIOs. The QIOs reported that they have frequent contacts with their Project Of- ficers: half (26 of 52) reported weekly contact, and 92 percent (48 of 52) reported at least monthly contact (Figure 13.1). The Project Officer provides direct technical assistance to each QIO, serves as the advocate for the QIO within CMS, and is an expert resource for the QIOs in terms of contract content and CMS policy. The Project Officers manage QIO contracts by monitoring the progress of the QIOs, acting as a direct liaison to the Contracting Officer at CMS, and participat- ing in strategic planning. Monitoring activities include scheduled calls with individual QIOs and review of the data on the Dashboard section of CMS's intranet site (see below). Official monitoring visits are discussed in greater detail later in this chapter. The Project Officers also have communications and coordination responsibilities at both the local and the national levels. Table 13.3 shows the number of full-time Project Officers at each Regional Other, 1 As needed, 3 Monthly, 14 Weekly, 26 Semi-monthly, 8 FIGURE 13.1 Frequency of Project Officer contact with QIOs reported by 52 QIOs. The numbers in the figure represent the number of QIOs responding as indicated. SOURCE: IOM committee web-based data collection tool.
330 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM TABLE 13.3 Numbers of Project Officers and Contracts for Each CMS Regional Office Regional Number of Number of Average Number of Office Project Officers QIO Contracts Contracts per Project Officer Boston 5 16 3.2 Dallas 4.6 11 2.4 Kansas City 4 13 3.25 Seattle 4 13 3.25 SOURCE: CMS (2004b). TABLE 13.4 QIO Ratings of Project Officers Expertise/ Expertise/ Clarity of Timeliness Understanding Understanding Rating Responsesa of Responsesa of Review Tasks of HCQIP Tasks Excellent 34 36 25 21 Good 13 13 19 24 Fair 4 3 7 6 Poor 1 0 1 1 NOTE: The data in the table represent the number of QIOs responding as indicated. Data are for a total of 52 QIOs. HCQIP = Health Care Quality Improvement Program. aResponses to questions raised or issues posed by the QIO. SOURCE: IOM committee web-based data collection tool. Office, as well as the total number of QIO contracts monitored in that region as of June 2004 (CMS, 2004b). The QIOs rated the Project Officers on various functions. Overall, the Project Officers received high ratings in all areas, with the majority of QIOs rating their Project Officers as "excellent" or "good" in each area (Table 13.4). Scientific Officers The Scientific Officers support the Project Officers by providing scien- tific and clinical expertise (CMS, 2004b). Scientific Officers are not as- signed to specific QIOs but, instead, assist all QIOs in the region covered by the Regional Office with specific technical needs. They also assist the
CMS OVERSIGHT 331 Semi-Weekly, 6 Only as needed, 17 Quarterly, 3 Monthly, 22 FIGURE 13.2 Frequency of Scientific Officer contact with QIOs reported by 48 QIOs. The numbers in the figure represent the number of QIOs responding as indicated. SOURCE: IOM committee web-based data collection tool. QIOs in other regions, if they are requested to do so. As of June 2004, the Boston Regional Office had five Scientific Officers on staff, and the three other Regional Offices each had four Scientific Officers (CMS, 2004b). The QIOs reported extremely variable interactions with the Scientific Of- ficers (Figure 13.2). Scientific Officers evaluate measurement methodologies and surveys, analyze QIO data, review manuscripts, provide clinical expertise, and man- age special studies (CMS, 2004b). Scientific Officers possess specific skills in areas such as statistics, epidemiology, clinical science (Medical Officer), and data management. Scientific Officers may complete any of the training sessions described for Project Officers, but they are not required to do so. Scientific Officers also participate in official monitoring visits, described later in this chapter. In addition to their basic duties, Scientific Officers often serve as Government Task Leaders (see below). Table 13.5 shows the QIO ratings of Scientific Officers on a variety of functions. In general, QIOs rated Scientific Officers highly in all areas, with most QIOs providing "excellent" or "good" ratings for their Scien- tific Officers. Government Task Leaders Each task of the QIO contract and each special study are assigned a single Government Task Leader to provide direct oversight. The Govern-
332 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM TABLE 13.5 QIO Ratings of Scientific Officers Clarity of Timeliness Timeliness of Rating Responsesa of Responsesb Manuscript Reviewsc Excellent 17 17 18 Good 24 29 14 Fair 6 2 1 Poor 0 0 2 aData are for a total of 47 QIOs. bData are for a total of 48 QIOs. cData are for a total of 35 QIOs. SOURCE: IOM committee web-based data collection tool. ment Task Leader may be located in either the Regional or Central Office. In the IOM committee telephone interviews, 11 of 20 QIO chief executive officers (CEOs) expressed problems with Government Task Leaders. Three CEOs, including two with QIO support center (QIOSC) contracts, specifi- cally mentioned, unprompted, that many Government Task Leaders lack substantive expertise in their topic areas. Some of their comments were as follows: · "What QIOSCs need to do the best job are exceptional CMS Gov- ernment Task Leaders. They blend a knowledge of breaking research with pragmatism and good political instincts." · "There should be better coordination among the Government Task Leaders at CMS. They tend to get siloed in their specialties and do not understand the scope of what QIOs are doing." · "You can usually attribute the difference [in timeliness] to the rela- tionship with the CMS Government Task Leader; if it is positive, you get things approved in a timely manner." Difficult relationships with Government Task Leaders were echoed in inter- views with staff from five organizations representing seven QIOSCs. All of them believed that the relationship often depended on the Government Task Leader's experience in the topic area. One staff member stated that the Government Task Leader used the QIOSC as an extension of his or her personal staff. Two staff members indicated that the rate of turnover of their Government Task Leaders was high and that their skills and experi- ence with their assigned topic areas varied.
CMS OVERSIGHT 333 TABLE 13.6 Full-Time CMS Employees for the QIO Program FTEa Percentage Area of CMS Count of Total Regional Offices Total count 42.45 32 Dallas 9.2 Boston 11.25 Seattle 11.0 Kansas City 11.0 Quality Improvement Total count 36.5 28 Group (Office of Division of Contract Operations and 14.5 Clinical Standards and Support Quality) Division of Quality Improvement Policy 14.0 for Acute Care Division of Quality Improvement Policy 4.5 for Chronic and Ambulatory Care Front office staff 3.5 Information Systems Group 24.0 18 (Office of Clinical Standards and Quality) Quality Measurement and 20.5 16 Health Assessment Group (Office of Clinical Standards and Quality) Office of Acquisition and Grants Management 8.5 6 aFTE = full-time equivalent. SOURCE: Personal communication, J. V. Kelly, September 8, 2005. Full-Time Employees As of September 2005, the full-time employee count for the QIO pro- gram was 131.95 (personal communication, J. V. Kelly, September 8, 2005). This includes all CMS employees who work on the core contract, special studies, or developmental work. Most employees (62 percent) work in one of the groups of the Office of Clinical Standards and Quality. The break- down is presented in Table 13.6. COMMUNICATIONS AND INFORMATION TECHNOLOGY SERVICES Communications QIO Manual and Contract Many conduits of communication exist within the QIO program (CMS, 2004b). A primary source of program information is the QIO manual,
334 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM which lays out basic program policy on the basis of legal and agency re- quirements and which is unlikely to change during the course of a contract. The QIO contract itself is another source of information for QIOs. The contract includes a statement of work, a document that delineates detailed work requirements, a list of deliverables, evaluation criteria, and a budget. The scope of work (SOW) is a section of the statement of work that pro- vides an overall nontechnical description of the required activities during the contract cycle. According to the J-1 attachment of the QIO contract (the glossary), the abbreviation "SOW" can be used to refer to either the scope of work or statement of work but declares that the terms themselves are not interchangeable (CMS, 2002). Memos and Letters CMS uses TOPS documents to inform the QIOs quickly about antici- pated changes in policy, including draft statements (Jost, 1991; CMS, 2004b). Although TOPS documents deal with policy changes, Standard Data Processing System (SDPS) memos inform QIOs about operational con- cerns. Examples include one-time requests for information, emergency alerts, and administrative announcements (CMS, 2004b). SDPS memos may come from different sources, but all memos must be cleared by the Informa- tion Systems Group of the Office of Clinical Standards and Quality. CMS uses contractor clarification letters to inform QIOs of alterations or additions to their contracts. The letters may also clarify requirements or respond to specific questions. Two types of clarification letters are used. The first type is an unofficial letter that explains an issue or question but does not result in a contract modification (personal communication, J. V. Kelly, May 31, 2005). The second type is one that is a precursor to a con- tract modification, informs QIOs of forthcoming contract changes, and ul- timately, results in a contract modification. No matter the source, all letters must be cleared by the Contracting Officer in the Acquisition and Grants Group. For day-to-day work and specific questions, CMS may use e-mail or conference calls to communicate with the QIOs. These formal letters and memos are all sent by e-mail to each QIO and are also posted in appropri- ate sections of QIONet, CMS's internal intranet website (described later in this chapter). Figure 13.3 shows QIO satisfaction with the clarity and the timeliness of TOPS memos. Overall, most QIOs believe that clarity was "good" but that timeliness was "fair." Regional Office Communications The Regional Offices coordinate much of the communication between CMS and the QIOs (CMS, 2004b). Informal interactions often occur daily
CMS OVERSIGHT 335 A B Excellent , 1 Excellent, 1 Poor, 3 Poor, 4 Good, 11 Fair, 12 Fair, 36 Good, 36 FIGURE 13.3 Clarity (A) and timeliness (B) of TOPS memos reported by 52 QIOs. The numbers in the figure represent the number of QIOs responding as indicated. SOURCE: IOM committee web-based data collection tool. via e-mail and telephone. Formal interactions occur at the 9- and 18-month evaluations (discussed later in this chapter). Some Project Officers expressed frustration that limited travel budgets do not permit more than two on-site evaluation visits. Finally, the Regional Offices interact with each other as well as with CMS Regional Offices that do not oversee the QIOs. The Project Officers of the four Regional Offices that oversee QIOs participate in a monthly community-of-practice call; this is a regularly scheduled tele- conference that allows officers to exchange ideas and information. Interac- tion with CMS Regional Offices not associated with the QIO program is less formalized but still occurs, especially when national programs (like Nursing Home Compare) are launched. Medicare Quality Improvement Community The Medicare Quality Improvement Community (MedQIC) (formerly known as the Medicare Quality Improvement Clearinghouse) is a public website available to anyone via the Internet at http://www.medqic.org (CMS, 2004b). MedQIC currently features support for seven areas: struc- tural and systems change, physicians' offices, hospitals, home health agen- cies, nursing homes, underserved populations, and managed care organiza- tions. These areas are subject to change with the evolution of the SOWs and refinement of the website. The site serves as a resource for quality improve- ment efforts and includes bibliographies, tool kits, flowcharts, and sugges-
336 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM A B Excellent, 1 Excellent, 5 Poor, 7 Good, 11 Poor, 10 Good, 12 Fair, 28 Fair, 30 FIGURE 13.4 Value (A) and ease of use (B) of MedQIC reported by 52 QIOs. The numbers in the figure represent the number of QIOs responding as indicated. SOURCE: IOM committee web-based data collection tool. tions. The site also provides information for consumers, including lists of all QIOs and activities of the QIO program, but it does not divulge pro- vider- or beneficiary-specific information (CMS, 2003). As discussed in Chapter 11, in the 7th SOW, the Iowa Foundation for Medical Care (Iowa's QIO) acted as a virtual QIOSC for the operation of MedQIC. Figure 13.4 shows QIO assessments of the value and the ease of use of MedQIC. More than half of the QIOs rated MedQIC as "fair" in each case. As MedQIC was redesigned in early 2005, an effort spearheaded by the 7th SOW's Qual- ity Improvement Interventions and Related Resources QIOSC, Figure 13.4 does not reflect the value or ease of use of the new version of MedQIC. QIONet QIONet is a protected intranet website of CMS used by the QIO com- munity to share task-specific information, provide forums and training re- sources, archive memos, and display data and progress reports (CMS, 2004b). Only preapproved users may gain access to the site. The Iowa Foun- dation for Medical Care (Iowa's QIO) maintains QIONet. All the tools of the SDPS (see later in this chapter) may be accessed via QIONet. The ma- jority of QIOs rated QIONet as "excellent" or "good" on the dimensions of value and ease of use (Figure 13.5).
CMS OVERSIGHT 337 A B Poor, 0 Poor, 1 Fair, 9 Excellent, 12 Excellent, 10 Fair, 15 Good, 31 Good, 26 FIGURE 13.5 Value (A) and ease of use (B) of QIONet reported by 52 QIOs. The numbers in the figure represent the number of QIOs responding as indicated. SOURCE: IOM committee web-based data collection tool. Information and Communication Technology Systems and Tools SDPS is the information system for the QIO program and includes hard- ware and software developed by the SDPS team for use by the QIO commu- nity (CMS, 2004b). MedQIC and QIONet (described above) also fall under the umbrella of SDPS. SDPS became operational in May 1997 in response to the needs of the QIO program and interfaces with the Central Office, the 53 QIOs, and the Clinical Data Abstraction Centers (CDACs) (CMS, 2003). As mentioned above, the Iowa Foundation for Medical Care (Iowa's QIO) acted as the QIOSC for data collection and SDPS issues. In the web-based data collection, the QIOs rated the value of SDPS to their core contract work. Thirty-three of 52 QIOs (63 percent) rated its value as "excellent" or "good." The QIOs also rated SDPS on timeliness and overall ease of use, with slightly higher ratings for ease of use than timeliness (Table 13.7). Dashboard Data from the CMS Dashboard, a part of QIONet, show the results of each QIO's work on the contract tasks (CMS, 2004b). Many Dashboard reports include quarterly trends and provider participation rates. Project Officers use the Dashboard to monitor the progress of the QIOs under their
338 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM TABLE 13.7 QIO Ratings of SDPS Rating Value Timeliness of Support Overall Ease of Use Excellent 8 4 6 Good 25 19 21 Fair 12 18 20 Poor 7 11 5 NOTE: The data in the table represent the number of QIOs responding as indicated. Data are for a total of 52 QIOs. SOURCE: IOM committee web-based data collection tool. Excellent, 2 Good, 7 Poor, 19 Fair, 24 FIGURE 13.6 QIO satisfaction with timeliness of Dashboard data reported by 52 QIOs. The numbers in the figure represent the number of QIOs responding as indicated. SOURCE: IOM committee web-based data collection tool. management, and QIOs may use it to compare their results with those of other QIOs around the country. As described above, however, many QIOs express frustration with the time delays that they encounter when they try access to different types of data, including data presented on Dashboard. Forty-three of 52 QIOs (83 percent) rated the timeliness of Dashboard data as "fair" or "poor" (Figure 13.6). Program Activity Reporting Tool The Program Activity Reporting Tool (PARTner) is an application that QIOs use to report on their deliverables (CMS, 2004b), including regular
CMS OVERSIGHT 339 A B Excellent, 1 Poor, 3 Excellent, 5 Poor, 11 Good, 16 Good, 18 Fair, 26 Fair, 24 FIGURE 13.7 Value (A) and ease of use (B) of PARTner reported by 52 QIOs. The numbers in the figure represent the number of QIOs responding as indicated. SOURCE: IOM committee web-based data collection tool. reports on activities and projects, information on publications, data on iden- tified participants, and project proposals. CMS Central and Regional Of- fice staff use PARTner to monitor these deliverables or approve the project plans submitted by QIOs. CMS staff warned the IOM, however, that some of the data sets were not complete and consistent enough for analytical purposes (personal communication, J. V. Kelly, January 11, 2005). Fig- ure 13.7 shows the QIO ratings of the value and ease of use of PARTner. More than half of the QIOs rated PARTner as "fair" or "poor." Case Review Information System The Case Review Information System (CRIS) is an application that the QIOs use to track and report data on case review activities (CMS, 2003, 2004b). The QIOs also use CRIS to describe other activities, such as the number or type of helpline calls recived. CRIS allows the QIOs and CMS to organize and monitor these activities. Project Officers use CRIS to monitor the timeliness of the case review activities of each QIO. Figure 13.8 shows the QIO ratings of the value and ease of use of CRIS. Thirty-two of 52 QIOs (62 percent) rated CRIS as "excellent" or "good" on CRIS's value, but 35 of 52 (67 percent) rated its ease of use as "fair" or "poor." CMS Abstraction and Reporting Tool Providers, QIOs, and CDACs use the CMS Abstraction and Reporting Tool (CART) to collect and analyze data on quality indicators related to the
340 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM A B Poor, 0 Excellent, 5 Excellent, 8 Poor, 8 Fair, 20 Good, 12 Good, 24 Fair, 27 FIGURE 13.8 Value (A) and ease of use (B) of CRIS reported by 52 QIOs. The numbers in the figure represent the number of QIOs responding as indicated. SOURCE: IOM committee web-based data collection tool. hospital tasks (CMS, 2004b). The tool was developed by a team that in- cluded CMS, the Joint Commission on Accreditation of Healthcare Organi- zations, and the QIOs themselves. Figure 13.9 shows the QIO ratings of the value and ease of use of CART. The QIOs appear to be evenly divided as to its value and ease of use, with about half of the QIOs rating CART as A B Excellent, 7 Excellent, 5 Poor, 9 Poor, 7 Fair, 17 Good, 20 Fair, 15 Good, 20 FIGURE 13.9 Value (A) and ease of use (B) of CART reported by 51 QIOs for value and by 49 QIOs for ease of use. The numbers in the figure represent the number of QIOs responding as indicated. SOURCE: IOM committee web-based data collection tool.
CMS OVERSIGHT 341 "excellent" or "good" and half rating it as "fair" or "poor" for both parameters. DATA FLOW Clinical Data Abstraction Centers In the 7th SOW, CMS contracted with two CDACs, AdvanceMed and DynKePRO, to abstract clinical data from medical records (CMS, 2004b). CMS contracted with these companies directly on behalf of the QIOs for the Hospital Payment Monitoring Program (see Chapter 12) as well as for other surveillance and validation needs (described later in this chapter). The contracts with the CDACs lasted for 5-year periods (personal communica- tion, M. Krushat and W. Matos, CMS, October 25, 2004). The most recent contract period began in September 2004 and was granted to DynKePRO alone. This contract is for 5 years at a cost of $74 million. The previous 5- year contract was for $125 million. The contract cost was reduced for sev- eral reasons, including the availability of improved data collection and re- porting tools, decreased abstraction needs, and the fact that the use of only one CDAC will lead to more efficient operations (personal communication, W. Matos, CMS, July 7, 2005). Nursing Homes and Home Health Agencies In the 7th SOW, CMS obtained performance data for nursing homes and home health agencies from the Center for Medicaid and State Opera- tions, which generated nursing home measures from data collected with the Minimum Data Set tool and home health agency measures from data col- lected with the Outcome and Assessment Information Set tool. In the 7th SOW, the measures were available to QIOs and CMS in two ways. First, the measures were available through an internal electronic information sys- tem. Second, the Office of Clinical Standards and Quality of CMS received a date file containing the measures, which was posted onto Dashboard for use by the QIOs. In the 8th SOW, the Information Systems Group is work- ing on a tool (modeled after CART) to track clinical processes associated with positive outcomes. The data will be submitted to a warehouse and comparative performance feedback will be provided back to the nursing home to help them target areas for improvement. Hospitals In the 6th SOW, the CDACs collected data on performance measures for the hospital setting, but these data were collected only at the baseline
342 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM and the time of remeasurement. If the QIOs wanted information at earlier intervals, they had to collect their own data. QIOs interested in interim data used various tools, which led to inconsistent results. For the baseline and final measurements, the data flowed from the CDACs to the Clinical Area Support Peer Review Organization (the predecessor to the QIOSCs) to the QIO. In this case the data were often too old to be helpful to the QIOs for their interventions. Furthermore, the sample size was targeted at the state level and not the provider level. By the beginning of the 7th SOW, CDACs increased data collection to a quarterly basis. However, there was still a lag from the time of service to the time of data availability, in part because the sample relied on claims filed by the provider and processed before abstraction. Efficiency was improved through the creation of the CART tool and the creation of a centralized data repository (instead of the use of the QIOSCs as intermediaries). Also, under Task 2b of the 7th SOW, Hospital Public Reporting, hospitals began to collect and report their own data via the CART tool directly to the warehouse on a quarterly basis (see Chapter 11). In the 7th SOW, CDACs abstracted a surveillance sample of records for hospital quality measures (~52,000 records annually), stroke and atrial fi- brillation measures (~3,800 records annually), and patient safety measures (~27,000 records annually) (personal communication, M. Krushat and W. Matos, CMS, October 25, 2004). The average cost of a single record ab- straction in 2003 was $56 per chart, with a range (depending on the type of review) of $47 to $103 per chart. Surveillance samples were not large enough to allow users to assess individual providers. In the 8th SOW, due to the duplicative efforts of the Hospital Quality Alliance and the reporting requirements of the Medicare Prescription Drug, Improvement, and Modernization Act of 2003 (P.L. 108-173), abstractions for surveillance have been eliminated or greatly reduced because data are available from public reports. Instead, the CDACs perform validation for a number of records sampled by SDPS. Patient safety measures are abstracted and validated separately. Although there has been a centralization of the data collection efforts and a standardization of the tools and processes in the hospital setting, CMS does not believe that there will be a significant improvement in the data lag (personal communication, W. Matos, CMS, July 7, 2005). As part of measures alignment among stakeholders, CMS and the Joint Commis- sion on Accreditation of Healthcare Organizations agreed to collect mea- sures using the same timeline, which limits the availability of the informa- tion reported. Hospitals do have the ability to concurrently submit their data and generate their own reports. The QIOs and hospitals are able to look at these results in real time if the hospital does immediate reporting, but they would be unable to compare those data with statewide or national
CMS OVERSIGHT 343 results. In the hospital setting, the data lag is mostly attributable to the chart abstraction process. Physicians' Offices CMS currently collects claims-based measures for physicians' offices (personal communication, W. Matos, CMS, July 7, 2005). Some of these measures lack reliability because of reporting issues, such as incomplete records and services delivered but not billed separately. In the 8th SOW, a Doctor's Office QualityInformation Technology warehouse has been es- tablished, and the QIOs will help physicians with the reporting of mea- sures data. DATA LAG ISSUES In the IOM committee telephone interviews with the QIO CEOS, all the CEOs commented that the timeliness of the data available for the differ- ent settings is a problem. Most CEOs focused on how a lack of timeliness generally hindered improvement because of a lack of availability of up-to- date baseline data and rapid feedback to QIOs so that they could alter their interventions or motivate providers to continue their system changes. A lack of timeliness also affected their views on contract length and the fair- ness of the evaluation process. This was confirmed during the IOM site visits, in which 8 of 11 QIOs independently cited data lag as a problem in their work. Many CEOs claimed that the data were often too old to reflect the effects of the quality improvement interventions and did not reflect the QIOs' efforts during their 3-year contracts due to the timing of evaluations (see later in this chapter for more on the evaluation period). Other studies also confirm the CEOs' concerns over data lag times. A random national sample of hospital quality improvement managers inter- viewed in 2002 raised concerns about the use of data for quality improve- ment interventions because physicians perceived questions of validity and substantive problems with the data and because the data were too old (sev- eral months to a year old) to be helpful (Bradley et al., 2005). However, a recent study of hospitals not participating in specific quality improvement interventions showed no difference in performance measures between hos- pitals that received immediate feedback and those that received data that were delayed 17 months (Beck et al., 2005). Many CEOs expressed the need for QIOs to supplement CMS data with more timely data for feedback to providers. Fourteen of 20 CEOs mentioned the continuing need to collect data themselves, although they
344 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM recognize that they cannot abstract information at all facilities or for all providers because of limited resources. Nursing Homes and Home Health Agencies Five of 20 CEOs mentioned problems with nursing home data, and four mentioned delays with home health data. In general, because of the tools used for data collection in these settings, data lag is not a major issue. Hospitals Thirteen of the 20 CEOs mentioned problems with hospital data time- liness. Ostensibly, with hospitals collecting their own data, the burden would be lifted from the QIOs, feedback would be more immediate, and the tracking of changes in the hospital setting would be easier. However, seven CEOs said that data availability was more timely under the 6th SOW. Four CEOs specifically mentioned problems with CART. However, others affirmed progress in the hospital arena: "If three years ago you would have said that every hospital in the state would report to CMS, I would have been surprised." One CEO said that CART data are "a good first step for facilities." The CEOs had different opinions on the ability to get data more fre- quently. One stated, "Hospitals might scream, but if they can provide it quarterly, they can do it monthly and that would allow even more timely evaluation of improvement." However, another CEO said that getting CMS data more often than quarterly was not realistic because in some states the provider pool and sample of patients would be too small on a monthly basis. Also, he was not sure the QIO or the hospitals could deal with the process of getting and sharing data monthly or the emotional gear up and reaction to data. Physician's Office and Outpatient Settings Half of the CEOs related that they encountered problems with lags in data from physicians' offices and outpatient practices. They considered these settings to be the most difficult from which to obtain data, both from CMS claims and directly from provider offices. Two CEOs offered specific com- ments about the CMS data. One CEO stated, "Physician office data [were] nonexistent, and we were already 18 to 24 months into the 7th SOW." Another CEO indicated, "In physician offices, the evaluation strategy for 7th SOW was seriously flawed. At 14 months we had at most a couple of weeks of data that would reflect anything we had done in the 14 months. There [were] no data early on to show us to correct our course. We had to
CMS OVERSIGHT 345 use proxy measures like the number of improvement plans drawn up; CMS does not give enough credit for these proxy measures." With respect to gaining access to data directly from physicians' offices, one CEO believed that "CMS doesn't trust QIOs with physician data, but the CMS data [are] old when we get [them]; QIOs have the capacity to handle the physician data." Another stated, "It is difficult to get into physi- cian offices to abstract data; physicians do not have the space or the time to accommodate persons collecting data." Data lag in this setting is often attributable to confidentiality restrictions or abstraction issues; however, future public reporting efforts and requirements will ameliorate these difficulties. QIO CONTRACTS Competition for Contracts QIOs that qualify as in-state organizations (CMS, 2004d) may have their contracts automatically renewed upon successful completion of the previous contract, known as a "noncompetitive renewal." If the organiza- tion holding the contract does not qualify as in state, CMS must announce the contract's expiration date in the Federal Register at least 6 months be- fore the end of the contract. In-state organizations that express interest in the contract are given priority, even if they did not hold the previous con- tract. Noncompetitive renewals are not allowed for out-of-state organiza- tions, even if they are successful in the completion of the previous contract, unless no qualified in-state organization applies (CMS, 2002, 2004d, 2005a). If a QIO fails to successfully complete all parts of the contract, it may present arguments to a CMS review panel as to why it was not successful. CMS may elect to renew the contract noncompetitively if it finds excep- tional circumstances; or it may decide to not renew the contract, which will go out for competition, known as a "competitive renewal" (CMS, 2004b). At the end of the 6th SOW, the CMS panel reviewed 16 contracts, and for the 7th SOW recommended that a recompetition be conducted for 9 of them. Of those 9, CMS reversed the decision for 2 of them, which were renewed noncompetitively. CMS put the remaining 7 contracts up for com- petition, but only 2 contracts were awarded to new organizations. Of the five organizations that ultimately regained their contracts, three had no other bidders, one won the contract against other bidders, and one had only one other bidder that ended up not qualifying (CMS, 2004b). At the end of the 7th SOW, CMS determined that six QIOs had unsuc- cessfully completed their contracts. After three of the six QIOs went before the evaluation panel in the first round of QIO contracts, CMS decided that
346 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM a recompetition would be conducted for all failing QIOs unless there were extremely unusual circumstances. Competition, per se, was highly valued; and the potential of bringing new organizations into the program or stimu- lating creative changes in the current QIOs apparently outweighed the pos- sible loss of long-established working relationships with providers and stakeholder groups (personal communication, W. Rollow and E. Freund, CMS, March 9, 2005). The incumbent QIOs retained the contracts for the three Round 1 QIO contracts up for competition (personal communica- tions, C. Lazarus, November 1, 2005, and November 9, 2005; personal communication, J. V. Kelly, October 25, 2005). Two of the contracts re- ceived only one proposal each, and the third received three proposals, all of which came from organizations that held at least one QIO contract in the 7th SOW. Of the two Round 2 contracts that were put up for competition, one was retained by the incumbent and had no other competitors. The second contract had three proposals and was won by an organization that holds other QIO contracts. The contract that was up for competition in Round 3 was retained by the incumbent QIO and had no other competitors (personal communications, J. Kelly, January 31, 2006, and March 27, 2006). QIO View of Competition In the IOM committee telephone interviews, the QIO CEOs responded to questions about automatic recompetition for each new QIO contract and what impact that might have on their operations. CEO views on routine recompetition Twelve of 19 CEOs opposed re- competition for any reason other than nonperformance of contract require- ments. They reiterated that the CMS evaluation needed to be fair, and many expressed concern that the QIOs are called before the CMS panel to ad- dress matters beyond their control (see the discussion of program evalua- tion later in this chapter). The major reasons against routine competition that the CEOs cited were the potential for the loss of momentum in quality improvement, the loss of knowledgeable staff, the length of time needed to develop relationships with the provider community, decreased sharing, and perhaps even less innovation. The following are the specific comments of three of the CEOs: · "It is appropriate to compete if the evaluation is a fair one and the QIO does not pass, but to compete all QIOs is a waste of resources and a diversion from our work. There would be a loss of momentum, as we would be acting in survival mode rather than continuing to improve in the later part of the contract."
CMS OVERSIGHT 347 · "Gaining trust and knowing the right people in the medical commu- nity is a time-consuming process and if you changed every 5 years, you lose momentum." · "QIOs will not be as innovative because they would be less inclined to take risks." Decreased sharing Of the seven CEOs asked about the impact on sharing, all believed that recompeting each contract would have a dampening effect. However, one CEO qualified that by saying, "Competition will not impair sharing on best practices for quality improvement but will impact sharing on organizational operations." Two of the seven CEOs said that some of their peers were already cautious about sharing, "You may not want to share something with a neighbor QIO that you think is going to try to take your work away." Some QIOs are perceived as having a growth philosophy with a "predatory" design on other QIO territories. Timeline Seven of 19 CEOs said that they could accept recompetition at each contract cycle, as long as there was a longer contract period. Of those seven, three favored having everyone competing on a 5-year basis. The other four said that they could accept competition but believed that the QIO program is better off with the incumbent, as long as there is not a nonper- formance issue. One of these CEOs commented that a "Baldrige award winner said winning is a culmination of a 10-year journey; it is not some- thing that happens overnight. The same is true for QIOs." Nine- and 18-Month Monitoring Visits Much of the monitoring of QIOs occurs on a regular basis through official memos, e-mails, teleconferences, and interactions with project offic- ers, as discussed above; but CMS performs formal monitoring visits at the 9- and 18-month points in the contract cycle (CMS, 2004b). A group of Regional Office staff visit each QIO. This group generally includes the Project Officer in charge of that QIO, a second Project Officer, and one Scientific Officer; but the makeup of the team may vary and can include other Regional or Central Office staff. In general, the 9-month visit serves to clarify contract requirements and to ensure that the QIOs are heading in the right direction. At the 18-month visit, the QIO's performance on the contract thus far is evaluated; and input on how it may improve on activi- ties in those areas on which it is not performing well may be given. Project and Scientific Officers work to develop standardized monitoring forms and streamline these visits. The officers also undergo training by teleconference or webex before the visit cycles.
348 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM Before the visits, QIOs complete extensive standardized monitoring re- ports, and the Project Officers and Government Task Leaders subsequently comment on those reports (CMS, 2004b). During the visit, CMS and QIO staff may discuss items from the report or the difficulties with the perfor- mance of the contract that that QIO faced. The visiting team also assesses samples of cases that the QIO reviewed. After the visit, the Project Officer (with input from other team members) finishes the report and prepares a letter with the team's observations and findings. The letter is ultimately reviewed by the Associate Regional Administrator and is sent to the QIO and the Contracting Officer. During the IOM site visits, many QIOs stated that they believed that the 9- and 18-month visits were not well timed, as the 9th month was too late to change any work already initiated and the 18th month was too late to do anything if it looked like the QIO might fail on a task. However, the IOM committee's web-based data collection tool showed that the QIOs believe that there is intrinsic value in the process of preparation for these monitoring visits, as well as in the feedback that they receive. Forty-two of 52 QIOs (81 percent) rated the value to the QIO of its own preparation for the 9-month monitoring visit as either excellent or good. Forty-three of 52 QIOs (83 percent) said the same about the preparation for the 18-month visit. Forty-two of 52 QIOs (81 percent) rated the value of the feedback received from the 9-month visit as excellent or good, and 43 of 52 QIOs (83 percent) said the same for the value of the feedback received from the 18-month visit. Contract Implementation and Length CMS divides the QIO contracts into three rounds for staggered imple- mentation over a 6-month period, from August 2005 to February 2006. For the 8th SOW, Round 1 QIO contracts had an official start date of August 1, 2005; however, the QIOs whose contracts were to begin in Round 2 and Round 3 also began working on 8th SOW activities on the same date by the use of modifications to their contracts for the 7th SOW (personal commu- nication, S. Pazinski, November 14, 2005). If a QIO contract is up for competition but the decision has not yet been made to award the contract to that QIO (as is the case for the one Round 3 QIO contract up for compe- tition as of this writing), the incumbent QIO begins 8th SOW activities along with all the other QIOs. If the contract is eventually awarded to a different organization, the incumbent assists the successor by use of a tran- sition plan that familiarizes the new contractor with state activities, includ- ing the provision of materials for case review and quality improvement activities. CMS also adjusts the contracts for new contractors to allow extra time for the delivery of certain deliverables.
CMS OVERSIGHT 349 During the IOM committee site visits, 7 of 11 QIOs indicated that they believed that the contract cycle was too short. Some believed that a longer contract cycle would help address some of the difficulties associated with the monitoring visits and data lag (as discussed above), which they believed limited their abilities to prove success during the time period of the contract cycle. Three of those QIOs independently suggested the use of a 5-year cycle. Thus, the QIOs also expressed concern that the evaluations focused too much on quantitative results and that the evaluation guidelines were too rigid. (See the discussion on program evaluation later in this chapter.) In the telephone interviews, the QIO CEOs related that a lack of provi- sion of data in a timely manner has implications for the length of the QIO contract and the perceived unfairness of the CMS evaluation. Eleven of the 20 CEOs mentioned that the lack of timeliness made the 3-year contract time frame inappropriately short because, first, it did not allow sufficient time for the provision of feedback data on quality improvement changes by providers and, second, the data that CMS uses to monitor whether the QIOs had met their performance requirements did not reflect the work that they had done. The QIOs in the first round believed that they were at a particular disadvantage. Some CEOs commented: · "The way things are structured now, we don't have data that [re- flect] but a short intervention period. We should really be doing an inten- sive 3-year period of intervention. We'd do better if we had feedback on identified participants sooner so we could adapt." · "We need a longer time horizon; with a longer contract we would have time to use our data to course correct." · "We really need 24 months of actual intervention which is impos- sible within a 3-year contract; we need at least a 4-year contract." QIOSC Contracts In the IOM committee telephone interviews, 7 of 20 QIO CEOs raised the question of whether the QIOSCs can ever be ahead of the curve if they are trying to sort out task content at the same time that implementation of the QIO contract is required. CEOs suggested that QIOs need a head start by having their contract tasks 6, 9, or even 12 months before the QIOs start new tasks or that all new tasks need to be the subject of pilot tests before the QIOs are assigned the new tasks. As one CEO put it, "Timeliness is a problem because we [QIOs and QIOSCs] are working on the same issue in a parallel time frame." Another CEO said, "Given the time constraints of our own contracts, we need quick answers. Sometimes, if we are trying to get a new project off the ground, we can't get direction from QIOSCs." All CEOs indicated that if the QIOSC is not ready to go on day one, the QIO
350 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM must move ahead on its own. The QIOs in the first round were particularly affected and had to start the new tasks with no QIOSC materials. Award Fees In the QIO contract for the 8th SOW, the Award Fee Plan involves a combination of cost plus fixed fee and cost plus award fee mechanisms. CMS also built in several types of incentives in addition to the base fee. The base fee is 1 percent of contract costs, excluding pass-through costs (reim- bursable expenses) and special studies (CMS, 2005a). The Full Pass Perfor- mance Award Fee is an award of 1 percent of contract costs per applicable subtask (excluding pass-through and special studies costs) for QIOs that meet all full pass performance expectations. The Excellent Pass Performance Award Fee is an additional award of 1 percent of contract costs per appli- cable subtask (excluding pass-through and special studies costs) for QIOs that meet the excellent pass criteria. Finally, a Group Award Fee of 2 per- cent of contract costs per subtask is awarded to QIOs that meet the follow- ing three criteria: · the QIO receives a full pass on evaluation standards for that subtask, · no more than five QIOs have failed to achieve at least a conditional pass on that subtask, and · the composite scores for all QIOs meet or exceed specific achieve- ment standards delineated for each subtask in the J-2 attachment of the QIO contract for the eighth. The Group Award Fee is designed to encourage sharing and collaboration among QIOs, for it is in the best interest of each QIO to ensure that all QIOs pass so that all will receive the additional fee. The contract also speci- fies that the QIOs will be paid a fixed fee for information systems, contrac- tual requirements, and special studies costs. In the 8th SOW, QIOs may also qualify for an interim award fee based on performance as of January 2007 (CMS, 2005c). This includes up to 50 percent of the Full Pass Performance Award Fee and up to 50 percent of the Group Award Fee. QIOs that do not qualify for Interim Award Fee payments may receive full award amounts in November 2007. QIOs quali- fying for an Interim Award Fee may receive the remainder of the fee at that time as well. In the QIO contract for the 8th SOW, CMS presents detailed information on the measures and calculations used to assess performance for these payments.
CMS OVERSIGHT 351 OVERALL PROGRAM GUIDANCE For a public program as diverse and multifaceted as the QIO program, some of the most important functions at the federal level include strategic planning, broad policy guidance and priority setting, coordination with other programs of the U.S. Department of Health and Human Services (DHHS), and evaluation of the whole program. In a federal program, even one that does not require an annual appropriation, these guidance func- tions take place in the context of the federal budget. That context inevitably creates some uncertainties. Also, as the QIO program becomes more inte- grated with other CMS activities, the independence of the program's plan- ning and operations will likely be affected. Strategic Planning In the months preceding the start of the 8th SOW, CMS began an am- bitious long-range planning process for the QIO program with the help of a consultant and the Process Improvement QIOSC (Qualis Health, under its contract as the QIO for Washington state). CMS was looking well beyond the 8th SOW and considering the program over the next 10 to 12 years. After considerable internal discussion, external stakeholder groups offered advice on how transformational change could be achieved. The meetings of the stakeholder groups, including representatives of QIOs, were organized according to the main provider settings addressed by the QIO program: home health agency providers, hospitals, nursing homes, and physicians' practices. The discussion at the physicians' meeting was wide ranging and touched on many of the issues raised by the 8th SOW, such as the role of QIOs in promoting the use of health information technology in physicians' practices. Discussions among the stakeholders and CMS indicated much uncertainty about the roles of QIOs in the 8th SOW, as well as in the future. CMS planned to prepare a report on the substance of the meetings and to provide feedback to the participants. Policy Direction As described in the evolution of the QIO program in Chapter 2, there have been significant changes in policy direction with each new contract, including occasional additional changes within a 3-year contract period. In the past, CMS released a version of the new QIO contract well in advance of the request for proposal for the first contract cycle so that the QIOs had time to plan their work and respond to the request for proposal. The tran- sition from the 7th to the 8th SOW was not easy because negotiations within DHHS and among DHHS, CMS, and the Office of Management and Bud-
352 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM get over the new QIO contract and funding took longer than it did in previ- ous years. The contract for the 8th SOW, which had been expected to be available in the summer of 2004 for implementation on August 1, 2005, was not formally released until April 2005; and all of the final budget fig- ures were not available until May 20, 2005. Subsequent altered versions of the contract were released in June, September, and November 2005. All QIOs bidding in the first round received guidance from CMS on most but not all of the QIO budget before their responses were due to CMS (personal communication, D. Adler, American Health Quality Association, May 16, 2005). In the 8th SOW, the tasks of the QIO program reflect a major change from measurement-based quality improvement to assisting providers with achieving transformational change (Rollow, 2004). Early summaries of the 8th SOW, as well as the request for proposal, raised many questions. The QIOs speculated about what "transformational change" really meant and how it would be accomplished (CMS, 2004a; AHQA, 2005). CMS had to post questions and answers on its website to clarify its intent for the QIO contract bidders. The long-range strategic planning meetings mentioned above, which were held after the release of the QIO contract for the 8th SOW, defined the goal as soliciting advice from the stakeholders on how to achieve transformational change in their care settings, how to measure it, and how CMS and QIOs could support that change. Priority setting is also a key need for the QIO program; beyond the goal of "transformational change" and the six quality aims for health care estab- lished by IOM (safety, timeliness, effectiveness, efficiency, equity, and patient-centeredness), it is difficult to discern priorities within the 8th SOW. Compared with the 7th SOW, the 8th SOW involves considerably more tasks, more measures for evaluation, and more identified participant groups. The evaluation formulas are complex, with many different subscores and many different weights, making it impossible to determine where a QIO should focus its time and resources (see Table A.6 in Appendix A). Because failure on any one task could jeopardize noncompetitive renewal for the next SOW, one might assume that all tasks are of equal priority. The QIOs voiced concerns about CMS's priorities and focus in the 7th SOW. During the IOM committee site visits, 6 of 11 QIOs described frus- tration with CMS's inconsistent or changing priorities, including contract changes in the middle of an SOW. One QIO specifically criticized the lack of continuity between SOWs, describing difficulty with a "stop-start" ef- fect. Several QIOs expressed a desire for fewer, more well-defined priority areas so that the QIOs could focus their efforts on just a few priorities. The participants of a focus group held by the IOM committee discussed the direction that the QIO program is taking in the 8th SOW. Themes of concern included the challenge of working with an increased number of
CMS OVERSIGHT 353 identified participants with limited resources, overly complex evaluation formulas, and the lack of flexibility in the contract. The participants also believed that increased competition might lead to decreased collaboration among QIOs, that the contract length was too short to create culture change, and that administrative reporting requirements should be de- creased. Overall, the focus group participants believed that DHHS as a whole needs to align its priorities to provide incentives for quality improve- ment, such as through the implementation of regulatory requirements and pay for performance. Program Coordination The QIO program is only one of several health care qualityrelated efforts under way within CMS, which increases the need for coordination within Medicare and CMS as a whole. Some of that coordination may take place when other offices within CMS desire to use the QIO apportionment to fund their research or other activities or to use the apportionment for policy planning at broader levels. Support Contracts and Special Studies Chapter 7 described the various review and funding mechanisms for the special studies and support contracts. At the beginning of the SOW, the program indicates priorities for special studies, but unsolicited proposals may be considered and funded later in the contract period for the SOW. However, no apparent mechanism exists for coordinating projects and fund- ing priorities among those projects. Also, it is unclear how CMS shares information about ongoing studies with the QIO community or what it does with the results of all studies. As one QIOSC representative stated in an interview, "I have no idea what CMS does with special studies' results." In the IOM committee telephone interviews, 8 of 20 QIO CEOs com- mented on the pros and cons of pilot testing. All found pilot testing to be favorable from the standpoint of having experience with the task at hand before all QIOs approach that task. However, two CEOs cautioned that sometimes pilot studies are not always the answer, as they can be too state specific and may not have been translated for a wider audience. One CEO commented further that the oversight of special projects is sometimes as- signed to a middle manager at CMS with no expertise in the topic area of the project. Quality Coordination Team Recently, CMS made efforts to coordinate the quality improvement efforts across all programs of CMS. On September 14, 2004, CMS an-
354 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM nounced the creation of a new Quality Coordination Team to support and act as staff to the redesigned Quality Council (CMS, 2004c). The adminis- trator of CMS, Mark McClellan, chairs the Quality Council; and its mem- bers include the director of each major CMS office. The Quality Council strives to coordinate all CMS efforts related to quality as well as to align those efforts with the quality improvement activities of other public and private organizations (Jencks, 2004). The Quality Coordination Team is led by Steve Jencks, former director of the Quality Improvement Group. Almost all team members are staff from CMS. In July 2005, CMS released the Quality Council's Quality Improve- ment Roadmap to improving the quality of care (CMS, 2005b). This roadmap included five major strategies: working through partnerships, pub- lic reporting, paying for quality performance, promoting efficient systems (such as electronic health systems), and increasing the availability and im- proved use of innovative technologies. Major activities of the Quality Coor- dination Team include direct support of the Quality Council, such as moni- toring of work groups, facilitation of partnerships and collaboratives, and participation in breakthrough projects. Topics chosen for focus in the work groups include performance measures and pay for performance, health in- formation technology, the Medicare Part D prescription drug benefit, and CMS Regional OfficeCentral Office communications. Breakthrough proj- ects include Fistula First, which seeks to improve vascular access in patients with end-stage renal disease; a project that seeks to raise immunization rates in specific settings; and the Institute for Healthcare Improvement's 100,000 Lives campaign (personal communication, S. Jencks, CMS, July 21, 2005). The Quality Coordination Team strives to facilitate partnerships within CMS, with other federal agencies (such as the Centers for Disease Control and Prevention and the Agency for Healthcare Research and Qual- ity), and with nongovernmental organizations (such as the Institute for Healthcare Improvement). The Quality Council and Quality Coordination Team have no direct responsibility for the QIO program, but many CMS staff on the team are directly responsible for the operation of the QIO program. Measures Selection and Coordination Some QIO functions, such as the selection of quality measures, require coordination with national stakeholder organizations as well with various offices in CMS. CMS identified four criteria for the selection of measures: · the measures must be scientifically and clinically sound, · the measures must be reproducible, · the measures should not add burden to the provider, and
CMS OVERSIGHT 355 · the measures should use existing data sources (CMS, 2004b). CMS worked collaboratively with the CMS Survey and Certification pro- gram as well as with the Joint Commission on Accreditation of Healthcare Organizations, the American Medical Association, and the National Com- mittee for Quality Assurance to align measure specifications to minimize reporting burdens on providers. Ultimately, the groups seek endorsement of selected measures by the National Quality Forum. In the 7th SOW, CMS contracted with the Health Services Advisory Group (Arizona's QIO) to maintain measures by the identification, standardization, and endorsement of measures with updating and retirement of the measures as needed (CMS, 2004b). The QIO program also funded work through other parts of CMS that contributed to the development and refinement of other measures, such as the support contract for the Consumer Assessment of Healthcare Provid- ers and Systems (CAHPS) family of surveys ($33.4 million during the 7th SOW) (personal communication, C. Lazarus, March 17, 2005). Roles and Relationships Overall, the QIOs express great concern about the relationships be- tween and among QIOs, Program Officers, Government Task Leaders, QIOSCs, and CMS. The QIOs want better definitions of the roles of each of these individuals or groups and streamlining of the management process. During the IOM committee's site visits, the QIOs discussed many chal- lenges related to CMS oversight of the QIO program. Seven of 11 QIOs expressed concerns over the relationship between CMS and the QIOs and communications problems. Specific examples included references to isola- tion of groups within CMS and poor communication between CMS, the Regional Offices, and the QIOs. The QIOs referred to "tension" in these interactions. Three QIOs specifically mentioned difficulties with ambigu- ous or poorly defined information, and three were frustrated with "micro- management" of the program. Five QIOs wanted more flexibility in the program in terms of either quality improvement topic areas or how goals are achieved. In interviews with five organizations representing seven QIOSCs, QIO staff members also mentioned difficulties in their relationships with CMS's Central Office and with their Government Task Leaders. As described above, all QIOs believed that the relationship with their Government Task Leaders was key, and they provided a range of responses as to whether or not that relationship was positive. QIOSC staff believed that they had lim- ited to no direct interaction with CMS's Central Office and wanted to see a reduction of silos and increased communication among and between the Government Task Leaders and CMS's Central Office.
356 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM Elaborating on this issue, during the telephone interviews, one QIO CEO said, "CMS develops these contracts to develop expertise in an area, but then CMS doesn't listen to or take the advice of experts, so we end up with a program directed by Government Task Leaders rather than experts. This structure could use improvement." To get better service, this CEO stated, "There should be direct knowledge transfer [to QIOs] rather than having to get approval by a Government Task Leader at every step." The CEO asserted that "the Government Task Leaders don't seem to have any great urgency in approving materials, so lots of time can transpire. Govern- ment Task Leader delay cuts into the time available for technical assistance to QIOs." Another CEO reported that "there is often lack of clear direction from CMS on desired outcomes or that expectations change during the project. Sometimes change is inevitable as information is gathered, but that can substantially change the QIOSC resources. Reasons for change are more palatable if they are clinical rather than political." In these telephone interviews, the QIO CEOs also expressed concern about the interaction between the QIOSCs and CMS. CEOs pointed out that the QIOSCs have dual audiences, both the QIOs and CMS, and thus have a sense of being caught in the middle. Four respondents whose organi- zations held QIOSC contracts mentioned having difficulty in being able to respond to QIOs as a direct consequence of CMS delay. One said, "QIOSCs are caught between a rock and a hard place, with CMS being the rock that they have to be responsive to over the QIOs. What the QIOs want goes nowhere until CMS wants it to." Overall Program Evaluation Compared with the considerable effort that CMS put into designing complex formulas to evaluate the contract performance of each QIO (see Chapter 10), creating the databases necessary to evaluate their performance, and monitoring the progress of each QIO, it appears to have spent little time on evaluating other aspects of the program and the program as a whole. Although priorities for special studies are set by the Science Council of the Office of Clinical Standards and Quality, as described in Chapter 7, CMS has not developed a system for tracking all the various special study and support contracts, considering the balance of topics and spending, and de- termining how they might serve program priorities. Also, no system exists for broadly sharing the knowledge acquired through the studies or even letting all the QIOs and other Project Officers and Government Task Lead- ers know which QIO is working on a particular special study topic. The Quality Improvement Group was unable to provide the IOM com- mittee with information on the various contracts at a level of detail suffi- cient for the committee to know what the contracts are supposed to accom-
CMS OVERSIGHT 357 plish and whether accomplishments have been made, although individual Project Officers are required to assess their own projects. This lack of infor- mation prevents an assessment of the overall value and impact of the spend- ing on special studies and support contracts. In the 7th SOW, nearly 31 per- cent ($355 million) of the total program's apportionment ($1,154.3 million) was spent on special studies and support contracts (personal communica- tion, C. Lazarus, March 17, 2005). CMS does not have a mechanism or formula with which it can evaluate individual QIOs overall. Although the program can determine whether a specific QIO has achieved a passing score on its contract performance, it cannot distinguish outstanding QIOs from mediocre QIOs in a holistic sense. Although the QIOs vary widely on many organizational criteria, it is unclear which, if any, of those factors contribute to better performance, as it was not feasible to identify the better-performing QIOs. For example, at the end of the 7th SOW, one of the three QIO contracts for which re- competition was conducted in the first round was held by an organization that had been awarded one of the highest number of special study con- tracts. Also, as mentioned in Chapter 10, the committee's attempts to group QIOs according to their overall performance on the quality improvement subtasks were unsuccessful. The web-based data collection tool attempted to gather opinions about other QIOs by the QIO community itself, but the results were inconclusive. More importantly, neither CMS nor independent researchers have per- formed a conclusive evaluation of the impacts of the 53 QIOs on quality improvement nationally. Also, CMS has not performed a programwide evaluation to examine in detail the synergy, or lack thereof, between the spending on special studies, QIOSCs, and support contracts and the spend- ing on the core contracts. During the IOM committee's site visits, 4 of 11 QIOs independently related frustration with the contract evaluation process. They believed that the goals were too stringent and that too much emphasis was placed on short-term quantitative results. In the web-based data collection tool, 52 QIOs rated the clarity and timeliness of the evaluation process. Overall, the process did not receive high marks, with only one QIO giving a score of "excellent" on one of the three dimensions indicated in Table 13.8. More than half of the QIOs answered "fair" or "poor" for each of the three dimensions. In contrast to the QIO evaluations of the 7th SOW, QIOSC evaluations were informal and were primarily based on the completion of a set of deliverables, according to interviews with five organizations representing seven QIOSCs. All believed that their "success" was very subjective and based on the personal satisfaction of their Government Task Leaders.
358 MEDICARE'S QUALITY IMPROVEMENT ORGANIZATION PROGRAM TABLE 13.8 QIO Ratings of Evaluation Process Overall Timeliness Clarity of Clarity of of Information Quantitative Portion Qualitative Portion About Evaluation Rating of Evaluation of Evaluation (methodology, process, etc.) Excellent 1 0 0 Good 18 21 10 Fair 19 19 19 Poor 14 12 23 NOTE: The data in the table represent the number of QIOs responding as indicated. Data are for a total of 52 QIOs. SOURCE: IOM committee web-based data collection tool. SUMMARY This chapter has discussed CMS's oversight of the QIO program. The following are some of the main themes of this chapter, which are reflected in the findings and conclusions presented in Chapter 2: · Multiple offices and divisions within CMS have responsibility for the QIO program. QIOs expressed frustration with the lack of coordina- tion and communication between and among personnel and with the time- liness of the information provided to them. They criticized the lack of coor- dination by CMS, which leads to competing agendas for different managers within the QIO program. · One of the greatest concerns for QIOs was the time lag to the receipt of performance data because it affects their quality interventions as well as their contract performance assessments. · QIOs oppose routine recompetition for the core contract because of the loss of momentum that it causes, the decreased incentive that QIOs have to share knowledge, and the chance that they might lose their contract. · Overall, QIOs believe that the 3-year contract period is too short to achieve measurable change and is complicated by the concurrent lag in the time to receipt of performance data. They also believe that the timeline for QIOSC contracts should begin earlier so that the QIOSCs may help the QIOs immediately upon the start of a new SOW. · Although CMS is developing a strategic plan for the QIO program 12 years into the future, the program still lacks distinct, focused priorities. Neither the core contracts nor the associated evaluation schemes prioritize the QIO activities.
CMS OVERSIGHT 359 · Evaluations of the QIO core contract are based on overly complex formulas. They hold the QIO accountable for provider improvements on specified measures for short-term quantitative results. In contrast, QIOSC evaluations are mainly subjective and are based primarily on the satisfac- tion of the Government Task Leader and completion of a set of deliverables. · CMS lacks any formal means of evaluation of the whole QIO pro- gram, its success on improving quality, or the distinction of the perfor- mance of one QIO over another. REFERENCES AHQA (American Health Quality Association). 2005. Proceedings of the AHQA Annual Meet- ing and Technical Conference. San Francisco, CA: American Health Quality Association. Beck C, Richard H, Tu J, Pilote L. 2005. Administrative data feedback for effective cardiac treatment: AFFECT, A Cluster Randomized Trial. Journal of the American Medical As- sociation 294(3):309317. Bradley EH, Carlson MDA, Gallo WT, Scinto J, Campbell MK, Krumholz HM. 2005. From adversary to partner: have quality improvement organizations made the transition? Health Services Research 40(2):459476. CMS (Centers for Medicare and Medicaid Services). 2002. 7th Statement of Work (SOW). [Online]. Available: http://www.cms.hhs.gov/qio [accessed April 9, 2005]. CMS. 2003. HHS Privacy Impact Assessment (PIA). November 18. [Online]. Available: www.cms.hhs.gov/privacyact/hcqis.pdf [accessed April 21, 2005]. CMS. 2004a. Proceedings of the QualityNet 2004 Conference. Washington, DC: Centers for Medicare and Medicaid Services. CMS. 2004b. The Quality Improvement Organization Program: CMS Briefing for IOM Staff. [Online]. Available: http://www.medqic.org/dcs/ContentServer?cid=1105558772835& pagename=Medqic%2FMQGeneralPage%2FGeneralPageTemplate&c=MQGeneralPage [accessed December 26, 2005]. CMS. 2004c. Notice from CMS Concerning Positions of Steve Jencks and Bill Rollow. Unpub- lished. Baltimore, MD: Centers for Medicare and Medicaid Services. CMS. 2004d. Quality Improvement Organization Manual. September 16. [Online]. Available: http://www.cms.hhs.gov/manuals/110_qio/qio110index.asp [accessed May 11, 2005]. CMS. 2005a. 8th Statement of Work (SOW). [Online] Available: http://www.cms.hhs.gov/qio [accessed April 9, 2005]. CMS. 2005b. Executive Summary. In: Quality Improvement Roadmap. [Online]. Available: http://www.cms.hhs.gov/quality/quality%20roadmap.pdf [accessed September 30, 2005]. CMS. 2005c. 8th Statement of Work (SOW), Version #080105-1. [Online]. Available: http:// www.cms.hhs.gov/qio [accessed November 4, 2005]. Hughes E. 2004. 8th SOW Contract Issues. Presentation at QualityNet 2004 Conference. Washington, DC: Centers for Medicare and Medicaid Services. Jencks SF. 2004. The Health Care Quality Improvement Partnership. Powerpoint Presenta- tion to Quality Improvement Organization Subcommittee, October 4, Baltimore, MD. Jost TS. 1991. Policing cost containment: The Medicare Peer Review Organization Program. University of Puget Sound Law Review 3:483526. Rollow WC. 2004. Evaluating the HCQIP Program. Powerpoint Presentation to Quality Im- provement Organization Subcommittee, October 4, Baltimore, MD.