National Academies Press: OpenBook

Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods (1990)

Chapter: 7. Medicare Conditions of Participation and Accreditation for Hospitals

« Previous: 6. A Quality Assurance Sampler: Methods, Data, and Resources
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 292
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 293
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 294
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 295
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 296
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 297
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 298
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 299
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 300
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 301
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 302
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 303
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 304
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 305
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 306
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 307
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 308
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 309
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 310
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 311
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 312
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 313
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 314
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 315
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 316
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 317
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 318
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 319
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 320
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 321
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 322
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 323
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 324
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 325
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 326
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 327
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 328
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 329
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 330
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 331
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 332
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 333
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 334
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 335
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 336
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 337
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 338
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 339
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 340
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 341
Suggested Citation:"7. Medicare Conditions of Participation and Accreditation for Hospitals." Institute of Medicine. 1990. Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods. Washington, DC: The National Academies Press. doi: 10.17226/1548.
×
Page 342

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

7 Medicare Conditions of Participation and Accreditation for Hospitals Michael G. H. McGeary Since the passage of Medicare legislation in 1965, Section 1861 of the Social Security Act has stated that hospitals participating in Medicare must meet certain requirements specified in the act and that the Secretary of the Department of Health, Education and Welfare (HEW) [now the Department of Health and Human Services (DHHS)] may impose additional require- ments found necessary to ensure the health and safety of Medicare benefici- aries receiving services in hospitals. On this basis, the Conditions of Par- ticipation, a set of regulations setting minimum health and safety standards for hospitals participating in Medicare, were promulgated in 1966 and sub- stantially revised in 1986. Also since 1965, under authority of Section 1865 of the Social Security Act, hospitals accredited by the Joint Commission on Accreditation of Healthcare Organizations (JCAHO or the Joint Commission) or the Ameri- can Osteopathic Association (AOA) have been automatically "deemed" to meet all the health and safety requirements for participation except the utilization review requirement, the psychiatric hospital special conditions, and the special requirements for hospital providers of long-term-care serv- ices. As a result of this deemed status provision, most hospitals participat- ing in Medicare do so by meeting the standards of a private body governed by representatives of the health providers themselves. Currently, about 5,400 (77.1 percent) of the 7,000 or so hospitals participating in Medicare are accredited. The 1,600 or so participating hospitals that are unaccredited) tend to be small and located in nonurbanized areas. A 1980 study found that about 70 percent of the unaccredited hospitals had fewer than 50 beds, compared with only 13 percent of the accredited hospitals (see Table 7.1~. The current federal standards for hospitals participating in Medicare are presented in the Code of Federal Regulations (CFR) as 24 "Conditions of 292

PARTICIPATION AND ACCREDITATION FOR HOSPITALS TABLE 7.1 Medicare Participating Hospitals, 1980 TotalJCAHO/AOAa Number ofParticipatingAccredited BedsHospitalsHospitalsb <501,7726791,093 50-991,6071,253354 100-1991,4441,36678 200-29978676125 300-39944443311 400~992932885 500-9993433385 1,000+56542 Total6,7455,1721,573 aJCAHO is the Joint Commission on Accreditation of Healthcare Orgaruzations; AOA is the American Osteopathic Association. bl lS are accredited by AOA. SOURCE: DHHS, 1980. 293 Participation," containing 75 specific standards (see Table 7.2~.2 The re- sponsibility for revising the Conditions of Participation lies with the Bureau of Eligibility, Reimbursement and Coverage of the Health Care Financing Administration (HCFA). A separate HCFA unit, the Bureau of Health Stan- dards and Quality (HSQB), is responsible for administering and enforcing the Conditions of Participation. In addition to overseeing about 1,600 certi- fied and 5,400 accredited hospitals, HSQB enforces separate sets of Condi- tions of Participation for over 25,000 other Medicare providers, including approximately 10,000 skilled nursing facilities, 5,700 home health agencies, and 4,775 laboratories. The actual compliance of hospitals with the Condi- tions of Participation is monitored for the federal government by each state through periodic on-site surveys by personnel of the state agency that li- censes hospitals and other health facilities (or, in a few cases, by an equiva- lent agency). The Joint Commission on Accreditation of Hospitals (JCAH) was created in 1951 to accredit hospitals that met its minimum health and safety stan- dards. In 1987, JCAH changed its name to the Joint Commission on Ac- creditation of Healthcare Organizations in recognition that since 1970 it had developed accreditation programs for additional health services organiza- tions delivering long term care, ambulatory health care, home care, hospice care, mental health care, and "managed" care [for example, health mainte- nance organizations (HMOs) and preferred provider organizations (PPOs)~.

294 MICHAEL G. H. McGEARY TABLE 7.2 Current Medicare Conditions of Participation and Standards for Hospitals Conditions of Participation Standards 1. Provision of emergency services by nonparticipating hospitals 2. Compliance with federal, state, and local laws 3. Governing body 4. Quality assurance 5. Medical staff 6. Nursing services 7. Medical record services 8. Pharmaceutical services 9. Radiologic services 10. Laboratory services 11. Food and dietetic services 12. Utilization review (a) Federal laws (b) State licensure (c) Personnel licensure (a) Medical staff (b) Chief executive officer (c) Care of patients (d) Institutional plan and budget (e) Contracted services (f) Emergency services (a) Clinical plan (b) Medically related patient care services (c) Implementation (a) Composition of the medical staff (b) Medical staff organization and accountability (c) Medical staff bylaws (d) Autopsies (a) Organization (b) Staffing and delivery of care (c) Preparation and administration of drugs (a) Organization and staffing (b) Form and retention of record (c) Content of record (a) Pharmacy management and administration (b) Delivery of services (a) Radiologic services (b) Safety for patients and personnel (c) Personnel (d) Records (a) Adequacy of laboratory services (b) Laboratory management (c) Personnel (d) (e) (a) (a) (c) (d) Blood and blood products Proficiency testing Quality control Organization Diets Applicability Composition of utilization review committee Scope and frequency of review Determination regarding admissions or continued stays (e) Extended stay review (f) Review of professional services

PARTICIPATION AND ACCREDITATION FOR HOSPITALS 295 Conditions of Participation 13. Physical environment Standards 14. Infection control 15. Surgical services 16. Anesthesia services 17. Nuclear medicine services 18. Outpatient services 19. Emergency services 20. Rehabilitation services 21. Special provisions applying to psychiatric hospitals 22. Special medical record requirements for psychiatric hospitals 23. Special staff requirements for psychiatric hospitals 24. Special requirements for hospital providers of long-term-care services ("swing-beds") (a) Buildings (b) Life safety from fire (c) Facilities (a) Orgaruzanon and policies (b) Responsibilities of chief executive officer, medical staff, and director of nursing services (a) Organization and staffing (b) Delivery of service (a) Organization and staffing (b) Delivery of services (a) Organization and staffing (b) Delivery of service (c) Facilities (d) Records (a) Organization (b) Personnel (a) Organization and direction (b) Personnel (a) Organization and staffing (b) Delivery of services (a) Development of assessment and diagnostic data (b) Psychiatric evaluation (c) Treatment plan (d) Recording progress (e) Discharge planning and discharge summary (a) Personnel (b) Director of inpatient psychiatric services; medical staff (c) Availability of medical personnel (d) Nursing services (e) Psychological services (f) Social services (g) Therapeutic activities (a) Eligibility (b) Skilled nursing facility services SOURCE: 42 CFR Part 482, effective September 15, 1986

296 MICHAEL G. H. McGEARY The Joint Commission's standards for the 5,400 hospitals it accredits currently are contained in the Accreditation Manual for Hospitals, some sections of which are revised each year through an elaborate process of professional consensus coordinated by its department of standards (see Table 7.3 for the outline of the Joint Commission's hospital standards). The Joint Commission currently is governed by a board of 24 commissioners, 7 each appointed by the American Medical Association (AMA) and the American Hospital Association (AHA), 3 each by the American College of Surgeons (ACS) and the American College of Physicians, 1 by the American Dental Association, and 3 private citizens appointed by the board to add the con- sumer perspective aCAHO, 1988a).3 As of late 198S, the Joint Commis- sion had a staff of 320 at its headquarters in Chicago and 310 surveyors located around the country. Both governmental regulation by HCFA and professional self-regulation by the Joint Commission are aimed at assuring the quality of care provided in hospitals.4 Both sets of standards have evolved from efforts to assure a minimum capacity to provide adequate care to more ambitious efforts to make hospitals assess and improve their organizational and clinical per ~ . ~ . . . tormance In a comprehensive and continuous manner. HOSPITAL STANDARDS: ORIGIN AND DEVELOPMENT Private, voluntary efforts to improve the quality of care in hospitals by setting minimum, and later, optimum standards date from 1918. However, federal facility standards have inevitably accompanied any significant fed- eral expenditures on hospital services or construction, beginning with the first grant-in-aid program for maternal and child health services, the Shep- pard-Towner Act of 1921. The two approaches were formally joined in 1965, when the Social Security Act amendments creating Medicare speci- fied that accreditation by JCAH meant that a participating hospital was automatically deemed to meet the federal Conditions of Participation in the Medicare program. Initially, about 60 percent of participating hospitals qualified through accreditation; today about four-fifths of the participating hospitals are accredited by the Joint Commission or, in some cases, the AOA. Development of Early Voluntary Standards by the ACS and JCAH The first standards for the organization and operation of hospitals were set forth by the ACS in 1918 (Davis, 1973; Stephenson, 1981; Roberts et al., 1987~. The founders of the ACS considered conditions in many hospi- tals to be deplorable for patients and physicians alike, and hospital stan- dardization was a stated purpose of the organization at its founding in 1912.

PARTICIPAT70N AND ACCREDITATION FOR HOSPITALS TABLE 7.3 Joint Commission on Accreditation of Healthcare Organizations' Hospital Standards, 1990 Chapter Alcoholism and Other Drug Dependence Services (AL) 297 Standard AL.1 Objectives and scope AL.2 Assessment AL.3 Treatment planning AL.4 Monitoring and evaluation AL.5 Discharge planning DR.1 Direction and staffing DR.2 Policies and procedures DR.3 Diagnostic studies and therapeutic procedures DR.4 Monitoring and evaluation DT. 1 Organization, direction, staffing, and integration DT.2 Orientation, education, and training DT.3 Policies and procedures DT.4 Facility design and equipment DT.5 Medical record DT.6 Quality control mechanisms DT.7 Monitoring and evaluation ER. 1 Plan ER.2 Organization, direction, and staffing ER.3 Integration ER.4 Training and education ER.5 Policies and procedures ER.6 Facility design and equipment ER.7 Medical record ER.8 Quality control mechanisms ER.9 Monitoring and evaluation GB.1 Responsibilities GB.2 Conflict of interest GB.3 Fulfillment of responsibilities HO.1 Availability HO.2 Education and training HO.3 Policies and procedures HO.4 Safety, equipment, and utilities management and life safety HO.5 Medical record HO.6 Quality control mechanisms HO.7 Monitoring and evaluation IC. 1 Program IC.2 Committee IC.3 Management IC.4 Policies and procedures IC.5 Support services/ departments TABLE 7.3 continues Diagnostic Radiology Services (DR) Dietetic Services (DT) Emergency Services (ER) Governing Body (GB) Hospital-Sponsored Ambulatory Care Services (HO) Infection Control (IC)

298 TABLE 7.3 Continued Chapter MICHAEL G. H. McGEARY Standard Management and Administration (MA) Medical Record Services (MR) Medical Staff (MS) Nuclear Medicine Services (NM) Nursing Services (NR) Pathology and Medical Laboratory Services (PA) Pharmaceutical Services (PH) Physical Rehabilitation Services (RH) MA.1 Responsibilities MR.1 Purposes MR.2 Content MR.3 Confidentiality and completeness MR.4 Direction, staffing, and facilities MR.S Staff role in committee functions MS.1 Membership MS.2 Bylaws and rules and regulations MS.3 Organization MS.4 Privilege delineation MS.S Reappointment and reappraisal MS.6 Monitoring and evaluation MS.7 Continuing education NM. 1 Direction and staffing NM.2 Policies and procedures NM.3 Diagnostic studies and therapeutic procedures NM.4 Monitoring and evaluation NR.1 Responsibilities NR.2 Direction and integration NR.3 Organization NR.4 Assignments NR.5 Care NR.6 Education and training NR.7 Policies and procedures NR.8 Monitoring and evaluation PA. 1 Availability PA.2 Facility design and equipment PA.3 Communication PA.4 Records and reports PA.5 Quality control systems PA.6 Additional specific requirements PA.7 Monitoring and evaluation PH. 1 Direction and staffing PH.2 Facility design and equipment PH.3 Scope of service PH.4 Intrahospital drug distribution system PH.5 Administration of drugs PH.6 Monitoring and evaluation RH.1 Availability RH.2 Services

PARTICIPATION AND ACCREDITATION FOR HOSPITALS 299 Chapter Standard Plant, Technology, and Safety Management (PL) Professional Library Services (PR) Quality Assurance (QA) Radiation Oncology Services (RA) Respiratory Care Services (RP) Social Work Services (SO) Special Care Units (SP) Surgical and Anesthesia Services (SA) SA. 1 SA.2 SA.3 SA.4 Utilization Review (UR) RH.3 Comprehensive physical rehabilitation services RH.4 Monitoring and evaluation PL. 1 Safety management program PL.2 Life safety management program PL.3 Equipment management program PL.4 Utilities management program PR. 1 Availability PR.2 Policies and procedures QA.1 Program QA.2 Scope QA.3 Monitoring and evaluation QA.4 Administration and coordination RA.1 Direction and staffing RA.2 Policies and procedures RA.3 Consultations and procedures RA.4 Monitoring and evaluation RP. 1 Availability RP.2 Training and education RP.3 Policies and procedures RP.4 Facility design and equipment RP.S Documentation RP.6 Monitoring and evaluation SO. 1 Availability S0.2 Training and education S0.3 Policies and procedures S0.4 Documentation SO.S Monitoring and evaluation SP.1 SP.2 Direction and staffing SP.3 Training and education SP.4 Policies and procedures SP.S Facility design and equipment SP.6 Monitoring and evaluation SP.7 Specific-purpose units Availability Comparable quality Policies and procedures Monitoring and evaluation UR.1 Program Availability SOURCE: JCAHO, 1989

300 MICHAEL G. H. McGEARY Sixty percent of the applicants for fellowship in the first 3 years of the ACS were rejected because the information in their medical case records was inadequate to judge clinical competence. Thus, the ACS formally estab- lished the Hospital Standardization Program, which existed until it was superseded by the JCAH in 1951. Although the ACS initially only promulgated five requirements, called the "Minimum Standard," only 89 of the 692 hospitals inspected in 1919 met these requirements. The number of accredited hospitals increased stead- ily, however; by 1950 nearly 3,300 hospitals met the Minimum Standard, which accounted for more than half He hospitals in the United States.5 The Minimum Standard emphasized basic structural characteristics con- sidered to be essential to "safeguard the care of every patient within a hospital" (Roberts et al., 1987, p. 937~. It required an organized medical staff of licensed medical school graduates who were competent, worthy in character, and ethical. The medical staff had to develop policies and rules approved by the governing body that governed the professional work of the hospital. The rules had to require medical staff meetings at least monthly and periodic reviews of patient care in each department, based on patient records. The specifications for complete patient medical records were de- tailed, including condition on discharge, follow-up, and autopsy findings in the case of death. Finally, diagnostic and therapeutic facilities had to in- clude at least a clinical laboratory and X-ray department (the entire mini- mum standard is reproduced in Roberts et al., 1987~. The Minimum Standard had dramatic results (Jost, 1983~. By 1935, for example, the proportion of hospitals with organized medical staffs increased from 20 percent to 90 percent. The ACS standards were revised and ex- panded a number of times over the years. By 1941 an additional 16 stan- dards addressing physical plant, equipment, and administrative organization supplemented the Minimum Standard. Eventually, however, the burden of accrediting several thousand hospitals became too great for the ACS to carry alone. In 1951 it joined with the American College of Physicians, the AHA, and the AMA to form the JCAH Cost, 1983~.6 JCAH carried on the ACS principles for improving health care in hospi- tals- voluntary private accreditation, minimum health and safety standards based on the consensus of health professionals, and confidential on-site surveys that involved education and consultation as well as evaluation (Roberts et al., 1987~. In 1961 JCAH began to hire its own surveyors rather than use ACS and AMA staff and in 1964 it began to charge a fee for inspections (Jost, 1983~. By 1965, when the legislation creating Medicare and Medicaid was passed, JCAH was already accrediting 60 percent of the hospitals (4,308 of 7,123) with 66 percent of the beds (1.13 million of 1.7 million) (AMA, 1966).

PARTICIPATION AND ACCREDITATION FOR HOSPITALS 301 Early Government Standards State licensing programs for hospitals were not common until the early 1950s. Most were stimulated by federal requirements (the link in timing between federal requirements and state regulatory activity is evident from inspecting the tables in Fry, 1965~. Fewer than a dozen states had hospital regulations before World War II (Worthington and Silver, 1970~. Federal hospital standards were imposed in 1935 for maternity and children's ser- vices, under regulatory authority contained in Title V of the Social Security Act (Somers, 1969~. In 1946 the Hospital Survey and Construction (Hill- Burton) Act required the states to establish minimum standards for main- taining and operating hospital buildings aided by the act. At that time the AHA, the Public Health Service (PHS), the Council of State Governments, and other organizations sponsored a model hospital licensing law. This model law was adopted in many states, especially after 1950 amendments to the Social Security Act required states using federal matching funds for the payment of health care for welfare recipients to designate an agency to establish and maintain standards for facilities providing the care (Somers, 1969). In 1964 the Hill-Harris amendments to the Hill-Burton Act required state licensure programs that went beyond building conditions to the administra- tion of services. Nevertheless, in 1965 one state (Delaware) still did not license hospitals and Ohio and Wisconsin only licensed maternity hospitals and maternity units in general hospitals. Connecticut, on the other hand, had an extensive program for inspecting and licensing hospitals (Foster, 1965~. New York and Michigan had just passed the first comprehensive hospital codes that addressed the quality of medical service organization and delivery (Worthington and Silver, 1970~. A series of studies and surveys in the late 1950s and early 1960s also found that the hospital survey programs of the states varied greatly in focus, intensity, and composition of the inspection team (Taylor and Donald, 1957; McNerney, 1962; Foster, 1965; Fry, 1965~. Nearly all emphasized fire safety and sanitation, but fewer than 40 looked at nurse staffing and prac- tices and fewer than 30 looked at medical staffing and practices. Just 37 states inspected hospitals annually. Nurses were on inspection teams in only 27 states and the use of physicians in state licensure programs was rare (Foster, 1965~. Development of the Medicare Conditions of Participation, 1965-1966 The drafters of the Medicare legislation were aware of the variability in the extent and application of state licensure standards. They knew that sev

302 MICHAEL G. H. AlcGEARY eral thousand, primarily small rural or proprietary hospitals, with a third of the nation's bed supply, were not in JCAH's voluntary accreditation pro- gram. In order to maximize access of beneficiaries to services, they did not want to exclude unaccredited hospitals from participating in the Medicare program. They could not rely, therefore, on licensure or accreditation to ensure minimum health and safety conditions in all hospitals. At the same time, federal policymakers did not want to create a national licensure pro- gram with federal inspectors. Accordingly, the Medicare legislation out- lined a program in which hospitals and other providers could participate voluntarily if employees of a state health facility inspection agency certi- fied that the providers met certain federal statutory and regulatory require- ments or if they were accredited by JCAH or another nationally recognized accreditation organization. The 1965 amendments to the Social Security Act that established Medi- care contained certain minimum requirements for hospitals? including the maintenance of clinical records, medical staff bylaws, a 24-hour nursing service supervised by a registered nurse, utilization review planning, insti- tutional planning and capital budgeting, and state licensure. Hospitals also had to meet any other requirements as the Secretary of HEW found neces- sary that were in the interest of the health and safety of individuals fur- nished services in the institution, provided that such other requirements were not higher than the comparable requirements prescribed for the ac- creditation of hospitals by JCAH. In addition, institutions accredited as hospitals by JCAH were "deemed" by the law to meet federal requirements without additional inspection or documentation (except the legislative re- quirements for utilization review, psychiatric hospital special conditions, and special requirements for hospitals providing long-term-care services). The Bureau of Health Insurance (BHI) of the Social Security Administration's Medicare Bureau was responsible for drafting the Condi- tions of Participation. Staff of the Division of Medical Care Administration in the PHS sewed as technical advisors, and a task force made up of repre- sentatives of major hospitals and health care and consumer organizations participated in the drafting of the conditions (HCFA, personal communica- tion, 1989~. Although the opportunity existed to develop model national standards, the efforts were severely constrained by the wording of the law, political and time pressures, the need to rely on state agency surveyors to inspect unaccredited hospitals, and the lack of knowledge about how to measure and achieve quality of medical care (Cashman and Myers, 1967~. Except for utilization review, Congress prohibited standards higher than those of JCAH, even though JCAH itself described its 1965 accreditation standards as the minimum ones necessary to assure an acceptable level of quality. Congressmen and administration officials had assured the hospital community since 1961 that JCAH-accredited hospitals would automatically

PARTICIPATION kD ACCRED~AT10N FOR HOSPICE 303 be eligible for participation in Medicare.7 There was tremendous political pressure to deliver Medicare benefits quickly and universally and therefore to involve as many hospitals as possible in order that every Social Security recipient would have access to hospital care (Feder, 1977a, 1977b).8 The conditions and procedures for applying them had to be developed in a few months; the law passed on July 30, 1965, and the conditions were mailed to hospitals at the end of January, 1966. -^~~ The standards could not be too complicated because they had to be applied by state surveyors with widely vary ~;xpenenc;e and Paining, woo, In most cases, were new to their jobs. Finally, even the best standards of the time were considered to be, at best, merely indicators of the structural and organizational capacity to deliver quality care. In the words of the PHS advisors on the conditions (Cashman and Myers, 1967, p. 1108), ". . . when a provider complies with the stan- dards, it has demonstrated a capacity to furnish a stated level of quality of care. The key element here is that standards define a certain capacity for quality and not quality itself. We assume that, given this capacity, a level of quality will result. And experience informs us that without this capacity, achievement of quality is difficult, if not impossible." BHI proceeded to draft Conditions of Participation that would be eguiva- lent to those of JCAH. Except for utilization review, the 16 standards corresponded to the areas covered in JCAH's 1965 hospital accreditation standards. The standards were mostly qualitative and subjective rather than quantitative. For example, they did not specify staffing ratios but referred to `'adequate" staffing, "qualified" personnel, and an "effective" staff organization. Next, procedures had to be worked out by which a number of hospitals that could not meet the standards, at least initially, could participate in Medicare while, hopefully, bringing themselves into compliance (Cash- man and Myers, 1967~. The solution was the concept of substantial compliance, which meant that a hospital could be certified for participa- tion even if it had significant deficiencies in meeting one or more stan- dards, as long as the significant deficiencies did not interfere with ade- quate care or represent a hazard to patient health and safety. Meanwhile, the hospital had to develop and make an adequate effort to complete a plan of correction. However, as the starting date of July 1, 1966, ap- proached, the pressure to make the program universal was overwhelming, and there was notable resistance to denying certification to any hospital that could meet the basic statutory requirements, which were embodied in 8 of the 100 standards (Cashman and Myers, 1967~. Also, provisions were made for special certification of hospitals in geographically remote areas where denial would have a major impact on the access of benefici- aries to services.9 The federal standard-setters expected and found widely varying state-to

304 MICHAEL G. H. McGEARY state interpretations of the conditions (Cashman and Myers, 1967~. Of the 2,700 unaccredited hospitals applying by September 30, 1966, less than 8 percent could not meet the conditions according to state surveyors, but the rate of denial recommendations varied from O in 18 states to 20 percent or more in 7 states. In all, just 15 percent of the 2,400 unaccredited hospitals that were certified were in compliance without any significant deficiencies. Nearly a third (1,556) were certified with correctable deficiencies, and more than a fifth (545) were not in compliance but were certified in the special categories to ensure access. Some states did not recommend special certifi- cation for any hospitals; others recommended special certification for half their hospitals. In all, some 700 hospitals had significant deficiencies in at least 6 of the conditions.~° Given that the federal requirements were minimum standards, the au- thors of the original Conditions of Participation concluded that future prog- ress would have to take place through innovative leadership by profession- als through the accreditation process. They called on professional standard- setters to establish optimal rather than minimum standards for medical care (Cashman and Myers, 1967~. JCAH and Medicare In 1966, with its standards forming the basis for the hospital Conditions of Participation in the Medicare program, JCAH found that the federal government was "usurping" its traditional role of guaranteeing minimum hospital standards (Roberts et al., 1987~. Already, in December 1965, the JCAH board of commissioners had adopted a utilization review standard. In August 1966, JCAH's board of commissioners decided to issue optimum achievable standards rather than minimum essential standards for hospital accreditation. The resulting 1970 Accreditation Manual for Hospitals con- tained 152 pages of standards, compared with just 10 pages of standards in 1965 (JCAH, 1965, 1971~. Meanwhile, however, JCAH went through a period of negative publicity that culminated in legislative changes in 1972 that imposed federal oversight of the accreditation process. In 1969 the Health Insurance Benefits Advisory Council, the advisory group to the So- cial Security Administration on the implementation of Medicare, criticized JCAH's standards and inspection process in its first report. According to the report, some JCAH standards were too low, the inspection cycle (2 years at that time) was too infrequent, and the surveyors (then just physi- cians) were too narrowly focused on medical staff and medical record is- sues. The council recommended that the Secretary of BW be given au- thority to set standards higher than those of JCAH and that state agencies be given the authority to inspect accredited hospitals (Health Insurance Bene- fits Advisory Council, 1969~.

PARTICIPATION AND ACCREDITATION FOR HOSPITALS 305 In 1969 and 1970, JCAH accredited (but with 1-year provisional certifi- cates) Boston City Hospital, D.C. General Hospital, San Francisco General Hospital, St. Louis City Hospital, and other major urban hospitals despite extensive publicity about serious problems in patient care (Worthington and Silver, 1970~. Consumer groups presented JCAH with demands for patient rights and consumer participation in the accreditation process (Silver, 1974~. Some groups sued HEW, arguing that the delegation of Medicare certifica- tion to the private JCAH was unconstitutional, and legislation was even introduced to establish a federal accreditation commission (Jost, 1983~. In 1972 Congress responded with amendments to the Social Security Act that gave the HEW Secretary the authority to promulgate standards higher than those of JCAH, to conduct inspections of a random sample of accred- ited hospitals each year, to investigate allegations of substantial deficien- cies in accredited hospitals, and, finally, to decertify hospitals that failed to meet federal requirements even though they were accredited. As a result of the first year of validation surveys, 107 of the 163 hospitals inspected by state agencies for HEW lost deemed status for being out of compliance with the Conditions of Participation. The state inspectors found 4,300 deficien- cies where JCAH had only found 2,993 contingencies; moreover, only 7 percent of the deficiencies cited by both groups were similar. JCAH and the AHA responded that the discrepancies had more to do with differences in the size and composition of the survey teams and duration of the inspec- tion visit than real differences in hospital conditions (Phillips and Kessler, 1975~. For example, more than half of the deficiencies found by state inspectors (2,305) related to the Life Safety Code (LSC), which, JCAH argued, were not significantly related to quality of patient care or safety. In contrast, JCAH surveyors found more deficiencies-than state inspectors concerning patient care; that is, in such areas as medical staff, medical records, and radiology. The first annual validation report strongly recom- mended that JCAH strengthen its capacity to evaluate and enforce fire safety requirements. As a result, JCAH introduced revised fire safety standards and procedures in October 1976. A study of the situation by the General Accounting Office (GAO, 1979) was more critical of HEW and its loose oversight of state agency operations than of JCAH. The GAO found that JCAH was finding more violations of requirements identified as essential by HEW and obtaining faster compli- ance, although state agency surveyors often found some deficiencies that JCAH did not. The GAO report concluded that state survey results were less reliable and had less impact than those of JCAH because HEW guide- lines for compliance were inadequate and federal specifications for survey team composition and training and survey duration were too weak to ensure consistency. Among alternatives for improving the certification process, GAO gave its highest recommendation to contracting with JCAH for the

306 MICHAEL G. H. McGEARY conduct of all certification surveys, subject to validation by federal survey- ors, because "this arrangement would provide a better, more consistent evalu- ation of hospitals and eliminate the problems associated with having more than 50 independent decision makers" (GAO, 1979, p. 31~. The discrepancies between JCAH and state agency surveys were much reduced with the introduction of the Fire Safety Evaluation System (FSES), a system for evaluating alternative ways of meeting the intent of the LSC. The FSES was developed for HEW by the National Bureau of Standards. Although more recent annual reports on validation surveys continued to recommend improvements in JCAH surveying of the LSC, they concluded that JCAH's surveying of accredited hospitals is "equivalent" to state agency surveying of unaccredited hospitals (DHHS, 1988~. For example, the pro- portion of JCAH-accredited hospitals subject to validation surveys that was found out of compliance with orate or more conditions was 20 percent in fiscal year (FY) 1982, 15 percent in FY 1983, 20 percent in FY 1984, and 29 percent in FY 1985, compared with an average of 25 percent among unaccredited hospitals (Table 7.4~. Also' the proportion of noncompliance with each condition is similar for accredited and unaccredited hospitals (Table 7.5~. In other words, HCFA has concluded that compliance with the Condi- tions of Participation is about the same in accredited and unaccredited hos- pitals.~2 This does not, however, preclude the possibility that Joint Com m~ss~on accreditation has a greater positive impact on quality of patient care than the federal-state survey and certification program, because in re- cent years, as will be seen below, the former's standards have been higher TABLE 7.4 Noncompliance of Joint Commission on Accreditation of Hospitals (JCAH)-Accredited and Unaccredited Hospitals with One or More Medicare Conditions of Participation, Fiscal Year 1985 Medicare-Certified Hospitals JCAH-Accredited Hospitals Unaccredited Hospitals Surveyed by State Agencies Number Percentage Number Percentage In compliance 328 70.7 1,168 75.6 Out of compliance 136 29.3 377 24.4 Total 464 100.0 1,545 100.0 NOTE: The JCAH-accredited, Medicare-certified hospitals surveyed by state agencies included 66 randomly selected for validation purposes and 398 hospitals surveyed on the basis of allegations of serious deficiencies that could affect the health and safety of patients. SOURCE: DHHS, 1988.

PARTICIPATION AND ACCREDITATION FOR HOSPITALS TABLE 7.5 Noncompliance of loins Commission on Accreditation of Hospitals aCAEI)-Accredited and (Jnaccredited Hospitals by Medicare Condition of Participation, Fiscal Year 15~85 307 Noncompliant, Noncompliant, JCAH-Accredited, Unaccredited, Condition of Certified Hospitals Certified Hospitals Participation Frequency Percentagea Frequency Percentages State and local law 2 3.0 64 4.0 Governing body 2 3.0 58 4.0 Physical environment 2 3.0 81 5.0 Medical staff 4 6.0 69 4.0 Nursing 2 3.0 92 6.0 Dietary 0 0.0 32 2.0 Medical record 3 5.0 56 3.0 Pharmacy 3 5.0 77 5.0 Laboratories 3 5.0 89 6.0 Radiology 0 0.0 18 1.0 Complementary 2 3.0 21 1.0 Outpatient 0 0.0 23 1.0 Emergency 3 5.0 88 6.0 Social work 0 0.0 15 1.0 Of the total of 66 JCAH-accredited, Medicare-cerufied hospitals that were ran- domly selected to be surveyed by stare agencies for compliance with the Medicare Conditions of Participation in fiscal year 1985. bOf the total of 1,545 unaccredited, Medicare-certified hospitals that were sur- veyed by state agencies for compliance with the Medicare Conditions of Participa- tion in fiscal year 1985. SOURCE: DEPTHS, 1988. and much more detailed with regard to quality assurance processes than the . . cone Tons. Despite the drastic revision and expansion of the accreditation standards in 1970, the JCAH standards still emphasized He structure and process features of hospital organization and administration that were believed to create the capacity to deliver quality patient care rather than evaluating the hospital's actual performance (JCAlIO, 1987~. In the early 1970s, aware of criticism of the emphasis on organizational and clinical capacity rather than actual performance (Somers, 1969), and stimulated by the advent of the Professional Standards Review Organizations (PSROs) with their mandate to review quality of care, JCAH began to emphasize the medical audit as the mechanism for assuring quality of care arid to specify the use of explicit

308 MICHAEL G. H. McGEARY criteria and formal procedures in place of the informal and subjective re- view processes already presumed to take place at the monthly medical staff and department meetings required since 1918 (Roberts and Walczak, 1984~. For example, JCAH sponsored the development of PEP, the Performance Evaluation Procedure for Auditing and Improving Patient Care, an elaborate medical audit system that was taught in workshops for accredited hospitals (JCAH, 1975; Jacobs et al., 1976~. The PEP methodology was based on several decades of efforts to develop objective methods of appraising clini- cal performance through retrospective auditing of medical charts using ex- plicit criteria (Sanazaro, 1980~. In 1976 a new section of the accreditation manual for hospitals on qual- ity of professional services called for a certain number of medical audits depending on hospital size, but it soon became apparent that the methodol- ogy was being applied mechanistically with little impact on medical prac- tice. Meanwhile, JCAH survey results indicated that surgical case review, drug and blood utilization review, and review of appointments and reap- pointments by the medical staff were subjective and informal and often ineffective in finding or resolving patient care and clinical performance problems (Affeldt et al., 1983~. In 1979, JCAH dropped numerical medical audit requirements and intro- duced a new quality assurance standard in a separate chapter of the accredi- tation manual. The new standard required the development of a hospi- talwide program that not only identified specific problems in patient care and clinical performance but documented attempts to resolve them. Since 1979 the accreditation manual for hospitals has undergone substantial change in an effort to incorporate quality assurance activities in each clinical activ- ity of a hospital. The revised standards are analyzed and recent efforts to develop explicit indicators of clinical and organizational performance are described in later sections of this chapter. Evolution of the Hospital Conditions of Participation, 1966-1986 The final regulations on the original Conditions of Participation that were promulgated in late 1966 were basically the same as those issued earlier in the year, except they accorded deemed status to hospitals accred- ited by the AOA. Those regulations included 16 conditions, broken down into about 100 standards and several hundred explanatory factors (Table 7.6~. The conditions were criticized from the beginning for only looking at the capacity of a hospital to provide adequate quality of care rather than its actual performance or effect on patient well-being. Nevertheless, the condi- tions were not revised in a significant way for 20 years. Generally, the conditions in effect from 1966 until 1986 emphasized structure over process measures of organizational and clinical capacity, such

PARTICIPATION AND ACCREDITATION FOR HOSPITALS TABLE 7.6 Medicare Conditions of Participation for Hospitals, 1965 309 1. Compliance with state and local laws Governing body Physical environment 4. Medical staff 5. Nursing department 6. Dietary department 7. Medical record department 8. Pharmacy or drug room Laboratories Radiology department Medical library 9. 10. 11. 12. Complementary departments (surgery; anesthesia; dentistry and dental staff; rehabilitation, physical therapy, and occupational therapy) 13. Outpatient department 14. Emergency service or department 15. Social work department 16. Utilization review plan as staff qualifications, written policies and procedures, and committee struc- ture, which were usually specified at the standard level. The process as- pects of quality-of-care standards were usually suggested as explanatory factors that could be used to evaluate compliance with the standard. For example, there was no quality-of-care or quality assurance condition or standard. Instead, the medical staff condition had a meetings standard, calling for regular meetings of the medical staff to review, analyze, and evaluate the clinical work of its members, using an adequate evaluation method. The explanatory factors that surveyors were supposed to use to determine compliance with the standard included attendance records at staff or departmental meetings and minutes that showed reviews of clinical prac- tice at least monthly. The reviews were supposed to consider selected deaths, unimproved cases, infections, complications, errors in diagnosis, results of treatment, and review of transfusions, based on the hospital statis- tical report on admissions, discharges, clinical classifications of patients, autopsy rates, hospital infections, and other pertinent hospital statistics. The minutes were also supposed to contain short synopses of the cases dis- cussed, the names of the discussants, and the duration of the meeting. In the 1970s there were several unsuccessful efforts by the government to revise the conditions. In 1977, HCFA developed specifications for revis- ing the Conditions of Participation and invited comments from interested parties in the Federal Register. After considering more than 2,000 com- ments, HCFA published draft revised conditions in the Federal Register in

310 MICHA15:L G. H. McGE~Y 1980 (Federal Register, 1980, p. 41794~. Generally, the new conditions proposed in 1980 would have eliminated a number of prescriptive re- quirements, especially those specifying personnel credentials and certain committees of the governing board and medical staff, replacing them with statements of the functions to be performed. The new conditions also rec- ognized changes in medical practice by adopting JCAH definitions and standards in new conditions for nuclear medicine; for rehabilitative, respira- tory, and psychiatric services; and for special care units. The proposed lg80 regulations also included a new standard, Quality Assurance, in the governing body condition. The new standard would have required a hospitalwide quality assurance program involving the medical staff in peer review and requiring performance evaluations by each organ . ~ . zeo service. Although the Reagan administration withdrew the proposed new Condi- tions of Participation for hospitals when it took office in January 1981, they were among the top five sets of regulations addressed by the Vice President's task force on deregulation. A committee of top political appointees and career staff in HCFA reviewed the Conditions of Participation line by line, developing detailed worksheets analyzing each condition and standard in terms of its statutory basis, pertinent public comments on the proposed 1980 regulations, and, in the several cases where they existed, research findings.~3 The revised conditions that were proposed in 1983 (Federal Register, 1983, p. 299) and finalized in 1986 (Federal Register, 1986, p. 22010) were based in part on those proposed in 1980, although, in line with the Reagan administrationts emphasis on deregulation, He resulting regulations carried further tile process of eliminating prescriptive requirements specifying cre- dentials or committees, departments, and other organizational arrangements. They were replaced with more general statements of desired performance or outcome in order to increase administrative flexibility (see statements on the proposed and final regulations in the Federal Registers cited above). On the over hand, the activities proposed for elevation to the condition level in 1980 to give Rem more emphasis in the certification process were retained as new conditions, including infection control and surgical and anesthesia services. In addition, quality assurance was made a separate condition. The possible impact of the new condition on quality of care is analyzed in a later section of this chapter, along with the ICAH quality assurance stan- dards. The new Conditions of Parucipaiion took effect on September 15, 1986. They were accompanied by interpretive guidelines and detailed survey pro- cedures developed by HCFA to increase corlsistency of interpretation and application by the stare agency surveyors (HCFA, 1986~. Use of the new quality assurance condition as a basis for decertificaiion was delayed for a

PARTICIPATION kD ACCRED~AT70N FOR GOSPEL 311 year. The stale inspectors did survey the condition, however. After the first 2 years, 128 (9 percent) of the 1,420 hospitals surveyed were found to be out of compliance with the new quality assurance condition (data sup- plied by HSQB). The states with the most hospitals failing this condition were Texas, with 23 (15 percent) of its 150 unaccredited hospitals, and Montana, with 10 (23 percent) of its 43 unaccredited hospitals. Other states with smaller numbers of unaccredited hospitals had higher rates of noncom- pliance: 6 of 10 in South Carolina; 2 of 4 in Virginia, and 1 of 3 in New Jersey. MEDICARE CERTIFICATION AND JOINT COMMISSION ACCREDITATION STANDARDS AND PROCEDURES FOR ASSURING QUALITY OF PATIENT CARE IN HOSPITALS Although one is governmental and the other private, both HCFA and the Joint Commission are regulatory in their approach. They each attempt to assure quality of care by influencing individual and institutional behavior. As in any regulatory system, quality assurance in health delivery organiza- tions has three components (IOM, 1986~. First, standards have to be set that relate tO quality of care. Second, the extent of compliance of hospitals win the standards must be monitored. Third, procedures for enforcing compliance are necessary. The HCFA and Joint Commission standards and their procedures for monitoring and enforcing compliance with the stan- dards are described, analyzed, and compared in this section. Standards In 1966, at the time the Conditions of Participation were first drafted, Donabedian (1966) identified three aspects of patient care that could be measured in assessing the quality of care: structure, process, and outcome. Theoretically, structure, process, and outcome are related' and, ideally, a good structure for patient care (e.g.' safe and sanitary buildings, necessary equipment, qualified personnel, and properly organized staff) increases the likelihood of a good process of patient care (e.g., the right diagnosis and best treatment available), and a good process increases the likelihood of a good outcome (e.g., the highest health status possible) (Donabedian, 1988~. Structure and Process Orientation of Hospital Standards The original conditions of 1966, and the JCAH standards they were based or, were almost exclusively based on structural aspects of patient care' because structural measures are the easiest for standard-setters to specify, for surveyors to assess, and for enforcers tO use in justifying their actions.

312 MICHAEL G. H. McGEARY Unfortunately, there is very little knowledge about the relations between structural characteristics and process features or outcomes of care. What knowledge exists on the relations between structure and process indicates that they are weak (Palmer and Reilly, 1979; Donabedian, 1985~. At best, then, the use of structurally oriented standards ensures that care is given in an environment that is conducive to good care (Donabedian, 1988~. Not meeting minimum structural standards may make it impossible to provide good care. Thus, structural standards may be necessary, but they are far from sufficient guarantors of good care. Clinical decision making is very complex, and, despite the development of complex clinical decision-making algorithms for assessing quality (Green- field et al.' 1975, 1977, 1981), it has proved to be difficult to develop objective criteria for assessing the quality of clinical processes in particular cases. In some instances, something is known about the relations between clinical processes and clinical outcomes, for example, where properly con- trolled experiments have been conducted. In most instances, however, stan- dards for best clinical practices are based on professional consensus, even though the relations between clinical practices considered by professional consensus to be best and favorable outcomes are generally weak (Schroe- der, 1987~. Outcome-based standards are the most difficult to apply or justify. Con- sider, for example, a standard that stated that the death rate should be no more than X percent during a specified time period among patients who had a particular diagnosis or who underwent a particular procedure. Because a number of factors influence death rates besides the clinical setting and processes used, death rates would have to be carefully adjusted for initial severity of illness and other case-mix differences before they could be used in setting regulatory standards. In any case, for compliance and enforce- ment purposes, outcome measures such as death rates, however adjusted, would have to be followed by assessment and documentation of the pro- cesses used in particular cases that caused the adverse outcomes. Both HCFA and the Joint Commission are severely constrained in their efforts to assure quality of care in hospitals or other health care organiza- tions by this fundamental lack of knowledge about relations between the aspects of care that can be most easily regulated (such as building specifica- tions, staff credentials, regular committee meetings, complete medical rec- ords, written quality assurance plans, and number of medical care audits) and those aspects of patient care that pertain more directly to quality (such as how well each patient is treated, how each patient's health status is affected by the care provided, or how the health status of the population served is being affected by a hospital's services). Traditionally, given these limitations, HCFA and the Joint Commission standard-setters did not try to assess the quality of care actually given.

PARTICIPATION AND ACCREDITATION FOR HOSPITALS 313 Instead, they adopted standards that, if met, would indicate that a hospital had the capacity to provide a minimum level of quality of care. Both sets of standards have always included standards for the construction, mainte- nance, and safe operation of hospital buildings. Currently, for example, compliance with the 1981 LSC and infection control standards (elevated to a Medicare Condition of Participation in 1986) are required. Both sets of standards require an organized medical staff and appointment of a hospital administrator, although the requirements have become less prescriptive over the years. For example, rather than require certain committees or creden- tials, the standards specify the functions that must be carried out. By and large, these capacity-oriented standards are based on professional consensus, although some are based on research. The LSC is a set of consensus-based standards for fire safety developed by the National Fire Protection Association. Infection control was raised to a condition in 1986, in part because of research by the Centers for Disease Control showing that 5 percent of patients in acute care hospitals contracted nosocomial infec- tions, necessitating several days of additional hospitalization at a cost of $1 billion a year (Federal Register, 1983, p. 3039. The requirements that the medical staff be organized under bylaws and that the medical staff and hospital administrator be accountable to a governing body were retained in the 1986 revision of the conditions in part because of research indicating that medical care is better in well-organized and supervised hospitals (HCFA Task Force, 1982~. Shift from Capacity Standards to Performance Standards In recent years, HCFA and the Joint Commission have tried to revise their standards in ways that would impel hospitals to examine and, hope- fully, improve the quality of their organizational and clinical performance. Thus, for example, both organizations have adopted quality assurance stan- dards that call for hospitals to set up structures and processes for monitor- ing patient care, identifying and resolving problems, and evaluating the impact of quality assurance activities. Under these standards, the medical staff is required to develop or adopt indicators of quality of care, gather information on the indicators, select criteria for deciding when an indicator is signaling a possible problem, and act on those signals. The Joint Commission calls these quality assurance activities "outcome- oriented,~' although the main emphasis of the new standards is to make hospitals adopt processes for monitoring indicators of the quality of their performance. Only a few of the indicators are likely to be outcomes, and those are most likely to be intermediate outcomes. For example, a radiol- ogy department might agree that the accuracy of upper gastrointestinal (GI) contrast studies is an important indicator of quality (JCAH, 1986~. Data

314 MACHO G. H. McGE~Y from the records of 20 percent of the department's patients would be col- lected monthly and aggregated by the radiologist and physician ordering an upper GI series, to determine whether or not the criteria for upper GI series are being met. Some of the criteria might be: 100 (or 98) percent of the requisitions for upper GI series contain the pertinent history, physical find- ings, and suspected diagnosis, or that radiologic interpretations shall be consistent with endoscopic findings 100 (or 97) percent of the time. Other indicators (for other departments or hospitalwide) might be: hospital-ac- quired infections, severe adverse drug reactions, agreement of final pathol- ogy diagnoses with patients' previous diagnoses, or transfer of patients from postsurgical recovery units to operating rooms (JCAHQ, 1988c). Evolution of the Joint Commission's Quality Assurance Standards The shift from prescriptive to performance-oriented standards began at JCAH in 1978, when the board of commissioners decided to replace the numerical medical audit requirement with a new quality assurance standard that mandated an ongoing, hospitalwide effort to monitor care, identify problems or ways to improve care, and resolve any problems (Affeldt et al., 1983~. The new quality assurance program was to involve all departments and services, not jUSt a quality assurance unit. It was to be problem-fo- cused rather than mindlessly to collect vast quantities of data for their own sake, which the old medical audit standard had encouraged. The new stan- dard was approved in 1979 but not implemented until 1981, to give hospi- tals time to develop systematic quality assurance programs. In 1981 the JCAH board voted to revise all the hospital standards by 1983 according to five principles (JCAH, 1981~: 1. The standards would be essential ones that any hospital should meet. 2. The standards should be statements of objectives, leaving the means to achieve their intent to the discretion of individual hospitals. 3. The standards should focus on elements essential to high-quality pa- tient care, including the environment in which that care is given. 4. The standards must be reasonable and surveyable. 5. The standards should reflect the current state of the art. The standards for governing bodies, medical staffs, management and administrative services, medical records, and quality and appropriateness review for support services were revised first. Despite the intention to simplify the standards and make them less prescriptive and more goal- oriented, the revision process ended up involving substantial expansion and formalization of quality assurance activities in each chapter of the hospital accreditation manual, including an increasing specification of processes needed to achieve the objectives of JCAH's new quality assurance standard.

PARTICIPATION ED ACCLIMATION FOR HOST 315 In 1981 the new quality assurance chapter of the hospital accreditation manual had one standard: There shall be evidence of a well-defined, organ- ized program designed to enhance patient care through the ongoing objec- tive assessment of important aspects of patient care and the correction of identified problems. According to a standard in the governing body chap- ter, the governing body was to hold the medical staff responsible for estab- lishing quality assurance mechanisms. One of the medical staff standards required regular review, evaluation' and monitoring of the quality and ap- propriateness of patient care provided by each member of the medical staff as well as surgical case (tissue) review, review of pharmacy and therapeutic activities, review of medical records, blood utilization review, review of the clinical use of antibiotics, and participation in hospitalwide functions such as infection control, safety and sanitation, and utilization review. In 1984 uniform language for the monitoring and evaluation of quality and appropriateness of care was added into each of 14 chapters on specific clinical services, e.g., anesthesia, nursing, radiology, and social work serv- ices: "As part of the hospital's quality assurance program, the quality and appropriateness of patient care provided by the X department/service are monitored and evaluated, and identified problems are resolved" (JCAH, 1983, p. 6~. The required characteristics of an acceptable process for carry- ing out the standard included: designation of the department head as respon- sible for the process, routine collection of data about important aspects of the care provided, periodic assessment of the data to identify problems or opportunities to improve care, use of objective criteria that reflect current knowledge and clinical experience, taking actions to address problems and document and report problems to the hospitalwide quality assurance pro- gram, and, finally, evaluating the impact of the actions taken (JCAH, 1983~. In 1984, after four field reviews of several drafts, revised medical staff standards were included in the hospital accreditation manual but not used for accreditation decisions until 1985. The standard for medical staff moni- toring and evaluation of the quality and appropriateness of patient care now included departmental review of the clinical performance of all individuals with clinical privileges and went on to specify the same required character- istics included in the other chapters on clinical services (JCAH, 1984a). In 1985 the quality assurance chapter was revised to add three standards. The second standard codified the monitoring and evaluation functions al- ready specified in the medical staff chapter and in each of the chapters on other services. It mandated certain hospitalwide activities (infection con- trol, utilization control, and review of accidents, injuries, and safety haz- ards) and required that the relevant findings of quality assurance activities were considered in the reappraisal or reappointment of medical staff mem- bers and renewal of clinical privileges of independent practitioners. The third standard required the use of the same steps for carrying out monitor

316 MICHAEL G. H. McGEARY ing and evaluation activities already listed as required characteristics in each of the clinical chapters in the 1984 manual. The fourth standard called for hospitalwide coordination and oversight of quality assurance activities (JCAH, 1984b) (see Table 7.71. By 1985, then, an elaborate set of quality assurance processes had evolved as standards and required characteristics in every chapter of the hospital accreditation manual. The object of these processes is aimed at making hospitals, through their medical staff, review and assess the quality of care given by each person with clinical privileges and in each clinical depart- ment and to act on problems or opportunities that are identified. Most hospitals, however, have had significant problems complying with the stan- dards. As already noted, the quality assurance standard adopted in 1979 was not implemented until 1981. Even then, hospitals only had to comply with the first three steps: assignment of authority and responsibility for quality assurance activities to a specific individual or group; progress in coordinating existing quality assurance mechanisms; written plan (JCAH, 1981~. In 1982 more than 60 percent of the 12,000 contingencies given by JCAH to the 1,150 hospitals surveyed were for quality assurance problems. The proportion of hospitals with contingencies or recommendations for cre- dentialing was 63 percent and for surgical case review was 45 percent (Roberts and Walczak, 1984~. Despite compliance problems, JCAH increased the level of compliance required with the quality assurance standard during 1983, requiring evi- dence that quality assurance information was being integrated, that patient care problems were being identified through the monitoring and evaluation activities of the medical staff and support services, and that the problems were being resolved (JCAH, 1982~. Medical staff quality assurance activi- ties still accounted for a large proportion of the contingencies and recom- mendations given in 1984, in areas such as the following: monthly depart- ment meetings to consider monitoring and evaluation findings (46 percent of hospitals surveyed); medical staff monitoring and evaluation actions are documented and reported (44 percent); and when important problems in patient care or opportunities to improve care are identified, problems are resolved (32 percent) (Longo et al., 1986~. In 1985, JCAH introduced implementation monitoring, by which certain standards would be surveyed and recommendations made, but lack of com- pliance would not affect accreditation decisions. JCAH explained that some changes in standards were taking more than 3 years for full implementation because they were difficult for hospitals to meet and required more time for learning (and for education of surveyors) (JCAH, 1985~. Not surprisingly, most of the standards placed on implementation-monitoring status initially, from January 1986 through June 1987, pertained to quality assurance: some parts of medical staff departmental monitoring and evaluation, use of medi

PARTICIPATION AND ACCREDITATION FOR HOSPITALS 317 cat staff quality assurance findings, and quality and appropriateness review in support services. In early 1988 the Joint Commission again eased implementation of the quality assurance standards. It no longer gave contingencies if hospitals were using only generic rather than department-specific indicators in moni- toring and evaluating the quality and appropriateness of care in the various departments and services. The explanation for the change in contingency policies referred to the problems the Joint Commission itself had encoun- tered in developing quality indicators for various types of care: "As the Agenda for Change activities have moved forward, it has become evident that the clinical literature does not provide sufficient information to permit health care organizations to select a full set of validated indicators for each area of clinical practice" (JCAHO, 1988b, p. 5~. The problems that many hospitals were having in complying with the Joint Commission standards for outcome-oriented monitoring and evaluat- ing quality of care were part of the impetus for the Joint Commission effort, called the Agenda for Change, to develop indicators of organizational and clinical performance for the hospitals to use aCAHO, l98Sc, l988d, l98Se). The data on such indicators would be transmitted by each hospital to the Joint Commission for use in developing empirical norms for hospitals to use in comparing their performance. Eventually, such indicator data could be used by the Joint Commission for monitoring compliance with accredita- tion standards. Development of the Quality Assurance Condition of Participation The quality assurance condition implemented in late 1986 by HCFA is similar in approach to, although less elaborate than, the Joint Commission's quality assurance standards. The task force of HCFA officials that devel- oped the revised conditions in 1981-1982 consciously tried to make the new requirements consistent with JCAH standards. In the preface of their recommendations, HCFA noted that in 1966 the conditions were similar to JCAH standards in 1966 but no longer were. JCAH had revised and up- dated its standards continuously while Medicare had not. The task force stated: '`Another recent consideration is the movement toward providing hospitals with greater flexibility in determining how they can best assure the health and safety of patients. The current regulations are, in many cases, overly prescriptive and not sufficiently outcome oriented. This trend toward increased internal hospital accountability has been reflected in re- cent revisions to JCAH standards" (HCFA Task Force, 1982~. Task force members agreed that a quality assurance program aimed at the identification and correction of patient care problems should be a condi- tion because it was important and cut across all aspects of direct patient

318 TABLE 7.7 Joint Commission on Accreditation of Healthcare Organizations Quality Assurance Standards for Hospitals MACHO G. H. McGE0Y Standard QA.1: There is an ongoing quality assurance program designed to objectively and systematically monitor and evaluate the quality and appropriateness of patient care, pursue opportunities to improve patient care' and resolve identified problems. Required characteristics QA.1.1 The governing body strives to assure quality patient care by requiring and supporting the establis}unent and maintenance of art effective hospital wide quality assurance program. QA.1.2 Clinical and administrative staff monitor and evaluate the quality and appropriateness of patient care and clinical performance, pursue identified problems, and report information to the governing body that the govem- ing body needs to assist it in fulfilling its responsibility for the quality of patient care. QA.1.3 There is a written plan for the quality assurance program that describes the program's objectives, organization, scope, and mechanisms for overseeing the effectiveness of monitoring, evaluation, and problem- solving activities. QA.1.4 There are operational linkages between the risk management functions related to the clinical aspects of patient care and safety and quality assurance functions. QA.1.5 Existing information from risk management activities that may be useful in identifying clinical problems and/or opportunities to improve the quality of patient care is accessible to the quality assuTarlce function. Standard QA.2: lithe scope of the quality assurance program includes at least the activities listed in Required Characteristics QA.2.1 through QA.2.5.3 and described in other chapters of this Manual. Required characteristics QA.2.1 The following medical staff functions are performed: QA.2.1.1 The monitoring and evaluation of the quality and appropri- ateness of patient care and clinical performance of all individuals with clinical privileges through QA.2.1.1.1 monthly meetings of clinical departments or major clinical services (or the medical staff, for a nondepartmentalized medical staff) to consider findings from the ongoing monitoring activities of the medical staff; QA.2. 1.1.2 surgical case review;

PARTICIPATION ED ACC~D~AT70N FOR HOST 319 QA.2.1.1.3 dug usage evaluation; QA.2.1.1.4 the medical record review function; QA.2.1.1.5 blood usage review; QA.2.1.1.6 the pharmacy and therapeutics function. QA.2.2 The quality and appropriateness of patient care in at least the following services are monitored and evaluated. QA.2.2.1 Alcoholism and other drug dependence services, when provided; QA.2.2.2 Diagnostic radiology services; QA.2.2.3 Dietetic services; QA.2.2.4 Emergency services; QA.2.2.5 Hospital-sponsored ambulatory care services; QA.2.2.6 Nuclear medicine services; QA.2.2.7 Nursing services; QA.2.2.8 Pathology and medical laboratory services; QA.2.2.9 Pharmaceutical services; QA.2.2.10 Physical rehabilitation services; QA.2.2.11 Radiation oncology services; QA.2.2.12 Respiratory care services; QA.2.2.13 Social work services; QA.2.2.14 Special care units; md QA.2.2.15 Surgical and anesthesia services. QA.2.3 The following hospital wide functions are performed: QA.2.3.1 Infection control; QA.2.3.2 Utilization review; and QA.2.3.3 Review of accidents, injuries, patient safety, and safety hazards QA.2.4 The quality of patient care and the clinical performance of those individu- als who are not permitted by the hospital to practice independently are monitored and evaluated through the mechanisms described in Required Characteristics QA.2.1 through QA.2.3.3 or through other mechanisms implemented by the hospitals. TABLE 7.7 continues

320 TABLE 7.7 Continued MICHAEL G. If. McGEARY QA.2.5 Relevant findings from the quality assurance activities listed in Required Characteristics QA.2.1 through QA.2.3.3 are considered as part of QA.2.5.1 the reappraisal/reappointment of medical staff members; QA.2.5.2 the renewal or revision of the clinical privileges of individu- als who practice independently; and QA.2.5.3 the mechanisms used to appraise the competence of all those individuals not permitted by the hospital to practice inde- pendently. Standard QA.3: Monitoring arid evaluation activities, including those described in Standard QA2, Required Characteristics QA.2.1 through QA.2.4, reflect the activities described in this standard, Required Characteristics QA.3.1 through QA.3.4. Required Characteristics QA.3.1 There is ongoing collection ardor screening of, and evaluation of information about, important aspects of patient care to identify opportuni- ties for improving care and to identify problems that have an impact on patient care and clinical performance. QA.3.1.1 Such information is collected and/or screened by a depart- ment/service or through the overall quality assurance program. QA.3.2 Objective criteria that reflect current knowledge and clinical experience are used. QA.3.2.1 Each department/service participates in QA.3.2.1.1 the development and/or application of criteria relating to the care or service it provides; and QA.3.2.1.2 the evaluation of the information collected in order to identify important problems in, or opportunities to improve, patient care and clinical performance. QA.3.2 The quality of patient care is improved and identified problems are resolved through actions taken, as appropriate, QA.3.3.1 by the hospital's administrative and supervisory staffs; and QA.3.3.2 through medical staff functions, including QA.3.3.2.1 activities of the executive committee, QA.3.3.2.2 activities of departments/services, QA.3.3.2.3 the delineation and renewal or revision of clinical privileges, and

PARTICIPATION AND ACCREDITATION FOR HOSPITALS 321 QA.3.3.2.4 the enforcement of medical staff or deparunent rules and regulations. QA.3.4 The findings, conclusions, recommendations, actions taken, and results of actions taken are documented and reported through channels established by the hospital. Standard QA.4: The administration and coordination of the hospital's overall quality assurance programs are designed to assure Mat the activities described in Required Characteristics QA.4.1 through QA.4.5 are undertaken. Required characteristics QA.4.1 Each of the monitoring and evaluation activities outlined in Standard QA.2 and QA.3 is performed appropriately and effectively. QA.4.2 Necessary information is communicated among departments/services when problems or opportunities to improve patient care involve more than one department/service. QA.4.3 The status of identified problems is tracked to assure improvement OF resolution. QA.4.4 Information from deparunent/services and the findings of discrete quality assurance activities are used to detect trends, patterns of performance, or potential problems that affect more than one deparunent/service. QA.4.5 The objectives, scope, orgaruzaiion, and effectiveness of the quality assurance program are evaluated at least annually and revised as neces- sary. SOURCE: JCAH, 1984b care. The task force suggested three minimal standards: (1) the organized, hospitalwide quality assurance program must be ongoing and have a written plan of implementation; (2) the hospital must take appropriate remedial action to address any deficiencies found; and (3) there must be evaluations of all organized services and of nosocomial infections, medicine therapy, and tissue removal. The new quality assurance condition as finally promulgated calls for a formal, ongoing, hospitalwide program that evaluates all patient care ser- vices (Table 7.8), although the explicit references to nosocomial infections, medicine therapy, and tissue removal were dropped. The interpretive guide

322 MICHAEL G. H. McGEARY TABLE 7.g Medicare's Quality Assurance Condition of Participation Condition of Participation: Quality Assurance (QA) The governing body must ensure that there is an effective, hospital-wide QA program to evaluate the provision of patient care. Interpretive guidelines: The condition requ*es that each hospital develop its own QA program to meet its needs. The methods used by each hospital for self-assessment (QA) are flexible. There are a wide variety of techniques used by hospitals to gather information to be monitored. These may include document-based review (e.g., review of medical records, computer profile data. continuous monitors, patient care indicators or screens, incident reports, etc.~; direct observation of clinical performance and of operating systems and interviews with patients, andlor staff. The information gathered by the hospital should be based on criteria and/or measures generated by the medical and professional/technical staffs and reflect hospital practice patterns, staff performance, and patient outcomes. (a) Standard: Clinical Plan. The organized hospital-wide OA program must he nnQ~ins' ~nr1 have. ~ written clan of implementation. ma. -- r~~=-~^ An- A_ a ~ l&~- ~ -1~LL~t ~t Interpretive guidelines: Ongoing means that there is a continuous and periodic collection and assessment of data concerning the important aspects of patient care. Assessment of such data enables areas of potential problems to be identified and indicates additional data which should be collected and assessed in order to identify whether a problem exists. The QA program must provide the hospital with findings regarding quality of care. The QA plan should include at least the following: program objectives; or- ganization involved; hospital-wide in scope; all patient care disciplines involved; description of how the program will be administered and coordi- nated; methodology for monitoring and evaluating the quality of care; ongoing; setting of priorities for resolution of programs; monitoring to determine effectiveness of action; oversight responsibility reports to governing body; documentation of the review of its own QA plan. (1) All organized services related to patient care including services furnished by a contractor must be evaluated. Interpretive guidelines: "All organized services" means all services provided to patients by staff accountable to the hospital through employment or contract. All patient care services furnished under contract must be evalu- ated as though they were provided by hospital staff. This means that all patient services must be evaluated as part of the QA program, that is: dietetic services; medical records; medical staff care ap- propriateness and quality of diagnosis and treatment; laboratory service;

PARTICIPATION ED ACC~D~AT10N FOR GOSPEL nursing service; pharmaceutical service; radiology service; hospital-wide functions infection control, utilization review (for hospitals under PRO review this requirement does not apply), discharge planning programs. If the hospital offers these optional services, they must also be evaluated: anesthesia services; emergency services; nuclear medicine services; outpa- tient services; psychiatric services; rehabilitation services; respiratory services; surgical services. Each department or service should address: patient care problems; cause of problems; documented corrective actions; monitoring or follow-up to determine effectiveness of actions taken. (2) Nosocomial infections and medication therapy must be evaluated. (3) All medical and surgical services performed in the hospital must be evalu- ated as they relate to appropriateness of diagnosis and treatment. 323 Interpretive guidelines: All services provided in the hospital must be periodically evaluated to determine whether an acceptable level of quality is provided. The services provided by each practitioner with hospital privi- leges must be periodically evaluated to determine whether they are of an acceptable level of quality and appropriateness. (b) Standard: Medically-related patient care services. The hospital must have an ongoing plan, consistent win available community and hospital resources, to provide or make available social work, psychological, and educational services to meet the medically-related needs of its patients. The hospital also must have an effective, ongoing discharge planning program that facilitates the provision of follownp care. Interpretive guidelines: To be considered effective, the discharge planning program must result in each patient's record being annotated with a note regarding the nature of post-hospital care arrangements. (1) (2) Discharge planning must be initiated in a timely manner. Patients, along with necessary medical information, must be transferred or referred to appropriate facilities, agencies, or outpatient services, as needed, for follow-up or ancillary care. (c) Standard: Implementation The hospital must take and document appropriate remedial action to address deficiencies found through the QA program. The hospital must document the outcome of the remedial action. SOURCE: HCFA, 1986.

324 MICHAEL G. H. McGEARY lines state that information gathered by the hospital to monitor and evaluate the provision of patient care should be based on criteria and measures gen- erated by the medical and professional staffs and reflect hospital practice patterns, staff performance, and patient outcomes. The term outcome does not appear in the language of the conditions or standards, however, because the majority of the task force did not think that outcome measures could be used in the survey process. The discussion in the task force report of the new condition pointed out that outcomes were difficult to use because of the differences in the pre-operative condition of patients. Although out- come measures were desirable, because they promised maximum flexibility to hospitals, they were difficult to assess without undertaking longitudinal studies beyond the given episode of care, which would be too cumbersome for hospitals and surveyors and difficult to use in enforcement. One objective of the 1986 revision of the Conditions of Participation was simplification of the regulations, and overlapping language in different conditions was usually eliminated. Accordingly, the monitoring and evalu- ation activities in each department and service implied by the quality assur- ance condition are not repeated under the other conditions, whereas the appropriate quality assurance standards are repeated in the various chapters of the Joint Commission's hospital accreditation manual and are cross-ref- erenced with the quality assurance chapter. There are few other references to quality in the other conditions. However, the governing body condition has a standard for ensuring that the medical staff is accountable for the quality of patient care, and the medical staff condition has a parallel stan- dard: The medical staff must be well organized and accountable to the governing body for the quality of the medical care provided to the patients. The interpretive guidelines for the medical staff condition also require that periodic appraisals of staff include information on competence from the quality assurance program. The only other reference to the quality assur- ance program outside the quality assurance condition itself is in the infec- tion control condition, where a standard assigns responsibility to the chief executive officer, medical staff, and director of nursing services to assure that hospitalwide quality assurance and training programs address problems identified by the infection control officers. The 1986 revisions of the Conditions of Participation, including the new quality assurance condition, were based in part on work done in the late 1970s and very early 1980s. They resemble the evolution of the JCAH standards in the same time period, when JCAH adopted a quality assurance standard and began to revise the other standards to make them more flexible and less prescriptive. However, the Joint Commissionts standards have undergone substantial evolution since the early 1980s. The latter's quality assurance standard in particular has undergone a great deal of elaboration in

PARTICIPATION AND ACCREDITATION FOR HOSPITALS 325 the process of trying to help hospitals understand how to comply with its intent. Survey Process Compliance with hospital regulatory standards is monitored and enforced through a process of on-site surveying by health professionals. The re- sources and procedures of Medicare and the Joint Commission for survey- ing are described and compared in this section. Surveyors and Survey Teams Section 1864 of the Social Security Act directs the Secretary of DHHS to enter into agreements with any "able and willing" state, under which the state health department or other appropriate state agency surveys health facilities wishing to participate in Medicare and certifies whether they meet the federal Conditions of Participation and other requirements. In return, the secretary agrees to pay for the reasonable costs of the survey and certifi- cation activities of the state agency. With very few exceptions, the same state agencies conduct state licensure and federal certification surveys of all health providers in their states, including nursing homes, laboratories, home health agencies, and hospitals. Most of the state agency survey load con- sists of nursing homes, because they are much more numerous than hospi- tals but do not have Joint Commission deemed status. Funding for Medicare certification activities comes from the Medicare trust funds. For FY 1990, HSQB has budgeted $91.2 million for state sur- veys of facilities participating in Medicare, about $10.0 million of it for surveys and follow-up visits to unaccredited hospitals. HSQB estimates av- erage survey costs by type of facility and allocates the funds to each federal regional office by its share of each type of facility. In FY 1990, for ex- ample, the unit cost for a survey of an unaccredited hospital was $7,500. Each regional office, however, uses a different method of distributing sur- vey funds to the states. The states are also reimbursed for surveys of Medicaid facilities and use state funds for licensure activities. An Institute of Medicine (IOM) study of nursing home regulations in 1986 found great variation in state survey agency budgets and policies. As a result, the number of surveyors and the intensity of the surveys, as measured by average person-days at a facility, varied tremendously (IOM, 1986~. Federal regulations and HCFA's state operations manual are very general regarding survey agency staffing levels and qualifications. As a result, there are large state-to-state differences in the experience and educational

326 MICHAEL G. H. McGEARY backgrounds as well as numbers of the surveyors. This affects the composi- tion of survey teams-e.g., how many nurses, generalists, sanitarians, and other specialists such as pharmacists and physicians are on the teams or available as consultants. Nationally about half are nurses, 20 percent are sanitarians, and most of the rest are engineers, administrators, and general- iStS (DHHS, l983~. But in 1983, eight states had only one or two licensed nurses on staff (Association of Health Facility Licensure and Certification Agency Directors, 1983~. Only a few state agencies have physicians on staff. The Joint Commission has 190 surveyors in its hospital accreditation program, 61 full-time, 74 part-time, and 55 consultants, who are based around the country (JCAHO, 198Sf). Most of the consultants are physician rehabilitation and psychiatric specialists who survey rehabilitation and psy- chiatric hospitals and those same services in general hospitals, if provided. Joint Commission survey team composition for the typical general acute- care hospital is a physician, an administrator, a registered nurse, and a medical technologist. The survey team may be tailored for hospitals that offer psychiatric, substance abuse, or rehabilitation services by including or adding physician surveyors with the appropriate specialty to the team. In 1988 the Joint Commission adopted a formula for determining survey costs, which are paid by the hospital desiring accreditation. The fee con- SiStS of a base fee and an additional charge that varies with the annual number of total patient encounters. A hospital with 150,000 inpatient and outpatient encounters a year would pay $S,652 for a full accreditation sur- vey. A follow-up visit to verify correction of a problem (contingency) found in the full survey would cost $900 per surveyor. In recent years, fees have amounted to about 70 percent of the Joint Commission's revenues; most of the rest is derived from the sale of publications and educational services. Survey Cycle HCFA does not have a fixed survey cycle for hospitals. Beginning in FY 1991, state agencies were funded to survey 100 percent of unaccredited hos- pitals (currently, 75 percent). The visits are scheduled ahead of time. Once certified, a hospital stays certified until and if a subsequent survey finds it out of compliance with one or more conditions, which could be more than a year. Until 1982, hospitals meeting JCAH standards were accredited for 2 years ore if Here were problems, 1 year. Since 1982, a hospital found to be in substantial compliance with Joint Commission standards has been awarded accreditation for 3 years. The surveys are scheduled in writing at least 4 weeks ahead of time.

PARTICIPATION kD ACC~D~Al70N FOR HOSPICE Survey Procedures 327 Both state agency and Joint Commission surveyors use survey report forms. State agency surveyors fill out survey forms provided by HCFA (Form HCFA-1537), which permit the surveyor to mark as "mete' or "not met" each condition, each standard under a condition, and each element of a standard if specified in the regulations. Altogether more than 300 items are checked as met or not met. The surveyors may refer to interpretive guidelines in the HCFA state op- erations manual (HCFA, 1986), which provide further guidance for evaluat- ing compliance with the regulation (condition, standard, or element) but do not have force of law. The interpretive guidelines also specify the survey procedures to be used in verifying compliance. For example, element (3) of the quality assurance standard, Clinical Plan, states: "All medical and surgi- cal services performed in the hospital must be evaluated as they relate to appropriateness of diagnosis and treatment" (see Table 7.8, and HCFA, 1986, p. Alp. The language is further explicated in the interpretive guide- lines: "All services provided in the hospital must be periodically evaluated to determine whether an acceptable level of quality is provided. The serv- ices provided by each practitioner with hospital privileges must be periodi- cally evaluated to determine whether they are of an acceptable level of quality and appropriateness." Finally, a surveyor may refer to the survey procedures column: "Determine that the hospital is monitoring patient care including clinical performance. Determine that a review of medical records is conducted and that the records contain sufficient data to support the diagnosis and to determine that the procedures are appropriate to the diag- nosis." The Joint Commission survey report forms (one for each surveyor disci- pline, e.g., physician, nurse, etc.) list the hundreds of standards and associ- ated required characteristics (350 items in the case of the physician sur- veyor) and provide a scale for rating compliance with most of them. The scale goes from 1 for substantial compliance to 5 for noncompliance. To help the surveyors to determine the degree of compliance with an item, the Joint Commission has developed explicit scoring guidelines for most chap- ters in the hospital accreditation manual as well as for the monitoring and evaluation of quality and appropriateness of care in each of the clinical services chapters. The scoring guidelines have been published and are available for sale to the hospitals. Table 7.9 provides an example of how the first nursing services standard should be scored. If the standard or required characteristic receives a score of 3 for partial compliance, 4 for minimal compliance, or S for no compli- ance, the surveyor must document the findings on blank pages that face each page of items in the survey report form.

328 MICHAEL G. H. McGEARY TABLE 7.9 Method of the Joint Commission on Accreditation of Healthcare Organizations for Scoring the First Nursing Services Standard Defining The Standard The following are elements of satisfactory performance for the first nursing standard, which is: `'There is an organized nursing department/service": A. The nursing department or service is organized with appropriate nursing direction; B. The department or service provides quality care as shown by use of the nursing process, adequate professional nurse staffing, findings of monitoring and evaluation that indicate high-quality care is provided and that actions are taken to solve identified problems, and documentation of adequate participation in orientation and in-service education of nursing personnel; and C. The department or service maintains optimal professional conduct and practice as shown by policies and procedures relating to ethical conduct and professional practices, monitoring and evaluation findings that identify instances of substan dard practice and/or unethical conduct, and actions taken according to an established disciplinary process when problems in professional conduct and practice are identified. Scoring the Standarda Score 1 if these elements have been fulfilled for at least the previous 24 months. Score 2 if the major requirements of all elements listed are fulfilled, but have been fulfilled for only 18 to 23 months. Score 3 if two of the three elements are fulfilled OR the elements have been fulfilled for 12 to 17 months. Score 4 if one of the three elements is fulfilled OR the elements have been fulfilled for 6 to 11 months. Score 5 if none of the elements are fulfilled OR the elements have been fulfilled for less than 6 months. aAll Joint Commission standards and required characteristics are scored on a scale from 1 to 5, depending on degree of compliance: 1. Substantial compliance (the organization consistently meets all major provisions of the standard or required characteristic). 2. Significant compliance (meets most provisions). 3. Partial compliance (meets some provisions). 4. Minimal compliance (meets few provisions). 5. Noncompliance (fails to meet the provisions). SOURCE: JCAH, 1987.

PARTICIPATION AND ACCREDITATION FOR HOSPITALS 329 State agency and Joint Commission survey teams present their findings at exit conferences, and hospitals with significant problems may begin to make corrections to head off a possible Recertification or nonaccreditation action. Some state surveyors obtain plans of correction at this time, whereas others ask for them after reviewing the findings at the office. Enforcement Procedures Enforcement begins with a formal finding of noncompliance that neces- sitates correction. This is a deficiency in HCFA's lexicon, a contingency in the Joint Commission's. In both cases the facility may be and usually is certified or accredited on the basis of, or contingent on, a plan of correction that will, if carried out, bring the hospital into compliance. Depending on the nature and seriousness of the problem, the state agency or the Joint Commission may require written documentation of corrective action or may decide to schedule an on-site visit by a surveyor to verify compliance. In most cases, enforcement ends when the plan of correction is carried out, and more formal enforcement action is rarely taken. In about 15 percent of the cases (100 of the 700 hospitals surveyed per year), problems are of a nature or degree of seriousness that an unaccredited hospital may be found out of compliance with a Condition of Participation, and Recertification proceedings are begun. If it is an "immediate and seri- ous" deficiency, a fast-track termination process is triggered that results in Recertification within 23 days. In other cases, and in fast-track cases when the immediate jeopardy is removed, the process takes 90 days. In most cases, the hospitals move to make the changes necessary to have the pro- ceedings dropped, but about 10 to 20 are terminated each year. Traditionally, the Joint Commission has denied accreditation to between 10 and 15 hospitals a year (about 1 percent of those surveyed). When the 3- year survey cycle with the contingency system was started in 1982, about 15 percent of hospitals were accredited without contingencies and the rest, 83 to 84 percent, were accredited with contingencies that had to be removed within a certain time period, usually 6 months. More recently, 99 percent of the accredited hospitals have been receiving contingencies, several hundred of them serious enough to trigger tentative nonaccreditation procedures, but, due to serious lags in computerizing the new procedures, only four lost accreditation in 1986 and five in 1987 (Bogdanich, 1988~. As a result, several hospitals with very serious problems identified in Joint Commission surveys were able to retain their accreditation status for months and even years. Meanwhile, they had lost their Medicare certification as a result of validation surveys triggered by complaints.

330 Enforcement Criteria MICH=L G. H. McGE~Y HCFA, in its state operations manual or otherwise, provides little guid- ance to the state agencies on how to decide whether the deficiencies found by surveyors amount to noncompliance with a Condition of Participation. For example, Hospital A may have deficiencies in four or the five standards comprising a condition but still be judged in compliance with the condition, whereas Hospital B may only have deficiencies in three standards and be ruled out of compliance with the condition. The judgment is left to the state survey agency In contrast, the Joint Commission has developed a complex algorithm for converting the scores on completed survey report forms for each standard and required characteristic into summary ratings on a decision grid sheet for each of the major performance-related functions that are taken into account in making accreditation decisions and decisions on whether to assign con- tingencies or not. In some cases, such as medical staff appointment, clini- cal privileges, and monitoring functions (e.g., reviews of blood utilization, medical records, and surgical cases), the score is taken directly from the survey form. In most cases, a set of scores of related items on the survey report form are aggregated according to specific written rules into a sum- mary score. For example, the summary score for '`evidence of quality assurance actions taken" is aggregated from some 21 scores on related items in 18 chapters of the accreditation manual. The accreditation decision grid, then, aggregates the hundreds of scores given by surveyors into 43 summary scores under lO headings (e.g., medi- cal staff, monitoring functions, nursing services, quality assurance, medical records). Another 7 scores for standards on implementation monitoring status are listed but not used in making the accreditation decision. Another set of rules is then applied to determine whether the hospital should be accredited. This set of rules is also used to decide whether contingencies should be assigned, with what deadlines, and whether subject to a follow-up visit or just written documentation of corrective action. For example, a tentative nonaccreditation decision is forwarded to the Accreditation Com- mittee of the Joint Commission's board of commissioners if the four ele- ments under the medical staff heading are scored 4 or 5, or five of the seven elements under the monitoring heading are scored 4 or 5, and so forth. Similarly specific rules determine whether l-month, 3-montl,, 6-month, or 9-month written progress reports are required, or 6-month, 9-month, or 12- month on-site surveys are necessary. These three sets of decision rules (surveyor scoring of individual items on the survey report form, aggregation of the individual surveyor scores into summary scores on the accreditation grid sheet, and the rules used to make nonaccreditation and contingency decisions) are new and constantly

PARTICIPATION AND ACCREDITATION FOR HOSPITALS 331 evolving as they are used in practice. They were adopted in response to complaints about variations in surveyor judgment and in Joint Commission decision making about accreditation; the advent of computers has made it possible. CONCLUSIONS, ISSUES, AND OPTIONS Conclusion: Quality Assurance Through Certification and Accreditation is Limited Federal and Joint Commission efforts to develop and apply quality assur- ance standards are hampered in several ways. First, despite 70 years of efforts, we sull do not have adequate and valid outcome standards.~4 Be- cause outcomes by themselves are affected by many factors besides what happens in hospitals, adverse or even improved outcomes can only be indi- cators of possible quality problems or opportunities that, in turn, trigger further investigation to see if some aspect of hospital care was involved (Donabedian, 1966, 1988; Lohr, 1988~. Medicare and Joint Commission standard-setters therefore have tried to mandate quality assurance processes in which hospitals use indicators of quality outcome-oriented if possible but usually process and even structural in nature to examine quality of care. However, few clinical indicators have been adequately validated through research. Even fewer indicators of the quality of organizational perform- ance exist. Nevertheless, to the extent there is knowledge about how to improve quality or make quality assurance more effective, it should be reflected in the Medicare and Joint Commission standards and survey proc- esses. The second barrier to quality assurance through certification and accredi- tation is the limited surveillance capacity inherent in any system of periodic inspections. A 2-day visit every year or two limits the ability of even the best surveyors to see if the process of care conforms to standards of best practice in an adequate sample of cases, let alone to see what the outcomes were. This "distance" problem is another reason why the standard-setters have tried externally to impose quality assurance standards that make the hospital itself conduct such surveillance continuously after the inspectors leave (Vladeck, 1988~. A third impediment to using regulatory, or self-regulatory, standards to assure quality is the ambivalent attitude of Medicare officials, the state agencies that actually survey the facilities, and Joint Commission leaders toward the use of sanctions. The raison d'etre of the Joint Commission is professional self-improvement. Federal and state officials are primarily motivated by the desire to make Medicare benefits widely available, and they are also subject to political pressure to keep facilities open, if at all

332 MICHAEL G. H. McGEARY possible. The only formal sanction is loss of formal certification or ac- creditation, a drastic step that officials are reluctant to take except in ex- treme cases. The due process protections of the legal system also discour- age enforcement attempts, as do the difficulties of documenting quality problems more subtle than gross negligence or death. Thus, for a variety of reasons, officials are very reluctant to take formal enforcement actions, especially to the extent of terminating a facility, preferring instead to work with substandard or marginal facilities over time and bring them into com- pliance. This approach works well if the hospitals involved have the will and capacity to improve, if shown how to do it, but it is ill-equipped to deal with places that cannot or will not improve. Fourth, while the federal government has delegated much of the stan- dard-setting and enforcement to private accreditation bodies on the one hand, it has given away much discretion to the states on the other. The states have always varied greatly in their interpretation of federal standards, and little has been done to increase consistency. HCFA requirements for state survey programs are very loose. Federal officials recognized from the beginning that who does the surveying is critical, "since this greatly influ- ences what the emphasis will be, regardless of what the standard-setters think the emphasis should be" (Cashman and Myers, 1967, p. 1112), but little has been done to standardize state survey capacity or process. The development of interpretive guidelines and survey procedures for the new Conditions of Participation was a step in the right direction. HCFA could develop more sophisticated decision rules for state agencies to use in deter- mining compliance and making enforcement decisions. It also could de- velop a more statistically credible survey validation program to check the performance of the Joint Commission and the states.~5 Conclusion: Certification and Accreditation Could Play a Role in Quality Assurance Many of the obstacles to more effective quality assurance facing HCFA's survey and certification and the Joint Commission's accreditation efforts are those facing Medicare's Utilization and Quality Control Peer Review Organizations (PROs): lack of knowledge about the relations among struc- ture, process, and outcome; distance; and political pressure. One of the advantages of the PRO program is its continuous access to information on individuals and the episodes of care they experience. Unlike the survey agencies or the Joint Commission (at least until and if its plan to develop and then collect data on clinical and organizational indicators is carried out), PROs can actively screen data using indicators of poor quality or inappropriate care. This at least allows them to identify statistically aber- rant hospitals and physicians through the use of aggregate profiles. How

PARTICIPATION AND ACCREDITAT70N FOR HOSPITALS 333 ever, the PROs are not well able to make the in-depth on-site investigations of places the indicators may identify, especially small, remote hospitals in rural areas. The survey agencies, on the other hand, can and do mandate certain minimum capacity characteristics of hospitals. In addition, they can require that hospitals have and use internal quality assurance standards and proce- dures. They can require those specific process characteristics that research has or will show are associated with favorable outcomes. In the meantime, the standards should be periodically revised in accord with expert consen- sus about best practices. Finally, survey agencies could be involved for- mally and systematically in investigations of hospitals where PRO-derived quality indicators signal possible quality problems and could use their legal authority to mandate changes needed. Issues and Options Major Issue 1: Role of Certification in Quality Assurance The Conditions of Participation and procedures for enforcing them are a part of the federal government's quality assurance effort, and, as such, they should be the best possible, given the state of current knowledge and availa- bility of resources, and they should be consistent with and supportive of other federal quality assurance activities. Pros: A large number of hospitals (1,600) with a significant number of beds are outside the accreditation system, and they tend to be the only hospitals . . in t hear area. Hospitals that have lost accreditation have applied for and received certification. · The conditions mandate some important basic structure and process standards (e.g., life safety code, sanitation and infection control, etc.) that can be enforced legally if there are related quality problems found by PROs or otherwise (e.g., through complaints). · State health facility surveyors are useful for investigating the causes of indicators of poor quality revealed through surveillance of case statistics. Quality is multifaceted and multiple systems of surveillance and en- forcement are useful. . Cons: The inherent limits on the ability of periodic facility inspections to find problems in the quality of patient care are too great (compared to, say, a peer review approach) to justify more investment in this approach.

334 . MICHAEL G. H. McGEARY Quality-of-care problems in unaccredited hospitals could be effectively dealt with by the PROs or other programs based on systematic, ongoing . ~ review or cases. Political pressures on state health agencies and HCFA to keep hospi- tals open, especially in rural areas, are too great. The need to keep PRO data confidential precludes coordination with the certification process; potential triggering of regulatory enforcement would poison the peer review process. Related issue: Improving the standards. If certification is considered to be an important part of the federal quality assurance effort, the standards (Conditions of Participation) should be revised to be consistent and suppor- iive of the overall federal quality assurance effort and kept up to date. Pros: The current conditions and related standards and elements were devel- oped in the early l980s and do not reflect recent advances in measuring and assuring quality of care. State licensure standards even for basic structural aspects of hospitals vary widely and certification assures conformity to a uniform set of stan- dards. Cons: · It is not realistic to expect that the conditions, which must go through the formal federal rule-making process, can be updated continuously. Little or no relation has been shown between facility-based standards and quality of patient care. Related issue: Improving enforcement. HCFA should take a number of steps to increase enforcement capacity (some of them already adopted in nursing home regulation), including the following: specification of survey team size and composition; use of survey procedures and instruments that focus more on patients and less on records; development of explicit deci- sion rules for determining enforcement actions; adoption of intermediate sanctions, such as fines and bans on admissions, so the punishment can fit the crime; and more use of federal inspectors to evaluate state agency per- formance through validation surveys and to inspect state hospital facilities. Pros: · Increasing competition and price regulation (e.g., prospective payment) in the hospital sector call for more attention to quality assurance and en- forcement, especially in small rural hospitals. Enforcement can be increased through these kinds of federal actions, as has been done with certified nursing homes.

PARTICIPATION AND ACCREDITATION FOR HOST Cons: . ness. 335 These steps are not worth the cost, given the limits on their effective Major Issue 2: Role of the Joint Commission in Assuring Quality of Care for Medicare Patients Deemed status should continue, and the Joint Commission should be encouraged in its efforts to develop a state-of-the-art quality assurance pro- grarn, but, at the same time, federal oversight of the Joint Commission should be increased to ensure accountability and there should be more dis- closure of information about hospitals with quality problems discovered by the Joint Commission. Pros: Joint Commission standards are higher and more up-to-date than the Conditions of Participation. Accreditation is a positive incentive that motivates hospitals to im- prove more than certification does or can (the Joint Commission is planning to reinforce this by recognizing "superior" hospitals). Joint Commission inspectors have better clinical credentials and make more consistent decisions. . The Joint Commission may achieve better compliance than the state agencies because accreditation is highly valued and the state agencies are hampered procedurally and politically (e.g., due process, lack of authority to deal with repeat deficiencies, political pressure to assure access to Medi- care services); in fact, HCFA might contract with the Joint Commission to conduct all certification surveys, subject to closer monitoring, rather than deal with the inconsistencies and administrative costs of dealing with more than 50 state survey agencies. The Joint Commission is planning voluntarily to release information to HCFA on hospitals with significant quality problems whose continued accreditation is conditional on major changes. These would be the 7 to 8 percent of hospitals surveyed each year that trigger one or more of the Joint Commission's nonaccreditation decision rules. Cons: · Higher standards are not meaningful if they are not enforced vigor- ously. · In any case, the Joint Commission is a private organization governed by associations of the providers it is regulating; its survey findings are confidential (except in 13 states e.g., New York, Pennsylvania, Arizona

336 MICHAEL G. H. McGEARY where the survey is a public document under state law). The Joint Commis- sion is not publicly accountable and, therefore, responsibility for assuring the health and safety of Medicare beneficiaries should not be delegated to it. The Joint Commission is still relatively weak in enforcing environ- mental and life safety code standards. HCFA must maintain a certification program with adequate standards and sufficient capacity (resources and procedures) in any case, to deal with small and rural hospitals that are not accredited, and this program could and should be applied to all (hospitals would still be encouraged to seek ac- creditation). . The resources for increasing federal oversight more funding for more Intensive state inspections, more federal inspectors to conduct validation surveys- would be better used elsewhere in the federal quality assurance program. Major Issue 3: Improving Coordination of Federal Quality Assurance Efforts HCFA should develop criteria and procedures for referring cases in which there are indications of serious quality-of-care problems from PROs to the Office of Survey and Certification and vice versa. Pros: The quality-of-care screens used by PROs include only indicators of quality-of-care problems, and the actual role of a hospital in producing adverse indicators has to be investigated further before changes can be required or sanctions applied. In many cases, on-site surveys by health facility inspectors could usefully sunDlement centrn1 r~.vi~.w~ Of reaps hv PRO clinicians. . --A --red '' ~ ~^ ~ The state inspection agencies and federal regional offices, in turn, could alert PROs when they find hospitals with possible quality-of-care problems; the PROs could then initiate focused reviews to document pro- cess-of-care or patient-outcome problems, if any. Cons: . Most state inspection agencies do not have physician inspectors and some do not have that many nurses, which limits their capacity to look at quality of clinical care or to justify findings in court against a facility's physician consultants. Any additional resources for handling quality-of-care problems should go to building up PROs or some other peer review-oriented mechanism.

PARTICIPATION kD ACC~D~ATION FOR HOSPICE CONCLUDING REMARKS 337 About 7,000 hospitals provide services to Medicare patients. The Secre- tary of DHHS has the regulatory authority to promulgate standards called Conditions of Participation in order to assure the adequate health and safety of Medicare patients in those hospitals, although the 5,400 hospitals accred- ited by the private Joint Commission and the AOA are deemed to meet the federal standards without further inspection by a public agency (except for a small number of accredited hospitals that are subject to validation surveys each year). In effect, then, Joint Commission standards are the Medicare standards for most Medicare beneficiaries using hospital services. At the same time, the users of 1,600 hospitals rely on the standards in the Medi- care Conditions of Participation. These are mostly small, primarily rural hospitals where Medicare beneficiaries do not have the alternative of going to an accredited hospital. Both sets of standards, therefore, affect a large number of people and should be as effective as possible in achieving the goal of assuring adequate care. This chapter has examined the evolution of Medicare and the Joint Com- mission hospital standards from mostly structural standards (aimed at assur- ing that a hospital has the minimum capacity to provide quality care) to mostly process standards (aimed at making hospitals assess in a systematic and ongoing way the actual quality of care provided on their premises). Also, certain structural standards, such as those for fire safety, that continue to be mandated and enforced through the certification and accreditation standards may not be closely related to patient care but are important fac- tors in patient safety. The certification and accreditation programs are inherently limited in their capacity to assure quality of care. They are hampered by the lack of knowledge about the interrelations between structure and process features of a hospital and patient outcomes. They are limited because periodic inspections cannot reveal much about how well the process of care con- forms to the standards of best practice, or what the outcomes of care are. They rely on the subjective judgment of their inspectors and the enforce- ment attitudes of the inspection agencies. Certification and accreditation could play a significant role in Medicare's quality assurance efforts if several issues are addressed. Pros and cons of suggested strategies are identified for consideration. NOTES 1. Throughout this chapter, we use the terms nonaccredited and unaccredited. Nonaccredited hospitals are those that have lost accreditation from the Joint Com

338 MICHAEL G. lI. McGEARY mission. Unaccredited hospitals are those hospitals that have never been accredited by the Joint Commission or who were accredited but subsequently lost accreditation and are not actively pursuing accreditation with the Joint Commission. 2. Another regulation automatically permits hospitals that meet the Medicare Conditions of Participation to participate in Medicaid. 3. One consumer representative has served on the board since 1981. In late 1989, two more public members were added to the Joint Commission board. 4. The author wishes to acknowledge the helpful comments provided by staff of the Joint Commission, HSQB, and HCFA's Office of Policy Development on earlier drafts of this chapter. 5. Most of the unaccredited hospitals had fewer than 25 beds and therefore were not eligible for accreditation under ACS rules at that time. 6. The Canadian Medical Association was also a founder of JCAH but withdrew In 1959 to develop the Canadian Council on Hospital Accreditation. The American Dental Association joined JCAH in 1980. 7. At 1961 hearings on health services for the aged, HEW Secretary Ribicoff said he would "hand down an order that any hospital that was accredited by the Joint Commission on Accreditation would be prima facie eligible" (quoted in Jost, 1983, p. 853~. The report of the Senate Finance Committee accompanying die Medicare bill said that hospitals accredited by JCAH would be "conclusively pre- sumed to meet all the conditions for participation, except for the requirement of utilization review" (quoted in Worthington and Silver, 1970, p. 314~. 8. Art Hess, first head of Medicare, told the American Public Health Associa- tion at its 1965 annual meeting that the Social Security Administration did not want to pay for services that did not meet "minimal quality standards," but "the intention . . . is not to impose requirements that cannot be met." He went on to say that "the program, through its definitions, provides support to what has now been achieved, and makes continued upgrading possible as progress in standards is made in the private sector through accreditation activities" (Hess, 1966, p. 14~. 9. Two special certification provisions were implemented in 1966 for certifying hospitals that did not meet the Conditions of Participation. The access provision allowed for the certifying of rural hospitals out of compliance with one or more conditions but in compliance with all statutory provisions provided the hospital was located in a rural area where access by Medicare enrollees to fully participating hospitals would be limited. The second provision, based upon the Burleson amend- ment, waived the statutory 24-hour registered nurse requirement for rural hospitals meeting all other requirements. Both provisions have since been terminated. 10. As of 1970, 98 hospitals that had applied in 1966 were still not in the program and 411 hospitals were participating through the special access certification provision (Worthington and Silver, 1970~. 11. JCAH apparently adopted the utilization review requirement (implemented in 1967) in the hope that accredited hospitals could be deemed to meet all federal requirements without state agency inspection. The Secretary of the DHHS, how- ever, has never agreed to let this accreditation standard be deemed to meet the federal utilization review requirement. More recently, however, hospitals have been able to meet the requirement if they are reviewed through Medicare's Utilization and Quality Control Peer Review Organization (PRO) program.

PARTICIPATION AND ACCREDITATION FOR HOSPITALS 339 12. Even though compliance at the condition level may be similar, it is inrerest- ing to note that more detailed analyses in earlier reports found that only about 10 to 14 percent of the specific deficiencies cited were the same (DlIHS, 1979, 1980; GAO, 1979). 13. These worksheets, which provide insight into the thinking that went into the revision of the Conditions of Participation for hospitals during the 1981-1983 pe- riod, are in the HCFA files (HCFA Task Force, 1982~. 14. For example, comparative hospital mortality figures have no meaning with- out consideration of many factors such as case-mix, severity of illness, geographic differences, and patterns of care of the terminally ill among hospitals, hospices, nursing homes, and family homes. 15. As of late 1989 HCFA was considering a revision of its sampling methodol- ogy to improve the effectiveness of its validation efforts. Also, beginning in FY 1989, the number of validation surveys performed by state agency staff was in- creased to approximately 200 per year (HCFA, personal communication, 1989~. REFERENCES Affeldt, J.E., Roberts, J.S., and Walczak, R.M. Quality Assurance: Its Origin, Status, and Future Direction A JCAH Perspective. Evaluation and the Health Pro- fessions 6:245-255, 1983. AHA (American Hospital Association). Hospitals, Journal of the American Hospi- tal Association (Guide Issue, Part 2), 40 (August 1, 1966~. Association of Health Facility Licensure and Certification Agency Directors. Sum- mary Report: Licensure and Certification Operations. Unpublished report submitted to Health Standards and Quality Bureau, Health Care Financing Administration, Baltimore, Md., 1983. Bogdanich, W. Prized by Hospitals, Accreditation Hides Perils Patients Face. Wall Street Journal October 12, 1988, pp. Al, A12. Cashman, J.W. and Myers, B.A. Medicare: Standards of Service in a New Pro- gram Licensure, Certification, Accreditation. American Journal of Public Health 57:1107-1117, 1967. Davis, L. Fellowship of Surgeons: A History of the American College of Surgeons. Chicago, Ill.: American College of Surgeons, 1973. DHHS (Depar~xnent of Health and Human Services). Medicare Validation Surveys of Hospitals Accredited by the JCAH: Annual Report for FY 1979. Washing- ton, D.C.: U.S. Department of Health and lIuman Services, 1979. DHHS. Medicare Validation Surveys of Hospitals Accredited by the JCAH: Annual Report for FY 1980. Washington, D.C.: U.S. Department of Health and Human Services, 1980. O ~O DHHS. Inventory of Surveyors of Medicare and Medicaid Programs, United States, 1983. Baltimore, Md.: Health Care Financing Administration, 1983. DHHS. Report on Medicare Validation Surveys of Hospitals Accredited by the Joint Commission on Accreditation of Hospitals (JCAH): Fiscal year 1985. In Report of the Secretary of DHHS on Medicare. Washington, D.C.: U.S. Gov- enunent Printing Office, 1988.

340 MICHAEL G. H. AlcGEARY Donabedian, A. Evaluating the Quality of Medical Care. Milbank Memorial Fund Quarterly 44:166-203, 1966. Donabedian, A. The Epidemiology of Quality. Inquiry 22:282-292, 1985. Donabedian, A. The Quality of Care: How Can It Be Assessed? Journal of the American Med. ical Association 260: 1743-1748, 1988. Feder, J. Medicare: The Politics of Federal Hospital Insurance. Lexington, Mass.: D.C. Heath, 1977a. Feder, J. The Social Security Administration and Medicare: A Strategy for Implem- entation. Pp. 19-35 in Toward a National Health Policy. Friedman, K. and Rakoff, S., eds. Lexington, Mass.: D.C. Heath, 1977b. Federal Register, Vol. 45, pp. 41794~1818, June 20, 1980. Federal Register, Vol. 48, pp. 299-315, January 4, 1983. Federal Register, Vol. 51, pp. 2201~22052, June 17, 1986. Foster, J.T. States are Stiffening Licensure Standards. Modern Hospital 105:128-132, 1965. Fry, H.G. The Operation of State Hospital Planning and Licensing Programs. American Hospital Association Monograph Series, No. 15. Chicago, Ill.: American Hospital Association, 1965. GAO (General Accounting Office). The Medicare Hospital Certification System Needs Reform. HRD-79-37. Washington, D.C.: General Accounting Office, 1979. Greenfield, S., Lewis, C.E., Kaplan, S.H., et al. Peer Review by Criteria Mapping: Criteria for Diabetes Mellitus: The Use of Decision-Making in Chart Audit. Annals of Internal Medicine 83:761-770, 1975. Greenfield, S., Nadler, M.A., Morgan, M.T., et al. The Clinical Investigation and Management of Chest Pain in an Emergency Department: Quality Assessment by Criteria Mapping. Medical Care 15: 898-905, 1977. Greenfield, S., Cretin, S., Worthman, L.G., et al. Comparison of a Criteria Map to a Criteria List in Quality-of-Care Assessment for Patients With Chest Pain: The Relation of Each to Outcome. Medical Care 19:255-272, 1981. HCFA Task Force (Health Care Financing Administration). HCFA Task Force Recommendations. Unpublished document in files of the Health Standards and Quality Bureau, Health Care Financing Administration, Baltimore, Md., 1982. HCFA. Appendix A, Interpretive Guidelines Hospitals. Pp. A1-A165 in State Operations Manual: Provider Certification. Transmittal No. 190. Health Care Financing Administration. Washington, D.C.: U.S. Department of Health and Human Services, 1986. Health Insurance Benefits Advisory Council. Report Covering the Period July 1, 1966~ecember 31, 1967. Washington, D.C.: Social Security Administra- tion, 1969. Hess, A.E. Medicare: Its Meaning for Public Health. American Journal of Public Health 56: 10-18, 1966. IOM (Institute of Medicine). Improving the Quality of Care in Nursing Homes. Washington, D.C.: National Academy Press, 1986. Jacobs, C.M., Christoffel, T.H., and Dixon, N. Measuring the Quality of Patient Care: The Rationale for Outcome Audit. Cambridge, Mass.: Ballinger, 1976.

PARTICIPATION AND ACCREDITATION FOR HOSPITALS 341 JCAH (Joint Commission on Accreditation of Hospitals). Standards for Hospital Accreditation. Chicago, Ill.: Joint Commission on Accreditation of Hospitals, 1965. JCAH. 1970 Accreditation Manualfor Hospitals. Chicago, Ill.: Joint Commission on Accreditation of Hospitals, 1971. JCAH. The PEP Primer: Performance Evaluation Procedure for Auditing and Im- proving Patient Care. Chicago, Ill.: Joint Commission on Accreditation of Hospitals, 1975. JCAH. Guidelines Set for AMH Revision. JCAH Perspectives 1~5~:3, 1981. JCAH. New QA Guidelines Set. JCAH Perspectives 2~5~:1, 1982. JCAH. New Quality and Appropriateness Standard Included in 1984 AMH. JCAH Perspectives 3~5~:5-6, 1983. ICAH. JCAH Board Approves New Medical Staff Standards. JCAH Perspectives 4~1 ): 1,3 - , 1 984a. JCAH. Quality Assurance Standards Revised. JCAH Perspectives 4~11:3, 1984b. JCAH. "Implementation Monitoring" for Designated Standards. JCAH Perspec- tives 5~1~:3 - , 1985. JCAH. Monitoring and Evaluation of the Quality and Appropriateness of Care: A Hospital Example. Quality Review Bulletin 12:326-330, 1986. JCAH. Hospital Accreditation Program Scoring Guidelines: Nursing Services, In- fection Control, Special Care [Jnits. Chicago, Ill.: Joint Commission of Ac- creditation of Hospitals, 1987. JCAHO (Joint Commission on Accreditation of Healthcare Organizations). Over- view of the Joint Commission's "Agenda for Change." Mimeo. Chicago, Ill.: Joint Commission on Accreditation of Healthcare Organizations, 1987. JCAHO. An Introduction to the Joint Com~nisszon: Its Survey and Accreditation Processes, Standards, and Services. Third edition. Chicago, Ill.: Joint Com- mission on Accreditation of Healthcare Organizations, 1988a. JCAHO. Rules Change on Monitoring and Evaluation Contingencies. Joint Com- mission Perspectives 8:5 - , 1988b. JCAHO. Medical Staff Monitoring and Evaluation: Departmental Review. Chicago, Ill.: Joint Commission on Accreditation of Healthcare Organizations, 1988c. JCAHO. Proposed Clinical Indicators for Pilot Testing. Chicago, Ill.: loins Com- mission on Accreditation of Healthcare Organizations, 1988d. JCAHO. Field Review Evaluation Form: Proposed Principles of Organizational and Management Effectiveness. Chicago, Ill.: Joint Commission on Accreditation of Healthcare Organizations, 1988e. JCAHO. Hospital Accreditation Program Surveyors, September 1988. Chicago, Ill.: Joint Commission on Accreditation of Healthcare Organizations, 1988f. JCAHO. 1990 Accreditation Manual for Hospitals. Chicago, Ill.: Joint Commis- sion on Accreditation of Healthcare Organizations, 1989. Jost, T.S. The Joint Commission on Accreditation of Hospitals: Private Regulation of Health Care and the Public Interest. Boston College Law Review 24:835-923, 1983. Lohr, K.N. Outcome Measurement: Concepts and Questions. Inquiry 25:37-50, 1988.

342 MICHAEL G. H. McGEARY Longo, D.R., Wilt, J.E., arid Laubenthal, R.M. Hospital Compliance with Joint Commission Standards: Findings from 1984 Surveys. Quality Review Bulle- tin 12:388-394, 1986. McNerney, W.J. Hospital and Medical Economics. Chicago, Ill.: American Hose tat Association Hospital Research and Educational Trust, 1962. Palmer, R.H. and Reilly, M.C. Individual and Institutional Variables Which May Serve as Indicators of Quality of Medical Care. Medical Care 17:693-717, 1979. Phillips, D.F. and Kessler, M.S. Criticism of the Medicare Validation Survey. Hospitals, Journal of the American Hospital Association 49:61-62, 64, 66, 1975. Roberts, J.S. and Walczak, R.M. Toward Effective Quality Assurance: The Evolu- tion and Current Status of the JCAH QA Standard. Quality Review Bulletin 10:1 1-15, 1984. Roberts, J.S., Coale, J.G., and Redman, R.R. A History of the Joint Commission on Accreditation of Hospitals. Journal of the American Medical Association 258:936-940, 1987. Sanazaro, P.~. Quality Assessment and Quality Assurance in Medical Care. Annual Review of Public Health 1980 1:37-68, 1980. Schroeder, S.A. Outcome Assessment 70 Years Later: Are We Ready? New En- gland Journal of Medicine 316:160-162, 1987. Silver, L.H. The Legal Accountability of Nonprofit Hospitals. Pp. 183-200 in Regulating Health Facilities Construction. Havighurst, C.C., ed. Washing- ton, D.C.: American Enterprise Institute for Public Policy Research, 1974. Somers, A.R. Hospital Regulation: The Dilemma of Public Policy. Princeton, N.J.: Industrial Relations Section, Princeton University, 1969. Stephenson, G.W. College History: The College's Role in Hospital Standardization. Bulletin of the American College of Surgeons (February):17-29, 1981. Taylor, K.O. and Donald, D.M. A Comparative Study of Hospital Licensure Regula- tions. Berkeley, Calif.: School of Public Health, University of California, 1957. Vladeck, B.C. Quality Assurance Through External Controls. Inquiry 25:100-107, 1988. Worthington, W. and Silver, L.H. Regulation of Quality Care in Hospitals: The Need For Change. Law and Contemporary Problems 35:305-333, 1970.

Next: 8. The Utilization and Quality Control Peer Review Organization Program »
Medicare: A Strategy for Quality Assurance, Volume II: Sources and Methods Get This Book
×
Buy Paperback | $125.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Volume II of Medicare: A Strategy for Quality Assurance provides extensive source materials on quality assurance, including results of focus groups with the elderly and practicing physicians, findings from public hearings on quality of care for the elderly, and many exhibits from site visits and the literature on quality measurements and assurance tools. The current Medicare peer review organization program and related hospital accreditation efforts are comprehensively described as background for the recommendations in Volume I of this report. Like the companion volume, this substantial book will be a valuable reference document for all groups concerned with quality of health care and the elderly.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!