National Academies Press: OpenBook
« Previous: Introduction
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 5
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 6
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 7
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 8
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 9
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 10
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 11
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 12
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 13
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 14
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 15
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 16
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 17
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 18
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 19
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 20
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 21
Suggested Citation:"Study Methods." Institute of Medicine. 1977. Reliability of Medicare Hospital Discharge Records: Report of a Study. Washington, DC: The National Academies Press. doi: 10.17226/9930.
×
Page 22

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Chapter 2 STUDY METHODS In the Institute of Medicine's study of the reliability of the six items selected frog the Medicare record, a field term independently abstracted selected patient records within participating study hos- pitals. The results of the independent abstracting were compared with Medicare data on the same hospitalizations obtained from the HCFA. (The Medicare information had been compiled by HCFA frog claim forms submitted by the study hospitals to the fiscal intermediary for payment.) Discrepancies between the Institute of Medicine (IOM) abstract and the Medicare record were noted, and the patient records were re-examined in an attempt to understand the reasons for discrepancies. For most records with discrepancies, an attempt was made to locate the hospital copy of the appropriate claim form (Medicare form number 1453) and transfer the claims information to the abstract form. These data, supplemented by information on hospital procedures for processing Medicare claims, constituted the basis for the analysis. The sampling plan, research instruments, field work, data processing, and analytic techniques developed for the previous re-abstracting study are generally applicable here. Unique methodological aspects of the current study are discussed below. SAMPLING PLAN As before, a three-stage sampling plan was used. The plan included an initial national sample of hospitals suitable for use in the larger PSRO evaluation; a smaller subsample of hospitals included in this study; and within each study hospital, aa sample of Medicare records that were used in the abstracting. Initial National Sample of Hospitals The Institute of Medicine had previously drawn a national probability sample of short-te'-m general hospitals using a two-way controlled selection 5

6 process . t 1 ] The sample consisted of ten panels of hospitals with less than 1,000 beds and an additional group of hospitals with 1,000 or more beds, which were included in the sample with certainty. Each panel constitutes a national sample in its own right and can be combined with other panels, plus the certainty hospitals, to create a range of repre- sentations of the national hospital universe, depending on the sample size and level of precision desired. Two of the ten panels were randomly selected and combined with the certainty hospitals to serve as the sampling frame for the previous re-abstracting steady. Therefore, eight panels were available for in- clusion in this study. One panel was chosen at random and, together with a subsample of certainty hospitals, provided 303 hospitals that served as the sampling frame for this study. Subsample of Hospitals for Abstracting _ Within the sampling frame, a final sample of 125 hospitals was drawn, using an approximation to controlled selection. Although the initial goal of the study was to include seventy to seventy-five hospitals, oversampling was necessary because of the limited time for follow-up contacts and an expected high refusal rate. A letter to administrators of sampled hospitals requesting their participation is in Appendix B. Eighty-four of the 125 hospitals (65 percent) agreed to participate. This number was reduced to seventy-two, using the stratification vari- ables, in order to stay within the budget. The hospitals that finally were asked to participate had similar characteristics and were distrib- uted in proportion to those in the original sample of 125. One hospital in the final sample was excluded because it was not possible to retrieve the necessary medical records with the available identifying information. Therefore, a total of seventy-one hospitals were included in the study. The hospitals declining to participate did not appear to differ from the participants in any systematic manner. The basic hospital weights were adjusted to reflect the reduced sample size. Sample of Medicare Records The sample design for selecting records to be independently abstracted was guided by several considerations. Because t'ne study was intended, at least in part, to determine the usefulness of Medicare records for evaluating the effects of PSROs, the diagnoses were selected to conform R. Goodman and L. Kish, .'Controlled Selection - A Technique in Prob- ability S~mpling'.8 Journal of the American Statistical Association, 45 (September 1950~: 350-72; see also Irene Hess, Donald C. Riedel, and Thomas B. Fitzpatrick, Probability Sampling of Hospitals and Patients, 2nd ed. (Ann Arbor: Health Administration Press, 1975J.

7 with those specified in the PSRO Evaluation Plan as appropriate for de- termining PSRO impact on utilization of hospital services.~2] This had the added advantage of facilitating comparisons between the results of this study with the previous re-abstracting study. However, some diag- noses from the previous study were not appropriate for a Medicare pop- ulation and were excluded--for example, hypertrophy of tonsils and adenoids. Other diagnoses especially important for a Medicare popula- tion were included, even though they had not been used in the previous study--for example, diseases of the prostate. In addition to examining specific diagnoses, it is sometimes useful to aggregate information to reflect broader, homogeneous groupings of patients with generally similar conditions. In this regard, HGFA's Office of Policy, Planning and Research has analyzed utilization data using AUTOGRP categories as a research tool.~3] Therefore, an addi- tional sampling consideration was the desirability of analyzing the reliability of data by AUTOGRP categories, as well as by specific diagnoses. A sampling plan was developed that concentrated on the fifteen AUTOGRP Diagnosis Related Groups (DRGs) for conditions most frequently occur- ring in the Medicare population. Most DRGs were divided into specific and residual sub-groups. Specific diagnostic sub-groups include diag- noses identified for special study because of their importance for the Medicare population and/or their inclusion in the previous study. They were sampled at a greater frequency than the residual diagnostic sub- groups, which include everything in the DRGs except the specific diag- noses. As an example, cataract is a specific diagnosis from the DRG o f "Di seases of the Eye'' and was sampled with a higher frequency than the residual sub-group within that DRG. In some cases, the DRG con- tained only the specific diagnosis - for example, diabetes or cerebro- vascular diseases - and there was no residual sub-group. Where there were residuals, they often included diagnoses referred to as "satel- lites" in the previous study. "Satellite" diagnoses are frequently and erroneously coded as principal in place of the specific diagnosis with which they are associated. This is another feature of the sample design that facilitated comparisons between this study and its predecessor. U.S. Department of Health, Education, and Welfare, Office of the Assistant Secretary for Health, Office of Professional Standards Review, Program Evaluation Plan: Professional Standards Review Organizations, by Martin A. Baum et al. (22 September 1975), pp. 119-21. 3 The AUTOGRP system, developed at Yale University, classifies patients into homogeneous categories that are clinically and statistically mean- ingful and reflect similar patterns of hospital resource consumption. See Ronald Mill, Robert B. Fetter, Donald C. Riedel, and Richard Averill, "AUTOGRP: An Interactive Computer System for the Analysis of Health Care Data'" Medical Care 14 (JU1Y 19761: 603-615.

8 In order to give some chance of inclusion in the sample to all diag- noses, a sixteenth DRG was created that encompassed all diagnoses not included in any of the other fifteen DRGs. This sixteenth category, plus the residual sub-groups in the other DRGs, permit the calculation of net and gross difference rates, as was done in the previous study. The total sample represents the universe of all diagnoses included in the ICDA-8 classification system, as adapted by HCFA.~4] Figure 1 shows the diagnostic groups included in the sample and their ICDA-8 code numbers. A more comprehensive listing of code numbers is found in Appendix C. A computerized sampling procedure was used to select the abstracts for inclusion in the study. The universe of abstracts eligible for selection in each hospital was known and used to develop the rates with which each diagnostic group within each hospital was sampled. All abstracts were for Medicare beneficiaries age 65 and over, who were discharged from the study hospitals during calendar year 1974 and included in the twenty per- cert sample maintained by HCFA. The year 1974 was used because this is the baseline year for assessing the effects of PSROs. This procedure was expected to yield a total of 4,908 abstracts from seventy-one hospitals. In some instances, however, the medical record was not available in the medical record department, and no substitutions were made. As a result, 4,745 medical records were actually indepen- dently abstracted by the IOM field team. Eighth Revision of International Classification of Diseases, Adapted for Use by the Social Security Administration, U.S. Department of Health, Education 9 and Welfare, Social Security Administration, Pubn. No. 23-72' undated.

9 Figure 1. Components of Sample of Medicare Records - Diagnosis Sub- ICDA-8 code related group (DRG) groups numbers* 1. Ischemic heart diseases except acute myocardial infarction (AMI) Specific diagnosis Chronic ischemic 412.0;412.9 heart disease** Residual diagnosis Subacute ischemic 411.0;411.9 heart disease (satellite) Angina pectoris Asymptomatic ischemic heart disease 2. Cerebrovascular Diseases 413.0;413.9 414.0;414.9 Specific diagnosis Cerebrovascular 430.0-438.9 diseases** Residual diagnosis 3. Fractures Specific diagnosis Residual diagnosis none Fracture, neck 820.0-820.9 of femur** Fracture of other 821.0-821.9 and unspecified parts of femur (satellite) Fracture of skull, 800.0-809.9 spine, and trunk *Discrete and non-continuous codes are listed individually. Where a diagnosis has a lengthy list of continuous codes, however, only the first and last codes are given, separated by a dash. A complete listing of codes for each diagnosis is found in Appendix C. The reader may wish to refer to the Appendix before considering the influence of coding discre- pancies in the analysis. **Indicates specific or 'target" diagnosis used in prior re-abstracting study.

10 Figure 1 ~ cont inued ~ ~ ~ . . Diagnosis Sub- ICDA-8 code re lated group ~ DRG) Fracture of upper limb Fractures of lower limb; ex- cludes fractures, neck of femur 4. Diseases of the eye 810 . 0-819 . 9 822 .0-829 .9 Speci tic diagnosis Cataract*X 374. 0-374 . 9 Res idual diagnosis Inf laudatory 360 . 0-369 .9 di seases of the eye Other diseases 370 . 0-373 . 9; and conditions 375 . 0-378 . g 0 f the eye; excludes cataract and blindnes 5. Acute myocardial infarction Specific diagnosis Acute myocardial 410.0-410~9 infarction'* Res idual diagnosis none 6. Hernia of abdominal cavity Specific diagnosis Residual diagnosis Inguinal hernia 550 without ment ion of ob struct ion^- Inguinal hernia 552 with ob struct ion ~ satellite) Other hernia of abdominal cavi ty without mention of ob struct ion Other hernia of abdominal cavi ty wi th ment ion o f ob s true t ion 551 .0-551 .9 553 . 0-553 .9

11 Figure 1 (continued) Diagnosis related group (DRG) Sub- groups ICDA-8 code numbers 7. Diabetes mellitus Specific diagnosis Residual diagnosis 8. Diseases of the prostate Specific diagnosis Residual diagnosis 9. Pneumonia Specific diagnosis Residual diagnosis 10. Diseases of the gall bladder and bile duct Specific diagnosis Residual diagnosis 11. Miscellaneous diseases of the intestine and peritoneum Specific diagnosis Diabetes** mellitus none Hyperplasia of prostate 250.0-250.9 600 Prostatitis.and 601;602 other diseases of the prostate Bronchopneumonia 485 ;486 organism not specified and pneumonia organism and type not specified Pneumonia, type and organism specified 480 ;481; 482 .0-482 .9; 483; 484 Cholelithiasis 574. 0-574 . 9; and cholecystitis** 575 Other diseases of the gall bladder and bile duct Intestinal obstruction without mention of hernia 576. 0-576 .9 560 .0-560 .9

12 Figure 1 (continued) Diagnosis related group (DRG? ICDA-8 code groups numbers Residual diagnosis 12. Heart failure Specific diagnosis Residual diagnosis 13. Enteritis, diverticula, and functional disorders of intestine Specific diagnosis Residual diagnosis 14. Bronchitis Specific diagnosis Residual diagnosis Peritonitis, peritoneal adhesions, and other diseases of intestine and peritoneum Congestive heart failure and left ventricular failure Acute heart failure, undefined Diverticulosis of intestine Noninfectious 5 61 gastroenteritis and colitis, except ulcerative Chronic enteritis and ulcerative colitis Functional disorders of intestine Bronchitis none 567 .0-567 .9; 568; 569 . 0- 569.9 427.0 j427.1 782 .4 562.0 ; 562.1 563. 0-563 .9 564. 0-564 .9 466;490; 491 o

13 Figure 1 (continued) . . . Diagnosis Sub- ICDA-8 code related group (DRG) groups numbers 15. Malignant neoplasm of the respiratory system Specific diagnosis Residual diagnosis 16. All else Specific diagnosis Residual diagnosis Malignant neo- 162.1 plasm of bronchus and lung Primary malignant 160.0-160.9; neoplasm of respi- 161.0-161.9; ratory system 162.0; except of bran- 163.0-163.9 chus and lung all else none - *Discrete and non-continuous codes are listed individually. Where a diagnosis has a lengthy list of continuous codes, however, only the first and last codes are given, separated by a dash. A complete listing of codes for each diagnosis is found in Appendix C. The reader may wish to refer to the Appendix before considering the influence of coding discre- pancies in the analysis. **Indicates specific or "target" diagnosis used in prior re-abstracting study. Retrieval of Medicare Claim Forms As noted earlier, the field team was instructed to consult the hospi- tal's copy of the Medicare claim form when a discrepancy was found between the IOM abstract and the Medicare record that was not attrib- utable to the field work. (Mistakes by the field team were expected to result in a determination that the Medicare record was correct, so there would be no need to consult the claim form.) Retrieving claim forms was difficult because of the lack of uniformity in filing procedures. Some are filed according to the date on which the account is closed; others are physically removed from the hospital. Some hospitals had not retained the form. Since there were 2,878 IOM abstracts with at least one discrepancy not attributable to the field work, the same number of claim forms should have been available and ex- amined. However, only 2,377, or 82.6 percent of those needed, could be retrieved. Ten hospitals were unable to produce any Medicare claim forms. They were located primarily in Standard Metropolitan Statistical

14 Areas, as was the general study population. However, they included fewer voluntary hospitals, more governmental hospitals, and a dispropor- tionately lower share of hospitals with 200 to 500 beds than the rest of the study population. Because of the limited availability of claims forms, these data were analyzed as a separate data set. RESEARCH INSTRUMENTS Two research instruments were developed: the abstracting form was in- tended to record information obtained in the process of abstracting selected medical records (see Appendix D); the informal checklist was intended to describe the flow of claims information within hospitals and to ascertain the extent to which hospital definitions conform to those used by the field team and by HCFA (see Appendix E). Abstracting Form Information items to be abstracted included date of hospital admission, date of discharge, sex, admitting diagnosis, principal and other diag- noses, and principal and other procedures. Additional items appear on the claim form, such as date of surgery and charges for laboratory and other hospital services. Because of the complexities and costs of add- ing more items to the study, however, the information abstracted was restricted to that mentioned above. Age and/or date of birth was not abstracted because it is validated by HCFA when a person enrolls for benefits. All abstracting was done in accord with item definitions developed for the Uniform Hospital Discharge Data Set (included in Appendix D), since PSROs use UHDDS definitions. Furthermore, consensus has been reached within the Department of Health, Education, and Welfare to require use of UHDDS in federal reporting systems. Diagnostic coding was based on ICDA-8 as modified by HCFA.~5] Procedural coding was based on four-digit Surgical Current Procedural Terminology (CPT), first edition, as modified by HCFA.~6] for Us ~ 1th Education, and Welfare, Social Security Administration, Pubn. No. 23-72, undated. 6 Numerical Surgical Current P , U.S. Department of Health, Educa scurry Adminis- tration, Office of Program Policy and Planning, ORS, Pubn. No. 013-75 (9/75~. Although the title specifically refers to surgical procedures, the contents include procedures which are not surgical--for example, transfusion--and which also appear on the Medicare claim form. Throughout this report the term "procedure" is used, rather than "surgical procedure," since it is clear that this broader concept is appropriate.

15 Although all diagnoses and procedures for a given hospitalization were recorded on the abstract form, only the admitting and principal diag- noses and principal procedure were coded. Strict guidelines were pro- vided for designating an admitting diagnosis, relying on only the limited information available at the time of admission or soon thereafter. This information was used to explore the possibility that the primary diag- nosis, contained in the Medicare record, may be in reality an admitting diagnosis, rather than a more definitive principal diagnosis, because of the urgency to submit a claim form for reimbursement. The presence of additional diagnoses was also noted--again, in accord with specific guidelines. For example, a "history of" a particular diagnosis would not be included unless it were clinically significant for the hospitalization under review. A bone fracture with surgery or other treatment would be included, whereas a slight degree of osteo- arthrit~s noted only as an observation and for which a patient was not treated, would not be included. The notation of significant additional diagnoses was intended to determine whether they are appropriately rec- orded on the Medicare record. (All diagnoses may be recorded on the Medicare claim form. The Medicare record, however, includes a code for the primary diagnosis only; the presence of additional diagnoses is noted, but they are not coded.) It was hypothesized that the reli- ability of data for patients with additional diagnoses might be lower than for those with a single diagnosis only. There were several steps involved in the abstracting process, each in- tended to increase understanding of the reasons for discrepancies and the relative degree of error that would be involved if such data were used for evaluative purposes. After the field team had independently abstracted all information items from the hospital medical record, the results of that process were compared with corresponding information on the Medicare record. Where the two disagreed, the medical record was re-examined and an attempt was made to determine the correct data source, as well as the reason for the discrepancy. In all cases, these determinations were made before consulting the claim form. The options available to denote reasons for discrepancy varied, depending on the data item, as outlined below. Reasons for discrepancy - date of hospital admission, discharge, and sex: . a Clerical - discrepancies attributable to mistakes of a coding clerk, such as obvious transposition of numbers in the year of admission. Completeness - discrepancies resulting from an inadequate review of the medical record. For example, an item may be missing from the admitting sheet, but clearly stated in the discharge summary. Reasons for discrepancy - admitting vs. principal diagnoses (both determined by field team):

16 . Completeness - discrepancies between the admitting and princi- pal diagnoses resulting from an incomplete review of those por- tions of the medical record designated for use in determining admitting diagnosis. o Coding refinement - discrepancies resulting from the avail- ability of additional details when the principal diagnosis was determined. Although the general diagnostic category does not change, the final determination is more refined. For example, the admitting diagnosis may be pneumonia (486.0), although the principal diagnosis is pneumococcal pneumonia (481.0~. Investigation - discrepancies occurring when an admitting diagnosis (such as headache) is based on preliminary find- ings or symptoms, although the principal diagnosis (hypo- glycemia) is based on more complete medical investigation. Other - all reasons not included in any of the above. For such cases, the field team member was instructed briefly to explain the reason for the difference. Reasons for discrepancy - principal diagnosis and procedure: These reasons for discrepancy were grouped into two broad categories: ordering and coding. The possibility of an ordering discrepancy had to be eliminated before considering a coding discrepancy. In general, ordering discrepancies stem from uncertainty about whether a diagnosis or procedure should be regarded as "principal" or '.other," in accord with UHDDS definitions. The following specific types of ordering discrepancies were considered: Ordering: Medicare definition - discrepancies reflecting dif- ferences between the UHDDS definition and that required for the Medicare claim form, which should occur only if a hospital con- sciously and consistently used the Medicare definition in com- pleting the claim forms in 1974. For example, a patient is ad- mitted for an open fracture reduction and while on the operating table suffers an acute myocardial infarction which requires a three-month hospitalization. Using the UHDDS definition, frac- ture reduction would be chosen as the principal diagnosis, be- cause fracture is the diagnosis explaining the cause of admis- sion. If the definition in the "Medicare Hospital Manual'' for principal diagnosis is used, however, the diagnosis might be AMI. The manual states: "the primary diagnosis is the diag- nosis or the illness or condition which was the primary reason for the patient's hospitalization." It might be noted that this Medicare definition is appropriate primarily for patients included in the twenty percent sample. For other Medicare bene- ficiaries, the "intermediary may authorize the hospital to use the 'working' diagnosis rather than the final diagnosis if this

17 would reduce delays in the submission of billing forms.''t6] In hospitals with this authorization, billing office personnel would have to explicitly determine whether a particular patient was in the twenty percent sample and report the diagnosis accordingly. Therefore, this reason for discrepancy was not expected to be frequently used. o Ordering: Hospital list - discrepancies stemming from a routine hospital practice (in 1974) of choosing the first listed diag- nosis or procedure on the face sheet as principal. . Ordering: Completeness - discrepancies caused by selecting a narrative for principal diagnosis based on an incomplete review of the medical record. For example, the principal diagnoses re- corded by TOM and Medicare refer to different diseases, each of which the patient had during the hospital episode under question. However, if the chart had been searched more thoroughly, it would have been clear that one, rather than the other, was the proper principal diagnosis in accord with UHDDS definitions. Ordering: Judgment - discrepancies representing an honest dif- ference of opinion in determining which of several diagnoses is principal. One example of this might be a record for a patient with diabetes and glaucoma and insufficient documentation to decide whether either diagnosis would conform to the UHDDS def- inition for principal diagnosis. Similarly, a record may in- dicate carcinoma of several sites and may not be sufficiently documented to permit a determination of a principal diagnosis using the UHDDS definition. Ordering: Other - all reasons not included in any of the above. The field team was instructed to write a note explaining the necessity to use this reason. Ordering: Dependent - this option applied only to discrepancies on principal procedure and was used if an earlier discrepancy in the selection of principal diagnosis resulted in a corresponding or dependent discrepancy in the selection of principal procedure. After the ordering options were eliminated as possible reasons for dis- crepancies between the Medicare record and the ION abstract, coding op- tions could be considered. In some respects, the term "coding" may be a slight misnomer for this type of discrepancy because the code should be assigned by HCFA personnel, who have access only to information on the claim form. A more accurate description of this type of discrepancy might refer to the specificity of detail in the narrative on which the code is later based, but this is rather cumbersome. In some hospitals a code is assigned in place of or in addition to the narrative, in which Medicare Hospital Manual, U.S. Department of Health, Education, and Welfare, HIM Pubn. 10 (6-66), Reprint, August 1975, p. 81.1-82.2.

18 case the earlier use of '~coding" discrepancies is quite appropriate. Occasionally fiscal intermediaries also assign codes. Furthermore, the distinction between ordering and coding discrepancies facilitates comparison with the prior study. Therefore, for abstracts where there was a general agreement on what the principal diagnosis or procedure should be, but not the resultant code, the following reasons for discrepancy were available: · Coding: Clerical - discrepancies caused by transposing diag- nostic code numbers or using non-existent codes. 0 Coding: Completeness - discrepancies caused by including incomplete narrative information for assigning a code, which resulted from an incomplete review of the medical record. For example, the presence of a nine as a fourth digit, indicating the diagnosis is not otherwise specified, when a more careful review of the chart would have resulted in a more specific fourth digit code. Coding: Procedural - discrepancies caused by routine and systematic misuse or misunderstanding of the coding system. For example, reliance on the diagnostic index without ref- erence to tabular listings or failure to heed inclusion and/ or exclusion advice from the tabular listing. a Coding: Importance - this option applies only to the coding of procedures and was used to explain discrepancies caused by differences of opinion over how significant a procedure must be to warrant coding. This is particularly important in hospitals which routinely code all procedures without regard to the UHDDS definition. · Coding: Judgment - discrepancies caused by absence of com- plete word-for-word correspondence between the recording of the diagnosis or procedure in the medical record and/or claim and the wording in the coding manuals, which requires relying on judgment. For example, a diagnosis may be recorded as "recurrent.`' It is unclear from the record whether '`acute'. or "chronic" is the more appropriate qualifier, and these are the only two options available in the coding manual. · Coding: Other - All reasons not included in any of the above. This option was expected to be used to explain differences when the field team coded a procedure, but Medicare coders did not because the information on the claim forms was illegible or inappropriate. A note of explanation was required if this option was selected.

19 Reasons for discrepancy - presence of additional diagnoses: Completeness - discrepancies resulting from an incomplete re- view of the medical record in hospitals where additional diag- noses are expected to be noted on Medicare claims forms. Hospital definition - discrepancies resulting from a hospital policy whereby only one diagnosis, presumably the principal one, is routinely entered on the Medicare claim form. By definition, no additional diagnoses would appear on the Medi- care record. Importance - discrepancies resulting from IOM guidelines for determining additional diagnoses. For example, osteoarthritis may be listed as a diagnosis in the medical record and the Medi- care claim form might indicate the presence of an additional diagnosis. However, if osteoarthritis were noted in the record only as an observation, then the field team would not regard it as an additional diagnosis. IOM guidelines define an additional diagnosis as one whose presence affects treatment methods or utilization of services. (See Specific Instructions in Appendix D). For each discrepancy, the field team was asked to code only one explan- atory reason. Frequently a subjective assessment was required in order to determine which reason might be influential. It is particularly dif- ficult to know whether coding errors should be traced to the hospital or to HCFA. Therefore, the reliability of these responses may be less than for the remainder of the data. Nevertheless, the potential value of the information was judged to outweigh its partially subjective nature. After all records had been abstracted and the reasons for discrepancy determined, information from the claim form was abstracted. The diag- noses and procedures were entered on the abstracting form in the order in which they were listed on the claim form. The first-listed diagnosis and related procedure was then coded by the field team since this procedure is usually followed by Medicare coders. To expedite the field work, comparisons between the claims data and the Medicare record were performed by computer at the National Academy of Sciences (NAS), rather than by the field team. When the narrative summary of diagnoses and procedures is transferred frog the claim form into a numerical code by HCFA coders for inclusion in the twenty percent data base, occasionally special coding guidelines are used. The field team did not use these guidelines, because of the potential lessening of reliability that might result if they altered their usual coding practices. This raises the possibility that dis- crepancies between the field team's work and the Medicare record may stem from HCFA procedures, rather than from inaccurate data submitted by the hospital. Therefore, claims data from all abstracts with dis- crepancies were coded by senior Registered Record Administrators (RRAs)

20 employed by BCFA to determine whether the HCFA codes had been correctly applied. In the analysis, special attention was given to those cases in which the IOM abstract and hospital claim agreed, but the Medicare record was different. This enabled an estimation of the extent to which HCFA coding accounted for the discrepancy. Information Checklist The checklist was developed to describe the flow of information from the medical record department to the hospital billing office and fiscal intermediary (see Appendix E). It was not administered as a formal questionnaire. Instead, it provided an informal guide for the field team to help them understand hospital procedures and the reasons for which discrepancies might occur. This information was needed for several reasons. During initial meetings with local hospitals and Medicare officials to determine the feasibility of conducting this study, it became apparent that the paths by which claims information eventually enters the HCFA computer vary considerably. In some hospitals, when a patient is dis- charged the medical record department sends the billing office a dis- charge list, including information on diagnoses and procedures needed to complete the claim form. Elsewhere, a copy of the face sheet of the medical record or discharge summary may be sent to the billing office, where the necessary information is retrieved. If the principal diag- nosis is determined by a billing clerk, short and easily spelled diag- noses may receive preference. HCFA requires that information on diag- nosis and procedure be provided in a narrative (not coded) form. In one hospital visited, diagnostic information was transformed to H-ICDA codes by the medical record department and sent to the billing office, which then translated that information back to a narrative for forward- ing to the intermediary. In other instances, the hospital provided coded information to the intermediary, which then converted the codes to narrative for transmittal to HCFA. Occasionally, HCFA may receive claim forms for the twenty percent sample that are already coded. Nevertheless, Medicare coders are instructed to disregard the codes and enter their own. Although the fact of variation was known, the permutations along that path and the influence on data reliability were not known. It there- fore became desirable to try to document the variability and use that information in analyzing the reliability of the Medicare records. Thus, information was obtained to reflect the source from which the billing office obtained information on diagnoses and procedures and the train- ing levels of persons retrieving that information, the number of days after discharge that diagnostic information was transmitted to the bil- ling office for entry on the claim form, whether later, more definitive diagnostic information was forwarded to the billing office and inter- mediary, whether the billing office received narrative or coded infor- mation and, if coded, whether it was translated back into a narrative.

21 Information on the definition of principal diagnosis and procedure used by each hospital during 1974 was also obtained. This was needed in order to determine whether the hospital consciously followed Medicare's defini- tion, which may have led to discrepancies, or whether the hospital had developed a unique definition that differed from both UHDDS and Medicare definitions, which again may have resulted in discrepancies. FIELD WORK The field work was conducted by four Registered Record Administrators (RRAs), recruited because of their extensive experience in diagnostic coding, research, and administration. They were specially trained for this study by a consultant under the general guidance of the TOM staff. Two of the field team members and the technical consultant participated in the previous re-abstracting study and were able to draw upon that experience in refining the methods for this study. The consultant re- viewed HCFA's adaptation of the ICDA-8 classification scheme, which con- sisted primarily of an expansion of fourth digit codes, and instructed the field team in its use. Before the field work began, the supervisor of the medical record de- partment in each study hospital was asked to locate and have available the selected medical records. When the field team member arrived, she first informally discussed the checklist items with the supervisor to acquaint herself with any unusual hospital procedures that might in- fluence the abstracts, and immediately began abstracting. ~ The completed abstracts were compared with the Medicare records (con- tained in a sealed envelope to avoid conditioning the abstracting) and any differences between the two were noted. If a discrepancy between the Medicare record and TOM abstract was observed, the patient's medical record was re-examined to determine what information appeared to be cor- rect and what factors might account for the discrepancy. For those dis- crepancies not attributable to the field team member, the claim forms were reviewed. A roster of all records under study had previously been sent to the director of the billing department, so that the claims were available, if needed. Details on the re-abstracting process are found in Appendix D. The field team's ability fully to understand the reasons for discrepancy was somewhat constrained by the break in continuity between the examina- tion of the medical record and review of the claim form. Ideally, after abstracting the medical record one would compare the TOM abstract, the Medicare record, and the claim form by placing them side-by-side and referring to the medical record in hopes of understanding discrepancies. During the feasibility phase of the study it became apparent that this would not be possible. Such a comparison would have required copying or physically removing either the medical record or the claim form frog its customary location, which created confidentality problems.

22 Furthermore, the difficulty in obtaining claim forms often led to delay or return visits, which might have required accessing the medical record twice and, in any case, would have created logistical problems. Therefore, the method used was regarded as the best com- promise . To check the reliability of the field work, a subsample of abstracts was independently assessed by the consultant who trained the field team. Comparisons were made between these results and those initially compiled by the field team. In conducting this work, the consultant did not know which member of the field team had done the initial ab- stracting, or whether discrepancies were initially detected. The as- sessment process and its results are described in Appendix F. CONFIDENTIALITY To retrieve the medical records and claim forms within study hospitals, it was necessary for HCFA to provide the IOM with the health insurance claim number, name and birth date of the beneficiary, and date of hos- pital admission and discharge for each hospital episode reviewed. To review the accuracy of diagnostic and procedural coding, a partial re- view of the medical record was required. Thus, the protection of in- dividual privacy became very important, and special measures were re- quired to assure confidentiality. All information provided by HCFA was carefully secured. The computer tapes on which the data were compiled for analytic purposes were stored in the NAS computer center, which is accessible only to authorized persons. All information which might identify individuals was removed and destroyed as tape files were created. The report contains statistical summaries only, which do not permit the identification of patient, physician, or hospital. These provisions were carefully delineated in a notice in the Federal Register, dated October 14, 1976. DATA PROCESSING After completion of the field work, the abstracts and Medicare records were returned to TOM for data processing. Each abstract and informal checklist was scanned visually, keypunched and verified, and subjected to computer edits. After the accuracy of the raw data was assured, weights were added for use in the analysis. A single composite weight was assigned to each abstract which reflected its probability of inclusion at each step in the sampling process and was adjusted to account for the non-participation of hospitals and the unavailability of medical records. The weights were applied through- out the analysis to permit generalizing to the broader national universe of all 1974 hospital discharges for Medicare beneficiaries age 65 and over and eligible for inclusion in the Medicare utilization data file. Although analyses are based on weighted data, the unweighted sample sizes are reported in the tables presented in the following chapter.

Next: Analysis »
Reliability of Medicare Hospital Discharge Records: Report of a Study Get This Book
×
 Reliability of Medicare Hospital Discharge Records: Report of a Study
MyNAP members save 10% online.
Login or Register to save!

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!