F
Verification of Completeness and Accuracy of Mortality Ascertainment

Assessment of Completeness

We describe the procedural steps, along with success rates, involved (Table F-l) in determining vital status information about the 73,910 Navy personnel considered in most of this study's analyses.

The comparison of missing rates for the participants and controls informs about the possible influence these gaps might have on the statistics calculated and inferences drawn from them.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 128
--> F Verification of Completeness and Accuracy of Mortality Ascertainment Assessment of Completeness We describe the procedural steps, along with success rates, involved (Table F-l) in determining vital status information about the 73,910 Navy personnel considered in most of this study's analyses. The comparison of missing rates for the participants and controls informs about the possible influence these gaps might have on the statistics calculated and inferences drawn from them.

OCR for page 128
--> TABLE F-1. Procedural Steps and Success Rates for Determination of Vital Status Information Procedural Step Participants (38,668), % Controls (35,242), % Total No. Total % All submitted to BIRLS Of those submitted to BIRLS, 100.0 100.0 73,910 100.0 % found on BIRLS 87.4 88.1   87.7 Of those found on BIRLS, % with indication of death 38.7 37.9   38.3 Of those with indication on BIRLS of death, indicated by % date of death 96.9 96.9   96.9 % FARC folder location only 3.1 3.1   3.1 Of those with indication on BIRLS of death, % with claims folder location noted in BIRLS 80.3 81.3   80.8 Of those with claims folder location noted on BIRLS, % in VA regional offices 71.1 70.7   70.9 % in FARCs 28.6 29.3   28.9 Submitted to VAMI 100.0 100.0 24,762 100.0 Of those submitted to VAMI, % found 92.6 91.3   91.9 A veteran could be "not found" on the Beneficiary Identification and Locator Subsystem (BIRLS) for varied reasons: (a) the record existed but the Medical Follow-up Agency (MFUA) submitted insufficient information, such as a misspelled name, to identify it; (b) the requesting information was correct but the BIRLS record includes a misspelling; (c) a veteran was not entered into BIRLS because the veteran or a surviving dependent had filed no claim for medical, educational, loan, death, or other benefits. Similarly, a claims folder—identified by BIRLS—could be "not found" because (a) the request went to the wrong VA Regional Office (VARO), (b) misfiling, (c) the file was transferred to another VARO, or (d) the file was transferred to a regional Archives center (FARC). Finally, a claims folder may be found yet not contain the death

OCR for page 128
--> certificate, the cause of death, or have been unusable because of an illegible or poorly copied certificate. Completeness of Mortality Follow-Up The completeness of follow-up is displayed in Tables F-2 and F-3 below. TABLE F-2 Vital Status for Navy Personnel Vital Participants Controls All   Status No. % No. % No. % Dead 12,092 31.3 10,804 30.8 22,896 31.1 Alive 21,771 56.3 20,321 58.0 42,092 57.1 LTFU 4,805 12.4 3,911 11.2 8,716 11.8 Total 38,668 100.0 35,036 100.0 73,704 100.0 TABLE F-3. Completeness of Mortality Information Vital Status Participants Controls All   Information No. % No. % No. % Cause and Date 10,436 86.3 9,649 89.3 20,085 87.7 Cause only 7 0.1 10 0.1 17 0.1 Date only 1,639 13.6 1,135 10.5 2,774 12.1 Neither 10 0.1 10 0.1 20 0.1 Total dead 12,092 100 10,804 100 22,896 100 Assessment of Validity Mortality Ascertainment To assess the quality of VA records' vital status information, we did an independent search of two federal, non-VA, databases. While none of the three is designated as the correct standard, similar findings would support the validity of each of them. HCFA All Lost-to-follow-up (LTFU) and a sample of Alive and Dead participants and controls were searched in Health Care Financing Administration (HCFA) files (Table F-4). Since HCFA does not record deaths prior to Medicare eligibility, we expected to find an undercount of deaths in "Dead" and an overcount in "LTFU." Deaths before 1976 are extremely unlikely to be picked up, as are deaths before age 65. Because HCFA is concerned with medical

OCR for page 128
--> benefits to living enrollees, its data are unlikely to confirm as "Alive" someone who is dead. TABLE F-4. Summary of Vital Status for Study Records Submitted to HCFA Vital Status from VA/MFUA Records Participants Controls All Alive 1,035 891 1,926 Dead 1,060 964 2,024 Lost to follow-up 5,649 6,160 11,809 Total 7,744 8,015 15,759 Methods. The HCFA enrollment files used in this examination of VA mortality ascertainment are of two kinds. The first kind of file is a called an "alpha search file" and the second a "vital status file." Both files contain information about HCFA beneficiaries, primarily Medicare beneficiaries, and their vital status. Other than formatting, the practical difference between the files is that the first is searched using a subject's name and date of birth, while the second uses the Social Security Number (SSN). For this examination of the completeness of VA mortality reporting, all participants and controls who were lost to VA follow-up (i.e., LTFU, no record in the BIRLS file) as of July 1995 were matched against HCFA enrollment files, as were random samples of about 1,000 presumed dead and presumed alive from both participants and controls. All subjects in this exercise were matched against the alpha search file, and subjects with SSN were also matched against the vital status file. The central, practical methodological issue is the definition of a match. Because there were multiple matches against the alpha search and vital status files, a scheme was developed to identify and classify the "best" match for any single subject. Because everyone was matched against the alpha search file, matches against it were considered first, and only if there was no "good" (see below) match against the alpha search file were matches against the vital status file considered. HCFA's algorithm for matching the alpha search file looks at the matches between various data elements and assigns point scores for each individual matching element. The data elements matched and their respective point scores are as follows: last name (64 points), month of birth (32 points), sex (16 points), first name (8 points), year of birth (4 points), day of birth (2 points), and middle initial (1 point). Sex was not included on our input file, so HCFA awarded no points for matching that element on any individual. Matching scores ranged from a maximum of 111 points (all elements matched, save sex) to zero (no elements matched). Practically speaking, however, matching scores sorted themselves into two obvious groups: 104 points or more (i.e., from last name, first name, and month of birth matches up

OCR for page 128
--> through perfect matches) and 103 points or less. Roughly three-quarters of the 23,200 potential matching records file fell into the first category, with the other one-quarter in the second group. Thus, we considered a good match to be a match associated with a point score of 104 or more. In addition, because we provided an SSN on the alpha search file whenever available, it was possible to check the SSN we provided with the SSN returned by HCFA. When these SSNs were different, the match was discarded as "bad," no matter what the matching score was. After being matched against the alpha search file process, many subjects still remained unmatched. If a subject had no match or a bad match, the vital status (SSN search) file was consulted. If there was an exact match between SSN and last name on the vital status file, that record was considered a good match and added to the file. The results tabulated below are based on the results of the combined alpha search and vital status file matching. Results. Table F-5 shows the results of the HCFA file matching process. Each of the three groups defined on the basis of VA follow-up (Lost-to-FollowUp, Alive, and Dead) is shown separately for participants and controls. In this table, deaths after the study mortality cut-off date (31 December 1992) were counted as "Dead." Subsequent tables in this chapter limit "Dead" to veterans who died within the study period. The LTFU constitute the largest groups of participants and controls. Roughly 80 percent of participant LTFU and 70 percent of control LTFU were matched at HCFA. Disregarding nonmatches, the overwhelming proportion of LTFU were found alive on HCFA: 84.7 percent for participants and 82.7 percent for controls (see discussion below). Roughly 85 percent of living participants and controls in the random sample were matched at HCFA. Again disregarding nonmatches, the overwhelming proportion of these subjects were found alive on HCFA: 92.1 percent for participants and 95.3 percent for controls. Results for the random sample of VA deaths were much different than for the other groups. Only about 55 percent of Dead participants in the random sample were found at HCFA, compared to around 40 percent of controls. Again disregarding nonmatches, there is still a general validation of the VA data: two-thirds of the participants were found dead on HCFA and three-quarters of controls. Possible reasons for the high number of nonmatches among deaths are discussed below. Discussion. In general, vital status data from HCFA validated the VA results. Disregarding nonmatches, the overwhelming proportion of subjects shown dead by the VA were dead according to HCFA; the same was true for living subjects. In addition, the LTFU group has now been shown to consist mostly of living individuals (about 8 percent were dead).

OCR for page 128
--> TABLE F-5. Comparison of VA and HCFA Vital Status for Operation CROSSROADS Participants and Controls     Participants     Controls     MFUA Status HCFA Status Number % of Total % of Matches Number % of Total % of Matches LTFU     Alive 3,883 68.7% 84.7% 3628 59.0% 82.7%   Dead* 701 12.4% 15.3% 757 12.3% 17.3%   No match 1,065 18.9% — 1775 28.8% —   TOTAL 5,649 100% 100% 6160 100% 100% ALIVE     Alive 813 78.6% 92.1% 731 82.0% 95.3%   Dead* 70 6.8% 7.9% 36 4.0% 4.7%   No match 152 14.7% — 124 13.9% —   TOTAL 1,035 100% 100% 891 100% 100% DEAD     Alive 143 13.5% 24.9% 39 4.0% 10.8%   Dead* 431 40.7% 75.1% 323 33.5% 89.2%   No match 486 45.8% — 602 62.4% —   TOTAL 1,060 100% 100% 964 100% 100% * In this table, deaths after the study mortality cut-off date (12/31/92) counted as "Dead." Subsequent tables in this appendix limit "Dead" to veterans who died within the study period.

OCR for page 128
--> Two additional topics merit some further discussion. First, deaths occurring early in the follow-up may have occurred before the decedent could have been enrolled as a HCFA beneficiary (for example, at age 65, for Medicare), and so should not have been found in the HCFA files. This seems the most likely explanation for the low finding among decedents. Lastly, the fact that the bulk of LTFU were found alive on HCFA could have been predicted. A number of studies (Beebe and Simon 1969, Page 1992, Page et al. 1995) have shown that VA death reporting is roughly 95 percent complete for World War II veterans. Add to this an estimated death rate of around 30 percent and a LTFU rate of 15 percent, and the arithmetic works out as follows. In a group of some 1,000 World War II veterans, 700 will be living and 300 dead. Of the 300 dead, 285 (95 percent) are known to BIRLS and 15 (5 percent) unknown. Of the one thousand, 850 (85 percent) will be found on BIRLS and 150 (15 percent) LTFU. From this, it can be determined that there are only 15 LTFU deaths among the total of 150 LTFU subjects, a rate of 10 percent, which is one-third of the overall death rate and consistent with our HCFA data. That the bulk of LTFU in our sample were confirmed alive is thus as one would have expected. NDI With related but different accuracy checks in mind, we submitted all 4,107 deaths for which, at the time, we had no death certificate (and, therefore, no cause of death) and which occurred after the National Death Index (NDI) had started (1979) and before the study follow-up cut-off date (31 December 1992). These were sent to obtain information needed to get death certificates from state vital statistics offices. We also sent a sample (about 250 in each category) of alive and dead participants and controls for verification of death status. Our NDI request, unlike the limitations in our expectations of HCFA data, was therefore structured so that all MFUA-classified ''Dead'' records should be found as "Dead" on NDI. However, we would expect some of the MFUAclassified "Alive" and a larger portion of the MFUA-classified "LTFU" to be in the NDI database as "Dead." The reason is that for the former we had some positive information in BIRLS indicating a death had not occurred, while for the latter we had no information. NDI characterized the 5,108 records MFUA submitted as involved in matches, nonmatching, and rejected. Of those matching, user (MFUA) records could match to one or more NDI records (see distribution in Table F-6).

OCR for page 128
--> TABLE F-6. Number of Potential Matches for Each Record Submitted to NDI No. of NDI Matches Returned for Each Record Submitted No. of MFUA Records Submitted 1 NDI record 2,259 2 NDI records 646 3 386 4 230 5 150 6 109 7 90 8 71 9 53 10 41 ≥11 NDI records 424 MFUA records involved in matches 4,459 Nonmatching user records 540 Records rejected 9 Total submitted by user 5,108 Of 5,108 records MFUA sent, NDI proposed at least one match for 4,459. MFUA accepted 3,656 as "good" for this analysis. The remaining 803 MFUA records with at least one NDI match can be—but have not been—reviewed by hand/eye to judge discrepancies. The results of the NDI search are shown in Table F-7. There were also 803 "questionable matches," some of which would probably be found to be correct, that is, "Dead," on NDI. The percentages in columns 4, 7, and 10 would then go up; view them as lower bounds. External Validation Discussion Mortality Information Despite our sending different samples, different years of coverage, and different technical approaches to matching, NDI and HCFA results are consistent with each other for both those we considered alive and those who were Lost-to-Follow-Up (Table F-8). Of participants for whom MFUA records would indicate vital status as "Alive," HCFA and NDI each found 3.6 percent to be "Dead" (positive information in their databases indicating a death). Both data sources reported fewer controls in this category (NDI, 1.2 percent; HCFA, 2.2 percent).

OCR for page 128
--> TABLE F-7. Comparison of NDI and MFUA Mortality Information   Participants     Controls     Total         On NDI (= dead)   On NDI (= dead)   On NDI (= dead) MFUA Status Submitted No. %* Submitted No. % Submitted No. % Alive 251 9 3.6 250 3 1.2 501 12 2.4 Dead 2,430 2,112 86.9 1,677 1,491 88.9 4,107 3,603 87.7 LTFU 250 22 8.8 250 19 7.6 500 41 8.2 Total 2,931 2,143   2,177 1,513   5,108 3,656   * Of the 803 "questionable matches," some would probably be found to be correct, that is, "Dead," on NDI. The percentages in columns 4, 7, and 10 would then go up; view them as lower bounds.

OCR for page 128
--> TABLE F-8. NDI and HCFA Comparisons MFUA Status ⇑ Participants   Controls   External Response NDI HCFA NDI HCFA Alive ⇑ Dead 3.6% 3.6% 1.2% 2.2% Dead ⇑ Dead 86.9% 34.2% 88.9% 28.3% LTFU ⇑ Dead 8.8% 7.3% 7.6% 7.8% HCFA was created in the 1970s to oversee the financial management of the Medicaid and, later, the Medicare programs. Its quality is maintained for current administrative purposes, and data are updated if the information might be relevant to benefits decisions; hence, recent deaths are well recorded (tied with the stopping of Social Security benefits), but earlier ones might not be. While the HCFA data were consistent with expectations and therefore somewhat informative, they are not a good fit for a mortality study covering substantial deaths before 1980 such as the CROSSROADS study. Less than a third of MFUA-considered "Dead" were reported as "Dead" based on the HCFA enrollment tapes; many deaths occurred in our study cohort before 1980. Vital status records at two databases external to VA confirm our expectation that individuals we label "Lost to follow-up" are probably "Alive." Looking at controls and participants together, NDI reported as "Dead" 2.4 percent of individuals MFUA considered Alive, 8.2 percent of these MFUA considered LTFU, and 87.7 percent of those MFUA considered "Dead.'' For those categories, HCFA reported a similar trend with lower values (see preceding paragraph). Therefore, we believe we are not introducing large additional bias into our analyses by considering the LTFU individuals to be "not dead.'' Cause-of-Death Information To evaluate the reliability of cause-of-death coding according to the ICD-9 classification system, we submitted already coded death certificates to a contract nosologist for recoding without identifying them as other than routine (Table F-9). Because of specific concerns about the coding of leukemia, an endpoint of radiation exposure, all 166 death certificates available at the time noting leukemia were recoded, along with all records noting hematopoietic or lymphatic cancers. All records noting other diseases of the blood and bloodforming organs were also included to check whether any leukemias may have been so coded. Also recoded was a sample of death certificates indicating all other malignant neoplasms, noncancer disease deaths, and external causes of deaths.

OCR for page 128
--> TABLE F-9. Distribution of Causes of Death for 641 Death Certificates Sent for Recoding No. of Death Certificates Disease Category 166 All leukemias 244 All other hematopoietic or lymphatic cancers 30 All other diseases of the blood and blood-forming organs 102 Sample of all other malignant neoplasms 83 Sample of noncancer disease deaths 16 Sample of external causes of death Of the 410 records sent for recoding because they were initially coded as leukemia (ICD9 204.0–208.9) or other lymphopoietic cancer (ICD9 200.0203.8)32 as the underlying cause of death, there were 33 discrepant pairs. Nineteen of those discrepancies did not affect the cause-of-death analysis category to which we would have assigned the case in our analyses. The remaining 14 discrepancies fit into 3 groups: Five (5) discrepancies involved moving from one lymphopoietic cancer category to another (e.g., leukemia to cancer of other lymphatic tissue)—so analysis of the combined lymphopoietic cancer group would not have been affected, although each specific category would have been. Four (4) discrepancies involved moving from a lymphopoietic cancer category to another disease category that we will explore in the study analyses (e.g., multiple myeloma to disease of the circulatory system)—so the deaths would be considered in the analysis, but not as lymphopoietic cancer deaths. Five (5) discrepancies involved moving from a lymphopoietic cancer category to a disease category that will not be separately considered in this study (e.g., cancer of other lymphatic tissue to herpes zoster)—so the deaths would be included only in the "all-cause" category. Considering leukemias alone, there were 18 discrepant pairs (18/166 = 10.4 percent). Only one of these affected the analysis category (leukemia and aleukemia), with that case moving from leukemia to the more general all lymphopoietic cancers. In an analysis of leukemias alone, this death (1/166 = 0.6 percent) would have been missed. Conversely, one death coded originally as lymphosarcoma or reticulosarcoma was recoded as a leukemia. The comparisons we make here between codings only identify discrepancies; they do not show which of each pair is correct. Other proportions support a similar view of discrepancies: 32   Code ranges chosen to match the NCI mortality tables (NCI 1995).

OCR for page 128
--> Of the lymphopoietic cancer deaths, 8.0 percent (33/410) had discrepancies in the coded underlying cause of death. All others matched exactly. A smaller proportion—3.4 percent (14/410)—of the lymphopoietic cancer deaths had discrepancies in the coded underlying cause of death that would have changed the cause-of-death category used in the study analyses. Of the 201 drawn as a sample from other causes of deaths (coded as all malignant neoplasms, noncancer diseases, and external causes), 8 (4.0 percent) were in discrepancy categories that had a high likelihood of changing analysis category. Thirty deaths from other diseases of blood and blood-forming organs (ICD9 282–289) were recoded to see whether any would turn up coded as leukemias. There were five discrepant pairs, but none of them resulted in a leukemia.