Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 42
8 Mortality Ascertainment Do atomic test participants have a reduced life expectancy compared to non- participants? Are they at increased risk for certain causes of death? Could this be related to radiation exposure? Our basis for addressing these questions is a com- parison of death rate, timing, and cause of death for the two cohorts. Correct as- certainment of mortality data, therefore, is crucial to the validity of this epidemio- logic study. In this chapter, we first describe ascertainment methods, verification activities, and validation analyses, and then proceed to an assessment of success. FACT-OF-DEATH ASCERTAINMENT As described in Chapter 4, the Department of Veterans Affairs (VA) Bene- ficiary Identification and Records Locator Subsystem (BIRLS) is the sole source of fact-of-death ascertainment for this report's dataset. If a person's record was not found in the BIRLS database, the VA Master Index was searched for addi- tional descriptive information (e.g., military service number or a middle name) that might allow a connection to a BIRLS record. BIRLS information, then, re- sults in a defined set of possible mortality ascertainment outcomes: · known dead indication of death in the BIRLS database, and · not known dead no indication of death in the BIRLS database. Each of these is composed of subgroups described by the availability of other pieces of information. An individual is classified as known dead if the BIRLS database (1) explicitly refers to a death, giving a date or a cause, or (2) lists the location of the VA claims folder as the federal archives. Not known dead is the accurate way of referring to individuals who in other studies might be classified 42
OCR for page 43
MORTALII YASCERTAINMENT 43 as "alive" or "lost to follow-up." What we do know about these individuals is that either the BIRLS database (1) has a record of the individual but no reference to death or federal archives or (2) contains no reference at all to the individual. The BIRLS procedure identified 38,055 deaths among the 132,949 mem- bers of the two cohorts. The 1,865 of these deaths that occurred after our defined end of follow-up (December 31, 1996) were treated as alive for the analyses presented in this report. Establishing a calendar cutoff of dates of death is neces- sary to allow time for adequate cause-of-death follow-up activities. The re- maining 36,190 deaths constitute 27.2 percent ofthe combined cohorts. Tables 8-1 and 8-2 present the vital status categories for the participant and referent cohorts. TABLE 8-1. Vital Status as of December 31, 1996 Participants Referents Total (n = 68,168) (in = 64,781) (n = 132,949) %of %of %of Vital Status No.Cohort No.Cohort No.Cohort Not Mown dead (no) 49,65172.8 47,10872.7 96,75972.8 Known to be dead (yes) 18,51727.2 17,67327.3 36,19027.2 Total 68,168100.0 64,781100.0 132,949100.0 FACT-OF-DEATH VALIDATION BIRLS is the only source of fact of death in this study. How complete is BIRLS as a record of veterans' deaths? If it does not capture almost all deaths, mortality studies based on these data would be inaccurate. If it captures certain kinds of deaths or deaths of certain kinds of veterans, inferences based on its data could be biased. BIRLS was searched for a record of each member of the combined study cohorts. Not all individuals were found: 23.4 percent of the par- ticipants and 24.8 percent of the referents were not found in BIRLS. A veteran might not be found in BIRLS for varied reasons: (1) the record existed, but MFUA submitted insufficient information, such as a misspelled name, to identify it; (2) the requesting information was correct, but the BIRLS record includes a misspelling; (3) a veteran was not entered into BIRLS because the veteran or a surviving dependent had filed no claim for medical, educational, loan, death, or other benefits. Similarly, a claims folder identified by BIRLS- might not be found because (1) the request went to the wrong VA regional office (VARO), (2) misfiling had occurred, (3) the file was transferred to another VARO, or (4) the file was transferred to a regional archives center. Finally a claims folder may be found but not contain the death certificate, the cause of death, or a legible copy of the certificate. For these reasons, we sought corrobo- ration of fact of death from other sources.
OCR for page 44
44 o cr _ _ 11 ·_4 - . Ct - Ct ¢ Ct U. Cal C) o so . . a, C) o U. Ct Ct cn Cat V) - .> _ Cal ~ U. oo C;: _ . ~ oo C) ~ $: 11 Ct _~ ~ o ~ ~V o o C40 o o o o o o ~o ~ v o z 1 1 1 °. o. o. Ct._ o - - , _ ~o~ - ~t _ ~o o _ ~oo ~oo oo o oooo ~_~ 1 1 1 ~ - o. oo _ o ~o - oo ~o ~ o _ ~o o o ~oo oo ~o oo ~t o ~ ~- I1 1 - o 1 ~_ . . . . . t- ~_ ~o ~C~ oo ~C~ _ oo - , _ _ ~ 3 ~ ~ ~3 E ~ o ~ Z ~ ~ ~o Z ~ cn U, o Ct C) o C~ 5 - o C) P~ e~ o ._ Ct C) .~ .~ ._ C~ a, m 11 m . . E~ o
OCR for page 45
MOR TALITY ASCER TAINMENT 45 Since 1979, the National Death Index (NDI), maintained by the National Center for Health Statistics, has assembled death certificate-derived mortality data from each of the 50 U.S. states, New York City, and the District of Columbia, as well as U.S. territories and protectorates. We first requested information on two 500-member samples of the participant and referent cohorts that had BIRLS rec- ords both without indication of death and with a BIRLS-noted Social Security number (SSN). Requiring an SSN allows for an efficient search within NDI and a check that the person identified in NDI is the same person in the study population. NDI identified as dead 1.4 percent of the not-known-dead participant cohort sample and 1.8 percent of the not-known-dead referent cohort sample. These two rates were not statistically different ~ = .614. Applying these rates to all ofthe not- known-dead individuals with BIRLS-noted Social Security numbers (21,513 par- ticipants and 16,917 referents), we estimate that 301 participant cohort deaths and 305 referent cohort deaths were not identified using the BIRLS procedures. These additions would increase the BIRLS-based study mortality rate from 27.2 to 27.7 percent (participants, from 27.2 to 27.6%; referents, from 27.3 to 27.8%~. However, we do not have SSNs for a large portion of the study population. In the two cohorts, of those not-known-dead, approximately 41 percent of the participants and 63 percent of the referents do not have any SSN in our database. The participant data that the Defense Threat Reduction Agency provided for this study from the Nuclear Test Personnel Review (NTPR) Program database in- clude SSNs for some of the participant cohort who did not have SSNs listed in the BIRLS database. This NTPR source of information was not available for the referent cohort. Because using NTPR Social Security numbers would increase the likelihood of finding only participants in NDI, the study design excluded use of NTPR Social Security numbers because they would have allowed non- equivalent mortality ascertainment procedures for the two cohorts, introducing a bias into the ascertainment of the outcome data. However, the availability of NTPR Social Security numbers for participants did allow us to estimate how many deaths might have been ascertained if we had more complete SSN cover- age. Thus, we submitted two additional 500-member samples of not-known- dead participants with NTPR SSNs to NDI. One group was in the BIRLS data- base without a BIRLS-noted SSN and one group was not in the BIRLS database at all. Because both BIRLS identification and SSN availability are associated with both vital status and the ascertainment of vital status, we wanted to use these samples to estimate the size of any differential in mortality rates that might stem from differences in information ascertainment rather than an effect of par- ticipation. Although these estimates were not used to adjust the analysis, they are useful in discussing the extent to which deaths have been missed and imbal- anced ascertainment could influence study findings. NDI identified as dead 4.6 percent of the not-known-dead participant cohort sample that was found in BIRLS without a BIRLS-noted SSN and 5.6 percent of the not-known-dead participant cohort sample that was not found in the BIRLS database at all.
OCR for page 46
46 THE FIVE SERIES STUDY Although ascertainment was not complete, these estimates provide a not-so- alarming approximation of the underascertainment of deaths in this study. There are 3,957 participants in the first group and 2,896 in the second. Applying the 4.6 and 5.6 percent sample estimates to the full groups would yield 182 deaths in the first group and 162 in the second. Applying these same rates to the participants without NTPR (or BIRLS) SSNs, we estimate an additional 411 deaths among the participants in BIRLS with neither BIRLS nor NTPR SSNs and 593 deaths among the participants with no record in BIRLS and no NTPR SSNs. Adding all of these groups together, we estimate that BIRLS did not identify 1,649 deaths. Therefore, the estimated BIRLS ascertainment rate for participants is 91.8 percent. For the referent cohort, which does not have NTPR SSNs at all, we must use participant data to produce ascertainment estimates. Applying the 4.6 per- cent additional death ascertainment to referents in BIRLS but without a SSN yields 609 deaths; 5.6 percent additional deaths among referents not in BIRLS at all amounts to 900 deaths. Taken together, an estimated 1,814 referent cohort deaths were not ascertained by the BIRLS procedure, yielding an ascertainment of 90.7 percent of the deaths in the referent cohort. Relatively few formal studies have been undertaken to determine the com- pleteness of veteran death reporting via the BIRLS system, most of them involving either World War II or Vietnam era veterans. Studies of deaths among World War II veterans (Page, 1992; Page et al., 1995) estimated, respectively, that 92 and 95 percent of veteran deaths could be found in BIRLS. Studies of deaths among Viet- nam era veterans (Page, 1993; Page et al., 1996) generally showed slightly lower percentages of BIRLS completeness, 90 percent, except that Boyle and Decoufle (1990) found BIRLS to be only 80 percent complete. A study by Fisher et al. (1995) of a group of hospitalized, largely pre-Vietnam-era veterans showed that BIRLS was 96 percent complete for death ascertainment. Although the methods employed across these studies are varied, all except the Boyle and Decoufle study showed the completeness of veteran death reporting in BIRLS to be 90 95 per- cent. Although the veterans studied here are, for the most part, neither World War II nor Vietnam era vets, we believe that the completeness of death reporting in BIRLS is roughly the same among the veterans in the present study. DATE OF DEATH BIRLS was the principal source of death date for the study analyses (see Ta- ble 8-3~. An actual date was noted for 97.2 percent of the known dead individuals. No date of death was identified for less than 0.1 percent of the known deaths. An- other 2.1 percent of the death dates were obtained from the VA Master Index, the death certificate, or NDI. For most of the remaining deaths, we were able to cal- culate an approximate date of death based on the date a record was transferred from a VA regional office to a federal archives center. This estimate is possible because VA sends to the archives only those VA benefit claims records that are inactive due to the death of veteran and any surviving beneficiaries.
OCR for page 47
47 Cd c' to .~3 ~ ~^ o ~ 11 En ~ S U. Ct ._ ._ Ct · Ct ¢ Ct Cal Cal o . . Ct o 1 Cal so ~ .O ~ CC ~ At, 5- ~ ~ S I<: . ~ ~ . ~ ~ so ,c' ~ ~ ~ S \ o o a\ o to on at - C) o Ct lo 4 - Ct Calo ~o ~ ... ... ~ooo or ~to ~ Go ~0rid _ ~ I ~oo ... . ~o o cr. o ~_4 - .. ~o to of ~C~ V)4~'·= ¢ ¢ ~ ~ ~c~ 1 - cn c~ ~ ~ = .e ~ 'e ~ c) 'e x o c) ~ - ~ ~ - e · - o , ~ ~ :- s~ - ~ ~ 'e ~ ~ o ~ c~s =. e ~ ~- ~ u) 'e a' o e 11 . ~ cq cq s~ c) o u, c) o · - g s:: e · - ~ m .= 11 .co ~ O ~ o~ ~ Z ¢
OCR for page 48
48 THE FIVE SERIES STUDY We used the records with both a BIRLS-noted date of death and a date of folder transfer to archives to calculate the lag time between death and record transfer. Because the efficiency of both the VA and the National Archives and Records Administration (NARA) may have varied over the years, we calculated these lags by year. These lags, generalized to multiyear periods as appropriate, were then applied to the 204 records that had only the record transfer date to impute a date of death. CAUSE-OF-DEATH ACQUISITION The two sources of cause-of-death information are both death certificate based: the death certificate itself and electronic tapes compiled from the death certificates. The BIRLS database provides the location of the claims folder: a specific VA regional office (VARO) or a specific federal archives center (FARC). Following established VA and NARA procedures, we requested that the VARO and archives staff pull the folder and send us a copy of the death cer- tificate for each death. Our contract nosologist supplied codes for all causes of death listed and selected one as the underlying cause and the others, if any, as associated causes. In cases in which the VAROs and FARCs could not produce a death certifi- cate and for which we had date of death, we requested death certificate informa- tion from NDI-Plus if the death occurred in 1979 or later. NDI-Plus returned an electronic tape with identifying information and underlying and associated causes of death. Tables 8-4 and 8-5 are limited to those members of the study population who are known to have died (excluding those who died after December 31, 1996~. Of these 36,190 individuals, a cause of death is not available for 5.9 per- cent. The difference between the participant cohort's 4.5 percent and the referent cohort's 7.3 percent is statistically significant. For the causes of death that we did obtain, 65.5 percent came from the death certificate and 34.5 percent from the National Death Index-Plus. CAUSE-OF-DEATH VALIDATION To determine the level of agreement between the two sources of cause-of- death codes, we processed a sample of 200 records through both ascertainment paths. Neither source- the contract nosologist or the NDI-Plus database was considered the standard; discrepancies were counted, not correct and incorrect codes. Eleven of the underlying cause-of-death codes were sufficiently different so that the death would be assigned to a different cause-specific analysis group. (Another 10 had differences [e.g., in the fourth digit of the International Classi- fication of Diseases code] that exceeded the level of detail examined in this re- port.) For 4 of the 11, the two sources had the same codes but specified different ones as the underlying cause of death.
OCR for page 49
MOR TALI TY A SCER TAINMENT TABLE 8-4. Cause-of-Death Availability Deaths Only 49 Participants with Death Indicated Referents with Death Indicated Total with Death Indicated AvailabilityNo. % No. % No. % Available17,675 95.5 16,378 92.7 34,053 94.1 Missing842 4.5 1,295 7.3 2,137 5.9 Total18,517 100.0 17,673 100.0 36,190 100.0 TABLE 8-5. Cause-of-Death Source Deaths Only Participants with Death Indicated Referents with Death Indicated Total with Death Indicated Source No. % No. % No. % Death certificate 11,893 67.3 10,422 63.6 22,315 NDI-Plus 5,782 32.7 5,956 Total 18,517 100.0 36.4 11,738 16,378 100.0 34,053 65.5 34.5 100.0 NOTE: NDI = National Death Index. We looked at the records that had a malignant neoplasm in any of the cause fields from either source to determine whether cancers the prime endpoint of this study-were noted similarly by the two coding sources. There were 74 rec- ords with malignancy codes; of these, 6 were discrepant in the underlying cause- of-death field. Five of these involved the selection of the underlying cause from among all listed causes. Of the six discrepancies, three do not affect the analysis of the broad category of all-malignancy deaths but, because they select a differ- ent site-specific cancer, would affect that level of analysis.
Representative terms from entire chapter: