Click for next page ( 169


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 168
APPENDIX D Verification of Completeness and Accuracy of the Participant Roster In the early 1990s, the Defense Threat Reduction Agency (then the Defense Nuclear Agency) announced that the personnel dataset it had provided MFUA contained substantial errors of inclusion and exclusion. Because this dataset was the basis for MFUA's Five Series Study (FSS) published in 1985, the U.S. Gen- eral Accounting Office (GAO), the congressional Office of Technology As- sessment (OTA), concerned members of Congress and their staffs, and MFUA itself recommended redoing the mortality analyses using a corrected dataset. Using GAO estimates of required additions (28,215) and deletions (14,854?, the correct dataset would have 59,547 participants; OTA estimates (15,000 addi- tions and 4,500 deletions) would yield 56,686 participants. These classification errors were discovered by NTPR in the process of updating its participant data- base following its 1987 consolidation of the databases previously maintained by each branch of service. Verification of the completeness and accuracy of the participant file is im- portant to any study and of special concern for this one given its history. In Chap- ter 5, we describe the development of the participant cohort used in the analyses for this report. This appendix presents the detailed verification and validation work we did to assure ourselves and the reader of the validity of this roster. We pursued two avenues of validation. The first was a comparison of the 1985 participant roster with the 1999 participant roster.) In the second, we com- pared the 1999 roster with participant lists compiled independently of the Nu- clear Test Personnel Review (NTPR) database. iFor this chapter, we refer to the dataset on which the analyses reported in this pub- lication are based as the 1999 data in keeping with the report publication date and paral- lel to references to the 1985 data for the earlier report. The datasets for each of these reports, however, were constructed and frozen prior to the reported analyses. 168

OCR for page 168
APPENDIX D 169 COMPARISON TO THE PARTICIPANT ROSTER USED FOR THE 1985 STUDY By comparing the current participant dataset to the 1985 version (Robinette et al., 1985) and seeking verification of participation for sampled individuals, we were able to describe the differences between the two rosters and comment on the reasons for the changed counts. We did not change the 1999 participant data based on our findings of the comparison with the 1985 data. Rather, we used the intonation to describe the completeness of the dataset and to comment on the way any incompleteness might affect the 1999 study findings. Computer File Match MFUA staff created computer programs to select participant records that matched on the DNA lists provided for both the 1985 study and the current study. Because military service numbers are printed in varied formats, we trun- cated the alphabetical prefixes and added leading zeros where necessary. Method A: A match was sought for complete military service number (MSN:looking at all four MSN fields on the R90 dataset-plus the first five characters of last name and the initial character of the first name. Method B.: Matches were sought for the full first name and full last name; MSN was then checked by hand to detect similarities. Comparing the 49,148 records in the 1985 data file (which includes clearly erro- neous entries that correctly had been deleted from the cohort for the 1985 publi- cation) and the 68,168 in the 1999 file, matches were found by methods A or B. above, for 38,729 individuals. These matching programs designated certain rec- ords as discrepancies. These records do not match exactly on all available vari- ables (last name, first name, date of birth [DOB], Social Security number [SSNi, military service number [MSN]) but do match on some loosely defined criteria (documented below). Reviewing Discrepancies by Hand Reviewing the first few pages of discrepancy lists produced by the com- puter matching program, we noted for each discrepant pair an opinion: match, probably a match, could be a match (not enough information), probably not a match, and not a match. Table D-1 presents the criteria we set for use in judging whether two entries matched.

OCR for page 168
170 THE FIVE SERIES STUDY TABLE D-1. Instructions to Staff Common Errors Be Alert for Common Errors Based on: One File Other File Number readability problems 9 0 Letter readability problems M N D P Adjacent-digit typing errors Missed hyphens Typist using familiar patterns 9 MEDINADIAZ CK MAC -OR -MAN -L BURGER Formatting differences Leading zeros 00001234 Ending zeros 12340000 Letters within a number string AF42899 o MEDINA-DIAZ C MC- -ER -MEN -LL- BERGER 1234 00001234 42899 For example, the examples from two files in Table D-2 would probably be true matches. TABLE D-2. Instructions to Staff Examples Nature of Discrepancy One File Other File Understandable discrepancy on DOB 241100 241105 Understandable discrepancy on SSN 123-45-6789 123~5-6780 Understandable discrepancy on MSN 765432 0000765432 7654320000 AF0765432 765482 764582 NOTE: DOB = date of birth; MSN = military service number; and SSN = Social Security number. Once we determined whether the MSN, SSN, and DOB information from the 1985 list and the 1999 listed record matched sufficiently, we judged whether the record matched, using a set of decision rules arrayed in Table D-3. Thirty- six combinations were possible. The decisions noted in upper case occurred in the sample; the lower-case decisions are what we would have chosen had these combinations occurred.

OCR for page 168
APPENDIX D TABLE D-3. Instructions to Staff Availability and Consistency of Identification Data Combination MSN SSN DOB Match Decision 1 x x x NO 2 x # x NO 3 x x # NO 4 x # # NO 5 x = = yes 6 x # = no 7 x = # yes 8 x = x yes 9 x x = NO 10 # x x no 11 # # x no 12 # x # no 13 # # # no 14 # = = yes 15 # # = no 16 # = # yes 17 # = x yes 18 # x = no 19 = x x 20 = # x yes 21 = x # 22 = # # YES 23 = = = YES 24 = # = yes 25 = = # yes 26 = = x yes 27 = x = yes 28 ~ x x 29 ~ # x yes 30 ~ x # 31 ~ # # 32 ~ = =/ YES 33 ~# = YES = # yes 35 ~ = x yes 36 ~ x = YES NOTE: DOB = date of birth; MSN = military service number; SSN = Social Security number; "=" indicates an exact match; "#" indicates one or both are missing; "x" indicates that they are different; and "a" indi- cates that they are very similar (with understandable discrepancy). Up- percase letters indicate decisions that occurred in the sample; lowercase letters indicate decisions that we likely would have chosen had these combinations occurred. 171

OCR for page 168
172 THE FIVE SERIES STUDY Using the methods described above, we reviewed three discrepancy lists: . Method C: Listed all records where the first four letters of the last name and the first three letters of the first name matched, regardless of the remaining letters. . Method D: Listed all records, using the first three letters of the last name and the first three letters of the first name, where first and last names were re- versed. . Method E: Listed all records where complete last names and complete first names were reversed. All three lists, generated after methods A- and B-determined matches were culled, disregarded whether DOB, SSN, and MSN matched. The first 10 pages of list C consist of possible matches involving 177 partici- pants on the 1999 roster and 103 participants on the 1985 roster. (Numbers of rec- ords do not match because, for example, one 1985 list "Bit* Smit*" could have matched four 1999 list "Bil* Smit*"s.) Using this group as a sample, we identified 25 matches. Table D-4 shows the match results from each method's list. TABLE D-4. Matching of Participant Names on the 1985 and 1999 Study Rosters by Types of Matching Methods Used Matches* %of %of 1985 List 1999 List No. 1985 List 1999 Lists A + B (all) 49,148 68~168 38~729 78.8 56.8 List C (sample) 103 177 25 24.3 14.1 List D (sample) 44 136 1 2.3 0.7 List E (sample) 23 24 20 87.0 83.3 *Based on staffjudgment. Sample for DTRA Verification We drew a sample of 50 participants from each of the five series for each of the following categories: . participants who were found in both the 1985 participant list and the cur- rent, 1999, participant list, were called matched; . participants who are currently in the study but could not be matched to a 1985 participant were called new only; . participants who were in the study in 1985 but could not be matched to a participant in the 1999 file, were called old only.

OCR for page 168
APPENDIX D 173 MFUA requested documentation from the Defense Threat Reduction Agency (DTRA) to verify the status of each of the selected individuals. Participants Found Only in the Current Dataset New Only Among the sample of 250 new-only participants whose names were on the 1999 list but not found in the 1985 data, 239 were confirmed as appropriately included new participants. For nine individuals, documentation found during the validation process indicated that the individuals should have been deleted from the 1999 dataset. These were deleted subsequent to the submission of the list to DTRA, but before the verification research had been completed. For one partici- pant, classified as an error, the verification research provided a dosimetry record for an individual that indicated participation; however, the serial number be- longed to another participant. No personnel records were found to confirm par- ticipation of either the named individual or the participant whose serial number was assigned to the name listed. In summary, the review of the sample of 250 participants added to the 1999 roster (new only) found 248 to be in the correct status in the current dataset (99 percent), one erroneously still included, and one of indete~n~inate status (consid- ered an error). Participants Found in Both the 1985 and l 999 Datasets Matched Of the 250 matched participants for whom we requested DTRA documen- tation, 247 were verified as participants. One was a verified deletion who had not yet been posted when the validation sample was sent. Two were errors: 1. an individual who was found to have left the test site 3 weeks before the shot he was thought to have attended, and 2. another who had previously been identified as a crew member of a par- ticipating ship prior to the test series, but a detailed review of the ship's records during the test found no evidence he was actually there. Participants Found Only in the 1985 Dataset- Old Only Of the sample of 250 participants found on the 1985 list, but not matched to a name on the 1999 participant list, 125 (50 percent) were discovered actually to be represented on the 1999 list, but the match had been obscured by inaccurate or missing identification information on one or both lists. They were recognized as matches when identification information (spelling of name or service number) was corrected during clean-up of the dataset. Another comparably sized group of 119 (48 percent) were confirmed deletions; records demonstrated that the individuals did not meet the definition of a participant. There were six errors:

OCR for page 168
174 THE FIVE SERIES STUDY . Two of the 250 were not included in the 1999 dataset but should have been. One had no documentation. . Two were aboard contaminated ships abler the operation but during the official post-operational period and should not have been dropped Tom the par- ticipant list. One was thought to be a civilian and dropped from the list, although later research found him to be in the military and therefore meeting participant cohort criteria. In summary, 244 (98 percent) of the old-only group had been appropriately handled in developing the 1999 dataset. Overlap of the 1985 and 1999 Participant Rosters Eighty-four percent of the individuals included in-the 1985 analysis (38,729 out of 46,186) are also included in the 1999 list. However, these people com- prise only 57 percent of the 1999 list. If the 3,736 personnel whose qualifying service was only during the post-operational period (see Chapter 5) were ex- cluded from this calculation because they reflect a change in the inclusion crite- ria since the construction of the 1985 list, rather than identification errors, there is still a 60 percent carryover. Table D-5 displays the extent of overlap between the 1985 and 1999 datasets. TABLE D-5. Comparison of Current (1999) Five Series Participant Dataset and 1985 Dataset 1985 1 999 Comment Match Old only New onlyb 38,729 8,877 NA Problem IDs 1,542 Old only + matches New only + matches Total 38,729 Participants in both studies NA Not now considered participants 27,897 Newly found participants 1,542 Insufficient data to positively identify 47,606 NA Size of the 1985 study (except problems) NA66,626 Size of the current study (except problems) 49,14868,168 Total size including problem records NOTE: NA = not applicable. aThis validation study was done by Medical Follow-up Agency staff with a prelimi- nary participant list; the numbers do not match the participant counts reported in the report analyses. bIncludes 3,736 post-onlys (change in criteria accounts for mismatch, not error in the 1985 data).

OCR for page 168
APPENDIX D 175 COMPARISON OF 1999 PARTICIPANT ROSTER WITH OTHER SOURCES National Association of Atomic Veterans (NAAV) Mortality Study List Estimating the number of persons erroneously left out of the 1999 participant list was more difficult than verifying the participation of those whose names were already known to be on the list. To estimate the rate of incorrect exclusions-that is, the proportion of actual five series participants who have been incorrectly ex- cluded from the 1999 list we needed to find an independent list of putative par- ticipants. We used three sources to find these additional participants. NAAV provided us with a list of veterans (n = 1,859) who reported service in at least one of the five series. Using this list as a benchmark, we estimated a false negative rate by matching the NAAV participants against those in our cur- rent dataset, according to the criteria presented above. NAAV participants were classified as either "matches" or "insufficient data." The NAAV database was compiled by Mr. Boley Caldwell, director, NAAV Medical History Survey, from a number of medical surveys that NAAV conducted of its members. The latest questionnaire was circulated in 1992 and has been documented elsewhere (Johnson, 1996~. For this validation study, we accepted the NAAV database as it was presented to us, editing only as necessary to ensure con- sistency of format in fields such as date of birth and to eliminate obvious duplicate records and records of confirmed civilians. We have not attempted to contact indi- vidual veterans to verify or obtain additional identifying information. The NAAV benchmark represents a highly selected population because it is based on health surveys that were intended to determine potentially radiogenic mortality and morbidity among atomic veterans. It is conceivable that veterans in the database may have been more likely to have contacted the NTPR program or the VA and, consequently, are more likely to be on our list of participants. To avoid this possible bias, we also sought participants through sources that were not connected with NAAV. Of the 1,784 individual veterans in the NAAV Medical Survey who indi- cated participation in at least one of the five series, we were able to match all but 195 (10.9 percent) to our current participant list. We provided the identifying information on these 195 individuals to DTRA, requesting verification of par- ticipant status. Searching service records, morning reports, unit diaries, and do- simetry records, DTRA traced the participation status of all but 31. Table D-6 shows the results of the MFUA and DTRA matching processes. Participants Solicited Through Veterans' Journals Write-Ins In order to obtain a group of veterans for comparison who were not associ- ated with NAAV, we placed announcements of the MFUA studies of nuclear

OCR for page 168
176 THE FIVE SERIES STUDY test participants in several veterans' publications.2 The periodicals that published our announcement (in some form) included the following: Journal of the Veterans of Foreign Wars, Journal of the American Legion, Journal of the Retired Enlisted Association, Journal of Retired Officers Association, and NA~4V Newsletter. With the exception of the NAAV Newsletter, we were limited to a few lines of text inviting a response from five series veterans. The publications edited the announcement to suit their needs for format and availability of space. The NAAV accommodated us with a half-page form for its readers to fill out and send in. This enabled us to distinguish between respondents who were newsletter recipients, and most likely members of NAAV, and those who were not. We asked veterans to provide us with personal identification information and details of their nuclear test participation. We refer to this as the write-in verification sample. Because the readership of these journals is broader than the NAAV survey, which was targeted to veterans who were already con- cerned about their health, this write-in sample probably constitutes a less se- lected (and potentially less biased toward illness) comparison group. Because more data were available for individuals in the write-in group, we were able to classify them in more detail when we matched them to the 1999 participant file: "Matches" corresponded to individuals in the NTPR participant file as defined above. "Not-five series" included individuals who mentioned the Five Series Study in their correspondence but provided documentation of participation (1) that definitely placed them at a different time and place most often in another atomic test or (2) as civilian personnel. "Insufficient information" describes those individuals who did not pro- vide enough information to classify them into one of the above categories. Typi- cally, these responders provided only last name and initials or a nickname, with no other identifying information. series. 2We also asked for responses from veterans who participated in the CROSSROADS

OCR for page 168
177 ._ Cal Ct Cq .~ ._ ~ ~ V) Cd .= ._4 ~ Can ~ ~ 7 O U. U. EM ~ Cal ~ C) 00 Z ~ ~ O Cut O O .~ Cal O on o (: - 0 Z Cal - 1 -I ~ V - . Cal ~ ~ _ ~ Do o ~ 11 A\ CD sit O o 11 \ o _' Ct o 3 I ~ ~ ~ o . .. .. o ~o o . o. ~o - oo ~-oo ~ r~ ~oo - - ~oo -~ r~ ~4 - 0 ~ I I ~ cr 0 ~n ~-oo ~cq .O aq D E -,, D v ~s~ ~ ~ ' E E ~ ~ -~ ~ 3 ~ ~ 11 o ._ Ct ._ C) .= ~_ ._ >, 3 ._ U. s~ ~U. 3 ~ o ~ - o o U. _ o Ct s~ C) .e . = ~.O ~== 5- .^ ~== a' ~5 3.= o (,, Cq ~ ~ Ct == Ct fi == U, ~ Cq ~ 3 ~ ~C~ ~ 11 '~ ~ =0 ~ '> ~ .. _ ~ C) Ct O ~ ~ Z ~ ~Q o ~_ Ct .= .= - o Ct U) _ o C ~ ._ ~ O ~q _ 4 - . ~U . ~ . c ~ O ~ ~ ~C) ~ ~O '9 rD . O O ~ ~U, 0' C) ^) _ o ~.0 C) ~Ct 7 , ~ - U) .. Ct ~ V CO~ ~ m x ._ au _~

OCR for page 168
178 THE FIVE SERIES STUDY The amount of information provided by those who responded to our Inquiry varied widely. Some veterans provided detailed documentation of their partici- pation, including both official government documents and their own narrative description of events they witnessed. Others provided only their name and a statement that they were present at one of the five series. In all, we received 531 responses that mentioned tests of the five series in one way or another. When we matched the respondents to our participant list, we obtained the results shown in Table D-7. We submitted to NTPR the 45 records with insufficient documentation for us to identify a match on the 1999 dataset. NTPR was able to confirm as participants or nonparticipants 40 of these individuals. Participants from Public Meetings In June 1993, when this study was at an early period of development, we held an open meeting. Members of the public, including atomic veteran repre- sentatives, and government officials were invited to attend. Many of the atomic veterans who were unable to attend in person provided written statements de- scribing their involvement with the aboveground nuclear test program. We compiled a list of the subset of veterans who noted participation in at least one of the five series `(n = 97) and compared them to participants on the 1999 list. We refer to this as the public meeting verification group. DTRA was able to identify as participants or nonparticipants all five individuals whom we could not (see Table Dab. DISCUSSION Comparison of the 1985 and 1999 NTPR-based participant rosters con- firms the 1991 reports of substantial misclassification of participant status in the older roster. Carefully researching 250-member samples of individual rec- ords for each of the three possible comparison results-old only, new only, both (matched) we identified four people on the 1999 list who do not meet participant cohort criteria (two of whom were also on the 1985 list) and five people listed in 1985 who were erroneously not included on the 1999 list (see Table D-9, page 181~. Applying these sample rates (2 out of 250, 2 out of 250, and 5 out of 250) to the entire old only, matched, and new only records, we estimated errors of inclu- sion and omission in the 1999 dataset.

OCR for page 168
'e of c) .~ a, 'e A, - ~ -l c) 'e a' o of o c~ ~ Em ~ - ~ ~4 'e c) i ~ ~ c.) o ~ I, ~ of i ~ - ~ o (D .-~ o ~ o so to ~ o Ed ~= o ~ 11 ~ ~ i co c) ^ o 11 i - e o En en v 'e ~ 1 ~ ~ o . . . lo ~ or ~ ~ ~ or . .. ... red O O O or Do ~ ~ ~ ~ v~ ~ _ - d" ~ ~ 1 1 ~ - ~- ~ ~n .= - 5 `) C) ca Ct ~ ~ C) Ct =, O E~ U. a' ._ 50 Cq 4 ~0 C~ s~ 3 ~ ~ C) Cc - a~ ce o ~ ~ ._ CO ~ C) ~ ._ ~ 3 o ~ V Z o Ct ~o. ._ ._ C) C~ 5 s~ a, 3 Ct a) au ~Q ~> o o ~Ct C) _ - o Cd 11 ~> .e C~ s~ o .= s~ Ct o ~._ o~ . _Ct .=3 .Oo - U. ~50 ,- ~s~ ~ ~ .^ ~;> ~a~.= a~ ~u,~ o ( ~c~ ~~ ~5~S ~O 4- ~. ~ ~ ~ ~ ~ ~ Ct ~ ~ 3= 8 cn s~ . ~ o ~ ~ ~ o E ~c.> a' u) O E~ z ~Ct 179 cn CC o ._ 50 _ ~3 ~_ Ct C) _ ~ ~o ._ CO 4_ ~ 8 E ~s ~.. V ~ o C ~

OCR for page 168
180 Cal Cal 3 Cal Cal 4 - .~ Cal C) . .> - o sin U) Cal ~ C,0 7 ~ . - _ O U. cq S:: . ~4 _ O ,= V ~04 00 0 1 ~ a, ~ O _ cn . _ ~ _ O ;> old, 11 S ca ~0 o 11 S Cal E- EN 0 au Cal V C) Cal ~ 1 oo Do ~OOOO ...... O O O Cat O Cat t- - O O oo ~ O ~ O O ~ ~ 1 1 ~ ~ oo Cat Ccq .= CD O ~ } E ~= ~ (, ~ ~ O E - E ~> D ~ 5 - 3 C~ V) o o ;~. ~o . ~ ~S L ~_1 ~o ~ e 3 c, o o $: _ ~o ~ (e . s ~ ~s~ ~o~ ~U, F ~t0 't - .^ C) s~ ~ ~3 0, ~ . a ~O c~ t> ~-~ C ~ ~ D ~U, CD E aw ~ ~ ~E ~c ~ c ~ ~c~ , ~ Z ~ ~ ~ ~ ~

OCR for page 168
APPENDIX D 181 TABLE D-9. Estimated Errors of Inclusion and Omission in the 1999 Dataset No. in Group Group Sample Error Rate Estimated Errors New only 27,897 2/250 (0.8%:in 1999 dataset who should not be 223.2 Matched 38,729 2/250 (0.8%:in 1999 dataset who should not be 309.8 Old only 8,877 5/250 (2.0%:should be in 1999 but are not 177.5 We then added information from three participant-identifying sources exter- nal to NTPR the NAAV mortality study, veteran correspondence solicited by MFUA in veterans' publications, and veteran correspondence invited by MFUA in conjunction with its public meeting at the beginning of the Five Series Study. Ten individuals were confirmed by NTPR as five series participants who had not been included in its 1999 participant roster. For another 36 individuals who reported being five series participants, NTPR could neither confirm nor dismiss participant status because military records could not be found and other data sources, such as unit logs and dosimetry records, did not list these individuals. If we assume one extreme that all 36 actually were five series participants-then there are 46 missed participants identified from non-NTPR sources (see Table D-10. Comparing the validation information from both approaches provides evi- dence that the roster on which the analyses reported here are based has very few ~ . . . . errors or omission or 1nc uslon. All 533 estimated wrong inclusions constitute less than 1 percent (0.8%) of the 1999 participant cohort. All 46 veterans whom NTPR could not confirm as nonparticipants, plus the 178 individuals from the 1985 comparisons assessed to be wrong omissions, would add less than 1 percent (0.3%) to the 1999 participant cohort. CONCLUSION The participant roster on which the 1999 Five Series Study is based in- cludes more than 99 percent of the military personnel who participated in any of the five series.

OCR for page 168
182 Cal o On 50 o o 50 v 7 3 A .> - o Cq Ct ~4 to d Ct ~ z .= ~ .O ~ z Cq C) I.., z z . . ~ _' o ~ ~ . . . o oo o sol by ~ .m o . Cal ~ ~ 3 ~s ~ CC 50 c) . o o o . 4 - Cd . Cal o CO CD Ct o .~ Ct 11 Z .. o z