The following HTML text is provided to enhance online
readability. Many aspects of typography translate only awkwardly to HTML.
Please use the page image
as the authoritative form to ensure accuracy.
Health Risks from Exposure to Low Levels of Ionizing Radiation: Beir VII Phase 2
protracted exposures to ionizing radiation. They are generally presented in the form of annual summaries of doses from different types of radiation (penetrating photons, beta, and where appropriate and measured, tritium and neutrons).
These data were, however, compiled to monitor worker exposure for compliance with radiation protection guidelines, which have changed over time, and not specifically for epidemiologic purposes. Overall, the accuracy and precision of recorded individual doses and their comparability will therefore depend on:
the dosimetry technology, which includes the physical capabilities of the dosimetry system, such as the response to different types and energies of radiation, in particular in mixed radiation fields;
the radiation fields in the work environment, which may include mixed types of radiation, variations in exposure geometries, and environmental conditions; and
the administrative practices adopted by facilities to calculate and record personnel dose based on technical, administrative, and statutory compliance considerations.
Consequently, detailed examination of dosimetry practices, including sources and magnitude of errors, is important in considering whether sufficiently accurate and precise estimates of dose can be obtained for use in an epidemiologic study.
Information on internal contamination with radionuclides other than tritium is generally sparse, particularly in early years, and consists of information on the fact of monitoring or on a percentage of the annual limit of intake. Very few studies have attempted to reconstruct individual doses from nuclides other than tritium. One exception is the study of Sellafield workers in the United Kingdom, where efforts have been made to reconstruct plutonium exposures (Omar and others 1999).
In high-dose studies, the majority of excess deaths from cancer have been demonstrated in subjects exposed to doses of at least 1 Sv. There were approximately 3000 such subjects among atomic bomb survivors. Doses received by employees of nuclear industry facilities are considerably lower. In the Sellafield cohort (Douglas and others 1994), in which the highest doses among the nuclear industry worker studies have been reported, only about 60 out of more than 10,000 individuals monitored for external radiation exposure had received doses of 1 Sv or more, and these doses were accumulated over the course of a working life. The mean cumulative radiation dose in the three-country combined analyses was 40.2 mSv per worker and the collective dose was 3843 Sv (IARC 1995). Women comprised fewer than 15% of the workers, and their mean cumulative dose was low (6.2 mSv) compared to that of men (46.0 mSv). Overall, the distribution of doses was very skewed; almost 60% of subjects had cumulative doses less than 10 mSv, 80% were less than 50 mSv, and less than 2% had doses greater than 400 mSv.
The majority of cohort studies collected only information that could readily be obtained from employment and dosimetry records. This consists, in addition to information on individual annual radiation dose from different types of radiation, date of birth, date and cause of death, sex, socioeconomic status based on occupational group or education, and dates of beginning and end of employment. Nested case-control studies have allowed the exploration of additional factors including tobacco smoking and other occupational exposures.
In most of the nuclear industry workers studies, death rates among worker populations were compared with national or regional rates. In most cases, rates for all causes and all cancer mortality in the workers were substantially lower than in the reference populations. Possible explanations include the healthy worker effect and unknown differences between nuclear industry workers and the general population.
In most studies where external radiation dose estimates were available, death rates were also compared in relation to levels of radiation exposure within the study population. For all cancer mortality (excluding leukemia), the estimates of radiation-induced excess risk varied from negative to several times greater than those derived from linear extrapolation from high-dose studies (Table 8-3). Moreover, because of the large degree of uncertainty, many of these estimates were consistent with an even wider range of possibilities, from negative risks to excess risks at least an order of magnitude greater than those on which the current radiation protection recommendations have been based.
In most of the large studies of nuclear industry workers, estimates of ERR1 per gray (ERR/Gy) have been derived, mostly using Poisson regression. Estimates of excess death rate per 106 person-years (PY) per gray have also been presented in some studies. Results of such analyses are shown in Tables 8-3 and 8-4 for all cancers excluding leukemia and for leukemia, respectively. Table 8-5 is a listing of the results from other studies of nuclear workers that could not be used in computation of ERRs or EARs.2
Cancer mortality was observed to increase significantly with increasing level of exposure in four studies: AWE (Beral and others 1988), ORNL (Wing and others 1991; Richardson and Wing 1998), Canadian NDR (Ashmore and others 1998), and Rocketdyne (Ritz and others 1999a). The ERR estimate based on the three-country combined analysis was close to zero, but was compatible with a range of possi-
ERR is the rate of disease in an exposed population divided by the rate of disease in an unexposed population minus 1.0.
EAR is the rate of disease in an exposed population minus the rate of disease in an unexposed population.