Hormesis has been defined as “the stimulating effect of small doses of substances which in larger doses are inhibitory.” As stated by Wolff (1989) the meaning has been modified in recent times to refer not only to a stimulatory effect but also to a beneficial effect. In other words, hormesis now connotes a value judgment whereby a low dose of a noxious substance is considered beneficial to an organism.
The committee has reviewed evidence for “hormetic effects” after radiation exposure, with emphasis on material published since the previous BEIR study on the health effects of exposure to low levels of ionizing radiation. Historical material relating to this subject has been reviewed by the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR 1994), and a special edition of Health Physics on hormesis is available (Sagan 1987). A recent publication reviews data for and against the concept of hormesis (Upton 2000), while noting that further research needs to be done at low-dose and low-dose-rate exposures to resolve the issue. Another recent review argues against the validity of a linear no-threshold model in the low-dose region (Cohen 2002). The committee also reviewed a compilation of materials submitted by Radiation, Science, and Health Inc., entitled Low Level Radiation Health Effects: Compiling the Data and materials provided by Dr. Edward J. Calabrese including the Belle Newsletter Vol. 8, no. 2, December 1999 and the article; Hormesis: a highly generalizable and reproducible phenomenon with important implications for risk assessment (Calabrese and coworkers 1999).
Much of the historical material on radiation hormesis relates to plants, fungi, algae, protozoans, insects, and nonmammalian vertebrates (Calabrese and Baldwin 2000). For the purposes of this report on human health effects, the committee focused on recent information from mammalian cell and animal biology and from human epidemiology. In this context, some investigators have suggested that radiation exposure may enhance immune response (Luckey 1996; Liu 1997) or DNA repair processes (see “Adaptive Response” below). It has been postulated that such stimulation might result in a net health benefit after exposure, and these observations are sometimes offered as mechanisms for hormesis.
Pollycove and Feinendegen have made a theoretical argument that the hazards of radiation exposure are negligible in comparison to DNA damage that results from oxidative processes during normal metabolism. They argue that endogenous processes, autoxidation, depurination, and/or deamination can lead to cellular DNA damage resembling that produced by ionizing radiation. Oxidative damage is much more complex than they appreciate and involves predominantly proteins and mitochondrial targets associated with transcription, protein trafficking, and vacuolar functions (Thorpe and others 2004). The identity of the particular radical species generated endogenously in undamaged cells is unknown, and therefore yields of endogenous single-strand breaks (SSBs) and double-strand breaks (DSBs) cannot be estimated reliably a priori. Direct measurements of SSBs in unirradiated cells indicate levels several orders of magnitude less than that estimated by Pollycove and Feinendegen. The authors’ hypothesis that endogenous processes within cells give rise to significant levels of DSBs from SSBs in close proximity is speculative and not supported by current experimental information. Exposure of cells to high levels of hydrogen peroxide, for example, produces high frequencies of SSBs but no DSBs, suggesting that overlap of SSBs does not occur to a significant extent experimentally (Ward and others 1985). They also hypothesize that low-dose radiation induces a specific repair mechanism that then acts to reduce both spontaneous and radiation-induced damage to below spontaneous levels, thus causing a hormetic effect. The evidence for such a repair mechanism is weak and indirect and is contradicted by direct measures of DSB repair foci at low doses (Rothkamm and Lobrich 2003).
Evidence from Cell Biology
Possible stimulatory effects have been reported for radiation exposure, such as mobilization of intracellular calcium (Liu 1994), gene activation (Boothman and others 1993), activation of signal transduction pathways (Liu 1994; Ishii and others 1997), increase in antioxidants such as reduced glutathione (GSH; Kojima and others 1997), increase in lipoperoxide levels (Petcu and others 1997), and increase in circulating lymphocytes (Luckey 1991). The general thesis presented is that stress responses activated by low doses of radiation, particularly those that would increase immunological responses, are more beneficial than any deleterious effects that might result from the low doses of ionizing radiation. Although evidence for stimulatory effects from low doses has been presented, little if any evidence is offered concerning the ultimate deleterious effects that may occur. In the section of this report on observed dose-response relationships at low doses, bystander effects and hyper radiation sensitivity for low-dose deleterious effects in mammalian cells have been observed for doses in the 10–100 mGy range. End points for these deleterious effects include mutations, chromosomal aberrations, oncogenic transformation, genomic instability, and cell lethality. These deleterious effects have been observed for cells irradiated in vivo as well as in vitro.
The radiation-adaptive response in mammalian cells was demonstrated initially in human lymphocyte experiments (Olivieri and others 1984) and has been associated in recent years with the older concept of radiation hormesis. A more extensive treatment of adaptive effects is discussed in another section of this report. Radiation adaptation, as it was initially observed in human lymphocytes, is a transient phenomenon that occurs in some (but not all) individuals when a conditioning radiation dose lowers the biological effect of a subsequent (usually higher) radiation exposure. In lymphocyte experiments, this reduction occurs under defined temporal conditions and at specific radiation dose levels and dose rates (Shadley and others 1987; Shadley and Wiencke 1989). However, priming doses less than 5 mGy or greater than ~200 mGy generally result in very little if any adaptation, and adaptation has not been reported for challenge doses of less than about 1000 mGy. Furthermore, the induction and magnitude of the adaptive response in human lymphocytes is highly variable (Bose and Olivieri 1989; Hain and others 1992; Vijayalaxmi and others 1995), with a great deal of heterogeneity demonstrated between different individuals (Upton 2000). Also, the adaptive response could not be induced when the lymphocytes were given the priming dose during G0. Although inhibitor and electrophoretic studies suggest that alterations in transcribing messenger RNA and synthesis of proteins are involved in the adaptive response in lymphocytes, no specific signal transduction or repair pathways have been identified. A recent study (Barquinero and others 1995), which reported that chronic average occupational exposure of about 2.5 mSv per year over 7 to 21 years induced an adaptive response for radiation-induced chromosomal aberrations in human lymphocytes, also reported that the spontaneous level of aberrations was elevated significantly, presumably by the occupational exposure. (See Barquinero and others  for references to six other reports that basal levels of chromosome abnormalities are in general higher in exposed human populations.) These results suggest that occupational exposure may have induced chromosomal damage in the worker population while protecting lymphocytes from a subsequent experimental radiation exposure administered years after initiation of the chronic exposure. It is unclear whether such competing events would result in a net gain, net loss, or no change in health status.
In general, to observe hormetic effects the spontaneous levels of these effects have to be rather high. The committee notes in the Biology section that a very low radiation dose was reported to cause a reduction in transformation in vitro below a relatively high spontaneous transformation frequency. However, problems and possible artifacts of the assay system employed are also discussed. When radioresistance is observed after doses that cause some cell lethality—for example, after chronic doses that continually eliminate cells from the population—the radioresistance that emerges may be caused either (1) by some inductive phenomenon or (2) by selecting for cells that are intrinsically radioresistant. Either process 1 or process 2 could occur as the radiosensitive cells are selectively killed and thus eliminated from the population as the chronic irradiation is delivered. In the end, an adaptive or hormetic response in the population may appear to have occurred, but this would be at the expense of eliminating the sensitive or weak components in the population.
In chronic low-dose experiments with dogs (75 mGy/d for the duration of life), vital hematopoietic progenitors showed increased radioresistance along with renewed proliferative capacity (Seed and Kaspar 1992). Under the same conditions, a subset of animals showed an increased repair capacity as judged by the unscheduled DNA synthesis assay (Seed and Meyers 1993). Although one might interpret these observations as an adaptive effect at the cellular level, the exposed animal population experienced a high incidence of myeloid leukemia and related myeloproliferative disorders. The authors concluded that “the acquisition of radioresistance and associated repair functions under the strong selective and mutagenic pressure of chronic radiation is tied temporally and causally to leukemogenic transformation by the radiation exposure” (Seed and Kaspar 1992).
Evidence from Animal Experiments
Life Span Data
In contrast to experiments showing that radiation shortens the life span, some early publications reported apparent radiation-induced life lengthening following exposure to low levels of single or protracted doses of radiation (Lorenz 1950; Lorenz and others 1954). Statistical analyses of the distribution of deaths in these studies indicate control animals usually show a greater variance around the mean survival time than groups exposed to low doses of radiation. In addition, the longer-living irradiated animals generally have a reduced rate of intercurrent mortality from nonspecific and infectious diseases during their early adult life, followed by a greater mortality rate later in life. Since these investigations were conducted under conditions in which infectious diseases made a significant contribution to overall mortality, the interpretation of these studies with respect to radiation-induced cancer or other chronic diseases in human populations must be viewed with caution.
Problems with variability in controls was a major difficulty in the early studies before animal maintenance and heath care issues were dealt with by transitioning to the use of specific pathogen-free (SPF) facilities; this change to SPF facilities substantially reduced interexperimental variability. For example, the cited data of Lorenz (1950) show a small difference in life span in mice exposed to 0.11 r/d compared to controls; the irradiated group lived somewhat longer than the unirradiated group, but the difference was not significant. A French study (Caratero and others 1998) shows life lengthening in irradiated mice compared to controls; unfortunately, the control life spans were significantly shorter by 100–150 d than any in other published data for this mouse strain (Sacher 1955; Congdon 1987).
Tumor Incidence Data
Two studies have reported a significant reduction in tumor incidence of lymphoma in animals that have a high spontaneous tumor incidence (>40%; Covelli and others 1989; Ishii and others 1996). A paper by Ishii and colleagues (1996) describes a reduction in lymphoma incidence after chronic, fractionated, low-dose total-body irradiation of AKR mice with a spontaneous lymphoma incidence of 80.5%. The spontaneous lymphoma incidence was decreased significantly (to 48.6%) by 150 mGy X-irradiation delivered twice a week for 40 weeks. A protocol of 50 mGy three times a week gave a smaller (not statistically significant) decrease to 67.5% lymphoma incidence. The mean survival time was significantly prolonged from 283 d for the control animals to 309 d with the three-exposure-per-week protocol and to 316 d with the twice-a-week protocol.
In a study by Covelli and colleagues (1989), a decrease in incidence of malignant lymphoma at low doses of radiation (46 and 52% age-adjusted incidence at X-ray exposures of 500 and 1000 mGy versus 57% incidence in control animals) shows a reduction in tumor incidence relative to the control frequency. After peaking at 60% lymphoma incidence (3000 mGy), the frequencies decline, “possibly due to cell inactivation becoming predominant at higher doses over the initial transforming events.”
The reduction in spontaneous tumors noted in the previous two studies may in some way be related to the high spontaneous lymphoma incidence in this mouse strain. In the Ishii study, the authors speculate that possible mechanisms may include augmentation of the immune system or initiation of an “adaptive response.” One might also consider that the substantial doses delivered to the animals in this study (6000 and 12,000 mGy) are effectively acting as radiotherapy in the reduction of spontaneous tumor incidence. Human populations, which have a wider spectrum of “spontaneous” tumors occurring at a lower incidence, may not be expected to respond to radiation in the same way as mouse strains with high lymphoma incidence.
HORMESIS AND EPIDEMIOLOGY
The term hormesis is not commonly used in the epidemiologic literature. Rather, epidemiologists discuss associations between exposure and disease. A positive association is one in which the rate of disease is higher among a group exposed to some substance or condition than among those not exposed, and a negative (or inverse) association is one in which the rate of disease is lower among the exposed group. If an association is judged to be causal, a positive association may be termed a causal effect and a negative association could be termed a protective effect.
One type of epidemiologic study that has been used to evaluate the association between exposure to radiation and disease is the “ecologic” study in which data on populations, rather than data on individuals, are compared. These data have been used to argue for the existence of radiation hormesis.
Another example of an ecologic study is the evaluation of geographic areas with high background levels of radiation compared to areas with “normal” background levels. The fact that cancer rates in these high-background-radiation geographic regions are not elevated is sometimes cited as evidence against a linear no-threshold model (Jaworowowski 1995).
It is also true that certain populations residing in high-background areas, such as occur at high altitudes, have lower levels of health problems than those residing at lower altitudes. This observation has been interpreted by some as evidence for a hormetic effect of radiation. BEIR V discussed
the effect of confounders and the ecological fallacy2 in the evaluation of high-background-radiation areas and concluded that “these two problems alone are enough to make such studies essentially meaningless” (NRC 1990).
Another important consideration is the expected magnitude of the increase in health effect induced by excess background radiation. If one assumes a linear no-threshold response, a calculation can be made for expected cancers induced by excess radiation in a high-background-radiation area. As an example, consider the elevated levels of gamma radiation in Guodong Province, Peoples’ Republic of China (PRC). In this study, a population receiving 3–4 mGy per year was compared to an adjacent control population receiving 1 mGy per year. No difference in cancers was noted between the high-background area and the control area (NRC 1990). One can estimate the expected excess percentage of cancers resulting from the 2–3 mGy difference in exposure per year using a linear nonthreshold model and the lifetime risk estimates developed in this report. A calculation by this committee indicated that the expected percentage of cancers induced by the excess background radiation would be 1–2% above the cancers occurring from all other causes in a lifetime. Even if all confounding factors were accounted for, it is questionable whether one could detect an excess cancer rate of 1–2%. Excess cancers may indeed be induced by elevated radiation exposure in high-background areas, but the excess may not be detectable given the high lifetime occurrence of cancer from all causes.
Ordinarily, epidemiologists do not consider ecologic data such as this as being sufficient for causal interpretations. Since the data are based on populations, no information is available on the exposure and disease status of individuals. Such data cannot be controlled adequately for confounding factors or for selection bias. Although ecologic data may be consistent with an inverse association between radiation and cancer, they may not be used to make causal inferences.
A second type of epidemiologic study that has been used to evaluate the association between exposure to radiation and disease is the retrospective cohort study. Persons who have had past exposure to radiation are followed forward in time, and the rate of disease is compared between exposed and nonexposed subjects or between exposed subjects and the general population. Especially valuable are occupational studies that include both unexposed and exposed subjects, so that a dose-response evaluation can be made of the relation between radiation exposure and health outcome. Typically, study populations in retrospective cohort studies include persons who have worked with radiation in medical facilities or in the nuclear industry or patients with cancer or other disease who have been treated with radiation.
It is common in cohort studies of occupational populations to observe that the overall mortality rate is lower than that of the general population, commonly about 15%. This is not interpreted to mean that work per se reduces the risk of mortality, but rather that healthy persons start to work more often than unhealthy persons (Monson 1990). The term “healthy worker effect” (HWE) is commonly used to describe this observation. Diseases such as cancer that develop in later life ordinarily have less of an HWE than noncancerous diseases. The HWE is observed in most occupational studies, including those of radiation workers, and should not be interpreted to mean that low doses of radiation prevent death from cancer or other causes.
A third type of epidemiologic study that has been used to evaluate the association between exposure to radiation and disease is the case-control study. Persons with a specific disease are compared to a control group of persons without the disease with respect to their past exposure to radiation. This type of study is unusual in radiation epidemiology, in that most general populations have relatively low exposure to radiation.
While no phenomenon similar to the HWE is observed in case-control studies, the play of chance is always operative, as it is in cohort studies. Thus, if some exposure does not cause cancer and if a number of case-control studies are conducted, there will be a normal distribution observed in the odds ratios that describe the association between exposure and disease. Some studies will have an odds ratio that is less than 1.0; others will have an odds ratio greater than 1.0. In interpreting these studies, it is inappropriate to select only those that are consistent with an excess or deficit of disease. Rather, the entire distribution must be examined to assess the likely relationship between exposure and disease.
The studies discussed here illustrate the variability that is inherent in all epidemiologic studies and the need to evaluate the entire body of relevant literature in order to assess possible associations between radiation and disease, be they positive or negative. In its evaluation of the literature and in its discussions, the committee has found no consistent evidence in the epidemiologic literature that low doses of ionizing radiation lower the risk of disease or death. Some studies show isolated positive associations between radiation exposure and disease, and some show isolated negative associations. However, the weight of the evidence does not lead to the interpretation that low doses of radiation exert what in biological terms is called hormesis.
The committee concludes that the assumption that any stimulatory hormetic effects from low doses of ionizing radiation will have a significant health benefit to humans that exceeds potential detrimental effects from the radiation exposure is unwarranted at this time.