toxicity. Then an evaluation can be made of whether intake at the UL has the potential to affect the bioavailability of other nutrients.
Possible adverse nutrient-nutrient interactions, then, are considered as a part of setting a UL. Nutrient-nutrient interactions may be considered either as a critical endpoint on which to base a UL for that nutrient or as supportive evidence for a UL based on another endpoint.
Other Relevant Factors Affecting Bioavailability of Nutrients
In addition to nutrient interactions, other considerations have the potential to influence nutrient bioavailability, such as the nutritional status of an individual and the form of intake. These issues should be considered in the risk assessment. The absorption and utilization of most minerals, trace elements, and some vitamins are a function of the individual's nutritional status, particularly regarding the intake of other specific nutrients such as iron (Barger-Lux et al., 1995; Mertz et al., 1994).
With regard to the form of intake, minerals and trace elements often are less readily absorbed when they are part of a meal than when taken separately or when present in drinking water (NRC, 1989). The opposite is true for fat-soluble vitamins whose absorption depends on fat in the diet. ULs must therefore be based on nutrients as part of the total diet, including the contribution from water. Nutrient supplements that are taken separately from food require special consideration, since they are likely to have different availabilities and therefore may represent a greater risk of producing toxic effects.
Steps in the Development of the Tolerable Upper Intake Level
Step 1. Hazard Identification
Based on a thorough review of the scientific literature, the hazard identification step outlines the adverse health effects that have been demonstrated to be caused by the nutrient (see Box 2). The primary types of data used as background for identifying nutrient hazards in humans are as follows:
- Human studies. Human data provide the most relevant kind of information for hazard identification, and, when they are of sufficient quality and extent, are given greatest weight. However, the number of controlled human toxicity studies conducted in a clinical setting is very limited for ethical reasons. Such studies are generally most useful for identifying very mild (and ordinarily reversible) adverse effects. Observational studies that focus on well-defined populations with clear exposures to a range of nutrient intake levels are useful for establishing a relationship between exposure and effect. Observational data
BOX 2 Development of Tolerable Upper In take Levels (ULs)
Components of Hazard Identification
Components of Dose-Response Assessment
- in the form of case reports or anecdotal evidence are used for developing hypotheses that can lead to knowledge of causal associations. Sometimes a series of case reports, if it shows a clear and distinct pattern of effects, may be reasonably convincing on the question of causality.
- Animal studies. The majority of the available data used in regulatory risk assessments comes from controlled laboratory experiments in animals, usually mammalian species other than humans (for example, rodents). Such data are used in part because human data on food-derived substances, particularly nonessential chemicals, are generally very limited. Because well-conducted animal studies can be controlled, establishing causal relationships is generally not difficult. However, cross species differences make the usefulness of animal data for establishing ULs problematic (see below).
Key issues that are addressed in the data evaluation of human and animal studies are the following:
- Evidence of adverse effects in humans. The hazard identification step involves the examination of human, animal, and in vitro published evidence addressing the likelihood of a nutrient or food component eliciting an adverse effect in humans. Decisions regarding which observed effects are adverse are based on scientific judgments. Although toxicologists generally regard any demonstrable structural or functional alteration to represent an adverse effect, some alterations may be considered of little or self-limiting biological importance.
As noted earlier, adverse nutrient-nutrient interactions are considered in the definition of an adverse effect.
- Causality. Is a causal relationship established by the published human data? The criteria of Hill (1971) are considered in judging the causal significance of an exposure-effect association indicated by epidemiologic studies. These criteria include: demonstration of a temporal relationship, consistency, narrow confidence intervals for risk estimates, a biological gradient or dose response, specificity of effect, biological plausibility, and coherence.
- Relevance of experimental data. Consideration of the following issues can be useful in assessing the relevance of experimental data.
Animal data. Animal data may be of limited utility in judging the toxicity of nutrients because of highly variable interspecies differences in nutrient requirements. Nevertheless, relevant animal data are considered in the hazard identification and dose-response assessment steps where applicable.
Route of exposure.3 Data derived from studies involving oral exposure (rather than parenteral, inhalation, or dermal exposure) are most useful for the evaluation of nutrients and food components. Data derived from studies involving parenteral, inhalation, or dermal routes of exposure may be considered relevant if the adverse effects are systemic and data are available to permit interroute extrapolation.
Duration of exposure. Because the magnitude, duration, and frequency of exposure can vary considerably in different situations, consideration needs to be given to the relevance of the exposure scenario (for example, chronic daily dietary exposure versus short-term bolus doses) to dietary intakes by human populations.
- Mechanisms of toxic action. Knowledge of molecular and cellular events underlying the production of toxicity can assist in dealing with the problems of extrapolation between species and from high to low doses. It may also aid in understanding whether the mechanisms associated with toxicity are those associated with deficiency. In most cases, however, because knowledge of the biochemical sequence of events resulting from toxicity and deficiency is still incomplete, it is not yet possible to state with certainty whether or not these sequences share a common pathway. Iron, the most thoroughly studied trace element, may represent the only exception to this statement. Deficient to near-toxic exposures share the same pathway, which maintains controlled oxygen transport and catalysis. Toxicity sets in when the exposure exceeds the specific iron-complexing capacity of the organism, resulting in free iron species initiating peroxidation.
- Quality and completeness of the database. The scientific quality and quantity of the database are evaluated. Human or animal data are reviewed for suggestions that the substances have the potential to produce additional adverse health effects. If suggestions are found, additional studies may be recommended.
- Identification of distinct and highly sensitive subpopulations. The ULs are based on protecting the most sensitive members of the general population from adverse effects of high nutrient intake. Some highly sensitive subpopulations have responses (in terms of incidence, severity, or both) to the agent of interest that are clearly distinct from the responses expected for the healthy population. The risk assessment process recognizes that there may be individuals within any life stage group that are more biologically sensitive than others, and thus their extreme sensitivities do not fall within the range of sensitivities expected for the general population. The UL for the general population may not be protective for these subgroups. As indicated earlier, the extent to which a distinct subpopulation will be included in the derivation of a UL for the general population is an area of judgment to be addressed on a case-by-case basis.
Step 2. Dose-Response Assessment
The process for deriving the UL is described in this section and outlined in Box 2. It includes selection of the critical data set, identification of a critical endpoint with its NOAEL (or LOAEL), and assessment of uncertainty.
The data evaluation process results in the selection of the most appropriate or critical data set(s) for deriving the UL. Selecting the critical data set includes the following considerations:
- Human data are preferable to animal data.
- In the absence of appropriate human data, information from an animal species whose biological responses are most like those of humans is most valuable.
- If it is not possible to identify such a species or to select such data, data from the most sensitive animal species, strain, or gender combination are given the greatest emphasis.
- The route of exposure that most resembles the route of expected human intake is preferable. This includes considering the digestive state (for example, fed or fasted) of the subjects or experimental animals. Where this is not possible, the differences in route of exposure are noted as a source of uncertainty.
- The critical data set defines a dose-response relationship between intake and the extent of the toxic response known to be most relevant to humans. Data on bioavailability are considered and adjustments in expressions of dose response are made to determine whether any apparent differences in response can be explained. For example, it is known that different metal salts can display different degrees of bioavailability. If the database involves studies of several different salts (for example, iron or chromium valence states), and the effect of the nutrient is systemic, then apparent differences in the degree and/or form of the toxic response among different salts may simply reflect differences in bioavailability. Data on bioavailability are considered and adjustments in expressions of dose response are made to determine whether any apparent differences in response can be explained.
- The critical data set documents the route of exposure and the magnitude and duration of the intake. Furthermore, the critical data set documents the intake that does not produce adverse effects (the NOAEL), as well as the intake producing toxicity.
Identification of NOAEL (or LOAEL) and Critical Endpoint
A nutrient can produce more than one toxic effect (or endpoint), even within the same species or in studies using the same or different exposure durations. The NOAELs and LOAELs for these effects will differ. The critical endpoint used to establish a UL is the adverse biological effect exhibiting the lowest NOAEL (for example, the most sensitive indicator of a nutrient or food toxicity). The derivation of a UL based on the most sensitive endpoint will ensure protection against all other adverse effects.
For some nutrients, there may be inadequate data on which to develop a UL. The lack of reports of adverse effects following excess intake of a nutrient does not mean that adverse effects do not occur. As the intake of any nutrient increases, a point (A, see Figure 2) is reached at which intake begins to pose a risk. Above this point, increased intake increases the risk of adverse effects. For some nutrients, and for various reasons, there are inadequate data to identify point A, or even to make any estimate of its location.
Because adverse effects are almost certain to occur for any nutrient at some level of intake, it should be assumed that such effects may occur for nutrients for which a scientifically documentable UL cannot now be derived. Until a UL is set or an alternative approach to identifying protective limits is developed, intakes greater than the RDA or AI should be viewed with caution.
Several judgments must be made regarding the uncertainties and thus the uncertainty factor (UF) associated with extrapolating from the observed data to
the general population (see Appendix B). Applying a UF to a NOAEL (or LOAEL) results in a value for the derived UL that is less than the experimentally derived NOAEL, unless the UF is 1.0. The larger the uncertainty, the larger the UF and the smaller the UL. This is consistent with the ultimate goal of the risk assessment: to provide an estimate of a level of intake that will protect the health of the healthy population (Mertz et al., 1994).
Although several reports describe the underlying basis for UFs (Dourson and Stara, 1983; Zielhuis and van der Kreek, 1979), the strength of the evidence supporting the use of a specific UF will vary. Because the imprecision of the UFs is a major limitation of risk assessment approaches, considerable leeway must be allowed for the application of scientific judgment in making the final determination. Since data are generally available regarding intakes of nutrients and food components in human populations, the data on nutrient toxicity may not be subject to the same uncertainties as with nonessential chemical agents, resulting in UFs for nutrients and food components typically less than 10. They are lower with higher quality data and when the adverse effects are extremely mild and reversible.
In general, when determining a UF, the following potential sources of uncertainty are considered and combined into the final UF:
- Interindividual variation in sensitivity. Small UFs (close to 1) are used to represent this source of uncertainty if it is judged that little population variability is expected for the adverse effect, and larger factors (close to 10) are used if variability is expected to be great (NRC, 1994).