The Dietary Reference Intakes (DRIs) are developed by committees that are independent of the process by which the Dietary Guidelines for Americans (DGA) are established. The DRIs provide a set of values describing the nutrient needs in apparently healthy populations. Although the values pertain to single nutrients, as opposed to foods and dietary patterns, the DRIs have played a key role in the evidence review of the Dietary Guidelines Advisory Committee (DGAC) and the recommendations in the DGA Policy Report. The DRIs underpin the DGA as the philosophy that if a person follows the recommendations of the DGA Policy Report, he or she will meet nearly all requirements as established by the DRIs (Britten et al., 2006). When DGAC conclusions change over time, food pattern modeling is applied to determine whether DRI values are still attainable. Given the interface between the DRIs and the DGA, the sections that follow provide context by presenting a historical perspective on the DRIs, outlining how the DRIs are intended to be used to assess nutritional adequacy and excesses, and describing the extent to which chronic disease end points have informed the DRIs.
A HISTORICAL PERSPECTIVE ON THE DIETARY REFERENCE INTAKES
Between 1941 and 1994, the Food and Nutrition Board (FNB) of what is now the National Academies of Sciences, Engineering, and Medicine issued 10 successive editions of the Recommended Dietary Allowances
(RDAs). As stated in the first of the reports, the RDAs were to act as a “guide to serve as a goal for good nutrition and as a ‘yardstick’ by which to measure progress toward that goal” (NRC, 1941, p. 1). These RDAs provided intake values for essential nutrients, which “were defined as chemical substances found in food that are necessary for human life and tissue growth and repair” (IOM, 1994, p. 8). RDA values were based, in large part, on prevention of the deficiency disease associated with lack of the specific essential nutrient, plus a margin of safety above this number to ensure good nutrition and protect all body tissues, termed “nutritional adequacy.” Canada had a parallel process that produced values called Recommended Nutrient Intakes (RNIs) and resulted in recommendations similar to those put forth in the United States.
As science progressed, the RDA reports expanded the numbers of nutrients included and the types of biochemical end points used for establishing values. The RDAs also expanded to include both genders and all life stages. Concomitantly, there was a growing interest and emphasis on decreasing risk of chronic disease through diet, culminating in the report Diet and Health: Implications for Reducing Chronic Disease Risk (NRC, 1989), which in turn was based in part on two major secondary sources: Surgeon General’s Report on Nutrition and Health and Nutrition and Your Health: Dietary Guidelines for Americans (HHS, 1988; USDA/HHS, 1985). Diet and Health addressed the science base for the relationship between nutrients, foods, and diet patterns and leading diet-related causes of morbidity and mortality in the United States at that time (NRC, 1989) (see Box E-1).
When Diet and Health was released, the FNB was in the process of determining whether to revise the tenth edition of the RDAs. There was a diversity of opinions as to “the challenge of whether to bring together the concepts of a health-promoting diet to reduce the risk of chronic disease and the nutrient-specific concepts underlying the RDAs” (IOM, 1994, p. 4). Ultimately, the FNB took a two-phase approach: (1) make a decision as to whether the RDAs should be revised; and (2) if so, determine the approach, strategy, and scope of work to revise them. To address the first phase, a public hearing was held in June 1993. One of the five questions posed to the speakers and audience was: “Should concepts of chronic disease prevention be included in the development of allowances?” (IOM, 1994, p. vi). At the end of the meetings, the speakers and testifiers unanimously agreed that it was time to revise the RDAs. The FNB produced a concept paper that summarized the symposium, public hearing, and discussions, stating:
The science of human nutrition stands at a pivotal point in its development. We now understand not only that nutrients are essential for growth and development and health maintenance, but also that some play a role in the reduction of risk of chronic disease. (IOM, 1994, p. 1)
One of the conclusions from the concept paper states: “Reduction in the risk of chronic disease is a concept that should be included in the formulation of future RDAs where sufficient data for efficacy and safety exist” (IOM, 1994, p. 18).
In contrast to the previous RDA reports in the United States and RNI reports from Canada, in 1994 the two countries agreed to have one set of standards for both nations. The replacement and expansion of the RDAs and RNIs by the DRIs is a recognized key paradigm shift in how nutrient intakes are evaluated (IOM, 2006; Murphy et al., 2016). In addition to setting the values for determining nutrient adequacy, a new emphasis for the reference intakes was chronic disease prevention in apparently healthy populations. Another important difference in the new DRIs is the expansion of values (see Box 7-1). The introduction of the Estimated Average Requirements (EARs) value, for instance, allowed for progress toward making population-based recommendations. A collection of DRI reports have since been published and provide intake values for energy, macronutrients, and micronutrients (IOM, 1997, 1998, 2000a, 2001, 2005a,b, 2011). A variety of reports have also explored and explained how DRIs should be operationalized (IOM, 2000b, 2003a,b).
ASSESSING NUTRIENT ADEQUACY USING DIETARY REFERENCE INTAKE VALUES
The DRIs serve as a reference against which dietary intakes can be compared and can be used to assess the intakes of individuals. The current guidance for application of the DRIs indicates that the EAR, RDA, and Adequate Intake (AI) can be used to gauge the likelihood of usual dietary intake being adequate (IOM, 2000b). The evaluation of an individual’s usual intake, however, “is imprecise and must be interpreted cautiously in combination with other types of information about the individual” (IOM, 2000b, p. 7).
Estimating the prevalence of inadequacy in groups requires different methods from those used for assessing individuals. The RDA, for instance, is not appropriate for the assessment of groups and should not be used. For nutrients with an EAR, the analysis can be performed using the probability approach or the EAR cut-point method. The probability approach has two key assumptions: (1) independence of intake and requirement, and (2) a known distribution of requirements (IOM, 2000b). This approach can be computationally challenging because it requires the selection of a probability model to properly execute. The EAR cut-point method is a shortcut to the probability approach and is performed by determining the proportion of the group with intakes below the EAR. This method typically provides similar results to the probability approach, and works particularly well when
intakes are accurately measured, actual prevalence in the group is neither very low nor very high, estimated usual intakes of individuals are independent of each individual’s requirements, the distribution of requirements is approximately symmetrical, and variability in intakes among individuals in the group is greater than the variability in requirements of the individual. (IOM, 2000b, p. 81)
Prevalence of nutrient inadequacy calculated using the EAR cut-point method can be underestimated or overestimated if one or more of the aforementioned assumptions are not met. Iron requirements do not meet the assumption of symmetry around the EAR, particularly in menstruating women, and use of the EAR cut-point method may lead to biased estimates of iron inadequacy (IOM, 2000b). An assessment of iron intake, especially among adolescent girls and premenopausal women, necessitates the use of the probability approach (IOM, 2000b). For nutrients with an AI, mean intake of a population meeting or exceeding the AI suggests prevalence of inadequacy is low.1 The AI cannot, however, be
1 Not all AIs are established based on indicators of inadequacy. Assessment of inadequacy in groups using such nutrients is made with less confidence.
used to determine the prevalence of inadequate nutrient intake for groups (IOM, 2000b).
ASSESSING RISK CAUSED BY EXCESSIVE INTAKES USING DIETARY REFERENCE INTAKE VALUES
The Tolerable Upper Intake Level (UL) does not represent an optimal or recommended intake level. Instead, it describes the highest intake level of a nutrient that is likely to pose little to no risk of a selected critical adverse effect for a given life stage and gender group. Table E-1 lists the adverse effects used to establish the existing ULs and demonstrates that values are largely not based on chronic disease end points or surrogate markers of chronic disease. Not all life stage groups or nutrients have a UL, which can make the assessment of excess consumption challenging. The rationale for not assigning a UL for saturated fat, for example, was that given the positive linear trends, “any incremental increase in saturated fatty acid intake increases [coronary heart disease] risk” (IOM, 2005a, p. 485).
INCLUSION OF CHRONIC DISEASE END POINTS IN DIETARY REFERENCE INTAKE VALUES
The vision for the DRIs was to include chronic disease risk reduction in the development of nutrition intake values as evidence emerged (IOM, 1994). The process for doing this has proven to be difficult. Determining a nutrient intake level that reflects the probability of developing a chronic disease does not operate the same way as the prevention of single nutrient deficiency conditions, because chronic diseases are multifactorial and absolute risk of a chronic disease in a given population is rarely 100 percent. Only five nutrients have nutritional adequacy DRI values that integrate chronic disease end points or surrogate markers of chronic disease (see Table E-2). Despite the complexities, efforts are currently under way to move toward using chronic disease end points to establish DRIs.2
2 A multidisciplinary working group sponsored by the Canadian and U.S. government DRI steering committees met from late 2014 through April 2016 to consider how to base DRI values on chronic disease end points. The working group produced a report that provided extensive discussion of the issues and ideas for paths forward (Yetley et al., 2017). An ad hoc consensus committee of the National Academies of Sciences, Engineering, and Medicine recently released a report in which the options presented by Yetley et al. (2017) are reviewed and recommends methods and guiding principles for including chronic disease end points in the DRI process (NASEM, 2017).
TABLE E-1 Critical Adverse Effects Used to Establish Tolerable Upper Intake Levels in the Dietary Reference Intakes
|Nutrient||Critical Adverse Effect Used to Establish Tolerable Upper Level|
|Borona||Reproductive and developmental effectsb,c|
|Calcium||Calcium excretion,d kidney stone formatione,f,g|
|Chloridea||Blood pressure statush|
|Cholinea||Hypotension, fishy body odorc,i|
|Fluoride||Enamel and skeletal fluorosis|
|Folatea,j||Precipitating or exacerbating neuropathy in individuals deficient in B12c|
|Iodinea||Elevated TSH concentrationc|
|Iron||Gastrointestinal side effects|
|Magnesiuma,k||Diarrhea and other gastrointestinal issuesf|
|Manganese||Elevated blood concentrations and neurotoxicityc|
|Nickela,l||Decreased body weight gainb,f|
|Selenium||Hair and nail brittleness and loss|
|Sodiuma||Blood pressure statusq|
|Vitamin A||Teratogenicity,s liver abnormalities,t hypervitaminosis Ac,u|
|Vitamin B6a,v||Sensory neuropathyc|
|Vitamin Ca||Osmotic diarrhea and related gastrointestinal disturbancesc|
|Vitamin D||Hypercalcemia and related toxicityw|
|Vitamin Ea||Hemorrhagic effectsb,c|
|Zinc||Adverse effect on copper metabolism (i.e., reduced copper status)x|
NOTES: Tolerable upper intakes levels are established based on intake of food, water, and supplements, unless otherwise noted. TSH = thyroid stimulating hormone.
a UL not determinable for infants ages 0 to 12 months.
b As observed in animal models.
c UL for children and adolescents (ages 1 to 18 years) were derived from the UL for adults.
d Used for infants.
e Used for adults.
f UL for children (ages 1 to 8 years) was derived from the UL for adults.
g UL for older children (ages 9 to 18 years) was derived from the UL for adults, with an additional amount added to account for metabolic demand increases and pubertal growth spurts.
h Chloride is assumed to be consumed in equimolar amounts as sodium. The UL, therefore, is the equimolar equivalent to the UL for sodium.
i Considered a secondary consideration.
j Limited to supplemental folate intake.
k UL is established for magnesium for nonfood sources.
l Derived from intake as soluble nickel salts.
m UL for adults (ages 19 to 70 years) was derived by dividing the approximate upper boundaries of normal serum inorganic phosphate levels in adults by an uncertainty factor.
n UL for toddlers and children (ages 1 to 8 years) was derived by dividing the approximate upper boundaries of normal serum inorganic phosphate levels in adults by a larger uncertainty factor than used for the adult UL, to account for the smaller body size.
o Because of the lack of evidence of a greater susceptibility of adverse effects, the UL for adolescents is the same as for adults.
p UL for older adults (ages > 70 years) was derived by dividing the approximate upper boundaries of normal serum inorganic phosphate levels in adults by a larger uncertainty factor than used for the adult UL, to account for increased prevalence of impaired renal function.
q UL for children (ages 1 to 18 years) was extrapolated from the UL for adults, based on estimated energy intakes.
r UL only determined for adults 19 years and older who are not pregnant or lactating.
s For women of childbearing age.
t For all other adults.
u Case reports in infants were used to derive a UL.
v UL for B6 is based on evidence from oral supplemental doses of pyridoxine.
w UL for children (ages 1 to 8 years) was derived from the UL for adults. UL for older children and adolescents (ages 9 to 18 years) is the same as for adults.
x Because of a lack of available data, the UL for older infants, children, and adolescents were extrapolated from the UL for young infants.
TABLE E-2 Nutritional Adequacy DRIs That Integrate Chronic Disease End Points and/or Surrogate Markers of Chronic Disease
|Nutrient||Adequacy DRI Value(s)||Chronic Disease End Point|
|Calcium||EAR, RDA||Bone health (accretion, maintenance, and loss)a|
|Potassium||AI||Combination of end points (salt sensitivity, kidney stones, blood pressure)|
|Total fiber||AI||Risk of coronary heart disease|
|Vitamin D||EAR, RDA||Bone health (accretion, maintenance, and loss)b|
NOTES: The UL for sodium is based on blood pressure, which is considered a surrogate marker for chronic disease. Because the UL does not reflect a level of nutritional adequacy but rather represents an intake level after which risk of adverse effects increases, sodium in not included in this table. AI = Adequate Intake; DRI = Dietary Reference Intake; EAR = Estimated Average Requirement; RDA = Recommended Dietary Allowance.
a Measures varied by DRI life stage group and incorporated data on calcium balance, which is not directly linked to a specific chronic disease end point.
b Based on serum 25-hydroxyvitamin D concentrations to achieve bone health.
Britten, P., K. Marcoe, S. Yamini, and C. Davis. 2006. Development of food intake patterns for the MyPyramid food guidance system. Journal of Nutrition Education and Behavior 38(6 Suppl):S78-S92.
HHS (U.S. Department of Health and Human Services). 1988. The Surgeon General’s report on nutrition and health. Washington, DC: U.S. Government Printing Office. https://profiles.nlm.nih.gov/ps/access/nnbcqh.pdf (accessed May 15, 2017).
IOM (Institute of Medicine). 1994. How should the Recommended Dietary Allowances be revised? Washington, DC: National Academy Press.
IOM. 1997. Dietary Reference Intakes for calcium, phosphorus, magnesium, vitamin D, and fluoride. Washington, DC: National Academy Press.
IOM. 1998. Dietary Reference Intakes for thiamin, riboflavin, niacin, vitamin B6, folate, vitamin B12, pantothenic acid, biotin, and choline. Washington, DC: National Academy Press.
IOM. 2000a. Dietary Reference Intakes for vitamin C, vitamin E, selenium, and carotenoids. Washington, DC: National Academy Press.
IOM. 2000b. Dietary Reference Intakes: Applications in dietary assessment. Washington, DC: National Academy Press.
IOM. 2001. Dietary Reference Intakes for vitamin A, vitamin K, arsenic, boron, chromium, copper, iodine, iron, manganese, molybdenum, nickel, silicon, vanadium, and zinc. Washington, DC: National Academy Press.
IOM. 2003a. Dietary Reference Intakes: Applications in dietary planning. Washington, DC: The National Academies Press.
IOM. 2003b. Dietary Reference Intakes: Guiding principles for nutrition labeling and fortification. Washington, DC: The National Academies Press.
IOM. 2005a. Dietary Reference Intakes for energy, carbohydrate, fiber, fat, fatty acids, cholesterol, protein, and amino acids. Washington, DC: The National Academies Press.
IOM. 2005b. Dietary Reference Intakes for water, potassium, sodium, chloride, and sulfate. Washington, DC: The National Academies Press.
IOM. 2006. Dietary Reference Intakes: The essential guide to nutrient requirements. Washington, DC: The National Academies Press.
IOM. 2011. Dietary Reference Intakes for calcium and vitamin D. Washington, DC: The National Academies Press.
Murphy, S. P., A. A. Yates, S. A. Atkinson, S. I. Barr, and J. Dwyer. 2016. History of nutrition: The long road leading to the Dietary Reference Intakes for the United States and Canada. Advances in Nutrition: An International Review Journal 7(1):157-168.
NASEM (National Academies of Sciences, Engineering, and Medicine). 2017. Guiding principles for developing Dietary Reference Intakes based on chronic disease. Washington, DC: The National Academies Press.
NRC (National Research Council). 1941. Recommended Dietary Allowances. Washington, DC: National Academy Press.
NRC. 1989. Diet and health: Implications for reducing chronic disease risk. Washington, DC: National Academy Press.
USDA/HHS (U.S. Department of Agriculture). 1985. Nutrition and your health: Dietary Guidelines for Americans. 2nd ed. Washington, DC: U.S. Government Printing Office. https://health.gov/dietaryguidelines/1985.asp (accessed May 15, 2017).
Yetley, E. A., A. J. MacFarlane, L. S. Greene-Finestone, C. Garza, J. D. Ard, S. A. Atkinson, D. M. Bier, A. L. Carriquiry, W. R. Harlan, D. Hattis, J. C. King, D. Krewski, D. L. O’Connor, R. L. Prentice, J. V. Rodricks, and G. A. Wells. 2017. Options for basing Dietary Reference Intakes (DRIs) on chronic disease endpoints: Report from a joint US-/Canadian-sponsored working group. American Journal of Clinical Nutrition 105(1):249S-285S.
This page intentionally left blank.