In the previous chapter, the committee reviewed the current process being used by most authoritative bodies to derive nutrient reference values (NRVs) and offered a recommendation regarding which NRVs should be prioritized. The current process for deriving NRVs, while based on the King and Garza (2007) conceptual framework, has also been made more rigorous and transparent through the use of new tools and methods that were either unavailable or not used in the past. In this chapter, the committee examines and assesses in greater detail key steps in the nutrient review process, with a focus on how to assess relevant data from systematic reviews and other data resources and how to account for local context (e.g., culturally specific food choices and dietary patterns). Based on this assessment, the chapter concludes with the committee’s proposed framework for harmonizing approaches used to derive NRVs on a global scale. Throughout the chapter, the committee emphasizes the need for nutrient review panels to recognize and manage any uncertainties that may exist in either the data or methods used to derive NRVs. Also throughout the chapter, the committee offers three separate recommendations to support a harmonized process for NRV development. It was beyond the committee’s task (see Box 1-2) and, thus, beyond the scope of this report to address specific ways to implement these recommendations, although some possible next steps are discussed in Chapter 5.
For methodological approaches to establishing NRVs to be adapted globally, there are certain key steps in the process that should be consistent across countries. These key steps are illustrated as a flow diagram in Figure 3-1. Each step is discussed in detail in this chapter. The four steps of the King and Garza (2007) approach to deriving NRVs (see Chapter 2) are embedded in the flow diagram. In the figure, the option to update or adapt existing NRVs means that it is not necessary to go into a full review, rather, adjust existing reference values and document how it was done. For new values, a full review is required.
Importantly, and as discussed throughout this chapter, the process to review any NRV must be transparent and include the following components:
- Document each stage of the process.
- Assess and document limitations in the data and methods.
- Consider uncertainties.
- Conduct a rigorous peer review of the nutrient review report.
The AGREE II instrument (see Appendix C) serves as an example of a generic tool that has been used by health care providers, guidelines developers, and policy makers to evaluate the quality of clinical practice guidelines and it may serve as a useful template for similar applications in nutrition (Brouwers et al., 2010).
An Authoritative Body or Expert Panel Identifies a Nutrient for Review
The first step in deriving an NRV is identifying the nutrient(s) for review, as well as the age and sex groups in the relevant population. In some situations, the reference values may be reexamined only for select population subgroups, such as children or pregnant women. Reexamining existing NRVs requires justification, including either new evidence for the nutrient or policy changes, such as fortification, that make it necessary to reassess the values (NASEM, 2018). In most cases, an authoritative body, such as a sponsoring governmental agency, identifies the nutrient(s) to be reviewed from among those under consideration. The United States and Canada, for example, collaborate on the Dietary Reference Intakes (DRIs), and the European Commission sponsors the European Food Safety Authority’s (EFSA’s) scientific opinions on nutrient references (Atkinson, 2011; EFSA, 2017).
Reexamining existing NRVs requires justification, including new evidence for the nutrient, or policy changes, such as fortification, that make
it necessary to reassess the values (NASEM, 2018). In the case of the joint U.S. and Canadian nutrient DRI review process, nutrient nominations were requested through formal notification. Candidate nutrients were accepted for consideration based on (1) a rationale and description of why a nutrient review was warranted, and how it would address a current public health concern, and (2) a documented literature search demonstrating new, relevant literature since the last nutrient DRI review. Another process that has been used is that the governmental agency or other authoritative body requests public input on candidate nutrients (as discussed by Peter Clifton at the Global Harmonization workshop) (NASEM, 2018).
The committee deliberated on the potential role of a central organizing body such as the World Health Organization (WHO) and the United Nations’ Food and Agriculture Organization (FAO), two international organizations responsible for facilitating cooperation in global health, nutrition, and agriculture. Setting and promoting international norms and standards is one of WHO’s responsibilities. FAO, similarly, is responsible for supporting international policies that make up-to-date nutrition information available (FAO, 2018). Given the overlap in their missions, WHO and FAO have a history of collaboration, most notably on the international food standards contained in the Codex Alimentarius (FAO/WHO, 2018a). The Codex Alimentarius Commission (Codex) is a vast and well-established international body. Its first formal meetings were in 1961, although regional Codex meetings started in the 1940s (FAO/WHO, 2018b).
WHO and FAO share funding for the Codex and have, over time, established a trust fund to support the capacity of participants in low- and middle-income countries to participate in the standard setting process (FAO/WHO, 2018d). Codex delegates set standards for, among other things, nutrition and labeling information, a topic inextricably related to NRVs (FAO/WHO, 2018c). This body is in a position to negotiate the logistics and funding of a similar effort for deriving NRVs. To illustrate, the Joint FAO/WHO Expert Committee on Food Additives is an international expert scientific committee that evaluates the safety of food additives. The committee conducts risk assessments and safety evaluations on a number of food additives and food contaminants. It is also concerned with developing principles for safety assessment of chemicals in foods that are aligned with current risk assessment approaches and methodologies.
Or, it is possible that a technical organization, such as the International Union of Nutritional Sciences (IUNS) might also have equally good convening authority among the scientific experts needed for such a project. Its adhering bodies include scientific and nutrition societies in more than 80 countries, and one of the main objectives of IUNS is to encourage international scientific collaboration (IUNS, 2018a). IUNS is formally affiliated with various international and regional nutrition organizations, including
the African Nutrition Society, the Asia Pacific Clinical Nutrition Society, the International Zinc Consultative Group, and the Iodine Global Network (IUNS, 2018b).
Another possibility is that international collaboration may be more efficiently carried out at the regional level. Working through regional networks could be a particularly promising strategy in developing countries. Where regional economic communities such as the South African Development Community (SADC) and the Association of Southeast Asian Nations (ASEAN) are already committed to advancing regional development there is precedent for such networks working on common problems related to nutrition and food security. The SADC Food and Nutrition Security Strategy 2015–2025 is meant to promote food fortification and good nutrition, goals that cannot be realized without suitable NRVs (SADC, 2014). The ASEAN nutrition program puts similar emphasis on tracking malnutrition in the region, something entirely dependent on accurate reference values (ASEAN, 2016, 2017). Elsewhere, the Mercosur trade partnership has harmonized nutrition labeling regulations and food labeling laws across South America.
Conclusions and Recommendation to Support the Process for Convening a Review of Nutrient Reference Values
The committee came to the following conclusions:
Two international organizations, WHO and FAO, both with missions to facilitate global cooperation in nutrition and health matters, present an opportunity for enabling harmonization of the process for deriving NRVs on a global scale by serving as a convener of an expert review panel. Convening a global expert panel would be ideal for promoting a harmonized process and making efficient use of the available resources. Or, it is possible that a technical organization, such as the IUNS might also have equally good convening authority among the scientific experts needed for such a project. Another possibility, one that is particularly promising in developing countries, is that international collaboration may be more efficiently carried out at the regional level.
The committee recognizes that political issues as well as country-specific conventions and traditions may present a barrier to countries to fully committing to a shared process for deriving NRVs. For the near future, national governments may still convene their own expert panels for this process. In such cases, those national panels would make better use of their resources by adapting international
values to their context and making use of existing systematic reviews when available.
Recommendation 2. To set a nutrient reference value, ideally a global body, such as the World Health Organization (WHO), the United Nations’ Food and Agriculture Organization (FAO), or the International Union of Nutritional Sciences (IUNS), or secondarily, a regional consortium, should convene an expert panel to identify relevant outcome measures and request a systematic review for the nutrient of interest, and appoint a panel to advise on how to adapt the values to different population subgroups and settings.
Revise an Existing or Conduct a New Systematic Review
Key steps in the systematic review process are described in Chapter 2, beginning with the necessity of understanding the status of the evidence for the nutrient under review prior to initiating a review. This includes knowing whether there is an existing systematic review that can be updated or if a new review is needed. Evaluating the relevance of the evidence in an existing review involves careful selection of the eligibility requirements prior to implementing the review and consideration of clinical measures of deficiency or excess, measureable and validated biomarkers for the nutrient of interest, and other factors. If it is decided that a new systematic review is needed, there are several elements that can increase the strength, rigor, and transparency of the process to derive NRVs. These include identifying population, intervention/exposure, comparator, and outcome of interest (PICO/PECO) elements; assessing uncertainty in the evidence (i.e., appraising risk of bias in individual studies); assessing bias in the systematic review process; and grading the strength of the body of evidence. Each of these is discussed below.
Identify PICO/PECO Elements
As stated in Chapter 2, the evidence to be included in a systematic review is selected according to eligibility criteria defined a priori, including criteria specified for PICO/PECO tables. Indeed, the PICO/PECO process serves as a core organizing framework for a systematic review. PICO/PECO elements are used to build conceptual relationships among the research questions being addressed, guide the identification of a database to search (such as PubMed or Embasse), and define the search strategy. If there is a previous systematic review on a nutrient, the PICO/PECO process may serve as the basis for an updated review.
In identifying the P component, that is, the population of interest,
relevant demographic factors, including age, sex, and ethnicity, need to be considered as well as the public health significance. As an example of the latter, women of reproductive age are a vulnerable group, and as such, they require special consideration because of the complexity of determining the nutritional needs of a population subgroup with limited access for clinical research.
Deriving NRVs is usually based on data from a healthy reference population (see discussion in Chapter 2). When the population of interest is not healthy, such as when a population has a high prevalence of infection, or when the population has unique physiological characteristics such as obesity, advanced age, or pregnancy, reference values must be adjusted accordingly. Studies on individuals suffering from chronic infections may be included, but separated later by a subgroup analysis.
The I/E or intervention/exposure component of PICO/PECO refers to exposure to the nutrient. To identify I/E elements, expert review panels judge the quality of intake data from foods and other substances, such as supplements or water (e.g., water can be a dietary route for iodine and magnesium). In reviewing the literature, it may be necessary to exclude research when the dose of the nutrient in question is not within a biologically relevant range.
The C or comparator or control component facilitates assessment of the effect of an alternative to the intervention. For data from randomized controlled trials (RCTs), for example, comparisons can be made relative to a placebo group, to a nonintervention group, or an existing standard of care group. In observational cohort studies, the comparison could be performed between groups stratified by differences in usual intake.
The last step of the PICO/PECO process, the O component, involves the listing and ranking of outcomes of interest. Prioritizing the many health outcomes that can be influenced by exposure to a nutrient often compels difficult decision making and may require an iterative review of the population data available for different outcomes. Yet, it is a necessary process for managing the scope, time, and expense of a systematic review.
As stated in Chapter 2 (see Figure 2-2), proximal biomarkers of the outcome of interest may be useful in assessing and prioritizing health outcomes. Red blood cell folate levels, for example, can be an important biomarker of folic acid status, as can rates of neural tube defects and stillbirth in a population. When conducting this last step of the PICO/PECO process, separating reviews of observational studies from RCTs is generally necessary because RCTs tend to have shorter exposure windows and follow-up times, which may be more relevant to proximal outcomes than to some clinical outcomes. Clinical outcomes can require longer exposure times to see an effect. Additionally, RCTs may have more stringent criteria for assessing intake.
Additionally and importantly, a given outcome may not be relevant to all members of a population. In considering the relationship between sodium intake and chronic disease, for example, blood pressure is a surrogate marker of interest for both adults and children. PICO/PECO tables should be created separately for different age groups and sexes, in the population or subgroup of interest. PICO/PECO elements should be carefully considered and assessed when adapting existing NRVs to a different context or population subgroup.
Assess Uncertainty in the Evidence: Appraisal of Risk of Bias in Individual Studies
Evaluation of risk of bias is an inherent step in the systematic review process. As stated in Chapter 2, an evaluation of risk of bias requires assessment of both the internal and external validity of each individual study (Higgins et al., 2011). Bias occurs when systematic flaws or limitations in the design, conduct, or analysis of a review distort the review results or conclusions (Whitinga et al., 2016). Bias can be introduced into a study at multiple levels: when estimating nutrient intakes (referred to as exposure assessment), when identifying outcomes (hazard identification), or when measuring outcomes (hazard characterization of biomarkers of nutrient status or adverse or beneficial health effects). Regardless of where it is introduced, bias can affect the conclusions drawn about the relationship between a nutrient and an outcome, as well as affect the interpretation of the study results. Many sources of bias are study design and outcome dependent and have to be assessed with this principle in mind; they are evaluated by a multidisciplinary team of experts, including both methodologists and domain experts.
Several tools have been developed in recent years to evaluate different types of risk of bias that can affect an individual study. These tools were originally developed for clinical interventions and subsequently adapted to other research areas. The tools comprise checklists of a series of items that have to be evaluated. Exemplar tools are shown in Appendix D.
Traditionally, risk of bias has been addressed using a qualitative approach, that is, by classifying studies into tiers that reflect the level of confidence or certainty in the evidence (e.g., low, medium, or high), and performing sensitivity scenario analysis, which involves rerunning the evidence synthesis under various conditions (e.g., including or excluding studies affected by high risk of bias) and assessing how much the conclusions change accordingly. When these analyses suggest serious bias, then bias-adjusted meta-analysis may be necessary (Dias et al., 2013; Turner et al., 2009).
Bias-adjusted meta-analyses are methods that allow for adjustments of the estimates of the response based on an assessment of the direction of the
bias and a quantitative expression of its magnitude. They rely on the use of expert judgment elicitation and modeling (Dias et al., 2013; Turner et al., 2009) and consist of probability distributions expressing the expected magnitude of the bias. The National Toxicology Program and the National Institute of Environmental Health Sciences’ Office of Health Assessment and Translation encourages this approach but warns that judgment regarding the effect of the bias on the estimate of the response must be based on sound data (National Toxicology Program, 2015).
In many fields, including nutrition, such data are still extremely scarce and thus, bias-adjusted meta-analyses might not be feasible. Although not ideal, in these cases a possible alternative approach is the exclusion of the studies with a high risk of bias from the risk-of-bias analysis. For this reason, researchers in nutrition are encouraged to engage in collecting such information and make it publicly available.
Assessing Uncertainties in the Systematic Review Process
One of the main shortcomings of systematic reviews is the heterogeneity that results from pooling studies carried out with different types of evidence, such as animal versus human data or study designs, different population subgroups, exposure doses, administration routes, or levels of compliance. Because such heterogeneity ultimately translates into uncertainty (Hill, 1965), pooling studies should be done only if they meet the criteria for a meta-analysis protocol. Additionally, when using the systematic review approach, consideration also must be given to the language used to search the literature, as well as the search strings, defined eligibility criteria, and the potential for a selective tendency from editors to publish papers with positive, versus negative, results. Any of these elements could introduce a bias in the results of the systematic review.
One approach to mitigating such biases is evidence mapping. Evidence mapping allows for investigation before starting a systematic review, allowing reviewers, for instance, to identify evidence that could be excluded based on language (i.e., considering only a single language) or eligibility criteria, such as restricting the systematic review to a specific study design or analytical method. Evidence mapping also allows, via an iterative process, a means to identify the best search string, language, eligibility criteria, and other factors to consider when selecting studies to include. The results of the evidence map can be presented in a graphical or tabular format showing, for instance, how many citations would be retrieved using other languages in addition to English. Box 3-1 shows an example of an evidence map from the Institute of Medicine (IOM, 2011) report on DRIs for calcium and vitamin D. This evidence map shows that in human nutrition the
evidence for adverse health effects of a nutrient that forms the basis for deriving a UL relies to a high degree on observational studies on humans.
Grading the Strength of the Body of Evidence
While still an exploratory approach, Grading of Recommendations, Assessment, Development and Evaluation (GRADE) can also be used to sum-
marize uncertainties that arise when evidence from heterogeneous sources is integrated to draw conclusions about causal relationships between a nutrient intake and a health outcome (Guyatt et al., 2008; WHO, 2014). GRADE provides a transparent means for evaluating a range of factors that can affect study quality, such as the limitations of each study considered, consistency of findings across studies, elements of directness and external validity, and the likelihood of publication bias. GRADE ranks the strength of the evidence into four categories: high, moderate, low, and very low.
Because of its qualitative nature, GRADE-based judgments do not account for the direction or magnitude of bias in the evidence (i.e., the biases in individual studies). Assessors using GRADE must evaluate whether evidence exists that allows for a probabilistic judgment of the direction and magnitude of bias that could affect the results of the assessment. In such cases, the methods discussed previously for quantitatively adjusting for bias could be applied.
Convene an Expert Panel to Evaluate the Evidence
After selecting a nutrient for review and initiating a systematic review, an expert nutrient review panel is convened and given its charge. The review panel may, in some cases, be convened to assist with structuring the systematic review (NASEM, 2018). In Figure 3-1, the relationship between the review panel and the systematic review is indicated by dotted arrows.
Conducting nutrient reviews is challenged by constraints in policy priority, availability of funding, and expertise, but the need for nutritional benchmarks is critical all over the world. This shared need, combined with the substantial effort and expense of deriving NRVs, is a strong justification for international cooperation. Already, neighboring countries with significant trade ties tend to share the work and expense of convening nutrient reference panels (NASEM, 2018). For example, as mentioned above, the United States and Canada collaborate on a single process for deriving a set of NRVs that is used by both countries.
Evaluating Existing Reference Values
The next step, as illustrated by the flow diagram in Figure 3-1, is to decide whether an existing NRV will be updated or adapted, or if a new NRV will be established. Either way, regardless of the convening organization, there are certain key steps in the process that should be implemented using a consistent process across countries.
Option: Keep and Update or Adapt an Existing Nutrient Reference Value
Box 3-2 outlines the tasks that must be completed when updating or adapting existing average requirements (ARs) and upper limits (ULs) to a local context where unique physiological and environmental issues, such as a high prevalence of infection, may be contributing to population-wide nutritional problems. The challenge of extrapolation is discussed later in this chapter.
Option: Establish a New Nutrient Reference Value
Precision and clarity are extremely important in the early stages of the process of deriving NRVs because the parameters of the question dictate the breadth and depth of the work. A transparent, rigorous, and consistent process will facilitate the task of establishing a new reference value. Box 3-3 summarizes the general steps in the process for deriving new NRVs.
Establishing a New NRV: Define the Approach to Take
If it is decided the new NRVs will be established, for the AR the next step is to define the approach to take: dose–response modeling, the factorial
approach, or balance studies. Each of these approaches is discussed below. ULs, which are determined through risk assessment, are addressed later in the chapter.
The different purposes of a dose–response assessment were described in Chapter 2: preventing deficiency disease, assessing biomarkers for health or risk of disease, and determining a safe upper level of intake. Below, the methodology of dose–response assessment is described, including options for managing uncertainty.
Generally, dose–response modeling is used when there is a clear relationship between the intake of a nutrient and a metabolic or functional outcome. However, since functional tests tend to be less responsive than biochemical tests to small changes in nutrient status they are rarely used, except for folate, to determine the AR for a population.
Relationships between nutrient intake and health outcomes are complex and difficult to estimate accurately and precisely, even when an advanced model or suite of models, rather than a single model, is used to estimate the quantitative relationship between nutrient intake and a response. Typically, several confounders or modifiers have the capacity to influence a given relationship between a nutrient and a health outcome. For example, sun exposure is known to influence the level of serum 25(OH)D, and can therefore modify the effect of measured vitamin D levels at the different intake levels. One way to address this challenge is to incorporate the potentially confounding factors into the model, adjusting the estimate of the parameters and the expected intake–response using a multivariate metaregressive approach (van Houwelingen et al., 2002). Yetley et al. (2017) described this challenge to using dose–response modeling to set NRVs, especially when dealing with chronic disease endpoints.
When multiple intake–response studies have been identified from a systematic review, several different modeling approaches are possible. One of these is the multivariate dose–response meta-analysis (Crippa and Orsini, 2016), which is not only a valid methodological solution for integrating data from different sources in intake–response modeling, but it also adjusts for confounding factors that are assumed to have an influence on the response and/or exposure. Orsini et al. (2012) illustrates several examples of the application of this method on observational and experimental data for linear and nonlinear models, and for either continuous or quantal response variables.
Uncertainty in the dose/intake response relationship As emphasized throughout this chapter, expressing uncertainty is a key step in the process
of deriving NRVs. In assessing dose–response relationships, statistical uncertainties may stem from any number of different sources:
- The type of model used, such as linear versus nonlinear or Hill versus exponential to fit a continuous outcome;
- Normality assumptions of the model for either raw or transformed data;
- Sampling uncertainty, which is traditionally expressed using confidence/credible intervals for the parameter estimates;
- Identification of the response level associated with a risk–benefit outcome, such as the level of total calcium intake that leads to hypercalcemia or body weight development in infants and children corresponding to retarded growth; and
- Correlation among arms from the same study or among related outcomes, such as the same organ/physiological function.
A number of methods are available to address uncertainties in the intake–response approach and are further described in NASEM (2017) and Yetley et al. (2017). These include assessing the effect of heterogeneity using sensitivity scenario analysis, considering use of moderator variables to adjust intake–response relationships, average models, and estimating confidence intervals for reference values.
The Factorial Approach
As explained in Chapter 2, the factorial approach is useful with biochemical indicators that are insensitive to nutrient levels. This approach is generally used to set ARs for minerals because minerals cannot be metabolized and, therefore, losses of minerals via the urine, feces, skin, menstrual blood, or semen are in the same inorganic form as in the diet. With the factorial approach, quantification of losses is used to estimate the physiological requirement of a nutrient to ensure that basic physiological needs are met and that the level of intake will support growth and development from pregnancy through infancy and childhood. The process of quantifying nutrient losses is facilitated by the use of stable isotopes to calculate nutrient kinetics, and to measure true absorption and retention rates.
As also stated in Chapter 2, the efficiency of mineral absorption (bioavailability) is one of several factors that needs to be considered when quantifying nutrient losses from a typical diet. For example, if total body losses of a given mineral total 5 mg per day, but only 20 percent of the dietary intake of that mineral is typically absorbed, then 25 mg would need to be consumed daily to replace total endogenous losses. The IOM (2001, pp. 69–70) defined bioavailability as a nutrient’s “accessibility to
normal metabolic and physiologic processes.” As described in that report, “Bioavailability influences a nutrient’s beneficial effects at physiologic levels of intake and also may affect the nature and severity of toxicity due to excessive intakes.”
A number of factors can affect nutrient bioavailability. These include nutrient concentration, dietary substances that can enhance or impair absorption, the chemical form of the nutrient, supplements, nutrition and health of the individual, excretory losses, and nutrient–nutrient interactions. In addition, pathophysiologic changes can affect bioavailability; for example, vitamin B12 absorption is reduced when secretion of gastric intrinsic factor is reduced or impaired.
As discussed in detail in Chapter 4, a dietary factor that affects nutrient bioavailability that is of particular concern for low- and middle-income countries is phytate, which progressively decreases the bioavailability of zinc and other minerals such as iron and calcium. Bioavailability of nutrients, particularly zinc and iron, is a concern globally because of its potential effect on nutritional status across populations.
Another approach useful for estimating mineral requirements and the one generally used to estimate protein requirements is the balance study. When the amount of a dietary mineral or dietary nitrogen balances (or equals) the amount lost in the feces, urine, and integument, it is assumed that the body is saturated, that all needs have been met, and that a higher intake of the nutrient would not have a beneficial effect. Protein ARs are estimated from nitrogen balance studies by examining the effects of increasingly concentrated intakes that just replaces the amount lost; this level is considered to be the nitrogen and, therefore, protein requirement because the average nitrogen content of protein is 16 percent.
Combining Approaches with Other Data to Reduce Chronic Disease Risk
Observational epidemiological studies can be used to determine whether usual intakes are in agreement with values obtained through any of these three approaches. This is useful when setting intake recommendations that support nutritional functions while also reducing the risk of chronic disease, and can also be useful for deriving a folate level that reduces the risk of neural tube defects. Usually, prospective cohort studies and, to a lesser degree, other observational studies, are combined with dose–response or factorial data.
While the importance of considering the role of nutrient intakes in reducing chronic disease endpoints was recognized when the DRI framework
was first developed (IOM, 1994), because of insufficient data needed to set an AR, only AIs have been derived at this point and only for a few nutrients (and disease endpoints): calcium (fracture rates), vitamin D (fracture rates), fluoride (dental carries), potassium (hypertension, kidney stones), and fiber (coronary heart disease) (Russell, 2010).
Establishing a New Nutrient Reference Value: Appraise the Evidence
The next step to deriving a new NRV, as illustrated in the flow diagram in Figure 3-1, is to synthesize and integrate the evidence while accounting for sources of uncertainty. Identifying the various sources of uncertainty and developing and applying methods to manage uncertainties increases the credibility of assessment results and facilitates decision making for public health managers. Identification of uncertainties can also stimulate research that will fill knowledge gaps and help in the development of models for the quantitative assessment of evidence.
Uncertainty in any assessment arises from two main sources: (1) limitations in the data used as evidence, including data scarcity, and (2) shortcomings in the methodology and processes applied to derive NRVs. Uncertainty caused by limitations in data was discussed earlier in this chapter in the section titled “Assess Uncertainty in the Evidence: Appraisal of Risk of Bias in Individual Studies.” Uncertainties caused by shortcomings in the methodology and processes to derive NRVs are described throughout this chapter. In addition to these two main sources of uncertainty, although variability between individuals, as well as within individuals, is different from uncertainty, variability can be a major contributor to overall uncertainty when its magnitude and distribution is unknown. It is important to address variability in individual susceptibility to a nutrient’s effect due to age, sex, developmental stage, environment, disease, or genetic heterogeneity (i.e., the distribution in individual susceptibility around a median value of exposure) separately from uncertainty caused by limitations in data.
The possible strategies to address uncertainty can vary depending on the steps of the assessment process and whether the objective of the analysis is establishing a causal relationship between a nutrient and outcomes (hazard identification) or rather characterizing this same relationship (hazard characterization). Traditionally, in the assessment of risk related to nutrient intakes, uncertainty owing to a lack of data or lack of knowledge of variability is addressed with uncertainty factors used to derive the UL from a no observed adverse effect level (NOAEL) or lowest observed adverse effect level (LOAEL). These uncertainty factors are discrete values derived in a toxicological context and frequently use very limited evidence. A probabilistic approach aimed at expressing the uncertainty factor as probability
distributions based on physiological considerations provides a more realistic and less conservative approach.
Generally, interindividual variability has a greater magnitude than intraindividual variability and consequently a greater influence on the derivation of nutrient reference values, and must be taken into account by mathematical modeling using, among others, justified expert assumptions on the likely shape of the distribution.
Considerations around interindividual variability with important implications for deriving NRVs include the fact that the distribution of biological endpoints is rarely normal; therefore, using the traditional approach of adding two standard deviations to the mean value of requirement is not always the most appropriate. In addition, the magnitude of the variability is frequently unknown, which is the case when NRVs are derived using a systematic review and the data are available only in an aggregated form. In particular, most of the variability in susceptibility to adverse health effects, which is one of the main components of interindividual variability, is unknown, because experimental trials with different doses in humans to identify potential thresholds for the occurrence of adverse effects are unethical. In a systematic review, a way to estimate the interindividual variability is to compute a weighted average of the sampling standard deviations of each study in the review. Alternatively, a value for the interindividual variability (e.g., expressed as a standard deviation) of between 10 and 15 percent of the AR could be used (Cashman and Kiely, 2013).
When establishing a recommended intake (RI), the distribution of a nutrient requirement among individuals within a population serves as the reference. The 50th percentile of the population distribution (AR), plus two standard deviations (or a percentage of the average requirement judged to be appropriate) assures, under a normal distribution, that the recommended intake is adequate for 97.5 percent of the population. If requirements are not normally distributed, then adding two standard deviations to the AR will not cover 97.5 percent of the population. Other methodologies have been used to address the problem (EFSA, 2010a); however, it remains a challenge.
Uncertainty affecting the estimate of an NRV can be expressed in several ways. It can be expressed as an uncertainty factor that incorporates all identified limitations in the data and possible shortcomings in the methodology. This is typically how it is expressed for the UL in order to maintain
a “margin of safety” and especially when extrapolation is used from one species to another or from certain experimental/observational conditions to others. Or, uncertainty can be expressed as the range of possible values for a parameter, either with or without explicit expression of the coverage of the true value. For instance, in the case of sodium and blood pressure, a UL might be expressed as the range of intake least likely to lead to blood pressure elevation in subjects with normal blood pressure. Another way to express uncertainty is as a full probability distribution of the parameter values. Finally, one could provide a qualitative description of the plausibility of a set of values for the parameter. However, more challenging quantitative methods to characterize and express uncertainty are considered to be better approaches as they allow the use of mathematical tools to combine diverse sources of uncertainty and avoid ambiguity in the interpretation of the conclusions (EFSA Scientific Committee et al., 2018).
Evaluating Usual Intake Levels
After describing the existing evidence base, when updating or adapting an existing NRV, or appraising the evidence, when establishing a new NRV, the next step in developing an NRV is to evaluate usual intakes. From one day to the next, individuals eat different foods in different amounts and consequently their intakes vary over time. Intakes on a single day usually greatly overestimate the usual intake of some nutrients and underestimate that of others. The recommended method of adjusting for this problem is to measure at least 2 nonconsecutive days of intake on a representative subset of each population (at least 35 people) (Tooze et al., 2006). The ratio of the within-person day-to-day variance to the variance between persons (i.e., intraindividual to interindividual variance) is then used to adjust the distribution of intakes statistically such that the lower and upper tails of the distribution move toward the median. Methods for making this adjustment have been described (Carriquiry, 1999; Nusser et al., 1996), and software is available for this purpose.
In the event that only 1 day of intake data per person has been collected, intraindividual variance estimates can be calculated from other data sets in the region, or taken from the software that enables the adjustment of the intake distributions (Iowa State University, 2001). An example of this approach has been used to assess adequacy of nutrient intakes from a Mexican national survey (Sanchez-Pimienta et al., 2016).
Assess Needs of Populations in Consideration of Local Context
A final step in the process of deriving an NRV is to assess the local context. When reference values that are set in one country or region, based
on data from a well-characterized population (e.g., with respect to body size, lifestyle, and environment), are adopted by another country or region, comparability of the characteristics of the two populations must be taken into consideration. If the two populations differ significantly, adjustments must be made to account for differences.
In addition to body size, lifestyle, and other environmental characteristics, in some populations, genetic factors can influence dose–response relationships and, therefore the magnitude of the reference values. For example, being homozygous for methylene tetrahydrofolate reductase (MTHFR) deficiency changes the relationship between folate and homocysteine concentration in plasma and, presumably, the probability of any associated health outcome as well, such as cardiovascular disease (Stover, 2007). Thus, if the percentage of homozygous individuals differs significantly between populations, the estimated requirement for folate within these populations will be different. The greater challenge in this particular situation, however, is the correct characterization of a reference population, not the extrapolation of reference values.
When data are insufficient, the AR or adequate intake (AI) for infants or children must be extrapolated or interpolated from experimental data that came from adults. Nutrient recommendations for infants and children may be extrapolated from adult standards by either scaling down the requirements based on body weight or by using a metabolic factor (i.e., body weight to the 0.75 power). These estimates are then adjusted for growth or tissue deposition needs. As a final step, the estimated ARs should be reviewed across age groups to ensure that there are no abrupt, inappropriate increases or decreases between age groups. Extrapolation of data from outside an observed range may be useful for forming new reference values when dose–response data, intake data, biomarkers of function, or health outcomes are missing for a population subgroup. A composite of scaling methods to extrapolate from reference values of one age group to another is provided in Appendix E.
One method for extrapolation assumes that the physiologic requirement or the upper intake level is proportional to body mass; for example, adult requirements are often adjusted downward for children. When extrapolating downward, not only from adults to children, but from any other group with a higher absolute requirement (group 1 in the formula below) to one with a lower absolute requirement (group 2 in the formula), the general formula is:
Average Value2 = Average Value1 × (scaling factor).
When the need for a nutrient is proportional to body mass, the scaling factor is isometric, meaning that it is calculated simply as the quotient between the reference body weights, and therefore the extrapolation is made on the basis of actual body weight. Isometric scaling is appropriate when the nutrient in question is distributed homogenously across the whole body or when a nutrient is distributed in one specific tissue or organ or in different tissues and organs, but only if the proportional relationship to body mass is preserved as body size changes.
For some nutrients, the proportional relationship between a nutrient and body mass is not preserved as body size changes. In such cases, allometric scaling is necessary. With allometric scaling, adjustments for body size are based on body mass modified by an exponent, typically 0.75 (Rucker, 2007). The 0.75 exponent is thought to account better for metabolically active tissue, the percentage of which is higher in infants (and possibly around puberty) than in adults. An example of allometric extrapolation is that a child weighing 22 kg would require 42 percent of what an adult weighing 70 kg would require; this is a higher percentage than what an isometric calculation would yield (32 percent) (IOM, 1998).
When extrapolating downward for children, in addition to the scaling factor, a growth factor can also be included to account for the additional nutritional demands needed to support growth. The growth is calculated as the proportional increase in estimated additional protein requirements for growth relative to the maintenance requirement at the different ages (FAO/WHO, 1985). Growth factors used in EFSA’s extrapolations are shown in Appendix F; these were calculated based on the assumption that nutrient requirements increase in proportion to the protein growth increment (West et al., 1997). When a growth factor is included in the extrapolation, the general formula is:
Average Valuechild = Average Valueadult × (scaling factor) × (1 + growth factor).
The average value for each age group corresponds to the mean of values for the included years (EFSA, 2014).
Findings, Conclusions, and Recommendations
The flow diagram in Figure 3-1 is based on availability of several new tools and methodologies that were either not available or not used in previous nutrient reviews. These include the process of systematic review, which
provides new data on nutrient requirements that facilitate decision making about whether a requirement should be updated or adapted or a new requirement is needed; application of the factorial approach, which takes into account bioavailability when determining an NRV from a typical diet; and new data sources that can be useful for assessing local factors that affect the requirement for the population under consideration, such as infection.
The committee came to the following conclusion:
The discrepancies between nutrient needs to support growth and development in infants and young children and the derivation of an AR or AI from extrapolation have posed major challenges to determining the composition of nutrient supplements and fortified products for this age group.
Recommendation 3. Expert groups should assess relevant evidence and, as needed, analyze existing or new data to assess the characteristics of various diets that can affect the bioavailability of specific nutrients. For example, iron bioavailability models should be validated using European data for factorial estimates, using iron intake and serum ferritin (adjusted where necessary) data from low- and middle-income countries.
Recommendation 4. When deriving nutrient reference values, countries or regions should look at existing values derived by expert panels and determine whether to accept, update, or adapt them to their context, if possible. If values are not relevant locally, an expert panel should adapt values to the local context or modify existing values from other experts.
In addition to the different types of uncertainty discussed earlier in this chapter, another type of uncertainty in estimating NRVs arises from the use of competing risk–benefit analyses when evaluating ULs based on chronic disease risk-related outcomes versus toxicity-related outcomes. The discussion below summarizes the differences between these competing risk–benefit analyses and methods for managing this type of uncertainty.
Competing Risk–Benefit Analyses
In contrast to nonessential food components, for which the process of risk assessment1 was originally developed, essential nutrients satisfy a physiological function. An evaluation of the effects of exposure to a nutrient includes beneficial or physiological effects in addition to toxicological effects. In the risk assessment process used to establish DRIs, for example, DRIs are set to avoid both the risk of deficiency (the AR) and risk of adverse effects from excess intake (the UL) (IOM, 2000). The assessment of a nutrient can, therefore, be done as a risk–benefit analysis (EFSA, 2010b), which means that similar hazard and benefit assessment steps are performed in parallel with the totality of evidence being assessed at the end (see Figure 3-2).
Modeling the prevention of deficiency as part of the derivation of an AR is a relatively more direct process than modeling the prevention of chronic disease or avoidance of long-term toxicity. Figure 3-3 shows that with increasing intake of a nutrient, the risk of chronic disease can either decrease abruptly (disease A, blue line), decrease continuously (disease C, green line) or increase (disease B, black line).
In the case of an abrupt increase in intake, the increase in risk can be
1 Risk assessment is a process of (1) identification of risk of toxicity, (2) dose–response assessment, (3) assessment of intakes outside the reference values, and (4) characterization of risks associated with excess intake.
considered an adverse health effect and may be used, with a high degree of uncertainty, to define a UL related to chronic disease risk despite the inherent individual variability in susceptibility (NASEM, 2018). Notably, any change in risk of a chronic disease with increasing intake is relative to a baseline risk. In contrast, toxicity from intake of a nutrient arises when intake surpasses a threshold value or range of values and results from intakes that are outside homeostatic mechanisms.
Methods for Managing Uncertainty
It is important that NRV estimates that clearly reflect any uncertainty encountered in the process be considered in their derivation. While, for practical purposes, point estimates of NRVs are preferred, interval estimates are preferable for expressing limitations in knowledge and the shortcomings of the derivation methods. This conflict becomes even more evident when
prevention of chronic diseases is taken into consideration concurrently with the avoidance of adverse effects.
Based on its review of the strengths and weaknesses in available methodologies, as detailed throughout this chapter, the committee developed a new framework for harmonizing the process for deriving NRVs. The framework is shown in Figure 3-4. A harmonized process involves four major steps:
- Choose the appropriate tools and data resources—As shown on the left side of the framework, three primary tools are needed to develop NRVs: systematic reviews, comprehensive databases, and information about relevant local and regional factors (local context) that can influence NRVs.
- Collect data from those tools—Data that are essential for understanding and selecting biomarkers of status, including pregnant and lactating women and young children; health outcomes; the influ
ence of dietary factors on absorption; and the effects of factors such as infection and diarrhea that can affect nutrient requirements.
- Identify the best approach for the nutrient under consideration—There are several options for approaches that can be used to determine key reference values. Dose–response modeling is used to set an AR or UL when there is a clear relationship between the intake of a nutrient and a metabolic or functional outcome. The factorial approach is generally used for minerals because losses of minerals via the urine, feces, skin, menstrual blood, or semen are in the same organic form as in the diet.
- Derive the two key reference values, the AR and the UL—These are the core values from which other NRVs are derived. RIs for various populations within a region, country, or globally may also be developed, but they are derived from the AR.
These major steps represent key components of the flow diagram in Figure 3-1. Applications and uses of NRVs, which were included in the King and Garza (2007) framework, are not included in this framework. However, they are discussed at the end of Chapter 4.
Notably, the balance method, while mentioned at various places in this report, is not included in the committee’s framework because the focus of this report is on zinc, iron, and folate; nutrients of concern for young children and women of reproductive age; and the balance method is not appropriate for those nutrients (see Chapter 4).
The following discussion summarizes the committee’s findings and conclusion related to the overall process of deriving NRVs, including how to manage uncertainties that can arise at various points during the process, and provides a recommendation based on these findings and conclusions.
Findings on the Process of Setting Nutrient Reference Values
General principles for setting NRVs are summarized in Box 3-4. Included in Box 3-4, and as emphasized earlier in this chapter, it is critical that all steps in the decision-making process are documented and transparent.
Findings for Uncertainties in the Nutrient Review Process
To maintain the credibility of the nutrient review process, it is essential that the numerous sources of uncertainty be taken into consideration. As discussed throughout this chapter, the range of uncertainty factors includes
- data limitations (e.g., interindividual variability in a study population);
- flaws in study design;
- risk of bias;
- extrapolation from one population subgroup to a different, unrelated population subgroup;
- heterogeneity in the evidence (e.g., biological and methodological heterogeneity across pooled studies);
- parameters of the search strategy for a systematic review;
- confounders or modifiers affecting intake–response in determining a causal relationship; and
- competing analyses of different outcomes, such as deficiency versus chronic disease.
The committee came to the following conclusion:
Identifying a strategy for managing uncertainties is critical to maintaining the accuracy and relevance of all NRVs as well as assuring
the credibility of the evidence analysis. A thorough understanding of the uncertainties that affect the nutrient review process underpins the process used by decision makers and public health officials in determining nutrition policy. Additionally, understanding the type of uncertainties that can influence a nutrient review is key to identifying research gaps and developing quantitative evidence assessment models.
Core Values of the Nutrient Reference Value Review Process to Support Harmonization
The committee identified the following characteristics of the NRV review process as being critical to supporting harmonization of the approaches to setting NRVs globally. NRVs must be:
- Regularly updated—Given the rapid pace of the generation of knowledge and data upon which NRVs are based, it is important to maintain the currency of information on nutrient outcomes as well as the nutritional profile of populations, particularly as supplement use increases and fortification programs are established.
- Clear and transparent—Confidence in the systematic reviews that lead to the establishment of reference values is necessary to ensure their use by policy makers and researchers alike. Such credibility requires the establishment of a transparent process including how members of the review panels are selected and the training and expertise of each; public availability of all material reviewed by the committee during its deliberations; and written protocols for systematic reviews, quality assessment of each study, assumptions made, and evidence synthesis leading up to the established reference values.
- Rigorous and relevant—Consistent methods need to be established and applied across nutrients, for various values (AR, RI, AI, and UL), and for various categories of age, sex, and life stage (e.g., pregnancy). A uniform approach for conducting systematic reviews and establishing values for nutrients where there is limited evidence is also needed. Given the changing epidemiologic and nutritional status across populations, values need to be relevant to contexts where chronic disease is rising, in addition to being relevant to the prevention of deficiency.
- Documented—At each stage in the process, all considerations or adjustments that influence the potential NRV must be documented, as well as the methodologies used and the assumptions behind them.
- Based on a determination of the strength of the evidence—Limitations in the data and methods are assessed and documented, and uncertainties are taken into account.
- Complete and efficient—Given the cost, time, and expertise required to undertake the development or revision of existing reference values even in high-income countries, low- and middle-income countries should consider using existing NRVs already developed by another international partner if they are deemed adequate.
Recommendation 5. After having adapted or created new nutrient reference values (NRVs), to achieve transparency the nutrient review expert panel should clearly report the reference population, adjustment factors, and the methodology used. Expert panels should also document the uncertainty in the evidence and in the methods used to develop the NRVs quantitatively. If this is not possible, then they should provide a qualitative evaluation of the confidence in the body of evidence and in the methods used.
In summary, the committee’s examination of the steps used to develop NRVs identified several new tools in the process that enhance the transparency, efficiency, and scientific rigor of the approach. These tools have either not been available or not used in past nutrient reviews. They are (1) systematic review, (2) larger and more accessible databases, and (3) information on factors affecting the culturally and context-specific food choices and dietary patterns among diverse populations. Finally, based on the availability of these new tools, the committee proposes a framework for a process to harmonize the approach used to derive NRVs. Chapter 4 examines the feasibility of a harmonized approach, based on this proposed framework, applying it to three exemplar nutrients.
ASEAN (Association of Southeast Asian Nations). 2016. ASEAN works towards ensuring good nutrition for all children in the region. ASEAN Secretariat News, March 28.
ASEAN. 2017. ASEAN to address malnutrition and all its forms. ASEAN Secretariat News, February 22.
Atkinson, S. A. 2011. Defining the process of Dietary Reference Intakes: Framework for the United States and Canada. American Journal of Clinical Nutrition 94(2):655S-657S.
Brouwers, M. C., M. E. Kho, G. P. Browman, J. S. Burgers, F. Cluzeau, G. Feder, B. Fervers, I. D. Graham, J. Grimshaw, S. E. Hanna, P. Littlejohns, J. Makarski, L. Zitzelsberger, and the ANS Consortium. 2010. AGREE II: Advancing guideline development, reporting and evaluation in health care. Canadian Medical Association Journal 182(18):E839-E842.
Carriquiry, A. L. 1999. Assessing the prevalence of nutrient inadequacy. Public Health and Nutrition 2(1):23-33.
Cashman, K. D., and M. Kiely. 2013 EURRECA-estimating vitamin D requirements for deriving dietary reference values. Critical Review of Food Sciemce and Nutrition 53: 1097-1109.
Crippa, A., and N. Orsini. 2016. Multivariate dose-response meta-analysis: The dosresmeta R package. Journal of Statistical Software 72(Code Snippet 1):15.
Dias, S., A. J. Sutton, N. J. Welton, and A. E. Ades. 2013. Evidence synthesis for decision making 3: Heterogeneity—subgroups, meta-regression, bias, and bias-adjustment. Medical Decision Making 33(5):618-640.
EFSA (European Food Safety Authority). 2010a. Scientific opinion on principles for deriving and applying dietary reference values. EFSA Panel of Dietetic Products, Nutrition, and Allergies (NDA). EFSA Journal 8(3):1458.
EFSA. 2010b. Guidance on human health risk-benefit assessment of foods. EFSA Journal 8(7):1673.
EFSA. 2014. Essential composition of infant and follow-on formulae. EFSA Journal 12(7):3760.
EFSA. 2017. EFSA Journal: Special Edition on dietary reference values. http://www.efsa.europa.eu/en/press/news/171211 (accessed February 2, 2018).
EFSA Scientific Committee, D. Benford, T. Halldorsson, M. J. Jeger, H. K. Knutsen, S. More, H. Naegeli, H. Noteborn, C. Ockleford, A. Ricci, G. Rychen, J. R. Schlatter, V. Silano, R. Solecki, D. Turck, M. Younes, P. Craig, A. Hart, N. Von Goetz, K. Koutsoumanis, A. Mortensen, B. Ossendorp, L. Martino, C. Merten, O. Mosbach-Schulz, and A. Hardy. 2018. Guidance on uncertainty analysis in scientific assessments. EFSA Journal 16(1):5123-5162.
FAO (Food and Agriculture Organization). 2018. What we do.http://www.fao.org/about/what-we-do/en (accessed February 2, 2018).
FAO/WHO (Food and Agriculture Organization and World Health Organization). 1985. Energy and protein requirements. Report of a joint FAO/WHO/UNU expert consultation. World Health Organization Technical Reports Series 724:1-206.
FAO/WHO. 2018a. Codex Alimentarius international food standards.http://www.fao.org/fao-who-codexalimentarius/en (accessed February 2, 2018).
FAO/WHO. 2018b. Codex Alimentarius timeline.http://www.fao.org/fao-who-codexalimentarius/about-codex/history/en (accessed February 2, 2018).
FAO/WHO. 2018c. Codex Committee on Food Labeling (CCFL). http://www.fao.org/fao-who-codexalimentarius/committees/committee/en/?committee=CCFL (accessed June 1, 2018).
FAO/WHO. 2018d. FAO/WHO Codex trust fund. http://www.fao.org/fao-who-codexalimentarius/about-codex/faowho-codex-trust-fund/en (accessed June 1, 2018).
Guyatt, G. H., A. D. Oxman, R. Kunz, Y. Falck-Ytter, G. E. Vist, A. Liberati, H. J. Schunemann, and the GRADE Working Group. 2008. GRADE: An emerging consensus on rating quality of evidence and strength of recommendations. British Medical Journal 336(7652):1170-1173.
Higgins, J. P., D. G. Altman, P. C. Gotzsche, P. Juni, D. Moher, A. D. Oxman, K. F. Savovic, L. Schulz, J. A. Weeks, and G. Stern. 2011. Cochrane bias methods, and Cochrane statistical methods. The Cochrane collaboration’s tool for assessing risk of bias in randomised trials. British Medical Journal 343:d5928.
Hill, A. B. 1965. The environment and disease: Association or causation? Proceedings of the Royal Society of Medicine 58:295-300.
IOM (Institute of Medicine). 1994. How should the recommended dietary allowances be revised? Washington, DC: National Academy Press.
IOM. 1998. Dietary Reference Intakes: A risk assessment model for establishing upper intake levels for nutrients. Washington, DC: National Academy Press. https://doi.org/10.17226/6432.
IOM. 2000. Dietary Reference Intakes: Applications in dietary assessment. Washington, DC: National Academy Press. https://doi.org/10.17226/9956.
IOM. 2001. Dietary Reference Intakes for vitamin A, vitamin K, boron, chromium, copper, iodine, iron, manganese, molybdenum, nickel, silicon, vanadium and zinc. Washington, DC: National Academy Press. https://doi.org/10.17226/10026.
IOM. 2011. Dietary Reference Intakes for calcium and vitamin D. Washington, DC: The National Academies Press. https://doi.org/10.17226/13050.
Iowa State University. 2001. Software for intake distribution estimation. http://www.side.stat.iastate.edu (accessed March 20, 2018).
IUNS (International Union of Nutritional Sciences). 2018a. Adhering bodies.http://www.iuns.org/adhering-bodies (accessed February 2, 2018).
IUNS. 2018b. Affiliated bodies.http://www.iuns.org/affiliated-bodies (accessed February 2, 2018).
King, J. C., and C. Garza. 2007. Harmonization of nutrient intake values. Food and Nutrition Bulletin 28(Suppl. 1):S3-S12.
NASEM (National Academies of Sciences, Engineering, and Medicine). 2017. Guiding principles for developing Dietary Reference Intakes based on chronic disease. Washington, DC: The National Academies Press. https://doi.org/10.17226/24828.
NASEM. 2018. Global harmonization of methodological approaches to nutrient intake recommendations: Proceedings of a Workshop. Washington, DC: The National Academies Press. https://doi.org/10.17226/25023.
National Toxicology Program. 2015. Handbook for conducting a literature-based health assessment using OHAT approach for systematic review and evidence integration. Washington, DC: Office of Health Assessment and Translation (OHAT), Division of the National Toxicology Program, National Institute of Environmental Health Sciences.
Nusser, S. M., A. L. Carriquiry, K. W. Dodd, and W. A. Fuller. 1996. A semiparametric transformation approach to estimating usual daily intake distributions. Journal of the American Statistical Association 91:1440-1449.
Orsini, N., R. Li, A. Wolk, P. Khudyakov, and D. Spiegelman. 2012. Meta-analysis for linear and nonlinear dose-response relations: Examples, an evaluation of approximations, and software. American Journal of Epidemiology 175(1):66-73.
Rucker, R. B. 2007. Allometric scaling, metabolic body size and interspecies comparisons of basal nutritional requirements. Journal of Animal Physiology and Animal Nutrition (Berlin) 91(3-4):148-156.
Russell, R. M. 2010. Integration of epidemiologic and other types of data into dietary reference intake development. Critical Reviews in Food Science and Nutrition 50(Suppl. 1):33-34.
SADC (Southern African Development Community). 2014. Food and nutrition security strategy 2015-2025. Paper read at SADC Food and Nutrition Security Strategy 2015-2025. Lilongwe, Malawi.
Sanchez-Pimienta, T. G., N. Lopez-Olmedo, S. Rodriguez-Ramirez, A. Garcia-Guerra, J. A. Rivera, A. L. Carriquiry, and S. Villalpando. 2016. High prevalence of inadequate calcium and iron intakes by Mexican population groups as assessed by 24-hour recalls. Journal of Nutrition 146(9):1874S-1880S.
Stover, P. J. 2007. Human nutrition and genetic variation. Food and Nutrition Bulletin 28(1) Suppl. 1:S101-S115.
Tooze, J. A., D. Midthune, K. W. Dodd, L. S. Freedman S. M. Krebs-Smith, A. F. Subar, P. M. Guenther, R. J. Carroll, and V. Kipnis. 2006. A new statistical method for estimating the usual intake of episodically consumed foods with application to their distribution. Journal of the American Dietetic Association 106(10):1575-1587.
Turner, R. M., D. J. Spiegelhalter, G. C. Smith, and S. G. Thompson. 2009. Bias modelling in evidence synthesis. Journal of the Royal Statistical Society, Series A Statistics in Society 172(1):21-47.
van Houwelingen, H, C., L. R. Arends, and T. Stijnen. 2002. Advanced methods in meta-analysis: Multivariate approach and meta-regression. Statistics in Medicine 21:589-624.
West, G. B., J. H. Brown, and B. J. Enquist. 1997. A general model for the origin of allometric scaling laws in biology. Science (276):122-126.
Whitinga, P., J. Savovi, J. Higgins, D. Caldwella, B. C. Reevese, B. Sheaf, P. Daviesa, J. Kleijnenc, R. Churchilla, and the ROBIS group. 2016. ROBIS: A new tool to assess risk of bias in systematic reviews was developed. Journal of Clinical Epidemiology 69:225-234.
WHO (World Health Organization). 2014. WHO handbook for guideline development, 2nd ed. http://www.who.int/publications/guidelines/guidelines_review_committee/en (accessed February 14, 2018).
Yetley, E. A., A. J. MacFarlane, L. S. Greene-Firestone, C. Garza, J. D. Ard, S. A. Atkinson, D. M. Bier, A. L. Carriquiry, W. R. Harlan, D. Hattis, J. C. King, D. Krewski, D. L. O’Connor, R. L. Prentice, J. V. Rodricks, and G. A. Wells. 2017. Options for basing Dietary Reference Intakes (DRIs) on chronic disease endpoints: Report from a joint US/Canadian-sponsored working group. American Journal of Clinical Nutrition 105(Suppl.):249S-285S.