Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 195
Page 195 10 Trace Elements IRON Iron is a constituent of hemoglobin, myoglobiin, and a number of enzymes and, therefore, is an essential nutrient for humans (Bothwell et al., 1979). In addition to these functional forms, as much as 30% of the body iron is found in storage forms such as ferritin and hemosiderin (mainly in the spleen, liver, and bone marrow), and a small amount is associated with the blood transport protein transferrin. Body iron content is regulated mainly through changes in the amount of iron absorbed by the intestinal mucosa (Finch and (Cook, 1984). The absorption of iron is influenced by body stores (Bothwell et al., 1979; Cook et al., 1974), by the amount and chemical nature of iron in the ingested food (Layrisse et al., 1968), and by a variety of dietary factors that increase or decrease the availability of iron for absorption (Gillooly et al., 1983; Hallberg, 1981). When the dietary supply of absorbable iron is sufficient, the intestinal mucosa regulates iron absorption in a manner that tends to keep body iron content constant. In iron deficiency, the efficiency of iron absorption increases (Finch and Cook, 1984). However, this response may not be sufficient to prevent anemia in subjects whose intake of available iron is marginal. Similarly, intestinal regulation is not sufficient to prevent excessive body accumulation of iron in the presence of continued high levels of iron in the diet. General Signs of Deficiency Three stages of impaired iron status have been identified. In the first stage, iron depletion, iron stores are diminished, as reflected in
OCR for page 196
Page 196 a fall in plasma ferritin to levels below 12 µg/liter, but no functional impairment is evident. The second stage is recognized by iron-deficient erythropoiesis, in which the hemoglobin level is within the 95%, reference range for age and sex but red cell protoporphyrin levels are elevated, transferrin saturation is reduced to less than 16% in adults, and work capacity performance may be impaired. In the third stage, iron deficiency anemia, total blood hemoglobin levels are reduced below normal values for age and sex of the subject. Severe iron deficiency anemia is characterized by small red blood cells (microcytosis) with low hemoglobin concentrations (hypochromia). Currently there is no single biochemical indicator available to reliably assess iron inadequacy in the general population. Three approaches for estimating the prevalence of impaired iron status were used by the Life Sciences Research Office (LSRO, 1985). One (the ferritin model) involved the use of three indicatorsserum ferritin, transferrin saturation, and erythrocyte protoporphyrinand required that at least two of these be abnormal. In another, mean cell volume (MCV) was substituted for ferritin, but there was also a requirement that at least two of the three indicators be abnormal. The third approach (hemoglobin percentile shift) was defined as the change in median hemoglobin concentration after exclusion of individuals with one or more abnormal iron status values. Operational definitions of anemia have been established by a World Health Organization (WHO) Expert Committee (WHO, 1968) in terms of hemoglobin level. For males and females age 14 years and over, anemia is defined as a hemoglobin level below 13 g/dl and 12 g/dl, respectively. For pregnant women, values below 11 g/dl for the first trimester, 10.5 for the second, and 11.0 for the third have recently been proposed by the Centers for Disease Control as levels defining anemia (CDC, 1989). Since the range of normal hemoglobin values is rather broad (13 to 16 g/dl in men and 12 to 16 g/dl in women), the actual deficit in hemoglobin may vary considerably in individuals with a given level of hemoglobin below the cut-off point. The consequences of iron deficiency are usually ascribed to the resulting anemia, although some effects of deficiency have been found before reduced hemoglobin levels were observed (NRC, 1979). An association between hemoglobin concentration and work capacity is the most clearly identified functional consequence of iron deficiency (Viteri and Torun, 1974). However, there are reports of reduced physical performance in iron deficiency even before anemia is present (Dallman et al., 1978). Iron deficiency also has been associated with decreased immune function as measured by changes in several components of the immune system during iron deficiency. The functional
OCR for page 197
Page 197 consequences of these immune system changes to actual resistance to infection still remains to be determined. In children, iron deficiency has been associated with apathy, short attention span, irritability, and reduced ability to learn (Lozoff and Brittenham, 1986). The degree to which milder forms of iron deficiency, as opposed to severe anemia, result in impaired school performance by children is uncertain (Pollitt, 1987). In the United States, iron deficiency may be observed primarily during four periods of life: (1) from about 6 months to 4 years of age, because the iron content of milk is low, the body is growing rapidly, and body reserves of iron are often insufficient to meet needs beyond 6 months; (2) during the rapid growth of early adolescence, because of the needs of an expanding red cell mass and the need to deposit iron in myoglobin; (3) during the female reproductive period, because of menstrual iron losses; and (4) during pregnancy, because of the expanding blood volume of the mother, the demands of the fetus and placenta, and blood losses during childbirth. Analyses of data from the Second National Health and Nutrition Examination Survey (NHANES II) of the U.S. population, 1976-1980, depending on the assessment model used, indicated a prevalence of impaired iron status ranging from 1 to 6% of the total population, including 9% of children aged 1 to 2 years of age, 4 to 12% of males ages 11 to 14 years, and 5 to 14% of females ages 15 to 44 (LSRO, 1985). The frequency of iron depletion as determined by measurement of serum ferritin, and of iron-deficient erythropoiesis as determined by transferrin saturation and protoporphyrin, is substantially greater than the frequency of iron deficiency anemia in the population surveyed in NHANES II (Bothwell et al., 1979; Dallman et al., 1984; Meyers et al., 1983). Dietary Sources Iron is widely distributed in the U.S. food supply; meat, eggs, vegetables, and cereals (especially fortified cereal products) are the principal dietary sources. In examining food consumption data for women 18 to 24 years of age from NHANES II, Murphy and Calloway (1986) found that of a daily iron intake of 10.7 mg, 31% came from meat, poultry, and fish and that 25% was provided by iron added to foods, mainly cereals, as fortification or enrichment. Fruits, vegetables, and juices contain varying amounts of iron, but as a group represent another major source of dietary iron. Heme iron, a highly available source, represents from 7 to 10% of the dietary iron of girls and women and from 8 to 12% of dietary iron of boys and men,
OCR for page 198
Page 198 according to data from the 1977-1978 Nationwide Food Consumption Survey (Raper et al., 1984). Iron availability may be enhanced by consumption of foods containing ascorbic acid. Ascorbic acid intake is relatively high in U.S. diets, ranging from 86 to 112 mg/day in various groups of men aged 15 years of age and older and from 76 to 92 mg/day for women 15 or more years old and over (Raper et al., 1984). Absorption of Dietary Iron Heme and nonheme forms of iron are absorbed by different mechanisms (Bjorn-Rasmussen et al., 1974). Heme iron is highly absorbable. The proportion of heme iron in animal tissues varies, but it averages about 40% of the total iron in all animal tissues, including meat, liver, poultry, and fish. The remaining 60% of the iron in animal tissues and all the iron in vegetable products is present as nonheme compounds. The absorption of nonheme iron can be enhanced or inhibited by several factors. The two most well-defined enhancers of nonheme iron are some organic acids (especially ascorbic acid) (Gillooly et al., 1983) and the animal tissues present in each meal (Cook and Monsen, 1976). On the other hand, some dietary and medicinal substances such as calcium phosphate, phytates, bran, polyphenols in tea, and antacids may decrease nonheme iron absorption substantially (Gillooly et al., 1983; Monsen et al., 1978). Overall, nonheme iron absorption may vary up to tenfold, depending on the dietary content of such inhibiting and enhancing factors (Hallberg and Rossander, 1984). The percentage of iron absorbed from a meal decreases as the amount of iron present increases. Bezwoda et al. (1983) reported that mean absorption of nonheme iron decreased from 18 to 6.4% as the nonheme iron content of four meals increased from 1.52 to 5.72 mg, resulting in little variation in the actual amount absorbed from the different meals. This presumably reflects tight control of nonheme iron absorption by the intestinal mucosa. On the other hand, 20% of the heme iron in all four meals was absorbed, despite a heme iron content ranging from 0.28 to 4.48 mg, suggesting that heme iron is less affected by other dietary components and at least partly bypasses intestinal mucosal control. On the basis of results of numerous studies of iron absorption in human subjects, Monsen et al. (1978) have suggested a method for planning and evaluating iron intakes that takes account of the enhancement of nonheme iron absorption by ascorbic acid and the presence of meat in the diet.
OCR for page 199
Page 199 Absorption of iron also depends on the iron status of the individual. Mean absorption of dietary iron is relatively low when body stores are high but may be increased when stores are low (Bothwell et al., 1979). Therefore, iron deficiency may not occur to the extent that might be predicted from a given iron intake below recommended allowance levels. Recommended Allowances Adults In calculating the RDA, the subcommittee assumed that there are some iron stores, but it concluded that the size of the iron store needed as a reserve against periods of negative iron balance is a value judgment rather than a scientific determination. In U.S. women, average iron stores are approximately 300 mg; in men, they are approximately 1,000 mg (Bothwell et al., 1979). The subcommittee concluded that a dietary intake that achieves a target level of 300 mg of iron stores meets the nutritional needs of all healthy people. This level would be sufficient to provide the iron needs of an individual for several months, even when on a diet nearly devoid of iron. The average loss of iron in the healthy adult man is estimated to be approximately 1 mg/day (Green et al., 1968). In adult women, there is an additional loss of about 0.5 mg/day, the amount of iron in the average menstrual blood flow averaged over 1 month (Hallberg et al., 1966). In approximately 5% of normal women, however, menstrual losses of more than 1.4 mg/day have been observed. As menstrual losses deplete iron stores, absorption of dietary iron increases. Concordant figures are found using radio-labeled iron loss from circulating erythrocytes. This method gives reliable turnover information for adults and indicates that average requirements to replace daily losses for adults ages 20 to 50 are approximately 14 µg of iron per kilogram of body weight for males (1.10) mg/79 kg) and 22 µg for premenopausal females (1.38 mg/63 kg) (Bothwell and Finch, 1968). There is little or no population-based information from which to assess variability of iron losses among individuals. A reasonable estimate of the coefficient of variation may be approximately 15% (NRC, 1986), but iron losses are not normally distributed among women. For men, absorbed iron would need to be sufficient to replace a potential loss of 1.3 mg/day (1.03 + 30%) to cover the needs of essentially the entire population. The variability estimate indicates that a replacement of 1.8 mg of iron per day would cover the needs
OCR for page 200
Page 200 of most women, except for the 5% with the most extreme menstrual losses. Some impairment of iron status in 9.6 to 14.2% of nonpregnant females 15 to 44 years of age was suggested by population-based data showing two abnormal values among measurements of serum ferritin, erythrocyte protoporphyrin, and transferrin saturation (LSRO, 1985). When hemoglobin was used as an indicator, however, only 2.5 to 4% of the women showed evidence of iron deficiency. The average iron intake of this population group was found to be 10 to 11.0 mg/day in surveys conducted by the U.S. Department of Agriculture (USDA), the National Center for Health Statistics (NCHS), and the Food and Drug Administration (FDA) (Murphy and Calloway, 1986; Pennington et al., 1986; Raper et al., 1984). This suggests that a mean population consumption of about 10 mg/day is associated with adequate iron status in at least 86% of the population of women 15 to 44 years of age. Distribution analysis, considering both the variation in iron losses by menstruating women and in population iron intake, indicate that an iron intake of 14 mg/day is sufficient to meet the needs of all but about 5% of menstruating women (NRC, 1986). From the available data, it seems reasonable to conclude that a daily intake of 10 to 1 1 mg of iron from typical U.S. diets is sufficient for most women. Those with high menstrual losses appear to compensate for those losses by improved absorption of dietary iron, since the prevalence of iron deficiency anemia in that group is quite low (Meyers et al., 1983). The most recent WHO recommendations (FAO, 1988) suggest that iron in the diets typical of most populations of industrialized countries is relatively highly available, iron absorption ranging from 10 to 15%. Thus, at an intake of 15 mg/day, approximately 1.5 to 2.2 mg of absorbed iron could be available to replace iron losses in adult women. This level would be expected to replace iron losses of most women. The subcommittee concluded that an RDA of 15 mg/day would provide a sufficient margin of safety and should cover the needs of essentially all the adult women in the United States except for those with the most extreme menstrual losses, given usual dietary patterns. This is a reduction from the 1980 recommendation of 18 mg/day. In the United States, very little iron deficiency has been reported for 15- to 60-year-old males (LSRO, 1985), whose average intake is about 15 mg/day. Given usual diets in the United States, an allowance of 10 mg/day for adult males should be sufficient to replace losses of up to 1.5 mg/dayan amount exceeding estimates of usual daily iron loss by men.
OCR for page 201
Page 201 There is no evidence of a high prevalence of iron deficiency in the elderly (Lynch et al., 1982). Survey data suggest that inflammatory disease, rather than iron deficiency, is the main cause of anemia in this group (Dallman et al., 1984). In addition, after the menstrual years, the daily iron needs of women approximate those of men. Therefore, the subcommittee recommends the same iron RDA for elderly women and men10 mg/day. Pregnancy and Lactation Pregnant women need iron to replace the usual basal losses, to allow expansion of the red cell mass, to provide iron to the fetus and placenta, and to replace blood loss during delivery. Hallberg (1988) estimates that the total iron needed for a pregnancy is approximately 1,040 mg, of which 840 mg are lost from the body permanently and 200 mg are retained and serve as a reservoir of iron when blood volume decreases after delivery. Over the entire period of gestation, the amount of iron absorbed daily averages about 3 mg/day. There is little need for increased iron intake in the first trimester of pregnancy, since the cessation of iron loss from menstruation compensates for any increased needs during this period. In later stages of pregnancy, however, the requirement increases substantially (INACG, 1981). During the later stages of pregnancy, the absorbability of dietary iron also increases (Apte and Iyengar, 1970). To ensure sufficient absorbed iron to satisfy the demands of a normal pregnancy, a daily increment of 15 mg of iron, averaged over the entire pregnancy, should satisfy the needs of most women. However, since the increased pregnancy requirement cannot be met by the iron content of habitual U.S. diets or by the iron stores of at least some women, daily iron supplements are usually recommended. Loss of iron through lactation is approximately 0.15 to 0.3 mg/day (Lonnerdal et al., 1981). This is less than menstrual loss, which often is absent during lactation (Habicht et al., 1985). Thus, since iron needs for lactating women are not substantially different from those of nonpregnant women, no additional iron allowance for this group is recommended. Infants Because of stored iron (Dahro et al., 1983), the normal term infant can maintain satisfactory hemoglobin levels from human milk without other iron sources during the first 3 months of life. From birth to age 3 years, infants not breastfed should have an iron intake of approximately 1 mg/kg per day. The RDA for 6 months to 3 years of age is set at 10 mg/daya level considered adequate for most healthy children during this time. Low birth weight infants (1,000 to 2,500 g) and those with a substantial reduction in total
OCR for page 202
Page 202 hemoglobin mass require 2 mg/kg per day, starting no later than 2 months of age (AAP, 1976; Dallman et al., 1980). For infants of normal or low birth weight, iron intake should not exceed a maximum of 15 mg/day. Children and Adolescents Children and adolescents need iron not only to maintain hemoglobin concentrations but also to increase their total iron mass during the period of growth. Because of the allowance for increases in iron mass related to growth in body size, the iron requirements of children and adolescents are considered to be slightly higher than those of adult men. To attain a target iron storage level of 300 mg for both sexes by age 20 to 25, an allowance of 10 mg/day is recommended for children. An additional 2 mg/day is recommended for males during the pubertal growth spurtwhich occurs between the ages of 10 and 17 and all additional 5 mg for females starting with the pubertal growth spurt and menstruationwhich begins at approximately age 10 or shortly thereafter and continues through the menstrual years. Other Considerations The RDAs have been established to be adequate for essentially all healthy people who daily consume diets containing 30 to 90 g of meat, poultry, or fish, or foods containing 25 to 75 mg of ascorbate after preparation. People who eat little or no animal protein, such as those whose diets consist largely of beans and rice, and those whose diets are low in ascorbate due to prolonged heating or storage of food (Kies, 1982) may require higher amounts of food iron or a reliable source of ascorbic acid. Excessive Intakes and Toxicity In people without genetic defects that increase iron absorption, there are no reports of iron toxicity from foods other than long-term ingestion of home brews made in iron vessels (Walker and Arvidsson, 1953). Deleterious effects of daily intakes between 25 and 75 mg are unlikely in healthy persons (Finch and Monsen, 1972). On the other hand, there are approximately 2,000 cases of iron poisoning each year in the United States, mainly among young children who ingest the medicinal iron supplements formulated for adults. The lethal dose of ferrous sulfate for a 2-year-old child is approximately 3 g; for adults, it ranges from 200 to 250 mg/kg body weight (NRC, 1979).
OCR for page 203
Page 203 Some people are genetically at risk from iron overload or hemochromatosis. Idiopathic hemochromatosis, which can result in the failure of multiple organ systems, is the result of an inborn error of metabolism (not yet elucidated), which leads to enhanced iron absorption. The disease is caused by an autosomal recessive gene. Reports suggest that the prevalence of this disease is higher than previously believed (Beaumont et al., 1979; Cartwright et al., 1979; Olsson et al., 1983). The studies by Cartwright et al. (1979) in Utah and Beaumont et al. (1979) in Brittany establish gene frequencies in the relatives of cases identified with the disease and so do not estimate general population prevalence. Olsson et al. (1983) studied a population of males ages 30 to 39 in central Sweden and reported an estimated gene frequency of 6.9%. This would result in a prevalence of heterozygotes in this population of 13.8%. The prevalence of the gene has not been reliably established in the United States. In NHANES II, five of the 3,540 people whose serum ferritin was assessed were diagnosed as having idiopathic hemochromlatosis, i.e., they were assumed to be homozygous for the gene. This implies a gene frequency of 3.8%, and prevalence of heterozygotes of 7.5% (LSRO, 1985). Further work is needed to determine the prevalence of this gene in the population and to establlish the risks of iron fortification of foods to both homozygotes and heterozygotes. References AAP (American Academy of Pediatrics). 1976. Iron supplementation for infants. Pediatrics 58:765-768. Apte, S.V., and L. Iyengar. 1970. Absorption of dietary iron in pregnancy. Am. J. Clin. Nutr. 21:73-77. Beaumont C., M. Simon, R. Fauchet. J.-P. Hespel, P. Brissot, B. Genetet, and M. Bourel. 1979. Serum ferritin is a possible marker of the hemochromatosis allele. N. Engl. J. Med. 301:169-174. Bezwoda, W.R., T.H. Bothwell, R.W. Charlton, J.D. Torrance, A.P. MacPhail, D.P. Derman, and F. Mayet. 1983. The relative dietary importance of haem and nonhaem iron. S. Afr. Med. J. 64:552-556. Bjorn-Rasmussen, E., L. Hallberg, B. Isaksson, and B. Arvidsson. 1974. Food iron absorption in man. Application of the two-pool extrinic tag method to measure heme and nonheme iron absorption from the whole diet. J. Clin. Invest. 52:247255. Bothwell, T.H., and C.A. Finch. 1968. Iron losses in man. Pp. 104-114 in Occurrence, Causes and Prevention of Nutritional Anaemias. Symposia of the Swedish Nutrition Foundation, VI. Almquist and Wiksell, Uppsala. Bothwell, T.H., R.W. Charlton, J.D. Cook, and C.A. Finch. 1979. Iron Metabolism in Man. Blackwell, Oxford. Cartwright, G.E., C.Q. Edwards, K. Kravitz, M. Skoliick, D.B. Amos, A. Johnson, and L. Buskjaer. 1979. Hereditary hemochromatosis: phenotypic expression of the disease. N. Engl. J. Med. 301:175-179.
OCR for page 204
Page 204 CDC (Centers for Disease Control). 1989. CDC criteria for anemia in children and childbearing-aged women. Morb. Mortal. Week. Rep. 38:400-404. Cook. J.D., and E.R. Monsen. 1976. Food iron absorption in human subjects. 111. Comparison of the effect of animal proteins on nonheme iron absorption. Am. J. Clin. Nutr. 29:859-867. Cook, J.D., D.A. Lipschitz., L.E.M. Miles, and C.A. Finch. 1974. Serum ferritin as a measure of iron stores in normal subjects. Am. J. Clin. Nutr. 27:681-687. Dahro, M., D. Gunning, and J.A. Olson. 1983. Variations in liver concentrations of iron and vitamiin A as a function of age in young American children dying of the sudden infant death syndrome as well as of other causes. Int. J. Vit. Nutr. Res. 53:13-18. Dallman, P.R., E. Beutler, and C.A. Finch. 1978. Effect of iron deficiency exclusive of anemia. Br. J. Heamatol. 40:179-184. Dallman, P.R., M.A. Siimes, and A. Stekel. 1980. Iron deficiency in infancy and childhood. Am. J. Clin. Nutr. 33:86- 18. Dallman, P.R., R. Yip, and C. Johnson. 1984. Prevalence and causes of anemia in the United States, 1976 to 1980. Am. J. Clin. Nutr. 39:437-445. FAO (Food and Agriculture Organization). 1988. Requirements of Vitamin A, Iron, Folate, and Vitamin B12. Report of a Joint FAO/WHO Expert Consultation. FAO Food and Nutrition Series No. 23. Food and Agriculture Organization, Rome. 107 pp. Finch, C.A., and J.D. Cook. 1984. Iron deficiency. Am. J. Clin. Nutr. 39:471-477. Finch, C.A., and E.R. Monsen. 1972. Iron nutrition and the fortification of food with iron. J. Am. Med. Assoc. 219:1462-1465. Gillooly, M., T.H. Bothwell, J.D. Torrance, A.P. MacPhail, D.P. Derman, W.R. Bezwoda, W. Mills, and R.W. Charlton. 1983. The effects of organic acids, phytates, and polyphenols on the absorption of iron from vegetables. Br. J. Nutr. 49:331342. Green, R., R.W. Charlton, H. Seftel, T.H. Bothwell, F. Mayet, E.B. Adams, C.A. Finch, and M. Layrisse. 1968. Body iron excretion in man: a collaborative study. Am. J. Med. 45:336-353. Habicht, J.-P., J. DaVanzo, W.P. Butz, and L. Meyers. 1985. The contraceptive role of breastfeeding. Popul. Stud. 39:213-232. Hallberg, L. 1981. Bioavailability of dietary iron in man. Annu. Rev. Nutr. 1:123147. Hallberg, L. 1988. Iron balance in pregnancy. Pp. 115-126 in H. Berger, ed. Vitamins and Minerals in Pregnancy and lactation. Raven Press, New York. Hallberg, L., and L. Rossander. 1984. Improvement of iron nutrition in developing countries: comparison of adding meat, soy protein, ascorbic acid, citric acid, and ferrous sulphate on iron absorption from a simple Latin American-type of meal. Am. J. Clin. Nutr. 39:577-583. Hallberg, L., A.M. Hogdahl, L. Nilsson, and R. Rybo. 1966. Menstrual blood lossa population study. Variation at different ages and attempts to define normality. Acta Obstet. Gynecol. Scand. 45:320-351. INACG (International Nutritional Anemia Consultative Group). 1981. Iron Deficiency in Women. A Report for INACG by T.H. Bothwell and R.W. Charlton. University of the Witwatersrand, Johannesburg, South Africa. Kies, C., ed. 1982. Nutritional Bioavailability of Iron. ACS Symposium Series 203. American Chemical Society, Washington, D.C. Lavrisse, M., C. Martinez-Torres, and M. Roche. 1968. Effect of interaction of various foods on iron absorption. Am. J. Clin. Nutr. 21:1175-1183.
OCR for page 205
Page 205 Lönnerdal, B., C.L. Keen, and L.S. Hurley. 1981. Iron, copper, zinc and manganese in milk. Annu. Rev. Nutr. 1:149-174. Lozoff, B., and G.M. Brittenham. 1986. Behavioral aspects of iron deficiency. Prog. Haematol. 14:23-53. LSRO (Life Sciences Research Office). 1985. Summary of a report on assessment of the iron nutritional status of the United States population. Am. J. Clin. Nutr. 42:1318-1330. Lynch, S.R., C.A. Finch, E.R. Monsen, and J.D.Cook. 1982. Iron status of elderly Americans. Am. J. Clin. Nutr. 36:1032-1045. Meyers, L.D., J.P. Habicht, C.L. Johnson, and C. Brownie. 1983. Prevalences of anemia and iron deficiency anemia in black and white women in the United States estimated by two methods. Am. J. Public Health 73: 1042-1049. Monsen, E.R., L. Hallberg, M. Layrisse, D.M. Hegsted, J.D. Cook, W. Mertz, and C.A. Finch. 1978. Estimation of available dietary iron. Am. J. Clin. Nutr. 31:134141. Murphy, S.P., and D.H. Calloway. 1986. Nutrient intakes of women in NHANES II emphasizing trace minerals, fiber, and phytate. J. Am. Diet. Assoc. 86:13661372. NRC (National Research Council). 1979. Iron. A Report of the Subcommittee on Iron, Committee on Medical and Biologic Effects of Environmental Pollutants, Division of Medical Sciences, Assembly of Life Sciences. University Park Press, Baltimore. 248 pp. NRC (National Research Council). 1986. Nutrient Adequacy: Assessment Using Food Consumption Surveys. Report of the Subcommittee on Criteria for Dietary Evaluation, Coordinating Committee on Evaluation of Food Consumption Surveys, Food and Nutrition Board, Commission on Life Sciences. National Academy Press, Washington, D.C. 146 pp. Olsson, K.S., B. Ritter, U. Rosen, P.A. Heedman, and F. Staugard. 1983. Prevalence of iron overload in central Sweden. Acta Med. Scand. 213:145-150. Pennington, J.A.T., B.F. Young, D.B. Wilson, R.D. Johnson, and J.E. Vanderveen. 1986. Mineral content of foods and total diets: the Selected Minerals in Foods Survey, 1982 to 1984. J. Am. Diet. Assoc. 86:876-891. Pollitt, E. 1987. Effects of iron deficiency on mental development: methodological considerations and substantive findings. Pp. 225-254 in F.E. Johnston, ed. Nutritional Anthropology. Alan R. Liss, New York. Raper, N.R., J.C. Rosenthal, and C.E. Woteki. 1984. Estimates of available iron in diets of individuals 1 year old and older in the Nationwide Food Consumption Survey. J. Am. Diet. Assoc. 84:783-787. Viteri, F.E., and B. Torun. 1974. Anaemia and physical work capacity. Clin. Haematol. 3:609-626. Walker, A.R. P., and U.B. Arvidson. 1953. Iron ''overload" in the South African Bantu. Trans. R. Soc. Trop. Med. Hyg. 47:536-548. WHO (World Health Organization). 1968. Nutritional Anaemias. Report of a WHO Scientific Group. WHO Technical Report Series No. 405. World Health Organization, Geneva. ZINC Zinc, a constituent of enzymes involved in most major metabolic pathways, is an essential element for plants, animals, and humans
OCR for page 236
Page 236 Effects on Dental Caries The negative correlation between tooth decay in children and fluoride concentrations in their drinking water was first demonstrated in a large study in the United States almost 50 years ago (Dean et al., 1942). Subsequently, many studies (see review by Burt, 1982) proved that fluoridation of public water supplies, wherever natural fluoride concentrations are low, is an effective and practical means of reducing dental caries (Council on Dental Therapeutics, 1982). Recommendations approved by virtually all national and international health organizations call for fluoride concentrations between 0.7 and 1.2 mg/liter, depending on average local temperature (as a predictor of water intake). There is evidence that dental health has been improving, even in communities with low water fluoride concentrations, presumably because of increased fluoride intake from other sources (e.g., from foods processed with fluoridated water, topical fluoride applications by dentists, fluoride supplementation, and unintentional ingestion of fluoride dentifrices). Although no one theory explains completely the exact role of fluoride in reducing caries (Council on Dental Therapeutics, 1982), it is known that fluoride replaces hydroxyl ions in developing enamel prior to tooth eruption, thereby forming an apatite crystal that is less susceptible to solubilization by acid and, hence, more resistant to caries formation. Some topically applied fluoride is also taken up by the enamel. The protective effect against caries is greatest during maximal tooth formation, i.e., during the first 8 years of childhood, but there is evidence to suggest that adults as well as children continue to benefit from the consumption of fluoridated water (Council on Dental Therapeutics, 1982). Effects on Bone Disease Although it has been suggested that fluoride intakes greater than those recommended for caries control may have some benefit in protecting adult bone, definitive evidence for such an effect is lacking. Bernstein et al. (1966) found a higher prevalence of reduced bone density and of collapsed vertebrae in an area with low-fluoride water (0.15 to 0.30 mg/liter) compared to that in an area with naturally high-fluoride water (4 to 5.8 mg/liter); these findings have not yet been confirmed.
OCR for page 237
Page 237 Dietary Sources and Usual Intakes A recent estimate of fluoride intake in the United States from food, beverages, and water ranged from approximately 0.9 mg/day in an area with unfluoridated water to 1.7 mg/day in an area with fluoridation (Singer et al., 1980). Daily fluoride intake was reported to be approximately 1.8 mg from a hospital diet in a fluoridated area of the United States (Taves, 1983), but drinking water was not taken into account. Most of the difference in fluoride intake between the fluoridated and unfluoridated areas was due to beverages, since foods marketed in different parts of the country contributed only 0.3 to 0.6 mg/day. This suggests that any effect of locally grown foods is largely negated by the supraregional distribution of the majority of foods in the United States. Food processing has a strong influence on the fluoride content of foods. The fluoride content of various foods can be increased severalfold by cooking them in fluoridated water (Marier and Rose, 1966; Martin, 1951). Even the type of cooking vessel can be important. Cooking in utensils treated with Teflon, a fluoride-containing polymer, can increase the fluoride content, whereas an aluminum surface can reduce it (Full and Parkins, 1975). The richest dietary sources of fluoride are tea and marine fish that are consumed with their bones (Kumpulainen and Koivistoinen, 1977). The bones of some land-based animals also contain high levels of fluoride. In countries where tea drinking is common, this beverage can make a substantial contribution to the total fluoride intake. In the United Kingdom Total Diet Study, tea was the main source of dietary fluoride for adults, accounting for 1.3 mg of the total daily intake of 1.8 mg (Walters et al., 1983). Current intake estimates for 6-month-old infants range from 0.23 to 0.42 mg/day in different regions of the United States (Ophaug et al., 1985). This small range is due to the agreement among the producers of infant formulas to use only water low in fluoride for all their products (Barness, 1981; Horowitz and Horowitz, 1983). The fluoride content of cow's milk is approximately 20 µg/liter (Taves, 1983). Mean reported values of human milk range from 5 to 25 µg/liter (Esala et al., 1982; Krishnamachari, 1987; Spak et al., 1983), reflecting maternal intake. The low and high concentrations in human milk were found in samples from mothers drinking water with fluoride concentrations of 0.2 and 1.7 mg/liter, respectively. The quantitative estimates of fluoride intake discussed above give no indication about the relative absorption of dietary fluoride (see review by Subba Rao, 1984). In general, free fluoride as it exists in
OCR for page 238
Page 238 water is more available than the protein-bound fluorine in foods, and the absorption of fluoride from sodium fluoride in aqueous solution is estimated to be 100%. In young adults, the absorption of fluoride from sodium fluoride added to milk or baby formula was only 72 and 65%, respectively, of that added to water in a study by Spak et al. (1982). An even poorer absorption, from 37 to 54% , has been reported for the fluorine in bone meal (Krishnamachari, 1987). These differences in fluoride absorption indicate the difficulties in establishing dietary recommendations for fluoride based solely on quantitative data about the fluoride content of foods and drinking water. Excessive Intakes and Toxicity Fluorine, like other trace elements, is toxic when consumed in excessive amounts. Chronic toxicityfluorosisaffects bone health, kidney function, and possibly muscle and nerve function (Krishnamachari, 1987). The condition occurs after years of daily exposures of 20 to 80 mg of fluorine, far in excess of the average intake in the United States. Mottling of the teeth in children has been observed at 2 to 8 mg/kg concentrations of fluoride in diet and drinking water (NRC, 1971). Many detailed epidemiological studies in the United States and abroad have failed to find any indication for an increased cancer risk associated with fluoride in the water supply (LARC, 1982). The acute toxicity of fluoride resulting in death has been described in a 70-kg adult who ingested one dose of 5 to 10 g of sodium fluoride (Heifetz and Horowitz, 1984). Recently, investigators have suggested that the use of pharmacological doses of fluoride (50 mg/day) for 3 months was helpful in the treatment of women with osteoporosis (Pak et al., 1989). At these doses there is a potential for toxicity, and these patients should be monitored carefully. Estimated Safe and Adequate Daily Dietary Intakes The estimated range of safe and adequate intakes of fluoride for adults is 1.5 to 4.0 mg/day. This takes into account the widely varying fluoride concentrations of diets consumed in the United States and includes both food sources and drinking water. For younger age groups, the range is reduced to a maximal level of 2.5 mg in order to avoid mottling of the teeth. Ranges of 0.1 to 1 mg during the first year of life and 0.5 to 1.5 mg during the subsequent 2 years are suggested as adequate and safe.
OCR for page 239
Page 239 Infants receiving human milk, ready-to-use formula, or concentrated formulas prepared with nonfluoridated water are all ingesting low levels of fluoride. In such cases, the American Academy of Pediatrics Committee on Nutrition advises the use of a fluoride supplement of 0.25 mg/day for children from 2 weeks to 2 years of age (Barness, 1981). In view of fluoride's beneficial effects on dental health and its safety at the prescribed intakes, the Food and Nutrition Board recommends fluoridation of public water supplies if natural fluoride levels are substantially below 0.7 mg/liter. References Barness, L.A. 1981. Fluoride in infant formulas and fluoride supplementation. Pediatrics 67:582-583. Bernstein, D.S., N. Sadowsky, I).M. Hegsted, C.D. Guri, and F.J. Stare. 1966. Prevalence of osteoporosis in high- and low-fluoride areas in North Dakota. J. Am. Med. Assoc. 198:499-504. Burt, B.A. 1982. The epidemiological basis for water fluoridation in the prevention of dental caries. J. Public Health Policy 3:391-407. Council on Dental Therapeutics. 1982. Fluoride compounds. Pp. 344-368 in Accepted Dental Therapeutics, 39th ed. American Dental Association, Chicago, Ill. Dean, H.T., F.A. Arnold, Jr., and E. Elvove. 1942. Domestic water and dental caries: additional studies of relation of fluoride domestic waters to dental caries experience in 4,425 white children aged 12 to 14 years, of 13 cities in 4 states. Public Health Rep. 57:1155-1179. Ekstrand,J. 1978. Relationship between fluoride in the drinking water and the plasma fluoride concentration in man. Caries Res. 12:123-127. Esala, S., E. Vuori, and A. Helle. 1982. Effect of maternal fluorine intake on breast milk fluorine content. Br. J. Nutr. 48:201-204. Full, C.A., and F.M. Parkins. 1975. Effect of cooking vessel composition on fluoride. J. Dent. Res. 54:192. Heifetz, S.B., and H.S. Horowitz. 1984. The amounts of fluoride in current fluoride therapies: safety considerations for children. J. Dent. Child. 51:257-269. Hodge, H.C., and F.A. Smith. 1970. Minerals: fluorine and dental caries. Pp. 93115 in R.F. Gould, ed. Dietary Chemicals vs. Dental Caries. Advances in Chemistry Series No. 94. American Chemical Society, Washington, D.C. Horowitz, A.M., and H.S. Horowitz. 1983. Fluorides and dental caries. Science 220:142-144. IARC (International Agency for Research on Cancer). 1982. Inorganic fluorides used in drinking-water and dental preparations. Pp. 237-303 in IARC Monographs on the Evaluation of the Carcinogenic Risk of Chemicals to Humans, Vol. 27. Some Aromatic Amines, Anthraquinones and Nitroso Compounds, and Inorganic Fluorides Used in Drinking-Water and Dental Preparations. IARC, Lyon, France. Krishnamachari, K.A.V.R. 1987. Fluorine. Pp. 365-415 in W. Mertz, ed. Trace Elements in Human and Animal Nutrition, Vol. 1. Academic Press, San Diego, Calif. Kumpulainen, J., and P. Koivistoinen. 1977. Fluorine in foods. Residue Rev. 68:3757.
OCR for page 240
Page 240 Maheshwari, U.R., J.T. McDonald, V.S. Schneider, A.J. Brunetti, L. Leybin, E. Newbrun, and H.C. Hodge. 1981. Fluoride balance studies in ambulatory healthy men with and without fluoride supplements. Am. J. Clin. Nutr. 34:2679-2684. Marier, J.R., and D. Rose. 1966. The fluoride content of some food and beveragesA brief survey using a modified Zr-SPADNS method. J. Food Sci. 31:941-946. Martin, D.J. 1951. The Evanston Dental Caries Study. VIII. Fluorine content of vegetables cooked in fluorine containing waters. J. Dent. Res. 30:676-681. Messer, H.H., W.D. Armstrong, and L. Singer. 1973. Influence of fluoride intake on reproduction in mice. J. Nutr. 103:1319-1326. Milne, D.B., and K. Schwarz. 1974. Effect of different fluorine compounds on growth and bone fluoride levels in rats. Pp. 710-714 in W. G. Hoekstra, J. W. Suttie, H. E. Ganther, and W. Mertz, eds. Trace Element Metabolism in Animals, 2. University Park Press, Baltimore. NRC (National Research Council). 1971. Fluorides. Report of the Committee on Biologic Effects of Atmospheric Pollutants. National Academy of Sciences, Washington, D.C. 295 pp. Ophaug, R.H., L. Singer, and B.F. Harland. 1985. Dietary fluoride intake of 6-month and 2-year-old children in four dietary regions of the United States. Am. J. Clin. Nutr. 42:701-707. Pak, C.Y.C., K. Sakhafe, J.E. Zerwekh, C. Parcel, R. Peterson, and K. Johnson. 1989. Safe and effective treatment of osteoporosis with intermittent slow release sodium fluoride: augmentation of vertebral bone. J. Clin. Endocrinol. Metab. 68:150159. Schwarz, K., and D.B. Milne. 1972. Fluorine requirement for growth in the rat. Bioinorg. Chem. 1:331-338. Singer, L., R.H. Ophaug, and B.F. Harland. 1980. Fluoride intake of young male adults in the United States. Am. J. Clin. Nutr. 33:328-332. Spak, C.J., J. Ekstrand, and D. Zylberstein. 1982. Bioavailability of fluoride added to baby formula and milk. Caries Res. 16:249-256. Spak, C.J., L.I. Hardell, and P. deChateau. 1983. Fluoride in human milk. Acta Paediatr. Scand. 72:699-701. Spencer, H., D. Osis, and M. Lender. 1981. Studies of fluoride metabolism in man: a review and report of original data. Sci. Total Environ. 17:1-12. Subba Rao, G. 1984. Dietary intake and bioavailability of fluoride. Annu. Rev. Nutr. 4:115-136. Tao, S., and J.W. Suttie. 1976. Evidence for a lack of an effect of dietary fluoride level on reproduction in mice. J. Nutr. 106:1115-1122. Taves, D.R. 1983. Dietary intake of fluoride ashed (total fluoride) v. unashed (inorganic fluoride) analysis of individual foods. Br. J. Nutr. 49:295-301. Taves, D.R., and W.S. Guy. 1979. Distribution of fluoride among body compartments. Pp. 159-185 in E. Johansen, D.R. Taves, and T.O. Olsen, eds. Continuing Evaluation of the Use of Fluorides. Westview Press, Boulder, Colo. Walters, C.B., J.C. Sherlock, W.H. Evans, and J.I. Read. 1983. Dietary intake of fluoride in the United Kingdom and fluoride content of some foodstuffs. J. Sci. Food Agric. 34:523-528. Weber, C.W., and B.L. Reid. 1974. Effect of low-fluoride diets fed to mice for six generations. Pp. 707-709 in W.G. Hoekstra, J.W. Suttie, H.E. Ganther, and W. Mertz, eds. Trace Element Metabolism in Animals, 2. University Park Press, Baltimore.
OCR for page 241
Page 241 CHROMIUM Trivalent chromium is required for maintaining normal glucose metabolism in laboratory animals; it acts as a cofactor for insulin (Mertz, 1969). Experimental chromium deficiency has been induced in several animal species, resulting in impaired glucose tolerance in the presence of normal concentrations of circulating insulin and, in severe cases, in a diabetes-like syndrome (Schroeder, 1966). Three cases of pronounced chromium deficiency have been reported in patients on long-term total parenteral alimentation (Brown et al., 1986; Freund et al., 1979; Jeejeebhoy et al., 1977); all three had in common a relative insulin resistance and peripheral or central neuropathy. Chromium-responsive impairment of glucose tolerance has been reported in malnourished children, in some but not all studies of mild diabetics, and in middle-aged subjects with impaired glucose tolerance. (For a review, see IPCS, 1988.) Chromium concentrations in human tissues decline with age, except for the lungs in which chromium accumulates. Parity, juvenile diabetes, and coronary artery disease are associated with low-chromium concentrations in hair or serum (IPCS, 1988). The intestinal absorption of dietary chromium at daily intakes of 40 µg and more is approximately 0.5% of the total amount present; intakes of less than 40 µg/day are absorbed with an increasing efficiency, up to about 2% of the total (Anderson and Kozlovsky, 1985). Absorbed chromium is excreted almost completely through the urine. Usual Intakes Chromium intake from typical Western diets varies widely between a low of 25 µg/day in elderly persons in England to approximately 200 µg in Belgian and Swedish diets, but in the most recent international studies (IPCS, 1988), intakes below 100 µg/day were reported. Two experimental diets prepared to meet the RDAs for all nutrients and furnishing 2,800 calories contained 62 and 89 µg of chromium at a fat content of 43 and 25% of the energy, respectively (Kumpulainen et al., 1979). This is in contrast to the average chromium intake of 33 and 25 µg/day from self-selected diets of adults in Beltsville, Maryland, in diets containing 2,300 and 1,600 kcal, respectively (Anderson and Kozlovsky, 1985). Estimated Safe and Adequate Daily Dietary Intakes Because of the lack of methods to diagnose chromium status, it is difficult to estimate a chromium requirement. In the majority of all
OCR for page 242
Page 242 chromium supplementation studies in the United States, at least half' the subjects with impaired glucose tolerance improved upon chromium supplementation, suggesting that the lower ranges of chromium intakes from typical U.S. diets are not optimal with regard to chromium nutriture (Anderson et al., 1983). Experiments in vitro and in animals have demonstrated substantial differences in the biological activity of different chromium compounds. Although the chemical forms of chromium in foods are not known with certainty, a chromium-dinicotinic acid-glutathione complex with high bioavailability has been identified in brewer's yeast. The bioavailability of chromium in calf's liver, American cheese, and wheat germ is also relatively high. More precise data on the nutritional value of chromium in various foods are not yet available. Thus, the best assurance of an adequate and safe chromium intake is the consumption of a varied diet balanced with regard to other essential nutrients. A range of chromium intakes between 50 and 200 µg/day is tentatively recommended for adults. This range is based on the absence of signs of chromium deficiency in the major part of the U.S. population consuming an average of 50 µg/day. The safety of an intake of 200 µg has been established in long-term supplementation trials in human subjects receiving 150 µg/day in addition to the dietary intake (Glinsmann and Mertz, 1966). Habitual dietary intakes of around 200 µg/day have been reported in several studies; no adverse effects of such intakes are known. The suggested range of chromium intake is predicated on the assumption that a varied diet providing an adequate intake of other essential micronutrients will furnish chromium with an average absorbability of 0.5%. The tentative recommendations for younger age groups are derived by extrapolation on the basis of expected food intake. Until more precise recommendations can be made, the consumption of a varied diet, balanced with regard to other essential nutrients, remains the best assurance of an adequate and safe chromium intake. Excessive Intakes and Toxicity The toxicity of trivalent chromium, the chemical form that occurs in diets, is so low that there is a substantial margin of safety between the amounts normally consumed and those considered to have harmful effects. No adverse effects were seen in rats and mice consuming 5 mg/liter in drinking water throughout their lifetimes, and no toxicity was observed in rats exposed to 100 mg/kg in the diet (IPCS, 1988).
OCR for page 243
Page 243 An increased incidence of bronchial cancer has been correlated with chronic occupational exposure of workers to dusts containing chromate, the hexavalent form of chromium (IPCS, 1988). Humans cannot oxidize the nontoxic trivalent food chromium to the potentially carcinogenic hexavalent chromate compounds. Therefore, the carcinogenicity of certain chromates bears no relevance to the nutritional role of trivalent chromium. References Anderson, R.A., and A.S. Kozlovsky. 1985. Chromium intake, absorption, and excretion of subjects consuming self-selected diets. Am. J. Clin. Nutr. 41:11771183. Anderson, R.A., M.M. Polansky, N.A. Bryden, E.E. Roginski, W. Mertz, and W.H. Glinsmann. 1983. Chromium supplementation of human subjects: effects on glucose, insulin, and lipid variables. Metabolism 32:894-899. Brown, R.O., S. Forloines-Lynn, R.E. Cross, and W.D. Heizer. 1986. Chromium deficiency after long-term total parenteral nutrition. Dig. Dis. Sci. 31:661-664. Freund, H., S. Atamian, and J.E. Fischer. 1979. Chromium deficiency during total parenteral nutrition. J. Am. Med. Assoc. 241:496-498. Glinsmann, W.H., and W. Mertz. 1966. Effect of trivalent chromium on glucose tolerance. Metabolism 15:510-520. IPCS (International Programme on Chemical Safety). 1988. Chromium. Environmental Health Criteria 61. World Health Organization, Geneva. Jeejeebhoy, K.N., R.C. Chu, E.B. Marliss, G.R. Greenberg, and A. Bruce-Robertson. 1977. Chromium deficiency, glucose intolerance and neuropathy reversed by chromium supplementation in a patient receiving long-term total parenteral nutrition. Am. J. Clin. Nutr. 30:531-538. Kumpulainen, J.T., W.R. Wolf, C. Veillon, and W. Mertz. 1979. Determination of chromium in selected United States diets. J. Agric. Food Chem. 27:490-494. Mertz, W. 1969. Chromium occurrence and function in biological systems. Physiol. Rev. 49:163-239. Schroeder, H.A. 1966. Chromium deficiency in rats: a syndrome simulating diabetes mellitus with retarded growth. J. Nutr. 88:439-445. MOLYBDENUM Molybdenum plays a biochemical role as a constituent of several mammalian enzymes, such as aldehyde oxidase, xanthine oxidase, and sulfite oxidase (Rajagopalan, 1988); however, production of characteristic pathological lesions in animals due to nutritional molybdenum deficiency has been difficult (Mills and Bremner, 1980). Although naturally occurring deficiency, uncomplicated by antagonists, is not known with certainty, molybdenum deficiency has been produced experimentally in goats by feeding them purified rations containing less than 0.07 µg of molybdenum per gram of diet. The consequences are retarded weight gain, decreased food consumption,
OCR for page 244
Page 244 impaired reproduction, and shortened life expectancy (Anke et al., 1985). On the basis of these studies, Anke et al. (1985) suggested that the minimum molybdenum requirement of goats was approximately 100 µg/kg of ration dry matter. However, the molybdenum requirement of monogastric species may be less than that of ruminants because of the molybdenum needs of rumen microflora (Anke et al., 1985). Two recent lines of investigation have suggested a role for molybdenum in human nutrition. The first involved a patient on long-term total parenteral nutrition who developed a variety of symptoms, including amino acid intolerance, irritability, and, ultimately, coma (Abumrad et al., 1981). This person also displayed hypermethioninemia, increased urinary excretion of xanthine and sulfite, and decreased urinary excretion of uric acid and sulfate. Treatment with 300 µg of ammonium molybdate (equivalent to about 163 µg of molybdenum) per day resulted in clinical improvement and normalization of sulfur metabolism and uric acid production. The authors concluded that this may be the first report of a feeding-induced molybdenum deficiency in humans. The second line of evidence concerns a rare inborn error of metabolism that leads to a combined deficiency of sulfite oxidase and xanthine dehydrogenase (Rajagopalan, 1988). This metabolic disease is due to a lack of the molybdenum cofactor (molybdopterin), which is an essential constituent of these enzymes. Patients with this defect display severe neurological dysfunction, dislocated ocular lenses, and mental retardation. Biochemical abnormalities are similar to those observed in the intravenously fed patient cited above. Structural characterization of molybdopterin indicates the presence of a reduced pterin ring, a 4-carbon side chain containing an enedithiol, and a terminal phosphate ester (Rajagopalan, 1988). Dietary Sources and Usual Intakes The concentration of molybdenum in food varies considerably, depending on the environment in which the food was grown (Mills and Davis, 1987). Therefore, tabulations of expected molybdenum concentrations in various foods are of limited value (Chappell, 1977). Tsongas et al. (1980) calculated that the dietary intake of molybdenum in the United States ranged from 120 to 240 µg/day, depending on age and sex, and averaged about 180 µg/day. The foods that contributed the most to the molybdenum intake were milk, beans, breads, and cereals. Pennington and Jones (1987) found a lower molybdenum content in the 1984 collection of the Food and Drug
OCR for page 245
Page 245 Administration's Total Diet Study ranging from 76 to 109 µg/day for adult females and males, respectively. Human milk contains very low levels of molybdenum, and after the first month of lactation, furnishes only approximately 1.5 µg/day (Casey and Neville, 1987). little is known about the chemical form or nutritional bioavailability of molybdenum in foods. Most public water supplies would be expected to contribute between 2 to 8 µg of molybdenum per day (NRC, 1980), which would constitute 10% or less of the lower limit of the provisional recommended intake. Since most diets should meet the molybdenum requirements of humans, supplements of additional molybdenum are not recommended. Estimated Safe and Adequate Daily Dietary Intakes Aside from the exceptions discussed above, no disturbances in uric acid or sulfate production have ever been related to molybdenum deficiency in humans. The human requirement apparently is so low that it is easily furnished by common U.S. diets. Therefore, the provisional recommended range for the dietary intake of molybdenum, based on average reported intakes, is set at 75 to 250 µg/day for adults and older children. The range for other age groups is derived through extrapolation on the basis of body weight. Excessive Intakes and Toxicity The toxicity of molybdenum presents a substantial problem for animal nutrition because molybdenum is antagonistic to the essential element copper (Mills and Davis, 1987). Adverse effects of high environmental concentrations of molybdenum were observed in humans living in a province of the USSR (Koval'skiy and Yarovaya, 1966). The authors suggested that the excessive dietary intake of 10 to 15 mg/day may be the cause of a high incidence of a goutlike syndrome associated with elevated blood levels of molybdenum, uric acid, and xanthine oxidase. Even a moderate dietary exposure of 0.54 mg/day has been associated with loss of copper in the urine (Deosthale and Gopalan, 1974). References Abumrad, N.N., A.J. Schieider, D. Steel, and L.S. Rogers. 1981. Amino acid intolerance during prolonged total parenteral nutrition reversed by molybdate theraps. Am. J. Clin. Nutr. 34:2551-2559.
OCR for page 246
Page 246 Anke, M., B. Groppel, and M. Grun. 1985. Essentiality, toxicity, requirement and supply of molybdenum in humans and animals. Pp. 154-157 in C.F. Mills, I. Bremner, and J.K. Chesters, eds. Trace Elements in Man and AnimalsTEMA 5. Commonwealth Agricultural Bureaux, Slough, United Kingdom. Casey, C.E., and M.C. Neville. 1987. Studies in human lactation 3: molybdenum and nickel in human milk during the first month of lactation. Am. J. Clin. Nutr. 45:921-926. Chappell, W.R., ed. 1977. Proceedings: Symposium on Molybdenun in the Environment, Vol. 2. Marcel Dekker, New York. 812 pp. Deosthale, Y.G., and C. Gopalan. 1974. The effect of molybdenum levels in sorghum (Sorghum vulgare Pers.) on uric acid and copper excretion in man. Br. J. Nutr. 31:351 -355. Koval'skiy, V.V., and G.A. Yarovaya. 1966. Molybdenum-infiltrated biogeochemical provinces. Agrokhlimiiya 8:68-91. Mills, C.F., and I. Bremner. 1980. Nutritional aspects of molybdenum in animals. Pp. 517-542 in M.P. Coughlan, ed. Molybdenum and Molybdenum-Containing Enzymes. Pergamon Press, Oxford. Mills, C.F., and G. K. Davis. 1987. Molybdenum. Pp. 429-463 in W. Mertz, ed. Trace Elements in Human and Animal Nutrition, 5th ed, Vol. 1. Academic Press, Orlando, Fla. NRC (National Research Council). 1980. The contribution of drinking water to mineral nutrition in humans. Pp. 265-404 in Drinking Water and Health, Vol. 3. Safe Drinking Water Committee, Board on Toxicology and Environmental Health Hazards, Assembly of Life Sciences. National Academy Press, Washington, D.D. Pennington, J.A.T., and J.W. Jones. 1987. Molybdenum, nickel, cobalt, vanadium, and strontium in total diets. J. Am. Diet. Assoc. 87:1644-1650. Rajagopalan, K.V. 1988. Molybdenum an essential trace element in human nutrition. Annu. Rev. Nutr. 8:401-427. Tsongas T.A., R.R. Meglen, P.A. Walravens, and W.R. Chappell. 1980. Molybdenum in the diet: an estimate of average daily intake in the United States. Am. J. Clin. Nutr. 33:1103-1107.
Representative terms from entire chapter: