The following HTML text is provided to enhance online
readability. Many aspects of typography translate only awkwardly to HTML.
Please use the page image
as the authoritative form to ensure accuracy.
rapid growth, children from infancy through adolescence, as well as pregnant women, may fail to consume sufficient iron to meet their needs.
Iron absorption is affected by many factors. Heme iron is present in meats, poultry, and fish and is more efficiently absorbed than inorganic (nonheme) iron, which is found in plant as well as animal foods. Ascorbic acid facilitates the absorption of nonheme iron, but dietary fiber, phytates, and certain trace elements may diminish it. Food composition data provide no indicators concerning the efficiency with which the body absorbs iron from a given food. The publication Recommended Dietary Allowances (NRC, 1980) provides directions on how to calculate available iron.
The availability of iron in the food supply has increased since 1909, chiefly because of the enrichment of flour and cereal products. The 1977-1978 Nationwide Food Consumption Survey (NFCS) indicates that on the average respondents of both sexes from 1 to 18 years old and females from 19 to 64 years old failed to meet their RDA for iron (USDA, 1984). The Continuing Survey of Food Intakes of Individuals (CSFII) conducted in 1985 and 1986 (USDA, 1987a,b) supports these findings. By itself, however, failure to meet the RDA is not an indicator of poor iron status.
Using data from the National Health and Nutrition Examination Survey (NHANES II), conducted from 1976 to 1980, an expert scientific group of the Federation of American Societies for Experimental Biology (FASEB) assessed iron status (LSRO, 1984a). Five indicators in three different models were used in the assessment. A relatively high prevalence of impaired iron status was found in children 1 to 2 years old, males 11 to 14 years old, and females 25 to 44 years old. Among those whose incomes were below poverty level, impaired iron status was highest in children 1 to 5 years old and females 25 to 54 years old (LSRO, 1984a).
Epidemiologic and Clinical Studies
Iron deficiency is a risk factor for the Plummer-Vinson (Paterson-Kelly) syndrome, which was once common in parts of Sweden but has been almost eliminated through improved nutritional status, especially with regard to dietary iron and vitamins (Larsson et al., 1975; Wynder et al., 1957). This condition is associated with an increased risk for cancers of the upper alimentary tract, especially the esophagus and stomach, suggesting that the underlying iron deficiency might be one of the factors that contribute to the occurrence of these cancers. However, epidemiologic studies have not implicated low dietary iron intake per se as a risk factor for cancers at these sites (Schottenfeld and Fraumeni, 1982).
In a correlation analysis of nutrition survey data and cancer mortality rates for 11 regions of the Federal Republic of Germany, Böing et al. (1985) found a positive association between estimated iron intake and mortality from colorectal and pancreatic cancer in men and from gallbladder cancer in women. In a prospective cohort of 21,513 Chinese men in Taiwan, ferritin levels were considerably higher in men over age 50 who developed cancer, especially primary hepatocellular carcinoma (PHC), than in controls without cancer, whereas serum transferrin levels were lower in the men who developed cancer (excluding PHC) (Stevens et al., 1986). These findings probably reflect an association of cancer risk with increased body iron stores, although iron stores were not directly assessed.
Occupational inhalation exposure to iron oxides has been associated with an increased risk for lung cancer in hematite miners and foundry workers (Kazantzis, 1981). In these occupational settings, however, there were other exposures to carcinogens, including ionizing radiation, polycyclic aromatic hydrocarbons (PAHs), and cigarette smoke. Thus, the increased cancer risk cannot be attributed specifically to iron (Doll, 1981; Kazantzis, 1981).
Clinical studies of patients with idiopathic hemochromatosis, a condition that includes abnormal deposition of iron in the liver and frequently cirrhosis, show a highly increased risk for hepatocellular carcinoma (Ammann et al., 1980; Bomford and Williams, 1976; Strohmeyer et al., 1988).
Overall, these studies in humans do not provide strong evidence for a role of iron exposure, whether by diet or by other routes, in the etiology of human cancer.
Iron-deficient rats given 1,2-dimethylhydrazine (DMH) developed neoplastic liver lesions within 4 months, compared to 6 months in an iron-sufficient group (Vitale et al., 1978). The authors noted that severe lack of iron appeared to promote carcinogenesis.
The effect of iron deficiency on tumor growth and host survival was studied in BALB/c mice with transplanted Merwin Plasma Cell-II tumors (Ben-