2
Key Elements in the Design and Implementation of Micronutrient Interventions

The 1991 Montreal Conference1 focused worldwide attention on ''Hidden Hunger." Not only were millions of individuals affected by deficiencies of vitamin A, iron, and iodine, but solutions to these micronutrient deficiencies were technologically possible. Since the 1991 conference, a number of donors have committed substantial financial resources to solving the problem of "hidden hunger." Other efforts, including the 1992 report of the International Committee on Nutrition,2 have helped stimulate national efforts to analyze nutritional problems and the resources available for their solution. Less attention, however, has been devoted to understanding the key elements needed to implement and sustain a micronutrient intervention on a fully operational scale (e.g., national, regional), as opposed to a pilot project scale, at either the national or community level.

Experience to date has shown that "how" an intervention is implemented may be as important, or in some cases more important, than "what" is implemented. Some research has already been conducted on elements of successful nutrition interventions generally; in 1989, the USAID International Nutrition Planners Forum held a workshop in Seoul, South Korea, on "Elements of Successful Community Nutrition Programs."3 Similarly, a World Bank-funded project entitled "Successful Nutrition Programs in Africa: What Makes Them Work?"4 evaluated elements of effective nutrition interventions in the region.

1  

"Ending Hidden Hunger" UNICEF/WHO Conference, Montreal, Canada, October, 1991.

2  

International Conference on Nutrition, FAO/WHO, Rome, Italy, December 1992.

3  

International Nutrition Planners Forum Workshop, "Elements of Successful Community Nutrition Programs," Seoul, Republic of Korea, August, 1989.

4  

Kennedy, E. 1991. Nutrition Interventions in Africa: What Makes Them Work? World Bank Working Series in Population, Health, and Nutrition. Washington, D.C.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 11
--> 2 Key Elements in the Design and Implementation of Micronutrient Interventions The 1991 Montreal Conference1 focused worldwide attention on ''Hidden Hunger." Not only were millions of individuals affected by deficiencies of vitamin A, iron, and iodine, but solutions to these micronutrient deficiencies were technologically possible. Since the 1991 conference, a number of donors have committed substantial financial resources to solving the problem of "hidden hunger." Other efforts, including the 1992 report of the International Committee on Nutrition,2 have helped stimulate national efforts to analyze nutritional problems and the resources available for their solution. Less attention, however, has been devoted to understanding the key elements needed to implement and sustain a micronutrient intervention on a fully operational scale (e.g., national, regional), as opposed to a pilot project scale, at either the national or community level. Experience to date has shown that "how" an intervention is implemented may be as important, or in some cases more important, than "what" is implemented. Some research has already been conducted on elements of successful nutrition interventions generally; in 1989, the USAID International Nutrition Planners Forum held a workshop in Seoul, South Korea, on "Elements of Successful Community Nutrition Programs."3 Similarly, a World Bank-funded project entitled "Successful Nutrition Programs in Africa: What Makes Them Work?"4 evaluated elements of effective nutrition interventions in the region. 1   "Ending Hidden Hunger" UNICEF/WHO Conference, Montreal, Canada, October, 1991. 2   International Conference on Nutrition, FAO/WHO, Rome, Italy, December 1992. 3   International Nutrition Planners Forum Workshop, "Elements of Successful Community Nutrition Programs," Seoul, Republic of Korea, August, 1989. 4   Kennedy, E. 1991. Nutrition Interventions in Africa: What Makes Them Work? World Bank Working Series in Population, Health, and Nutrition. Washington, D.C.

OCR for page 11
--> Correspondingly little work, however, has been done on the elements of effective micronutrient interventions. While the design and implementation elements of effective micronutrient interventions may be identical to macronutrient nutrition interventions in general, it is worth taking a systematic look at the factors that account for—or constrain—their effectiveness. This chapter draws on the discussions of the two working groups in the 5–7 December 1996 workshop (see the Appendix for agenda). Working Group I was charged with evaluating past experience with approaches to the prevention or correction of micronutrient malnutrition. The approaches examined included food-based strategies such as dietary change and fortification, supplementation, and other public health measures, including parasite control and delayed umbilical cord ligation. Working Group II looked more broadly at the major elements of success and constraint in past programs. This chapter first describes the importance of iron, vitamin A, and iodine to health. It then considers options for successful interventions based on the level of development of the target country. The costs of interventions are also briefly reviewed. In drafting this summary, the committee has followed two general rules. First, while elements of past experiences may differ among the specific micronutrients, the committee paid special attention to successful examples of strategies incorporating more than one micronutrient or including improvement in public health measures. Second, the committee and workshop participants agreed to base all findings and recommendations in this report on the data provided in three background papers, because these documents provided the substantive basis for discussion at the workshop. "Conventional wisdom" was not considered a sound basis for judgment in the absence of acceptable evidence. To streamline discussion, no references are provided in this chapter; the interested reader is encouraged to read the supporting papers. The Importance Of Iron, Vitamin A, And Iodine To Health The health and vitality of human beings depend on a diet that includes adequate amounts of certain vitamins and minerals that promote effective functioning of physiologic processes, including reproduction, immune response, brain and other neural functions, and energy metabolism. The body needs relatively minute quantities of these elements—i.e., measured in micrograms or milligrams—thus supporting their description as micronutrients. These elements are essential; they cannot be manufactured by the human body and must be obtained through dietary means. Deficiencies of most micronutrients are known to have devastating effects on health. They increase risk of overall mortality and are associated with a variety of adverse health effects, including poor intellectual development and cognition,

OCR for page 11
--> decreased immunity, and impaired work capacity. The adverse effects of micronutrient malnutrition are most severe for children, pregnant women, and the fetus. This report focuses on lessons learned from past interventions to address iron, vitamin A, and iodine malnutrition. The committee decided to limit its evaluation to these three micronutrients because it felt there was adequate experience for each and, because iron, vitamin A, and iodine deficiencies alone are responsible for significant global mortality and morbidity. While discussion relates to these three micronutrients, workshop participants agreed that the lessons learned for improving future intervention strategies would also be applicable to prevention and control of malnutrition created by deficiencies of other micronutrients. Iron Iron is present in both heme and nonheme forms in the diet. Heme iron, the most bioavailable form, is found in the greatest quantities in animal sources such as red meat. Normal individuals absorb between 20 and 30 percent of dietary heme iron, while iron-deficient subjects absorb between 40 and 50 percent. Nonheme iron—which is absorbed less efficiently than heme iron—is most abundant in other sources of iron, including eggs and all vegetable roots, seeds, leaves, and fruits. Nonheme iron is also present in heme iron-containing and other animal tissues. Nonheme iron generally constitutes over 90 percent of dietary iron, particularly in the developing world. In contrast with heme iron, nonheme iron absorption is enhanced or inhibited by many dietary constituents. Heme-iron-containing proteins, ascorbic, malic, tartaric, and succinic acids, and many fermentation products are enhancers. Meat and alcohol, by promoting gastric acid production, also enhance nonheme iron absorption. Inhibitors include fiber, phytic acid and other polyphosphates, calcium, manganese, polyphenols such as tannins and other compounds present in food (seeds, other plant components, and many condiments), and beverages (e.g., tea, herbal infusions, coffee, and chocolate). ID has serious adverse consequences on health; the most evident is anemia. About one billion people worldwide suffer clinical anemia. Severe anemia causes as many as one in five maternal deaths and is a major cause of childhood mortality in many developing countries. Other consequences of iron deficiency are impaired physical growth; potentially permanent adverse effects on neurological functions involving cognition, emotional behavior, reaction to and reception of stimuli, attention span, learning capacity, and neuromotor development and function; decreased capacity for physical work; lowered immunity, resulting in increased susceptibility to infections; and alterations in the reproductive process.

OCR for page 11
--> Vitamin A Vitamin A activity is found in fruits and vegetables that contain green and yellow provitamin A carotenoid pigments and as the preformed vitamin in liver and breast milk. Humans need less than 1 milligram of vitamin A a day to maintain health, yet in 1995, it was estimated that 3 million children annually exhibit xerophthalmia—that is, they are clinically vitamin A-deficient and at risk of blindness. An additional 250 million children under 5 years of age were estimated to be subclinically vitamin A-deficient (based on the prevalence of serum retinol distributions below 0.70 µmol/L) and at risk of severe morbidities and premature death. These estimates do not include pregnant and lactating women living in areas of childhood VAD endemicity who are also likely to be in poor status, but for whom epidemiological data are quite limited. A high prevalence of maternal night blindness and low breast milk levels of vitamin A are reported in such areas. A lack of sensitive, survey-applicable, nonclinical indicators specific to VAD, however, has hampered population-based evaluation of status among reproductive-age women and other age and sex groups. Iodine Iodine must be obtained from the environment, but it has been depleted from the soil and water in many areas of the world. WHO has estimated that over 1.5 billion persons in the world reside in regions of environmental iodine deficiency and are at risk of IDD. The only recognized role of iodine in mammalian biology is as a component of the thyroid hormones, although there are data suggesting that iodine deficiency may be involved in fibrocystic disease of the breast. IDD is associated with goiter, cretinism, mental and neuromotor retardation, and reproductive impairment. Fetal and pre- and postnatal survival are also reduced by iodine deficiency. The Continuum Of Population Risk The nutrition status of all populations is in flux. Groups are in continuous movement along a continuum of nutritional risk, extending from a situation of severe micronutrient malnutrition, through a wide spectrum of presumed nutrient adequacy, to one of nutrient overload and toxicity. The latter state is not emphasized in this report. The committee has identified four levels of population risk. Level IV (severe deficiency) is characterized by populations with severe micronutrient malnutrition. These populations can be said to be in public health crisis with respect to vitamin A and iodine deficiency when clinical

OCR for page 11
--> manifestations (xerophthalmia and goiter) are prevalent at the levels indicated in Table 2-1. Immediate therapeutic and prophylactic programs are needed. There is lack of consensus on the prevalence cutoff to determine the urgency for therapeutic public health programs for anemia defined by hemoglobin level alone.5 Level III (moderate deficiency) is characterized by populations with moderate to severe micronutrient malnutrition where preventive or therapeutic public health programs geared to the level of severity are appropriate. This scenario is most frequently encountered in developing countries, but it can be found in regions of industrial countries as well. Level II (mild and widespread deficiency) is characterized by populations with mild micronutrient malnutrition. This scenario is encountered in both developing and industrialized countries. Level I (mild and clustered deficiency) is characterized by only selected, usually deprived, populations affected by micronutrient malnutrition. This scenario is most frequently encountered in regions of industrial countries. Table 2-1 provides the criteria for classification of populations to Levels IV to I for iron, vitamin A, and iodine status. In using this framework, program funders—and implementers—should note that populations within a single country may be at different levels of risk for these three micronutrients. For example, a given country may have populations at Level II with respect to iodine status, while the same or other populations are at Level III with respect to iron status. The criteria in Table 2-1 are based on prevalence rates of specific clinical and subclinical signs of iron, vitamin A, and iodine deficiency in specified subpopulations. In studies in the field, standard clinical signs for deficiency of iron, vitamin A, and iodine, respectively, are: anemia in any high-risk group; xerophthalmia in preschool-age children, including night blindness; and presence of goiter in school-age children as determined by palpation or ultrasound. The subclinical indicators for deficiency of iron, vitamin A, and iodine, respectively, are: iron deficiency indicator (usually serum ferritin); serum retinol level; and median urinary iodine concentration. The highest level of population risk assigned by a prevalence value takes precedence. 5   Hemoglobin is a biochemical measurement that in some populations may not closely correlate with adverse health consequences at the cutoff used for defining anemia (e.g., when the hemoglobin distribution curve is narrow reflecting most values at the cutoff or only slightly below). When the distribution curve is skewed to the left, the risk of adverse health effects and need for public health interventions become increasingly urgent. Level IV, therefore, refers to a situation with a hemoglobin distribution substantially skewed toward the left.

OCR for page 11
--> TABLE 2-1 Population Prevalence of Clinical and Subclinical Signs of Iron, Vitamin A, or Iodine Deficiency by Level of Population Risk Level of Population Risk Indicator Iron Vitamin A Iodine IV (severe deficiency) Clinicala >40% or >5% or >30% or   Subclinicalb >80–100% >20% >99% III (moderate to severe deficiency) Clinical >20 to <40% or >1 to <5% or >20 to <30% or   Subclinical >50 to <80% >10 to <20% >50 to <99% II (mild and widespread deficiency) Clinical >12 to <20% or >0 to <1% or >5 to <20% or   Subclinical >30 to <50% >2 to <10% >20 to <50% I (mild and clustered deficiency) Clinical <5% and 0 and <5% and   Subclinical <12% <2% <20% NOTE: Highest level assigned by a prevalence value takes precedence. For example, a population with 15% of its people showing clinical signs of iron deficiency (a Level III designation), but with 60% of its people showing subclinical signs of iron deficiency (a Level IV designation) would be classified as Level IV. a Clinical: iron = prevalence of anemia in any high-risk group; vitamin A = prevalence of xerophthalmia in preschool-age children, including night blindness; iodine = prevalence of goiter in school-age children as determined by palpation or ultrasound. b Subclinical: iron = prevalence of iron deficiency indicator below cutoff (usually serum ferritin); vitamin A = prevalence of serum 28retinol levels < 0.7 µmol/L; iodine = prevalence of median urinary iodine values <100 µg/L. SOURCES: WHO/UNICEF/UNU. 1997. Indicators for Assessing, and Strategies for Preventing, Iron Deficiency (WHO/NUT/96.12) Geneva: WHO. WHO. 1996. Indicators for Assessing Vitamin A Deficiency and Their Application in Monitoring and Evaluating Intervention Programs (WHO/NUT/96.10) Geneva: WHO. WHO/UNICEF/WHO/UNICEF/ICCIDD. 1994. Indicators for Assessing Iodine Deficiency Disorders and Their Control Through Salt Iodization (WHO/NUT/94.6) Geneva: WHO.

OCR for page 11
--> The rationale for classifying populations across the three micronutrients examined in this report is an important one. Although the interventions for the three will differ, it was the consensus of the workshop participants that micronutrient deficiencies tend to cluster in populations. Thus, intervention programs should always consider strategies that meet the population needs with respect to multiple deficiencies where they may exist. While interventions on a single micronutrient may, in certain instances, be appropriate (for example, USI), the committee believes that strategies that focus only on a single micronutrient, without consideration of other micronutrient needs, should no longer be supported without careful consideration and justification. In addition, the focus on both clinical and subclinical signs as criteria for determining severity of micronutrient status is a deliberate one. Workshop participants concluded that the prevailing focus of past intervention efforts on frank clinical signs of micronutrient malnutrition as a basis for determining need, while important for their severe health consequences and for serving as indices of risk in the community, has often resulted in neglect of the vast hidden problem of subclinical micronutrient malnutrition. The subclinical problems contribute to slowed human capital development and stagnant national economic development through reduced mental and work capacities and premature deaths. Future efforts should therefore place particular emphasis on the measurement, control, and ultimate prevention of subclinical deficiencies. The purpose of any intervention program is to move at-risk populations along the continuum of risk, from higher to lower levels of risk (toward Level I). As this chapter will demonstrate, experience has shown that the timeframe for movement will depend both on the context of overall national development of the target population's country and the mix of interventions selected within that context. Options For Successful Interventions Past experience with strategies directed toward correction of iron, vitamin A, and iodine deficiencies demonstrates that there is a "toolchest" of potentially effective, complementary interventions available to both government and the private sector. Workshop participants agreed, however, that the key to past programmatic successes has not been just the availability of these tools—many programs have faltered even with their use. Experience suggests that it is the selection and adoption of the right mix of tools for a particular country or regional setting that can ensure success. Equally important, evaluation of limited past experience suggests that it is the persons and organizations using the tool that provide the critical means for building upon and extending the initial basis for success. Well-cited instances (see, for example, the description of the Thailand ivy gourd project in Chapter 4) support the contention that it is important that the people using the tools—ideally, providers at the local, regional, and

OCR for page 11
--> country level—have a say in their choice, are educated as to their strengths and limitations in different circumstances, and are assured the means and capabilities to maintain the tools long after their donors have moved elsewhere. This may be achieved by building local "ownership," as would be done for many food-based or supplementation interventions, or by utilizing a viable industrial base and market, as might be the case for fortification. Although logical and popular, workshop participants agreed that this contention requires additional confirmation, given its limited testing and the considerable additional costs associated with securing and maintaining broad-based community involvement. Availability of the toolchest alone, however, has not been sufficient to ensure program success. Workshop participants agreed that careful consideration and application of design and management strategies suited to local conditions and needs are critical to success. Unfortunately, these strategies have not often been given adequate consideration in the design of past interventions. The following section describes some of the major tools available in the fight against micronutrient malnutrition. It should be emphasized that each has its strengths and limitations and domain of applicability, providing a powerful mix of options for improving micronutrient status in a population over varying periods of time. Supplementation Supplementation refers to the addition of pharmaceutical preparations of nutrients—capsules, tablets, or syrups—to the diet. Research has shown supplementation of adequate dosage and duration to be efficacious in treating, correcting, and preventing deficiencies of iron, vitamin A, and iodine for groups in which there are serious health problems. A major challenge is to scale up supplementation to a program level that achieves adequacy in target group coverage, dosage, and frequency of dosing that assures effectiveness, representing the combined impact of efficacy and the process of implementation. Supplementation has traditionally been considered "short-term," although it may usefully continue until effective alternatives are in place. Iron supplementation and iron therapy are currently part of the national health programs in the majority of developing countries and in many industrial countries. Periodic distribution of high-dose vitamin A supplements, either universal to all children of a specified age range or targeted to high-risk groups, has been the most widely applied intervention for treatment, prevention, and control of VAD. Supplementation of iodine using iodized oils by injection; drops of Lugol's solution; and tablets of salts of iodine, sometimes disguised with chocolate, have been less frequently used than supplementation for iron and vitamin A, but it can be an effective stopgap measure in populations with severe iodine deficiency until salt iodization can become effective.

OCR for page 11
--> Strengths of supplementation include its immediate impact on micronutrient status, health, and survival ability. It can achieve rapid coverage in at-risk populations and be linked to the health care delivery system, and the cost of worker training is relatively low, compared, for example, with that for dietary modification. A key limitation of supplementation—whether it is used to correct for deficiencies of iron, vitamin A, or iodine—is that because of inadequate targeting or coverage, deficient individuals may not be identified or reached routinely, and many at-risk persons, particularly in rural settings, can be missed. In addition, periodic high coverage has often not been sustained over time for a variety of reasons, including lack of sustained financial or political support or other overriding priorities in a limited health infrastructure. Finally, poor compliance by the target individual in taking a supplement has been a consistent reason for the low impact of many supplemental schemes. This is a particular problem in iron programs, which have traditionally relied on the use of daily iron supplementation, although weekly dosing now appears to offer a cost-effective alternative. Thus, supplementation should be considered an essential and complementary bridge to more sustained measures such as food fortification, food-based approaches, and other supportive public health interventions. The example of Indonesia, which successfully shifted from an almost exclusive reliance on vitamin A supplementation to more varied strategies, such as fortification (see Chapter 4), supports this conclusion. Fortification Fortification refers to the addition of needed micronutrients to foods. Adequate consumption of fortified food has been shown to improve micronutrient status. The choice of a food vehicle or vehicles depends on a series of factors, including the target group, food consumption patterns of the target group, and availability and characteristics of the possible vehicle. With respect to the target group, food vehicles may differ when directed to the population as a whole (general fortification) or to specific target groups (e.g., infants, schoolchildren, and refugees), or defined socioeconomic or geographical areas (for example, urban, rural, and ethnic group). The fortified food should also be adjusted to match the food consumption practices of the target population to avoid under- or over-supplementation. Foods should be selected for fortification on the basis of the food consumption practices, stability, production and marketing characteristics, and cost. Experience has shown food fortification to be a useful bridge to sustainable, long-term dietary change in populations at moderate and low levels of iron and vitamin A deficiencies (Level III to I). If the program is made universal through a commonly consumed product, fortification requires little government involvement in the creation of consumer demand or in the training of service delivery

OCR for page 11
--> workers. In addition, it generally presents fewer logistical problems in supply than supplementation and its costs end up being borne almost exclusively by the private rather than the public sector. The experience with iron fortification is mixed. When cereal flour is used without long storage time, ferrous sulphate is cheap and effective, and ethylene diamine tetraacetic acid (EDTA) iron could potentially be a satisfactory fortificant once supply and cost issues are solved. Nevertheless, iron fortification of foods continues to be plagued by the absence of ideal compounds that would be favorably absorbed, stable and nonreactive, with little color and taste of their own, easily measurable for monitoring purposes, and inexpensive. The experience with vitamin A fortification of foods has also been mixed. Successful vehicles for vitamin A fortification have included sugar in Guatemala and margarine in the Philippines. Experience with the use of vitamin A-fortified monosodium glutamate (MSG) in the Philippines and Indonesia—where the product was highly effective, but showed color changes (yellowing) that the manufacturers feared would jeopardize sales—suggests that it is important to solve technical problems with the vehicle early on, while assuring that the population with the micronutrient problem consumes the vehicle in stable quantities on a regular basis. Selection of the form of the fortificant is important. For iodine and vitamin A, the options are more straightforward than for iron. With iron, no one form is superior in all vehicles and environmental conditions. Thus, careful evaluation of options is required before the most appropriate iron fortificant is selected. The advantages outlined for salt in the following paragraph are also true for iron fortification of flour in some countries, and in vitamin A fortification of sugar. The same quality assurance precautions are required. Fortification of salt with iodine has been a major public health success. It has the unique advantage among the micronutrients because it requires no change in dietary habits in most instances: everyone uses salt. USI has therefore become the cornerstone for prevention of IDD. Programs, however, must take into account possible losses between point of manufacture or importation and the consumer's table. Losses may vary among the forms of iodine used (iodide vs. iodate), heat, purity, humidity, packaging, shelf time, and losses in cooking. Programs should also be designed around salt consumption patterns in order to ensure, as nearly as possible, an intake of iodine within the desired range. Similarly, a major need in the salt iodization process is keeping the level of added iodine within safe and effective limits. This means that, at the very least, the concentration of the commercial product must be measured at frequent intervals. Fortunately, the measurement technique is quite simple and reasonably accurate for practical purposes. Difficulties arise in implementing programs when the salt industry is widely dispersed among a large number of small producers. Ensuring distribution of iodine to all local producers is difficult, and compliance is a problem. Other foods that have been used successfully as vehicles for iodine

OCR for page 11
--> fortification have included bread and water. Salt, however, remains the preferred vehicle for fortification. A less traditional form of intervention is the broad class of plant breeding strategies that are emerging to deal with micronutrient deficiencies. A number of pilot trials are now under way worldwide to examine nutrient-modified crops—high-iron rice is an example—to address serious micronutrient deficiencies. In addition, reduced phytate staple crops are being developed to increase the bioavailability of a range of micronutrients. If successful, these strategies are attractive because they do not require a modification in typical dietary patterns. Food-Based Approaches Food-based approaches attempt to correct the underlying causes of micronutrient deficiencies. These strategies are usually considered the ideal long-term goal toward which society strives—provision or assurance of access to a nutritionally adequate diet achieved through diversity of food availability, wise consumer selection, proper preparation, and adequate feeding. Nevertheless, the conventional assumption that food-based approaches represent the best strategy for correcting micronutrient deficiency in all circumstances needs to be reviewed carefully. If increases in homegrown foods can be effected, and this leads to increased intakes, then the dietary change minimizes the effect of lack of consumer access to markets or fluctuations in market prices or food availability. Increased reliance on food-based approaches decreases reliance on the health care system as a means of nutrient supply (as with supplementation) and offers a source of nutrients through foodstuffs that may be available through foraging as well as homegrown or procured foods. In the case of iodine deficiency resulting from the low iodine content of water supplies and locally produced foods, however, dietary change based on local foods is not an option. Changes in food-based strategies also have only limited, short-term application for the prevention of iron deficiency when there are economic or religious constraints on increasing animal protein intake. In addition, efforts to change national consumption patterns of foods that interfere with iron uptake (e.g., tannin containing teas, wheat-containing foods) have had limited success. Addition of iron enhancers to the diet (vitamin C) is also being tried, but experience is limited. Studies have also shown that improving iron status as part of the complementary feeding of infants and very young preschoolers is impractical without fortification. For improving vitamin A status, food-based approaches could be most effective where there is widespread availability, variability, adequacy, and acceptability of vitamin A-containing foods among targeted populations. The cultivation of homegrown foods is not cost-free. In most cases the responsibility for cultivation rests with the females in the household. The issue of

OCR for page 11
--> TABLE 2-5d Preferred Initial Approaches to Prevention and Control of Iron, Vitamin A, and Iodine Deficiencies in Mild and Clustered Populations—LEVEL I Approach Deficiency       Iron Vitamin A Iodine Supplementation       Targeted to vulnerable groups ++ + + Universal — — — Fortification       Targeted — — — Universal +++ ++ ++++ Food-based approaches       Food, nutrition education ++++ ++++ ++ Food production — — n.a. Food-to-food — — — Public health control measures       Immunization ++++ ++++ — Parasite control n.a. n.a. — HW/S n.a. n.a. — DD/ARI n.a. n.a. — Personal sanitation/hygiene ++++ ++++ — NOTE: ++++, very strong emphasis; +++, strong emphasis; ++, moderate emphasis; +, light emphasis;—, no emphasis; food-to-food fortification, mixing of staple foodstuffs—e.g., mango with gruel—at the household level to enrich nutrient content; n.a., not applicable; HW/S, healthy water and public sanitation; DD/ARI, control of diarrheal diseases and acute respiratory infections. Interventions for Level I Populations Food-based approaches and food fortification are the approaches of choice to address micronutrient malnutrition in selected, usually deprived, populations of Level I countries. Programs directed toward iron and, to a lesser degree, vitamin A supplementation of at-risk groups should be continued as needed, as should universal public health control measures such as immunization and education on personal hygiene and sanitation. Balancing Approaches to Country-Specific Circumstances Countries with micronutrient deficiencies at a public health level are usually confronted with multiple problems of underdevelopment and limited resources to deal with them. Setting priorities is essential, not a choice. A series of notable

OCR for page 11
--> political events, beginning in 1990 with The World Summit for Children and the follow-up 1991 conference on Ending Hidden Hunger, focused world attention on micronutrient malnutrition. The preparatory process for the International Conference on Nutrition in 1992 and country-level follow-up actions have fostered national-level planning for micronutrient deficiency control that was virtually nonexistent in many countries before these high-profile political events. National planning often is done collaboratively with international and bilateral agencies because of reliance on their financial assistance for program follow-up. The caution is to ensure that internationally set, time-bound goals are driven by nationally determined, not donor-driven, considerations. Coordinating Interventions Across Micronutrients Malnutrition from a single micronutrient seldom occurs in isolation, but within the context of deprivation, including multiple vitamin/mineral deficits. Thus, it is attractive to conceive of dealing with all of these deficits concurrently. A careful analysis needs to be undertaken, however, to determine where program compatibility exists in areas of awareness, assessment, analysis of causes, and resources available for solutions. Coordinated strategies are technically feasible, but infrequently implemented. Except for iodine, food-based approaches are the most logical for integrating micronutrient control programs. Interactions are avoided between potential concentrated-dose incompatibilities among supplements, such as solubility differences, susceptibility to oxidation, and competition for absorption. The situation with IDD control is different because the deficit is not correctable simply by growing more or a different variety of food in the same iodine-depleted area. Furthermore, there is a proven, cost-effective IDD control intervention—universal iodization of salt—that should receive continued support, using oral iodine supplements to control the problem in limited, unyielding situations. Nonetheless, there are areas of opportunity for cost-saving, complementary activities in assessment, program selection and design, and in delivery mechanisms to vulnerable groups where micronutrient deficiencies coexist. Common Elements Of Successful Micronutrient Interventions This section briefly details elements that the workshop participants identified as being common to all successful micronutrient interventions (see Table 2-6). Political Will/Stability Experience demonstrates that political will and stability are important factors in the control of micronutrient deficiencies. Political instability breeds failure, as demonstrated

OCR for page 11
--> by the collapse of the initially successful Guatemalan salt fortification (iodine) and sugar fortification (vitamin A) programs following a period of political unrest. Working to ensure consistent signals from a broad spectrum of leadership affirming the importance of eliminating or reducing micronutrient disorders, however, can help catalyze both government and voluntary agency efforts. Key actors in this process are political and administrative leaders, those from the health sector, the business community, NGOs, and, when involved in such programs, international agencies. Respected and visionary local champions of the intervention should be sought and involved from the earliest stages of program development. These champions can be individuals, as was the case in the Ecuador salt fortification program, or industries, as was seen in the Nigerian salt fortification program. Political will can be further enhanced and maintained through the development of creative partnerships, such as that between the government of the Philippines and the private sector in the program to fortify margarine with vitamin A. Strategic and Program Planning A common element in the design and implementation of successful micronutrient programs is development of effective strategies and planning processes. Strategic planning results in a clear set of impact objectives to be reached over a set timeframe and the choice of interventions and the necessary scale of operations to achieve them within available resources. Program planning involves formulation of process objectives and work plans. Decisions include choices of scale, targeting to particular beneficiaries, phasing and sequencing of activities, and selection of technologies. The planning process also addresses development of effective systems for training, supervision, management and logistics, the framing of work routines, allocation of tasks and functions, and phasing and sequencing of activities. The successful experience in increasing use of the underutilized vitamin A-rich ivy gourd in Thailand was a good example. A majority of these elements were incorporated (see page 117). The flexibility to adapt program content to changing circumstances, including lessons of implementation experience, is also a characteristic of successful intervention programs. Community Involvement, Participation, and Consumer Demand Involvement of the community at the point where interventions and beneficiaries intersect is a feature of some successful micronutrient programs. An excellent example was the program promoting horticultural interventions in gardens in Bangladesh. Committees at the state, district, block, and village levels provided guidance, coordination, and implementation (see page 123). It also characterizes most of the programs that have had positive results against other forms of malnutrition. Opinion is divided as to when and how best to involve individual communities:

OCR for page 11
--> before the basic program framework is prepared or after. Both strategies appear to have helped to generate appropriate levels of consumer demand for interventions. Involving each community in the original design of its own interventions through such techniques as Participatory Rural Appraisal may fully invest program ownership in the community. The price, however, might be a diversity, which the program management system cannot easily absorb. Presenting a community with a design that has proved workable in similar circumstances may limit its involvement to adaptation, but has not proved a major impediment to beneficiary participation in the program. Physical and Administrative Infrastructure Experience shows that when interventions have been ''scaled up"—that is, increased in size and/or duration—results may be disappointing, in part because of the failure to anticipate the management and institutional capacity needed for ongoing operation. To ensure larger-scale or sustained accomplishment, the physical and administrative infrastructure must be appropriate. Among the indications that these conditions exist are the following, which apply regardless of the intervention being considered: Physical infrastructure. This includes adequate communications capability (e.g. postal mail, telephones, faxes, e-mail; presence of roads, or other ways to reach the populations at risk) and special storage conditions where required. Strategy and program design capability. These include the ability to identify optional strategies and program designs, to test them out, to choose best alternatives, and to evaluate and adjust programs on the basis of appropriate operations and management research. Selection of the most appropriate strategies and program designs also requires the capacity to adapt them to specific resource environments and constraints, along with the ability to measure program costs, efficiency, and effectiveness, as well as costs foregone through intervention outcomes. Part of the strategy and program design process requires clear specification of roles for concerned organizations and institutions, as well as administrative accountability at all levels of managerial and implementation responsibility. Scaling-up skills. The initial success of many interventions is based primarily on the results of small-scale clinical trials. The ability to move to the national level from such small-scale endeavors needs to be validated through large-scale field demonstrations that include measures of effects.

OCR for page 11
--> TABLE 2-6 Common Elements of Successful Micronutrient Interventions Element Description Political will/stability Consistent signals from a broad spectrum of leadership; key actors in this process are political and administrative leaders, those from the business community, nongovernment organizations, and, when involved in such programs, international agencies. Strategic and program planning Strategic planning results in a clear set of impact objectives to be reached over a specific time frame and the choice of interventions and the necessary scale of operations to achieve them within available resources; program planning involves formulation of process objectives and work plans. Community involvement, participation, and consumer demand Experience has demonstrated that involvement of the community in all program phases, from initial design to evaluation, helps to generate appropriate levels of consumer demand for interventions. Physical and administrative infrastructure Presence of a competent physical infrastructure, strategy and program design capability, scaling-up skills (i.e., ability to move to national levels from smaller-scale, local endeavors), managerial capability, budgetary resources, and human resources. Communications strategies Ability to generate consumer demand for improved micronutrient status, to remove barriers to adoption of specific micronutrient-enhancing practices. Such strategies are critical to long-term program sustainability and effectiveness. Use of appropriate food vehicle Choice should take into account bioavailability, safety, side effects, and public acceptance; the technology should be consistent with best practice as determined by comparison with similar programs or well-documented research in pilot or clinical programs. Sustainability Three key factors include efficacy, appropriateness, and demonstrated feasibility. Information systems, monitoring, and evaluation Process and outcome indicators, including biological indicators, appropriate to monitor intervention impact will vary in accordance with the intervention objective.

OCR for page 11
--> Managerial capability. Training to strengthen or develop a management ethic and skills, and to promote management institutional development, including systems for administrative control, is an important but sometimes overlooked factor that influences program success. The establishment of appropriate process goals and a system and procedures for periodically assessing progress toward them is an important measure of managerial capability. Budgetary resources. Resources consistent with achieving established impact objectives need to be made available. These include budgets adequate to develop, test, and choose among strategy options; to formulate and refine the program design; and to test and implement interventions at agreed operational levels for the time specified to reach program objectives. Human resources. The capacity to define tasks and workloads realistically, and to train, deploy, supervise, and retain both employees and, where appropriate, volunteers must also be considered. Task-oriented training needs to take place initially and on an in-service basis, particularly for workers and supervisors in service delivery programs that involve supplementation and communications. Food-based approaches involving dietary modification require appropriate training of formal and informal educators in the use of both interpersonal and mass media resources. Supervisory tasks and ratios need to be geared to service delivery tasks and work routines. Communications Strategies Communications can play an important role in successful micronutrient programs by inducing target groups to improve their micronutrient-related behaviors. Depending on the specific operational context, successful communications strategies seek to (1) generate consumer demand for improved micronutrient status and/or (2) remove barriers to adoption of specific micronutrient-enhancing practices. Such strategies are critical to long-term program sustainability and effectiveness. Communications is an important supportive measure in supplementation and fortification. It can be both a supportive and a leading intervention in the area of dietary modification. In micronutrient supplementation regimes, motivating consumers to demand improved micronutrient status as a personal benefit can lead to higher coverage rates, better compliance, and more efficient implementation. Regarding fortification, public demand for better micronutrient status plays a part in both consumption of the fortified product and in encouraging administrative bodies to adopt and enforce quality-control and other regulatory mechanisms. In the area of dietary practices, appropriate communications interventions can persuade consumers to prepare existing menus in micronutrient-favorable ways and/or to diversify their diets to include new sources of micronutrients.

OCR for page 11
--> Successful communications strategies include: (1) market segmentation, that is, identification of groups whose attitudes and behavior are to be affected; (2) definition of the specific changes sought for each group; (3) understanding of the barriers to such changes; (4) selection of suitable communications channels; and (5) the development and testing of appropriate messages. In most cases, a comprehensive communications strategy will need to address specific segments of the general public, with attention to both target groups for a given intervention and those who influence the micronutrient behavior of such groups. Health workers and managers, from the community to the tertiary care levels, would often need to be included in the strategy, and in most cases would require reorientation, training, and materials support to do so. The potential role of policymakers, particularly in health, agriculture, education, industry, and finance also would need to be analyzed, particularly when such officials could affect resource flows, public perceptions, or other key aspects of the communications process. Two good examples of successful use of communications strategies in building support for and implementing effective interventions were the joint iodine fortification/supplementation program in Ecuador (see page 180) and the experience in applying social marketing methods to increase use of locally available vitamin A-rich foods in Thailand (see page 117). Use of Appropriate Vehicle The choice of an appropriate vehicle for the micronutrient and/or intervention strategy selected should take into account bioavailability, safety, side effects, and public acceptance. The vehicle should be consistent with best practice as determined by comparison with similar programs or well-documented research in pilot or clinical programs. An example of inappropriate choice of vehicle was the Indonesian experience of fortifying MSG with vitamin A (see page 133). Although the vehicle was universally applicable, the resulting fortified "yellow rather than white" product was unacceptable. The widely accepted use of iodinated oils in Ecuador (see page 180) and vitamin-A fortified margarine in the Philippines (see page 134) indicates, however, that selection of an appropriate food vehicle is an important determinant of program success. Genetically modified crops appear to offer opportunities to increase yield, increasing micronutrient content or bioavailability. Their acceptance by the public, however, needs to be addressed. Sustainability Sustainability, as used here, refers to both the continuity of a successful intervention and a continuation of a significant, positive impact on the intended beneficiary.

OCR for page 11
--> The first kind of sustainability thus relates to process, the other to outcomes. Three factors are essential for sustainability: efficacy, appropriateness, and demonstrated feasibility. Clearly one would only want to sustain an intervention that has "worked." The assumption is that a policy or program has been implemented that addresses the micronutrient need of a particular population. In order to continue to effectively operate the intervention, an institutional structure is needed that will allow for ongoing capacity for management. One common finding in public health interventions, in general, is that successful approaches are the ones designed and managed as part of research and/or pilot projects. Cost is clearly a factor that influences sustainability. Programs based on a permanent reliance on external funding are usually not viable in the long term. At the same time, precipitous withdrawal of external funding may also doom projects. A consistently agreed upon gradualist approach may be optimal. There are now examples of the effective transition from total donor funding to total support by financing at the national level. The Indonesia vitamin A program is an excellent example of an intervention that evolved over a 20-year period from 100 percent donor support to the current program, which is entirely funded by government monies. The time period is also critical. For most countries, it is unrealistic to expect this transition to occur in a 3- to 5-year period. Micronutrient interventions such as the Indonesia vitamin A program, in which the donors and the host country plan for this transition from the initial stages, are the ones that have been most successful. Micronutrient interventions that continue to achieve a significant impact on the target individuals are projects that are flexible enough to respond to the changing needs of the client. Typically this involves a combination of approaches to address a particular micronutrient. For vitamin A, as an example, a combination of strategies is most effective in reducing vitamin A deficiency in a given area. Each country must determine the most cost-effective mix of interventions. Information Systems, Monitoring, and Evaluation Monitoring and evaluation are essential program elements. They are vital for ensuring and improving efficiency of program operations—reaching the target group in a cost-effective fashion. Monitoring may provide early warning signs that either program operations are faltering or that prevalence of micronutrient malnutrition is rising in one or more groups. Protocols for monitoring and evaluation must be developed as part of the overall program design and implemented as part of the program. Programs that have not done so have inevitably failed. Projects that have incorporated strong monitoring and evaluation components, such as the two programs promoting home gardens in Bangladesh described in Chapter 4 (see page 123), have been successful and have been sustained.

OCR for page 11
--> Indicators appropriate to monitor intervention impact will vary in accord with the intervention objective. For example, program objectives may be to improve coverage of iron-supplement recipients; to insure that a vitamin A-fortified food meets quality assurance standards or is selected for consumption by target groups; to cause a change in food consumption behaviors, such as the frequency of consumption of dark, green, leafy vegetables (DGLV); or to increase the year-round availability of vitamin A-rich food in household or community gardens. The appropriate intervention-specific impact indicator(s) for each of these objectives will differ; in some cases process indicators will be appropriate, and in other cases biological indicators will be the most useful. If the desired outcome of the intervention is to document a change in the vitamin A status of the recipient population, biological indicators are ideal. Use of impact indicators, however, can be limited in instances where they are difficult to measure and it is likely that a scaled-up intervention will not have the precision necessary to demonstrate impact. In such cases, process indicators should be substituted. For example, demonstrating that the target population received the supplement of vitamin A and ingested it is sufficient. Strong evidence is already available that vitamin A supplements reduce vitamin A deficiency and childhood mortality and morbidity; thus, it is not necessary to repeat these impact measurements. Resource availability can limit the feasibility of direct biological evaluations because these indicators are usually more costly to obtain and evaluate than indirect indicator data. In such situations, outcomes derived from metabolic and/or controlled community studies lend credence to causative inferences from similar outcomes of interventions implemented in less rigorously controlled community studies. Inability to perform biological evaluations should not be the sole criterion that prevents initiation of, or stops, VAD control programs when and where such programs are needed. Biological Indicators Population monitoring of iron deficiency is difficult. Responsiveness of the left tail of an Hb distribution curve is probably the best and least expensive indicator of iron-deficiency anemia, but is inadequate to measure iron reserves. In spite of limitations noted in the background paper on iron, serum ferritin is likely to be the best indicator of measuring iron status. In developing countries with initial high prevalence rates of anemia (and hence prevalence rates of nearly 100 percent subclinical iron deficiency), however, assessing hemoglobin levels is enough. VAD, like iron deficiency, is difficult to monitor. In the view of the workshop participants, process indicators can monitor most programs just as accurately as any single biological indicator and with less expense. Which to use depends on the mix of program strategies. Clinical indicators require very large

OCR for page 11
--> sample sizes because they are rare events. Night blindness is only useful in some populations and does not detect all subclinical VAD. The dynamic nature of the left tail of serum distribution curves among populations of young children is likely the best reflector for biological monitoring. Monitoring of progress against iodine deficiency will usually involve both process and biological indicators. In a highly endemic area, goiter prevalence might be an appropriate initial indicator, but as a control program progresses, overall goiter rate is not an adequate indicator, because adult goiters are often fibrotic, and thus persist even when iodine deficiency is corrected. Goiter incidence in school-age children, however, could be appropriate until it becomes quite low. In contrast, median urinary iodine is reflective of current intakes of a population. Coverage can usually be monitored adequately—with least expense—by nonbiological process indicators such as the number of households in which iodized salt or other fortified food vehicle is consumed. To monitor quality control, however, quantitative laboratory methods for iodine levels in batches of salt, or median urinary iodine levels between cutoffs reflective of the desired level of iodine intake, are appropriate. Where the level of development favors institution-based deliveries, neonatal thyrotropin (TSH)—if a screening program for neonatal hyperthyroidism is already in place, as it is in much of Europe and other developed regions of the world—would be possible, but more expensive than urinary iodine. Median urinary iodine in representative school-age populations is likely the best indicator for long-term monitoring of iodine status and quality assurance of adequate salt iodization levels.

OCR for page 11
This page in the original is blank.