The following HTML text is provided to enhance online
readability. Many aspects of typography translate only awkwardly to HTML.
Please use the page image
as the authoritative form to ensure accuracy.
Infant Formula: Evaluating the Safety of New Ingredients
history of the development of infant formulas. Liebig’s food for infants was marketed in 1867 and consisted of wheat flour, cow milk, malt flour, and potassium bicarbonate (Fomon, 2001). In 1915 a formula called “synthetic milk adapted” was developed with nonfat cow milk, lactose, oleo oils, and vegetable oils. This was the basis for modern commercially prepared formulas (Fomon, 1993).
The limitations of using nonhuman-mammalian milks as substitutes became clear. As early as 1545, people were concerned with the feeding of animal milks to babies. The Boke of Chyldren stated that “If children be fed the milk of sheep, then their hair will be soft as that of a lamb, but if they be fed the milk of the goat, the hair will be coarse” (Phaire, 1955, P. 18). There are, of course, far greater concerns about feeding animal milk to infants, such as folate deficiency (goat milk) and early onset hypocalcemic seizures and azotemia (cow milk).
By the early twentieth century it was clear that cow milk was most likely the best animal-milk base to work from, but that certain modifications were needed to make it safe and palatable for human infants. These modifications included:
removing animal fat and substituting vegetable oils,
diluting the protein content for the newborn’s relatively immature renal tubular system, and
adding or balancing minerals and vitamins (e.g., adding iron, adjusting the calcium: phosphorus ratio).
The process of modifying cow milk for large-scale production in the 1920s represented the birth of the infant formula industry. Since then new ingredients have been added for a variety of reasons. For example, iron was added in 1959 to reduce the risk of iron deficiency in formula-fed infants (Fomon, 1993), and long-chain polyunsaturated fatty acids (LC-PUFAs) were recently added in an effort to improve infant visual and cognitive development.
The protein content of formulas was a consideration from about 1935 onward. Early estimates of human-milk protein levels were higher than is now known, and it was believed that cow-milk protein was far inferior to human-milk protein. Formulas thus included high levels of protein (3.3–4.0 g/100 kcal). In the 1960s renal solute load began to be considered in the design of infant formulas, although infant formula regulations permit higher loads than are currently recommended by expert panels (no greater than 30 mosm/100 kcal) (Fomon, 2001).
Based on the recognition that human milk contains a predominance of whey proteins, while in cow milk, caseins are higher, formulas with a whey:casein ratio similar to human milk were introduced in 1962. By 2000 whey-predominant formulas were the most widely used milk-based formulas. These changes were made primarily based on composition rather than on functional measures (Fomon, 2001).
In 1984 taurine was added to infant formulas, based on at least a decade of studies that included composition, provisional essentiality, safety, and function in mammals (MacLean and Benson, 1989). Nucleotides were added to formulas with both compositional and efficacy claims in the late 1990s. They may act as growth factors and may have immunomodulating effects on immune defenses (Carver et al., 1991).
When considering new ingredients, manufacturers analyze every step in the production process, including raw materials (availability, source, and purity), processing methods, packaging, storage conditions and shelf life, methods of home preparation, and potential for misuse. Chapter 4 provides a discussion of these manufacturing considerations and their relevance to the regulatory process.