Click for next page ( 6


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 5
1 Introduction Contaminated drinking water, leachates and volatile substances from haz- ardous waste sites, and individual diet and life style all involve exposure to thousands of chemicals that can affect the likelihood or course of disease. This report of the Committee on Methods for the In Vivo Toxicity Testing of Com- plex Mixtures reviews the methodologic requirements for testing the toxicity of these chemical mixtures to humans. It does not review testing of environ- mental toxicity. Toxicity testing to predict effects on humans has traditionally studied chemi- cal compounds one at a time, for various reasons: dealing with agents singly has been more convenient to investigators; physicochemical properties of sin- gle agents were often readily defined; dosage could be easily controlled; bio- logic fate could usually be monitored in a straightforward manner; concentra- tions in air, water, and tissue could be accurately measured; target-organ toxicity was predictable on the basis of experience with agents related to the one in question; and relevant data were often available from human occupa- tional exposures. However, most occupational and environmental chemical exposures are to mixtures, not single agents. Almost all commercial products contain impurities. Some products such as fuels, pesticides, and coal tar- have dozens of constituents. Tobacco smoke consists of a few thousand chemi- cals. Human exposure to complex mixtures has been extensive, and some adverse effects are well documented, especially in terms of chronic disease associated with long exposures. But the extent of human exposure to even well-defined mixtures that are responsible for producing excess disease is generally not known in quantitative terms; and the relative roles of mixture components associated with the effects are rarely known. Furthermore, the relationships s

OCR for page 5
6 COMPLEX MIXTURES between exposure and dose to the target organ are often not well defined in either humans or experimental animals. The quantitative predictive power of toxicity-test data from animal studies for human health effects is often limited by a lack of understanding of interspecies differences in biologically effective dose. Nonetheless, toxicity data on animals provide a key component of pri- ma~y preventive medicine, in that they provide a rational basis on which to estimate, and hence prevent or reduce, human harm. For the study of mixtures, testing strategies must consider similarities and differences between mixtures and must use methods that maximize the ability to extrapolate results between mixtures. Toxicologic evaluation of complex mixtures is complicated by the potential for component interactions. Interac- tions could influence physicochemical properties, dose to a target tissue, bio- transformation and elimination, and toxicity. In the absence of adequate hu- man data, data on the toxicity of a complex mixture (or its components) must be acquired with animal (surrogate) models. Use of these models is likely to be complicated by unpredictability of toxicity resulting from interactions, diff~- culties in extrapolating from high-dose to low-dose exposures, and the ability of several materials to cause the same toxic effect. Questions asked about complex mixtures might be quite similar to those asked about single entities. Testing strategies designed to answer those ques- tions, however, might be very different. For example, asking whether a spe- cific component of a mixture causes cancer is different from asking whether a single chemical causes cancer. In the first case, a fractionation protocol might be needed, perhaps with short-term in vitro assays to identify the toxic frac- tions; in the second case, fractionation is not applicable. Environmental mix- tures are often bound to a matrix, such as soil, and extraction procedures must be developed to isolate a mixture or its components. However, health data based on separation of a mixture from its matrix might not be relevant to actual human exposures. Consideration of human experience with complex mixtures is important, because many toxic effects of chemical mixtures were not, and in some cases could not have been, predicted on the basis of experiments with animal models. For example, the extent to which cigarette-smoking and alcohol consumption affect the risk of oral cancer would not have been predicted on the basis of animal experiments, because experimental models for detecting oral cancer due to either cigarette-smoking or alcohol consumption are inadequate. In some cases, further interpretations of available data or additional use of stan- dard tests can provide a basis for anticipating effects. But these cases generally would involve chronic tests that are seldom performed, and the conventional protocols and rodent models are inadequate for eliciting many of the responses seen in epidemiologic studies. Consideration of chemical-mixture toxicity must also take into account the potential for interaction between components of a mixture. Interaction be-

OCR for page 5
INTRODUCTION 7 tween components can occur in several ways. Chemicals can interact with each other within a mixture to form one or more new compounds. On exposure of an organism, some components might affect the absorption, distribution, metabo- lism, or elimination of other components. Such interactions can lead to toxicity greater or less than would be expected on the basis of the individual component toxicities. Interactions that lead to unpredictably greater toxicity might pose public-health problems, particularly in the case of acute exposures to high doses. Most biologically significant interactions observed at the high doses used for chronic toxicity testing in the laboratory, however, are not detectable at the low doses to which the human population is exposed environmentally. DEFINITIONS Several terms used throughout this report in the context of interaction are difficult to define, because of their varied interpretations in different disci- plines. "Interaction" is a general term that has been applied to toxicity-test results that deviate from dose-additive behavior expected on the basis of dose- response curves obtained with individual agents. Biologists tend to think of "synergism" as referring to any linear result thatis greater then would be expected from addition of doses. Suppose that two chem- icals are carcinogenic in the same organ by the same mechanism, for example, by forming adducts at critical sites on DNA. Exposure to both chemicals will result in a tumor rate that reflects simple addition of doses. If, instead, one acts by an indirect mechanism, such as that of a tumor promoter, exposure to both agents might result in a tumor rate that is greater than that expected on the basis of simple addition of doses. Biologists refer to this phenomenon as "syner- gism." Statisticians describe the situation as multiplicative if the results could be predicted on the basis of multiplication of doses and as synergistic if the results are greater than would be predicted on the basis of the underlying dose- response model. Antagonism is a situation in which the response is less than would be predicted on the basis of simple addition of doses. Such classifica- tions are model-dependent; the credibility of a dose-response model depends on the extent to which is is based on some understanding of the biologic mecha- nisms underlying the interaction. "Exposure" refers to the contact of a substance with an object of interest. If the object of interest is a living organism, a distinction must be made between the exposure (e.g., ambient concentration) and the biologically effective dose (commonly known simply as "dose". The latteris the quantity of material that interacts with the biologic receptor responsible for a particular effect. Because determination of a biologically effective dose requires intimate knowledge of the mechanisms of action, in many instances imperfect knowledge leads to the use of surrogates for this quantity. The less knowledge we have concerning the receptor and the steps leading to the substance-receptor interaction, the less

OCR for page 5
8 COMPLEX MIXTURES precise are our surrogates (i.e., estimates) for the biologically effective dose and the less predictive they are for estimating doses in other species (e.g., humans). Current statistical models for describing and extrapolating toxicity data gen- erally were developed for single substances, and their application to mixtures can lead to incorrect inferences when mixture components interact. Several of the most commonly used dose-response models suggest that the risk associated with exposure to a mixture at a "low" dose (i.e., with a response rate of about 0.1-1 %) is equal to the sum of the risks associated with exposure to the mixture components separately. However, these results are not only dose-limited, but also model-dependent, and the credibility of the result is closely linked to the credibility of the model. Sound scientific judgment is the key ingredient in the application of any particular model to a specific situation. STRUCTURE OF THE REPORT Chapter 2 of this report discusses the availability of various types of qualita- tive and quantitative data that form the bases of currently used analyses of human exposures to complex mixtures (e.g., epidemiology). It presents sev- eral situations in which standard toxicity-testing protocols did not predict ad- verse health effects of such exposures. Concern about the nature of these situa- tions has led investigators and regulators to question whether the use of standard, largely acute, single-chemical toxicity-testing methods at high expo- sures is appropriate for mixtures, especially when many of the documented effects of mixtures are attributable to chronic, small exposures. Chapter 3 restates the fundamentals of toxicity testing of single chemicals in the context of chemical mixtures. Testing of mixtures differs only in extent, not in kind, from testing of single agents. The two critical elements in evaluating complex mixtures are the careful definition of experimental objectives (includ- ing the uses to which the resulting data will be put) and the necessity of differ- entiating exposure from the biologically effective dose. The chapter examines testing strategies to determine optimal approaches to various experimental ob- jectives. Chapter 4 provides guidelines for the design of protocols for sampling and analyzing chemical mixtures. Analytic strategies should include consideration of the nature of exposure and bioavailability and should supplement toxicity testing. A proposed classification scheme considers degree of complexity, sample origin, physical state, and chemical properties. Chemical characteriza- tion of a complex mixture might not be necessary before testing, but predicting effects of chronic exposure and extrapolating test results to other mixtures are often improved by such characterization. Chapter 5 explores the role that biostatistics and quantitative modeling can play in describing the toxicity of mixtures. This role includes both the develop-

OCR for page 5
INTRODUCTION 9 ment of testing strategies and experimental designs and the interpretation of results, including analysis of interaction. The importance of using mathemati- cal models that are consistent with postulated biologic mechanisms of action is emphasized. The assumptions underlying successfully predictive models can in themselves stimulate additional biologic research into the mechanisms of toxicity. The conclusions derived in the chapter are expanded in appendixes that describe their mathematical bases.