Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
8 Other Potential Applications of Toxicogenomic Technologies to Risk Assessment The potential speed, lower cost, and information content of toxicogenomic technologies may offer distinct opportunities for enhancing the design and inter- pretation of standardized testing protocols and approaches used in risk assess- ment. A brief overview of risk assessment is provided in Appendix C; Chapters 4, 5, 6, and 7 describe the potential for toxicogenomics to improve exposure assessment, hazard screening, assessment of human variability, and mechanistic insight, respectively. This chapter continues the discussion of toxicogenomic applications, focusing on several topics that are important to risk assessment: understanding dose-response relationships, especially at low doses, and improv- ing the selection of doses used in testing; relating animal model data to human risk; assessing effects of exposure during development; and assessing the rele- vance of coexposures and the impact of mixtures. The implications that the in- creased use of toxicogenomics may have on experimental animal use and on federal agency infrastructure needs is also discussed. POTENTIAL APPLICATIONS TO RISK ASSESSMENT Dose-Response Relationships A critical step in assessing risk is determining how responses change de- pending on the magnitude and nature of exposure to the agent or agents in ques- tion. As discussed in Appendix C, the complex nature of the dose-response rela- tionship has traditionally been considerably simplified for risk assessment (1) by using linear slopes at low doses (below the point of departure or below observed data points) for agents with direct mutagenic activity or for which exposures are 121
122 Applications of Toxicogenomic Technologies thought to be near levels associated with key precursor events in the carcino- genic process, and (2) by using nonlinear slopes at low doses when there are enough data to determine the mode of action and conclude the dose-response relationship is not linear or an agent is not mutagenic at low doses. Because of their ability to detect more subtle changes at the molecular level than those detected by traditional high-dose animal studies, toxicogenomic studies, properly designed and interpreted, can provide greater insight into dose- response relationships with respect to low-dose effects and mode of action (which affects the assumed shape of the dose-response relationship). However, similar to traditional toxicology assays, most toxicogenomic investigations to date have used relatively high doses and conventional end points. These studies assume that many of the differentially regulated genes are associated with the observed toxic effect. For example, Gant et al. (2003) identi- fied genes induced during chronic liver injury, Amin et al. (2004) investigated renal toxicity, and Hamadeh et al. (2004) identified furan-mediated hepatotoxic- ity. Moggs et al. (2004) identified the genes and molecular networks associated with the uterotrophic response to estrogens. Such studies are critical for proof of concept, but, ultimately, toxicogenomic technologies will most benefit risk as- sessments when they provide insight into responses that occur at doses at or near anticipated population exposuresâmodeling at such low doses is always a chal- lenge. Several illustrations of the potential implications of toxicogenomics for exploring dose-response issues are provided in the following sections. Low-Dose Responses When looking at gene expression over a range of doses, some altered genes may reflect homeostatic responses, others may be âearly respondersâ in- herent to the ultimate toxic response, and still others may represent perturbations in vital cell pathways resulting from adverse effects. To elucidate quantitative dose-response relationships, the observed changes in gene expression need to predict the toxic response and distinguish it from nontoxic responses. Although this is inherently difficult in complex biologic systems, several concepts have been proposed to help distinguish low-dose effects that predict toxicity from those that do not. They include emphasizing perturbations in critical cellular systems, such as stress responses, apoptosis, and energy production as well as assessing the magnitude of gene expression changes and the number of genes affected as doses increase (Heinloth et al. 2004). Often a dose increase does not simply increase the magnitude of expres- sion change in the same set of genes but also influences which genes are af- fected; that is, the dose increase shifts the response profile. For example, An- drew et al. (2003) compared the effects of low noncytotoxic doses with higher cytotoxic doses of arsenic on human bronchial epithelial cells and reported the expression of almost completely nonoverlapping sets of genes. There appeared to be a threshold switch from a âsurvival-based biologic response at low doses
Other Potential Applications of Toxicogenomic Technologies 123 to a death response at high doses.â Thukral et al. (2005) exposed animals to two doses of two nephrotoxicants, mercuric chloride and amphotericin. Low doses resulted in damage to and regeneration of tubular epithelium, whereas high doses caused necrosis of tubular epithelium. Gene expression profiles clustered on the basis of similarities in the severity and type of pathology, and necrosis was associated with more changes in gene expression. These studies illustrate the complexity of interpreting toxicogenomic data for the purposes of defining dose-response relationships. Lower doses may cause changes that involve tissue damage and repair and predict some degree of chronic toxicity, even though they do not lead to observable pathologic changes in a given experimental set- ting. (Note that interpretation of changes in gene expression over the range of the dose-response curve should be supported by appropriately robust statistical methods.) Experiments conducted at two doses, although helpful, still do not allow for construction of a full dose-response curve useful for risk assessment. To in- terpret the impact on gene expression, a range of doses must be investigated. Heinloth et al. (2004) measured changes in gene expression due to exposure to acetaminophen at doses ranging from those expected to have no hepatotoxic effect to those known to cause liver toxicity. The goal was to discover changes in low-dose-mediated gene expression that might indicate biologic responses predictive of the toxic effects observed with high doses. The results indicate that subtoxic doses of acetaminophen cause down regulation of genes involved in energy expenditure, coupled to the upregulation of genes involved in ATP pro- duction. Cellular systems affected included energy production and energy- dependent pathways as well as cellular stress responses. The results suggest that changes in gene expression induced by exposure to a low dose of a potentially toxic agent may reveal signs of stress or subtle injury that signal potential overt toxicity at higher doses. Thus, in this system, changes in gene expression were more sensitive indicators of potential adverse effects than traditional measures of toxicity. Sen et al. (2005) tested several concentrations of dimethylarsinic acid on rat urothelium and similarly found a progressive increase in differentially ex- pressed genes associated with apoptosis, cell cycle regulation, adhesion, stress response, and others over much of the dose range. Specifically, the number of genes affected increased as the dose increased from 1 part per million (ppm) to 40 ppm, with increased expression of as many as 74 genes associated with cell cycle regulation and proliferation, apoptosis, and oxidative stress. Interestingly, however, many of these genes were not expressed in animals exposed to a 100- ppm treatment, despite increased toxicity. The authors speculated that toxicity from the high 100-ppm dose likely resulted in the degradation of cellular com- ponents, including RNA, which could result in a decrease in the ability to detect altered transcript numbers. The differences may also reflect a different time course of effects after treatment with dimethylarsinic acid at 100 ppm, greater adaptive response at 40 ppm, or a U-shaped dose-response curve (as also ob-
124 Applications of Toxicogenomic Technologies served in transcriptomic experiments with 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) (Ahn et al. 2005). In another study assessing gene expression over a range of doses, Bover- hof et al. (2004) investigated time- and dose-dependent changes in hepatic gene expression from immature ovariectomized mice given several doses of ethynyl estradiol, an orally active estrogen. Thirty-nine of the 79 genes identified as dif- ferentially regulated exhibited a dose-dependent response at 24 hours. This study also illustrated how the non-static nature of gene expression complicates the assessment of a relationship between dose and response. That is, this study showed that administering a single dose (100 Âµg/kg of body weight) resulted in time-dependent changes in the expression of genes associated with growth and proliferation, cytoskeletal and extracellular matrix responses, microtubule-based processes, oxidative metabolism and stress, and lipid metabolism and transport. A number of papers have reported effects of chemical agents on gene ex- pression profiles at concentrations below those associated with overt toxicity. For example, Boverhof et al. (2004) found that most of the 39 transcripts that exhibited a dose-dependent effect at 24 hours were changed at median effective doses (ED50) comparative to ED50 values reported in the literature to cause uterotrophic effects. However, a number of transcripts displayed ED50 values at doses that do not elicit a physiologic effect. Naciff et al. (2005a) similarly stud- ied the effects of three estrogenically active compounds over a dose range span- ning five orders of magnitude and found that the gene expression profiles pro- vided more sensitive and less variable responses than traditional morphologic end points. In a later study, they demonstrated that all these estrogenically active substances exhibited monotonic dose-response curves, providing critical mode of action insights into the biologic plausibility, or lack thereof, of hypothesized low-dose, nonmonotonic dose-response curves for endocrine-active substances (Naciff et al. 2005a). No-Effect Threshold Because conventional animal bioassays generally are conducted at high doses, risk assessors have particular interest in whether toxicogenomic tech- nologies will provide empirical evidence of adverse effects at low doses. This includes not only the shape of the dose-response curve but also whether gene expression indicates a no-effect threshold in a more straightforward manner than conventional means. A concentration at which no transcriptional effect is observed, referred to as a no-observed-transcriptional-effect level (NOTEL), has been demonstrated for estrogen in cells from a hormonally responsive breast cancer cell line (Lobenhofer et al. 2004). The investigators measured multiple transcriptional changes in response to four concentrations of estrogen and found that only physiologically relevant doses of estrogen induced a transcriptional response, which suggests that it is possible to estimate NOTELs through gene expression
Other Potential Applications of Toxicogenomic Technologies 125 microarray experiments. Similarly, Naciff et al. (2005a) found low doses of three estrogenic compounds that did not elicit transcriptional responses. Deter- mining a NOTEL may be of greatest value with well-characterized toxicants because the most sensitive species, sex, organs and cells can be identified and there has been extensive cross-referencing of in vitro and in vivo responses. However, this concept may be less useful in characterizing less-well-studied types of toxicity. In particular, great care must be taken not to assume that a NOTEL derived in one in vitro cellular system is representative of all potential whole-animal responses. It is essential that efforts to incorporate the concept of a NOTEL into regulatory decision making derive a rigorous definition of NOTEL that can be applied to many different mechanisms of toxicity as well as rigorous methods to ensure that minor changes in gene or protein expression associated with toxicity are detected against background variability. Impact on Cross-Species Extrapolations A critical challenge for risk assessment is to identify test species that dis- play responses to toxicants similar to those of humans. Animals show species- and strain-specific differences in responses to toxicants and these differences are often inconsistent from compound to compound, thus greatly increasing the complexity of interpreting standard bioassay results. The application of toxico- genomic technologies to standard test species and simpler model organisms of- fers new opportunities to better understand the range of speciesâ responses to exposures and to identify species-specific mechanisms that affect toxic re- sponses. For example, lung tumors in mice have long been studied as models for human adenocarcinoma (AC), the most frequently diagnosed human lung can- cer. Stearman et al. (2005) recently compared lung tissue from the A/J mouse urethane model and human AC using gene expression analysis to quantify the degree of molecular similarity between the murine model and its human coun- terpart. Gene expression changes between tumor and adjacent normal lung tissue in human AC were recapitulated in the mouse model with striking concordance. More than 85% of genes had similar expression levels between adjacent tissue and tumor samples in both species, suggesting common pathobiology in AC between humans and mice. The authors highlighted the numerous similarities in orthologous genes associated with well-documented hallmarks of cancer (Hanahan and Weinberg 2000). Similarly, fish have been used as models in cancer research for almost a century because some fish and human tumors have similar histopathologic fea- tures. Still, little has been known about the correspondence of the molecular mechanisms that drive tumorigenesis in these two phylogenetically distant spe- cies. In 2006, Lam et al. (2006) evaluated the molecular conservation between human and zebrafish liver tumors. They identified significant similarities be- tween fish and human tumors in the expression profiles of orthologous genes
126 Applications of Toxicogenomic Technologies according to histopathologic tumor grade, thus building confidence in the ze- brafish as a reliable model system for cancer research. In another study, liver tumors in zebrafish were found to recapitulate the molecular expression patterns of human liver cancers and to possess features that correlated with progressively higher grades of malignancy (Grabher and Look 2006). Similarly, Sweet- Cordero et al. (2005) and Lee et al. (2004) compared gene expression to assess the molecular relationship of mouse models of KRAS2-mediated liver cancer with similar human cancers. These studies demonstrate that toxicogenomic technologies could be valuable in identifying animal models that most closely correspond to human toxicity and disease. Toxicogenomic technologies also offer the opportunity to explore the relevance to humans of toxicity findings that vary dramatically among model species. This scenario is exemplified by the dramatic multispecies and strain toxicity differences observed after treatment with dioxin-like substances (for example, TCDD, polychlorinated biphenyls), ubiquitous environmental con- taminants that represent a significant concern for human health risks. Some spe- cies and strains are known to be extremely responsive to dioxin exposure, whereas others are relatively resistant. Okey and coworkers (2005) suggested that important events critical to expression of toxicity could be resolved by comparing common and dissimilar gene expression profiles in sensitive and insensitive animals. TCDD elicits a broad spectrum of aryl hydrocarbon receptor (AhR)- mediated toxic biochemical effects, which are well correlated with its ability to bind to the AhR (Safe 1990). Studies demonstrating that mice with low-affinity AhR alleles are less susceptible to the effects of TCDD (Okey et al. 1989) and that mice lacking the AhR are resistant to prototypical toxicities elicited by TCDD and related substances (Mimura et al. 1997) support this contention. Sun et al. (2004) undertook a comparative computational scanning approach assess- ing gene expression in livers of human, mouse, and rat (species that differ in dioxin response). They used this approach to identify âdioxin response ele- mentsâ across the three test species and thereby investigate possible explana- tions for the species differences in dioxin response. Results suggested that AhR- mediated gene expression may not have been well conserved across species, which could have significant implications in human risk assessment. Despite important developments in cross-species comparisons, such as those described above, the comparison of gene expression data across species remains an arduous challenge. The DNA probes for analogous genes in two dif- ferent species may have different hybridization characteristics, may represent different regions of the gene, or may probe different splice variants in each spe- cies; these and other factors may confound direct comparisons of expression across species, even for genes expressed at the same level. As a result, the data obtained are not always directly comparable and may, in fact, yield disparate results. Nonetheless, this limitation will be mitigated as genomic sequences are completed and gene annotation is improved and incorporated into microarrays. Investigators have begun to develop computational bioinformatic tools that al-
Other Potential Applications of Toxicogenomic Technologies 127 low for the identification of common genes across microarray platforms repre- senting different species (Mattingly et al. 2003; Thorgeirsson et al. 2006). An- other possible limitation to the generation of informative toxicity data from con- ventional studies is that test animals and humans may not metabolize or activate the study compound in a similar fashion. This is a major drawback of the Ames bacterial genotoxicity test and necessitates using liver microsomes to represent mammalian metabolism. This issue has also limited the use of other genetically tractable organisms such as yeast. To mitigate this deficit, investigators have engineered human cell lines to constitutively express various combinations of xenobiotic-metabolizing enzymes (Crespi et al. 1997; Crespi and Miller 1999). In addition, several recent genomic studies have used genetically engineered yeast to express genes encoding xenobiotic-metabolizing enzymes such as hu- man CYP1A1 and epoxide hydrolase (Keller-Seitz et al. 2004; Guo et al. 2005, 2006). These yeast strains were then used to study gene expression profiles in- duced by aflatoxin B1 (AFB1) under a variety of dosing regimens. One of these studies then exploited the power of yeast genetic manipulation to disrupt genes of pathways induced by AFB1 and ascertained DNA repair pathways critical to modulating AFB1-induced mutagenesis (Guo et al. 2005). However, whether the signatures detected in yeast or the pathways affected are also induced by AFB1 in human cells remains to be determined. Similar approaches can also be used for genes that are not well conserved across species or to investigate the effect of human alleles of specific genesâfor example, work with the paraoxonase gene (PON1) involved in the metabolism of organophosphate insecticides and nerve gases such as Sarin. To model human genetic variation in susceptibility to these agents, Furlong et al. (2005a,b) gener- ated PON1 knockout mice and used DNA microarrays to compare gene expres- sion profiles induced in the presence and absence of the gene. They also gener- ated knock-in mice carrying the wild-type human PON1 or allelic variants that show enhanced susceptibility. This insertion of a variety of human xenobiotic- metabolizing genes in mouse models and subsequent measurement of gene ex- pression in response to chemical toxins should improve the ability to extrapolate from the laboratory to potential human exposures. Identification of susceptibility genes such as PON1 can also be integrated into PBPK models to further inform risk assessment. The toxicity of the organo- phosphate pesticide chlorpyrifos is impacted by PON1 activity. Using PBPK analyses, Timchalk and coworkers (2002) demonstrated that PON1 variation in humans would be expected to impact cholinesterase inhibition at high doses (for example, high-dose occupational scenarios) but would be unlikely to impact this toxicity at low environmental concentrations encountered by the general popula- tion. Thus, this approach can test whether genetic polymorphisms linked to a chemical's specific mode of action always confer a significant contribution to adverse health outcomes when environmental exposures are very low.
128 Applications of Toxicogenomic Technologies Developmental Effects In utero exposures have become widely recognized as causes of develop- mental disorders such as fetal alcohol syndrome, X-ray-induced leukemia, and cerebral palsy-like syndrome due to high levels of methylmercury (Harada 1978; IOM 1996; Doll and Wakeford 1997). In addition, prenatal exposures have long been associated with other chronic diseases such as cancer (Birnbaum and Fen- ton 2003) and, more recently, with neurodegenerative diseases, asthma, cardio- vascular diseases, and immune dysfunction (Holladay and Smialowicz. 2000; Osmond and Barker 2000; Peden 2000; Pinkerton and Joad 2000). Although developmental toxicity testing is required in the U.S. testing batteries for pesti- cides and drugs and in Europe under the REACH program, relatively little is known about the health impacts of many chemicals in current use. A number of epidemiologic and conventional toxicologic investigations of the impact of early life exposure on adult disease have been reported, but testing primarily has fo- cused on gross morphologic changes of the offspring and reductions in repro- ductive capability. One way that toxicogenomics may be useful in assessing risk from expo- sures that occur during development is that the near completion of mouse and human genome sequencing is advancing molecular level understanding of de- velopment by identifying all genes, including regulatory transcripts (such as microRNAs). Hence, even sequences from genes that are expressed only during development and that were historically underrepresented in cDNA libraries can now be identified with bioinformatic tools and included in gene expression mi- croarrays. As a result, comprehensive gene expression profiling can now be per- formed at any developmental stage, not only providing insight into the normal regulatory networks that control development but also allowing for the analysis of how exposures to toxicants can perturb these networks and cause abnormali- ties. Nemeth et al. (2005), in work with early mouse embryos exposed to vari- ous teratogens, illustrated that toxicogenomics can advance molecular level un- derstanding of development. They identified a benchmark panel of genes for normal development of the eye, an organ considered a definitive target for the study of teratogens. The authors identified 165 genes differentially expressed during rodent eye morphogenesis, including 58 genes common to both rats and mice. The biologic significance of some of the genes and pathways affected, such as glycolysis genes crucial in maintaining oxygen levels, may provide in- sights into the mechanisms of teratogenesis. Toxicogenomic studies conducted to date have focused largely on iconic teratogens, exposing animals on a key gestational day and then attempting to identify gene changes that correlate with the teratogenic action observed. For example, Hard et al (2005) investigated ethanol-induced alterations in gene ex- pression in the fetal mouse brain in an effort to identify a genetic marker for fetal alcohol syndrome and discover the pathways involved in its origin.
Other Potential Applications of Toxicogenomic Technologies 129 Twenty-five genes were associated with ethanol exposure and teratogenic ef- fects. In their transcriptional profiling work with valproic acid (VPA), a potent teratogen that induces neural tube defects, Kultima et al. (2004) defined a subset of VPA-responsive genes for evaluation as potential biomarkers of VPA terato- genicity. Their study also highlighted some potential challenges associated with analyses conducted in embryos, including limited amounts of tissue and com- plex mixtures of cell types. In addition, alterations in critical gene targets may be difficult to observe because removing cells from the fetus for analysis may ablate the genomic events of interest. Although it is theoretically possible to obtain sufficient starting material (for example, mRNA) from a single cell, it is often not practical and sampling bias can easily arise because the âpoolingâ of multiple nonresponsive and noncritical target cells will diminish the signal-to- noise ratio (Kultima et al. 2004). Working with three endocrine disruptors of different potencies, Naciff et al. (2002) sought to identify a gene expression signature to detect estrogenic effects on development. They identified a common set of genes whose expres- sion was significantly and reproducibly modified by each of the chemicals tested. The products of these genes were plausible targets of endocrine disrupt- ers, including, for example, genes encoding steroidogenesis products important to gonadal differentiation. Toxicogenomics is also an attractive approach to uncover critical molecu- lar events altered by developmental toxicants. This is because the disruption of signaling and gene regulatory networks that control embryonic development is likely to underlie many cases of birth defects. For example, VPA has been sub- jected to a weight-of-evidence MOA analysis that illustrates how toxicogenomic and other conventional data inputs can be merged to improve confidence in de- cisions evaluating whether findings of VPA-induced teratogenicity present plau- sible concern for potential induction of spina bifida in exposed humans (Wiltse 2005). This analysis concluded that VPA-induced alteration in WNT1-dependent gene expression in both animal and human cells represented a critical MOA event causing in vivo developmental effects. This conclusion was further sup- ported by integrating supplementary dose-response information; classic enzy- matic, biochemical, and pharmacologic studies; and outcomes of cross-species studies. Understanding Effects of Chemical Mixtures Standardized animal testing protocols are largely intended to assess the toxicity and potential risks of single chemicals. Data from these types of studies alone provide little evidence of potential adverse interactions that may occur as a result of pharmaceutical or environmental chemical coexposures. Clinical ex- 1 Wnt proteins are a family of secreted signaling molecules that regulate cell interac- tions during embryogenesis (http://www.stanford.edu/~rnusse/wntwindow.html).
130 Applications of Toxicogenomic Technologies perience has clearly demonstrated the potential for adverse drug interactions, which may be mediated via competition for common metabolic activation or detoxification pathways, pharmacologic interactions, unanticipated idiosyncratic responses, or other mechanisms. Regardless of the nature of the interaction, gene expression data collected during preclinical drug development studies may afford an improved, early method for identifying such potential interactions occurring at intended doses. For example, toxicogenomic technologies been used to explore the idiosyncratic hepatotoxic response associated with ranitidine, a therapeutic histamine-2 recep- tor antagonist used to treat ulcers and gastroesophageal reflux disease (Luyendyk et al. 2004). Using hierarchal clustering of gene expression data from rat microarrays, these investigators identified key transcriptional responses that appear to account for ranitidine-induced injury associated with a subclinical interaction with bacterial lipopolysaccharide (endotoxin). However, the identification of potential adverse outcomes surrounding co- exposures to environmental chemicals presents challenges, primarily because typical human exposures are well below doses used in conventional toxicity studies for many compounds (Teuschler et al. 2002). Furthermore, human chemical exposures are by no means limited to chemicals of human origin; natu- ral products also may play a role in multiagent exposures. Thus, future toxico- genomic studies examining the potential health effects of exposures to mixtures must address the confounding overlay of toxicogenomic signals associated with the range of substances present in diets and other environmental exposures. A critical element of this challenge is to determine how an exogenous exposure may truly be differentiated beyond the background pattern and variability of toxicogenomic expression associated with everyday ânormalâ activities such as dietary patterns, physical activities, disease status, and other lifestyle circum- stances. It is unlikely that toxicogenomic signatures will be able to decipher all in- teractions among complex mixtures of toxicants, diet, drugs, natural compounds, and other environmental exposures, but it should be possible to use available mechanism-of-action data to design informative toxicogenomic experiments. For example, knowledge of the mechanism of action for a chemical should make it possible to rapidly screen other chemicals for potential points of biologic con- version (overlap) such as shared activation and detoxification pathways, enhanc- ing identification and exploration of potential interactions and moving beyond empirical experiments. Despite the lack of clear MOA understanding of how chemicals may interact under conditions of low-dose environmental exposures, the Environmental Protection Agency, acting under legislative mandates (Super- fund Act 1980, FQPA 1996, Safe Drinking Water Amendments 1996), ruled that assessors must consider âhow to addâ the risk implications of complex environ- mental mixtures (Wilkinson et al. 2000). The primary methods used in regula- tory practice are dose addition and response addition. Dose addition sums the exposures of the individual compounds and then combines the sums into a summed risk. Response addition calculates the risk for each compound in the
Other Potential Applications of Toxicogenomic Technologies 131 mixture and then sums the risks to produce a cumulative risk estimate (Wilkin- son et al. 2000; Teuschler et al. 2002). These approaches rely on simple addition and do not incorporate synergism and activismâkey considerations for interac- tions (Teuschler et al. 2002). Valuable knowledge may come from using toxicogenomic technologies to test these approaches and other basic assumptions that underpin current risk as- sessment models for mixtures. For example, assumed health risks of environ- mental mixtures of dioxin-like compounds (for example, TCDD, tetra- chlorodibenzofurans, polychlorinated biphenyls) are based on the central hypothesis that the individual toxicities of each of these compounds in a mixture can be cumulatively related to each other through a common mechanism of ac- tion involving activation of the AhR (EPA 1986; Safe 1990). This approach as- sumes that each compound in the mixture can be assigned a toxic equivalency factor (TEF) based on its comparative potency to the standard reference chemi- cal TCDD. The total amount of dioxin-like toxic equivalents (TEQs) present in the mixture can then be calculated based on the amounts of individual chemicals present in the mixture. Toxicogenomic approaches afford promising opportuni- ties to test the fundamental biologic assumptions underlying this TEF/TEQ methodology. If dioxin-like chemicals are truly toxicologically equivalent, then gene expression patterns would be expected to have critical commonalities ac- counting for AhR-driven toxicity responses. These types of informative investi- gations are in progress (Budinsky 2005). Toxicogenomic data can also be used to study interactions among com- pounds that are not structurally related. Studies could be used to determine whether the gene expression, proteomic, or metabolomic patterns induced by individual chemicals overlap, whether they increase in proportion to the sum of compounds added, or whether they suggest new mechanisms of toxicity. IMPLICATIONS FOR BETTER USE AND POTENTIAL REDUCTION IN THE USE OF EXPERIMENTAL ANIMALS As societal views on animal welfare are increasingly concerned with ethi- cal and humane treatment of animalsâin particular vertebratesâgovernments have responded with legislation. Legislation affecting research and drug testing seeks to minimize the use of animals in research and eliminate statistically underpowered studies that cannot lead to valid conclusions. There is pressure to develop alternative techniques that replace the use of animals altogether, reduce the number of animals used, and refine study designs to minimize distress to the animals. Over the past 20 years, companies, universities, and other research institu- tions have contributed substantial time and resources to developing alternative methods and reducing their reliance on animal studies. Although a large number of alternative in vitro tests and computational approaches are available and can provide useful information, most do not accurately model the complexity of in
132 Applications of Toxicogenomic Technologies vivo toxicologic or pharmacologic responses. Since its inception, toxicogenom- ics has been proposed as a possible way to reduce animal testing (Nuwaysir et al. 1999; Zarbl 2001). It is a challenging task, but if toxicogenomics can be used to predict long-term effects (such as reproductive toxicity, teratogenicity, and carcinogenicity) from short-term animal studies, or even from only in vitro toxi- cogenomic studies, such approaches might significantly reduce the number of animals needed for conventional tests and possibly reduce the morbidity and mortality to which test animals are subjected. In addition, if a predictive toxico- genomic signature could be detected during early gastrointestinal damage in animals, extreme gastrointestinal ulceration and bleeding could be limited. In- deed, there is a genomics study analyzing fecal material to identify an early biomarker of gastrointestinal damage (Searfoss et al. 2003). Another potential use of toxicogenomics is the possibility of mechanisti- cally linked surrogate markers. For example, if an exposure known to target an organ elicits a toxicogenomic signature in a surrogate tissue or compartment such as skin or blood, then use of less invasive procedures for assessing toxicity in animals becomes a real possibility. Investigators are exploring the use of blood lymphocytes or proteins present in serum and urine as surrogates for monitoring damage in target tissue organs. IMPLICATIONS OF TOXICOGENOMICS FOR RISK ASSESSMENT INFRASTRUCTURE NEEDS A joint report developed by the Society of Toxicology and the Society of Environmental Toxicology and Chemistry pointed out that use of toxicogenomic information in risk assessment decision making will clearly be hindered if the parties ultimately responsible for decisions (regulatory agencies) do not have adequate resources and expertise to confidently analyze and interpret submitted data (Bus et al. 2007). In addition, adequate research and associated expertise capabilities must also be present within agencies so they can better âownâ criti- cal elements of the technology being suggested for use in risk analysis. Regula- tory agencies historically have dealt with other new technologies, such as physiologically based pharmacokinetic models (see Chapter 4 and Chapter 6), by developing internal capabilities supportive of the technology. However, im- plementation of toxicogenomic technology within the agencies will likely prove challenging because of the very large datasets and complex bioinformatic needs required for data analysis and interpretation. Practically, until such issues and hurdles are addressed, implementation of toxicogenomic data into risk assess- ment evaluation may be slowed or even rejected. As the flow of toxicogenomic data rapidly increases, there will be a need not only for corresponding technical expertise within the agencies but also for enhanced training of risk assessors and risk managers in the fundamentals of the technology. Because of the wide range of stakeholders in agency risk decisions, such individuals must be able to clearly understand and articulate the processes
Other Potential Applications of Toxicogenomic Technologies 133 by which toxicogenomic information is incorporated and applied to risk deci- sions. Additional important challenges include the need to achieve consistency across regulatory agencies as to how they review, interpret, and communicate toxicogenomic data, and to maintain transparency and shared learning experi- ences with the external research and regulated communities. CONCLUSIONS It is not yet appropriate to rely solely on toxicogenomic technologies (such as the applications described in earlier chapters) to support risk decisions. How- ever, it is clear these technologies will address some of the most vexing chal- lenges facing risk assessment. These challenges include (1) establishing the relevance of animal dose-response data to actual human exposures; (2) under- standing the relevance to human risk of different responses in different animal models; (3) identifying and establishing the significance of key factors that may confer particular susceptibilities to chemical exposures, including sensitivity to developmental toxicity; (4) understanding risk implications of exposures to complex, low-dose environmental mixtures; and (5) the need to refine, reduce, or replace the use of whole animal studies in toxicology testing. The collection of dose-response information, appropriately linked to time, will be essential to fully integrate toxicogenomics into risk assessment decision making. To effectively address risk questions associated with human exposures to environmental chemicals, which may be much lower than doses currently used in toxicology studies, special attention must focus on characterizing toxi- cogenomic responses at low doses. Conducting toxicogenomic studies over a range of low doses may consume considerable laboratory, expertise, and finan- cial resources. These studies may be more valuable when incorporated into tra- ditional toxicity testing programs where toxicogenomic results can be tied to conventional toxicity responses. RECOMMENDATIONS 1. Develop and expand research programs specifically dedicated to inte- grating toxicogenomics into challenging risk assessment problems. The devel- opment of partnerships among regulatory agencies and private sector stake- holders to incorporate toxicogenomic approaches and data into risk assessments should be encouraged to ensure the most rapid development of useful examples of toxicogenomics in risk assessment. Examples of important research areas include those discussed in the conclusions section of this chapter and in Chap- ters 4, 5, 6, and 7. 2. Future toxicologic assessment should incorporate dose-response and time course analyses appropriate to risk assessment. An analysis of known toxic compounds that are well characterized would provide an intellectual framework for future studies.
134 Applications of Toxicogenomic Technologies 3. Continue to use toxicogenomics to study differences in toxicant re- sponses between animal models. These results will afford valuable opportunities to extend knowledge of how to effectively translate animal model observations into credible estimates of potential human risk. If toxicogenomics can illustrate how currently known interspecies differences in toxicity can be more rapidly and clearly explained, it will offer the potential to significantly enhance the con- fidence in animal-to-human toxicity extrapolations that constitute a foundational element of risk evaluations. 4. Use toxicogenomics to investigate how exposure during early devel- opment conveys susceptibility to drug and chemical toxicities. Efforts to de- velop mode-of-action data to clarify the need to apply either specific or univer- sal default uncertainty factors in addressing susceptibility concerns, and to supplant such defaults with data- and principle-driven alternatives, will be of great value. 5. Use toxicogenomic approaches to test the validity of methods for esti- mating potential risks associated with mixtures of environmental chemicals. Investigations examining responses in the range of relevant exposures and low doses will be particularly valuable. 6. Invest in research and expertise within the infrastructure of regulatory agencies, as well as active collaboration across agencies, to enable toxicoge- nomic approaches to be effectively and credibly integrated into risk assessment practice. Transparent and participatory mechanisms for educating and engaging the scientific, regulatory, and public communities will also be needed. 7. Support education and training programs at the doctoral and postdoc- toral levels to train a new generation of risk assessors fluent in the science of toxicogenomics.