Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
7 Application to the Study of Mechanisms of Action The study of biologic mechanisms is a priority for basic and clinical re- searchers interested in elucidating the cellular, biochemical, and molecular basis of chemical and drug toxicity. Mechanistic insight is required for in-depth un- derstanding of the pathobiology that underlies the adverse response to chemical exposures as well as the development of pharmacologic and nonpharmacologic strategies that can control or contain adverse outcomes in chemical toxicity. In addition, toxic chemicals are increasingly being used as tools to unravel com- plex mechanisms of disease onset and progression and to define the role of envi- ronmental injury in acute and chronic pathogenesis. As such, the toxicology research community readily embraces the application of new technologies and approaches to study mechanisms of toxicity. From a practical perspective, knowledge and insight gained from mecha- nistic toxicology investigations have proven useful in risk assessment and drug development (Petricoin et al. 2002c). The chemical industry is interested in ap- plying toxicogenomic technologies to monitor the biologic response of humans to their products and to study chemical-biologic interactions in target species. Current risk assessment processes use mechanistic understanding to detect or identify hazards and to make decisions about dose-response relationships. How- ever, despite general acceptance of the concept that mechanistic information is valuable for understanding the basis of the toxic response, data from mecha- nism-based investigations are not incorporated into risk assessment paradigms as often because either understanding of the mechanism of toxicity is incomplete or competing hypotheses exist. The scarcity of comprehensive assessments of multiple exposures in real time or with adequate quantitative resolution has slowed progress. Recent reports by the National Research Council (NRC 2006b) and the Environmental Protection Agency (EPA 2003) on the health risks from 107
108 Applications of Toxicogenomic Technologies dioxin and related chemicals as well as the report of the International Agency for Cancer Research on the human carcinogenicity of benzo(a)pyrene (IARC 1987) exemplify the usefulness of mechanism-based information in risk assess- ment. In sharp contrast to the chemical industry, the pharmaceutical industry routinely uses mechanisms and molecular level understanding to identify off- target biologic responses to potential new compounds and products already on the market. In fact, these data are used in new drug applications for regulatory approval by the Food and Drug Administration. In both settings, mechanistic studies have greatly facilitated informed decision making and more detailed understanding of critical issues. Toxicogenomic studies offer the opportunity to evaluate molecular mechanisms of toxic action and the degree of conservation of biologic response pathways across species that are responsible for toxicity. When applied to the study of large classes of chemicals or drugs, toxicogenomic information can be used to globally define modes or mechanisms of toxic action. The application of toxicogenomics to the study of toxicity mechanisms rests on the premise that chemical or physical injury is mediated by, or reflected in, changes at the mRNA, protein, or metabolite level. Abundant evidence supports this concept that toxicity coincides with changes in mRNAs, proteins, and metabolites. As such, these changes under defined conditions of cellular location, time, and bio- logic context can provide meaningful information about biologic responses to toxic insult. Thus, toxicogenomic studies offer a new dimension in environ- mental exposure assessment, drug and chemical screening, and understanding of human and animal variability in response to drugs and chemicals. This chapter provides examples of the application of toxicogenomic analyses for exploring toxicity mechanisms of environmental chemicals and pharmaceuticals. Because transcriptome profiling technologies are technically more mature than methods for proteomics or metabonomics, the discussion fo- cuses primarily on transcriptomics, with complementary insights derived from proteomics and metabonomics. The limitations of the various technologies and a needs assessment are also presented. STATE OF THE ART IN TRANSCRIPTOMIC ANALYSES Investigators have exploited transcriptome profiles to understand mecha- nisms of environmental chemical or drug toxicity. Generally, the experimental approaches used can be categorized in three broad approaches (Box 7-1), which are discussed later in this chapter: the contextual approach, the exploratory ap- proach, and the network building approach. Transcriptome profiling experiments designed to evaluate toxicity provide a âsnapshotâ of global gene activity as measured by steady-state levels of mRNA at a precise point in time during the course of the toxic response. Genetic and epigenetic mechanisms that orchestrate the recruitment of RNA polymerase
Application to the Study of Mechanisms of Action 109 BOX 7-1 Types of Experimental Approaches in Toxicogenomic Studies to Evaluate Mechanisms of Toxicity â¢ The contextual approach places newly generated toxicogenomic data in the context of available biologic and toxicologic knowledge by comparing a set of elements (transcriptome profiles) with known response parameters. The reason- ing process is deductive. â¢ The exploratory approach moves from a discovery to a hypothesis- driven approach that defines specific mechanisms of action. The information ob- tained generates novel insights into biologic and toxicologic mechanisms. The reasoning process is, by definition, more inductive. â¢ The network building approach uses patterns of transcriptional co- regulation or other means to construct interaction maps for affected genes and regulatory networks. The information provides opportunities to deduce relation- ships that place individual responses within the broader framework of complex interactions. and other components of the transcriptional machinery for the synthesis of RNA control the amount of gene activity. Once the DNA template has been copied, the cellular machinery must process the precursor RNA into one (or more) ma- ture forms that act as the template for protein synthesis. Proteins, in turn, are responsible for the biologic processes that govern cellular functions, susceptibil- ity to chemical toxicity, and generation of products of cellular metabolism. Combining transcriptional analysis with parallel or correlational analyses of both protein and metabolite profiles makes it possible to examine the multidi- mensionality of the toxic response. The study of mechanisms is complicated by the fact that toxicity often involves complex interactions of chemicals with genes, proteins, and cellular metabolic intermediates. Genetic variability is an important determinant of the transcriptional re- sponse elicited by toxic insult. Thus, genetic factors such as polymorphisms that contribute to variable susceptibility to chemical and drug toxicity need to be factored into toxicogenomic analyses. In addition to genetic influences on toxi- cogenomic response, recent studies have established the importance of epige- netic mechanisms (heritable changes in gene expression profiles that occur without changes in DNA sequence; see Chapter 6) in toxic injury. The epige- nome regulates mRNA abundance and is a critical determinant of gene expres- sion. Microarray technology is now capable of assessing epigenetic variability, and impacts of the epigenome on toxicogenomic and transcriptomic responses are being established (Stribinskis and Ramos 2006). Also relevant to this discussion is the recent introduction of DNA microar- ray chips to study the role of microRNAs in biology (Monticelli et al. 2005; Shingara et al. 2005; Castoldi et al. 2006). MicroRNAs are a class of naturally occuring, small, noncoding RNAs involved in regulating the translation and
110 Applications of Toxicogenomic Technologies processing of mRNAs. MicroRNAs bind to the 3' untranslated region of target mRNAs and target the transcript for degradation. As such, microRNAs are be- lieved to play important roles in regulating cell replication, differentiation, and apoptosis. Aberrant patterns of microRNA expression have now been implicated in human cancer (Iorio et al. 2005), suggesting that toxicity may involve altera- tions in RNA processing. Contextual Approach to Study Mechanisms of Action The contextual approach (Box 7-1) to studying mechanisms of action places newly discovered information within the context of the larger body of biologic and toxicologic knowledge, comparing newly generated toxicogenomic data from a test chemical with data from compounds whose mechanism of ac- tion is better understood. The experimenter may have some prior knowledge of the nature of molecular changes that would be consistent with an adverse re- sponse. Alternatively, the experimenter may not know of the molecular nature of the adverse response but can use the newly acquired data to determine possible toxicities. The result of this analysis is evidence implicating a set of genes in toxicity and identification of the similarities and differences of the test chemi- calâs toxicogenomic data with known toxicity mechanisms. Thus, applying a contextual approach facilitates the discovery process. Contextual analysis of gene expression is best exemplified by studies in which the expression profiles of a chemical with unknown effects are compared with the profiles of gene expression of compounds with known effects. This is similar to the class prediction approach described in Chapter 3 and its applica- tions to hazard screening as described in Chapter 5. Although mainly descrip- tive, the approach is unbiased and does not require a detailed understanding of affected biologic pathways, and categorical classifications do not require annota- tions or other detailed understanding of affected genes. Compounds with similar toxicities often affect similar genes, as shown in experiments in which âfinger- printsâ of the genomic response to different chemical classes have been devel- oped (Hammadeh et al. 2002a,b). Several examples of this approach have been described in the literature and include studies of the responses initiated when 2,3,7,8-tetrachlorodibenzo-p- dioxin interacts with the aryl hydrocarbon receptor (AhR) (Puga et al. 2000; Frueh et al. 2001; Kurachi et al. 2002; Boverhof et al. 2005; Fletcher et al. 2005; Thackaberry et al. 2005). Vezina et al. (2004) used DNA microarrays to identify rat hepatic transcriptome profiles associated with subchronic exposure to three AhR ligands and one structurally related non-AhR ligand. The AhR ligands pro- duced similar gene expression profiles that could be readily distinguished from the non-AhR ligand. Thus, the gene expression changes controlled by the AhR could be enumerated. This enumeration led to several genes not previously char- acterized as AhR targets (carcinoembryonic cell adhesion molecule 4 [C- CAM4] and adenylate cyclase-associated protein 2 [CAP2]) being identified.
Application to the Study of Mechanisms of Action 111 The data provided mechanistic insight by implicating novel genes in the AhR signal transduction pathway and identifying putative targets of toxicity. In another study of mechanisms of chemical carcinogenesis, Dickinson et al. (2004) compared the effects of the genotoxic drug cisplatin with non- genotoxic osmotic stress by sodium chloride. Transcriptome profiles of cis- platin-treated cells revealed significant increases in transcripts associated with DNA damage (for example, members of the GADD45 family) and adaptive cel- lular repair (for example, fos and HSP40 homologue). In contrast, at equitoxic concentrations, the gene expression profile of sodium chloride-treated cells did not indicate changes in transcripts associated with DNA damage and repair. This suggests that DNA damage and cellular repair may be important in cisplatin- mediated toxicity. In a similar study, Ellinger-Ziegelbauer et al. (2004) identi- fied transcripts that characterize the DNA damage response, induction of drug metabolism enzymes, and survival of proliferative pathways after a 2-week ex- posure to four hepatocarcinogens. Observed differences in gene expression pro- files may represent the processes preferentially affected by the carcinogens and may be related to the mechanism of carcinogenesis. Exploratory Approach to Study Mechanisms of Action The second experimental approach, the exploratory approach to studying mechanisms, moves from generating contextual insights to testing specific hy- potheses with lower throughput and more detailed molecular and biochemical analyses. The information derived from exploratory studies provides insight into biologic and toxicologic mechanisms by using a reasoning process that is more inductive. This hypothesis-driven approach is highly complementary to the con- textual approach described above, so the two approaches are often used in con- cert to test specific hypotheses. For example, knowledge of the identity of a spe- cific gene discovered in the contextual approach can be used later in exploratory studies to reveal downstream events involved in a toxic response. More detailed information about mechanisms of toxicity is gained by perturbing the system to study biologic relationships. In this scenario, genes identified by using a contex- tual design are disrupted pharmacologically, genetically, or molecularly and the genome-wide response is reevaluated to gain more detailed information and additional mechanistic insight. For example, pharmacologic agents or molecular interventions such as posttranscriptional gene silencing, also known as RNA interference, are used to selectively target genes of interest and test specific hy- potheses that involve studying gene-gene associations that emerged from tran- scriptional profiling experiments (Hannon and Rossi 2004). A well-designed series of experiments can provide a more in-depth understanding of how multi- ple elements within the affected pathways are connected and their roles in toxic- ity. This approach is biased toward known biologic mechanisms, especially those well represented in the published literature, and success depends on the
112 Applications of Toxicogenomic Technologies knowledge and experience of the investigator. Nonetheless, it can provide an effective means to uncover mechanisms of toxicity. The exploratory approach frequently relies on pathway analyses to deter- mine how discrete elements (transcripts, proteins, and metabolites) contribute to mechanisms of toxic responses. These analyses often use the Kyoto Encyclope- dia of Genes and Genomes or the Gene Ontology database to define categories of cellular components, biologic process, or molecular function and software tools (for example, Expression Analysis Systematic Explorer, Ingenuity, GeneGo or MicroArray Pathway Profiler) to reveal pathways. Multiple examples of the exploratory approach can be found in the pri- mary literature. Thomas et al. (2001) identified a set of genes useful for predict- ing toxicity using transcriptome profiles for 24 hepatotoxic compounds repre- senting five toxicologic categories (peroxisome proliferators, AhR agonists, noncoplanar polychlorinated biphenyls, inflammatory agents, and hypoxia- inducing agents). After surveying 1,200 transcripts, either correlation-based or probabilistic analysis yielded a weak toxicologic classification accuracy (<70%) based on a known mechanism of action. However, with a forward parameter selection scheme, a diagnostic set of 12 transcripts was identified that provided 100% predictive accuracy based on a âleave-one-out cross-validationâ approach. A forward parameter selection scheme is an iterative process in which tran- scripts are examined individually with a naÃ¯ve Bayesian model and the tran- scripts with the best internal mechanistic classification rates and highest confi- dence (representing the sum of all probabilities for correctly classified treatments) are selected. This exemplifies how a classification approach coupled with statistical analysis can be used in exploratory experiments to study mecha- nisms of toxic action. Moggs et al. (2004) also used the exploratory approach, in which gene ex- pression changes that drive the response of immature mouse uterus to 17-beta- estradiol were identified. Gene expression changes were sorted into groups of selected cellular processes, such as protein synthesis and cell replication, and expression was analyzed relative to changes in uterine weight. Correlation of phenotypic (uterine weight in this case) and toxicogenomic response, known as âphenotypic anchoringâ (Tennant 2002), is useful in linking transcriptional per- turbation events to toxicity outcomes. Ulrich and coworkers (Waring et al. 2002) used a classification approach together with discrete gene analysis to examine the biologic pathways affected by an inhibitor of the transcription factor nuclear factor kappa B as well as the genes associated with hepatic hypertrophy and toxicity. Experiments by Anazawa et al. (2005) exemplify the use of molecular in- tervention combined with transcriptome profiling study mechanisms of toxicity. This study examined gene expression profiles of prostate cancer cells isolated by laser capture microdissection to exclude signal contamination by other cells. A newly identified gene, prostate collagen triple helix (PCOTH), was found to be specific to prostate cancer cells and the corresponding protein was expressed in prostate tumor tissue. Decreasing PCOTH expression with small interfering
Application to the Study of Mechanisms of Action 113 RNA attenuated prostate cancer cell growth, whereas increasing PCOTH by DNA transfection enhanced growth. In cells expressing the increased exogenous PCOTH, phosphorylation of SET (a nuclear translocation protein expressed in acute undifferentiated leukemia) was detected. A reduction of endogenous SET levels also attenuated the viability of prostate cancer cells, suggesting that PCOTH mediates growth and survival of prostate cancer cells through the SET pathway. This investigation led the investigators to suggest PCOTH as a target for new therapeutic strategies for prostate cancers. An example of how an iterative process can be useful once key elements have been uncovered is presented in a study by Guo et al. (2006), who used transcriptome profiling to study aflatoxin B1 (AFB1) injury in a genetically en- gineered yeast model. AFB1 is a potent human hepatotoxin and hepatocarcino- gen produced by the mold Aspergillus flavus that affects the health of millions of people in developing countries worldwide. The authors engineered a yeast strain to express human cytochrome P4501A2, which metabolizes AFB1 to its ultimate mutagen. They also engineered the genetic background of the yeast to allow for direct correlations between toxicity, gene expression, and mutagenic- ity. Genes activated by AFB1 treatment included mediators of the DNA damage response. The study also found rapid and coordinated repression of histone and M/G1 phase-specific transcripts, but these molecular changes were uncoupled from cell cycle arrest, suggesting that histone gene repression by genotoxic stress in yeast involves signaling pathways different from those involved in cell cycle control functions. This study exemplifies the power of transcriptional pro- filing combined with other molecular approaches to study mechanisms of toxic- ity. Finally, animal models that have been engineered with added (knock-in) or deleted (knock-out) genes or that have been engineered to have human genes have been successfully used to evaluate the transcript profiles involved in chemical toxicity. These approaches allow investigators to construct experimen- tal model systems with which to evaluate the interaction of added or deleted genes with the cellular machinery and its micro- or macroenvironment. The combination of genetically modified models with other toxicogenomic ap- proaches provides a powerful tool to examine the molecular basis of the toxic response. Network Building Approach to Study Mechanisms of Action The third experimental approach uses patterns of transcriptional co- regulation and targeted genetic manipulations to identify biologic interaction networks involved in toxicity. This type of experiment uses relational biologic information to place the toxicologic response within the complex framework that mediates the biologic response. The ultimate goal is to move from tradi- tional reductionist approaches to analyze the behavior of the system as a wholeâa âsystems biologyâ approach. The application of systems biology ap-
114 Applications of Toxicogenomic Technologies proaches will require that toxicogenomic analysis include not only multigene dimensionality but also the dimensions of dose and time. These analyses are highly dependent on databases of gene and protein interaction networks (interac- tomes), which provide a framework for interpretation. Interactomes for yeast and other model organisms are available and are becoming increasingly useful in this context. Studies by Samson and colleagues, looking at yeast (Sacchromyces cere- visiae) treated with model alkylating agents, were the first to reveal distinct tran- scriptome profile changes to toxicants (Jelinsky and Samson 1999; Jelinsky et al. 2000). This group elaborated on this approach to generate the first systems biology-level analysis of toxicant action. They produced deletion mutant strains of yeastâstrains with different genes deletedâto evaluate roles of individual genes, proteins, and interactome networks (Begley et al. 2002, 2004; Said et al. 2004). These studies revealed new, unanticipated stress signaling networks with novel interactions of genes involved in DNA repair, protein turnover, and meta- bolic regulation. The power of this yeast model is derived from integration of transcriptome profiling, an extensive protein-protein and protein-gene interac- tion database (Xenarios et al. 2002) and the availability of a near-comprehensive library of mutants with single genes deleted (Giaever et al. 2002). Yao et al. (2004), who investigated signal transduction pathways evoked by AhR activation, reported another example of network building. These inves- tigators used an ordered panel of yeast (S. cerevisiae) strains that harbored dele- tions in each of 4,507 genes. Because this collection of deletion mutant strains had essentially every gene in the organism deleted, addition (via transfection) of the human AhR gene in combination with AhR agonists (Î±- or Î²- naphthoflavone) could be used to systematically analyze the biologic interaction between specific genes and the AhR activation cascade. A relatively small num- ber of genes (54) exerted a significant influence on AhR signal transduction. They then described the relationship between these modifying genes by using a network map based on the yeast interactome. This revealed that the AhR signal- ing is defined by five distinct steps regulated by functional modules of interact- ing modifiers. They classified these modules as mediating receptor folding, nu- clear translocation, transcriptional activation, receptor level, and a novel nuclear step related to the Per-Arnt-Sim domain. By coupling computer-assisted and experimental annotations, this study identified the minimum number of genetic loci and signaling events required for AhR signaling; it exemplifies the power of the network analysis approach to study mechanisms of toxicity. Another example is a study by C. D. Johnson et al. (2003) that evaluated transcriptional reprogramming during the course of environmental atherogenesis in response to benzo(a)pyrene, a ligand for AhR and an inducer of oxidative stress. A combined oxidant-antioxidant treatment regimen was used to identify redox-sensitive targets early during the course of the atherogenic response. Su- pervised and nonsupervised analyses identified transcripts highly regulated by benzo(a)pyrene, unaffected by antioxidant, and neutralized by combined chemi- cal treatments. Critical transcripts included lymphocyte antigen 6 complex, his-
Application to the Study of Mechanisms of Action 115 tocompatibility class I component factors, secreted phosphoprotein, and several interferon-inducible proteins. A predictor algorithm was then applied to define critical gene-gene interactions involved in the atherogenic response. Transcrip- tional gene networks predictive of the transcriptional activation of AhR were defined in a later study showing that the expression of AhR is most commonly predicted by lymphocyte antigen 6 complex, locus e, a frequent predictor among three-gene combinations that included insulin growth factor binding protein 3 and tumor necrosis factor receptor superfamily member 1b (Johnson et al. 2004). Linkage diagrams of significant predictors were then used to delineate how individual genes integrate into a complex biologic network of genes poten- tially involved in the toxic response to AhR ligands. Workman and coworkers (2006) recently used a systems biology approach to map DNA damage response pathways in yeast. In these studies, genome bind- ing sites for 30 DNA damage-related transcription factors were identified after exposure of yeast to methyl methanesulfonate. These binding sites were identi- fied by integrating transcription factor binding profiles with genetic perturba- tions, mRNA expression, and protein interaction data. The product was a physi- cal map of regulatory interactions and biologic connectivities that can help define the biologic response to DNA-damaging agents. State of the Art: Proteomic Analyses The principles and concepts described for transcriptome analyses and its applications to the study of mechanisms of action apply to proteomics analyses. The integration of mass spectrometry (MS) with protein and peptide separation technologies has enabled the characterization of complex proteomes as well as mechanistic study of chemically induced protein modifications (Liebler 2002a; Aebersold and Mann 2003). Because of the extraordinarily wide range of protein expression levels (>106), no proteomic approach is capable of analyzing all pro- teins in the cell, tissue, or organism. Global (as opposed to comprehensive) pro- teome analyses have been applied to derive insight into mechanisms of toxic action. Two-dimensional gel electrophoresis and MS-based protein identifica- tion have revealed a number of proteome changes that are characteristic of chemical toxicities (Fountoulakis et al. 2000; MacDonald et al. 2001; Ishimura et al. 2002; Ruepp et al. 2002; Bandara et al. 2003; Xu et al. 2004). In these cases, a contextual or exploratory approach identified proteins and peptides that are correlated with injury and the adaptive biologic responses that follow. Al- though many of the changes observed may encode mechanistic information about chemically induced injury, distinguishing proteomic changes that repre- sent causes versus effects requires more sophisticated experimental approaches. As described in Chapter 2, one such study recently described the applica- tion of quantitative shotgun proteomics using isotope-coded affinity tag labels to compare proteomes of mice that are susceptible or resistant to the hepatotoxic analgesic acetaminophen both before and after drug treatment (Welch et al.
116 Applications of Toxicogenomic Technologies 2005). These analyses provided a global picture of proteome differences that may govern susceptibility as well as proteome changes that occur with injury. Modeling protein interaction networks using time-series data is a powerful method to uncover a mechanism in toxicogenomics, as revealed by a study by Allen et al. (2006). These investigators argued that existing analytical ap- proaches of proteomics typically yield a large list of proteins modified under specific conditions, but these data alone may not yield a systems biology under- standing of cell signaling or produce hypotheses for future experiments. To im- prove understanding of proteomic data, these investigators used Monte Carlo approximations to predict behavior and to define relationships in cellular signal- ing networks. Because signals are often transmitted via covalent modification of protein structure, the investigators examined protein carbonylation upon expo- sure of yeast to copper as an experimental model to identify protein connections that shut down glycolysis in a reverse, stepwise fashion in response to copper- induced oxidative stress in yeast. The cellular response to toxic injury often involves direct modification of proteins. The types of modifications seen include carbonylation, phosphoryla- tion, glycosylation, sumoylation, and ubiquitination. Proteomic approaches can go beyond traditional toxicologic approaches in identifying molecular targets of injury and delineating mechanisms of toxicity. Covalent binding has long been known to contribute to toxicity in many cases, but mechanistic insights have been limited by a lack of knowledge about protein targets of reactive intermediates (Cohen et al. 1997; Liebler and Guengerich 2005). In exploratory studies, the application of MS-based proteome analyses has revealed protein targets. Two-dimensional sodium dodecyl sulfate polyacrylamide gel electrophoresis (2D SDS-PAGE; see Chapter 2) analyses of proteins from cells and tissues treated with radiolabeled toxicants identified pu- tative protein targets but not the mapping of actual protein adducts (Qiu et al. 1998; Lame et al. 2000). Other work has applied the combination of liquid chromatography and MS (LC-MS-MS) to map the sites of electrophile adduc- tion on proteins. The major problem with the identification of protein adducts is the apparently low stoichiometry of modification, meaning that adducted pro- teins are relatively difficult to detect in complex protein mixtures. LC-MS-MS analysis of bile from rats treated with the prototypical hepatotoxin 1,1- dichloroethylene (DCE) revealed DCE-derived adducts on biliary proteins (Jones et al. 2003). Future studies with appropriate model compounds and affin- ity capture of adducts should help address some of these relationships. State of the Art: Metabonomic Analyses Metabonomic analyses use high-field nuclear magnetic resonance (NMR), gas chromatography (GC) MS, or LC-MS (see Chapter 2) to analyze complex mixtures of metabolites in biofluids and tissue samples (Nicholson et al. 2002). In this context, âmetabolitesâ include metabolic products of xenobiotics and
Application to the Study of Mechanisms of Action 117 products of endogenous metabolism, including small molecules generated from bacteria within an organism (for example, gut microflora). Assessment of toxic responses is based on relative intensities of signals corresponding to multiple metabolites, many of which can be identified (for example, by NMR chemical shifts). This allows inferences about metabolites and pathways involved in tox- icities. Serial measurements enable metabolite status to be correlated with time- dependent changes in other parameters, and this allows inferences about specific metabolic pathway changes during the course of toxicity and recovery. Most studies to date have applied NMR- and MS-based metabonomic analyses to animal models of tissue-specific toxicity. These studies have described metabo- lite profiles characteristic of toxicities for different types of chemicals (Nicholls et al. 2001; Waters et al. 2001; Bollard et al. 2002; Coen et al. 2003; Y. Wang et al. 2005b; Williams et al. 2005). Toxicity mechanisms are commonly envisioned at the molecular-chemical level, yet toxicity is a systems- and organ-level phenomenon. Metabonomic analyses of biofluids, such as urine, can provide mechanistic insights at the level of metabolic pathways and system dysfunction. The advantage over conven- tional analysis of metabolites is that multiple metabolites in a biologic network can be examined simultaneously and integrated with the responses of related biologic systems. Because urinary metabolite profiles reflect toxicities in multi- ple organs or tissues, metabolite profiles must be deconvoluted to extract infor- mation specific to target tissues. Waters et al. (2005) provided an example of metabolic deconvolution of toxicity signatures in studies of thioacetamide, which is both hepatotoxic and nephrotoxic. They relied on knowledge of both renal and hepatic biochemical physiology to interpret metabolite profile changes accompanying tissue injury. Many signals observed in NMR spectra or MS pro- files can be confidently attributed to known compounds, and this linkage pro- vides a systems context to interpret changes. Analysis of time-dependent changes also is critical to distinguishing primary effects from secondary adap- tive responses (Keun et al. 2004; Waters et al. 2005). Metabonomics can elucidate toxicity mechanisms even in the absence of a distinctive toxic phenotype. For example, Mortishire-Smith et al. used me- tabonomics to characterize mechanisms of a hepatotoxic drug. The compound increased medium-chain fatty acids and intermediates of the tricarboxylic acid cycle, an important metabolic pathway, thus implicating inhibition of fatty acid metabolism as a mechanism of toxicity (Mortishire-Smith et al. 2004). In mechanistic studies, metabonomic analyses of tissues and biofluids are perhaps most powerful when integrated with other data types, including histopathology, clinical chemistry, and transcriptome profiles or proteomics data. A study of the hepatobiliary toxicant alpha-naphthylisothiocyanate combined liver, plasma, and urine metabonomics with histopathology and clinical chemistry measurements to establish a global profile of the development of injury (Waters et al. 2001). Integration of metabonomic and transcriptome profiles from mice treated with a hepatotoxic dose of acetaminophen enabled interpretation of gene expression changes in the context of metabolic status (Coen et al. 2004). Thus, a potentially
118 Applications of Toxicogenomic Technologies ambiguous set of gene expression changes was reconciled with a metabolic functional outcome (elevation of glycolysis) as toxicity progressed. CONCLUSIONS Toxicogenomic studies are improving our knowledge of the underlying biology and the regulatory networks that integrate the signaling cascades in- volved in toxicity. Thus, toxicogenomic data may advance the introduction of mechanistic insight into risk assessment and fulfill the promise of more accurate and expedited elucidation of class-related biologic effects or predictive toxicity. One must consider, however, that the data-rich nature of toxicogenomic tech- nologies, coupled to challenges of data interpretation, make the application of toxicogenomics in risk assessment inherently complex and will require the im- plementation of educational programs for the toxicology and risk assessment communities. A more immediate need in the field of toxicogenomics is for more accu- rate identification of orthologous genes or proteins across species. This effort will improve our understanding of conservation of biologic responses to toxic injury and facilitate use of surrogate species that predict the responses in hu- mans. Although there are important differences in the genomes and proteomes, many responses to chemical and physical stressors are evolutionarily conserved and limitations posed by cross-species extrapolation can be mitigated by focus- ing analyses on processes conserved across species. However, many genes of rats, mice, and humans remain uncharacterized, and divergence can be a major factor in species differences in sensitivity or response. A key goal of toxicogenomic research is to integrate data from multiple sources to produce a comprehensive understanding of the molecular basis of toxicologic responses. For this reason, there is a pressing need to develop algo- rithms that combine and interpret data of multiple types (for example, gene ex- pression, proteomic, and metabonomic data). Integration of data from different technologies will lead to synergistic interpretations beyond what can be resolved when data are analyzed in isolation. Examples include the interplay between transcriptional analysis of protein factors and gene expression changes and be- tween levels of metabolizing enzymes and the production or elimination of me- tabolites. The integration of data from different toxicogenomic technologies, has been explored (Hogstrand et al. 2002; Ruepp et al. 2002; Coen et al. 2004) but has yet to be fully realized. There is also a need to develop mechanisms to better probe the complexity of toxic responses. Toxicologic responses are typically defined by a linear se- quence of events. In contrast, a network and system level of organization reflects nonlinear cellular states that depict the true complexity of biologic systems. The development of a knowledge base to accurately reflect network-level molecular expression and interpretation requires a new paradigm of data management, in- tegration, and computational modeling. As the field of toxicogenomics ad-
Application to the Study of Mechanisms of Action 119 vances, the development of approaches (such as the combined use of model sys- tems and interactome analyses) to unravel the complexity inherent to biologic systems needs to be a research priority. The application of toxicogenomic approaches to the study of toxicity has advanced our understanding of the biology that underlies the deleterious actions of chemical and pharmaceutical agents on living systems, the regulatory net- works that integrate the signaling cascades involved in toxicity, and the patho- genesis of environmental or drug-induced disease. Indeed, mechanistic toxicol- ogy investigations have proven useful in risk assessment, drug development, environmental exposure assessment, and understanding of human and animal variability in response to drugs and chemicals. Progress to date has been limited by the scarcity of comprehensive time- and dose-related investigations and the lack of studies using exposure paradigms that reproduce the human condition with fidelity. RECOMMENDATIONS Given the status of current mechanistic toxicogenomic investigations, the following recommendations are made: Immediate Actions 1. Develop richer knowledge bases and models that can integrate knowl- edge of mechanisms of toxicity and the complex networks information, encour- aging the community to use these models to study toxicology as a global re- sponse. This requires a new paradigm of data management, integration and computational modeling and will specifically require the development of algo- rithms that combine and interpret data across multiple platforms in different animal species. 2. Make resources available to advance detailed mechanistic research that is useful for classifying toxic chemicals and to assess the public health relevance of these toxicity classifications. 3. Make resources available to facilitate the identification of orthologous genes or proteins across laboratory species, for identifying suitable surrogate species to predict the response of humans to toxic injury. In the near term, this would include the development of algorithms. Intermediate Actions 4. Advance proteomic and metabonomic analyses by promoting the inte- gration of peptide and metabolite separation technologies (NMR, GC-MS, LC- MS, MS with protein) into toxicologic investigations and advancing proteomic and metabonomic databases. Advancement of proteomic and metabonomic
120 Applications of Toxicogenomic Technologies analysis is necessary to fully elucidate cellular responses to toxic injury, particu- larly those mediated by direct modification of proteins and/or disruption of metabolic networks involved in tissue homeostasis. 5. Examine the sensitivity of current toxicogenomic-based methodologies and analyses and their ability to distinguish between endogenous and exogenous effects at biologically relevant doses. 6. Implement educational programs to help the toxicology and risk as- sessment communities incorporate toxicogenomic approaches and data-rich mechanistic assessments into their professional practice. Long-Term Actions 7. When appropriate, encourage a shift in the focus of mechanistic inves- tigations from single genes to more integrated analyses that embrace the com- plexity of biologic systems as a whole as well as the multidimensionality of dose- and time-related effects of toxic agents.