Summary of the Workshop

INTRODUCTION

Data on adverse effects of chemicals on humans can be acquired through epidemiologic studies and from occupational, inadvertent, accident-related, or other exposures, such as the airborne exposure that followed the World Trade Center attacks of September 11, 2001. However, intentional human testing of environmental chemicals is limited, and the available human data are generally insufficient for making regulatory decisions, because they do not shed enough light on relevant health issues. Reliance on experimental animal data is a cornerstone of toxicology and risk assessment, and responses observed in experimental animals are commonly assumed to be predictive of responses in humans. Regulatory agencies and industry rely on animal data to make health and safety decisions about exposure to and intake of chemicals from food, drugs, and the environment.

Although experimental animal data often constitute the only available predictor of human health effects, their predictive ability is limited. There are numerous differences between experimental animal and human responses to chemicals, including differences in the types of adverse effects experienced and the dosages at which they occur. The differences may reflect variations in the underlying biochemical mechanisms or in the distribution of the chemicals. It can be expensive or detrimental to public health if experimental animal models are not good predictors of



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation Summary of the Workshop INTRODUCTION Data on adverse effects of chemicals on humans can be acquired through epidemiologic studies and from occupational, inadvertent, accident-related, or other exposures, such as the airborne exposure that followed the World Trade Center attacks of September 11, 2001. However, intentional human testing of environmental chemicals is limited, and the available human data are generally insufficient for making regulatory decisions, because they do not shed enough light on relevant health issues. Reliance on experimental animal data is a cornerstone of toxicology and risk assessment, and responses observed in experimental animals are commonly assumed to be predictive of responses in humans. Regulatory agencies and industry rely on animal data to make health and safety decisions about exposure to and intake of chemicals from food, drugs, and the environment. Although experimental animal data often constitute the only available predictor of human health effects, their predictive ability is limited. There are numerous differences between experimental animal and human responses to chemicals, including differences in the types of adverse effects experienced and the dosages at which they occur. The differences may reflect variations in the underlying biochemical mechanisms or in the distribution of the chemicals. It can be expensive or detrimental to public health if experimental animal models are not good predictors of

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation human health effects, so it is critical to select and validate animal models early in the regulatory process. Critical differences in how humans and experimental animals respond to chemicals may not be identified until after considerable testing has been conducted. Toxicogenomics has been described as “an emerging discipline that combines expertise in toxicology, genetics, molecular biology, and environmental health to elucidate the response of living organisms to stressful environments” (Ramos 2003). Scientists in the field use new technologies to simultaneously assess the coordinated expression of genes in response to a particular chemical exposure (“transcriptomics”). They also look at how individual and species differences in the underlying DNA sequence itself can result in different responses to the environment (“genomics”). The “-omics”1 part of “toxicogenomics” also encompasses several other types of profiling technologies including protein profiling (proteomics) and metabolite profiling in a cell or tissue (metabonomics). Toxicogenomics potentially can provide faster and less-expensive methods for predicting differences between experimental animal and human responses to chemicals. The National Academies standing Committee on Emerging Issues and Data on Environmental Contaminants, sponsored by the National Institute of Environmental Health Sciences, provides a forum for communication among scientists and regulators in government, industry, environmental groups, the academic community, and the general public about topics at the forefront of toxicogenomics. The objective of the workshop reported here was to explore some of the scientific challenges and promises of applying toxicogenomic information to the extrapolation of animal data to humans. A workshop planning committee was formed to organize the August 12, 2004, workshop on Applications of Toxicogenomics Technologies to Cross-Species Extrapolation at the National Academies in Washington, DC. This summary contains highlights of the workshop. Opinions expressed are those of the speakers, individual committee members, and other participants but do not represent the viewpoint of the National Academies or a consensus of any National Academies committee. PowerPoint presentations of the speakers are available at the standing committee’s Web site (http://dels.nas.edu/emergingissues/index.asp). In 1   The term “-omics” is used in this report to refer to various types of global analytical approaches, such as genomics, transcriptomics, metabolomics, and proteomics.

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation addition, recordings of speakers’ talks (except those of Donna Mendrick and Susan Sumner) and other discussions are available at http://dels.nas.edu/emergingissues/index.asp. The workshop agenda and biosketches of the speakers and workshop planning committee members are included as appendixes. David L. Eaton, of the University of Washington, chair of the standing committee and a member of the Application of Toxicogenomics to Cross-Species Extrapolation workshop planning committee, and N. Leigh Anderson, of the Plasma Proteome Institute, chair of the Application of Toxicogenomics to Cross-Species Extrapolation workshop planning committee, introduced the topic of cross-species extrapolation and explained the objectives of the workshop. The potential uses of toxicogenomic technologies in cross-species extrapolation were the topics of discussion. The objective of the workshop was to consider the promises and limitations of emerging data-rich approaches—such as genotyping (genomics), mRNA profiling (transcriptomics), protein profiling (proteomics), and metabolite profiling (metabolomics)—to inform cross-species extrapolation, particularly whether the effects of chemicals in test animals can be used to predict human responses. A basic premise of toxicology and risk assessment is that experimental animals are generally appropriate models with which to identify potential chemical hazards to humans. However, the reliability of extrapolation for particular chemicals is often controversial. Despite the dependence on animal models for predicting human health effects in the regulatory arena, there can be important differences between how nonhuman animals and humans respond to chemicals. Much of the workshop discussion focused on mammalian species, but it is acknowledged that other toxicologic test species, such as zebrafish and the nematode Caenorhabditis elegans, can also provide a large body of information on many environmental toxicants. Anderson introduced the question, “How good can cross-species extrapolation ever be?” The question could be attacked with two broad strategies. The first is understanding the biologic mechanisms of action of individual compounds from two perspectives: (1) whether identifying the dominant mechanism in one species, such as rats, will be sufficient to extrapolate results to another species, such as humans, and (2) whether all the potential toxicity mechanisms in a species can be characterized in enough detail to build a predictive model for humans. Anderson acknowledged that the latter concept might stretch the limits even of systems biology. The second, more practical strategy is to use in vitro sys-

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation FIGURE 1 Extrapolation through in vitro systems. tems. That strategy, illustrated in Figure 1 and discussed further below, uses in vitro systems for comparison of test species with humans. To function in this way, in vitro systems need to replicate sufficiently the in vivo characteristics of interest in the test species and in humans. EMERGING MOLECULAR AND COMPUTATIONAL APPROACHES FOR CROSS-SPECIES EXTRAPOLATION Richard Di Giulio, of Duke University, presented highlights of a 2002 report he and William H. Benson of the U.S. Environmental Protection Agency (EPA) edited, Interconnections Between Human Health and Ecological Integrity (Di Giulio and Benson 2002). The report drew several conclusions regarding the biologic bases of similarities and differences between humans and other animals: “1) Omic technologies will enhance understanding; 2) Extrapolations among levels of organization [are the] most critical and complete challenge; and 3) Advanced mathematics and modeling may provide a pivotal approach [to exploring the biologic bases of differences].”

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation A more recent workshop was organized by Di Giulio and Benson on the topic of cross-species extrapolation. The joint Society of Environmental Toxicology and Chemistry-Society of Toxicology Pellston (SETAC-SOT) workshop was held on July 19-22, 2004. The goals of the workshop were “to understand and enhance utility of -omics and computational biology in order to: 1) elucidate similarities and differences among species, 2) relate stressor-mediated responses to phenotype, 3) extend this science into innovative approaches for risk assessment and regulatory decision-making, and 4) develop the ‘interconnections between human health and ecological integrity’ paradigm.” The final workshop conclusions and recommendations on those topics will be released in the report of that workshop, expected in 2005. Di Giulio highlighted several overall workshop themes that may be apparent in the report: -omics technologies are not a replacement for traditional toxicology approaches in the foreseeable future, proof-of-concept studies are needed, more standardized approaches for conducting -omics assays and analyzing data are needed, -omics databases are needed for selected surrogate species, studies are needed to link -omics responses to adverse effects seen in experimental animals (“phenotypic anchoring”), and there is a need for enhanced training to produce cross-disciplinary scientists. In summary, SETAC-SOT workshop participants thought that genomic and computational approaches collectively will greatly enhance the ability to address many of the major issues in human and environmental toxicology. Specifically, the new technologies will provide unique approaches to address cross-species extrapolation in risk assessment in both human and environmental toxicology. POTENTIAL IMPLICATIONS OF GENOMICS FOR RISK ASSESSMENT Benson discussed potential implications of genomics for regulatory and risk assessment applications at EPA, focusing largely on a white paper on genomics and risk assessment that EPA released in March 2004 (Dearfield and Benson 2004; EPA 2004). The white paper describes how -omics may provide new approaches to old problems but notes that -omics data today are insufficient for risk assessment, although they may play a role in a weight-of-evidence analysis. It also explores different EPA perspectives on how the agency might use -omics data, providing

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation an overview of genomics for EPA risk assessors and managers on how -omics data will fit into their work. The document identifies anticipated regulatory and risk assessment applications and implications, provides an overview of current agency science activities in -omics that may support regulatory scenarios, and identifies scientific and research needs. -Omics data may be useful for screening and priority setting, especially if -omics responses are linked to adverse outcomes. Linking -omics data to exposure may also be helpful for biomonitoring—for example, tracking of pathogen sources. The white paper also discusses how -omics data could trigger reporting requirements. Benson discussed how -omics might be useful in risk assessment in elucidating a chemical’s mode of action (MOA), identifying and assessing effects on susceptible populations and life stages, and assessing mixtures. -Omics might be helpful with MOAs, for example, by elucidating pathways and contributing to predictive models. -Omics data might also increase the confidence in cross-species extrapolation if genes or patterns of gene expression are conserved between humans and test-animal species or support a conclusion of nonrelevance to humans if there is little or no similarity. Benson also asked when it makes sense to use wildlife data to improve human risk assessment. Benson highlighted challenges specific to cross-species extrapolation. For example, there are differential dosimetry issues—different dose-response relationships. Animal physiology and environment will affect exposures to chemicals, and metabolic pathways can differ substantially among species. There are also various degrees of homology among species in genes, proteins, biochemistry, and physiology. Benson next described EPA’s Computational Toxicology Research2 Program in the Office of Research and Development. The program is centered on what is referred to as the source-to-outcome continuum (Figure 2). The general objectives of the program are to improve the linkages in the source-to-outcome continuum, provide predictive models for screening and testing chemicals, and enhance quantitative risk assessment. The hope is that -omics can contribute to each of the boxes in the continuum and enhance quantitative risk assessment. The current effort is on endocrine-disrupting chemicals as a proof of concept for this approach to computational toxicology in general, including cross-species extrapolation. 2   Computational toxicology is the application of mathematical and computer models for prediction of effects and the understanding of MOAs.

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation FIGURE 2 The source-to-outcome continuum. PBPK = physiologically based pharmacokinetics; QSAR = quantitative structure-activity relationship; BBDR = biologically based dose-response. Source: EPA 2003. James S. Bus, of the Dow Chemical Company, and Di Giulio discussed how there are long-term opportunities to improve risk assessment, perhaps through different paradigms. Di Giulio’s statement, that -omics will not replace traditional toxicologic approaches sparked discussion. Participants seemed to agree that not all traditional toxicologic testing could be replaced with -omics technologies, but they were hesitant to say that none would be replaced as the technology progressed. Studies that might not be amenable to -omics approaches include metabolic enzyme kinetics, for example. Kinetics are unlikely to be captured by current genomic or proteomic approaches, and pharmacokinetic studies in general might be difficult to replace with new -omic approaches, according to Eaton. The discussion concluded with Benson, Bus, and Di Giulio noting that -omics studies do not have to replace traditional studies to be useful but instead should be considered as “value-added” tests. One participant pointed out that the use of toxicogenomic information would depend on whether this field of research is viewed from the perspective of the precautionary principle (that is, erring on the side of caution). Eaton agreed that that is important and could affect how in-

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation formation, such as a change in a transcription profile might be interpreted by the regulated community and the regulators. That is, if the precautionary principle were invoked, a given change in a transcription profile might be more likely to be considered an adverse effect. TECHNOLOGICAL CHALLENGES OF CROSS-SPECIES EXTRAPOLATION USING PROTEOMICS Frank A. Witzmann, of Indiana University, discussed technological challenges of cross-species extrapolation using proteomics. He explained that characterization of the proteome for a given target tissue and species is something that needs to be done first. Although proteomic toxicity testing needs to be done in the context of other -omics technologies, it is important to remember that most toxicologic studies include time-course and dose-response analyses, so sample numbers generally are high, and this places a burden on the proteomic technology. The types of proteomic responses to a toxicant that Witzmann discussed are quantitative changes in protein expression and differential posttranslational modification of proteins. Different platforms, or technical approaches, are available for proteomic analysis. For example, two-dimensional gel electrophoresis (2DE) works well for several reasons. It has enormous resolving power: of the 5,000-10,000 proteins present in a cell, about 2,000 can be resolved in a single electrophoretic run. That allows detection of small changes in the concentrations or properties of proteins and the isolation of proteins in quantities sufficient for structural analysis. 2DE also works with the large number of samples that would be generated in a toxicology experiment with different exposure times and dosages. Witzmann emphasized that proper experimental design is needed to compare protein expression and posttranslational modification between species (for example, to extrapolate results from rat to human). Experiments could be designed to initially derive results from rat in vivo exposures, replicate them with cultured rat cells, and then attempt to replicate them in cultured human cells. Such an experimental design is frequently mentioned in this report (for example, see Figure 3). As mentioned above, 2DE enables quantitative assessment of protein expression. Protein expression under different conditions can be compared with computational models. For example, Witzmann described how the protein profiles generated by exposure to structurally

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation FIGURE 3 Interspecies comparisons and human relevance—a parallelogram approach. Source: Nesnow 2004. Reprinted with permission of the author. diverse peroxisome proliferators and halocarbons can be compared with such computational modeling approaches as integrated quantitative structure-activity relationships (I-QSAR) analysis. He thinks that these modeling concepts could be applied to cross-species comparisons when homologous protein systems exist. A good portion of the challenge in doing this is bioinformatic. Accurate characterization of toxicologically relevant proteins—for example, with respect to homologous pathways, receptors, metabolism, and bioactivation—and determination of common toxic end points (such as oxidative stress and glutathione depletion) are required. That is, these techniques will work best when similar pathways are involved in similar or common toxic responses. Witzmann was asked about the commonalities between the in vitro rat proteome and the human proteome. He said that “the 2D protein patterns seem very similar, but these expression profiles need to be analyzed more comprehensively and in a coherently designed way.” MODELING GENE-EXPRESSION DATA TO PREDICT HUMAN HEPATOTOXICITY AFTER INCONSISTENT ANIMAL RESPONSES Donna Mendrick, of Gene Logic, outlined how her company has used toxicogenomic approaches to analyze the relevance of some tradi-

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation tional toxicologic data to human health. Specifically, pathologic results were seen in one laboratory animal species but not another. Gene Logic is trying to use -omics to learn whether the pathologic observations are specific to the one laboratory animal species—that is, whether they are relevant to human health. She emphasized that she was presenting the -omics findings as a case study of what can be learned from -omics technologies and hoped to hear participants’ views on how much -omics data are needed to affect a conclusion about human health. Mendrick reviewed the case study in which proprietary Compound X was being developed for clinical use by a pharmaceutical company. The compound did not suggest problems in traditional toxicity studies with rats. However, in dogs, liver fibrosis developed after 3 months of daily administration. Gene Logic’s goal was to discover a mechanistic explanation for the species-specific toxicity observed in dogs to determine whether the toxicity is likely to occur in humans. The researchers used Affymetrix microarrays to measure changes in gene transcription in rat and canine livers exposed to Compound X in vivo. It appeared that the dog genes dysregulated by exposure to Compound X were consistent with the fibrosis observed in dogs. However, that type of gene dysregulation was not observed in rat liver. For example, the fibrosis genes differentially dysregulated between dog and rat liver included INHBE (activin beta E). To determine whether these canine genes might be relevant to human fibrosis, Gene Logic’s BioExpress database of normal and diseased human tissue samples was used to compare genes dysregulated in the dog liver with genes dysregulated in humans who had liver fibrosis. Researchers found that some of the genes whose expression changed in the dog were changed in the diseased human samples as well. For example, the expression of INHBE is relatively liver specific in tissues from healthy rats and humans. However, the gene is also detectable in the dog heart, so researchers asked whether variations in INHBE expression suggested differences in regulation between species. Next, compound X was compared with phenobarbital, because phenobarbital was identified by the Gene Logic predictive models as similar to compound X in its effects on gene expression. Phenobarbital induces transcription of drug-metabolizing genes in rats and causes liver enlargement. Some reports indicate that phenobarbital may cause liver fibrosis in dogs, but it is generally considered nonhepatotoxic in humans and rats. (It does, however, raise flags because it has the potential to affect the metabolism of other compounds.) Mendrick compared the genes dysregulated in rat and dog

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation liver after phenobarbital treatment with those dysregulated after compound X treatment. Her conclusion was that compound X dysregulates many of the same genes as does phenobarbital in the rat and dog but that both affect few genes in common between species. In summary, looking at the whole of what was learned with the phenobarbital comparisons and the liver fibrosis difference in humans, Mendrick concluded that the data might suggest that the gene dysregulation in the dog did not predict human health liabilities and that compound X might warrant further investigation. She pointed out that it would be useful to consider how much detail is needed to explain species differences; that is, what is sufficient to “make the case.” USING METABOLOMICS TO EXPLORE SPECIES DIFFERENCES IN METABOLISM AND DISTRIBUTION Susan Sumner, of Paradigm Genetics,3 discussed metabolism, metabolomics, and cross-species extrapolation. Metabonomics is defined as “the quantitative measurement of the dynamic multiparametric metabolic response of living systems to pathophysiological stimuli or genetic modification”—in other words, assessing the changes in endogenous compounds as a function of which biologic indicator is being examined, such as toxicity or, in the case of a drug, efficacy (Nicholson et al. 1999). It involves assessing the changes in low-molecular-weight endogenous compounds as functions of the end point being examined. Metabolomics, metabonomics, and metabolite profiles are used interchangeably by many. Investigators are using a number of analytical tools to assess endogenous metabolites, such as nuclear magnetic resonance spectroscopy and mass spectrometry coupled with chromatographic separation methods. Some methods look for particular classes of metabolites; others are less targeted. Methods looking at specific classes of metabolites, such as lipids, often focus the metabolomics effort toward specified pathways. The less-targeted methods are particularly useful for identifying new metabolic pathways and biologic mechanisms, because they enable investigators to discover events outside the proposed or known interactions. Sumner began her presentation by noting that metabolism studies are advantageous for developing indicators of exposure, effect, or sus- 3   Sumner is now at RTI International.

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation SUMMARY OF ROUNDTABLE DISCUSSION After the individual presentations, the speakers and standing-committee members participated in a roundtable discussion. Anderson, chair of the workshop planning committee, led the discussion by asking about MOA and cross-species extrapolation. Other topics then arose as the presenters and audience engaged in a dialogue on issues surrounding the use of toxicogenomic data for improving cross-species extrapolations for toxicologic end points. Mode of Action Participants were asked to consider two questions: (1) When are data sufficient to conclude that an MOA established in one species, such as mice or rats, is relevant or not relevant to humans? and (2) Do the -omic technologies offer any advantages in defining the MOA of a chemical? Kerry Dearfield, of EPA,6 indicated that EPA has developed an MOA framework as part of its 2003 revisions to the “Guidelines for Carcinogen Risk Assessment.” The MOA framework has a series of questions that should be asked about any set of data, including genomics information, to determine whether the hypothesized MOA and key events are appropriate for the end point of concern. Yvonne Dragan, of the Food and Drug Administration (FDA), emphasized that it is important to remember that the identification of an MOA involved in a particular end point does not preclude the involvement of other MOAs—there can be more than one MOA for a given chemical, and experimental design influences the ability to examine target vs nontarget effects. But, according to Dearfield, even though regulatory agencies recognize that there may be multiple mechanisms, EPA typically regulates a chemical on the basis of a particular end point, such as cancer, and attempts to determine a plausible mechanism of action for that end point. In its risk assessments, however, EPA also describes other possible MOAs, and these may add to the weight of evidence for the hazard assessment. In the past, EPA used just one effect as the end point for its hazard assessment, but it recognizes that with the advent of the -omic technologies there may be many cellular activities that affect a 6   Dearfield is now with the U.S. Department of Agriculture.

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation chemical’s activity and MOA. Risk managers need to know all this information to make informed regulatory decisions. In connection with whether -omics are advantageous in defining MOAs, Bing Ren, of the University of California, San Diego, explained how he thought that once a profile is understood in one species, it can be used to determine the MOA in another species more expeditiously than traditional toxicology methods. Addressing knowledge gaps might assist in developing hypotheses about transcriptional changes into knowledge about disruption of particular cellular pathways. Gaps might pertain to what pathways are involved in developmental states or are active in different tissues or to transcription factor binding in the genome. John Leighton, of FDA, cautioned that it is necessary to validate the use of toxicogenomic data in one species—to move beyond the exploratory phase—before using them for regulatory risk assessments. The issue of validation and cross-species extrapolation is being addressed by the FDA guidance document for industry for submission of pharmacogenomics data (FDA 2003). Similarity of Pathways Between Species The use of biologic pathways to think about cross-species comparison was a theme that came up in Sumner’s presentation and was highlighted by John Quackenbush,7 of the Institute for Genomic Research. He noted that participants seemed to be thinking about pathways and signaling networks rather than looking at individual genes and proteins. Although his own work has sometimes focused on identifying orthologous genes between species, scientists have to go beyond orthologous genes and proteins and look at orthologous pathways and networks. For example, rather than seeing whether dysregulated genes are conserved across species, it is important to look at the pathway that the dysregulated genes suggest may be involved and ask whether similar pathways exist and are dysregulated in humans. Sumner discussed the importance of mapping metabolites of interest to metabolic pathways and comparing the pathways across species. For example, she found that ascorbic acid was increased in the urine of rats exposed to a toxicant, but the lack of rat-to-human pathway convergence makes the use of this marker ques- 7   Quackenbush is now at the Dana-Farber Cancer Institute at the Harvard School of Public Health.

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation tionable. However, being able to map to the glucuronic acid pathway from rat to human may indeed be relevant for the development of a human marker. Along the same lines of comparing possibly orthologous genes and proteins, Anderson asked what types of molecular targets might be looked at with toxicogenomic technologies to help with cross-species extrapolation. He mentioned that binding to receptor targets, for example, would be good to look at but that new approaches to measuring binding of chemicals to receptors appear to be in their infancy. He and Quackenbush suggested that it might be helpful to think about the types of chemical targets that are likely to be conserved across species. Possible examples are the generation of reactive oxygen species and other mechanisms of DNA damage. Toxicogenomics for Other Extrapolations (Low vs High Dose) The roundtable participants moved on from the discussion of toxicogenomics for cross-species extrapolation to how it might be used for other extrapolations in risk assessment, such as high- to low-dose extrapolations. Bus described the current risk assessment paradigm as flawed because of its reliance on testing chemicals at high dosages in nonhuman animals even though humans are typically exposed to much lower dosages. High-dosage testing does not provide necessary information about how an organism will respond to low dosages of a chemical. He indicated that a critical question in risk assessment is how to determine the MOA of an organism’s response to low dosages of a chemical if no classical toxic effects are observed. That is, what biologic mechanisms are mounted to modulate expression of toxicity? He expressed hope that toxicogenomic approaches would offer a way out of the traditional risk assessment paradigm and provide a new tool for understanding how cells and organisms react to a new chemical. The theme of extrapolation from high to low dosages was echoed by several participants. In response to the discussion about how toxicology can move beyond testing chemicals at high dosages to determining effects at actual, low dosages, Mendrick asked how scientists could determine which changes at a low dosage, that do not result in overt traditional pathologic effects, signal impending toxicity. She described Gene Logic’s dose-escalation approach. Rather than use the same dosage of each compound, comparable concentrations of different compounds are determined by using a marker of mitochondrial damage as a phenotypic

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation benchmark—an amount that induces some mitochondrial toxicity but is less than what leads to a traditional pathologic end point. Bus also emphasized the importance of examining the entirety of the comparative dose-response curves. If the comparative animal dose-response curves are not parallel, responses at high dosages may converge or even cross over at lower dosages, and this results in different implications for risk assessment. Another participant echoed the value of focusing on chemicals that had qualitative species differences at the low end of the dose-response curves rather than devoting efforts to chemicals that have the same effect in humans and other animals at low dosages. Eaton followed this point, questioning what to do about qualitative differences in how species respond. For example, one species may respond to a chemical while another does not. That is a bigger issue than whether similar responses occur at different dosages, particularly at unrealistically high dosages. Bus agreed that quantitative differences in response (similar effects at different dosages, but with the same no-observed-effects level [NOEL]) do not have the same public-health implications as qualitative or more dramatic quantitative differences, for example, when different species receive environmentally realistic dosages of a chemical but the NOELs are dramatically different. Dearfield thought that the biggest challenge would be sifting through the noise to figure out which of the expression changes in genes, proteins, or other markers would be precursors or biomarker of adverse events and which would not. He agreed that the promise of -omics technologies was in exploring toxic effects at lower dosages. Use of Uncertainty Factors in Risk Assessment The roundtable participants discussed how the uncertainty factors used to account for species differences might be affected by toxicogenomics. EPA approaches the use of uncertainty factors on a weight-of-evidence basis, according to Benson. Furthermore, some uncertainty factors depend on statutes, such as the Food Quality Protection Act, because they define what is legally considered safe. Dearfield asked some pragmatic questions faced by risk assessors who extrapolate from animal data to human health. Are risk assessors going to start changing how they think about reference doses8 and uncertainty factors on the basis of 8   Reference dose or concentration is an estimate (with uncertainty spanning perhaps an order of magnitude) of a daily exposure to the human population (in

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation toxicogenomic information? Should the typical interspecies uncertainty factor of 10 be applied if genomics data suggest that the test species is more like the human in some way? Should a different uncertainty factor be applied for interspecies extrapolation if genomics data reveal that some pathway or metabolite is well conserved in humans? Although EPA does not have answers for those questions, it is starting to explore the use of toxicogenomics to better characterize such variability and to refine the use of uncertainty factors. Using -Omics for Mode of Action vs Predictive Patterns Quackenbush pointed out that it is important to distinguish between using -omics information to understand a mechanism and using it to make predictions about end points without necessarily understanding the underlying mechanism or biologic process responsible for the end point. He emphasized that mechanistic understanding is not necessary to gain useful information. Ren weighed in on the issue of MOAs vs predictive patterns. He explained how predictive patterns might seem analogous to a criminal’s fingerprint from a crime scene, but they are different. A detective who identifies a fingerprint at a crime scene can match it with a fingerprint in a database. Analogously, if a scientist identifies a particular -omics pattern, it might match with an -omics pattern associated with a specific disease. However, the difference is that organisms are dynamic, and it may be difficult to find the correct cell or tissue at the right moment. Given that it is not practical to do experiments across an entire human life span for every cell type, he thinks that the best way to make predictions is to understand mechanisms. Quackenbush thought that the issues of dynamic expression and interindividual variability could be resolved by collecting more data so that signals can be separated from noise. Thomas thought that although it might be possible to develop a training set for making predictions in some arenas, such as cancer research and the pharmaceutical industry, it will be necessary to understand the mechanisms of toxicity to make a predictive model valuable not just from a hazard identification perspective but from a risk assessment perspective. Quackenbush did not agree, saying that even if -omics     cluding sensitive subgroups) that is likely to be without an appreciable risk of deleterious effects during a lifetime.

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation data did not help to elucidate an MOA, they could have predictive value. He used the example of developing a training set of common toxicogenomic profiles of 50 hepatotoxic compounds. If one knows the profile of those compounds, one can determine whether the 51st compound is hepatotoxic by comparing it with the other 50 profiles. These data might not identify an MOA, but the information may be valuable nonetheless. Eaton agreed that even without an understanding of mechanisms, toxicogenomic tools may be useful as screening tools for priority setting, either among potential pharmaceutical leads in the drug world, or, at EPA, among new chemicals to test. That application of the technologies would not necessarily replace traditional toxicologic evaluation aimed at understanding mechanisms, but it is important not to sell the technologies short for their potential screening application. Eaton’s comments were echoed by Leighton, who indicated that, from a pharmaceutical regulatory decision-making perspective, it is not necessary to understand an MOA—a predictive pattern is more valuable. He used the example of estradiol: even after 50 years of use and investigation, we are still unsure of how it works. However, MOA information is important because it allows extrapolation to untested populations. In response, Mendrick noted that analyzing the MOA of a chemical might require complex time-course studies to identify all the genes or proteins that are being affected and ultimately expressed as phenotypic changes. If one is using genomics just to determine whether a chemical might be toxic, however, one needs to identify only the genes or proteins that reliably predict an adverse effect, not every gene in the pathway that leads to the effect. Tiffany Tummino, of the Occupational Safety and Health Administration, reminded the group that MOA is helpful in attributing effects to one chemical vs another, that is, teasing out which effects are attributable to the chemical we are trying to regulate rather than to coexposures. John A. Moore, reminded participants of the traditional importance of understanding MOA in the assessment of environmental chemicals. When a chemical is tied to an adverse effect and the association is believed to be a false positive, the only way to demonstrate that the effect is indeed a false-positive effect is to explain the MOA. Eaton emphasized that without understanding the underlying mechanism, it is difficult to determine whether the different transcriptional signatures observed in different species are biologic because of differences or because of noise (that is, are nonpredictive responses). Anderson pointed out that understanding the biologic basis of an adverse effect allows scientists to think about key critical steps in a bio-

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation logic pathway that distinguish between adverse and nonadverse effects. There is a continuum between information that provides mechanistic insight and information that is predictive without providing mechanistic insight. Casimir Kulikowski, of Rutgers University, thought that it might be useful to develop models for integrating different lines of evidence (mechanistic and nonmechanistic). Richard A. Canady, of the Executive Office of the President, and Eaton echoed that sentiment: scientists may need new ways of thinking about the data that are “mentally manageable.” Other Thoughts on Cross-Species Comparisons Several other thoughts about cross-species comparisons were mentioned briefly in the roundtable discussion. Bus pointed out that -omics technologies may allow toxicologists to focus what they look at with traditional toxicologic methods. That is, to apply traditional toxicologic measures to end points or biologic processes that -omics data suggest may be important. Sumner supported using -omics data as a signal of pathologic outcome but also for integrating them with traditional toxicology approaches, such as the use of the area under the curve and physiologically based pharmacokinetic modeling. The parallelogram presented by Nesnow illustrated the experimental paradigm for identifying species differences in response to new chemicals or chemicals that are not well tested. The concept, described earlier, is that by having the information in three corners, one can extrapolate to predict what happens in the fourth corner—in vivo responses in humans. Eaton thought that it would be useful for scientists to generate data so this concept can be tested and the proof of principle demonstrated. Of course, this parallelogram approach to using transcriptomics may not be appropriate if the toxicologic effects would not be picked up by transcriptomics—such as enzyme activity changes via direct inhibition or other differences in pharmacokinetics or pharmacodynamics that may not be reflected in changes in the transcriptome. CONCLUSION This workshop examined the use of toxicogenomics for cross-species extrapolation. Such extrapolation is important in assessing risks

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation posed by exposure to chemicals because although toxicity testing is conducted in laboratory animals, it is human health that risk assessors are trying to protect. The workshop did not attempt to reach a consensus that any particular approach for using toxicogenomics technologies will expedite or facilitate cross-species extrapolation, but it identified several approaches that might be useful and several needs for more proof-of-principle research to demonstrate the utility of -omics technologies in enhancing cross-species extrapolation. Insight may be gained as scientists try to use toxicogenomic data to hypothesize about possible MOAs of chemicals or to focus their chemical and pharmaceutical development programs. Insight may also be gained when more scientists who are using toxicogenomics to look at mechanisms of particular chemicals begin to study additional species, including humans. Such research will highlight similarities or dissimilarities of biochemical pathways and mechanisms between test species and humans.

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation References Bakhiya, A., A. Bahn, G. Burckhardt, and N. Wolff. 2003. Human organic anion transporter 3 (hOAT3) can operate as an exchanger and mediate secretory urate flux. Cell Physiol. Biochem. 13(5):249-256. Buist, S.C., and C.D. Klaassen. 2004. Rat and mouse differences in gender-predominant expression of organic anion transporter (Oat1-3; Slc22a6-8) mRNA levels. Drug Metab. Dispos. 32(6):620-625. Burris, J.M., J.K. Lundberg, G.W. Olsen, C. Simpson, and J.H. Mandel. 2002. Determination of Serum Half-lives of Several Fluorochemicals. Interim Report No. 2. EPA Docket AR 226-1086. 3M Company, St. Paul, MN. Corbin, I.R., R. Buist, V. Volotovskyy, J. Peeling, M. Zhang, and G.Y. Minuk. 2002. Regenerative activity and liver function following partial hepatectomy in the rat using (31)P-MR spectroscopy. Hepatology 36(2):345-353. Dearfield, K.L., and W.H. Benson. 2004. Genomics implications for EPA regulatory and risk assessment applications. Emerging Issues in Environmental Health Sciences 6:2-3 [online]. Available: http://dels.nas.edu/emergingissues/docs/EI_issue6.pdf [accessed May 13, 2005]. Di Giulio, R.T., and W.H. Benson, eds. 2002. Interconnections Between Human Health and Ecological Integrity. Pensacola, FL: SETAC Press. EPA (U.S. Environmental Protection Agency). 2003. A Framework for a Computational Toxicology Research Program in ORD. EPA 600/R-03/-65. Office of Research and Development, U.S. Environmental Protection Agency, Washington, DC [online]. Available: http://www.epa.gov/comptox/publications/comptoxframework06_02_04.pdf [accessed May 13, 2005].

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation EPA (U.S. Environmental Protection Agency). 2004. Potential Implications of Genomics for Regulatory and Risk Assessment Applications at EPA. Final Report. EPA 100/B-04/002. Prepared for the U.S. Environmental Protection Agency, by Genomics Task Force Workgroup. Science Policy Council, U.S. Environmental Protection Agency, Washington, DC [online]. Available: http://www.epa.gov/osa/genomics.htm [accessed May 13, 2005]. FDA (Food and Drug Administration). 2003. Guidance for Industry: Pharmacogenomic Data Submissions. Center for Drug Evaluation and Research, Center for Biologics Evaluation and Research, Center for Devices and Radiological Health, Food and Drug Administration. November 2003 [Online]. Available: http://www.fda.gov/cder/guidance/5900dft.pdf [accessed May 16, 2005]. Hanhijärvi, H., M.Ylinen, T. Haaranen, and T. Nevalainen. 1988. A proposed species difference in the renal excretion of perfluorooctanoic acid in the beagle dog and rat. Pp. 409-412 in New Developments in Biosciences: Their Implications for Laboratory Animal Science, A.C. Beynen, and H.A. Solleveld, eds. Dordrecht, the Netherlands: M. Nijhoff. Kemper, R.A. 2003. Perfluorooctanoic Acid: Toxicokinetics in the Rat. Laboratory Project ID: DuPont-7473. EPA Docket AR-226-1350. DuPont Haskell Laboratories. Kudo, N., and Y. Kawashima. 2003. Toxicity and toxicokinetics of perfluorooctanoic acid in humans and animals. J. Toxicol. Sci. 28(2):49-57. Kudo, N., M. Katakura, Y. Sato, and Y. Kawashima. 2002. Sex hormone-regulated renal transport of perfluorooctanoic acid. Chem.-Biol. Interact. 139(3):301-316. Lenz, E.M., J. Bright, I.D. Wilson, A. Hughes, J. Morrison, H. Lindberg, and A. Lockton. 2004. Metabolomics, dietary influences and cultural differences: A1H NMR-based study of urine samples obtained from healthy British and Swedish subjects. J. Pharm. Biomed. Anal. 36(4):841-849. Ljubojevic, M., C.M. Herak-Kramberger, Y. Hagos, A. Bahn, H. Endou, G. Burckhardt, and I. Sabolic. 2004. Rat renal cortical OAT1 and OAT3 exhibit gender differences determined by both androgen stimulation and estrogen inhibition. Am. J. Physiol. Renal Physiol. 287(1):F124-F138. Nesnow, S. 2004. A Transcriptional Analysis Approach to Understanding the Basis of Species Differences in Conazole Toxicology. Presentation at the Seventh Meeting on Emerging Issues and Data on Environmental Contaminants-Applications of Toxicogenomics to Cross-Species Extrapolation: A Workshop, August 12, 2004, Washington, DC. Nicholson, J.K., J.C. Lindon, and E. Holmes. 1999. ‘Metabonomics’: Understanding the metabolic responses of living systems to

OCR for page 1
Application of Toxicogenomics to Cross-Species Extrapolation pathophysiological stimuli via multivariate statistical analysis of biologic NMR spectroscopic data. Xenobiotica 29(11):1181-1189. Noker, P. 2003. A Pharmacokinetic Study of Potassium Perfluorooctanoate in the Cynomologus Monkey . Southern Research Institute Study ID: 99214. EPA Docket AR 226-1362. Ramos, K.S. 2003. EHP Toxicogenomics: A publication forum in the post-genome era [editorial]. EHP Toxicogenomics 111(1T):A13.