Summary

The Human Genome Project, which set the goal of determining the complete nucleotide sequence of the human genome, was among the most important biologic research projects of all time. First envisioned in the late 1980s and considered by many to be technologically impossible at the time, it was the combination of adequate resources and strong scientific leadership of the project that fostered development of the requisite rapid DNA sequencing technologies. These new technologies were so successful that the genomic sequence of the bacterium Haemophilus influenzae was obtained only a few years later in 1995. Since then, the genomes of dozens of organisms have been elucidated and made available to the research community, and most important, the reference human genome sequence was made available by the year 2000, several years ahead of schedule.

To capitalize on the enormous potential of having access to genome-wide sequence information, scientists, clinicians, engineers, and information scientists combined forces to develop a battery of new molecular and bioinformatic tools that now make it possible to obtain and analyze biologic datasets of unprecedented magnitude and detail. Generally referred to as genomic technologies, these approaches permit sequence analysis—as well as gene transcript, protein, and metabolite profiling—on a genome-wide scale. As a result, the Human Genome Project and the technologic innovations and computational tools that it spawned are having profound effects on biologic research and understanding.

The application of these technologies to toxicology has ushered in an era when genotypes and toxicant-induced genome expression, protein, and metabolite patterns can be used to screen compounds for hazard identification, to monitor individuals’ exposure to toxicants, to track cellular responses to different doses, to assess mechanisms of action, and to predict individual variability in sensitivity to toxicants.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 1
Summary The Human Genome Project, which set the goal of determining the com- plete nucleotide sequence of the human genome, was among the most important biologic research projects of all time. First envisioned in the late 1980s and con- sidered by many to be technologically impossible at the time, it was the combi- nation of adequate resources and strong scientific leadership of the project that fostered development of the requisite rapid DNA sequencing technologies. These new technologies were so successful that the genomic sequence of the bacterium Haemophilus influenzae was obtained only a few years later in 1995. Since then, the genomes of dozens of organisms have been elucidated and made available to the research community, and most important, the reference human genome sequence was made available by the year 2000, several years ahead of schedule. To capitalize on the enormous potential of having access to genome-wide sequence information, scientists, clinicians, engineers, and information scientists combined forces to develop a battery of new molecular and bioinformatic tools that now make it possible to obtain and analyze biologic datasets of unprece- dented magnitude and detail. Generally referred to as genomic technologies, these approaches permit sequence analysis—as well as gene transcript, protein, and metabolite profiling—on a genome-wide scale. As a result, the Human Ge- nome Project and the technologic innovations and computational tools that it spawned are having profound effects on biologic research and understanding. The application of these technologies to toxicology has ushered in an era when genotypes and toxicant-induced genome expression, protein, and metabo- lite patterns can be used to screen compounds for hazard identification, to moni- tor individuals’ exposure to toxicants, to track cellular responses to different doses, to assess mechanisms of action, and to predict individual variability in sensitivity to toxicants. 1

OCR for page 1
2 Applications of Toxicogenomic Technologies This potential has prompted a plethora of scientific reviews and commen- taries about toxicogenomics written over the past several years that attest to the widely held expectation that toxicogenomics will enhance the ability of scien- tists to study and estimate the risks different chemicals pose to human health and the environment. However, there are limitations in the data that are currently available, and fully understanding what can be expected from the technologies will require a greater consolidation of useful data, tools, and analyses. Given the inherent complexity in generating, analyzing, and interpreting toxicogenomic data and the fact that toxicogenomics cannot address all aspects of toxicology testing, interested parties need to prepare in advance. This preparation will help them understand how best to use these new types of information for risk assess- ment and for implementing commensurate changes in regulations and public health, while preparing for the potential economic, ethical, legal, and social con- sequences. COMMITTEE’S CHARGE In anticipation of these questions, the National Institute of Environmental Health Sciences (NIEHS) of the U.S. Department of Health and Human Ser- vices, asked the National Academies to direct its investigative arm, the National Research Council (NRC), to examine the potential impacts of toxicogenomic technologies on predictive toxicology. NIEHS has invested significant resources in toxicogenomic research through establishment of the National Center for Toxicogenomics, funding of the National Toxicogenomics Research Consor- tium, development of the Chemical Effects in Biological Systems database for toxicogenomic data, and other collaborative ventures. In response to the NIEHS request, the NRC assembled a panel of 16 ex- perts with perspectives from academia, industry, environmental advocacy groups, and the legal community. The charge to the committee was to provide a broad overview for the public, government policy makers, and other interested and involved parties of the benefits potentially arising from toxicogenomic tech- nologies; to identify the challenges in achieving them; and to suggest ap- proaches that might be used to address the challenges. COMMITTEE’S RESPONSE TO ITS CHARGE The committee clarified its task by defining the terms “toxicogenomics” and “predictive toxicology” as follows: • Toxicogenomics is defined as the application of genomic technologies (for example, genetics, genome sequence analysis, gene expression profiling, proteomics, metabolomics, and related approaches) to study the adverse effects of environmental and pharmaceutical chemicals on human health and the envi- ronment. Toxicogenomics combines toxicology with information-dense ge-

OCR for page 1
3 Summary nomic technologies to integrate toxicant-specific alterations in gene, protein, and metabolite expression patterns with phenotypic responses of cells, tissues, and organisms. Toxicogenomics can provide insight into gene-environment interac- tions and the response of biologic pathways and networks to perturbations. Toxicogenomics may lead to information that is more discriminating, predictive, and sensitive than that currently used to evaluate exposures to toxicants or to predict effects on human health. • Predictive toxicology is used in this report to describe the study of how toxic effects observed in model systems or humans can be used to predict patho- genesis, assess risk, and prevent human disease. Because of the belief that toxicogenomics has the potential to place toxi- cology on a more predictive footing, the committee describes the momentum channeling the field in this direction and some of the obstacles in its path. The committee approached its charge by identifying, defining, and describing several proposed applications of toxicogenomics to hazard-identification screening, mechanism-of-action studies, classification of compounds, exposure assessment, defining genetic susceptibility, and reducing the use of animal-based testing. Studies supporting each of these putative applications were then critically evalu- ated to define limitations, to enumerate remaining challenges, and to propose viable solutions whenever possible. Finally, the committee outlined realistic expectations of how these applications can be validated and how they can be used in risk assessment. The second part of this summary reviews these applica- tions and what is needed for each of them. In evaluating the putative applications, the committee recognized some overarching themes and steps necessary for the field to move forward. OVERARCHING CONCLUSIONS AND RECOMMENDATIONS Reproducibility, Data Analysis, Standards, and Validation After evaluating the different applications, the committee concluded that, for the most part, the technologic hurdles that could have limited the reproduci- bility of data from toxicogenomic technologies have been resolved, representing an important step forward. To consolidate this advance, those who use these tools need to make a unified effort to establish objective standards for assessing quality and quality-control measures. Across the different applications, valida- tion efforts are an important next step: actions should be taken to facilitate the technical and regulatory validation of toxicogenomics. There is also a need for bioinformatic, statistical, and computational ap- proaches and software to analyze data. Thus, the committee recommends the development of specialized bioinformatic, statistical, and computational tools and approaches to analyze toxicogenomic data.

OCR for page 1
4 Applications of Toxicogenomic Technologies Use of Toxicogenomics in Risk Assessment Improving risk assessment is an essential aim of predictive toxicology, and toxicogenomic technologies present new opportunities to enhance it by po- tentially improving the understanding of dose-response relationships, cross- species extrapolations, exposure quantification, the underlying mechanisms of toxicity, and the basis of individual susceptibilities to particular compounds. Although the applications of toxicogenomic technologies to risk assess- ment and the regulatory decision-making process have been exploratory to date, the potential to improve risk assessment has just begun to be tapped. Toxicoge- nomic technologies clearly have strong potential to affect decision making, but they are not currently ready to replace existing required testing regimes in risk assessment and regulatory toxicology. Toxicogenomic technologies are assum- ing an increasing role as adjuncts to and extensions of existing technologies for predictive toxicology. Toxicogenomics can provide additional molecular level information and tests that add to the “weight of the evidence” for refining judg- ments about the risks posed by environmental toxicants and drugs. Ultimately, however, they are envisioned to be more sensitive and informative than existing technologies and may supplant some approaches currently used or at least be a component of batteries that will replace certain tests. To move forward, the committee recommends that regulatory agencies enhance efforts to incorporate toxicogenomic data into risk assessment. The following actions are needed: (1) substantially enhance agencies’ capability to effectively integrate toxicogenomic approaches into risk assessment practice, focusing on the specific applications below; (2) invest in research and personnel within the infrastructure of regulatory agencies; and (3) develop and expand research programs dedicated to integrating toxicogenomics into challenging risk assessment problems, including the development of partnerships between the public and private sectors. Need for a Human Toxicogenomics Initiative Several themes emerged throughout evaluation of the different applica- tions discussed below, including the need for more data, the need to broaden data collection, the need for a public database to facilitate sharing and use of the volumes of data, and the need for tools to mine this database to extract biologic knowledge. Concerted efforts are necessary to address these needs and propel the field forward. Fully integrating toxicogenomic technologies into predictive toxicol- ogy will require a coordinated effort approaching the scale of the Human Ge- nome Project. It will require funding and resources significantly greater than what is allocated to existing research programs and will benefit from public- private partnerships to achieve its goals. These types of investments and coordi- nated scientific leadership will be essential to develop toxicogenomic tools to

OCR for page 1
5 Summary the point where many of the expected benefits for predicting the toxicity of compounds and related decision making can be realized. To achieve this goal, NIEHS should cooperate with other stakeholders to explore the feasibility and objectives of a human toxicogenomics initiative (HTGI), as described in Box S-1. The HTGI would support the collection of toxicogenomic data and would coordinate the creation and management of a large-scale database that would use systems biology approaches and tools to integrate the results of toxicogenomic analyses with conventional toxicity test- ing data. The information generated from toxicogenomic experiments is on a scale vastly exceeding DNA sequencing efforts like the Human Genome Project. The heft of these outputs, consisting of multidimensional datasets that include geno- type, gene expression, metabolite, and protein information; design factors such as dose, time, and species information; and information on toxicologic effects warrant the creation of a public database. This database is needed to compile and analyze the information at a more complex level than the current “one disease is caused by one gene” approach. Curation, storage, and mining of these data will require developing and distributing specialized bioinformatic and computational tools. Current public databases are inadequate to manage the types or volumes of data to be generated by large-scale applications of toxicogenomic technolo- gies and to facilitate the mining and interpretation of the data, which are just as important as their generation and storage. Although the database and tools are important, the database itself is not sufficient. Data on a large number of compounds are needed so that comparisons can be made and data can be mined to identify important relationships. To col- lect and generate these toxicogenomic data, it will be important to leverage large publicly funded studies and facilitate the production and sharing of private sec- tor data. In addition to data, work is needed in the collection of physical samples appropriate for toxicogenomic research. Specifically, a national biorepository for human clinical and epidemiologic samples is needed so that toxicogenomic data can eventually be extracted from them. In addition, when possible and ap- propriate, the collection of samples and data should be incorporated into major human studies. The collection of human samples and their corresponding data raises a number of ethical, legal, and social issues of the type described below, which need to be addressed. Because the realization of the goals articulated here will require signifi- cantly higher levels of funding, leadership, and commitment than are currently allocated to toxicogenomics, planning and organizing research should begin immediately. Collaborations among government, academia, and the private sec- tor not only will expedite discovery but will ensure optimal use of samples and data; prevent unnecessary duplication or fragmentation of datasets; enhance the ability to address key ethical, legal, and social effects; reduce costs; and promote intellectual synergy.

OCR for page 1
6 Applications of Toxicogenomic Technologies BOX S-1 Human Toxicogenomics Initiative NIEHS should cooperate with other stakeholders in exploring the feasibility and objectives of implementing a human toxicogenomics initiative (HTGI) dedi- cated to advancing toxicogenomics. Elements of the HTGI should include the fol- lowing: 1. Creation and management of a large, public database for storing and in- tegrating the results of toxicogenomic analyses with conventional toxicity-testing data. 2. Assembly of toxicogenomic and conventional toxicologic data on a large number (hundreds) of compounds into the single database. This includes the gen- eration of new toxicogenomic data from humans and animals for a number of compounds for which other types of data already exist as well as the consolidation of existing data. Every effort should be made to leverage existing research studies and infrastructure (such as those of the National Toxicology Program) to collect samples and data that can be used for toxicogenomic analyses. 3. Creation of a centralized national biorepository for human clinical and epidemiologic samples, building on existing efforts. 4. Further development of bioinformatic tools, such as software, analysis, and statistical tools. 5. Consideration of the ethical, legal, and social implications of collecting and using toxicogenomic data and samples. 6. Coordinated subinitiatives to evaluate the application of toxicogenomic technologies to the assessment of risks associated with chemical exposures. The resulting publicly accessible HTGI data resource would strengthen the utility of toxicogenomic technologies in toxicity assessment and thus enable more accurate prediction of health risks associated with existing and newly de- veloped compounds and formulations. SPECIFIC APPLICATIONS OF TOXICOGENOMICS To address the expectation that toxicogenomics will revolutionize predic- tive toxicology, the committee explored several proposed applications of toxico- genomics, including hazard screening, the study of toxicologic mechanisms of action, exposure assessment, and characterizing variability in susceptibility. These and the other applications can be used in conjunction with risk assess- ment, although they are also important in predictive toxicology, which is re- moved from the risk assessment process. In the following sections, the commit- tee reports findings from the evaluation of these topics that were assimilated into the conclusions of the report.

OCR for page 1
7 Summary Exposure Assessment The application of toxicogenomics for defining biomarkers of exposure will require consensus on what constitutes an exposure biomarker. Standardized toxicogenomic platforms that are appropriate for identifying signatures of envi- ronmental or drug exposures in target and surrogate tissues and fluids will also be required. Additional technical challenges include the individual variation in response to an environmental exposure and the persistence of a toxicogenomic signature after exposure. Toxicogenomic technologies should be adapted and applied for the study of exposure assessment by developing signatures of exposure to individual chemicals and perhaps to chemical mixtures. To facilitate the development of exposure-assessment tools based on toxicogenomics, large human population studies should include a collection of samples that can be used for transcrip- tomic, proteomic, metabolomic, or other toxicogenomic analyses in addition to traditional epidemiologic measures of exposure. Hazard Screening Toxicogenomic technologies provide new and potentially useful indicators for use in toxicity screening. Near-term applications include current uses in drug development and the validation of categories of compounds for screening chemicals found in the environment. In contrast to applications in evaluating new drug candidates, screening approaches for environmental chemicals will need to address a broader range of exposure levels and a more comprehensive set of adverse health effects. Toxicogenomic screening methods should be integrated into relevant cur- rent and future chemical regulatory and safety programs upon validation and development of adequate databases. To move toward this goal, it is important to improve the quantity and quality of data available for deriving screening profiles and to develop a database to organize this information. The process of creating such a database could be accelerated by addressing proprietary and legal hurdles so at least some of the toxicogenomic data currently in private databases could be made available, and by integrating toxicogenomic assays into ongoing chemical screening and testing initiatives such as those conducted by the Na- tional Toxicology Program. In addition, regulatory agencies should continue to develop and refine guidance documents for their staff on interpreting toxicoge- nomic data. Variability in Susceptibility People vary in their susceptibility to toxic effects of chemical exposures. Toxicogenomic technologies (including the analysis of gene sequences and epi- genetic modifications) offer the opportunity to use genetic information in a pro-

OCR for page 1
8 Applications of Toxicogenomic Technologies spective fashion to identify susceptible subpopulations and assess the distribu- tion of differences in susceptibility in larger populations. Toxicogenomic tech- nologies could also reduce the uncertainty surrounding assumptions used in regulatory processes to address population variability. Toxicogenomic information should be used to prospectively identify, un- derstand the mechanisms of, and characterize the extent of genetic and epige- netic influences on variations in human susceptibility to the toxic effects of chemicals. Animal models and genome-wide human studies should be used to identify genetic variations that influence sensitivity to chemicals, using existing large human studies when possible to investigate the effect of genetic variations on responses to a wide array of chemical exposures and pharmaceutical thera- pies. More attention should be focused on modeling effects involving multiple genes and the study of context-dependent genetic effects (that is, gene-gene in- teractions as well as the impact of developmental age, sex, and life course). Mechanistic Information Toxicogenomic studies are improving knowledge of the molecular level events that underlie toxicity and may thus advance the consideration of mecha- nistic information in risk assessment and decision making. Tools and approaches should continue to be developed to advance the ability of toxicogenomics to provide useful mechanistic information. Developing richer algorithms and mod- els that can integrate complex and various types of toxicogenomic data (for ex- ample, metabolomic and proteomic data) may make it possible to shift the focus of mechanistic investigations of single genes to more integrated analyses. These will encompass more of the complexity of biologic systems as a whole as well as the multidimensionality of the dose- and time-related effects of toxicants. Cross-Species Extrapolation Toxicogenomic technologies offer the potential to significantly enhance confidence in animal-to-human toxicity extrapolations that constitute the foun- dation of risk evaluations. Using toxicogenomics to analyze species differences in toxicity will help explain the molecular basis for the differences and improve the translation of animal observations into estimates of potential human risk. In addition, by providing molecular level comparisons between humans and other species, toxicogenomics may assist in identifying those animal species and strains that are most relevant for specific assays. Toxicogenomics should continue to be used to study differences in toxi- cant responses between animal models and humans, and genotyped and geneti- cally altered animal model strains should continue to be used as experimental tools to better extrapolate results from animal tests to human health. Algorithms must be developed to facilitate accurate identification of genes and proteins that

OCR for page 1
9 Summary serve the same function in different organisms and species—called orthologous genes and proteins—used in toxicologic research. Dose-Response Relationships Toxicogenomics has the potential to improve the understanding of dose- response relationships, particularly at low doses. Future toxicologic assessment should incorporate dose-response and time-course analyses appropriate to risk assessment. Analyses of toxic compounds that are well characterized could pro- vide an intellectual framework for future studies. Developmental Exposures Although recognized to be important in a number of disorders, relatively little is known about the health impacts of fetal and early-life exposures to many chemicals in current use. Because of their sensitivity, toxicogenomic technolo- gies are expected to reveal more than previously was possible about the mole- cules involved in development and the critical molecular level events that can be perturbed by toxicants. Toxicogenomics may also enable screening for chemi- cals that cause gene expression changes associated with adverse developmental effects. In short, toxicogenomic technologies should be used to investigate how exposure during early development conveys susceptibility to drug and chemical toxicities. Mixtures Although much toxicology focuses on the study of single chemicals, hu- mans are frequently exposed to multiple chemicals. It is difficult to decipher how exposure to many chemicals will influence the effects of each one. It is unlikely that toxicogenomic signatures will be able to decipher all interactions among complex mixtures, but it should be possible to use mechanism-of-action data to design informative toxicogenomic experiments, including screening chemicals for potential points of biologic conversion (overlap) such as shared activation and detoxification pathways, enhancing identification and exploration of potential interactions, and moving beyond empirical experiments. Toxicoge- nomic approaches should be used to test the validity of methods for the ongoing challenge of estimating potential risks associated with mixtures of environ- mental chemicals. ETHICAL, LEGAL, AND SOCIAL ISSUES The committee evaluated ethical, legal, and social implications of toxico- genomics. As toxicogenomic data linked to clinical and epidemiologic informa-

OCR for page 1
10 Applications of Toxicogenomic Technologies tion are collected, it is critical to ensure adequate protections of the privacy, con- fidentiality, and security of toxicogenomic information in health records and information used in studies. Safeguarding this information will further advance important individual and societal interests. It will also prevent individuals from being dissuaded from participating in research or undergoing the genetic testing that is the first step in individualized risk assessment and risk reduction. Toxicogenomics is also likely to play a role in occupational, environ- mental, and pharmaceutical regulation and litigation. Regulatory agencies and courts should give appropriate weight to the validation, replication, consistency, sensitivity, and specificity of methods when deciding whether to rely on toxico- genomic data. Ethical, legal, and social issues that affect the use of toxicogenomic data and the collection of data and samples needed for toxicogenomic research should be addressed. This could occur through legislative improvements to en- hance individual protection, exploration of how to facilitate large-scale biore- pository and database research while protecting individuals, and consideration by courts and regulatory agencies of appropriate factors when deciding how to consider toxicogenomic data. Finally, special efforts should be made to address the impact of toxicogenomic research and findings on vulnerable populations. EDUCATION AND TRAINING IN TOXICOGENOMICS Given the complexity of toxicogenomics, the generation, analysis, and in- terpretation of toxicogenomic information represents a challenge to the scientific community and requires the collaborative cross-disciplinary efforts of scientific teams of specialists. Therefore it is essential that education and training in toxi- cogenomics become a continuous, ongoing process that reflects the rapid devel- opments in these new technologies. There is a need to develop education and training programs relevant to toxicogenomic applications to predictive toxicol- ogy. Specifically, programs are needed to reach the general public, susceptible subgroups, health professionals, government regulators, attorneys and judges, the media, scientists in training, scientists on the periphery of toxicogenomics, and institutions that participate in toxicogenomic research. CONCLUSIONS In summary, toxicogenomic technologies present a set of powerful tools for transforming current observation-based approaches into predictive science, thereby enhancing risk assessment and public health decision making. To lever- age this potential will require more concerted efforts to generate data, make multiple uses of existing data sources, and develop tools to study data in new ways. Beyond the technical challenges and opportunities, other challenges in the communication, education, ethics, and legal arenas will need to be addressed to ensure that the potential of the field can be realized.