BOX 1-1

Toxicogenomics Definition

Toxicogenomics: In this report, toxicogenomics is defined as the application of genomic technologies (for example, genetics, genome sequence analysis, gene expression profiling, proteomics, metabolomics, and related approaches) to study the adverse effects of environmental and pharmaceutical chemicals on human health and the environment. Toxicogenomics combines toxicology with information-dense1 genomic technologies to integrate toxicant-specific alterations in gene, protein, and metabolite expression patterns with phenotypic2 responses of cells, tissues, and organisms. Toxicogenomics can provide insight into geneenvironment interactions and the response of biologic pathways and networks to perturbations. Toxicogenomics may lead to information that is more discriminating, predictive, and sensitive than that currently used to evaluate toxic exposure or predict effects on human health.

Toxicology studies generally use multiple doses that span the expected range from where no effects would be observed to where clinical or histopathologic changes would be evident. The highest dose at which no overt toxicity occurs in a 90-day study (the maximum tolerated dose), is generally used to establish animal dosing levels for chronic assays that provide insight into potential latent effects, including cancer, reproductive or developmental toxicity, and immunotoxicity. These studies constitute the mainstays of toxicologic practice.3

In addition to animal studies, efforts to identify and understand the effects of environmental chemicals, drugs, and other agents on human populations have used epidemiologic studies to examine the relationship between a dose and the response to exposures. In contrast to animal studies, in which exposures are experimentally controlled, epidemiologic studies describe exposure with an estimate of error, and they assess the relationship between exposure and disease distribution in human populations. These studies operate under the assumption that many years of chemical exposures or simple passage of time may be required before disease expression can be detected.


Toxicogenomic approaches are often referred to as “high-throughput,” a term that can refer to the density of information or the ability to analyze many subjects (or compounds) in a short time. Although the toxicogenomic techniques described here create information that is highly dense, the techniques do not all offer the ability to analyze many subjects or compounds at one time. Therefore, the term high-throughput is not used except in reference to gene sequencing technologies.


Relating to “the observable properties of an organism that are produced by the interaction of the genotype and the environment” (Merriam-Webster’s Online Dictionary, 10th Edition).


For more information, see the National Research Council report Toxicity Testing for Assessment of Environmental Agents: Interim Report (NRC 2006a).

The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement