Toxicogenomics Challenges

Gene annotation. Example: Public genome projects • Cross-species extrapolation. Example: Public genome projects • Technical standards for evolving platforms. Example: National Institute of Standards, MIAME and MAQC consortiums • Standards for data sharing. Example: NCBI, MIAME, and MAQC consortiums • Signature/biomarker qualification. Example: Critical Path Institute, FDA, Environment Protection Agencyand European Regulatory Groups • Translation of assays for regulatory purposes. Example: FDA Critical Path Initiative, ICH • Ethical, Legal, Social Issues. Example: National Institutes of Health, NHGRI

Toxicogenomics can be defined as the application of “omics” techniques to toxicology and risk assessment. By identifying molecular changes associated with toxicity, TGx data might assist hazard identification and investigate causes. Early technical challenges were evaluated and addressed by consortia (e.g. ISLI/HESI and the Microarray Quality Control consortium), which demonstrated that TGx gave reliable and reproducible information. The MAQC also produced “best practice on signature generation” after conducting an extensive evaluation of different methods on common datasets. Two findings of note were the need for methods that control batch variability, and that the predictive ability of a signature changes in concert with the variability of the endpoint. The key challenge remaining is data interpretation, because TGx can identify molecular changes that are causal, associated with or incidental to toxicity. Application of Bradford Hill's tests for causation, which are used to build mode of action (MOA) arguments, can produce reasonable hypotheses linking altered pathways to phenotypic changes. However, challenges in interpretation still remain: are all pathway changes equal, which are most important and plausibly linked to toxicity? Therefore the expert judgement of the toxicologist is still needed. There are theoretical reasons why consistent alterations across a metabolic pathway are important, but similar changes in signalling pathways may not alter information flow. At the molecular level thresholds may be due to the inherent properties of the regulatory network, for example switch-like behaviours from some network motifs (e.g. positive feedback) in the perturbed pathway leading to the toxicity. The application of systems biology methods to TGx data can generate hypotheses that explain why a threshold response exists. However, are we adequately trained to make these judgments? There is a need for collaborative efforts between regulators, industry and academia to properly define how these technologies can be applied using appropriate case-studies.

  • Challenges of conventional toxicology approaches
  • Bioinformatics and interpretive challenges in toxicogenomics
  • Detecting rare transcripts
  • Gene annotation
  • Cross species extrapolation
  • Translational of assays for regulatory purposes
  • Signature/biomarker qualification

Related Conference of Toxicogenomics Challenges

Toxicogenomics Challenges Conference Speakers