-
EXS 2009A genotoxin is a chemical or agent that can cause DNA or chromosomal damage. Such damage in a germ cell has the potential to cause a heritable altered trait (germline... (Review)
Review
A genotoxin is a chemical or agent that can cause DNA or chromosomal damage. Such damage in a germ cell has the potential to cause a heritable altered trait (germline mutation). DNA damage in a somatic cell may result in a somatic mutation, which may lead to malignant transformation (cancer). Many in vitro and in vivo tests for genotoxicity have been developed that, with a range of endpoints, detect DNA damage or its biological consequences in prokaryotic (e.g. bacterial) or eukaryotic (e.g. mammalian, avian or yeast) cells. These assays are used to evaluate the safety of environmental chemicals and consumer products and to explore the mechanism of action of known or suspected carcinogens. Many chemical carcinogens/mutagens undergo metabolic activation to reactive species that bind covalently to DNA, and the DNA adducts thus formed can be detected in cells and in human tissues by a variety of sensitive techniques. The detection and characterisation of DNA adducts in human tissues provides clues to the aetiology of human cancer. Characterisation of gene mutations in human tumours, in common with the known mutagenic profiles of genotoxins in experimental systems, may provide further insight into the role of environmental mutagens in human cancer.
Topics: Animals; Carcinogens; DNA Adducts; DNA Damage; Environmental Pollutants; Mutagenicity Tests; Mutagens; Toxicogenetics
PubMed: 19157059
DOI: 10.1007/978-3-7643-8336-7_4 -
International Journal of Molecular... Jul 2022Pollution is defined as the presence in or introduction of a substance into the environment that has harmful or poisonous effects [...].
Pollution is defined as the presence in or introduction of a substance into the environment that has harmful or poisonous effects [...].
Topics: Biomarkers; Risk Assessment; Toxicogenetics
PubMed: 35955413
DOI: 10.3390/ijms23158280 -
Archives of Toxicology Aug 2013This review encompasses the most important advances in liver functions and hepatotoxicity and analyzes which mechanisms can be studied in vitro. In a complex... (Review)
Review
Recent advances in 2D and 3D in vitro systems using primary hepatocytes, alternative hepatocyte sources and non-parenchymal liver cells and their use in investigating mechanisms of hepatotoxicity, cell signaling and ADME.
This review encompasses the most important advances in liver functions and hepatotoxicity and analyzes which mechanisms can be studied in vitro. In a complex architecture of nested, zonated lobules, the liver consists of approximately 80 % hepatocytes and 20 % non-parenchymal cells, the latter being involved in a secondary phase that may dramatically aggravate the initial damage. Hepatotoxicity, as well as hepatic metabolism, is controlled by a set of nuclear receptors (including PXR, CAR, HNF-4α, FXR, LXR, SHP, VDR and PPAR) and signaling pathways. When isolating liver cells, some pathways are activated, e.g., the RAS/MEK/ERK pathway, whereas others are silenced (e.g. HNF-4α), resulting in up- and downregulation of hundreds of genes. An understanding of these changes is crucial for a correct interpretation of in vitro data. The possibilities and limitations of the most useful liver in vitro systems are summarized, including three-dimensional culture techniques, co-cultures with non-parenchymal cells, hepatospheres, precision cut liver slices and the isolated perfused liver. Also discussed is how closely hepatoma, stem cell and iPS cell-derived hepatocyte-like-cells resemble real hepatocytes. Finally, a summary is given of the state of the art of liver in vitro and mathematical modeling systems that are currently used in the pharmaceutical industry with an emphasis on drug metabolism, prediction of clearance, drug interaction, transporter studies and hepatotoxicity. One key message is that despite our enthusiasm for in vitro systems, we must never lose sight of the in vivo situation. Although hepatocytes have been isolated for decades, the hunt for relevant alternative systems has only just begun.
Topics: Animals; Coculture Techniques; Culture Techniques; Gene Expression Regulation; Hepatocytes; High-Throughput Screening Assays; Humans; Inactivation, Metabolic; Liver; Organ Culture Techniques; Receptors, Cytoplasmic and Nuclear; Signal Transduction; Toxicity Tests; Toxicogenetics
PubMed: 23974980
DOI: 10.1007/s00204-013-1078-5 -
Biomedicine & Pharmacotherapy =... Jul 2023More information about a person's genetic makeup, drug response, multi-omics response, and genomic response is now available leading to a gradual shift towards... (Review)
Review
More information about a person's genetic makeup, drug response, multi-omics response, and genomic response is now available leading to a gradual shift towards personalized treatment. Additionally, the promotion of non-animal testing has fueled the computational toxicogenomics as a pivotal part of the next-gen risk assessment paradigm. Artificial Intelligence (AI) has the potential to provid new ways analyzing the patient data and making predictions about treatment outcomes or toxicity. As personalized medicine and toxicogenomics involve huge data processing, AI can expedite this process by providing powerful data processing, analysis, and interpretation algorithms. AI can process and integrate a multitude of data including genome data, patient records, clinical data and identify patterns to derive predictive models anticipating clinical outcomes and assessing the risk of any personalized medicine approaches. In this article, we have studied the current trends and future perspectives in personalized medicine & toxicology, the role of toxicogenomics in connecting the two fields, and the impact of AI on personalized medicine & toxicology. In this work, we also study the key challenges and limitations in personalized medicine, toxicogenomics, and AI in order to fully realize their potential.
Topics: Humans; Artificial Intelligence; Precision Medicine; Toxicogenetics; Algorithms; Technology
PubMed: 37121152
DOI: 10.1016/j.biopha.2023.114784 -
BMC Systems Biology Aug 2014Toxicogenomics studies often profile gene expression from assays involving multiple doses and time points. The dose- and time-dependent pattern is of great importance to...
BACKGROUND
Toxicogenomics studies often profile gene expression from assays involving multiple doses and time points. The dose- and time-dependent pattern is of great importance to assess toxicity but computational approaches are lacking to effectively utilize this characteristic in toxicity assessment. Topic modeling is a text mining approach, but may be used analogously in toxicogenomics due to the similar data structures between text and gene dysregulation.
RESULTS
Topic modeling was applied to a very large toxicogenomics dataset containing microarray gene expression data from >15,000 samples associated with 131 drugs tested in three different assay platforms (i.e., in vitro assay, in vivo repeated dose study and in vivo single dose experiment) with a design including multiple doses and time points. A set of "topics" which each consist of a set of genes was determined, by which the varying sensitivity of three assay systems was observed. We found that the drug-dependent effect was more pronounced in the two in vivo systems than the in vitro system, while the time-dependent effect was most strongly reflected in the in vitro system followed by the single dose study and lastly the repeated dose experiment. The dose-dependent effect was similar across three assay systems. Although the results indicated a challenge to extrapolate the in vitro results to the in vivo situation, we did notice that, for some drugs but not for all the drugs, the similarity in gene expression patterns was observed across all three assay systems, indicating a possibility of using in vitro systems with careful designs (such as the choice of dose and time point), to replace the in vivo testing strategy. Nonetheless, a potential to replace the repeated dose study by the single-dose short-term methodology was strongly implied.
CONCLUSIONS
The study demonstrated that text mining methodologies such as topic modeling provide an alternative method compared to traditional means for data reduction in toxicogenomics, enhancing researchers' capabilities to interpret biological information.
Topics: Computational Biology; Data Mining; Dose-Response Relationship, Drug; Humans; Oligonucleotide Array Sequence Analysis; Time Factors; Toxicogenetics
PubMed: 25115450
DOI: 10.1186/s12918-014-0093-3 -
Trends in Pharmacological Sciences Feb 2019Toxicogenomics (TGx) has contributed significantly to toxicology and now has great potential to support moves towards animal-free approaches in regulatory decision... (Review)
Review
Toxicogenomics (TGx) has contributed significantly to toxicology and now has great potential to support moves towards animal-free approaches in regulatory decision making. Here, we discuss in vitro TGx systems and their potential impact on risk assessment. We raise awareness of the rapid advancement of genomics technologies, which generates novel genomics features essential for enhanced risk assessment. We specifically emphasize the importance of reproducibility in utilizing TGx in the regulatory setting. We also highlight the role of machine learning (particularly deep learning) in developing TGx-based predictive models. Lastly, we touch on the topics of how TGx approaches could facilitate adverse outcome pathways (AOP) development and enhance read-across strategies to further regulatory application. Finally, we summarize current efforts to develop TGx for risk assessment and set out remaining challenges.
Topics: Animal Testing Alternatives; Animals; Humans; Machine Learning; Reproducibility of Results; Risk Assessment; Toxicogenetics
PubMed: 30594306
DOI: 10.1016/j.tips.2018.12.001 -
Molecular Omics Aug 2018The toxicogenomics field aims to understand and predict toxicity by using 'omics' data in order to study systems-level responses to compound treatments. In recent years... (Review)
Review
The toxicogenomics field aims to understand and predict toxicity by using 'omics' data in order to study systems-level responses to compound treatments. In recent years there has been a rapid increase in publicly available toxicological and 'omics' data, particularly gene expression data, and a corresponding development of methods for its analysis. In this review, we summarize recent progress relating to the analysis of RNA-Seq and microarray data, review relevant databases, and highlight recent applications of toxicogenomics data for understanding and predicting compound toxicity. These include the analysis of differentially expressed genes and their enrichment, signature matching, methods based on interaction networks, and the analysis of co-expression networks. In the future, these state-of-the-art methods will likely be combined with new technologies, such as whole human body models, to produce a comprehensive systems-level understanding of toxicity that reduces the necessity of in vivo toxicity assessment in animal models.
Topics: Animals; Databases, Genetic; Drug Discovery; Gene Expression Profiling; Gene Expression Regulation; Gene Regulatory Networks; Humans; Pharmacogenomic Testing; Systems Biology; Toxicity Tests; Toxicogenetics
PubMed: 29917034
DOI: 10.1039/c8mo00042e -
BMC Bioinformatics Oct 2009The Comparative Toxicogenomics Database (CTD) is a publicly available resource that promotes understanding about the etiology of environmental diseases. It provides... (Comparative Study)
Comparative Study
BACKGROUND
The Comparative Toxicogenomics Database (CTD) is a publicly available resource that promotes understanding about the etiology of environmental diseases. It provides manually curated chemical-gene/protein interactions and chemical- and gene-disease relationships from the peer-reviewed, published literature. The goals of the research reported here were to establish a baseline analysis of current CTD curation, develop a text-mining prototype from readily available open source components, and evaluate its potential value in augmenting curation efficiency and increasing data coverage.
RESULTS
Prototype text-mining applications were developed and evaluated using a CTD data set consisting of manually curated molecular interactions and relationships from 1,600 documents. Preliminary results indicated that the prototype found 80% of the gene, chemical, and disease terms appearing in curated interactions. These terms were used to re-rank documents for curation, resulting in increases in mean average precision (63% for the baseline vs. 73% for a rule-based re-ranking), and in the correlation coefficient of rank vs. number of curatable interactions per document (baseline 0.14 vs. 0.38 for the rule-based re-ranking).
CONCLUSION
This text-mining project is unique in its integration of existing tools into a single workflow with direct application to CTD. We performed a baseline assessment of the inter-curator consistency and coverage in CTD, which allowed us to measure the potential of these integrated tools to improve prioritization of journal articles for manual curation. Our study presents a feasible and cost-effective approach for developing a text mining solution to enhance manual curation throughput and efficiency.
Topics: Computational Biology; Databases, Factual; Gene Regulatory Networks; Information Storage and Retrieval; Toxicogenetics
PubMed: 19814812
DOI: 10.1186/1471-2105-10-326 -
Journal of Environmental Science and... 2014The aim of this review is to comprehensively summarize the recent achievements in the field of toxicogenomics and cancer research regarding genetic-environmental... (Review)
Review
The aim of this review is to comprehensively summarize the recent achievements in the field of toxicogenomics and cancer research regarding genetic-environmental interactions in carcinogenesis and detection of genetic aberrations in cancer genomes by next-generation sequencing technology. Cancer is primarily a genetic disease in which genetic factors and environmental stimuli interact to cause genetic and epigenetic aberrations in human cells. Mutations in the germline act as either high-penetrance alleles that strongly increase the risk of cancer development, or as low-penetrance alleles that mildly change an individual's susceptibility to cancer. Somatic mutations, resulting from either DNA damage induced by exposure to environmental mutagens or from spontaneous errors in DNA replication or repair are involved in the development or progression of the cancer. Induced or spontaneous changes in the epigenome may also drive carcinogenesis. Advances in next-generation sequencing technology provide us opportunities to accurately, economically, and rapidly identify genetic variants, somatic mutations, gene expression profiles, and epigenetic alterations with single-base resolution. Whole genome sequencing, whole exome sequencing, and RNA sequencing of paired cancer and adjacent normal tissue present a comprehensive picture of the cancer genome. These new findings should benefit public health by providing insights in understanding cancer biology, and in improving cancer diagnosis and therapy.
Topics: Disease Susceptibility; High-Throughput Nucleotide Sequencing; Humans; Neoplasms; Toxicogenetics
PubMed: 24875441
DOI: 10.1080/10590501.2014.907460 -
Protein & Cell May 2018Inter-individual heterogeneity in drug response is a serious problem that affects the patient's wellbeing and poses enormous clinical and financial burdens on a societal... (Review)
Review
Inter-individual heterogeneity in drug response is a serious problem that affects the patient's wellbeing and poses enormous clinical and financial burdens on a societal level. Pharmacogenomics has been at the forefront of research into the impact of individual genetic background on drug response variability or drug toxicity, and recently the gut microbiome, which has also been called the second genome, has been recognized as an important player in this respect. Moreover, the microbiome is a very attractive target for improving drug efficacy and safety due to the opportunities to manipulate its composition. Pharmacomicrobiomics is an emerging field that investigates the interplay of microbiome variation and drugs response and disposition (absorption, distribution, metabolism and excretion). In this review, we provide a historical overview and examine current state-of-the-art knowledge on the complex interactions between gut microbiome, host and drugs. We argue that combining pharmacogenomics and pharmacomicrobiomics will provide an important foundation for making major advances in personalized medicine.
Topics: Anti-Infective Agents; Biodiversity; Humans; Microbiota; Pharmacogenetics; Precision Medicine; Toxicogenetics
PubMed: 29705929
DOI: 10.1007/s13238-018-0547-2