CHAPTER 8: Profiling the Tox21 Chemical Library for Environmental Hazards: Applications in Prioritisation, Predictive Modelling, and Mechanism of Toxicity Characterisation
-
Published:04 Dec 2019
-
Special Collection: 2019 ebook collectionSeries: Issues in Toxicology
S. Sakamuru, H. Zhu, M. Xia, A. Simeonov, and R. Huang, in Big Data in Predictive Toxicology, ed. D. Neagu and A. Richarz, The Royal Society of Chemistry, 2019, pp. 242-263.
Download citation file:
The Toxicology for the 21st Century (Tox21) program is an initiative between multiple U.S. federal agencies aiming to predict chemical toxicity based on the data from in vitro assays, which would greatly reduce the effort of traditional whole animal studies. The program has constructed a library of ∼10 000 environmental chemicals and drugs, representing a wide range of structural diversity, which is being tested in triplicate against a battery of cell-based assays in a quantitative high-throughput screening (qHTS) format. A standardised process has also been established to enable assay development, automated robotic screening, massive data acquisition, new data analysis approaches needed to integrate and characterise the data, and data sharing. To date, the Tox21 program has generated over 120 million data points that have been made publicly available, thus contributing to the big data in toxicology. In this chapter, examples are given to show how to build in vivo toxicity prediction models based on in vitro activity profiles of compounds and prioritise compounds for further in-depth toxicological studies. These data sets were also successfully used in a “crowd-source” challenge with its goal to encourage public participation to develop new methods and models for toxicity prediction.