The Adverse Outcome Pathway Concept: A Basis for Developing Regulatory Decision-making Tools

Nathalie Delrue, Magdalini Sachana, Yuki Sakuratani, Anne Gourmelon, Eeva Leinala, Robert Diderich

The Adverse Outcome Pathway (AOP) concept is expected to guide risk assessors in their work to use all existing information on the effects of chemicals on humans and wildlife, and to target the generation of additional information to the regulatory objective. AOPs will therefore be used in the Organisation for Economic Co-operation and Development (OECD) chemical safety programme, as underlying scientific rationales for the development of alternative methods for hazard assessment, such as read-across, in vitro test methods or the development of integrated testing strategies that have the potential to replace animal tests. As a proof-of-concept, the OECD has developed an AOP for skin sensitisation, and as a follow-up has: a) implemented the AOP into the OECD QSAR Toolbox, so that information related to the Key Events (KEs) in the AOP can be used to group chemicals that are expected to act by the same mechanism and hence have the same skin sensitisation potential; b) developed alternative test methods for the KEs, so that ultimately chemicals can be tested for skin sensitisation without the use of animal tests. The development of integrated testing strategies based on the AOP is ongoing. Building on this proof-of-concept, the OECD has launched an AOP development programme with a first batch of AOPs published in 2016. A number of IT tools, which together form an AOP Knowledge Base, are at various stages of development, and support the construction of AOPs and their use in the development of integrated approaches for testing and assessment. Following the publication of the first batch of AOPs, OECD member countries will decide on priorities for their use in supporting the development of tools for regulatory use.
You need to register (for free) to download this article. Please log in/register here.

A Multi-faceted Approach to Achieving the Global Acceptance of Animal-free Research Methods

Jodie Melbourne, Patricia Bishop, Jeffrey Brown and Gilly Stoddart

In 2015, the PETA International Science Consortium Ltd. was awarded the Lush Training Prize for its broad approach to education and training on the effective use of human-relevant, non-animal research techniques. The prize was awarded for work that included hosting workshops and webinars, initiating in-person training sessions and developing educational resources. The Consortium works closely with industry and regulatory agencies to identify and overcome barriers to the validation and use of alternatives to animal testing, by using an approach that identifies, promotes and verifies the implementation of these methods. The Consortium's recent activities toward replacing tests on animals for nanomaterials, pesticides and medical devices, are described, as examples of projects with broad applicability aimed at large-scale regulatory change.
You need to register (for free) to download this article. Please log in/register here.

Development of an In Silico Profiler for Respiratory Sensitisation

Steven J. Enoch, David W. Roberts, Judith C. Madden and Mark T.D. Cronin

In this article, we outline work that led the QSAR and Molecular Modelling Group at Liverpool John Moores University to be jointly awarded the 2013 Lush Science Prize. Our research focuses around the development of in silico profilers for category formation within the Adverse Outcome Pathway paradigm. The development of a well-defined chemical category allows toxicity to be predicted via read-across. This is the central approach used by the OECD QSAR Toolbox. The specific work for which we were awarded the Lush Prize was for the development of such an in silico profiler for respiratory sensitisation. The profiler was developed by an analysis of the mechanistic chemistry associated with covalent bond formation in the lung. The data analysed were collated from clinical reports of occupational asthma in humans. The impact of the development of in silico profilers on the Three Rs is also discussed.
You need to register (for free) to download this article. Please log in/register here.

A Quantitative Structure-toxicokinetic Relationship Model for Highly Metabolised Chemicals

Patrick Poulin and Kannan Krishnan

The aim of the present study was to develop a quantitative structure-toxicokinetic relationship (QSTkR) model for highly metabolised chemicals (HMCs). The proposed QSTkR model is essentially a physiologically based toxicokinetic (PBTK) model, in which the blood:air and tissue:blood partition coefficients (PCs) are predicted from the molecular structure of chemicals, and the liver blood flow rate (Ql) is used to describe hepatic clearance. Molecular structure-based prediction of the blood:air and tissue:blood PCs was performed from the n-octanol:water and water:air PCs of chemicals obtained with the conventional fragment constant methods. The validity of incorporating Ql instead of metabolic rate constants, as the hepatic clearance factor, in PBTK models for HMCs (extraction ratio > 0.7) was verified by comparing the simulations of venous blood concentration (Cv) profiles obtained with both the QSTkR and PBTK model approaches for 1,1-dichloroethylene, trichloroethylene and furan in the rat. Following the validation of this alternative approach for describing hepatic clearance of HMCs, a QSTkR model for dichloromethane was constructed. This model used molecular structure information as the sole input, and provided simulations of Cv for human exposure to low concentrations of dichloromethane. The QSTkR model simulations were similar to those obtained with the previously validated, conventional human PBTK model with experimentally determined PCs and metabolic rate constants (Vmax, Km and Kf) for dichloromethane. The present methodology is the first validated example of a mechanistically based prediction of the inhalation toxicokinetics of HMCs made solely from information on molecular structure.
You need to register (for free) to download this article. Please log in/register here.

In Vitro Toxicity: Mechanisms, Alternatives and Validation — A Report from the 19th Annual Scientific Meeting of the Scandinavian Society for Cell Toxicology

Anna Forsby

The Scandinavian Society for Cell Toxicology (SSCT) has arranged annual scientific meetings since 1983. These workshops were the forum for the Multicentre Evaluation of In Vitro Cytotoxicity (MEIC) programme. Along with the MEIC programme, which was completed in 1998, a wide range of topics relating to cytotoxicity have been discussed. The meetings have also given an opportunity for graduate students and young scientists to present their work to an international audience. At the same time, experts in the fields of in vitro toxicity have been invited as speakers. The 19th SSCT scientific meeting, which was held in 2001 at Sørup Manor in Ringsted, Denmark, was no exception. The meeting consisted of four sessions: mechanisms of toxicity; environmental toxicological testing; alternatives to animal experiments; and validation of in vitro tests.
You need to register (for free) to download this article. Please log in/register here.

Intelligent Testing Strategies for Chemicals Testing — A Case of More Haste, Less Speed?

Robert Combes and Michael Balls

The prospects for using (Q)SAR modelling, read-across (chemical) and other non-animal approaches as part of integrated testing strategies for chemical risk assessment, within the framework of the EU REACH legislation, are considered. The potential advantages and limitations of (Q)SAR modelling and read-across methods for chemical regulatory risk assessment are reviewed. It is concluded that it would be premature to base a testing strategy on chemical-based computational modelling approaches, until such time as criteria to validate them for their reliability and relevance by using independent and transparent procedures, have been agreed. This is mainly because of inherent problems in validating and accepting (Q)SARs for regulatory use in ways that are analogous to those that have been developed and applied for in vitro tests. Until this issue has been resolved, it is recommended that testing strategies should be developed which comprise the integrated use of computational and read-across approaches. These should be applied in a cautious and judicious way, in association with available tissue culture methods, and in conjunction with metabolism and biokinetic studies. Such strategies should be intelligently applied by being driven by exposure information (based on bioavailability, not merely on production volume) and hazard information needs, in preference to a tick-box approach. In the meantime, there should be increased efforts to develop improved (Q)SARs, expert systems and new in vitro methods, and, in particular, ways to expedite their validation and acceptance must be found and prospectively agreed with all major stakeholders.
You need to register (for free) to download this article. Please log in/register here.

QSAR Applicability Domain Estimation by Projection of the Training Set in Descriptor Space: A Review

Joanna Jaworska, Nina Nikolova-Jeliazkova and Tom Aldenberg

As the use of Quantitative Structure Activity Relationship (QSAR) models for chemical management increases, the reliability of the predictions from such models is a matter of growing concern. The OECD QSAR Validation Principles recommend that a model should be used within its applicability domain (AD). The Setubal Workshop report provided conceptual guidance on defining a (Q)SAR AD, but it is difficult to use directly. The practical application of the AD concept requires an operational definition that permits the design of an automatic (computerised), quantitative procedure to determine a model’s AD. An attempt is made to address this need, and methods and criteria for estimating AD through training set interpolation in descriptor space are reviewed. It is proposed that response space should be included in the training set representation. Thus, training set chemicals are points in n-dimensional descriptor space and m-dimensional model response space. Four major approaches for estimating interpolation regions in a multivariate space are reviewed and compared: range, distance, geometrical, and probability density distribution.
You need to register (for free) to download this article. Please log in/register here.

An Approach to Determining Applicability Domains for QSAR Group Contribution Models: An Analysis of SRC KOWWIN

Nina Nikolova-Jeliazkova and Joanna Jaworska

QSAR model predictions are most reliable if they come from the model’s applicability domain. The Setubal Workshop report provides a conceptual guidance for defining a (Q)SAR applicability domain. However, an operational definition is necessary for applying this guidance in practice. It should also permit the design of an automatic (computerised) procedure for determining a model’s applicability domain. This paper attempts to address this need for models that use a large number of descriptors (for example, group contribution-based models). The high dimensionality of these models imposes specific computational restrictions on estimating the interpolation region. The Syracuse Research Corporation KOWWIN model for prediction of the n-octanol/water partition coefficient is analysed as a case study. This is a linear regression model that uses 508 fragment counts and correction factors as descriptors, and is based on the group contribution approach. We conclude that the applicability domain estimation by descriptor ranges, combined with Principal Component rotation as a data pre-processing step, is an acceptable compromise between estimation accuracy and the amount of data in the training set.
You need to register (for free) to download this article. Please log in/register here.