in silico

/Tag:in silico

Barriers to the Uptake of Human-based Test Methods, and How to Overcome Them

Kathy Archibald, Tamara Drake and Robert Coleman

Although there is growing concern as to the questionable value of animal-based methods for determining the safety and efficacy of new medicines, which has in turn led to many groups developing innovative human-based methods, there are many barriers to their adoption for regulatory submissions.
The reasons for this are various, and include a lack of confidence that the available human-based methods, be they in vivo, in silico or in vitro, can be sufficiently predictive of clinical outcomes. However, this is not the only problem: the issue of validation presents a serious impediment to progress, a particularly frustrating situation, in view of the fact that the existing animal-based methods have never themselves been formally validated. Superimposed upon this is the issue of regulatory requirements, where, although regulators may be willing to accept non-animal approaches in place of particular animal tests, nowhere is this explicitly stated in their guidelines. Such problems are far from trivial, and represent major hurdles to be overcome. In addition, there are a range of other barriers, real or self-imposed, that are hindering a more-predictive approach to establishing a new drug’s clinical safety and efficacy profiles. Some of these barriers are identified, and ways forward are suggested.
You need to register (for free) to download this article. Please log in/register here.

Toward the Replacement of Animal Experiments through the Bioinformatics-driven Analysis of ‘Omics’ Data from Human Cell Cultures

Roland C. Grafström, Penny Nymark, Vesa Hongisto, Ola Spjuth, Rebecca Ceder, Egon Willighagen, Barry Hardy, Samuel Kaski and Pekka Kohonen

This paper outlines the work for which Roland Grafström and Pekka Kohonen were awarded the 2014 Lush Science Prize. The research activities of the Grafström laboratory have, for many years, covered cancer biology studies, as well as the development and application of toxicity-predictive in vitro models to determine chemical safety. Through the integration of in silico analyses of diverse types of genomics data (transcriptomic and proteomic), their efforts have proved to fit well into the recently-developed Adverse Outcome Pathway paradigm. Genomics analysis within state-of-the-art cancer biology research and Toxicology in the 21st Century concepts share many technological tools. A key category within the Three Rs paradigm is the Replacement of animals in toxicity testing with alternative methods, such as bioinformatics-driven analyses of data obtained from human cell cultures exposed to diverse toxicants. This work was recently expanded within the pan-European SEURAT-1 project (Safety Evaluation Ultimately Replacing Animal Testing), to replace repeat-dose toxicity testing with data-rich analyses of sophisticated cell culture models. The aims and objectives of the SEURAT project have been to guide the application, analysis, interpretation and storage of ‘omics’ technology-derived data within the service-oriented sub-project, ToxBank. Particularly addressing the Lush Science Prize focus on the relevance of toxicity pathways, a ‘data warehouse’ that is under continuous expansion, coupled with the development of novel data storage and management methods for toxicology, serve to address data integration across multiple ‘omics’ technologies. The prize winners’ guiding principles and concepts for modern knowledge management of toxicological data are summarised. The translation of basic discovery results ranged from chemical-testing and material testing data, to information relevant to human health and environmental safety.
You need to register (for free) to download this article. Please log in/register here.

Human-based Systems in Drug and Chemical Safety Testing — Toward Replacement, the ‘Single R’

Robert A. Coleman

The Three Rs was a concept originally conceived as a means of reducing the suffering of laboratory animals that are used largely in identifying any potential safety issues with chemicals to which humans may be exposed. However, with growing evidence of the shortcomings of laboratory animal testing to reliably predict human responsiveness to such chemicals, questions are now being asked as to whether it is appropriate to use animals as human surrogates at all. This raises the question of whether, of the original Three Rs, two — Reduction and Refinement — are potentially redundant, and whether, instead, we should concentrate on the third R: Replacement. And if this is the best way forward, it is inevitable that this R should be based firmly on human biology. The present review outlines the current state-of-the-art regarding our access to human biology through in vitro, in silico and in vivo technologies, identifying strengths, weaknesses and opportunities, and goes on to address the prospect of achieving a single R, with some suggestions as to how to progress toward this goal.
You need to register (for free) to download this article. Please log in/register here.

Development of an In Silico Profiler for Respiratory Sensitisation

Steven J. Enoch, David W. Roberts, Judith C. Madden and Mark T.D. Cronin

In this article, we outline work that led the QSAR and Molecular Modelling Group at Liverpool John Moores University to be jointly awarded the 2013 Lush Science Prize. Our research focuses around the development of in silico profilers for category formation within the Adverse Outcome Pathway paradigm. The development of a well-defined chemical category allows toxicity to be predicted via read-across. This is the central approach used by the OECD QSAR Toolbox. The specific work for which we were awarded the Lush Prize was for the development of such an in silico profiler for respiratory sensitisation. The profiler was developed by an analysis of the mechanistic chemistry associated with covalent bond formation in the lung. The data analysed were collated from clinical reports of occupational asthma in humans. The impact of the development of in silico profilers on the Three Rs is also discussed.
You need to register (for free) to download this article. Please log in/register here.

A Modular Approach to the ECVAM Principles on Test Validity

Thomas Hartung, Susanne Bremer, Silvia Casati, Sandra Coecke, Raffaella Corvi, Salvador Fortaner, Laura Gribaldo, Marlies Halder, Sebastian Hoffmann, Annett Janusch Roi, Pilar Prieto, Enrico Sabbioni, Laurie Scott, Andrew Worth and Valérie Zuang

The European Centre for the Validation of Alternative Methods (ECVAM) proposes to make the validation process more flexible, while maintaining its high standards. The various aspects of validation are broken down into independent modules, and the information necessary to complete each module is defined. The data required to assess test validity in an independent peer review, not the process, are thus emphasised. Once the information to satisfy all the modules is complete, the test can enter the peer-review process. In this way, the between-laboratory variability and predictive capacity of a test can be assessed independently. Thinking in terms of validity principles will broaden the applicability of the validation process to a variety of tests and procedures, including the generation of new tests, new technologies (for example, genomics, proteomics), computer-based models (for example, quantitative structure–activity relationship models), and expert systems. This proposal also aims to take into account existing information, defining this as retrospective validation, in contrast to a prospective validation study, which has been the predominant approach to date. This will permit the assessment of test validity by completing the missing information via the relevant validation procedure: prospective validation, retrospective validation, catch-up validation, or a combination of these procedures.
You need to register (for free) to download this article. Please log in/register here.

The Integrated Acute Systemic Toxicity Project (ACuteTox) for the Optimisation and Validation of Alternative In Vitro Tests

Cecilia Clemedson, Ada Kolman and Anna Forsby

The ACuteTox project is designed to replace animal testing for acute systemic toxicity, as is widely used today for regulatory purposes, by using in vitro and in silico alternatives. In spite of the fact that earlier studies on acute systemic toxicity demonstrated a good correlation between in vitro basal cytotoxicity data (the 50% inhibitory concentration [IC50]) in human cell lines and rodent LD50 values, and an even better correlation between IC50 values and human lethal blood concentrations, very few non-animal tests have been accepted for general use. Therefore, the aim of the ACuteTox project is to adapt new testing strategies, for example, the implementation of new endpoints and new cell systems for toxicity screening, organ-specific models, metabolism-dependent toxicity, tissue absorption, distribution and excretion, and computer-based prediction models. A new database, AcuBase, containing descriptions and results of in vitro tests of the 97 reference chemicals, as well as the results of animal experimentation, and human acute toxicity data, will be generated within the framework of ACuteTox. Scientists from 13 European countries are working together and making efforts to find the most appropriate testing strategies for the prediction of human acute systemic toxicity, and also to select a robust in vitro test battery for cytotoxicity testing of chemicals.
You need to register (for free) to download this article. Please log in/register here.

Application of a Systems Biology Approach to Skin Allergy Risk Assessment

Gavin Maxwell and Cameron MacKay

We have developed an in silico model of the induction of skin sensitisation, in order to characterise and quantify the contribution of each pathway to the overall biological process. This analysis has been used to guide our research on skin sensitisation and in vitro test development programmes, and provides a theoretical rationale for the interpretation and integration of non-animal predictive data for risk assessment (RA) purposes. The in vivo mouse Local Lymph Node Assay (LLNA) is now in widespread use for the evaluation of skin sensitisation potential and potency. Recent changes in European Union (EU) legislation (i.e. the 7th Amendment to the EU Cosmetics Directive) have made the development of nonanimal approaches to provide the data for skin sensitisation RA a key business need. Several in vitro predictive assays have already been developed for the prediction of skin sensitisation. However, these are based on the determination of a small number of pathways within the overall biological process, and our understanding of the relative contribution of these individual pathways to skin sensitisation induction is limited. To address this knowledge gap, a “systems biology” approach has been used to construct a computer-based mathematical model of the induction of skin sensitisation, in collaboration with Entelos, Inc. The biological mechanisms underlying the induction phase of skin sensitisation are represented by nonlinear ordinary differential equations and defined by using information from over 500 published papers. By using the model, we have identified knowledge gaps for future investigative research, and key factors that have a major influence on the induction of skin sensitisation (e.g. TNF-α production in the epidermis). The relative contribution of each of these key pathways has been assessed by determining their contributions to the overall process (e.g. sensitiser-specific T-cell proliferation in the draining lymph node). This information provides a biologically-relevant rationale for the interpretation and potential integration of diverse types of non-animal predictive data. Consequently, the Skin Sensitisation Physiolab® (SSP) platform represents one approach to integration that is likely to prove an invaluable tool for hazard evaluation in a new framework for consumer safety RA.
You need to register (for free) to download this article. Please log in/register here.

Assuring Consumer Safety Without Animal Testing: A Feasibility Case Study for Skin Sensitisation

Gavin Maxwell, Maja Aleksic, Aynur Aptula, Paul Carmichael, Julia Fentem, Nicola Gilmour, Cameron MacKay, Camilla Pease, Ruth Pendlington, Fiona Reynolds, Daniel Scott, Guy Warner and Carl Westmoreland

Allergic Contact Dermatitis (ACD; chemical-induced skin sensitisation) represents a key consumer safety endpoint for the cosmetics industry. At present, animal tests (predominantly the mouse Local Lymph Node Assay) are used to generate skin sensitisation hazard data for use in consumer safety risk assessments. An animal testing ban on chemicals to be used in cosmetics will come into effect in the European Union (EU) from March 2009. This animal testing ban is also linked to an EU marketing ban on products containing any ingredients that have been subsequently tested in animals, from March 2009 or March 2013, depending on the toxicological endpoint of concern. Consequently, the testing of cosmetic ingredients in animals for their potential to induce skin sensitisation will be subject to an EU marketing ban, from March 2013 onwards. Our conceptual framework and strategy to deliver a non-animal approach to consumer safety risk assessment can be summarised as an evaluation of new technologies (e.g. ‘omics’, informatics), leading to the development of new non-animal (in silico and in vitro) predictive models for the generation and interpretation of new forms of hazard characterisation data, followed by the development of new risk assessment approaches to integrate these new forms of data and information in the context of human exposure. Following the principles of the conceptual framework, we have been investigating existing and developing new technologies, models and approaches, in order to explore the feasibility of delivering consumer safety risk assessment decisions in the absence of new animal data. We present here our progress in implementing this conceptual framework, with the skin sensitisation endpoint used as a case study.
You need to register (for free) to download this article. Please log in/register here.

The Use of Computer Models in Pharmaceutical Safety Evaluation

Scott Boyer

With the ever increasing volume of data available to scientists in drug discovery and development, the opportunity to leverage an increasing amount of these data in the assessment of drug safety is clear. The challenge in an environment of increasing data volume is in the structuring and the analysis of these data, such that decisions can be made without excluding information or overstating their meaning. Informatics and modelling play a crucial role in addressing this challenge in two basic ways: a) the data are structured and analysed in a transparent and objective way; and b) new experiments are designed with the model as part of the design process, much like modern experimental physics. Enhancing the use and impact of informatics and modelling on drug discovery is not simply a matter of increasing processor speed and memory capacity. The transformation of raw data to usable, and useful, information is a scientific, technical and, perhaps most importantly, cultural challenge within drug discovery. This review will highlight some of the history, current approaches and promising future directions in this rapidly expanding area.
You need to register (for free) to download this article. Please log in/register here.

The In ChemicoIn Silico Interface: Challenges for Integrating Experimental and Computational Chemistry to Identify Toxicity

Mark T.D. Cronin, Fania Bajot, Steven J. Enoch, Judith C. Madden, David W. Roberts and Johannes Schwöbel

A number of toxic effects are brought about by the covalent interaction between the toxicant and biological macromolecules. In chemico assays are available that attempt to identify reactive compounds. These approaches have been developed independently for pharmaceuticals and for other nonpharmaceutical compounds. The assays vary widely in terms of the macromolecule (typically a peptide) and the analytical technique utilised. For both sets of methods, there are great opportunities to capture in chemico information by using in silico methods to provide computational tools for screening purposes. In order to use these in chemico and in silico methods, integrated testing strategies are required for individual toxicity endpoints. The potential for the use of these approaches is described, and a number of recommendations to improve this extremely useful technique, in terms of implementing the Three Rs in toxicity testing, are presented.
You need to register (for free) to download this article. Please log in/register here.