integrated testing

/Tag:integrated testing

A Critical Assessment of the Scientific Basis, and Implementation, of Regulations for the Safety Assessment and Marketing of Innovative Tobacco-related Products

Robert D. Combes and Michael Balls

Our scientific, logistical, ethical and animal welfare-related concerns about the latest US Food and Drug Administration (FDA) regulations for existing and so-called ‘new’ tobacco products, aimed at reducing harmful exposures, are explained. Such claims for sales in the USA now have to be based on a wide range of information, a key part of which will increasingly be data on safety and risk. One of the pathways to achieve marketing authorisation is to demonstrate substantial equivalence (SE) with benchmark products, called predicates. However, the regulations are insufficiently transparent with regard to: a) a rationale for the cut-off date for ‘old’ and ‘new’ products, and for exempting the former from regulation; b) the scientific validity and operation of SE; c) options for product labelling to circumvent SE; d) the experimental data required to support, and criteria to judge, a claim; and e) a strategy for risk assessment/management. Scientific problems related to the traditional animal methods used in respiratory disease and inhalation toxicology, and the use of quantitative comparators of toxicity, such as the No Observed Adverse Effect Level, are discussed. We review the advantages of relevant in vitro, mechanism based, target tissue-oriented technologies, which an advisory report of the Institute of Medicine of the US National Academy of Sciences largely overlooked. These benefits include: a) the availability, for every major site in the respiratory tract, of organotypic human cell-based tissue culture systems, many of which are already being used by the industry; b) the accurate determination of concentrations of test materials received by target cells; c) methods for exposure to particulate and vapour phases of smoke, separately or combined; d) the ability to study tissue-specific biotransformation; and e) the use of modern, humanfocused methodologies, unaffected by species differences. How data extrapolation, for risk assessment, from tissue culture to the whole animal, could be addressed, is also discussed. A cost (to animal welfare)–benefit (to society, including industry and consumers) analysis was conducted, taking into account the above information; the potential for animal suffering; the extensive data already available; the existence of other, less hazardous forms of nicotine delivery; the fact that much data will be generated solely for benchmarking; and that many smokers (especially nicotine-dependents) ignore health warnings. It is concluded that, in common with policies of several tobacco companies and countries, the use of laboratory animals for tobacco testing is very difficult, if not impossible, to justify. Instead, we propose and argue for an integrated testing scheme, starting with extensive chemical analysis of the ingredients and byproducts associated with the use of tobacco products and their toxicity, followed by use of in vitro systems and early clinical studies (involving specific biomarkers) with weight-of-evidence assessments at each stage. Appropriate adjustment factors could be developed to enable concentration–response data obtained in vitro, with the other information generated by the strategy, to enable the FDA to meet its objectives. It is hoped that our intentionally provocative ideas will stimulate further debate on this contentious area of regulatory testing and public safety.
You need to register (for free) to download this article. Please log in/register here.

Biokinetic and Toxicodynamic Modelling and its Role in Toxicological Research and Risk Assessment

Bas J. Blaauboer

Toxicological risk assessment for chemicals is still mainly based on highly standardised protocols for animal experimentation and exposure assessment. However, developments in our knowledge of general physiology, in chemicobiological interactions and in (computer-supported) modelling, have resulted in a tremendous change in our understanding of the molecular mechanisms underlying the toxicity of chemicals. This permits the development of biologically based models, in which the biokinetics as well as the toxicodynamics of compounds can be described. In this paper, the possibilities are discussed of developing systems in which the systemic (acute and chronic) toxicities of chemicals can be quantified without the heavy reliance on animal experiments. By integrating data derived from different sources, predictions of toxicity can be made. Key elements in this integrated approach are the evaluation of chemical functionalities representing structural alerts for toxic actions, the construction of biokinetic models on the basis of non-animal data (for example, tissue–blood partition coefficients, in vitro biotransformation parameters), tests or batteries of tests for determining basal cytotoxicity, and more-specific tests for evaluating tissue or organ toxicity. It is concluded that this approach is a useful tool for various steps in toxicological hazard and risk assessment, especially for those forms of toxicity for which validated in vitro and other non-animal tests have already been developed.
You need to register (for free) to download this article. Please log in/register here.

Intelligent Testing Strategies for Chemicals Testing — A Case of More Haste, Less Speed?

Robert Combes and Michael Balls

The prospects for using (Q)SAR modelling, read-across (chemical) and other non-animal approaches as part of integrated testing strategies for chemical risk assessment, within the framework of the EU REACH legislation, are considered. The potential advantages and limitations of (Q)SAR modelling and read-across methods for chemical regulatory risk assessment are reviewed. It is concluded that it would be premature to base a testing strategy on chemical-based computational modelling approaches, until such time as criteria to validate them for their reliability and relevance by using independent and transparent procedures, have been agreed. This is mainly because of inherent problems in validating and accepting (Q)SARs for regulatory use in ways that are analogous to those that have been developed and applied for in vitro tests. Until this issue has been resolved, it is recommended that testing strategies should be developed which comprise the integrated use of computational and read-across approaches. These should be applied in a cautious and judicious way, in association with available tissue culture methods, and in conjunction with metabolism and biokinetic studies. Such strategies should be intelligently applied by being driven by exposure information (based on bioavailability, not merely on production volume) and hazard information needs, in preference to a tick-box approach. In the meantime, there should be increased efforts to develop improved (Q)SARs, expert systems and new in vitro methods, and, in particular, ways to expedite their validation and acceptance must be found and prospectively agreed with all major stakeholders.
You need to register (for free) to download this article. Please log in/register here.

Integrated Testing Strategies for Toxicity Employing New and Existing Technologies

Robert D. Combes and Michael Balls

We have developed individual, integrated testing strategies (ITS) for predicting the toxicity of general chemicals, cosmetics, pharmaceuticals, inhaled chemicals, and nanoparticles. These ITS are based on published schemes developed previously for the risk assessment of chemicals to fulfil the requirements of REACH, which have been updated to take account of the latest developments in advanced in chemico modelling and in vitro technologies. In addition, we propose an ITS for neurotoxicity, based on the same principles, for incorporation in the other ITS. The technologies are deployed in a step-wise manner, as a basis for decision-tree approaches, incorporating weight-of-evidence stages. This means that testing can be stopped at the point where a risk assessment and/or classification can be performed, with labelling in accordance with the requirements of the regulatory authority concerned, rather than following a checklist approach to hazard identification. In addition, the strategies are intelligent, in that they are based on the fundamental premise that there is no hazard in the absence of exposure — which is why pharmacokinetic modelling plays a key role in each ITS. The new technologies include the use of complex, three-dimensional human cell tissue culture systems with in vivolike structural, physiological and biochemical features, as well as dosing conditions. In this way, problems of inter-species extrapolation and in vitro/in vivo extrapolation are minimised. This is reflected in the ITS placing more emphasis on the use of volunteers at the whole organism testing stage, rather than on existing animal testing, which is the current situation.
You need to register (for free) to download this article. Please log in/register here.