Roman
Ashauer
*ab and
Tjalling
Jager
c
aEnvironment Department, University of York, Heslington, York YO10 5NG, UK. E-mail: roman.ashauer@york.ac.uk
bToxicodynamics Ltd, York YO10 4PE, UK
cDEBtox Research, De Bilt, The Netherlands
First published on 1st November 2017
As ecotoxicologists we strive for a better understanding of how chemicals affect our environment. Humanity needs tools to identify those combinations of man-made chemicals and organisms most likely to cause problems. In other words: which of the millions of species are at risk from pollution? And which of the tens of thousands of chemicals contribute most to the risk? We identified our poor knowledge on physiological modes of action (how a chemical affects the energy allocation in an organism), and how they vary across species and toxicants, as a major knowledge gap. We also find that the key to predictive ecotoxicology is the systematic, rigorous characterization of physiological modes of action because that will enable more powerful in vitro to in vivo toxicity extrapolation and in silico ecotoxicology. In the near future, we expect a step change in our ability to study physiological modes of action by improved, and partially automated, experimental methods. Once we have populated the matrix of species and toxicants with sufficient physiological mode of action data we can look for patterns, and from those patterns infer general rules, theory and models.
Environmental significanceHumanity designs and produces ever more chemicals and urgently needs to identify those that pose the greatest risk to the millions of species living on earth. Predictive ecotoxicology would enable risk assessors to project the impact of untested chemicals on environmental organisms, yet we are currently not very proficient at this. We outline a research strategy that will deliver more effective theory and models for environmental risk assessment of chemicals. This strategy rests on mechanistic toxicokinetic-toxicodynamic modelling, complements efforts to develop quantitative adverse outcome pathways (qAOPs), and will also enable the design of more efficient quantitative structure-activity relationships (QSARs). |
The first approach relies on high-throughput bioassays of cellular or molecular markers and promises to upscale the bioassay results to organisms and beyond.2,3 Those efforts rally around the idea of quantitative adverse outcome pathways (AOP, see also glossary in Table 1).4–6 In our view, this approach will help to solve many important problems, but continues to struggle with the central conundrum because of a poor link to the whole organism (life history traits), and because it does not have easy scalability in the species dimension. We use the term scalability as it is used in engineering: the ability of a process to accommodate a growing amount of work. Implementation of a method consumes most of the costs and effort, applying it thousands or millions of times adds little extra costs. We can see that high-throughput testing will likely go a long way to screening large numbers of chemicals, i.e. we have scalability in the many chemicals dimension, and quantitative AOPs will aid interpretation and extrapolation, but the results will initially tell us something only about a limited number of biological species. To extrapolate to other, untested, species requires new knowledge that is not available yet. Simply put, we think these approaches, as they currently are put together, do not have scalability in the biological species dimension. Extrapolating quantitative AOPs across species requires the assumption that molecular pathways and functions are conserved across biological species. Moreover, it requires that they are conserved not only qualitatively, but also that the quantitative response is the same across biological species. For most species we do not have this information, and for those where we know something already, we can see that a notable fraction of receptors and target sites is not conserved between species.7,8 However, because a large fraction of receptors and pathways is conserved across species, the promise is that building blocks of quantitative AOPs can be reused – eventually making their development quick and cheap. However we haven't reached that point yet. The resources required to build quantitative AOPs for a new species are substantial, and more importantly, they are larger than simply doing apical toxicity tests with a range of model toxicants in the new species. To put it more bluntly: currently the most efficient way of establishing if a previously untested species is vulnerable to chemical pollutants might be to simply perform traditional toxicity testing of life-history traits with a dozen or so carefully selected model toxicants – and not by embarking on a large research programme with the aim to build quantitative AOPs around that species. When will that situation change? The answer hinges on how much quantitative AOPs differ amongst species – something that we do not know very much about.
Term | Abbreviation | Meaning |
---|---|---|
Toxicokinetics | TK | What an organism does to a chemical, including uptake, biotransformation, distribution and elimination. Process that links the external concentration to a change in the concentration at the target site, often studied over time |
Toxicodynamics | TD | What a chemical does to an organism. Process that links the concentration at the target site to toxicity. Encompasses all kinds of effects (e.g. on growth, reproduction, behavior, survival, etc.) & often studied over time |
Adverse outcome pathway | AOP | A way to organise and structure toxicological knowledge. Diagrams with boxes and arrows connecting cellular, physiological and individual level variables. Sometimes viewed as a framework for (eco)toxicity |
Dynamic energy budget model with toxicity module | DEBtox | A famility of models following from DEB theory. Used to simulate how organisms acquire and use energy to live, grow and reproduce, and how chemicals change those energy flows |
Physiological mode of action | pMoA | A distinct way in which a chemical interferes with the energy fluxes in an organism, and thereby affects life-history traits. Different pMoAs are defined within DEBtox |
The second approach to tackle the conundrum of large numbers views ecotoxicology as chemical stress ecology.9,10 A large body of literature, including some high level reviews, calls for more ecological relevance and realism in the environmental risk assessment of chemicals.11 This school of thought recognizes that, in the environment, chemical stress is just one of many stress factors that influence an organism, a population, community or ecosystem.12 Ecological impacts of toxicants are conditional on environmental variables as well as ecological factors. Also, there is clear evidence that the environment is degraded in many locations,13 and the question arises how much of that environmental degradation and biodiversity loss can be attributed to man-made, synthetic chemicals. Ecology provides tools to study stress at different biological scales (organisms, populations, communities, ecosystems) and there are concepts of what makes a biological species vulnerable to toxicants.14 However these approaches are all limited by a lack of quantitative models for multiple stressors, in particular how organisms respond to chemical stress in realistic environmental situations, which will inevitably include multiple stress factors. As individual organisms are the building blocks of populations, communities and ecosystems, and as they are well defined systems subject to the laws of mass and energy conservation as well as evolution, we have good reasons to start building theory and models for the effects of toxicants on the organism level.15 If we had a quantitative model to predict effects of toxicant exposure on organisms' life history traits under multiple stress, then ecology provides us with the theory and tools to extrapolate those effects to populations,16 communities17 and ecosystems.18
We take a closer look at the current limitations of ecotoxicological theory and models, and we will illuminate the interface of the first and second approach: effects of toxicants at the level of the whole organism's life history (Fig. 1). We will explain how progress on both challenges, the development of quantitative adverse outcome pathways and ecologically-relevant multiple-stress assessments, is limited by the same issue: a poor understanding of how the physiological mode of action19 (pMoA) varies across toxicants and biological species.
Fig. 2 Upscaling from molecular to macroscopic scale in physical chemistry and ecotoxicology. The equations that describe a macroscopic system (perfect gas, e.g. Boyle's law) derive from the equations of the molecular scale model (kinetic model of gases). In ecotoxicology we have models at the cellular and the organism level, but the connecting theory is missing. At the macroscopic scale DEBtox and its pMoAs provide a starting point (stress: degree of stress on DEB model parameter, CT: internal tolerance concentration, CV: internal concentration of toxicant, C0: threshold38). |
The development of AOPs promises to fill parts of this gap, but there are very few quantitative AOPs to date4,25–28 and those that have been developed do not provide a general method for scaling cellular toxicity up to the level of the organism's life history. As a first step (necessary but not sufficient), we argue here that significant advances can only be made by exchanging the traditional organism level dose-response models with biologically based dose-response models, namely those based on energy-budget considerations. There are many shortcomings of the traditional descriptive dose-response models and the associated summary statistics like LD50, ECx or NOEC values,29,30 but it is these crude metrics that in vitro and in silico methods aim to predict. Is it really the best way forward to build quantitative models around AOPs or cellular bioassays with the aim to predict adverse outcomes by proxy of LD50, ECx or NOEC values? We can do better by using energy-budget models instead, and predicting the parameters of those models. Why is this better? Simply put, organisms require resources to grow, develop and reproduce; it is these traits that we ultimately require to link AOPs to ecological theory, and upscale to the population level and higher. In particular for sub-lethal responses, like changes in growth and reproduction, it is the acquisition and use of resources (or in general: energy) that links the different life-history traits and determines how they develop over ontogeny. Changes in energy-demanding traits, such as growth and reproduction, as a consequence of toxicant or environmental stress, logically imply changes in the energy budget. Hence we can model organism's life history using dynamic energy budget (DEB) theory,31 and we can describe toxic effects on life-history traits of organisms (growth, reproduction) as changes in their energy allocation.32,33 Energy-budget models can identify where energy allocation has changed due to a stressor – the physiological mode of action - and by how much. This type of physiological information offers far more opportunities to link to the microscopic level, as well as to the population level, than descriptive summary statistics like the ECx values.15 We propose that predicting the energy-budget parameters from sub-organismal bioassays is likely far more robust and accurate than predicting ECx values because those model parameters have a clear biological interpretation.
Fig. 3 Upscaling from molecular to macroscopic scale with toxicokinetic-toxicodynamic (TK-TD) models at the organism level. Toxicodynamic model parameters cluster according to the biochemical mechanism of toxicity (adapted from Ashauer et al. 201634). In this example toxicokinetics and toxicodynamics were accounted for separately using a TK-TD model at the organism level. The toxicants were from five chemical classes (organophosphates, carbamates, baseline toxicants, uncouplers, reactive toxicants), each representing a distinct molecular initiating event and toxicity pathway at the cellular level (different symbols in plot). Finding those distinct cellular toxicity pathways reflected in organism level apical endpoint data (as clusters in the plot), even if it is just for the endpoint survival, demonstrates that biochemistry is reflected at the organism level in the values of TD model parameters. |
Fig. 4 Currently known physiological modes of action across toxicants and biological species. Physiological mode of action: M = maintenance, A = assimilation, G = growth costs, R = reproduction costs, H = hazard to embryo (list of studies in ESI†). First column: ECOSAR class (Ecological Structure Activity Relationships (ECOSAR) Predictive Model v1.11, US EPA); (1): insecticide, inhibits oxidative phosphorylation; (2): metals classified by us; n.a.: not applicable. Physiological modes of action are extracted from the scientific literature: A. nanus,19C. elegans,23,48–54D. octaedra,55L. rubellus,56–58C. teleta,59F. candida,60–62D. magna,32,57,63–67M. micrura,68M. californianus,69M. galloprovincialis,57,69,70M. edilus,57C. gigas,57L. stagnalis,71–73D. rerio,57,74S. droebachiensis.75 |
In Fig. 4 we list the chemical class from ECOSAR (Ecological Structure Activity Relationships Predictive Model v1.11, US EPA) in the first column. This classification is based on the molecular structure and existing toxicity data. It is important to realize that the mode of toxic action classification of chemicals itself is subject to variation depending on which method is used.40
We advocate populating the pMoA matrix until we can see patterns. From those emerging patterns we can then derive new classification schemes for chemicals, which will then correspond to the chemicals' physiological mode of action. Development of new quantitative structure activity relationships will follow. In addition to populating the pMoA matrix, we also need to map AOPs onto pMoAs. In other words: for each AOP we should carry out experiments with toxicants triggering the same molecular initiating event and calibrate an energy-budget model to test how widely the assumption of AOPs being chemical agnostic holds.
AOPs and pMoAs are complementary, in our view. AOPs are detailed at the molecular/cellular level, but sketchy about the ‘adverse outcome’, i.e., the effects on the life-history traits of the organism. The pMoAs (and the associated energy-budget model) provide a direct link to growth, development and reproduction, over the entire life cycle of the organism, and thereby a direct connection to higher levels of biological organization. However, the pMoAs are extremely sketchy at the sub-individual level as they consider rather abstract, lumped, energy fluxes such as assimilation and maintenance. We propose to populate the species & chemicals matrix to test the hypothesis that similarly acting chemicals will result in the same pMoA. The outcome of this exercise is totally open. It is also conceivable that one molecular initiating event will influence several energetic fluxes, and the chemical will thus have a pMoA that is made up of multiple energy fluxes. The AOP framework has not yet resulted in quantitative models that are general enough to test the above hypothesis.
We view the proposed research programme into pMoAs as complementary to the development of AOPs and AOP-based quantitative models. We can simply view the pMoA in DEBtox terminology as the ‘adverse outcome’ in AOP terminology. What is missing in the AOP concept, is the explicit use of a dynamic biological model to describe the organism and the physiological context within which an adverse outcome manifests itself. DEBtox models can fill this gap. Note however, that different AOPs can map onto the same pMoA. For example we can imagine many pathways by which a toxicant can affect the assimilation process, or each of the other pMoAs.
The first key challenge for IVIVE is that for most combinations of toxicant and biological species we do not know the pMoA yet (Fig. 4). The second is that it can be difficult to identify the pMoA from, typically noisy, experimental data. The third challenge is that DEBtox parameterization requires rather extensive animal testing with growth and reproductive output measured over time for a good part of the life cycle, and with sufficiently strong chemical effects.
At the modelling side, there are several practical issues that need to be addressed. Firstly, DEBtox is not a single model but rather a family of closely-related models.37 It is highly unlikely that different DEBtox models will identify a different pMoA, but the quantitative comparison of TD parameter values is best served by selecting a single model for all analyses. Further aspects that require more research are the identification of the relevant dose metric at the target site (e.g., the membrane concentration for baseline toxicants) and the quantitative link between the level of target occupation and the associated energy flux in the DEBtox model. Currently, most applications have relied on the linear-with-threshold relationships as presented by Kooijman and Bedaux.41 However, there are no strong theoretical reasons to dismiss other possible relationships. Further complications arise because we cannot observe the energy fluxes, and thus the pMoAs, themselves. They are derived from observations on growth and reproduction over time, and linked to the underlying fluxes with auxiliary assumptions. This may hamper the identification of patterns in pMoAs across species and chemicals and constitutes one more reason why we need high-throughput testing with whole organisms (see next section below).
Many animals require some modification of the DEB model to fit their life cycles; biological reality is often more complex than can be captured by the simplicity of generic models. This requires more parameters to be fitted, and thus more extensive testing efforts. However, this only needs to be done once in detail for each species to build the DEB model. After that, the DEB part of the TD model remains the same, and only the toxicological part needs to be calibrated for each toxicant. In general, partial life-cycle testing suffices to identify the pMoA and to fit the toxicological parameters, following growth and reproduction over a substantial part of the life cycle (starting with juveniles, and continuing until a number of reproduction events has been observed). An often-overlooked aspect in such tests is that toxicants may affect the investment per offspring. These differences are not only extremely relevant to identify and quantify the pMoA in the energy-budget context, it is also essential for an accurate prediction of population-level effects.16 From a practical perspective, it will be important to start with species that have very little variation between individuals (e.g., clones), and that do not require a sexual partner (e.g., parthenogenetic species). To prove the concept, and to allow high-throughput testing, we would suggest starting with a small animal species that has a fast life cycle, such as some daphnids, rotifers, nematodes or even protozoans. Once such a proof-of-concept is firmly established, and patterns in pMoA's and model parameters identified, subsequent testing with other animal species can likely be more focused.
Which aspect of ecotoxicology? | Traditional workflow | Proposed new workflow |
---|---|---|
Toxicity test design and data analysis | Fitting of sigmoidal dose-response curve and extraction of summary statistic (e.g. EC50, ECx or NOEC values) for one observation time point (usually at the end of the test) and each endpoint separately. Loss of information | Fitting of toxicokinetic-toxicodynamic models (e.g. DEBtox) to time-series of multiple, physiologically related endpoints (e.g. growth & reproduction) observed throughout the toxicity test. Extraction of biologically meaningful model parameter values |
Entries in ecotoxicology databases | Summary statistics such as EC50, ECx, NOEC, or similar values | Values of TKTD model parameters and raw data |
Use in chemical risk assessment | Comparison with environmental concentrations | Simulate effects using environmental concentration time series as input |
Development of new quantitative structure activity relationships (QSARs) | Benchmarking against summary statistics (EC50, ECx or NOEC values, etc.) limits progress due to lost information and lumping of toxicokinetics (related to physical-chemical properties) with toxicodynamics (related to toxicophore and toxic mechanism) | Aiming to predict TKTD model parameters is more likely to succeed because they have biological meaning and capture more information. Toxicokinetic and toxicodynamic parameters can be predicted separately |
Development of in vitro to in vivo toxicity extrapolation (IVIVE) and high throughput testing (HTT) methods | Benchmarking against summary statistics (EC50, ECx or NOEC values, etc.) limits progress due to lost information and lumping of toxicokinetics (related to physical-chemical properties) with toxicodynamics (related to toxicophore and toxic mechanism) | Aiming to predict TKTD model parameters is more likely to succeed because they have biological meaning and capture more information. Toxicokinetics and toxicodynamics can be separated. In vitro assays can be developed to predict toxicodynamic parameters |
There is a large gap in our theory and modelling capability when it comes to linking across scales of biological organization, but there must be a link between what happens at the cellular level and what happens on the energy budget, the tricky part is to find it. To fill this gap, the bottleneck might, surprisingly, not be a lack of detailed molecular understanding, but rather the effort required to generate suitable data on whole organism life history traits under chemical stress. Technological innovation is needed to achieve the large number of toxicity tests that we envision. There are already methods, often based on image analysis, to speed up ecotoxicity testing.42–45 As far as we know none of those have led to the high-throughput organism level data generation capability that we need, but it might be only one step away (e.g. C. elegans growth assay46).
Generally, we need to make greater efforts to lay open, discuss and research the validity of assumptions of models in ecotoxicology. Models are simply tools to deduce quantitative conclusions from a set of assumptions and data, nothing more. It is good to remember that most ecotoxicity models fall into the category of ‘phenomenological’ models,47 although DEB models at least consider some fundamental physics, such as the inclusion of the factor ‘time’ and adherence to the laws for energy and mass conservation. But perhaps it would be misleading to require that models in biology must be based on fundamental physics. Or, to use Jeremy Gunawardena's words:47 “Keep it simple. Including all the biochemical details may reassure biologists but it is a poor way to model.” It is more important that models are fit for purpose and that means finding the right level of abstraction for the question asked. In ecotoxicology, the individual organism response is what connects the research programs aiming at high-throughput testing and those efforts aimed at increasing ecological realism and relevance, including multiple stressors. Hence it appears that a useful level of abstraction for ecotoxicology models and theory is the individual organism and its energy budget – and that means we need to study physiological modes of action.
Footnote |
† Electronic supplementary information (ESI) available: The data used in Fig. 4 is available as supporting information (MS EXCEL file), including links to the original studies. See DOI: 10.1039/c7em00328e |
This journal is © The Royal Society of Chemistry 2018 |