Executive summary from 1st expert workshop * INERIS 12

advertisement
Executive summary from 1st expert workshop – INERIS 12-13 April, 2012
The first expert meeting within the HEROIC project was organized at INERIS (Verneuil-enHalatte, FRANCE) the 12th and 13th of April 2012. Experts from public institutes, academia
and the business sector were invited to work together at providing a consolidated overview of
today’s risk assessment procedures and data requirements to then stimulate a discussion on
how existing strength could be further exploited and weaknesses could be overcome. The
workshop combined plenary sessions for information sharing (presentation of the working
groups the first day and restitutions on the afternoon of the second day) with individual
breakout sessions where experts could give input according to their respective area of
expertise. There were five working groups, on risk assessment, hazard assessment for
single compounds, hazard assessment for mixtures, exposure assessment and socioeconomic aspects.
Work Group I
Participants: J. Tarazona (ECHA) ; N. Carmichael (ECETOC) ; N. Roth (SCAHT) ; M. Wilks
(SCAHT) ; L. Aicher (SCAHT) ; B. Richard (EDF) ; D. Barcelo (CSIC); P. Grasso (Università
Cattolica del Sacro Cuore) ; T. Vermeire (RIVM); S. Andres (INERIS) ; D. Demortain (London
School of Economics); M. Junghans (Ecotox Center, Dübendorf, Switzerland) ; R. Fautz
(KPSS, Cosmetics Industry, Darmstadt, Germany), A. Charistou (Benaki Phytopathological
Institute).
The potential benefits of integrated risk assessment (IRA) have already been recognized 10
– 15 years ago and since then several projects have been initiated including a joint WHO IPCS/US-EPA project (2001) and a range of EU FP6 and EU FP7 projects.
The impact of those initiatives on the way we do risk assessment today has been limited. In
the absence of explicit legal mandates some IRA is happening but there is no consensus
among subject matter experts about the achieved scale of integration. This is partly because
there is no harmonized definition and understanding of what IRA means in practice.
For the purpose of our work group we defined IRA in its broadest sense as a concept that
evaluates all the known data to decide which ones are relevant to solve a particular problem.
We identified that a focus on integration of human health and environmental risk assessment
including socio-economic analysis of human welfare and ecosystem services would clearly
differentiate HEROIC from previous projects on IRA.
We also assumed that the accessibility to large new data sets generated under the REACH
legislation will create a favourable environment to better illustrate the benefits of IRA
There was consensus that the development of real life case studies would be most
appropriate to illustrate the added value from IRA. The group felt that potential benefits might
be biggest in situations where risk assessors are not able to demonstrate the safe use of a
product/compound and are looking for comparative risk assessment. This is independent of
chemical classes and initial data situations. IRA might support 1st tier screening when only
limited data are available. But IRA might also solve specific questions which arise as part of
a higher tier risk assessment in situations which were initially considered to be data rich.
In any case an initial constructive dialogue between risk assessors and risk managers to
prioritize risks and define protection goals is indispensable to manage the complexity in
today’s risk assessment and to drive efficiency. An appropriate problem formulation is
therefore an indispensable first step in IRA.
Work Group II
Participants: S. Tissot (SYNGENTA) ; D. Kroese (TNO) ; W. Peijnenburg (University of
Leiden) ; R. Beaudouin (INERIS) ; E. Mombelli (INERIS) ; C. Brochot (INERIS) ; F. Bois
(INERIS) ; G. Schurmann (UFZ) ; M. Farre (CSIC) ; A. Péry (INERIS) ; M. Cronin (University
of Liverpool) ; F. Brion (INERIS) ; S. Ait Aissa (INERIS) ; M. Bisson (INERIS)
We first proposed an inventory of endpoints considered in toxicological and ecotoxicological
hazard assessment for the different classes of chemicals (industrial chemicals, biocides,
plant protection products, pharmaceuticals, food additives, and cosmetics). We also tried to
figure out what endpoints would be considered in future risk assessment. It is expected that
increased focus on adverse outcome pathways, with a biochemical mechanistic
understanding, will outdate nowadays endpoints in hazard assessment, which can be seen
as the convergence of many AOPs, except for classification.
We then moved to an inventory of usual databases. We addressed the question of data
acceptability and accessibility. First, data deviating from guidelines could be accepted only
for chemicals on their own. Non-standard results could be part of a weight of evidence
approach. For extrapolation between substances, only standard data will be considered.
Second, the issue of sharing data is not restricted to data from private companies.
Researchers themselves wish to protect the access to their data, especially if the production
of these data has been fastidious and costly. A system of reward should be developed and
adapted. Moreover, the availability of data is not sufficient. Some work may be necessary to
arrange the data and put them in a useful format.
Alternative methods in animal testing were also examined. Read across, which is based on
experimental data on similar compounds and expert judgment to assess similarity, is
accepted for REACh, but there is still a lack of protocol and concrete guidance for both
industrials producing dossiers and experts evaluating these dossiers. In general in silico
models are more questioned than experimental data. For instance, in vitro tests are not
questioned with mode of action, QSAR models are. Some committee could be constituted
that makes recommendations to increase mutual acceptance. Acceptance of alternative
methods would be improved when combinations of QSARs, experimental data and read
across are used (weight of evidence, and integrated testing strategies).
Finally, the question of statistical analysis of the data was tackled. There is still a lot of
NOAEL estimated, despite inherent statistical problems, and still rare use of benchmark
doses. A reliable estimate of uncertainty of a test or model output is crucial in hazard
assessment. It must be noted that even results from reference standard tests have
uncertainty. Another source of uncertainty arises when extrapolation is performed.
Work Group socio-economics
Participants: J.M. Brignon (INERIS), L. Frewer (Newcastle University, UK), S. Georgiou
(HSE, UK), K. Mattas (Aristotle University Of Thessaloniki, Greece), J.P. Vergnaud (Paris
School of Economics, France).
We acknowledged that “socio-economic assessment” (SEA) is important to complement RA
and to improve the transparency and public trust in risk management of chemicals.
However, despite various voices in the EU sharing this view, there are still important gaps
and few and fragile bridges between the two fields. By nature, SEA is focused on risks at the
population levels, and about their impacts in terms of human well-being, or in terms of
ecosystems integrity, whereas RA is assessing effects on organs at individual levels, in
humans, or in model species. For instance, when RA community studies cell reprotoxic
effects of a chemical, SEA is interested at the implications in terms of fertility at the collective
level.
Better communication between the two disciplines is therefore a first objective, and the group
viewed that work on connecting indicators, sharing the terminologies, or building
“intermediate indicators” between those used by RA (e.g. cancer risk indicators) and SEA
(welfare measures) could be a way to progress towards this objective.
Further to this first step, other areas of collaboration between RA and SEA were foreseen
during the discussion:
- SEA can help RA to set priorities and allocate financial resources for its R&D, e.g. to
decide on future priorities between cancer, reproductive toxicity, etc…. SEA is a set of
methodologies that are able to deal with prioritization issues, taking into account
public/stakeholder participation and trust in the process. In this context however,
questions go beyond strict SEA discipline and would require involvement of ethical
considerations and specialists.
- RA currently works on a substance-by-substance basis, and SEA can promote
moving to groupings that would be meaningful in a socio-economic viewpoint, by
looking at groups that are consistent in terms of societal issues and responses. For
instance, a more integrated approach of reprotoxic chemicals could be elaborated in
this way.
- SEA could also work with RA on the improvement of models and methods in RA, at
least two directions were noted by the group :
First, since SEA is working on use and release of chemicals in the economy
(use pattern scenarios) there could be collaboration on setting industrial or
domestic use patterns, by collaboration between social scientists and
exposure assessors (e.g. hygienists…)
Second example : RA models such as toxicokinetic models could have as
outputs damage to organs, and then health impact description that can
translate in need for, or success factor and cost of medical treatment, useful
for SEA.
A final statement was that R&D in (eco)toxicology and RA should go as far as possible at the
same pace and linked to R&D in epidemiology, because epidemiological information
(experimental or simulated) is key in bringing the risk assessments of chemicals at the socioeconomic level.
Work Group III
Participants: T. Backhaus (F+B Env. Cons.); J.-L. Dorne (EFSA); M. Faust (F+B Env. Cons.);
T. Frische (UBA); L. Geoffroy (INERIS); A. Ginebreda (IDAEA-CSIC); M. Grote (EDF); A.
Kortenkamp (Brunel University London); T. Porsbring (JRC); L. Posthuma (RIVM); A. Ragas
(Radboud University Nijmegen); S. Ronga (EDF); C. Tebby (INERIS).
The discussions within work group III (WG 3) on mixture toxicity were organized along four
major aims: (i) to get a comprehensive overview on current provisions and approaches for
the regulatory risk assessment of chemical mixtures both in the EU and the US, (ii) to
compare different generic schemes for regulatory mixture risk assessments that have been
proposed by governmental organizations, (iii) to identify relevant developing methodologies
that have a potential for regulatory use in the near future, and (iv) to collect expert views on
the possible advantages of the concept of integrated human and environmental risk
assessment (IRA) for chemical mixtures.
In 2009, an overview on existing legal requirements to consider combined effects of
chemicals was prepared for the European Commission as part of the State-of-the-Art Report
on Mixture Toxicity. The WG 3 participants agreed that this needs revision and expansion.
The reasons are (i) important changes in the EU legal framework and corresponding
implementation guidelines during recent years, and (ii) restrictions in the scope of the report,
which was focused on a selection of 21 EU directives and regulations of chemicals and
chemical products. In particular, relevant environmental-media-oriented pieces of EU
legislation were not covered, such as those addressing water quality and ambient air quality
for instance. The HEROIC work on an updated and comprehensive inventory of existing
provisions and approaches will take particular advantage from corresponding efforts at the
European Food Safety Authority (EFSA). The authority is currently preparing a report on the
issue which shall be published in the next few months.
In 2011, a working group established under the umbrella of the World Health
Organization / International Programme on Chemical Safety (WHO/IPCS) published a
generic framework for the Risk Assessment of Combined Exposure to Multiple Chemicals.
The scheme suggests a tiered approach that starts with the default assumption of dose
addition for toxicants co-occurring in an exposure scenario. The WG 3 participants discussed
the strength and weaknesses of this approach in comparison to a variety of other generic
schemes that have been developed by different governmental and inter-governmental
organizations. It was generally acknowledged that considerable progress has been made
during recent years towards the establishment of consistent, consensually acceptable, and
practically feasible methodologies for regulatory mixture risk assessments. However, it was
also agreed that none of the proposed generic frameworks already fits all regulatory needs.
Regulatory approaches to the problem of mixture toxicity fall into three basic categories:
experimental whole mixture testing, component-based mixture toxicity modeling, and
safeguarding against unwanted mixture effects by means of assessment factors. The WG 3
participants discussed a number of developing methodologies that may help to improve
these “toolboxes”. Application of the TTC approach (Threshold of Toxicological Concern)
within a framework for mixture toxicity assessment has recently been suggested by the
scientific advisory committees of the European Commission. The approach was considered
to have a strong potential for becoming regulatory reality in the near future. The same may
apply to the SSD concept (Species Sensitivity Distribution). However, it is limited to
ecotoxicological mixture risk assessments and hence of no direct relevance for IRA. Other
novel approaches appear to be very promising form a scientific perspective but were
considered to be far away from being fit for regulatory purposes. In particular this applies to
the application of the so-called omics-approaches as well as the DEB theory (Dynamic
Energy Budget) in assessing toxic effects of chemical mixtures.
The 2001 WHO/UNEP/ILO IPCS report on Integrated Risk Assessment mentioned risks from
exposure of humans and the environment to multiple agents via multiple routes as an
important aspect. However, application of the IRA concept to chemical mixtures was not
further worked out in the paper. The WG 3 participants agreed that it is necessary to clarify in
detail where human and ecological assessments of mixture toxicity can really benefit from
each other and where integration may only produce a counterproductive increase in
complexity, time and costs. Starting points for a meaningful integration may be given either
by similar exposure situations or by similar modes of combined action. The potentials for
integrated mixture risk assessments should be explored in detail by means of case studies.
To this end, the participants made a range of suggestions. However, there was uncertainty
whether this could be over-ambitious for a Coordination Action and requiring dedicated
research projects.
Work Group IV
Participants: R. Bonnard (INERIS), L. Castleleyn (Univ. Leuven), J. Caudeville (INERIS), P.
Ciffroy (EDF), R. Glass (FERA), C. Kairo (IRSN), C. Legind (DTU), R. Smolders (VITO), L.
Sparfel (INSERM), T. Tanaka (INERIS), F. Zeman (INERIS)
The main objective of the discussion was to review and analyse the main criteria that are
explicitly or implicitly taken into account when evaluating the ‘quality’ of an exposure model
in respect to the assessment context. To introduce the discussion, a preliminary structured
list of criteria was proposed to the participants, distinguishing: (i) Reliability criteria (i.e.
inherent quality of a exposure computed result relating to a modelling methodology or
specification); (ii) Relevance criteria (i.e. extent to which a modelling tool is appropriate for a
particular risk assessment); (iii) Uncertainty criteria; (iv) Practical use of the method/model.
We first reviewed the endpoints considered in toxicological and ecotoxicological exposure
assessment, and the associated tools able to provide such endpoints, i.e. (i) internal
exposure endpoints (i.e. concentration in body matrix), requiring the development of PBPK
models and the availability of physiological data, as well as of biomonitoring data for
validation. The main gaps that were identified are the scarcity of data on target organs and
the scarcity of data for some age classes. Besides, internal exposure assessment is of
interest only if in parallel ‘Equivalent Biomonitoring Reference Doses’ are defined, providing
concentration thresholds of contaminants in human tissues (instead of thresholds in
environmental media only); (ii) ingestion/inhalation intakes, requiring the development of
multimedia models and the availability of data related to human behavior. The main gaps
that were identified are the difficulty to capture individual exposure, the difficulty to identify
the origin of food products and the scarcity of information related to some pathways (e.g.
soil ingestion); (iii) the concentration in food products. Even if some convergence can be
found between Human Exposure Assessment (HEA) and Environmental Risk Assessment
(ERA), it seems not so obvious to use the same multimedia models for both because some
wording can cover different realities (e.g. ‘Fish’ means only fish muscle for HEA an and
whole organisms for biota); (iv) concentrations in water, air and/or soils, requiring dispersion
models and monitoring databases. Potential common databases and dispersion models
could be used for HEA and EEA; (v) spatial qualitative maps of contaminated vs non
contaminated areas.
We then discussed the standardisation/validation process of tools dedicated to exposure
assessment for humans and the environment, distinguishing: (i) field validation (i.e.
comparison between predicted and monitored data). Given the complexity of exposure
pathways, validation is generally possible only on sub-models, but not on the complete
aggregated model. Besides, it is difficult to evaluate the contribution of background and/or
past contamination levels; (ii) benchmarking exercises (i.e. blind comparison of models on
common scenarios). Strictly speaking, such exercises increase the plausibility of the tools
rather than their validity. To our knowledge however, no blind benchmarking exercises were
realized so far for ‘full-chain’ modeling scenarios (only experiences on sub-models or
specific cases); (iii) numerical validation (i.e. adequacy between user guide and software,
verification of numerical schemes); (iv) documentation and transparency of tools. Some
participants noted that exposure models are sometimes badly used by end-users because
of poor information on weaknesses and applicability domain (multimedia models being
partly informed by other disciplines like QSAR modeling); (v) relevance to the aim (e.g.
spatial and temporal required resolution, steady-state vs dynamic scenario, analytical vs
numerical solutions, Mechanistic vs Regression approach, Deterministic vs Probabilistic
approach, etc)
We then tried to classify the typology of uncertainty associated to exposure assessment,
distinguishing: (i) Scenario uncertainty, including uncertainty on emission and target
behaviour (e.g. population composition and behaviour, spatial and time scales, etc); (ii)
Model uncertainty, including pathways and processes availability or not, consensus-based
or alternative approaches; (iii) Parametric uncertainty. It is necessary here to distinguish
knowledge uncertainty and natural variability. Probabilistic approaches are available for
accounting for parametric uncertainty, but they require distributions for parameters that are
not easily derived because of the scarcity and/or the heterogeneity of data; (iv)
measurement uncertainty, including analytical imprecision and levels of detection, spatial
resolution, representativeness of population.
A special focus was then discussed on mixtures of chemicals in exposure assessment. Two
alternatives were considered: (i) the Top-down approach for retrospective studies. In this
case, observed effect(s) allow(s) defining the priority (classes of) pollutants to be
investigated for exposure; (ii) the Bottom-up for predictive studies. The mixture of pollutants
that have to be investigated in exposure assessment are driven by further interactions in
effect assessment. An acceptable assumption is that no interaction of (micro-)pollutants
occur for environmental fate. Some substances can also be used as tracers of mixtures.
Finally, common data/models/transversal approaches for both HEA and EEA were
discussed, including emission, environmental concentrations and fate data, Dispersion and
Bioaccumulation models, Uncertainty framework and conceptually Top-Down or Bottom-Up
approaches.
Download