SensitivityTalk1

advertisement
Models, Uncertainty and Sensitivity
Andrea Saltelli,
European Commission,
Joint Research Centre
andrea.saltelli@jrc.it
ECOINFORMATICS meeting
US Environmental Protection Agency,
Research Triangle Park, North Carolina,
April 2008
1
On uncertainty – 1
"That is what we meant by science. That both
question and answer are tied up with uncertainty,
and that they are painful. But that there is no way
around them. And that you hide nothing; instead,
everything is brought out into the open".
Borderliners,
Peter Høeg,
Delta publisher,
1995
2
On uncertainty – 2
Hazy reasoning
behind clean air
David Goldston,
Nature 452|3,
April 2008
‘Science alone
can’t determine
how regulations
are written’
[…] EPA’s science panel
found that “quantitative
evidence […] must … be
characterized as having
high uncertainties.” What
to do in the face of
uncertainty is a policy
question, not a scientific
question. [..] The debate
is about […] what kinds
of uncertainty can be
tolerated as a basis for
decision-making.
3
On uncertainty – 3
How to play
uncertainties in
environmental
regulation …
Source: Scientific American, Jun2005, Vol. 292, Issue 6
4
- Fabrication (and politicisation) of
uncertainty
The example of the US Data quality
act and of the OMB “Peer Review
and Information Quality” which
”seemed designed to maximize the ability of
corporate interests to manufacture and magnify
scientific uncertainty”.
5
About the OFFICE OF MANAGEMENT
AND BUDGET (OMB) Proposed Risk
Assessment Bulletin (January 9, 2006)
http://www.whitehouse.gov/omb/inforeg/
‘OMB under attack by US legislators and
scientists’
“Main Man. John
Graham has led the
White House mission
to change agencies'
approach to risk”
ibidem in Nature
“The aim is to bog the process
down, in the name of
transparency” (Robert Shull).
Source: Colin Macilwain, Safe and
sound? Nature, 19 July 2006.
6
The critique of
models and what
sensitivity analysis
has to do
with it
7
Jared Diamond’s ‘Collapse’ versus
Michael Crichton’s ‘State of Fear’
8
Rising sea level will threaten “
… cities of the United Kingdom
(e.g. London), India Japan and
the Philippines.”, p. 493.
9
Michael Crichton presents
‘adversarial’ opinion on
retreating glaciers and
thickness of Antarctic ice
cap – and contends that sea
levels are not rising.
10
“They talk as if simulation
were real-world data. They ‘re
not. That ‘s a problem that has
to be fixed. I favor a stamp:
WARNING: COMPUTER
SIMULATION – MAY BE
ERRONEOUS and
UNVERIFIABLE. Like on
cigarettes […]” p. 556
11
For sure modelling is subject toady to
an unprecedented critique, which is no
longer limited to post-modern
philosophers but involves intellectuals
and scientists of different political
hues.
Have models fallen out of grace?
12
Useless Arithmetic: Why
Environmental Scientists Can't
Predict the Future
by Orrin H. Pilkey and Linda
Pilkey-Jarvis
‘Quantitative mathematical
models used by policy makes
and government
administrators to form
environmental policies are
seriously flawed’
13
One of the examples discussed concerns the
Yucca Mountain repository for radioactive waste
disposal, where a very large model called TSPA
(for total system performance assessment) is
used to guarantee the safe containment of the
waste. TSPA is Composed of 286 sub-models.
14
TSPA (like any other model) relies on
assumptions -- a crucial one being
the low permeability of the geological
formation and hence the long time
needed for the water to percolate
from the desert surface to the level of
the underground disposal.
Evidence was produced which could lead to an
upward revision of water permeability of 4 orders of
magnitude
(The
36Cl
story)
15
 The narratives:
‘How bad is the modeling that
supports the Department of Energy's
assertions about the safety and
permanency of the Yucca Mountain
nuclear waste dump? Execrable,
according to legendary Duke
University geologist Orrin Pilkey and
his geologist daughter, Linda PilkeyJarvis, who works for the Washington
state ecology department.’
Ken Maize Power Blog
16
We just can’t predict, concludes
N. N. Taleb, and we are victims
of the ludic fallacy, of delusion
of uncertainty, and so on.
Modelling is just another attempt
to ‘Platonify’ reality …
Nassim Nichola
Taleb, The
Black Swan,
Penguin,
London 2007
17
Many will disagree with
Pilkey and Taleb.
Yet, stakeholders and media
alike expect instrumental
use of models, amplification
or dampening of uncertainty
as a function of convenience
and so on.
18
The IFPRI had raised
about $460,000 for
the modeling, which
would have provided
insights to help
policymakers […]
[… ] But Greenpeace’s Haerlin and others
objected that the models were not
“transparent”.
19
Source: Dueling visions for an hungry world, Erik Stokstad,
The critique of models
The nature
of models,
after
Rosen
Decoding
N
F
Natural
system
Formal
system
Entailment
Entailment
Encoding
20
The critique of models
After Robert Rosen, 1991, ”World” (the
natural system) and “Model” (the formal
system) are internally entailed - driven by a
causal structure. [Efficient, material, final for
‘world’ – formal for ‘model’]
Nothing entails with one another “World” and
“Model”; the association is hence the result
of a craftsmanship.
Decoding
F
N
Entailment
Entailment
Formal
system
Natural
system
Encoding
21
The critique of models
George M.
Hornberger
1981
Hydrogeologist
Naomi Oreskes
1994
Historian
Jean
Baudrillard
1999
Philosopher
…
22
Just philosophy? Maybe not:
A title during the RIVM media scandal (1999):
“RIVM over-exact prognoses based on virtual
reality of computer models”
Jeroen van der Sluijs
Other Newspaper headlines:
Environmental institute lies and deceits
Fuss in parliament after criticism on environmental
numbers
The bankruptcy of the environmental numbers
Society has a right on fair information, RIVM does not
provide it
23
Science for the post normal age is discussed
in Funtowicz and Ravetz (1990, 1993, 1999)
mostly in relation to Science for policy use.
Jerry
Ravetz
Silvio
Funtowicz24
Post Normal Science
Remark: on Post Normal
Science diagram
increasing stakes
increases uncertainty
Funtowicz and Ravetz, Science for the
Post Normal age, Futures, 1993
25
Jerry
Ravetz
GIGO (Garbage In,
Garbage Out)
Science - where
uncertainties in
inputs must be
suppressed lest
outputs become
indeterminate
26
The critique of models <-> Sensitivity
Peter Kennedy, A Guide to
Econometrics
One of the ten commandments of applied
econometrics according to Peter
Kennedy:
“Thou shall confess in the presence of
sensitivity.
Corollary: Thou shall anticipate criticism
’’
27
When reporting a
sensitivity analysis,
researchers should
explain fully their
specification search so
that the readers can
judge for themselves
how the results may
have been affected.
28
Sensitivity
Definition. The study of how uncertainty in the
output of a model (numerical or otherwise) can be
apportioned to different sources of uncertainty in the
model input.
A related practice is `uncertainty analysis', which
focuses rather on quantifying uncertainty in model
output.
The two should be run in tandem.
29
In sensitivity analysis:
Type I error: assessing as important a non
important factor
Type II: assessing as non important an
important factor
Type III: analysing the wrong problem
30
Type III in sensitivity: Examples:
•In the case of TSPA (Yucca mountain) a range of 0.02
to 1 millimetre per year was used for percolation of
flux rate. Applying sensitivity analysis to TSPA could or
could not identify this as a crucial factor, but this
would be of scarce use if the value of the percolation
flux were later found to be of the order of 3,000
millimetres per year.
31
Prescriptions for sensitivity analysis
EPA’s 2004 guidelines on modelling
Models Guidance Draft - November 2003 Draft Guidance on the
Development, Evaluation, and Application of Regulatory
Environmental Models Prepared by: The Council for Regulatory
Environmental Modeling, http://cfpub.epa.gov/crem/cremlib.cfm
32
CREM Prescriptions for sensitivity analysis
“methods should preferably be able to
(a) deal with a model regardless of
assumptions about a model’s linearity and
additivity;
(b) consider interaction effects among input
uncertainties; and
(c) …an so on
33
CREM prescriptions are good.
We at JRC works on practices
that that take them into proper
account.
What these practices have in
common the aspiration to
tackle the curse of
dimensionality.
34
Want to to know more?
Buy our book!
GLOBAL SENSITIVITY
ANALYSIS. The primer
John Wiley & Sons,
2008
35
Sensitivity analysis and the White House
In the US the Proposed Risk
Assessment Bulletin
mentioned before also puts
forward prescription for
sensitivity analysis.
36
4. Standard for Characterizing Uncertainty
Influential risk assessments should characterize
uncertainty with a sensitivity analysis and, where
feasible, through use of a numeric distribution
[…] Sensitivity analysis is particularly useful in
pinpointing which assumptions are appropriate
candidates for additional data collection to narrow the
degree of uncertainty in the results. Sensitivity analysis
is generally considered a minimum, necessary
component of a quality risk assessment report.
Source: OFFICE OF MANAGEMENT AND
BUDGET
Proposed Risk Assessment Bulletin
(January 9, 2006)
http://www.whitehouse.gov/omb/inforeg/
37
The OMB about
transparency
http://www.whitehouse.gov/omb/inforeg/
38
The primary benefit of public transparency is not necessarily that errors in
analytic results will be detected, although error correction is clearly valuable.
The more important benefit of transparency is that the public will be
able to assess how much an agency’s analytic result hinges on the
specific analytic choices made by the agency. Concreteness about
analytic choices allows, for example, the implications of alternative technical
choices to be readily assessed. This type of sensitivity analysis is widely
regarded as an essential feature of high-quality analysis, yet sensitivity
analysis cannot be undertaken by outside parties unless a high degree
of transparency is achieved. The OMB guidelines do not compel such
sensitivity analysis as a necessary dimension of quality, but the transparency
achieved by reproducibility will allow the public to undertake sensitivity
studies of interest.
Friday, February 22, 2002
Graphic - Federal Register, Part IX
Office of Management and Budget
Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and
Integrity of Information Disseminated by Federal Agencies; Notice; Republication
http://www.whitehouse.gov/omb/inforeg/
39
Conclusions: Role of sensitivity analysis
(in a post-normal science context)
• Good practice and due diligence (e.g. test models,
obtain parsimonious model representations …)
•Check if policy options are distinguishable given
the uncertainties
•Contribute to the pedigree of the assessment.
•Falsify an analysis and or / make sure that you are
not falsified.
40
•Falsify the analysis (Popperian demarcation):
*‘Scientific mathematical modelling should
involve constant efforts to falsify the model’
(Pilkey and Pilkey Jarvis, op. cit.)
** Fight ‘the white swan syndrome’ (Nassim
N. Taleb, 2007)
41
Download