Sensitivity Analysis

advertisement
Sensitivity Analysis
Jake Blanchard
Fall 2010
Introduction
Sensitivity Analysis = the study of how
uncertainty in the output of a model can
be apportioned to different input
parameters
 Local sensitivity = focus on sensitivity at a
particular set of input parameters, usually
using gradients or partial derivatives
 Global or domain-wide sensitivity =
consider entire range of inputs

Typical Approach

Consider a Point Reactor Kinetics
problem
1.8
 P0
C (0) 

 increased by 50%
1.7
1.6
1.5
P(t)
dP   0   

 P(t )  C (t )
dt   
dC 
 P(t )  C (t )
dt 
P(0)  P0  1
=0.08
1.4
1.3
1.2
1.1
1
0
0.5
1
1.5
time (s)
2
2.5
3
Results
P(t) normalized to P0
 Mean lifetime normalized to baseline
value (0.001 s)
 t=3 s

-3
3
x 10
relative change in P(t)
2
1
0
-1
-2
-3
-0.1
-0.05
0
0.05
relative change in 
0.1
0.15
Results
P(t) normalized to P0
 Mean lifetime normalized to baseline
value (0.001 s)
 t=0.1 s

0.02
0.015
relative change in P(t)
0.01
0.005
0
-0.005
-0.01
-0.015
-0.1
-0.05
0
0.05
relative change in 
0.1
0.15
Putting all on one chart – t=0.1 s
0.025



0
0.02
dimensionless variation in P(t)
0.015
0.01
0.005
0
-0.005
-0.01
-0.015
-0.02
-0.025
-0.2
-0.15
-0.1
-0.05
0
0.05
dimensionless variation in input variable
0.1
0.15
Putting all on one chart – t=3 s
0.15



0
dimensionless variation in P(t)
0.1
0.05
0
-0.05
-0.1
-0.15
-0.2
-0.2
-0.15
-0.1
-0.05
0
0.05
dimensionless variation in input variable
0.1
0.15
Quantifying Sensitivity
To first order, our measure of sensitivity is
the gradient of an output with respect to
some particular input variable.
 Suppose all variables are uncertain and

Y  Cs Ps  Ct Pt  C j Pj

Then, if inputs are independent,
Y  Cs Ps  Ct Pt  C j Pj
y  Cs ps  Ct pt  C j p j
  C  C  C 
2
y
2
s
2
s
2
t
2
t
2
j
2
j
Quantifying Sensitivity

Most obvious calculation of sensitivity is
Y
Sx 
Px
This is the slope of the curves we just
looked at
 We can normalize about some point (y0)

y 0  C s ps0  Ct pt0  C j p 0j
0
p
l
x Y
Sx  0
y Px
Quantifying Sensitivity
This normalized sensitivity says nothing
about the expected variation in the inputs.
 If we are highly sensitive to a variable
which varies little, it may not matter in
the end
 Normalize to input variances

 x Y
Sx 
 y Px

Rewriting…
 s Y
s
Ss 
 Cs
 y Ps
y

t
S t  Ct
y

j
Sj  Cj
y

 y  C s2 s2  Ct2 t2  C 2j  2j
2

j
2 
2 
1  Cs
 Ct
 C 2j 2


y
2
s
2
y
2
t
2
y
A Different Approach
Question: If we could eliminate the
variation in a single input variable, how
much would we reduce output variation?
 Hold one input (Px) constant
 Find output variance – V(Y|Px=px)
 This will vary as we vary px
 So now do this for a variety of values of
px and find expected value E(V(Y|Px))
 Note: V(Y)=E(V(Y|Px))+V(E(Y|Px))

Now normalize
V ( E (Y | Px ))
Sx 
Vy

This is often called the
◦
◦
◦
◦
importance measure,
sensitivity index,
correlation ratio, or
first order effect
Variance-Based Methods

Assume
Y  f ( x)  f 0   f i xi    f ij xi , x j   ...  f1, 2,...,k x1 , x2 ,...,xk 
k
i 1
i
j i
Choose each term such that it has a mean
of 0
 Hence, f0 is average of f(x)

f i xi   E Y xi   f 0


f ij xi , x j   E Y xi , x j  f i xi   f j x j   f 0
Variance Methods

Since terms are orthogonal, we can
square everything and integrate over our
domain
2
Vi   E Y | xi 
k
V f   Vi   Vij   Vijk  ...  V1, 2,...,k
i 1
i
j
i
j
k
Vi   f i 2 xi dxi
Si 
Vi
Vf
k
1   Si   Sij   Sijk  ...  S1, 2,...,k
i 1
i
j
i
j
k
Variance Methods
 Si is first order (or main) effect of xi
 Sij is second order index. It measures
effect of pure interaction between any
pair of output variables
 Other values of S are higher order indices
 “Typical” sensitivity analysis just addresses
first order effects
 An “exhaustive” sensitivity analysis would
address other indices as well
Suppose k=4
1=S1+S2+S3+S4+S12+S13+S14+S23+S24+S34+
S123+S124+S134+S234+S1234
 Total # of terms is 4+6+4+1=15=24-1

Download