Stochastic Methods The Power of Numbers Presented by

advertisement
Stochastic Methods
The Power of Numbers
Presented by
Roger M. Hayne, PhD, FCAS, MAAA
CAS Spring Meeting
16-18 June 2008
Quebec City, Quebec
Why Bother with Stochastic Methods?
 We all know stochastic methods are:
– Complicated
– Black boxes
– Leave no room for actuarial judgment
– Are impossible to describe
– Take way too long to implement
– Take far more data than we could ever imagine obtaining
– Don’t answer the interesting questions
– Bottom line, inconvenient and too complex for their own good
 Well, some of you might know that, others (yours truly included)
need to be convinced
2
July 26, 2016
What is A Stochastic Actuarial Model?
 Many definitions – lets use: A simplified statement about of one
or more aspects of a loss process that explicitly includes the
potential for random effects
 Two main features
– Simplification
– Explicit incorporation of random effects
 Both important and both informative
 In effect it is a statement about all possible outcomes along with
their relative likelihood of occurring, that is a statement of the
distribution of outcomes and not just a single “selection”
3
July 26, 2016
Why is This Important?
 Consider the following very simple loss development triangle:
AY
12
24
36
48
2004
1.20
1.10
1.02
??
2005
1.25
1.08
??
??
2006
1.15
??
??
??
2007
??
??
??
??
 Simple chain ladder method:
– First pick a “typical” number for each column
– Square the triangle with those numbers
 Not a stochastic model, though a simplified statement of loss
process
4
July 26, 2016
Traditional “Deterministic” Approaches
 Chain ladder – pick factors thought to be representative of
column
 What happens “next year” when new information available?
– Often entire exercise is repeated “afresh”
– Sometimes we ask “what did we pick last year?”
 If “actual” varies “too much” from “expected” then we might
reevaluate the “expected”
 How much is “too much” is often dictated by experience, with
line of business or particular book being reviewed
 That indefinable quality – “actuarial judgment”
5
July 26, 2016
Let’s Parse The Traditional
 Start out with the chain ladder recipe, i.e. a “model”
 We pick “selections” that are somehow representative of a
particular age
 Experience and “actuarial judgment” often inform us as to what
we expect to see (e.g. auto physical damage = stable, umbrella
= volatile)
 Wait a minute – we have a simplified statement about the loss
process and an implicit statement about random fluctuation
 The traditional is almost stochastic already!
 Why not write down the recipe and expectation of randomness
explicitly
6
July 26, 2016
More Info in a Stochastic Context
 Stochastic approaches gain significant advantages over
deterministic ones from many sources
– Practitioner is forced to explicitly state his/her assumptions
– Not only will a good model give projections, but also estimates of
certain data points along the way – we can measure next year’s
actual vs. expected
 Parametric models have some advantages too
– They allow for extrapolation beyond the observed data under the
assumptions of the model
– Good methods for estimating the model parameters also provide
estimates of how volatile those parameters themselves are
• Maximum likelihood
• Bayesian
7
July 26, 2016
We May Never Pass This Way Again
 Two schools of statistical thought
– Frequentist
– Bayesian
 Two distinct approaches in dealing with uncertainty
– Frequentist makes the most sense with repeatable experiments
– Bayesian attempts to incorporate prior experience in a rational,
rigorous fashion
 Actuarial problems usually do not relate to repeatable
experiments, unless you use the dice example…
 Actuarial judgment is essentially a Bayesian “prior distribution”
 Bayesian prior is also a way to handle model uncertainty
8
July 26, 2016
All Models are Wrong …
 The banking sector has “sophisticated” risk models setting
capital to be adequate at very high (well above 99) percentiles
 All is fine … until something like the “subprime crisis” comes
along
 But the models were well founded and based on “considerable”
data
 Think about it – using 10 years of data to estimate a 1-in-1,000
year, or even a 1-in-100 year event really does not make a
whole lot of sense
 The only way to extrapolate from such data is to assume an
underlying parametric model and assume that you can
extrapolate with it
9
July 26, 2016
Model Uncertainty
 Mentioned before a good parameter estimation method also
gives an estimate of uncertainty in the parameter estimates
within that model
 The subprime issue was not one of parameter estimation but
one of model mis-estimation
 Traditional methods long recognized this problem and solved it
by using several forecast techniques
 At end of day an actuary “selected” his/her “estimate” based on
the projections of the various models – stochastically he/she
calculated an expected value “forecast” using weights
(probabilities) that were determined by “actuarial judgment”
 Thus there was a Bayesian prior dealing with model uncertainty
10 July 26, 2016
More is Better
 Stochastic methods can be thought of as extensions of
traditional approaches can
– Be based on same recipes as traditional methods
– Give rigor in “making selections” avoiding the ever-present
–
–
–
–
temptation to “throw out that point – it is an obvious outlier”
Provide more information as to the distribution of outcomes within
the scope of the particular model
Provide more information as to how well model fits with reality
Be evolutionary and evolve as data indicate
Be adapted to recognize “actuarial judgment” as well as a
multiplicity of potential models
 All in all stochastic reserving models can give you everything
that traditional methods do and much, much more
11 July 26, 2016
Download