Crichton 2007

advertisement
© Crichton 2007. Draft paper on flood modelling. Review copy not for publication.
1
Flood modelling for insurance purposes in the UK
By David Crichton
Abstract
The European Solvency Directive1 and its forthcoming successor, the European Solvency II
Directive makes it increasingly important that insurance companies take steps to accurately
measure their maximum probable loss for various types of catastrophe events. To date the
biggest loss events in the UK in terms of property damage have been storms and floods, and
climate change projections indicate that changes in the frequency, severity and location of
such events will produce bigger losses in the future.
This paper concentrates on the issues relating to flood losses, but many of the principles can
also be applied to other types of natural catastrophes.
Keywords: Insurance modelling, Reinsurance, Maximum Probable Loss, Catastrophe,
Climate change
Introduction
Insurers need to calculate the maximum probable loss in order to assess the appropriate level
of reinsurance needed and how much they should pay for it. In each case it is important to
analyse the three different components of risk, namely hazard, exposure and vulnerability.
There are basically two ways to model catastrophes; deterministic and probabilistic.
Deterministic modelling
This is the “traditional” method used by insurers for most of the twentieth century, mainly in
the context of fire insurance. It is prudent business practice for the insured to assess how
much cover is needed, usually in the form of a sum insured, while it is good practice for the
insurer to assess the maximum amount of a possible loss in order to ensure that their
solvency will not be threatened. For an individual policy the sum insured is usually the
maximum probable loss, but for a book of policies in a catastrophe scenario, such as an
earthquake, or major flood event, where many properties may be damaged, insurers found it
essential to measure their aggregate exposures. This was particularly important for fire
insurers, and from the earliest days of fire insurance, insurers would keep a “street book”
where they would record the location of all of the fire risks they insured. This was an early
form of geographical information system, with every street being listed and with careful
attention to corner sites where there could be adjoining properties with different street
addresses.
Careful recording in the street book could enable insurers to calculate the total accumulations
of exposure in any particular area, in case there was a major fire. There was a particular
problem if inflammable materials were involved, as in the major brandy warehouses clustered
in the centre of the city of Cognac in France, or the whisky warehouses and printing houses in
the centre of Glasgow and Dundee.
A “what if” scenario would be determined, such as “what if a fire breaks out in a particular
building, how far will it spread and what value of property will be damaged?” In such an
example, the underwriter would use the street book and seek to assess the likely fire spread
based on the vulnerability of the property, including proximity of adjoining property and fire
stations etc., and then the cost of the fire based on the value of the total exposure to the risk.
Fire risk management placed great emphasis on measures to compartmentalise the risk,
using fire doors, sprinklers, fire resistant party walls, in order to contain the maximum
probable loss.
In this way, it could be assumed that not all properties would be affected, and for those that
are, not all will be total losses. In other words underwriters assumed that the residual risk of a
1
Directive 2002/13/EC of the European Parliament and of the Council Amending Council Directive 73/239/EEC as
regards the Solvency Margin Requirements for Non-Life Insurance Undertakings
1
© Crichton 2007. Draft paper on flood modelling. Review copy not for publication.
2
total loss was so improbable that it could be ignored. To try to quantify such uncertainty in the
terms of probability in a pre-computer age would have required a substantial effort which
could not be justified. They relied instead on experienced fire surveyors who would make
subjective judgements based on site inspections.
Thus the “Maximum Probable Loss” is taken to mean the largest possible loss which it is
estimated may occur, given the worst combination of circumstances which could be
realistically imagined.
Deterministic methods were not confined to insurers: civil engineers traditionally followed
deterministic construction practices and building code guidelines and dealt with uncertainty
not by quantifying it, but by incorporating explicit, often arbitrary, factors of safety in their
designs. These procedures are still followed today in current building codes. Indeed they
have been followed almost intuitively by competent builders and engineers for centuries, often
resulting in older buildings being “over engineered”, using a combination of guesswork and
experience for designing margins for safety. It is interesting that when English building codes
were changed in 1971 to allow light weight roofing construction, based on calculated safety
margins rather than experience, insurers soon found that most buildings damaged in storms
in England were those constructed after 1971. On the other hand, in the Shetland Islands in
Scotland, where builders insisted on maintaining traditional methods, even the most severe
storm on record in Europe which remained over Shetland for 22 days continuously in 1993,
caused negligible damage.
Case study
Sometimes, deterministic modelling is based on an actual historical event, such as “How
much would it cost flood insurers now if the 1953 coastal storm surge was repeated today?”
In 1994, the first attempt was made to design a model for this particular scenario at a high
resolution, down to unit postcode (average 15 dwellings) and the procedure used is outlined
below.
Assess Hazard
1. Assemble detailed maps of the extent and depth of the flooding during the 1953
event.
2. Consider the effects of coastal flood defences, using data on the latest sea defence
condition survey which included details of which defences would be overtopped, or
would fail in a 50 year return period storm surge event.
3. Consider the ponding effects due to un-breached sea defences retaining flood waters
from breached defences.
4. Produce high resolution maps showing areas likely to be inundated, along with depth,
duration and velocity of inundation.
Assess Exposure
1. Extract data from the National Census and from credit referencing agencies to
establish the type of property, its value, age and construction and the socio economic
grouping of occupants in each flood hazard area at unit postcode level.
2. Using insurance company claims data, profiles were established for the nature of
contents normally found in domestic residences, for example, carpets, electronic
equipment etc and tables of profiles of average contents mix were produced based
on each of the main socio economic groupings.
Assess vulnerability
Using insurance company data from all the insurers most involved with a recent major coastal
flood event (Towyn in 1990), a set of mean loss tables were produced showing insurance
costs for
 different types of property,
 different socio economic groups
 different sums insured
in each case, assessing these for
 different depth,
2
© Crichton 2007. Draft paper on flood modelling. Review copy not for publication.
3


duration and
velocity of inundation.
Having assessed the hazard, exposure and vulnerability, all of this information was
programmed into a brand new large mainframe computer which occupied the whole floor of
an insurance head office building. Before the computer was commissioned to carry out all the
world wide computing functions of this company, the entire computer was used to run the
flood model. The model took 48 hours continuous run time on this giant computer, and the
results were used by the insurance industry as a whole to check whether it had sufficient
capacity to survive such a flood event (it did, but only just). Nowadays of course the model
could probably be run on a modern desktop PC or workstation in a few hours.
Deterministic modelling is still important for flood modelling because the areas affected by
flood are so dependent on the topography of the land. If insurers wanted to assess a
maximum probable loss for flood, it could be argued that they should treat any low lying area
of land as being a potential flood risk, but in a country such as the UK with its extensive
coastline, this could make the majority of property uninsurable at a price people would be
prepared to pay. It became clear that probabilities of loss would have to be taken into
account more explicitly.
Probabilistic modelling
Towards the end of the twentieth century, the increasing vulnerability of society to catastrophe
scenarios, for example from damage to complex high value industrial installations, has given
rise to greater sophistication by insurers in probabilistic modelling, enabled by a new
generation of powerful computers. The legal profession and the regulators were happy with
deterministic modelling, but during the 1960s with the advent of nuclear power it was realised
that no nuclear facility, no matter how well designed on a deterministic basis, can be entirely
safe, and the danger can spread well beyond the site of the facility. The risk to the public had
to be quantified in some way, not only for nuclear but for other catastrophe risks such as
earthquake. Indeed the arguments came to a head in 1984 over a case concerning the
seismic hazard of a nuclear test reactor at Vallecitos, California 2.
Probability modelling was made possible by advances in computer power in the last decade
of the 20th century. For example the use of so called “Monte Carlo” simulations in which the
computer would be programmed to run thousands of deterministic scenarios dependent on
random variables, rather like spinning a ball in a roulette wheel over and over again. At the
end of the run, the computer would calculate the probable occurrence of each scenario.
However, insurers soon realised that when it comes to weather catastrophes, Monte Carlo
does not work, because weather events are not random (although many academic
researchers mistakenly continued to use Monte Carlo for weather events even in the 1990s).
A good example of non random weather events occurred in December 1999, when three
catastrophic storms affected France and the Benelux countries within one month. Another
example occurred in the Gulf of Mexico in August to October 2005 when three record
breaking hurricanes caused widespread damage. Nevertheless, Monte Carlo is still a useful
tool for non weather events, especially earthquake.
Meanwhile for flood risks, the science of hydrology was developing quickly. As data on flood
events and from river flow gauges and tide gauges was building up, hydrologists were able to
assess flood probability return periods with increasing confidence. Methodologies for
assessing return periods in the UK became more sophisticated, first with the 1976 Flood
Studies Report3, to be replaced in 1999 by the Flood Estimation Handbook 4. With these
tools, hydrologists could advise civil engineers and property developers and provide flood risk
assessments for specific sites.
2
3
4
Meehan R.L., 1984. “The Atom and the Fault”. M.I.T. Press.
NERC, 1976. “Flood Studies Report”, Natural Environment Research Council, London.
Institute of Hydrology 1999 “Flood Estimation Handbook” 5 vols, Institute of Hydrology, Wallingford
3
© Crichton 2007. Draft paper on flood modelling. Review copy not for publication.
4
Deterministic modelling still has a role to play, especially for flood catastrophes, and
insurance companies in the UK have continued to cooperate in collecting mean loss data on
the costs of flood claims analysed by nature of building and nature of flood. Indeed the British
Flood Insurance Claims Database is now the biggest of its kind in the world, with 25 major
insurance companies contributing their own data and receiving analyses from the aggregate
data5. This is proving to be very valuable for premium setting, claims validation, early
assessment of claims costs, etc as well as catastrophe modelling. Flood maps have also
improved with a national airborne synthetic aperture radar survey to map topographical
details, combined with hydrological modelling. The flood maps for Scotland are particularly
significant as they combine fluvial, tidal and coastal flood maps to map the 200 year return
period flood in detail. Low resolution (1:50,000 scale) versions of these maps are now freely
available on the internet6, with high resolution maps for different return periods available to all
local authority planning departments in Scotland.
Insurance role in adapting society
The threats from climate change mean that insurers can no longer adopt a reactive approach;
they need to use their expertise and their economic power to help society to adapt. There are
a number of ways in which insurers are already doing this in Scotland.
Probabilities
What level of probability is acceptable to insurers at a price the customer is prepared to pay?
Local authority planning officials need this sort of guidance so that they can apply differential
rules to property development on a consistent basis. The “Insurance Template” is now almost
universally used in Scotland as guidance for planning authorities, and an extract appears
below:
Insurance Template
Extract from the residential property section of the “insurance template”
© Copyright David Crichton 1998, but may be quoted if authorship is acknowledged.
Type of housing
Standard of protection
Return period
Sheltered housing and homes for the disabled and elderly
Annual probability
1,000 years
0.10%
Children's homes, boarding schools, hotels, hostels
750 years
0.15%
Basements used for accommodation
750 years
0.15%
Bungalows without escape skylights
500 years
0.20%
Ground floor flats
500 years
0.20%
“Flashy” catchments (little or no flood warning available)
500 years
0.20%
Bungalows with escape skylights
300 years
0.33%
50 years
2.00%
200 years
0.50%
Caravans for seasonal occupancy only, provided adequate
warning notices and evacuation systems are in place
All other residential property
Black, A., Werritty, A., and Paine, J., October 2006. “Financial costs of property damages due to flooding; the
Halifax Dundee Flood Loss Tables 2005.” University of Dundee, sponsored by Halifax General Insurance Services
6
SEPA flood map is available at http://www.sepa.org.uk/flooding/mapping/index.htm
5
4
© Crichton 2007. Draft paper on flood modelling. Review copy not for publication.
5
Combined events
Insurers have considerable expertise in joint probability modelling and can help with questions
such as: “What is the effect on probable losses if there are combined events, such as a storm
surge combined with a spring tide and a high fluvial flow due to saturated ground and
snowmelt7?”
There are a number of examples of such combined effects:
Towyn, 1990. A storm washed away the sea defences. Before they could be repaired, a
second storm a week later inundated the town8.
Perth, 1993. Heavy snowfall in the catchment of the River Tay was followed by a sudden
thaw and heavy rainfall resulting in the biggest river flow ever recorded in the UK (2,200
cumecs) with 1,200 properties flooded.
Climate change
Developments in computer modelling have taken place just in time for scientists to be able to
model the impacts of climate change and to make accurate projections for future scenarios.
The outlook is extremely serious and insurers are gearing up for a much more proactive role.
Insurance operates on the basis of pooling risks in different parts of the world on the
assumption that when there is a catastrophe in one place, capital can be transferred from
elsewhere. What if there is a catastrophe clash situation where many completely different
areas suffer from flood events around the same time, as might be envisaged with climate
change and rising sea levels? The result could be global economic meltdown.
Conclusion
The biggest question is the extent to which insurers should become involved in helping
society to manage flood risks, for example using their expertise on catastrophe modelling,
using market forces to promote sensible land use planning, more resilient building codes, or
promotion of sustainable flood management techniques?
The global insurance industry is the biggest industry in the world, and governments will have
to learn that they can no longer afford to ignore the warnings that insurers have been issuing
in the last 20 years.
7
Defra and EA, June 2006., Joint probability: Dependence mapping and best practice
Technical Summary: FD2308 See
http://sciencesearch.defra.gov.uk/Default.aspx?Menu=Menu&Module=FJPProjectVi
The variable-pairs presented in the guide are:

• wave height & sea level, relevant to most coastal flood defence studies

• river flow & surge, relevant to most river flood defence studies

• hourly rainfall & sea level, of potential use in drainage studies in coastal towns

• wind-sea & swell, of potential use in coastal engineering studies.
8
Jones, M., 1992 “When the sea came by… the story of a flooded coast.” Workers Educational Association, Bangor,
North Wales
5
Download