Validation of multiple ocean shelf models against EO data

advertisement
Validation of multiple ocean shelf
models against EO data using
automated front detection
Peter Miller, Jason Holt1 and Dave Storkey2
NCOF Science Workshop, Croyde Bay, 21-23 Oct. 2008
1. Proudman Oceanographic Laboratory
2. UK Meteorological Office
Model validation using fronts
• Rationale
• Models to validate
• Validation method
– Composite front maps
– Local regional comparison
– Model cloudiness
• Visual and quantitative results
• Applications and future work
Rationale for model validation using fronts
• There are a myriad of different ocean models:
– FOAM, ROMS, NEMO, OCCAM, POLCOMS, …
• Increasing usage of and reliance on ocean models:
– Coupled ocean-atmosphere; met-ocean forecasts and climate
predictions; coupled physics-ecosystem; pollution trajectories; water
quality and algal predictions.
– Realistic ecosystem modelling requires accurate physical forcing to
control the supply of nutrients to the surface mixed layer.
• Existing validation uses point comparisons:
– Average difference is of questionable value;
– Earth Observation (EO) surface fields give greater coverage but do
not improve understanding
• Need better validation methods to improve models
NCOF
Operational NW European shelf domains
• Atlantic Margin
model (AMM) 12km
(32 level)
• Medium Resolution
Continental Shelf
(MRCS) 7km
(18 level) including
sediments and
ecosystem
(ERSEM).
• Irish Sea model,
1 nm (18 level)
AMM (12km)
Irish Sea
(1nm)
HRCS
(2km)
MRCS
(7km)
FOAM-NEMO
(7km)
www.ncof.gov.uk
to be transitioned to NEMO framework
Model validation using fronts
• Rationale
• Models to validate
• Validation method
– Composite front maps
– Local regional comparison
– Model cloudiness
• Visual and quantitative results
• Applications and future work
Conventional image composites
SST - 20 Sep. 1535 GMT
SST - 21 Sep. 1343 GMT
SST - 22 Sep. 1513 GMT
Composite SST
•
•
•
Weekly SST composite - 20-26 Sep.
Mean SST at each location during week
Dynamic and transient features are blurred
Spurious features introduced
Miller, P.I., (2004) Multispectral front maps for automatic
detection of ocean colour features from SeaWiFS,
International Journal of Remote Sensing, 25 (7-8), 1437-1442.
Composite front maps
Fronts
- 20
Sep.
1535
GMT
SST - 20
Sep.
1535
GMT
Fronts
21Sep.
Sep.1343
1343GMT
GMT
SST - -21
Fronts
Sep.1513
1513GMT
GMT
SST - -2222Sep.
Composite fronts
Weekly front map - 20-26 Sep.
•
Does not blur dynamic features.
•
Highlights persistent or strong gradient
fronts.
Miller, P.I., (2004) Multispectral front maps for automatic
detection of ocean colour features from SeaWiFS,
International Journal of Remote Sensing, 25 (7-8), 1437-1442.
Model validation using fronts
• Rationale
• Models to validate
• Validation method
– Composite front maps
– Local regional comparison
– Model cloudiness
• Visual and quantitative results
• Applications and future work
Front detection on model SST
• POLCOMS 3D hydrodynamic
model
• HRCS: 2 km resolution
• Horizontal: latitude-longitude
Arakawa B-grid
• Vertical: S-coordinates
ModelModel
sea-surface
thermaltemperature
fronts
Satellite thermal fronts
01 Aug.
01-07
2001
Aug.
02:00
2001GMT
01-31 Aug. 2001
Local regional comparison
Model thermal fronts
EO thermal fronts
Summarise
local properties
•
Summarise by
subsampling or filtering
•
Properties of gradient
magnitude, persistence,
direction, etc.
•
Compare by differencing
maps or checking for
matches
•
Robust method, can be
automated.
Summarise
local properties
Compare
regionally
Maps of
matches
Statistics
Miller, P., J. Holt, and D. Storkey (in press) Validation of multiple
ocean shelf models against EO data using automated front
detection: initial results, EuroGOOS Conference, Exeter.
Model cloudiness
Model fronts – with EO cloudiness
Model
Satellite
fronts
thermal
– cloud-free
fronts
01-31 Aug. 2001
01-31
01-07 Aug.
Aug. 2001
2001
Regional comparison
Remapping and resampling (e.g. 8 x 8 window)
Aug. 2001
Model thermal fronts
Aug. 2001
EO thermal fronts
Regional comparison
Validation measures, EO = ‘truth’
‘Misses’ of EO
fronts by model
fronts
‘Hits’ of EO fronts by
model fronts
‘False alarm’ fronts
generated by model
EO front min=4, model front min=4, win size=24x24, Aug. 2001
Fronts explains biological errors
Low
Chl-a model ‘skill’
High
Model validation using fronts
• Rationale
• Models to validate
• Validation method
– Composite front maps
– Local regional comparison
– Model cloudiness
• Visual and quantitative results
• Applications and future work
HRCS model vs AVHRR SST fronts
HRCS model SST
09 May 2001
0200 UTC
May 2001
HRCS model 2km, SST fronts
Cloud-masked
May 2001
AVHRR HRPT 1km, EO SST fronts
FOAM model
SST fronts
12km
Cloud-masked
Aug. 2005
AVHRR Pathfinder
EO SST fronts
4km => 12km
Aug. 2005
FOAM-NEMO fronts vs EO AVHRR
Aug. 2007
FOAM-NEMO model 7km, SST fronts
Aug. 2007
AVHRR HRPT 1km, EO SST fronts
Cloud-masked
EO front min=4, model front min=2, win size=4x4
ROC validation of HRCS 2km fronts
ROC vs mod_min (1..20 top to bottom), win_size=24, sat_min=4 (Jan-Aug. 2001)
Varying model front minimum value
100
Win 24x24
Win 48x48
90
1=lax threshold
80
Hit rate
70
60
Model front
minimum value
50
40
30
20
10
20=strict threshold
0
0
10
20
30
40
50
60
70
80
90
100
False alarm rate
EO front min=4, model min=1..20, mean Jan-Aug. 2001
ROC comparison of model fronts
ROC vs mod_min (1..20 top to bottom), compare different models
100
FOAM 12km Win 4x4
NEMO 7km (1km) Win 48x48
90
FOAM-NEMO
7km
HRCS 2km (1km) Win 24x24
80
HRCS 2km (1km) Win 48x48
Hit rate
70
POLCOMS-HRCS
2km
60
50
40
30
FOAM
12km
20
10
0
0
10
20
30
40
50
60
70
80
90
False alarm rate
EO front min=4, win size=48x48 km, model min=1..20 (top to bottom)
100
Initial results: MRCS 7km SST fronts
MRCS model
7km, SST fronts
AVHRR HRPT
1km,
EO SST fronts
Cloud-masked
Jul 2007
Jul 2007
Aug 2007
Aug 2007
Sep 2007
Sep 2007
Initial results: MRCS 7km Chl-a fronts
MRCS model
7km, Chl fronts
Aqua-MODIS
1km,
EO Chl fronts
Cloud-masked
Jul 2007
Aug 2007
Sep 2007
Potential applications
• Analyse and improve models
– E.g. persistence of eddies at sea surface, boundary
effects.
– Assess improvement in ecosystem model.
– Data assimilation method?
• Compare alternative models or versions
– E.g. UK MetOffice moving from FOAM to NEMO.
AlgaRisk: UK algal bloom risk
•
•
•
•
Provide satellite and model information to the EA
Help focus monitoring for bloom events
Enable EA to advise local authorities
Demonstrate potential to assist with
EU directives
Chlorophyll-a
18 July 2006
Further work
• Further model comparisons
– POLCOM-MRCS vs. FOAM-NEMO, both at 7km.
– Optimise EO/model front detections for validation.
• Detailed analysis over annual sequence
– Indicate consistently good and bad regions.
– Confirm genuine time-series changes, and
interpret significant deviations of model from obs.
• Front contours by simplifying clusters
– Model location errors for particular fronts / overall.
Peter Miller: pim@pml.ac.uk
Model validation using fronts
• Rationale
• Models to validate
• Validation method
– Composite front maps
– Local regional comparison
– Model cloudiness
• Visual and quantitative results
• Applications and future work
Peter Miller: pim@pml.ac.uk
Front detection method
SST map
Local window
Histogram bimodality
test and threshold
Front map


Cohesion test
Contour following
Cayula, J.-F., and Cornillon, P., (1992), Edge detection algorithm for SST
images. Journal of Atmospheric and Oceanic Technology, 9, 67-80.
Composite
Weighting factors
• Mean gradient
• Persistence = P(front)
• Advection = proximity
Fmean
Fprox
Weighting
factors
Pfront
Combine
Fcomp
Miller, P.I., (in press) Composite front maps for improved
visibility of dynamic oceanic fronts on cloudy AVHRR and
SeaWiFS data, Journal of Marine Systems.
F
Composite
front map 20-26 Sep.
prox
Example thermal front maps
Eddies off NW Spain, 29-31 Mar. 1997
Faroe-Shetland current, 18-24 May 1999
Miller, P.I., (in press) Composite front maps for improved visibility of dynamic oceanic
fronts on cloudy AVHRR and SeaWiFS data, Journal of Marine Systems.
Capabilities of PML RSG and NEODAAS
NASA and ESA
global coverage
MODIS
MERIS
AVHRR
SeaWiFS
NERC funded
Dundee Satellite
Receiving Station
Near-real time
Researchers and
students at NERC
centres, universities
Navigation and
atmospheric
correction
Raw data
received in
Plymouth
Mapped products of
ocean colour/temperature,
atmosphere, terrestrial
www.neodaas.ac.uk info@neodaas.ac.uk
Scientists at sea
or in the field
EO fronts without cloud?
AMSR-E Passive microwave SST thermal fronts, 25 km resolution
01-31 Aug. 2001
FOAM model
SST fronts
12km
Cloud-masked
Dec. 2005
AVHRR Pathfinder
EO SST fronts
4km => 12km
Dec. 2005
FOAM-NEMO fronts vs EO AVHRR
Sep. 2007
FOAM-NEMO model 7km, SST fronts
Sep. 2007
AVHRR HRPT 1km, EO SST fronts
Cloud-masked
EO front min=4, model front min=2, win size=4x4
ROC validation of FOAM 12km fronts
ROC vs mod_min (1..20 top to bottom), win_size={4, 8}, sat_min=4 (2005 average)
Varying model front minimum
value
100
Win 4x4
90
Win 8x8
80
Hit rate
70
60
50
40
30
20
10
0
0
10
20
30
40
50
60
70
80
90
False alarm rate
EO front min=4, model min=1..20, mean Jan-Dec. 2005
100
ROC validation of NEMO 7km fronts
ROC vs mod_min (1..20 top to bottom), win_size={4,8}, sat_min=4 (Jul-Sep 2007
Varying model front minimum value
100
Win 4x4
Win 8x8
Win 24x24
Win 48x48
90
80
Hit rate
70
60
50
40
30
20
10
0
0
10
20
30
40
50
60
70
80
90
100
False alarm rate
EO front min=4, model min=1..20, mean Jul-Sep. 2007
Download