Object-based spatial verification for multiple purposes Beth

advertisement
Object-based Spatial Verification
for Multiple Purposes
www.cawcr.gov.au
Beth Ebert1, Lawrie Rikus1, Aurel Moise1, Jun Chen1,2, and Raghavendra Ashrit3
1 CAWCR,
Melbourne, Australia
of Melbourne, Australia
3 NCMRWF, India
2 University
The Centre for Australian Weather and Climate Research
A partnership between CSIRO and the Bureau of Meteorology
Object-based spatial verification
FORECAST
2
OBSERVATIONS
Verifying attributes of objects
3
Other examples
HIRLAM cloud
Vertical cloud comparison
AVHRR satellite
Climate features (SPCZ)
Convective initiation
4
Jets in
vertical
plane
What does an object approach tell us?
• Errors in
•
•
•
•
Location
Size
Intensity
Orientation
FCST
OBS
• Results can
• Characterize errors for individual forecasts
• Show systematic errors
• Give hints as to source(s) of errors
• I will discuss CRA, MODE, "Blob"
• Not SAL, Procrustes, Composite (Nachamkin), others
5
Contiguous Rain Area (CRA) verification
• Find Contiguous Rain Areas (CRA) in
the fields to be verified
– Choose threshold
– Take union of forecast and
observations
– Use minimum number of points
and/or total volume of parameter to
filter out insignificant CRAs
Observed
Forecast
• Define a rectangular search box around CRA to look for best match
between forecast and observations
• Displacement determined by shifting forecast within the box until MSE
is minimized or correlation coefficient is maximized
• Error decomposition
MSEtotal = MSEdisplacement + MSEintensity + MSEpattern
Ebert & McBride, J. Hydrol., 2000
6
Heavy rain over India
Met Office global NWP model forecasts for monsoon rainfall, 2007-2012
7
Ashrit et al., WAF, in revision
Heavy rain over India
Errors in Day 1 rainfall forecasts
CRA threshold: 10 mm/d
8
20 mm/d
40 mm/d
10 mm/d
20 mm/d
40 mm/d
Heavy rain over India
Error decomposition (%) of Day 1 rainfall forecasts
9
Climate model evaluation
Can global climate models reproduce features such as the South Pacific
Convergence Zone?
Delage and Moise, JGR, 2011
added a rotation component
10
Climate model evaluation
etc.
"Location error" = MSEdisplacement + MSErotation
"Shape error" = MSEvolume + MSEpattern
Applied to 26 CMIP3 models
11
Climate model evaluation
Correcting the position of ENSO EOF1 strengthens model agreement on projected
changes in spatial patterns of ENSO driven variability in temperature and precipitation
12
Power et al., Nature, 2013
Method for Object-based Diagnostic
Evaluation (MODE) (Davis et al. MWR 2006)
Identification
Measure
attributes
Merging
Matching
Convolution – threshold
process
Fuzzy Logic Approach
Compare forecast and
observed attributes
Merge single objects into
clusters
Compute interest
values*
Identify matched pairs
Comparison
Summarize
13
Accumulate and examine
comparisons across
many cases
*interest value = weighted combination of attribute matching
CRA & MODE – what's the difference?
14
CRA
MODE
Convolution filter
N
Y
Object definition
Rain threshold
Rain threshold
Object merging
N
Y
Matching criterion
MSE or correlation
coefficient
Total interest of
weighted attributes
Location error
X- and Y- error
Centroid distance
Orientation error
Y
Y
Rain area
Y
Y, incl. intersection,
union, symmetric area
Rain volume
Y
Y
Error decomposition
Y
N
Comparison for tropical cyclone rainfall
CRA
15
MODE
Chen, Ebert, Brown (2014) – work in progress
Westerly jets
"Blob" defined by percentile of local maximum of zonal mean U in
reanalysis Y-Z plane
5th percentile
16
10th percentile
15th percentile
Rikus, Clim. Dyn., submitted
Westerly jets
17
Westerly jets
Global reanalyses
show consistent
behaviour except
20CR.
Can be used to
evaluate global
climate models.
18
Future of object-based verification
• Routinely applied in operational verification suite
• Other variables
• Climate applications
19
Future of object-based verification
Ensemble prediction – match individual ensemble members
8 ensemble members
Prob(object)=7/8
Brier skill score
Ensemble calibration approaches
20
Johnson & Wang, MWR, 2012, 2013
Future of object-based verification
Weather hazards
Tropical cyclone
structure
Fire
spread
Pollution cloud,
heat anomaly
Flood
inundation
Blizzard extent
and intensity
21
WWRP High Impact
Weather Project
The Centre for Australian Weather and Climate Research
A partnership between CSIRO and the Bureau of Meteorology
Thank you
www.cawcr.gov.au
Thank you
Extra slides
23
Spatial Verification Intercomparison Project
• Phase 2 – testing the methods
Tier 2a
• "MesoVICT" – precipitation
and rain in complex terrain
Tier 1
• Deterministic & ensemble
forecasts
• Point and gridded
observations including
ensemble observations
• MAP D-PHASE / COPS
dataset
Sensitivity tests
to method parameters
Tier 3
Core
Determ. precip
+ VERA anal
+ JDC obs
Ensemble wind
+ VERA anal
+ JDC obs
Other variables ensemble
+ VERA ensemble
+ JDC obs
• Phase 1 – understanding the
methods
Tier 2b
25
MODE – total interest
Attributes:
•
•
•
•
•
centroid distance separation
minimum separation distance of object boundaries
orientation angle difference
area ratio
intersection area
c w F


 c w
M
Ij
i 1 i , j
M
i 1 i , j
i, j
i, j
i, j
M = number of attributes
Fi,j = value of object match (0-1)
ci,j = confidence, how well a given
attribute describes the forecast
error
wi,j = weight given to an attribute
26
Tropical cyclone rainfall
27
CRA:
• Displacement & rotation error
• Correlation coefficient
• Volume
• Median, extreme rain
• Rain area
• Error decomposition
MODE:
• Centroid distance & angle difference
• Total interest
• Volume
• Median, extreme rain
• Intersection / union / symmetric area
Download