Forecast Skill Matrix

advertisement
Skill Matrix
Model performance for seasonal prediction is verified by three skill scores; mean
squared skill score (MSSS), anomaly correlation coefficient (ACC), and root mean
square error (RMSE). Brief descriptions of the scores are presented in Appendix. MSSS,
ACC, and RMSE in each model and each multi-model analysis method are listed in
Table 1, Table 2, and Table 3.
Table 1: Mean Square Skill Score (MSSS) of each model and each multi-model
analysis method (global mean, June-July-August average of precipitation on surface and
temperature on 850hPa). Bold indicates the best performance method.
JJA Hindcast Global mean of MSSS
MODEL
Precipitation
T850
CWB
GCPS
GDAPS_F
GDAPS_O
HMC
IRI
-0.477
-0.429
-0.355
-0.057
-1.558
-0.399
-0.002
-0.011
0.008
-0.013
0.115
0.077
IRIF
JMA
METRI
MGO
NCC
NCEP
-0.399
-0.693
-0.158
-0.406
-0.318
-0.332
0.077
-0.067
-0.167
0.116
-0.036
-0.011
CPPM
MME
MR
SE
0.209
0.180
0.199
0.062
0.116
0.148
0.113
-0.013
Table 2: Anomaly Correlation Coefficient (ACC) of each model and each multi-model
analysis method (global mean, June-July-August average of precipitation on surface and
temperature on 850hPa). Bold indicates the best performance method.
JJA Hindcast (1983-2003) Global mean of
ACC
MODEL
Precipitation
T850
CWB
GCPS
GDAPS_F
GDAPS_O
HMC
IRI
IRIF
JMA
0.231
0.255
0.131
-0.007
0.119
0.374
0.374
0.151
0.301
0.216
0.143
0.041
0.357
0.313
0.313
0.123
METRI
MGO
NCC
NCEP
0.210
0.267
-0.029
0.327
0.153
0.352
0.068
0.216
CPPM
MME
MR
SE
0.448
0.404
0.408
0.349
0.357
0.388
0.324
0.325
Table 3: Root Mean Square Error (RMSE) of each model and each multi-model
analysis method (global mean, June-July-August average of precipitation on surface and
temperature on 850hPa). Bold indicates the best performance method.
JJA Hindcast (1983-2003) Global mean of
RMSE
MODEL
Precipitation
T850
CWB
GCPS
GDAPS_F
GDAPS_O
0.973
0.956
0.930
0.821
0.848
0.850
0.844
0.850
HMC
IRI
IRIF
JMA
METRI
MGO
NCC
1.280
0.945
0.945
1.222
0.863
0.942
0.918
0.797
0.814
0.814
1.042
0.913
0.796
0.862
NCEP
0.923
0.848
CPPM
MME
MR
SE
0.710
0.726
0.792
0.632
0.650
0.782
0.999
0.980
Appendix
A-1. Root Mean Square Error (RMSE)
RMSE indicates measure of accuracy of the forecast (f) compared with observation
(o). Then RMSE is defined as,
,
where w is latitude weight, W is summation of w, and subscript i is grid point.
RMSE indicates total amount of difference between forecast and observation map.
The score is always greater than or equal to 0.0. If the forecast is perfect, the score of
RMSE equals to 0.0. Sample results of RMSE are shown in Fig. 1 and Fig. 2 in
Appendix.
A-2. Anomaly Correlation Coefficient (ACC)
ACC is pattern correlation between predicted and analyzed anomalies defined as,
,
where over bar is time average.
ACC indicates spatial similarity between forecast and observation map. The score
always ranges from -1.0 to 1.0. If the forecast is perfect, the score of ACC equals to 1.0.
Sample results of ACC are shown in Fig. 3 and Fig. 4 in Appendix.
A-3. Mean Squared Skill Score (MSSS)
A detailed description of mean squared skill score (MSSS) is provided by WMO
(2002), so only a brief description is presented here. Let oij and fij (i=1,…,n) denote time
series of observations and continuous deterministic forecasts respectively for a grid
point or station j over the period of verification (POV). Then, their averages for the
POV, o j and fj and their sample variances soj2 and sfj2 are given by
,
,
The mean squared error of the forecasts is
.
For the case of cross-validated POV climatology forecasts where forecast/observation
pairs are reasonably temporally independent of each other (so that only one year at a
time is withheld), the mean squared error of ‘climatology’ forecasts (Murphy 1988) is
.
The Mean Squared Skill Score (MSSS) for j is defined as one minus the ratio of the
squared error of the forecasts to the squared error for forecasts of ‘climatology’
.
An overall MSSS is computed as,
,
where wj is unity for verifications at stations and is equal to cos(θj), where θj is the
latitude at grid point j on latitude-longitude grids. Sample results of MSSS for JJA mean
precipitation hindcast are shown in Fig. 5 in Appendix.
For either MSSSj or MSSS a corresponding Root Mean Squared Skill Score
(RMSSS) can be obtained easily from
.
MSSSj for forecasts fully cross-validated (with one year at a time withheld) can be
expanded (Murphy 1988) as
,
where rfxj is the product moment correlation of the forecasts and observations at point or
station j.
.
The first three terms of the decomposition of MSSSj are related to phase errors
(through the correlation), amplitude errors (through the ratio of the forecast to observed
variances) and overall bias error, respectively, of the forecasts. These terms provide the
opportunity for those wishing to use the forecasts for input into regional and local
forecasts to adjust or weight the forecasts as they deem appropriate. The last term takes
into account the fact that the ‘climatology’ forecasts are cross-validated as well. Spatial
distribution of MSSS, phase errors, amplitude errors, and overall bias error are shown in
Fig. 6 in Appendix.
Reference
Murphy, A.H., 1988: Skill scores based on the mean square error and their relationships
to the correlation coefficient. Mon. Wea. Rev. 116. 2417-2424.
WMO, 2002: Standardised verification system (SVS) for long-range forecasts (LRF).
Manual on the GDPS (WMO-No. 485), volume 1.
Download