List of Centres running operationally ensemble prediction systems

advertisement
WORLD METEOROLOGICAL ORGANIZATION
_______________
COMMISSION FOR BASIC SYSTEMS
MEETING OF EXPERT TEAM ON
ENSEMBLE PREDICTION SYSTEMS
TOKYO, JAPAN, 15-19 OCTOBER 2001
CBS ET/EPS/Doc.9(5)
(4.X.2001)
______
ITEM: 9
Original: ENGLISH
EDUCATION AND TRAINING OF USERS OF ENSEMBLE PRODUCTS
(Submitted by the Secretariat)
____________________________________________________
Summary and purpose of document
The recommendations of CBS and EC for training on EPS products are recalled. Required
training seminars and guidance material are considered.
_____________________________________________________
Action proposed
The expert team is invited to take into account the information given in this document and to
make appropriate recommendations.
References:
- WMO/TD No. 1065, WWW Technical Report No. 21. Proceedings of the
Workshop on Use of Ensemble Prediction.
- CBS XII, Geneva, 29 November-8 December 2000, Abridged final report.
- EC LIII, Geneva, 5-15 June 2001, Abridged final report.
CBS ET/EPS/Doc. 9(5), p. 2
DISCUSSION
Background
1.
The last WMO Executive Council (EC LIII) particularly welcomed that the Commission for Basic
Systems (CBS), in collaboration with regional associations, put emphasis on training in severe weather
forecasting and enhanced use of EPS products and definition of related regional requirements.
2.
The CBS recommended four types of EPS training as follows:

Regional WMO workshops to explain the EPS approach, its usefulness, and its
limitations. It should concentrate on the products, which are available. These workshops
would be mainly useful for those who intend to use EPS end products;

Technical cooperation type of training for those who intend to make their own products
and/or who will need more specific training about products or the methodology of the
forecast. Training could be organized on individual request or through WMO voluntary
cooperation arrangements;

Workshops or seminars developed by Centres running EPS. These centres are
encouraged to open them to a wide range of participants. Co-sponsorship with WMO
should be considered.

Universities engaged in the training in meteorology, should be encouraged by Members to
include topics related to EPS in their programme.
3.
To assure coordinated implementation of initial procedures for making available EPS products,
the CBS endorsed the idea that the WMO Secretariat should plan for regional workshops with focus on
EPS training under the appropriate programme of WMO.
Training and guidance material
4.
In parallel to the project of dissemination of ensemble products, training of forecasters to make
the best use of these new products is necessary. Forecasters will have to change their habit of dealing
with only deterministic forecast products and to adapt to the EPS products and the ensemble system
terminology, like the attributes to probability forecast (see Table 1). Therefore training is a must. There
are ample needs for training world-wide. Forecasters have to understand that EPS is applicable to all
ranges and that it is the future way of doing forecasting. CAL learning modules should be built. Roving
seminars and training workshops should be organized.
5.
WMO is already programming an introduction to the Ensemble Prediction subject in its usual
GDPS regional training seminars (half to one day). Now, in addition, five days or eleven days seminars
entirely devoted to EPS would have to be organized with a lot of practical sessions on case studies.
Products tailored to the specific regions related to the seminar will have to be generated.
6.
The production of guidance material on use of Ensemble Prediction products by forecasters,
which could be a new chapter in the Guide on the Global Data Processing System, is also required.
CBS ET/EPS/Doc. 9(5), p. 3
Table 1
ATTRIBUTE
1. Bias
DEFINITION
Correspondence between mean forecast
and mean observation
2. Association
Strength of linear relationship between
pairs of forecasts and observations
3. Accuracy
Average correspondence between
individual pairs of observations and
forecasts
4. Skill
Accuracy of forecasts relative to
accuracy of forecasts produced by a
standard method
5. Reliability
Correspondence of conditional mean
observation and conditioning forecasts,
averaged over all forecasts
6. Resolution
Difference between conditional mean
observation and unconditional mean
observation, averaged over all forecasts
7. Sharpness
Variability of forecasts as described by
distribution forecasts
8. Discrimination Difference between conditional mean
forecast and unconditional mean
forecast, averaged over all observations
9. Uncertainty
Variability of observations as described
by the distribution of observations
RELATED MEASURES
bias (mean forecast probabilitysample observed frequency)
covariance, correlation
mean absolute error (MAE), mean
squarred error (MSE), root mean
squared error, Brier score (BS)
Brier skill score, others in the usual
format
Reliability component of BS, MAE,
MSE of binned data from reliability
table
Resolution component of BS
Variance of forecasts
Area under ROC, measures of
separation of conditional distribution;
MAE, MSE of scatter plot, binned by
observation value
Variance of observations
________________________________
Download