PPT

advertisement
Ensemble Forecasting
Yuejian Zhu
Environmental Modeling Center
December 6th 2011
Acknowledgment for:
Members of Ensemble & Probabilistic Guidance Team
1
Outlines
• Responsibility of ensemble team
– Include all ensemble systems
• Current available data and products
– Data access through all available resource
– Digital probabilistic products
– Web-based products
• Implementations for past year
– Include all major and minor
– Update information from other centers
• On going implementations
– Include all major and minor
– Reforecasting for future ensemble post process
• Future plans
– Development of major system
– Post process and calibration
– How to satisfy user requests?
2
Responsibilities of Ensemble Team
- Assess, model, communicate uncertainty in numerical forecasts
• Present uncertainty in numerical forecasting
– Tasks
• Design, implement, maintain, and continuously improve ensemble systems
– Topics
• Initial value related uncertainty
• Model related forecast uncertainty
– Ensemble systems
•
•
•
•
•
Global – GEFS / NAEFS / NUOPC
Regional – SREF / HREF / NARRE-TL / HWAF ensemble
Climate – Contributions to future coupling CFS configuration
NAEFS/GEFS downscaled
Ocean wave ensemble (MMA/EMC)
• Statistical correction of ensemble forecasts
– Tasks
• Correct for systematic errors on model grid
• Downscale information to fine resolution grid (NDFD)
• Combine all forecast info into single ensemble/probabilistic guidance
• Probabilistic product generation / user applications
– Contribute to design of probabilistic products
– Support use of ensembles by
• Internal users (NCEP Service Center, WFOs, OHD/RFC forecasters and et al.) 3
• External users (research, development, and applications)
NAEFS Products Distribution
System
Current available products
Config.
1.deg 0-384h, every 6 hours, 20 members (NCEP) and 20 members (CMC), ens. control (NCEP and CMC)
Format
CCS
NCEP
FTPPRD
TOC
NOMADS
GRIB1 (and GRIB2, GIF images for web display)
NCEP: pgrba, pgrbb, pgrba_bc, pgrba_an, pgrba_wt, ensstat, ndgd
CMC: pgrba, pgrba_bc, pgrba_an, pgrba_wt, ensstat
NAEFS: ndgd, pgrba_an, pgrba_bc
ftp://ftpprd.ncep.noaa.gov/pub/data/nccf/com/gens/prod cd gefs.${yyyymmdd} for NCEP ensemble
1. pgrb2a (00, 06, 12 and 18UTC) (1.0 degree, all lead times, 1(c) + 20 (p))
2. pgrb2alr (00, 06, 12 and 18UTC (2.5 degree, all lead times, 1(c) +20 (p))
2. pgrb2b (00, 06, 12 and 18UTC) (1.0 degree, all lead times, 1(c) + 20 (p))
4. pgrb2blr (00 and 12UTC) (2.5 degree, all lead times, 1(c) + 20 (p))
5. ensstat (00UTC) (prcp_bc, pqpf and pqpf_bc files)
6. wafs (00 and 12UTC)
7. ndgd_gb2 (00, 06, 12, 18UTC) (CONUS-5km, all lead times and all probability forecasts)
ftp://ftpprd.ncep.noaa.gov/pub/data/nccf/com/gens/prod cd cmce.${yyyymmdd} for CMC ensemble
1. pgrba (00 and 12UTC) (1.0 degree, all lead times,1 control + 20 members)
ftp://ftpprd.ncep.noaa.gov/pub/data/nccf/com/gens/prod cd naefs.${yyyymmdd} for NAEFS products
1. pgrb2a_an (00, 12UTC) (1.0 degree, all lead times, anomaly for ensemble mean)
2. pgrb2a_bc (00,12UTC) (1.0 degree, all lead times, probabilistic forecasts)
3. ndgd_gb2 (00,12UTC) (CONUS-5km, all lead times, probabilistic forecasts)
ftp://tgftp.nws.noaa.gov/SL.us008001/ST.opnl/ cd MT.ensg_CY.${cyc}/RD.${yyyymmdd} for NCEP
only
1. PT.grid_DF.gr1_RE.high (00 and 12UTC) (Pgrba: 1.0 and 2.5 degree, 0-384 hrs, c + 10 (p))
2. PT.grid_DF.gr1_RE.low (00 and 12UTC) (Pgrbb: 1.0 degree, 0-84 hrs, 2.5 d, 90-384 hrs, c + 10 (p))
3. PT_grid_DF.bb
http://nomad5.ncep.noaa.gov/ncep_data/ for ftp: combined pgrba and pgrbb at 1 degree resolution, for all ensemble
members (c+20(p)) and all lead time (0-384 hours)
4
http://nomad5.ncep.noaa.gov/pub/gens/archive/ for http: combined pgrba and pgrbb at 1 degree resolution
Web-based Probabilistic Products
• NCEP web-site
– NCO supported all ensemble probabilistic products (main page)
– EMC (experimental) developed probabilistic products
•
•
•
•
PQPF for various thresholds (include precipitation types)
RMOP for 500hPa height
Tropical storm track forecast
Extra-tropical storm track forecast (global tracking)
• NAEFS web-site
– CPC developed extended probabilistic forecast
• Temperature, 500hPa height and precipitation
– NCO supported NAEFS products
• Spaghetti,
– EMC (experimental) NAEFS products
• PQPF for various thresholds
• Anomaly forecast
– CMC supported NAEFS products
• Metagrams for NA major cities (example)
• NUOPC web-site
– EMC (experimental) NUOPC products
• PQPF for various thresholds (include precipitation types)
• TS track forecast for multi-model ensembles
– FNMOC’s NUE probabilistic products (example)
5
Implementations for past year
• Alaska downscaling - Dec. 7th 2010
– 6km probabilistic guidance for Alaska region (improvement for Max/Min
temperature, wind speed/direction)
• NUOPC – IOC Jan. 18 2011
– Getting FNMOC ensemble data (both raw and bias corrected) in NCEP and
NOMADS for public
– Ceremony of NUOPC IOC
• NAEFS products upgrade – March 1st 2011
– Adding more variables for exchange, more variables for bias correction
• 5th Ensemble User Workshop – May 9-11 2011
– In Laurel Maryland (highlight)
• NAEFS data upgrade – May 24th 2011
– Receive CMC’s grib2 bias corrected forecast directly;
• CCPA upgrade – July 26th 2011
– Adding 3-hrly analysis for regional application
•
•
•
•
CMC’s GEFS upgrade – August 17th 2011 (highlight)
FNMOC’s GEFS upgrade – September 14th 2011 (highlight, stats)
NMME (August) and IMME (December) implementation in CPC
CMC’s GEFS data on NOMADS – November 2011
6
On going implementations
• Major GEFS upgrade – Jan. 2012
– Highlight the changes
– What do we expect from this change?
• Overall skills, TC tracks, bias correction
• More stats: http://www.emc.ncep.noaa.gov/gmb/yzhu/html/imp/201109_imp.html
• NAEFS product upgrade – April. 2012
– Extend variables for CONUS downscaling
• Temperature, winds, humidity and dew point
– Precipitation calibrations (CONUS, RFC)
– Anomaly forecast from CFSRR climatology
• 6th NAEFS workshop – May 1-3 2012
– At Monterey CA – FNMOC will be a localhost
• GEFS reforecast – August 2010 –
– Leading by Tom Hamill (ESRL)
– Bias correction for TS forecast
• CCPA update
– Use more historical data for calibration
• Additional supports
– CSTAR (the Collaborative Science, Technology, and Applied
Research) program - ensemble sensitivity analysis
7
Future plans
•
GEFS
– Plan for FY2013-2015 (EMC-ESRL collaboration)
•
NAEFS
–
–
–
–
–
•
Increasing products resolution and output frequencies
Improving calibration method
Apply GEFS reforecast for calibration
EMC-MDL collaboration for high moment adjustment
EMC-OHD collaboration for improving temperature and precipitation uncertainty forecast
NUOPC – adding FNMOC ensemble to NAEFS
– Project information and highlights
– Evaluation metrics
– Tentative Schedule
•
Extended range forecast
– Coupling ensemble system
– Extended to 45 days to cover week-3 and week-4
•
Seamless forecast system
– Exchange extended forecast data with CMC
– Support IMME and NMME to improve MJO prediction
•
User requests
–
–
–
–
Mean sea level pressure – more useful for users
Relative humidity or surface dew-point – more useful for users
Anomaly forecast – extreme weather index
Clustering (questionable for single model ensemble?)
8
http://mag.ncep.noaa.gov/NCOMAGWEB/appcontroller
Global
ensemble
NAEFS
Regional
ensemble
Return
9
http://www.cpc.ncep.noaa.gov/products/predictions/short_range/NAEFS/Outlook_D264.00.php
Example of temperature forecast
Upper tercile
Return
Lower tercile
10
Map of PQPF and Precipitation Types: every 6 hours, 4 different thresholds
Return
Example: The tracker puts out 4 time a day for
all cyclones (Northern Hemisphere)
Return
NAEFS products – Metagram (examples)
Return
13
Return
Application for Alaska region and HPC Alaska desk
10-m U
Max Temp
Solid – RMS error
Bias (absolute value)
Dash - spread
10-m wind speed
Max Temp
Bias (absolute value)
CRPS – small is better
Return
15
16
5th NCEP Ensemble User Workshop
•
Logistics
– Workshop organized by EMC/NCEP and DTC/NCAR (co-organizer)
– May 10-12 2011, Laurel, MD, 90+ participants
• NWS Regions (6), Headquarters (17), NCEP (44)
• OAR (5), other government agencies (4), private (2), academic (5) & international (11)
– For further info, see: http://www.dtcenter.org/events/workshops11/det_11/
•
Main Theme
– How to support NWS in its transition from single value to probabilistic forecasting
• Goal is to convey forecast uncertainty in user relevant form
•
46 presentations
– Covering all ensemble forecast systems
•
SREF, GEFS/NAEFS, Wave ensemble, CFS and NMME
– Reports from NCEP Service Centres and Regions (WFOs)
• E.g., first numerical ensemble-based 2-day tornado, week 3-4, monthly MJO outlook
•
Working groups
–
–
–
–
•
Ensemble configurations
Statistic post processing
Probabilistic product generation
Ensemble data depository / access
forecaster tools
- Ensemble forecasting
- Reforecast/hindcast generation
- Forecaster’s role and training
- Database interrogation /
Outcome / Recommendations
– Prepared report for NWS roadmap reference
• Plan for immediate steps (interim solution to be implemented in 2-3 years)
• Outline for long term solution and resource requirements (5-10 years)
Return
– All activities to be coordinated under NWS Forecast Uncertainty Program (NFUSE)
CMC’s GEFS Implementation
• Modification to EnKF analysis configuration
– Use 192 ensemble members instead of 96
– Has more satellite data
• Upgrade GEM version
– Use 4.2 version (vertical staggering) instead of
version 3.0
• Increase model top to 2 hPa from 10 hPa
• Resolutions
– Horizontal: 600x300 (66km) from 400x200
(100km)
– Vertical: 40 levels from 28 levels
Return
18
FNOMC’s GEFS Implementation and Plan
• 9 latitude band Ensemble Transform
initialization instead of 5-banded
• T159L42 instead of T119L30 in horizontal
and vertical resolutions
• Plan for T239L42 in operations for June
2012
• Implement the bias correction for selected
variables (NAEFS algorithm)
• Implement forecast vs observation
verification system
• NUOPC (FNMOC+NCEP+CMC) products
Return
Tropical Cyclone Track Error T119, T159, T239
Homogenous Sample Average Track Error
600
500
CTL
G119
G159
G239
Error (km)
400
300
200
100
0
# fcsts:
12
24
36
356
320
281
48
72
Forecast Hour
245
182
96
129
120
86
Homogenous NHTC track forecast error (km), for G119, G159, and G239 ensemble mean
tracks as denoted in key. Also shown is the average forecast error of the T239L30
NOGAPS operational deterministic forecast (CTL). The numbers of verifying forecasts are
shown below the x axis. The differences between G119 and G159 are statistically
significant at the 95% level out to 96 h. The differences between G159 and G239 are
statistically significant at 48 and 72 h. All used global ET.
(Fig 3 from Impact of Resolution and Design on the U.S. Navy Global Ensemble
Performance in the Tropics, Reynolds, et al., MWR, July 2011, p 2145-2155.)
Return
Proposal Changes
• Model and initialization
– Using GFS V9.01 (current operational GFS) instead of GFS V8.00
– Improved Ensemble Transform with Rescaling (ETR) initialization
– Improved Stochastic Total Tendency Perturbation (STTP)
• Configurations
– T254 (55km) horizontal resolution for 0-192 hours (from T190 – 70km)
– T190 (70km horizontal resolution for 192-384 hours (same as current opr)
– L42 vertical levels for 0-384 hours (from L28)
• Add Sunshine duration for TIGGE data exchange
• Part of products will be delayed by approximately 20 minutes
– Due to limit CCS resources
– 40-42 nodes for 70 minutes (start +4:35 end: +5:45)
• Unchanged:
– 20+1 members per cycle, 4 cycles per day
– pgrb file output at 1*1 degree every 6 hours
– GEFS and NAEFS post process output data format
Return
• Why do we make this configurations?
– Considering the limited resources and resolution makes difference
• What do we expect from this implementation?
– Improve general probabilistic forecast skill overall
21
– Significant improvement of tropical storm tracks (especially for Atlantic basin)
Anomaly Correlation
Winter 2 months
11.00d
Skillful line
10.25d
SH 500hPa height
NH 500hPa height
GFS V8.0 .vs V9.0
NH 850hPa temperature
SH 850hPa temperature
22
Atlantic, AL01~19 (06/01~11/30/2011)
GEFSo
250
Track error(NM)
200
GEFSx
GEFSx runs once
per day before Oct.
GFS
GEFSo---GEFS T190 (operational run)
GEFSx---GEFS T254 (parallel run)
GFS ------GFS T574 (operational run)
Improvement
150
11%
20%
22%
100
12%
50
0
#CASES
0
12
24
36
48
72
96
309
279
251
227
202
162
125
Return
Forecast hours
120
88
CRPS for NH 850hPa Temperature
CRPS for NH 2-meter Temperature
CRPS for NH 10m U-wind
CRPS for NH 10m V-wind
24
Tmax
Tmin
CRPS
Temperature
CRPS
Return
CRPS
Latest evaluation for CONUS
temperature forecast by apply :
1. Bias correction at 1*1 degree for
NCEP GFS/GEFS, CMC/GEFS
2. Bybrid bias corrected NCEP GFS
and GEFS
3. Apply statistical downscaling for all
bias corrected forecast
4. Combined all forecasts at 5*5 km
(NDGD) grid with adjustment - 25
NAEFS
U 10m
CRPS
Wind
speed
CRPS
Return
V 10m
CRPS
Wind direction
CRPS
26
Dew point T
RMS & Spread
Dew Point T
RH
RMS & Spread
RH
CRPS
CRPS
Return
27
Precipitation calibration for 2009-2010 winter season (CONUS only)
Comparison for GFS and ensemble control (raw and bias corrected)
ETS for all lead-time
ETS for 0-6hr fcst
BIAS for all lead-time
BIAS for 0-6hr fcst
Perfect bias = 1.0
Return
Courtesy of Yan Luo
The probabilistic scores (CRPS -not show here) is much improved as well.
We are still working on the different weights, different RFC regions, downscaled
28
to 5km as well. More results will come in soon. Plan for implementation: Q4FY11
Significantly reduced bias for CONUS and each RFC
CONUS
NWRFC
MBRFC
NERFC
MARFC
CNRFC
CBRFC
Bias=1.0
NCRFC
OHRFC
•Consistently effective along with leading time
ABRFC
WGRFC
LMRFC SERFC
•More effective on lower amount precip
1 * 1 deg
CBRFC
OHRFC
before bias correction
before bias correction
1 * 1 deg
after bias correction
after bias correction
1 * 1 deg
Experimental maps to support CSTAR program for winter season
30
GEFS Implementation Plan for FY13-15
•
Hybrid data assimilation based GEFS initializations
– Using 6-hr EnKF forecast to combine improved ETR (without cycling) (Schematic
diagram)
– Improving ETR (still in discussion and investigation)
• Adaptive modification of initial and stochastic model perturbation variances
• Based on recursive average monitoring of forecast errors and ensemble spread
• Avoid having to tune perturbation size after each analysis/model/ensemble
changes
• Improving performance and easy maintenance
•
Real time generation of hind-casts (pending on resource)
– Make control forecast once every ~5th day (6 runs for each cycle)
• T254L42 (0-192) and T190L42 (192-384), and use new reanalysis (~30y)
– Increasing sample of analysis – forecast pairs for statistic corrections
– Improving bias correction beyond 5-d
– Potential for regime/situation dependent bias correction
•
Coupled ocean-land-atmosphere ensemble
– Couple MOM4/HYCOM with land-atmosphere component using ESMF
• Depending on skill, extend integration to 35 days
• Merge forecasts with CFS ensemble for seamless weather climate interface
• Land perturbation and surface perturbation (later)
– Explore predictability in intra-seasonal time scale
– Potential skill beyond 15 days
•
Hydro-meteorological (river flow) ensemble forecasting
– Pending on operational LDAS/GLDAS, and RFC application
Return
31
Development of Statistical Post-Processing for NAEFS
Future Configuration of EMC Ensemble Post-Processor
•
Opportunities for improving the post-processor
– Utilization of additional input information
• More ensemble, high resolution control forecasts (hybrid?)
• Using reforecast information to improve week-2 forecast and precipitation
• Analysis field (such as RTMA and etc..)
– Improving calibration technique
• Calibration of higher moments (especially spread)
• Use of objective weighting in input fields combination
• Processing of additional variables with non-Gaussian distribution
– Improve downscaling methods
Return
Project Information and Highlights
• Evaluate the value added for current NAEFS
inclusion of FNMOC ensembles
– Current NAEFS products attached
• Period: December 1st 2011 – May 31st 2012
– Cover winter and spring seasons
• Available data for each participating centers
– NCEP
• Raw and bias corrected NCEP, CMC and FNMOC ensembles
– CMC
• Raw and bias corrected NCEP, CMC and FNMOC ensembles
– FNMOC
• Raw NCEP, CMC and FNMOC ensemble data only
– AFWA
• Raw NCEP, CMC and FNMOC ensemble data only
Return
Evaluation metrics
• NUOPC evaluation metrics
– RMSE (MAE) and spread of ensemble mean, CRPS, Brier score
for selected thresholds
– Targeting evaluation parameters:
• 2m T, 10m winds, 250hPa winds, 700hPa RH, 500hPa Z (targeting for
next NCEP implementation)
• TS tracks, precipitation (potential for future consideration)
• Significant wave high, total cloud cover
– NCEP is expected to have most evaluations against own
analysis (GSI and RTMA) and observations (to connect with
NCEP users)
• NAEFS evaluation metrics
– RMSE and spread of ensemble mean, CRPS (resolution and
reliability) and etc…
– Targeting evaluation parameters:
• 2m T, 10m winds, 250 and 850 winds, 500hPa, 1000hPa Z, 850 T.
(Targeting for next NCEP implementation)
Return • Total precipitation (raw)
Scheduling
• Current – November 30: preparation
• December 1st 2011 – May 31st 2012: Data collection and
evaluation for all participate centers
– Mid-term performance review through NAEFS and UEO meetings
– Another performance review by NAEFS workshop (May 1-3,
Monterey CA)
• Late of June
– Fully evaluations from all participated centers
– Possibility to have one day meeting at Silver Springs/Camp Springs
or NUOPC workshop in June (???)
• Mid of July
– Decision will be made to recommend for NCEP implementation (or
not)
• July 1st – deadline for EMC RFCs for implementation
• August – September 2012: NCO test and real time parallel for
NCEP users evaluations (see additional slide for details)
Return
• September 25 2012: targeting for implementation
Future seamless forecast system
NCEP/GEFS will plan for T254L42 (2011 GFS
version) resolution with tuned ETR initial
perturbations and adjusted STTP scheme
for 21 ensemble members, forecast out to
16 days and 4 cycles per day. Extended to 45
days at T126L28/42 resolution, 00UTC only
(coupling is still a issue?)
NAEFS will include FNMOC ensemble in 2011,
with improving post process which include
bias correction, dual resolution and down scaling
Main products:
Main
event
MJO
ENSO predictions???
Seasonal forecast???
GEFS/NAEFS
service
week-1
week-2
Weather/Climate linkage
Main products:
1.
2.
3.
CFS
service
Probabilistic forecasts for every 6-hr
out to 16 days, 4 times per day:
10%, 50%, 90%, ensemble mean,
mode and spread.
D6-10, week-2 temperature and
precipitation probabilistic mean
forecasts for above, below normal
and normal forecast
MJO forecast (week 3 & 4 … )
one month
SEAMLESS
Operational CFS has been implemented in
Q2FY2011 with T126L64 atmospheric model
resolution (CFSv2, 2010version) which is fully
coupled with land, ocean and atmosphere
(GFS+MOM4+NOAH), 4 members per day (using
CFS reanalysis as initial conditions, one day older?),
integrate out to 9 months.
Future: initial perturbed CFS
36
Flow Chart for Hybrid Variation and Ensemble Data Assimilation System
(HVEDAS) - concept
Lower resolution
Ensemble fcst (1)
t=j-1  j
EnKF
assimilation
t=j
Ensemble fcst (2)
t=j  16 days
Two-way
hybrid
Estimated Background
Error Covariance from
Ensemble Forecast
(6 hours)
GSI/3DVAR
t=j
EnKF
assimilation
t=j+1
Ensemble fcst
t=j,  j+1
Ensemble initialization
Replace
Ensemble Mean
Hybrid
Analysis?
Estimated Background
Error Covariance from
Ensemble Forecast
(6 hours)
GSI/3DVAR
t=j+1
37
Higher resolution
BACKGROUND
38
Atlantic, AL01~17 (06/01~09/30/2011)
GEFSo
GEFSx
GFS
300
Track error(NM)
250
200
17%
GEFSo --- GEFS T190 (operational
run)
GEFSx --- GEFS T254 (parallel run)
GFS ------ GFS T574 (operational
run)
Improvement
24%
150
26%
100
13%
50
0
#CASES
0
12
24
36
48
72
96
120
235
213
194
178
159
133
103
75
Forecast hours
NAEFS downscaling parameters and products
Last update: May 1st 2010
(NDGD resolutions)
Variables
Domains
Resolutions
Total 4/8
Surface Pressure
CONUS/Alaska
5km/6km
1/1
2-m temperature
CONUS/Alaska
5km/6km
1/1
10-m U component
CONUS/Alaska
5km/6km
1/1
10-m V component
CONUS/Alaska
5km/6km
1/1
2-m maximum T
Alaska
6km
0/1
2-m minimum T
Alaska
6km
0/1
10-m wind speed
Alaska
6km
0/1
10-m wind direction
Alaska
6km
0/1
All products at 1*1 (lat/lon) degree globally
Ensemble mean, spread, 10%, 50%, 90% and mode
back
Note: Alaska products
is in real time parallel
Expect implementation:
Q1 FY2011
40
NEXT NAEFS pgrba_bc files
(bias correction)
Variables
pgrba_bc file
Total 49 (14)
GHT
10, 50, 100, 200, 250, 500, 700, 850, 925, 1000hPa
10 (3)
TMP
2m, 2mMax, 2mMin, 10, 50, 100, 200, 250, 500, 700, 850, 925,
1000hPa
13 (3)
UGRD
10m, 10, 50, 100, 200, 250, 500, 700, 850, 925, 1000hPa
11 (3)
VGRD
10m, 10, 50, 100, 200, 250, 500, 700, 850, 925, 1000hPa
11 (3)
VVEL
850hPa
1(1)
PRES
Surface, PRMSL
2(0)
FLUX (top)
ULWRF (toa - OLR)
1 (1)
14 new vars
Notes
back
41
2-m temp 10/90 probability forecast verification
Northern Hem, period of Dec. 2007 – Feb. 2008
P10-expect
P90-expect
Probabilities (percent)
100
P10-ncepraw
P90-ncepraw
P10-naefs
P90-naefs
90%
90
80
70
60
50
3-month verifications
40
30
20
10
10%
0
24
48
72
96
120 144 168 192 216 240 264 288 312 336 360 384
Lead time (hours)
Top: 2-m temperature probabilistic
forecast (10% and 90%) verification
red: perfect, blue: raw, green: NAEFS
Left: example of probabilistic forecasts
(meteogram) for Washington DC, every
6-hr out to 16 days from 2008042300
42
back
Decision Theory Example
Critical Event: sfc winds > 50kt
Cost (of protecting): $150K
Loss (if damage ): $1M
Observed?
Forecast?
YES
NO
YES
Hit
$150K
NO
False
Alarm
$150K
Miss
$1000K
Correct
Rejection
$0K
Deterministic
Deterministic Observation
Observation
Probabilistic
Probabilistic
Cost
Cost ($K)
($K) by
by Threshold
Threshold for
for Protective
Protective Action
Action
Case
Case Forecast
Forecast (kt)
(kt)
(kt)
(kt)
Cost
Cost ($K)
($K)
Forecast
Forecast
0%
0%
20%
20%
40%
40%
60%
60%
80%
80% 100%
100%
11
65
65
150
150
42%
42%
150
150
150
150
150
150
54
54
1000
1000
1000
1000
1000
1000
22
58
58
150
150
71%
71%
150
150
150
150
150
150
150
150
63
63
1000
1000
1000
1000
33
73
73
150
150
95%
95%
150
150
150
150
150
150
150
150
150
150
57
57
1000
1000
44
55
55
37
37
150
150
13%
13%
150
150
00
00
00
00
00
55
39
39
31
31
00
3%
3%
150
150
00
00
00
00
00
66
31
31
36%
36%
150
150
150
150
55
55
1000
1000
1000
1000
1000
1000
1000
1000
1000
1000
77
62
62
150
150
85%
85%
150
150
150
150
150
150
150
150
150
150
71
71
1000
1000
88
53
53
42
42
150
150
22%
22%
150
150
150
150
00
00
00
00
99
21
21
27
27
00
51%
51%
150
150
150
150
150
150
00
00
00
10
10
52
52
39
39
150
150
77%
77%
150
150
150
150
150
150
150
150
00
00
Total
Total Cost:
Cost: $$ 2,050
2,050
$$1,500
1,500 $$1,200
1,200 $$1,900
1,900 $$2,600
2,600 $$3,300
3,300 $$5,000
5,000
43
back
Optimal Threshold = 15%
Overall temperature forecasts: Average over past 30 days: (2008092920081028)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
MAE Bias >10 err <3 err
12-hr 2.44 0.7 0.1% 67.3%
24-hr 2.84 1.0 0.3% 59.1%
36-hr 2.94 0.8 0.3% 57.8%
48-hr 3.36 1.6 2.1% 52.8%
60-hr 3.26 1.0 1.7% 54.8%
72-hr 3.35 1.3 2.1% 53.1%
84-hr 3.80 0.6 4.7% 49.0%
96-hr 3.96 0.7 4.0% 44.4%
108-hr 4.43 0.9 5.5% 38.5%
120-hr 4.57 1.0 5.9% 36.6%
132-hr 4.83 0.7 7.8% 34.7%
144-hr 4.83 0.5 7.4% 34.7%
156-hr 5.43 0.1 11.9% 30.3%
168-hr 5.74 0.3 14.4% 27.7%
off. rank
1 out of 7
2 out of 7
1 out of 7
1 out of 7
1 out of 6
1 out of 5
1 out of 5
2 out of 4
2 out of 3
2 out of 4
1 out of 3
3 out of 4
3 out of 3
2 out of 4
Best G.
NAM40 65.4%
NAM40 60.3%
NAM40 55.9%
MOSGd 48.9%
MOSGd 50.1%
MOSGd 49.9%
NAEFS 48.6%
NAEFS 46.2%
NAEFS 41.7%
NAEFS 40.9%
NAEFS 34.5%
HPCGd 36.4%
NAEFS 32.1%
HPCGd 27.7%
2nd G.
NAM12 60.1%
NAM12 56.9%
NAM12 52.6%
NAM40 48.3%
NAM12 48.8%
NAM12 49.5%
SREF
44.5%
HPCGd 42.6%
MOSGd 37.7%
HPCGd 36.5%
MOSGd 34.4%
NAEFS 35.5%
MOSGd 30.8%
MOSGd 26.9%
Worst G.
NGM80 44.4%
SREF
47.0%
NGM80 44.0%
NGM80 12.9%
NAM40 6.2%
SREF 44.0%
NAM12 2.6%
MOSGd 40.6%
MOSGd 37.7%
MOSGd 36.3%
MOSGd 34.4%
MOSGd 33.3%
MOSGd 30.8%
NAEFS 26.1
Minimum temperature forecast: Average over past 30 days: (2008092920081028)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
12-hr
24-hr
36-hr
48-hr
60-hr
72-hr
84-hr
96-hr
108-hr
120-hr
132-hr
144-hr
156-hr
168-hr
3.17
3.03
3.25
3.94
4.30
4.76
4.85
5.24
5.11
5.31
4.97
5.42
5.40
5.46
-1.2
-0.9
-0.8
-1.1
-0.4
0.1
0.3
0.4
0.8
0.7
0.7
0.6
0.5
1.1
1.0%
0.6%
0.9%
2.9%
4.4%
6.4%
7.5%
13.0%
12.8%
12.0%
9.9%
15.0%
14.9%
17.7%
53.4%
55.5%
51.6%
43.2%
39.1%
33.7%
34.7%
33.1%
35.4%
31.9%
35.1%
35.0%
35.7%
38.1%
3 out of 7
2 out of 7
3 out of 7
3 out of 7
4 out of 6
5 out of 5
2 out of 6
1 out of 3
1 out of 4
1 out of 3
2 out of 4
1 out of 3
1 out of 4
1 out of 3
NAEFS
SREF
NAEFS
NAEFS
NAEFS
NAEFS
NAEFS
NAEFS
HPCGd
MOSGd
HPCGd
MOSGd
HPCGd
MOSGd
59.7%
57.2%
54.2%
51.9%
49.2%
42.9%
40.0%
32.7%
34.5%
31.6%
38.0%
31.3%
32.9%
35.6%
SREF
NAEFS
SREF
SREF
SREF
SREF
MOSGd
MOSGd
NAEFS
NAEFS
MOSGd
NAEFS
MOSGd
NAEFS
57.1%
54.2%
53.9%
45.8%
43.0%
40.1%
33.4%
29.9%
32.1%
24.8%
30.9%
29.0%
32.7%
28.4%
Contributed by Richard Grumm (WFO)
NGM80
NGM80
NGM80
NGM80
NAM40
NAM12
NAM12
MOSGd
MOSGd
NAEFS
NAEFS
NAEFS
NAEFS
NAEFS
21.8%
24.9%
23.2%
6.2%
8.9%
35.2%
8.9%
29.9%
30.5%
24.8%
27.2%
29.0%
23.4%
28.4%
44
Ocean Wave Ensemble System
- Hendrik Tolman
• Configuration of ocean wave ensemble system
Wave ensemble has been running since 2008
– Running on 1°×1° wave model grid as the control.
– 20 wave members generated through GEFS using ETR method
– Cycling initial conditions for individual members to introduce
uncertainty in swell results.
– 10 day forecast using the GEFS bias corrected 10m wind (future
operation)
• Improving forecast uncertainties through
– Introducing ensemble initial perturbations from previous model
cycle
– Introducing bias corrected ensembles as external forcing.
– Example of comparison (wave heights)
• Plans
– Work towards a combined NCEP-FNMOC ensemble
– Analyze the role of swell played in the wave ensemble
Comparison of the ensemble systems
old
cycle
Old ensemble setup, ensemble
with cycling of initial conditions
and wind bias correction (BC).
cycle, BC
Mean wave height (contours)
and spread (shading)
2008/03/28 t06z 120h forecast
back
Alaska NAEFS Wind Speed MAE
July-October 2010
3.5
3
2.5
MAE (m/s)
HPC
2
NDFD
50th
(median) and mean are best
1.5
GEMODE
GEAVG
GE50PT
GMOS00
GMOS12
1
0.5
back
0
D4
06Z
D4 D4
12Z 18Z
D4 D5 D5
00Z 06Z 12Z
D5 D5 D6 D6 D6
18Z 00Z 06Z 12Z 18Z
D6 D7 D7 D7 D7
00Z 06Z 12Z 18Z 00Z
Forecast time
D8 D8 D8 D8
06Z 12Z 18Z 00Z
47
Courtesy of Dave Novak
EMC-MDL COLLABORATION
• Compare quality of current operational / experimental products
– Gridded MOS vs. Downscaled NAEFS
• Ongoing
– Kathy Gilbert, Val Dragostano – Zoltan Toth, Bo Cui, Yuejian Zhu
• Proxy for truth issue unresolved
– Need observations independent of MOS
– MDL experimental ensemble guidance vs. Downscaled NAEFS
• 10/50/90 percentiles to be evaluated
– Matt Peroutka & Zoltan Toth
• Proxy for truth issue
• Proxy for truth?
– Agree on best proxy for truth
• Collaborate on
– Improving RTMA, including bias correction for FG
– Creating best CONUS precipitation analysis & archive
• Joint research into best downscaling methods?
– Climate, regime, case dependent methods
– Addition of fine temporal/spatial variability into ensemble
48
SREF Probability of STP
Ingredients: Time Trends
24 hr SREF Forecast Valid 21 UTC 7 April 2006
Prob (MLCAPE > 1000 Jkg-1)
X
Prob (6 km Shear > 40 kt)
X
Prob (0-1 km SRH > 100 m2s-2)
Max 50%
X
Prob (MLLCL < 1000 m)
X
Prob (3h conv. Pcpn > 0.01 in)
Shaded Area Prob > 5%
49
BACKGROUND
50
User Requirements - Simplified
• Applications affected by (extreme/high impact) weather
– Must consider information on weather to
• Minimize losses due to adverse weather
• Optimal user decision threshold equals
– Probability of adverse weather exceeding
• Cost / loss ratio of decision situation (simplified decision theory)
• Probability of weather events must be provided
– Only option in past, based on error statistics of single value forecasts
• e.g., MOS POP
• Now can be based on ensemble statistical information (e.g., RMOP)
– Users act when forecast probability exceeds their cost/loss ratio (example)
• Advantages
– A set of products (e.g., 10 / 50 / 90 percentile forecast , metagram, mean and
mode)
• Advanced - Problems???
– Proliferation of number of products
• For different variables, probability / weather element thresholds, joint probabilities
– Limited usage
• Downstream applications severely limited (e. g., wave, streamflow, etc, ensembles not
possible)
51
User Requirements - Advanced
•
Advanced information
–
–
–
Statistical reliable ensemble forecast products
Ensemble statistical data – historical information
6D-cube – space (3D) + time + variables + ensemble
•
•
Joint probabilities
–
–
Many variables, different probabilities / critical value decision thresholds
Some (or many) of forecast events are related joint probabilities.
•
•
•
Must be easy operating with quick information access
Simulating optimal weather related operations
Simulating different user procedures for multiple plausible weather scenarios
Able to tell – what are actions / costs / benefits? - assuming weather is known
Application
–
–
–
–
–
•
Probability of significant convection
Fire weather
User application model (UAM)
–
–
–
–
•
Expanded NDFD – future official NWS weather / climate / water forecast database
Run UAM n x n times with multiple weather scenarios from each ensemble member (n) and user procedures (n)
Weather scenario from each ensemble - generated from optimized user procedures
Take ensemble mean of economic outcome (costs + losses) for each set of user procedures
Choose operating procedures to minimize costs and losses in expected sense.
Make optimizing weather related decisions
Challenge
–
–
Requires - storage / telecom bandwidth
Requires - smart sub-setting & interrogation tools – can derive any weather related information include joint probabilities
52
NCEP/GEFS raw forecast
4+ days gain from NAEFS
NAEFS final products
From
back
Bias correction (NCEP, CMC)
Dual-resolution (NCEP only)
Down-scaling (NCEP, CMC)
Combination of NCEP and CMC
53
NCEP/GEFS raw forecast
8+ days gain
NAEFS final products
From
back
Bias correction (NCEP, CMC)
Dual-resolution (NCEP only)
Down-scaling (NCEP, CMC)
Combination of NCEP and CMC
54
Data on NOMADS
• SREF grid221 (North America, 30km) for individual
members
• SREF bias corrected grid212 (CONUS, 40km) for
individual members
• GEFS raw forecast (1*1 degree globally) for individual
members
• GEFS bias corrected (1*1 degree globally) for individual
members and products
• NAEFS probabilistic forecast (1*1 degree globally)
• NAEFS downscaled probabilistic forecast (5*5 km
CONUS only)
• NAEFS downscaled probabilistic forecast (6*6 km
Alaska)
Post Process - Derived Variables (Plan)
– Objective
• Generate variables not carried in NWP models
• Or variables can not be easy calibrated
– E. g. relative humidity
– Input data
• Bias corrected and downscaled ensemble data (NWP
model output)
– Methods
• Model “post-processing” algorithms
– Apply after downscaling for variables affected by surface
processes
• SMARTINIT for global forecast – Geoff Manikin et al.
– NDFD weather element generator
• Other tools?
– Text generation, etc?
56
GEFS operational extratropical cyclone tracking
The system includes two components currently:
•
•
A global cyclone tracking system.
A graphic display web site.
Planned GEFS cyclone tracking and verification:
• An unified storm ID for all members will be created during the tracking
process;
• This will allow us to obtain a mean track among the 21 members;
• Probabilistic errors will be computed using the mean track, member
tracks and an analysis track (GFS).
• All tracks will be processed and stored in MySQL Database.
Example: A Nor’easter forms at Gulf of Mexico. 3 days before impacting
the mid Atlantic states, all single models predicted the storm would move
out to sea; The GEFS had several members that showed significant
impact (tracks close to the BEST track)
Download