John/Mark's Presentation - NPS Department of Oceanography

advertisement
ASM Project Update:
Atmospheric Component
University of Colorado
Mark W. Seefeldt and John J. Cassano
University of Colorado at Boulder
Cooperative Institute for Research in Environmental Sciences
Department of Atmospheric and Oceanic Sciences
Goals for Year 1 of DOE Project
• Develop and evaluate Polar WRF
– University of Colorado
– Iowa State University
– Sub-contract to Bromwich / Hines - OSU
• Coupling of WRF to CCSM cpl7
– Lead by Juanxiong He, UAF
Goals after the Boulder meeting in May 2008
• Create and establish the atmosphere/land
pan-arctic domain
• Run a one-month simulation for September
1998 over the pan-arctic domain
• Continue to develop and evaluate the WRF
physics parameterization over the Arctic
• Evaluate seasonal extremes and impact of
fractional sea ice in WRF
Iowa State University (Bill, Justin, Brandon)
• Continue with coupling of WRF to CCSM cpl7
UAF (Juanxiong He)
Outline
•
•
•
•
•
•
Pan-arctic atmosphere/land domain
Pan-arctic simulation for September 1998
WRF Pre-processing System (WPS)
Evaluation of global forcing data for WRF
WRF polar modifications update
Introduction to Model Evaluation Tools
(MET)
• Evaluation of radiation and microphysics
parameterizations during the SHEBA year
Pan-Arctic Atmosphere/Land Domain
• Requirements for the atmosphere / land domain:
– 50 km grid resolution
– Extend beyond the boundaries of the
ocean/sea ice domain
– Incorporate the entire Arctic watershed basin
Pan-Arctic Atmosphere/Land Domain
• Extend ~ 10 points beyond the Arctic watershed basin
Pan-Arctic Atmosphere/Land Domain
• Extend beyond the boundaries of ocean/sea ice domain
Pan-Arctic Atmosphere/Land Domain
• The combination of the two requirements:
Pan-Arctic Atmosphere/Land Domain
• The two domains with color shaded topography:
Pan-Arctic Atmosphere/Land Domain
• The domains with color shaded topography and bathymetry:
Pan-Arctic Atmosphere/Land Domain
• WPS (WRF Pre-Processing System) settings:
– s_we = 1, e_we = 276
– s_sn = 1, e_sn = 206
– dx = 50000
– dy = 50000
– map_proj = ‘polar’
– ref_lat = 80.5
– ref_lon = 160.0
– truelat1 = 80.5
– stand_lon = -114.0
Pan-Arctic Atmosphere/Land Domain
• All files related to the pan-arctic atmosphere /
land domain can be retrieved from ARSC:
– namelist.wps.pan_arctic (WPS namelist)
– geo_em.d01.nc (WPS geogrid.exe output)
– pan_arctic_domain.nc
(a cleaned-up version of the NetCDF output)
– all domain images in png and ps format
• Located at:
/projects/ARSCASM/data/atmos_land-domain/
http://data.arsc.edu/asm/data/atmos_land-domain/
WRF Pan-Arctic Simulation
• At the May 2008 meeting it was agreed for all
groups to supply data for a one-month simulation
– month: September 1998
– data: 6-hourly intervals (preferably NetCDF)
WRF Pan-Arctic Simulation
• WRF Configuration:
– a single continuous run from 00 UTC 31 August
through 00 UTC 1 October 1998 (31 days)
– forcing data: ERA40 – soil / surface values
ECMWF-TOGA – atmosphere
– using the new pan-arctic domain (50 km res.)
– physics:
SW: Goddard
LW: RRTM
Microphysics: Morrison
Cumulus: Grell-Devenyi
PBL: YSU (sfclayer: MM5) Land Surface: Noah LSM
– output: 6-hour history interval (wrfout NetCDF)
WRF Pan-Arctic Simulation
• Initial test – June 2008:
– run locally on a dual-processor Linux system
– one day: ~ 6 hours 47 minutes processing time
– entire month: ~ 8 days
• Created daily wrfout files: 566 MB / day
• Sample wrfout NetCDF file provided to NPS
WRF Pan-Arctic Simulation
• Second test – December 2008:
– run on midnight at ARSC
– same configuration as the June 2008 test
– 8 CPUs: one day: ~ 28 minutes
entire month: estimated at 14-15 hours
– 16 CPUs: one day: ~ 19 minutes
entire month: estimated at 10 hours
– 32 CPUs: have not been able to get a test run
– the wrfout NetCDF files will be located at:
/projects/ARSCASM/data/wrf-test-199809/
WRF Pre-processing System (WPS)
• The WPS processes the static terrestrial and the gridded
model forcing data into a format to be used by WRF
WRF Pre-processing System (WPS)
• There are three steps in the WPS process:
– geogrid.exe – creates the domain and processes the
static terrestrial data
– ungrib.exe – extracts the gridded forcing data and
creates ‘intermediate files’
– metgrid.exe – combines the geogrid data with the
intermediate files to create the WRF input files
WRF Pre-processing System (WPS)
• The interpolation to the model horizontal grid is done in
metgrid.exe
• metgrid.exe checks that all mandatory fields are included
(pres, skintemp, T, u, v, ~ psfc)
• Data from non-GRIB sources can be included if it is in the
‘intermediate file’ format (ex: sea ice from NSIDC)
• metgrid.exe creates the met_em* files
WRF – real.exe
• real.exe creates the initial and boundary condition files to
be used by wrf.exe
• The interpolation to the model vertical levels is done in
real.exe
• If a non-mandatory field is not included in the met_em*
files, but it is used by wrf.exe, then real.exe creates the
data
Ex: If sea ice is not included in
the met_em* files, then real.exe
sets all grid points with a
SKINTEMP less than 271 K
(user specified) as sea ice
• real.exe has the ability to
create psfc, but it is best to
have it from the source data
WRF – Global Forcing Data
• WRF works best with the NCEP Final Analyses
• Features:
– 1 x 1 degree global grid every six hours
– daily 30 July 1999 – present
– NCAR – DSS: ds083.2
• It is not a re-analysis – when the model is
changed the dataset is not reprocessed
• The NCEP analyses have been indicated to be
relatively poor performing in the arctic
WRF – Global Forcing Data
• ERA40 has shown the best performance in the
arctic
• A consistent data set covering a range of years
• There are different version of ERA40 at the NCAR
– DSS (best to use pressure level):
– model native resolution:
surface / single levels – N80 – (includes psfc) ds117.0
upper air – T159 spectral resolution – ds117.1
Note: WPS ungrib.exe does not support the grid types
– 2.5 deg lat/lon global grid
surface / single levels (does not include psfc) - ds118.0
upper air – ds118.1
WRF – Global Forcing Data
• ECMWF-TOGA has shown good performance
in the arctic
• It is not a re-analysis
• The supplemental fields have improved over
the last 10 years
• There are two general versions of ECWF:
– model native resolution:
surface / single levels – N80 – (inc. psfc) ds111.0
upper air – T106 spectral resolution – ds111.1
Note: WPS ungrib.exe supports this n80 grid
– 2.5 deg lat/lon global grid
global surface and upper level analysis – ds111.2
-does not include soil moisture and temperature
WRF – Global Forcing Data
• It has proven to be difficult to select a preferred
source of global forcing data
• ERA40 is the preferred choice
– has the consistency of a re-analysis
– does not have psfc in a format usable by WPS
• ECMWF-TOGA has the psfc values
– the analyses change over time
– older analyses are missing some supplemental fields
• The likely solution for pre-2003 is to use ERA40
and to develop a method to get psfc into the
met_em* files
WRF – Polar Modifications
• Modifications have been made to the native
WRF code for polar applications
– Keith Hines and David Bromwich
Byrd Polar Research Center (OSU)
• The modifications are currently distributed by
BPRC and are not included in WRF releases
– They may be included in the WRF 3.1 release –
Spring 2009
• The modifications fall into two divisions:
– fractional sea-ice
– Noah LSM
• ifdef compiler flags turn on the two divisions
of modifications at compile time
WRF – Polar Modifications
• Fractional sea ice:
– Surface boundary layer routine called separately for
ice and liquid
– Noah or other LSMs can be used
– Can specify sea ice albedo
• Noah LSM:
– Snow/ice emissivity set at 0.98
– Permanent ice albedo set at 0.8
– If snowpack depth > 0.05 m, treat snow within the
prognostic “soil” layers of Noah
– Direct summation of surface energy balance terms
in diagnostic calculation of surface temperature
– Penman Equation evapotranspiration calculations
updated for ice sublimation
*Information from Hines and Bromwich – 2008 AMOMFW presentation
Model Evaluation Tools (MET)
• What is it?
– a set of model evaluation tools for the WRF
community
– developed by scientists, statisticians, and
software engineers
– primary users include WRF developers,
university researchers, and operational centers
– developed and supported by the Development
Testbed Center (DTC)
– support provided by the Air Force Weather
Agency (AFWA) and NOAA
MET Components
• data reformatting modules (ex. ascii2nc)
• statistics modules:
– MODE – object based spatial verification
– Grid-Stat – verification of grids
– Point-Stat – verification of point data
• analysis modules (ex. VSDB analysis tool)
Figure and
information
courtesy of
UCAR/DTC
MET Components – Grid-Stat
• Compares gridded forecasts to observations
(analysis / “truth”)
• Includes ‘neighborhood methods’
• Select multiple: variables, levels, thresholds,
masking regions, smoothing methods,
confidence interval methods (CIs), and alpha
values for CIs
• Output: contingency table counts and statistics
with CIs, continuous statistics with CIs,
neighborhood methods
MET Components – Point-Stat
• Compare observation against a user specified
region of grid point(s)
– nearest point
– 2x2 box around observation point
– 3x3 box around observation point
• Select multiple: variables, levels, thresholds,
masking regions, matching methods,
confidence intervals (CI) methods
• Several metrics can be used to create matched
forecast value (min., max., median,
unweighted mean, distance weighted mean)
• User-defined number of minimum number of
valid data points
MET Statistics – Discrete Variables
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Number of observations
FHO Statistics
• based on applying a
threshold to a
Contingency table counts
continuous variable
Contingency table proportions
Accuracy
Bias
Probability of Detection of Yes
Probability of Detection of No
False Alarm Ratio
Critical Success Index
Gilbert Skill Score
Hanssen and Kuipers Disciminent
Heidke Skill Score
Odd Ratio
MET Statistics – Continuous Variables
•
•
•
•
•
•
•
•
•
•
Mean (fcst./obs.)
Standard Deviation (fcst./obs.)
Correlation Coefficients (3 types)
Mean Error
Mean Absolute Error
Mean Squared Error
Bias-correlated Mean Squared Error
Root-Mean Squared Error
Error Percentiles (10th, 25th, 50th, 75th, 90th)
Partial Sums (1st and 2nd moments of the forecasts,
observations, and errors)
– scalar, anomaly, vector, vector anomaly
WRF Evaluation of Physics
• In May a list of preferred physics parameterizations was
selected based on a quick and limited evaluation
– Land surface: Noah
– Longwave radiation: RRTM
– Shortwave radiation: Goddard
– Boundary layer: YSU or MYJ (MYJ)
– Microphysics: Morrison or WSM5
– Cumulus: Kain-Fritsch or Grell-Devenyi
• 50km ARCMIP domain over the SHEBA field campaign
• Limited number of combinations were evaluated
• Two months were evaluated: January 1998, June 1998
• Results were compared only compared against the
SHEBA surface observations
WRF Evaluation of Physics
• Past studies indicate that atmospheric models have
difficulties with the clouds (microphysics) and radiation
processes in the polar region
• Different combinations of parameterizations impact the
results of the selected parameterizations
• Goal: to thoroughly evaluate the WRF microphysics and
radiation processes in the arctic
– test all probable microphysics and radiation
parameterizations
– multiple months through the year
– multiple observation points
– test with 10km and 50km resolutions
– hold all other physics parameterizations constant
WRF Clouds and Radiation Evaluation
• Domain to cover the SHEBA field campaign and the
Barrow ARM observation location
• 50km and 10km domains to cover both locations
WRF Clouds and Radiation Evaluation
• Selected five longwave and shortwave radiation
combinations:
Longwave
Shortwave
lw_1-sw_1
RRTM
Dudhia*
lw_1-sw_2
RRTM
Goddard
lw_1-sw_3
RRTM
CAM
lw_3-sw_2
CAM
Goddard
lw_3-sw_3
CAM
CAM
• Selected six realistic microphysics parameterizations:
mp_2 – Lin
mp_7 – Goddard
mp_4 – WSM5
mp_8 – Thompson
mp_6 – WSM6
mp_10 – Morrison
• All other physics parameterizations are constant
PBL: MYJ
Land Surface: Noah LSM
Cumulus: Kain-Fritsch
WRF Clouds and Radiation Evaluation
• Selected three months during the SHEBA year:
199801
January
winter
199803
March
seasonal transition
199805
May
transition to liquid clouds
199806
June
summer
• Results in: 5 x 6 x 4 = 120 simulations
• The results will be evaluated with both the10km
domain and the 50km domain (one-way nesting)
• The goal is to identify 2-3 preferred radiation and
microphysics combinations
Download