Global Ocean Data Assimilation

advertisement
GODAE OceanView
Inter-comparison and Validation
Task Team: Work Plan
Fabrice Hernandez (Mercator Océan, Fr) and Matt Martin (Met Office, UK)
Main objectives of the task team:


Provide a demonstration of inter-comparison and validation
in an operational framework
Improve inter-comparison and validation methodology for
operational oceanography
Link with other groups and activities
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
IV-TT worplan
• Pursues activities developed during GODAE
• Develop a framework for comparing outputs of the various
operational ocean forecasting systems (OOFS)
– leads to improvements to the systems and to the quality of products
from those systems
– provides a framework for scientific discussions on forecasting system
performance assessment, on ocean analysis and validation from
numerical simulations
– offers a demonstration of the work in GODAE OceanView and has the
potential to increase the visibility to the external community
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
IV-TT worplan
• Coordinates and promotes the development of scientific validation
and intercomparison of operational oceanography systems
– For short term forecast assessment, but benefit for/from reanalysis
assessment; and for medium range predictions (eg seasonal) in
collaboration with CLIVAR/GSOP
– include the definition of metrics to assess the quality of analyses and
forecasts (e.g. forecast skills) both for physical and biogeochemical
parameters and the setting up of specific global and regional
intercomparison experiments
– metrics related to specific applications also considered (eg, lagrangian
statistics)
– New approach for ocean estimates, like multi-model ensemble, should
be promoted
– links with the OSE-TT
– links with the JCOMM ET-OOFS team for operational implementation
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
Contribution to IV task team
•
Australia: BlueLink
Development of validation packages and validation methodologies for both reanalysis
and ocean forecast systems. Development of automated inter-comparison with other
GODAE OceanView partner forecast systems.
•
Brazil:
Start participation in 2011 by preparing routines to validate ocean forecasts and assess
the short-term predictability of the systems with focus on the METAREA V and regions
along the Brazilian shore. As soon as the results become available, they will be
presented on the REMO website. From 2012 on more active participation will occur,
including participation in the Coastal Ocean and Shelf Seas TT.
•
Canada: DFO/CONCEPT
Development of validation packages and validation methodologies for both reanalysis
and ocean forecast systems. This includes presentation methodologies over the web for
demonstrating value of forecast systems.
Development of automated inter-comparison with other GODAE OceanView partner
forecast systems.
•
China: SOA/NMEFC
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
Contribution to IV task team
•
Europe: ECMWF
Development of validation packages and validation methodologies for both reanalysis
and ocean forecast systems. This includes presentation methodologies over the web for
demonstrating value of forecast systems, as well as published reports
•
France: Mercator Ocean
Co-chair this TT
Implementation of new diagnostics / metrics specifically designed for each OOFS or
application. This should take into account new parameters / processes like meso-scale
dynamics, biological parameters, areas where nesting is done, downstream applications,
data assimilation efficiency.
Inter comparisons with other OOFSs when possible
•
India: INDOFOS
•
Italy: INGV/MFS
Development of validation packages and validation methodologies for MFS by including
a large number of independent observations collected by different instates in the
Mediterranean. The validation is presented on a web page that is linked with the web
page for the analyses and forecast
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
Contribution to IV task team
•
Japan: MRI/JMA
We have a plan for validation in JMA groups. It is not clear that JMA will participate in
intercomparison exercises with other GODAE OceanView partner forecast systems. It
depends on the experimental conditions of the work plan under the task team.
•
Norway: NERSC/TOPAZ
Update of MERSEA metrics for the Arctic. Validation methodology for vector variables
(sea-ice drift) provided by NIERSC, Russia. Comparison and validation of systems
producing Arctic metrics. Routines provided on demand.
Automatic identification of eddies through circlelets (see Chauris et al., 2010).
•
UK: Met Office/FOAM
Co-chair this TT.
Develop software to perform routine Class 4 inter-comparisons, and share the relevant
code with the other GODAE OceanView groups.
Develop a system for plotting and summarizing the routine inter-comparison results.
Contribute to proposed inter-comparison aimed at monitoring climate indices.
Contribute to and help organize working group meetings.
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
Contribution to IV task team
•
USA: NOAA
Development of validation packages and validation methodologies for ocean forecast
systems. This includes presentation methodologies over the web for demonstrating
value of forecast systems.
Development of automated inter-comparison with other GODAE OceanView partner
forecast systems.
•
USA: US Navy GOFS
Validation testing will examine accuracy in hindcasts, nowcasts and forecasts of sea
surface height, the 3-dimensional temperature, salinity and current structure, mixed layer
depth, sonic layer depth and the deep sound channel. GOFS V3.0 will provide boundary
conditions to regional and nested coastal models and this will be examined along with
longer-range (14-day) forecast skill of the quantities listed above. Evaluations will be
global, and additional emphasis will be placed on the western Pacific (South China Sea,
Yellow Sea, and Japan/East Sea) and Indian (Arabian Sea) Oceans.
•
USA ECCO
ECCO will participate intercomparison effort that focuses on climate time scales
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
Implementing new intercomparison activity
Assessing Performance of Ocean Forecasting
Systems, using “Class 4” metrics
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
Class 4 intercomparison implementation
Up to now there has been little inter-comparison of the class 4 metrics (forecast
performance in the observation space).
Request from OSE-TT to provide metrics dedicated to performance
assessment
Would like to start performing some routine and ongoing inter-comparisons
between the GODAE OceanView systems.
A technical proposal, based on Class 4 metrics has been defined for
routine inter-comparison:
• in common formats;
• for a core sub-set of the various data types which the systems assimilate (or
not);
• to enable common assessment of forecasts.
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
Class 4 intercomparison implementation
Common set of reference data:
•
•
•
Temperature and salinity profiles from Argo GDAC (USGODAE or Coriolis)
Sea level data from Aviso and tides gauges (Coriolis)
Sea surface temperature data from in situ surface drifters (USGODAE) and
from L3 AATSR (MyOcean)
The MDT used by each group will be provided in a separate netCDF file
OOFS outputs to be evaluated: [Best estimates] & [1-5 d. forecast] are
compared to reference data, and stored in standard files, whose format is
specified and will contain:
•
•
•
•
•
•
Observation time, location (lat/long), depth, value, type and ID;
Best estimate model field interpolated to observation locations;
Forecast values interpolated to observation location (from previous forecasts);
Persistence values (from previous nowcast);
Climatological values;
QC information.
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
Class 4 intercomparison implementation
Sharing data and calculating results:
•
•
•
•
Files made available via ftp to the USGODAE server within 10 days of the
observation time.
The files will be archived on the USGODAE server for future use.
Any of the participants should be able to perform the inter-comparison as
they like.
But a standard set of statistics calculated on a monthly basis is already
planned:
–
•
•
•
RMS differences, bias and anomaly correlation against forecast time for various regions (ie
using the existing GODAE region definitions).
The results of the monthly inter-comparison could be made available on the
GODAE OceanView or JCOMM ETOOFS web-sites
Results produced by any of the participants will not be publicised outside of
the group of participants, without agreement of all participants.
Existing software to be shared (M. Martin)
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
Class 4 intercomparison implementation
Operational Class 4 OND 2010
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
Class 4 intercomparison implementation
Operational Class 4 JFM 2011
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
Link with OSE-OSSE TT activities:
Use the metrics already defined for operational systems,
and assess data impact in a comparable manner
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
Link with other intercomparison activities:
Intercomparison performed by the
“operational reanalysis” community
Y. Xue (NOAA), M. Balmaseda (ECMWF), A. Kumar (NOAA), N. Ferry (M-O), M. Rienecker (NASA), T.
Rosati (NOAA), S. Good (UK-Met), I. Iishikawa (JMA), T. Boyer (NOAA), O. Alves (BoM)
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
Link with “operational reanalysis” community
Objectives of ocean operational analysis intercomparison
– from OceanObs’09 White Paper: OCEAN STATE ESTIMATION FOR
GLOBAL OCEAN MONITORING: ENSO AND BEYOND ENSO by Yan
Xue et al.
– Then discussion with Yan Xue (NCEP) and M. Balmaseda (ECMWF)
– Raising interest in the CLIVAR-WGSIP (Working Group on Seasonal
Interannual Prediction)
• action item "...[explore] who is interested in participating in developing a strategy to
assess real-time ocean analyses..."
– Routine comparison of operational analysis, eg, in a 3-month basis, in a
collaboration/or as part of the IV-TT is welcomed
– Possible benefit from GODAE ocean forecasting group:
• Share expertise on “climate signal” detection with ”ORA groups”
• Assess the reanalysis performed by GODAE ocean forecasting groups
• Measure the robustness of climate assessment by including high resolution
analysis in the intercomparison
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
Link with “operational reanalysis” community
Possible objectives of ocean operational analysis intercomparison:
– « how to derive ocean climate indices from multiple operational ocean analyses »
– Produce ensemble ORA heat content indices, which will provide monitoring at
quasi real time of climate signals with uncertainty estimations.
– Verification of the different ORA products by comparison with observational
estimates (Altimeter, EN3, NODC, …) when possible. This raises awareness on
the weakness/strengths of the different data assimilation products, and identifies
the areas where consensus on the current state of the ocean is possible and
those where more efforts are needed.
– Impact of data distribution on the quality of the ocean analysis, and seasonal
forecast errors
– Ongoing work: Describe the heat content indices in the context of
understanding global energy budget, monitoring and forecasting climate on
seasonal to decadal time scales. Paper in preparation, which goal is to assess the
quality of upper 300m heat content analysis derived from multi-model
operational ocean reanalyses, and their capabilities in describing climate signals
associated with interannual variability (ENSO, IOD), decadal variability (PDO,
TAV and AMOC) and trends.
GOV Technical Workshop OSE-IV - Santa Cruz, USA, 13-17 June 2011
Download