Operational and research numerical weather prediction applications in the Meteorological and Hydrological Service of Croatia B. Ivančan-Picek*, K. Horvath*, S. Ivatek-Šahdan*, M. Tudor*, A. Bajić*, I. Stiperski* and A. Stanešić* * Meteorological and Hydrlogical Service/Department of research and development, Zagreb, Croatia e-mail picek@cirus.dhz.hr Abstract - The applications being developed and used at Meteorological and Hydrological Service of Croatia cover broad areas of meteorology, climatology, renewable energy, hydrology, air quality modelling, and other. Applications used for operational weather forecast include the numerical weather prediction model ALADIN, data pre-processing and model output post-processing tools as well as visualization applications. Other meteorological numerical models (such as WRF, COAMPS, WAsP, MM5, RegCM) are utilized for research and additional applicative purposes and tasks. The operational system is automatic and controlled by a set of scripts that coordinate the model execution and related modules with the availability of the input data. The final products are disseminated to the endusers as soon as the module responsible for its generation is finished. I. INTRODUCTION This article will describe the applications developed, maintained and used in the Research department the Croatian Meteorological and hydrological service (CMHS). The operational applications in CMHS are the timecritical jobs controlled by a cron deamon on the designated computer. If the computer is used for other, non-operational purposes, that are not time-critical, a job submission and queueing system controlls the execution and priority of the jobs. The operational class jobs are given priority and the non-operational jobs are stopped or held until the operational job completes. Slika 1. The operational model domains. The operational forecast of the weather conditions for the two domains shown in Figure 1 is obtained by running the numerical weather prediction (NWP) model ALADIN (Aire Limitee Adaptation Dynamique developement InterNational) [1] from the initial time (analysis) up to 72 hours in advance for the 2 domains shown in Figure 1. The domains do not cover the whole Earth, so this is a limited area model (LAM) application. Each operational forecast run requires the initial data and the lateral boundary conditions (LBC) forecast data. The LBC data are obtained from a global NWP forecast model operational output. Either data from the ARPEGE (Action de Recherche Petite Echelle Grande Echelle) model run operationally in Meteo France (MF) or the IFS (Integrated Forecast System) model run operationally at ECMWF (European Center for Medium-range Weather Forecast) can be used for initial and LBC. The process to get “as best as possible” initial state of atmosphere is called analysis. A complex analysis that takes into account the time distribution of observations and dynamical properties of analyzed system is called data assimilation (Talagrand, 1967). The data assimilation procedure is run at MF and ECMWF includes data measured in Croatia so up to now main operational forecast suite in CMHS did not utilize additional data assimilation procedure. The operational forecast is run twice per day, starting from 00 and 12 UTC analyses, but two other sets of analyses and LBCs are available each day, at 06 and 18 UTC. The operational forecast is a soft time-critical application. The measured meteorological data are collected from the measuring stations and disseminated to the data center. The data assimilation procedure uses only the data collected up to a certain time. The data pass an automatic quality check procedure. Then starts the meteorological model dependent assimilation procedure that is described in the next section. The final product of the assimilation procedure is the initial file for the model forecast run. The model forecast run is the most demanding task of the operational suite for the mainframe computer in CMHS. The forecast run should finish in time so that the complete set of the forecast products can be created in a process called postprocessing and disseminated to the endusers. The forecast products are being created during the model forecast run or upon its completion depending on the input data set required by the particular postprocessing application. A. The computer system The mainframe computer in CMHS (violet in Figure 3) is a SGI Altix LSB-3700 BX2 Server with 48 Intel Itanium2 1.6GHz/6MB CPUs and 96 GB standard system memory and 2x146 GB/10Krpm SCSI disk drive. It runs with OS SUSE Linux Enterprise Server 9 for IPF with SGI Package. The NWP model software is compiled by Intel Fortran & C++ compilers version 9.0.031. The execution of the operational suite scripts and other not time-critical jobs is controlled by queuing system (PBS Pro). The individual tasks of the operational suite are submitted to the queueing system by a cron deamon. Slika 2. Scheme of the access from the member service (MS) to the ECMWF. Slika 3. The scheme of the computers and connections involved in the ALADIN operational suite. The operational forecast suite starts with the retrieval of the files containing initial and boundary conditions. They are retrieved from MF via internet. The copy of these files is being delivered by MF to ECMWF server for backup transfer via internet or much slower dedicated RMDCN (Regional Meteorological Data Communications Network in Europe) line. Additional files produced at ECMWF are available too. II. METEOROLOGICAL APPLICATIONS A. Aladin model The operational forecast is performed using the hydrostatic version of ALADIN model [1] with 8 km horizontal resolution on 37 levels in the vertical. It is a spectral model that uses double Fourier representation of fields with elliptic truncation [7] and a hybrid pressuretype terrain following coordinate [9]. Operationally, the 8 km resolution run is initialized using digital filter initialization (DFI) [6]. The model version used operationally has changed from the one described in [4] as desribed in [10]. The primitive equation set for the wind components, temperature, specific humidity, cloud water and ice, rain and snow as well as surface pressure is solved using the two-time-level semi-implicit semilagrangian integration scheme. The 8 km resolution forecast is operationally further dynamically downscaled to 2 km horizontal resolution on a single domain of 450x450 points, using the same procedure as described in Ivatek-Šahdan and Tudor (2004). Instead of running the full model forecast on 2km resolution for 72 hours, each output file of 8 km resolution is used as initial and coupling file and the forecast is run on only 15 levels using hydrostatic dynamics and vertical turbulent diffusion parameterization. The model is run for 30 one-minute timesteps which allows wind to adapt dynamically to the high resolution terrain. B. Operational network The computing and archive facilitiws of ECMWF (light blue in Figure 3) can be accessed via ECaccess server through the internet or the dedicated RMDCN line. Both require usage of the ECaccess software and an ECaccess gateway installed at CMHS (pluton in Figure 3). The computing and archive facilities of MF (green in Figure 3) can be accessed through a similar firewall system from a registered server (Figure 3). The operational model output is stored on a massive storage facility (zemlja, Figure 3) and disseimated to other serves internal to CMHS (orange in Figure 3), to be picked up by the users, or disseminated directly to the servers outside (red in Figure 3). C. Aladin model code The ALADIN NWP model software package is ported to the mainframe computer using a compiler in combination with the gmkpack compiling utillity that has to be ported first. Porting and compilation of the model source is done only once and separately from the model execution and the model executable is stored for further use. The model re-compilation happens only when the research subject demands the model source code modification or when a new version of the model becomes available to be ported. the research performed on a different computer or when switching the operational suite from one computer to the other one. D. The postprocessing system The operational model run produces the output as model variables stored as spectral coefficients on the model levels as well as specific meteorological variables interpolated on the pressure surfaces or 2 and 10 meters above ground. The output files are in a model specific format. The users require model output data in either GRIB or ASCII fromats and on specific sub-domains. These can be created and disseminated already during the model forecast run. Slika 4. The CPU time per timestep during backward forward DFI and 66 timesteps of the forward model forecast (6 hours) for various model set-ups (different experiments). The figures of the meteorological forecast fields are ploted on a separate server that also hosts the intranet pages of the operational ALADIN forecast products. Another set of model output data are pseudo-TEMP messages and model forecasts for specific points that require the model output data for the entire 72 hour forecast period and can be created only after the model forecast run is finished. E. Aladin data assimilation system A description of setup of the local assimilation system for a LAM ALADIN HR is given. Assimilation system at CMHS consists of two parts; the surface assimilation which is used to change the state of a model soil variables and the upper air assimilation which changes an upper air model fields. Surface assimilation is done by the optimal interpolation (OI) technique while upper air assimilation is done using the 3D variational technique (3DVAR). To get better initial conditions data assimilation can be used. To implement data assimilation, first an assimilation cycle needs to be set up. Assimilation cycle is sequence of analysis and 6 hour forecasts that is run on regularly basis. Slika 5. The surface temperature difference obtained with data assimilation using the same model code and same input data on two different mainframe computers in Meteo France. Aladin model code is being developed by a number of scientists from 16 countires. It also includes the IFS software that is developed in ECMWF, the ARPEGE global model as well as the physics part of the MesoNH research model of MF. All the model developments done in different countires and on different computer platforms are collected in a process called phasing and a new version of the model software is released. During porting, the model performance is evaluated for the different optimization levels and various model configurations that utilize various physical parametrization schemes. One of the important issues is to keep the CPU time per model time-step small and constant in order to have the operational model forecast run finished on time. This can depend on the model configuration for some optimization levels (Figure 3), which is not good for the operational forecast tasks. The same model code can give slightly different forecast on different computer platforms (Figure 5). It is important to keep this in mind when testing the results of TABLICA I. OBSERVATION TYPE AND VARIABLES ASSIMILATED AT CMHS. Observation type Variable surface pressure, 2m temperature SYNOP and relative humidity Aircraft wind components Atmospheric Motion Winds wind components pressure, wind components, TEMP temperature and humidity Wind profiler wind components Satellite radiances radiance In the assimilation cycle, the information coming from observations (Table I) is accumulated into the model state. The assimilation cycle is even more important for the surface analysis, because surface vairables need more time to be updated. F. e r i f i c a t i o n CMHS is a member of Regional Cooperation for Limited Area modeling in Central Europe (RC LACE; http://www.rclace.eu/) that supports the LACE common observation preprocessing unit (OPLACE). The observation data is collected, preprocessed and dissipated to LACE member services. The geographical selection of data and quality control can be done locally through LACE observation monitoring tool. The assimilation cycle and production are run in quasi-operational mode i.e. observational data is taken at operational time but analysis and model integration is done with some time delay (after the operational model run is finished). Scheme D ata assi mila tion setu p at DHMZ, as described in previous chapter, is running in quasi operational mode from end of February 2010. Approximately at same time storage capacities were enhanced. This enabled archiving 72 hour forecasts initialized from the assimilation cycle. This data was used to evaluate quality of forecast initialized with assimilation system (ASSIM) against operational forecast (OPER) using verification package VERAL (http://old.chmi.cz/meteo/ov/aladin/docs/veral). Slika 6. Scheme of assimilation cycle implemented on DHMZ. Verification is preformed in few steps. First quality control of data is done using surface optimal interpolation. The ARPEGE long cut off analysis is taken as background in order to do “neutral” observation selection. Then, the same observations are used for computation of model departures from observations for both OPER and ASSIM. Departures are used for calculating some basic statistics like bias, Root Mean Square Error (RMSE) and standard deviation (STD). Slika 7. Seasonal verification scores for screen level parameters: temperature, humidity, wind direction and wind speed versus prognostic hour. BIAS-dashed lines, RMSE-full line, STD-dotted line. Red is ASSIM and Black OPER. of local setup of assimilation cycle at CMHS is shown on Figure 6. Model results were interpolated to location of observation and compared with synoptic and radio sounding observations. This was over time period of 10 months and over the whole model domain from 02.03.2010. to 04.12.2010. with 6 hour interval. Statistic are computed for screen level parameters (2 meter temperature and relative humidity, 10 m wind) and for upper air parameters (temperature, realtive humidity, wind components and geopotential). Verification results for whole period (Figure 7) for 2m parameters show that both bias and root mean square error are better for temperature and relative humidity. The results are mostly neutral for wind speed and direction (not shown). G. Regional climate model Regional climate model RegCM3 (Pal et al, 2007) is run on local machine viking. Typical simulation includes European domain with 35 km grid spacing, 142 x 93 mesh and 23 vertical levels. In recent experiments, 2 years of simulation per day were run on 16 CPU-s. Installation of new RegCM4.1 is planned in a near future. H. European monitoring and evaluation programme The Unified EMEP (European Monitoring and Evaluation Programme) model is coupled to ALADIN meteorological output and run on 10km resolution. This model setup called EMEP4HR is used for air quality studies in DHMZ. Figure 8. shows monthly maximum surface ozone fields (in PPBV) for May 2006. This is a part of a study made to determine the influence of industrial and traffic emissions on this important pollutant. The relative effects of 15% increase and 15% decrease respectively of traffic based NOx and VOC emissions are also evaluated. In the first case maximum ozone is increased while in the other it is decreased by approximately 1% in the area with very high traffic emissions. This study is made for the purposes of the Ministry of construction and environment of Croatia. Slika 8. Maximum ozone concentrations in PPB for May 2006 I. Other NWP models A number of meteorological models is used purely for research purposes since they are too demanding on the computer time and memory to be used in the time critical operational forecast applications on the mainframe computer used in CMHS. The COAMPS (Hodur, 1997) model has been used in research of the sea surface temperature effects on the bura flow (Kraljević and Grisogono, 2005). The PSU/NCAR MM5 model (Grell et al, 1995) has been used in a study of the model resolution impact on the simulated bura (Špoler Čanić and Kraljević, 2005). These models are nevertheless an important tool used in various case studies and phenomenological research. III. CONCLUSIONS The numerical weather prediction applications related to the operational forecast are presented with a short description of other non-operational models used in particular research studies. LITERATURA [1] ALADIN International Team, 1997: The ALADIN project: Mesoscale modelling seen as a basic tool for weather forecasting and atmospheric research. WMO Bull., 46, 317-324. [2] Grell, G. A., Dudhia, J. and D.R. Stauffer, 1995: A description of the fifth-generation Penn State/NCAR mesostale model (MM5). NCAR Technical Note, NCAR/TN-398+STR, 122 pp. [3] Hodur, R. M. 1997: The naval research laboratorys coupled ocean/atmosphere mesoscale prediction system (COAMPS). Mon. Wea. Rev., 125 (7), pp. 1414-1430. [4] Ivatek-Šahdan, S., M. Tudor, 2004: Use of highresolution dynamical adaptation in operational suite and research impact studies. – Meteorol. Z. 13, 99–108. [5] Kraljević and Grisogono [6] Lynch, P., X.-Y. Huang, 1994: Diabatic Initialization using recursive filters. – Tellus 46A, 583–597. [7] Machenhauer, B., J.E. Haugen, 1987: Test of a spectral limited area shallow water model with timedependent lateral boundary conditions and combined normal mode/semi-lagrangian time integration schemes. – In: Proceedings from the ECMWF Workshop on Techniques for HorizontalDiscretization in NumericalWeather Prediction Models, 2–4 November 1987, ECMWF, 361–377. [8] Pal J and 19 coauthors (2007) Regional climate modeling for the developing world. The ICTP RegCM3 and RegCNET. Bull Amer Meteorol Soc 88: 1395-1409 [9] Simmons, A.J., D.M. Burridge, 1981: An Energy and AngularMomentumConservingVertical Finite-Difference Scheme and Hybrid Vertical Coordinates. – Mon. Wea. Rev. 109, 758–766. [10] Špoler Čanić, K and L. Kraljević. [11] Talagrand, O., 1997: Assimilation of observations, an introduction.. J. Met. Soc. Japan, Special Issue 75, 1B, 191-209. [12] Tudor, M. and S. Ivatek-Šahdan, 2010: The case study of bura of 1st and 3rd February 2007, Meteorologische Zeitschrift, 19, 5; 453-466 [13]