Operational weather forecasting in Norway

advertisement
Operational weather forecasting in Norway
“The Norwegian Meteorological Institute is the meteorological service for both The Military and the
Civil Services in Norway, as well as the public. Our mission is to protect life, property and the
environment, and to provide the meteorological services required by society.”
Background thoughts/input for Norwegian operational numerical weather forecasting
computational needs; Roadmap 2015 onwards
Products from operational numerical weather forecasting are useful only if available to forecasters and
public users in due time. Main numerical codes should finish within 1 hour after (observation
gathering) “cut off”. Keywords are peak performance, movement and storage of relatively large amount
of real time data.
Today (2010) the HPC system available for operational weather forecasting in Norway is 25% of the
NOTUR IBM computer "njord" in Trondheim, but almost all of the system can be allocated for shorter
time slots around the main forecasting periods at 00/06/12/18utc. This means that approximately 1100
"node hours" pr. day are available for operational runs, development and testing. A typical nonhydrostatic weather forecasting model at 4km horizontal resolution set up on a domain covering the
Norwegian main land uses ~20 node hours to finish a 66 hour forecast. A single run of such a model
typically produces ~4Gb of output.
If :
- the future weather/ocean forecasting codes will be developed to be able to utilize coming computer
systems approximately as efficient as todays systems
- the complexity of future weather/ocean forecasting models could be set to be approximately 150% of
todays models
Then :
An indication of necessary computing power can be calculated by comparing todays typical models and
a future scenario of future (2020) modeling systems by simple scaling:
Deterministic "best" forecasting model system :
 horizontal resolution 1km, 100 vertical levels, 40 seconds time step
 domain covering Norway and adjacent areas ( see figure )
 approximate relative computing cost pr. forecasting hour = 130 node hours
 nowcasting mode: short (~6 hour long) runs every hour assimilating high resolution satellite
and radar data
 24 hour forecasts every third hour
 72 hour forecasts twice a day
 a total of 384 forecasting hours a day => ~50.000 node hours
 anticipated amount of output => 2 Tb/day
Ensemble system :
 half the horizontal resolution compared to the deterministic model, 70 seconds time step
 approximate relative computing cost pr. forecasting hour = 19 node hours
 50 members
 48 hour forecasts twice a day
 a total of 4800 forecasting hours a day => 91.200 node hours
 anticipated amount of output => 7.5 Tb/day
This first estimate suggest a HPC resource with ~500 times the capacity of the 2010 version of "njord"
to cover met.no's basic NWP needs in say 2020. ( If computing power increases by a factor of two
every 1.5 year, it will be an increase by 500 in 12 years... )
Some interesting stuff from ECMWF strategy document (2006-2015):
Society's main expectation from meteorologists is undoubtedly a correct prediction of severe weather,
an area where failures are increasingly unacceptable. Severe weather encompasses windstorms,
extreme rainfall and floods, snowstorms, heat waves, cold outbreaks, droughts, tropical cyclones, freak
ocean waves, storm surges and all conditions conducive to bad visibility. Progress in these areas will
increase our capability to mitigate the adverse impacts of severe weather on society, including the high
toll on human lives. Progress will also lead to increases in productivity in various sectors of activity,
including transport, energy, construction, tourism, agriculture and health.
Probabilistic forecasting of severe weather using ensembles is already well established in the mediumrange and its application will also grow progressively in the short-range. As a consequence, the role of
ECMWF as provider of boundary conditions for limited-area ensembles will grow.
An essential factor for progress has been the increase of resolution allowed by the development of high
performance computing. This trend will continue in the next ten years. A reasonable projection for
deterministic forecasting system resolutions by 2015 is 1 to 3 km for limited-area models in the
Member States and 10 km for the global ECMWF system. The ratio of resolutions between the global
and the limited area systems will therefore stay approximately constant. On the other hand, a
reasonable projection of the resolutions possible for ensemble forecasting over the period of the
strategy is roughly half that of the resolution of deterministic systems. Increasing the resolution of the
ensemble prediction system will allow it to benefit from the same progress already achieved for the
deterministic forecasting system and will further improve the value of ensemble prediction systems for
all applications.
-----------------------------------------From ECMWF HPC strategy
There are many new players in the Linux Cluster area and it is felt that during the period covered by
this strategy such a system, based mainly on commodity components and opensource software, will
become a viable option for satisfying high-end operational production requirements such as ECMWF's.
However the roadmap for the support model of such systems is far from clear and may involve a partial
trade-off in investment between HPC resources and the requirement of additional Centre staff to
support these systems.
The trend towards a larger number of relatively less powerful processing cores will put additional
pressure on the parallel efficiency of the Centre's main application codes and will require efforts to be
focussed on ensuring that these programs continue to be scalable up to much higher numbers of
processors.
In the European arena, contracts were
recently awarded by the French Commissariat à l'Energie Atomique and the German Leibnitz
Rechenzentrum, both for systems in the 60-70 teraflops peak range. The European plans include the
UK High-End Computing Terascale Resources (HECToR) project and efforts to establish either a
climate-oriented European Supercomputing Facility or several multidisciplinary European Facilities.
The European Facilities would be part of a computing pyramid comprising local resources at the
institute level, regional/national centres and European resources of the highest performance class.
---------------ECMWF as HPC hosting institution:
If ECMWF were to be offered to host some of the European HPC resources being discussed, both the
consortium funding such resources and ECMWF stand to benefit. ECMWF would bring into such a
project a strong track record in delivering well-managed and cost-effective computing resources to
researchers throughout Europe. Value for money would be maximised for both parties due to the
improved economies of scale. Therefore ECMWF should be ready engage in any serious discussions
that could lead to it hosting European HPC resources.
Met Office
Strategies and initiatives at the UK Met Office (“UM” is the acronym for the Met Office Unified
Model, witch is used for weather forecasting and climate predictions at all scales) :
1)
The pdf-file “UM_new_architectures.pdf” discusses “The Strategies for improving the scalability of the
UM in response to changing computer architectures”, both in short, medium and long term.
2)
http://www.nerc.ac.uk/research/programmes/ngwcp/events/workshop100520.asp
“... there is an urgent need to begin research and development of a new UK model for weather and
climate prediction, designed to exploit the new computer technology to provide the UK science
community with the tools needed to model processes at smaller scales.”
Download