Collaboration meeting on clouds, radar and radiation: report

advertisement
Collaboration meeting on clouds, radar and radiation: report
Raporteurs: Robin Hogan, Jon Petch, Maike Ahlgrimm, Richard Forbes
1. Overview
On 17 and 18 November 2009, a collaboration meeting was held at the Department of Meteorology,
Reading, including attendees from the Department, the Met Office and ECMWF, to discuss collaboration
related to evaluation of the representation of clouds in models using radar, lidar and other observations.
This talk summarizes the meeting, particularly the three breakout sessions. Proposed actions are given in
bold.
2. Talks
The first day of the meeting was devoted to talks summarizing a particular aspect of the work at the
University of Reading, the Met Office or ECMWF. The talks may be downloaded from
http://www.met.reading.ac.uk/clouds/workshops/2009_met_office_ecmwf_collaboration/
3. Breakout session summary: Radiation (Raporteur: Robin Hogan)
Four main topics were discussed, as indicated below.
3.1 Methods of representing inhomogeneity and overlap: McICA/Tripleclouds intercomparison
We discussed the need to compare the performance of the McICA and Tripleclouds methods to represent
the interaction of cloud structure and radiation. There is a need to compare both the off-line behaviour on
global cloud fields and the on-line behaviour when the cloud fields can feedback in response to the
changed radiative forcing. Tripleclouds works in the off-line Edwards-Slingo code, and the final issues have
at last being ironed out in its implementation in the UM.
 ACTION J Manners/J Shonk: Ensure Reading are using sensible version of model/radiation
scheme so that the Met Office can understand their changes and do direct McICA comparison
with Tripleclouds etc. (Action from Jon Petch)
 ACTION Peter Hill: Create an updated off-line version of Edwards-Slingo including McICA
 ACTION Jon Shonk: Run this code on ERA-40 scenes and compare to Tripleclouds
 ACTION Hill/Manners: Consider 10-yr on-line McICA to compare with Jon’s Tripleclouds runs
(need to implement latitude dependence in McICA)
 ACTION J Shonk/J-J Morcrette: Consider running ECMWF off-line McICA code on ERA-40 scenes
and comparing to Met Office McICA and Tripleclouds
 ACTION J Manners/J Petch: Consider collaboration server access request (short term) and longer
term collaboration proposal (Action from Jon Petch)
3.2 Characterizing the sub-grid structure in models
We discussed the use of the new cloud schemes to provide the radiation with the necessary information to
do sub-grid cloud structure. At ECMWF, radiation runs on a 2x2 dynamics grid and Jean-Jacques is currently
working on putting a PDF of surface types into McICA. He is waiting for cloud scheme modifications before
this aspect of the physics can feed through to the radiation, so currently the variance information is from
the “standard” McICA cloud generator. However, he has introduced Jon Shonk’s latitudinally dependent
overlap decorrelation length, but with some slight smoothing at the pole and equator.
At the Met Office, Peter Hill (as part of his PhD) will be using information from PC2 to provide water
content variance information for the radiation. He will then compare this variance to CloudSat-Calipso
measurements, work that could conceivably test the variance in the Met Office and ECMWF models
simultaneously.
3.3 Representing the effect of 3D radiation in models
1
We discussed the newly started project on 3D radiation and potential for collaboration. The ECMWF code
has had the potential for representing some 3D effects very simply using the “Tilted ICA” (TICA)
approximation, from work by Adrian Tompkins. This work did not include the side leakage effect, but it
might be possible to do so by adjusting the overlap assumption differently in the direct solar and diffuse
two-stream parts of the code. We also discussed whether a particular bias in the ECMWF model
(overestimated surface temperatures late in the day over mid-latitude continents) could be caused by 3D
effects associated with cumulus clouds.
 ACTION J Shonk/J-J Morcrette: In due course consider whether it might be possible to modify the
TICA method to represent side leakage effects (and possibly longwave 3D effects) and compare
with the “two-stream with lateral transport” method currently being developed.
 ACTION (now done) Robin Hogan: Forward to Martin Koehler a simple formula for modifying the
cloud radiative forcing to roughly include 3D effects for cumulus, which he would use to estimate
the likely associated temperature error.
3.4 Representing gases absorption in models
James Manners reported that the Met Office was having input from Zhien Sun and Wenyi Zhong in the
development in the representation of gaseous absorption in Edwards-Slingo, including new methods and
tricks for gaseous overlap and continuum dependence. Problems in the stratosphere were becoming
apparent through the recent raising of the Unified Model lid. The new solar spectrum (Lean) was reported
to be better, particularly in the UV. The Met Office does not really have a standard method for generating
new spectral files.
It was discussed how the full-spectrum correlated-k method that Robin Hogan had been working on might
be incorporated into Edwards-Slingo – this would need not only new spectral files, but also look-up tables
for the Planck functions. Changes to the Fortran code to provide this capability was believed to be part of
Zhein Sun’s modifications. Robin has little time and no funding to pursue this, however.
ECMWF are considering changing to a 0.001 hPa lid from 0.01 hPa, for satellite interpretation of weighting
functions that peak high and QBO mis-representation. Non-LTE problems are likely to be encountered. It
was reported that stratopause errors of up to 15 K had been found at one point, which were partly
corrected with better distribution of trace gases and gravity wave drag.
The work to develop a new water-vapour continuum in Keith Shine’s CAVIAR project was highlighted. This
will feed into RRTM in due course, and thence into the ECMWF model. There is no formal route into
Edwards-Slingo.
 ACTION James Manners: Contact Keith Shine to discuss the information that is likely to be
forthcoming from CAVIAR and how it might be incorporated into Edwards-Slingo.
 ACTION James Manners: Ensure Keith Shine and Piers Forster are aware of recent Edwards-Slingo
updates that may impact their work (note that Marc Stringer at Reading ensures changes from
James are available to Reading users at least) and ensure they do any updates in the most
relevant version of the radiation scheme. (Action from Jon Petch)
4. Breakout session summary: Microphysics (Raporteur: Richard Forbes)
The discussion fitted into three main topic areas and the main points are highlighted below.
4.1 Heterogeneity and Uncertainty
There can be considerable heterogeneity in humidity and cloud related fields at different spatial scales that
is often only partially represented in models or represented in a very simplified way. Cloud parametrization
schemes (e.g. Smith scheme, Tiedtke, PC2) imply an underlying distribution of total water (humidity and
condensate) but how do we improve this representation and what aspects matter the most?
o
Firstly we need observations of the PDFs of humidity, hydrometeors (liquid, ice/snow, rain), vertical
velocity. We already have data from ground-based and space-based radar and lidar for
cloud/precipitation which should be utilised more. Humidity variability can be obtained from Raman
lidar (Chilbolton, Lindenberg and Cabauw) for non-cloudy air (outside cloud).
2
o
o
o
We need further work on parametrization schemes to represent the sub-grid heterogeneity and
sources and sinks in a physically consistent and realistic way.
How do we include sub-grid heterogeneity effects in the radiation and microphysics parametrizations?
Once we have a reasonable representation of the sub-grid heterogeneity in the model, it is straight
forward to include the impact in the multi-column McICA approach to the radiation parametrization
without additional cost. A similar multi-column approach could be applied to the microphysics
calculations but this could increase the computational cost considerably. Is there more benefit from a
multi-column approach with simple microphysics rather than a single column approach with more
complex microphysics?
The 1D testbed could be used to assess the sensitivity to some of the uncertainties associated with the
representation of sub-grid hetereogeneity.
4.2 Microphysical Processes
There is potential for improving our understanding and parametrization of most, if not all, of the
microphysical processes in the model. However, a few were highlighted in the discussion:
o
Mixed phase (super-cooled liquid water): This can occur in thin layers (e.g. altocumulus) or can be in
pockets embedded in larger systems (fronts). It is possible for models to get the overall statistics of
frequency of occurrence reasonably well, but the representation of the details is often poor (e.g. thin
layers). There has been work on the overall frequency of occurrence, but is there more we can extract
from the observations in terms of layer heights/thickness occurrence and what conditions they occur
in. What is the turbulence level that is needed to form SLW (we should be able to determine this from
high resolution models but what about from the observations?). Can we get a handle on the degree of
riming using LWP from a radiometer when we know there is no liquid water cloud in the column (e.g.
using lidar)?
o
Ice nucleation (glaciation): There are many complexities here including the representation of different
types of ice nuclei and nucleation processes. NWP models represent ice nucleation with simple
schemes based on temperature/humidity. Do we need a more complex representation or are the
uncertainties too great and will a simple scheme suffice?
o
Mixing/entrainment in warm-phase cloud: How adiabatic are cloud profiles and how well do models
represent this? Can we infer how much mixing occurs from measuring the sub-adiabaticity in warm
clouds?
4.3 Level of Complexity of Microphysical Parametrizations
The ever-present question for microphysical parametrizations is what level of complexity to represent. We
want to capture the most important processes in a realistic way but not put too much complexity that we
don’t understand and cannot constrain with observations. A number of issues were discussed:
o
Aerosol-cloud interactions: Particularly in relation to drizzle. Lee H-S has done some work relating to Sc
deriving LWP from radar reflectivity. We need additional information on aerosol to characterise cloudaerosol interactions. Use ground UV lidar to get cloud droplet concentrations?
o
Ice particle characteristics (density, fallspeeds, shape, PSDs). Use dual wavelength radar with Doppler
to get fall speed and information on particle size distributions (Reading have a few months of data from
Chilbolton)
o
Separate ice and snow categories or a continuum? We need to show that there are two modes that
vary independently. Observations so far suggest a continuum is more representative. We discussed
Chris Westbrook’s work to evaluate ice fall-speeds by plotting Z versus Doppler velocity for ice clouds
and overplotting the assumption from the model (a single line). It was suggested that this method
could be used to test both double-moment schemes (for which a spread of values of would be
expected, as in the observations) and two-component ice schemes (for which two “blobs” would be
expected, something not seen in the observations – see Fig 15 of Marsham et al. 2006).
3




ACTION Chris Westbrook: When the data from more sophisticated GCM ice schemes become
available, apply this method to test the validity of double-moment and two-component ice
schemes.
ACTION Cyril Morcrette: Compare a Control/Field PSD run with the fall speed diagnostics against
the Cloudnet Doppler velocity. Use extra diagnostics not normally available to Reading to test
theories (it would be quite a bit of work getting these diagnostics to Reading as they would have
to be added in the operational model and then archived etc). (Action from Jon Petch)
ACTION Jonathan Wilkinson: Capitalize on Z-LWP drizzle work (Lee Hawkness-Smith) and possibly
extend to include aerosol information. This will help inform the MURK-droplet number
parametrization being planned for UKV. (Action from Jon Petch)
ACTION: J Wilkinson/C Morcrette: Possible collaboration to spin up later in the year: Use
remotely sensed TKE (or dissipation rate) and occurrence of embedded supercooled liquid water
to compare against high resolution modelling that is being carried out in APP. (Action from Jon
Petch)
5. Breakout session summary: Model Evaluation (Raporteur: Maike Ahlgrimm)
This group discussed primarily the issue of compositing in model evaluation. The idea behind compositing
by cloud type or dynamical regime is to identify and focus on a particular cloud type that the model is
having difficulty representing. In contrast to an individual test case, the composite takes advantage of the
wealth of data, and gives a distribution or range of typical cloud properties for a given regime, rather than
one value that may or may not be representative. If test cases are desired later on for SCM, we not only
have the mean of the distributions, but can also test cases from the more extreme ends of the
distributions.
How to define or constrain are regime involves some educated guessing, and depends on where the model
is having the greatest problems. A basis for honing in on problem areas could be the use of standard
diagnostics/scores that are currently used. If the dynamic regimes of interest are well-defined, then the
cloud properties for various regimes should have clearly separable distributions of cloud properties.
If the goal of the evaluation is to compare the frequency of occurrence of a given cloud type in a dynamical
regime, then it follows that the regime should be defined using variables that are not directly tied to the
modelled clouds, or the cloud type will be pre-selected a-priori.
Since the full combined CloudSat/Calipso product from Julien is very large, he has offered to downsize the
product on request by averaging the variables of interest onto a model grid (means and moments for each
grid box). He'd need the model grid (lat, lon location of centre points and grid box boundaries, plus vertical
levels) for this.
Representativity of the satellite along-track observations vs. the model grid box area is always an issue.
Some ideas for minimizing the representativity error are to throw out grid boxes that are not well-sampled
(few footprints within the box), to use MODIS 2D data in the grid box or the standard deviation of the
observations falling into the model grid box to get a feeling for the inhomogeneity of the cloud field in the
box. Olaf is working on a statistical method to get an error estimate of the representativity error based on
the satellite observations only. He's testing this method, but it's not yet fully fledged. This is probably more
important when model data and satellite tracks are matched directly for a threat-score-type evaluation,
and not quite as crucial when many samples (grid points) composited and distributions considered.
 ACTION: Cyril Morcrette and Ewan O’Connor: Consider regime dependent processing of the data.
6. Wrap-up session
The discussion at the end of the meeting focussed mainly on ways to ensure more rapid transfer of singlesite data from the Met Office to the Department of Reading, in order for it to be evaluated more routinely.
Regarding model data under CloudSat underpasses, Reading have reported having received “only have 3
weeks' of data” in a convenient format. However, a lot more has been extracted by Malcolm Brooks from
4
MASS (this is the time consuming bit), with the sub-satellite data then being extracted using Alejandro's
script. Julien has the data in Reading, but the problem is that it hasn’t been quite explained how to read the
data, and what formats it is in. It appears that the model columns are not provided in the order in which
they would be sampled by CloudSat during its orbits. Reading’s view on this is “if you want your model
tested, provide the data in a convenient format”.
A cloud evaluation mailing list was also proposed to facilitate communication.
 ACTION Cyril Morcrette: Work with colleagues to ensure a speed-up of the delivery of Met Office
data over the various ground-based sites. Actions written by Jon Petch:
o Decide on what is a priority in terms of processing back data. Global is 3 years behind and
stopped, and NAE/UK4 are lagged by 6 months. Consider what resources and
methodology are needed to ensure better and quicker release of data. Catch up with the
delays in the various models if easy.
o Ensure current output is available from NAE/Global so we can see differences
(presumably due to physics). Get global model for period of recent NAE/UK4 comparison
to Reading
o Ensure we are getting some use from looking at operational performance.
o Get a system where we can test our trials against Cloudnet data (is CloudSat also
possible?)
o Get current model data for Cloudnet processing near real time and have some system
where we use this (auto generation of monthly means with mail sent to us?)
o Get UKV data tested as soon as we can
 ACTION: Ben Shipway: Get SCM framework up and running as soon as we can. (Action from Jon
Petch)
 ACTION Alejandro Bodas-Salcedo/Jon Petch/Julien Delanoe: Assist in reading model data under
CloudSat (or possibly provide in a more convenient format).
 ACTION Alejandro Bodas-Salcedo: He has file containing sub-satellite point as function of time
(currently processed up to Jan 2009). So in principle we could use this to extract correct model
columns from global and NAE VAR trial (control and PC2 or other) and compare against CloudSat.
(Action from Jon Petch)
 ACTION Jon Petch: Talk to and work with those at the Met Office doing other stuff with CloudSat,
to ensure they benefit from the comparisons.
 ACTION Robin Hogan: Set up mailing list. This is now done: cloud-evaluation@lists.reading.ac.uk;
subscription to this list is via http://www.lists.rdg.ac.uk/mailman/listinfo/cloud-evaluation.
5
Download