Arctic snow in a changing cryosphere: What have we learned from observations and CMIP5 simulations? Chris Derksen and Ross Brown Climate Research Division Environment Canada Thanks to our data providers: Rutgers Global Snow Lab ● National Snow and Ice Data Center ● World Climate Research Programme Working Group on Coupled Modelling ● University of East Anglia – Climatic Research Unit ● NASA Global Modeling and Assimilation Office ● European Centre for Midrange Weather Forecasting Outline 1. Snow in the context of a changing cryosphere 2. Overview of ‘observational’ snow analyses • Validation approaches • Inter-dataset agreement 3. Observations versus CMIP5 simulations Climate Change and the Cryosphere Spring snow cover Summer sea ice Sea level Observational time series Trends in surface temperature 1901–2012 IPCC AR5 Summary for Policy Makers Figure 3 IPCC AR5 WG1 Chapter 2 Figure 2.21 Arctic Sea Ice Volume Arctic sea ice volume anomalies from the Pan-Arctic Ice Ocean Modeling and Assimilation System (PIOMAS ) U. Washington Polar Science Center Canadian Arctic Sea Ice Canadian Arctic Sea Ice Trends from the Canadian Ice Service Digital Archive Howell et al 2013 (updated) Greenland Ice Sheet Mass Balance Monthly changes in the total mass (Gt) of the Greenland ice sheet estimated from GRACE measurements. Tedesco et al., 2013 NOAA Arctic Report Card Arctic Ice Caps and Glaciers Mean annual (red) and cumulative (blue) mass balance from 1989-2011 from Arctic glaciers reported to the World Glacier Monitoring Service by January 2013. Sharp et al., 2013 NOAA Arctic Report Card Cryosphere Contribution to Sea Level Rise Rate of ice sheet loss in sea level equivalent averaged over 5-year periods. IPCC AR5 WG1 Figure 4.17 Arctic Terrestrial Snow Snow cover extent (SCE) anomaly time series, 1967-2013 (with respect to 1988– 2007) from the NOAA snow chart CDR. Solid line denotes 5-yr running mean. • Over the 1979 – 2013 time period, NH June snow extent decreased at a rate of -19.9% per decade (relative to 1981-2010 mean). • September sea ice extent decreased at-13.0% per decade. Derksen, C Brown, R (2012) Geophys. Res. Letters Active Layer Thickness Active layer thickness from Siberian stations, 1950 to 2008 IPCC AR5 WG1 Figure 4.23d Snow – An Important Hydrological Resource NASA Earth Observatory Snow – Highly Variable in Space and Time Focus on Arctic land areas, during the spring season (AMJ): 100% snow cover at the beginning of April; Nearly all snow gone by end of June. Part 2: Overview of ‘observational’ snow analyses • Validation approaches • Inter-dataset agreement Hemispheric Snow Datasets The challenge is not a lack of data… Description Period Resolution Data Source NOAA weekly snow/no-snow 1966-2013 190.5 km Rutgers University, Robinson et al [1993] NOAA IMS daily 24 km snow/no-snow 1997-2004 24 km National Snow and Ice Data Center (NSIDC), Ramsay [1998] NOAA IMS daily 4 km snow/no-snow 2004-2013 4 km NSIDC, Helfrich et al [2007] AVHRR Pathfinder daily snow/no-snow 1982-2004 5 km Canada Centre for Remote Sensing, Zhao and Fernandes, [2009] MODIS 0.05° snow cover fraction 2000-2013 ~5 km NSIDC, Hall et al [2006] ERA-40 reconstructed snow cover duration (temperature-index snow model) 1957-2002 ~275 km (5 km Environment Canada, Brown et al [2010] elev. adjustment) QuikSCAT derived snow-off date 2000-2010 ~5 km Environment Canada, Wang et al [2008] Daily snow depth analysis (in situ obs + snow 1998-2013 model forced by GEM forecast temp/precip fields) ~35 km Canadian Meteorological Centre, Brasnett [1999] Daily snow depth analysis (in situ obs + snow model forced by reanalysis temp/precip fields) 1979-1998 ~35 km Environment Canada, Brown et al [2003] MERRA reanalysis snow water equivalent (CATCHMENT LSM) 1979-2013 0.5 x 0.67 deg NASA, Rienecker et al [2011] ERA-interim reanalysis snow water equivalent (HTESSEL LSM) 1979-2010 ~80 km ECMWF, Balsamo et al [2013] GLDAS reanalysis snow water equivalent (Noah LSM) 1948-2000 1948-2010 1.0 x 1.0 deg 0.25 x 0.25 deg NASA, Rodell et al [2004] SnowModel driven by MERRA atmospheric reanalysis snow water equivalent 1979-2009 10 km Colorado State, Liston and Hiemstra [2011] GlobSnow snow water equivalent (satellite passive microwave + climate station obs) 1978-2013 25 km Finnish Meteorological Institute, Takala et al [2011] Validating Snow Products with Ground Measurements • • • • • Lack of in situ observations ‘Snapshot’ datasets Spatial representativeness? Measurement deficiencies Poor reporting practices (non-zero snow depth) Challenges to Validating Gridded Snow Products with Ground Measurements This is what product users want to see: This is the reality: 180 160 Old Jack Pine 1975 Harvest 0.25 140 SWE (mm) 120 Relative Frequency 1994 Harvest 2002 Harvest GlobSnow SWE V0.9.2 100 80 60 40 20 0.20 n= ~5000 0.15 0.10 0.05 0.00 0 <20 305 317 329 341 353 365 12 24 36 48 60 72 84 40 60 80 100 120 140 160 180 200 >200 96 108 120 Day of Year 2006/07 Time series for the former BERMS sites SWE (mm) Spatial sampling across one grid cell ‘Validating’ Gridded Snow Products via Multi-Dataset Comparisons NH June SCE time series, 1981-2012 NOAA snow chart CDR (red); average of NOAA, MERRA, ERAint (blue) •Tendency for NOAA to consistently map less spring snow (~0.5 to 1 million km2) than the multi-dataset average since 2007. •Accounting for this difference reduces the June NH SCE trend from -1.27 km2 x 106 to -1.12 km2 x 106 EUR Oct SCE: difference between NOAA snow chart CDR and 4 independent datasets, 19822005 • Evidence of an artificial trend (~+1.0 million km2 per decade) in October snow cover. Brown, R Derksen, C (2013) Env. Res. Letters A New Multi-Dataset Arctic SCE Anomaly Time Series April May June Part 3: Observations versus CMIP5 simulations Simulated vs. Observed Arctic SCE 1. NOAA CDR NA EUR Historical + projected (16 CMIP5 models; rcp85 scenario) and observed (NOAA snow chart CDR) snow cover extent for April, May and June. SCE normalized by the maximum area simulated by each model. Updated from Derksen, C Brown, R (2012) Geophys. Res. Letters Simulated vs. Observed Arctic SCE NA 1. NOAA CDR 2. Liston & Hiemstra 3. MERRA 4. GLDAS-Noah 5. ERA-int Recon. EUR Historical + projected (16 CMIP5 models; rcp85 scenario) and multi-observational snow cover extent for April, May and June. SCE normalized by the maximum area simulated by each model. Arctic SCE and Surface Temperature Trends: 1980-2009 SCE Tsurf • Simulations slightly underestimate observed spring SCA reductions NA • Similar range in observed versus simulated SCA trends • Observed Arctic temperature trends are captured by the CMIP5 ensemble range EUR 1. CRU 2. GISS 3. MERRA 4. ERA-int Why do CMIP5 models underestimate observed spring SCE reductions? North America Eurasia Model vs observed temperature sensitivity (dSCE/dTs), 1981-2010 • Models exhibit lower temperature sensitivity (change in SCE per deg C warming) than observations • Magnitude of observational dSCE/dTs depends on choice of observations (both snow and temperature) Understanding CMIP5 SCE Projections • Projected changes in snow cover for individual models are predictable based on the characteristics of historical simulations. • Consistent with a priori expectations, models project greater DSCE with: -greater standard deviation (s) -greater dSCA/dTs -stronger historical trends Future Work • • CMIP5 models do fairly good job of replicating the mean seasonal cycle of SWE over the Arctic but the maximum is higher than observations, and the models underestimate the rate of spring depletion. Shallow snow albedo and excess precipitation frequency may together act to keep albedo higher – simulated snow melt is not ‘patchy’. Conclusions • The rate of June snow cover extent loss (-19.9% per decade since 1979) is greater than the rate of summer ice loss (-13.0% per decade). • Arctic surface temperatures in the spring are well simulated by CMIP5 models, but they exhibit reduced snow cover extent sensitivity to temperature compared to observations. • Interannual variability (s), temperature sensitivity (dSCE/dTs), and historical trends are good predictors of DSCE projections to 2050. • The spread between 5 observational datasets (mean; variability) is approximately the same as across 16 CMIP5 models. A climate modeling group would never run one model once, and claim this is the best result. Why do we gravitate towards this approach with observational analyses? Questions? Snow Cover Extent: Inter-Dataset Variability June snow cover extent (2002) 2004-2008 CMC IMS-24 May Avg SCE 9.0 11.0 June Avg SCE 3.0 5.1 IMS-4 NCEP NOAA 10.6 9.6 10.2 11.6 10.2 10.2 10.3 ± 0.80 4.7 2.3 2.8 4.8 3.1 3.4 3.7 ± 1.06 Brown et al., 2010, J. Geophys. Res. PMW QSCAT Avg ± 1 std MODIS Observed vs. Simulated SCE Variability CMIP5 versus NOAA Liston and Hiemstra 1. NOAA CDR 1. NOAA CDR 2. Liston & Hiemstra All Observations 1. NOAA CDR 2. Liston & Hiemstra 3. MERRA 4. GLDAS-Noah 5. ERA-int Recon. • The NOAA CDR is an outlier with respect to interannual variability Understanding CMIP5 SCE Projections Ratio of interannual variability relative to observations (NOAA) for fall and spring • The 3 models with <5% difference in variability for spring versus fall are the 3 top ranked models at reproducing the spatial pattern of mean snow cover duration over Arctic land areas (vs. NOAA)