The Worldwide LHC Computing Grid Inauguration Ceremony of the GRID-CERN Laboratory Kavala, Greece Tuesday 9 June 2015 Frédéric Hemmer CERN IT Department Head The ATLAS experiment CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it 7000 tons, 150 million sensors generating data 40 millions times per second i.e. a petabyte/s Frédéric Hemmer 2 A collision at LHC Frédéric Hemmer 3 3 The Data Acquisition Ian.Bird@cern.ch 4 Frédéric Hemmer 4 Tier 0 at CERN: Acquisition, First pass reconstruction, Storage & Distribution 2011: 4-6 GB/sec 2011: 400-500 MB/sec 1.25 GB/sec (ions) 9 June 2015 Frédéric Hemmer 5 The LHC Data Challenge • The accelerator will run for 20 years • Experiments are producing about 25 Million Gigabytes of data each year (about 3 million DVDs – 850 years of movies!) • LHC data analysis requires a computing power equivalent to ~100,000 of today's fastest PC processors • Requires many cooperating computer centres, as CERN can only provide ~20% of the capacity CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it Frédéric Hemmer 6 Solution: the Grid • Use the Grid to unite computing resources of particle physics institutes around the world The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations The Grid is an infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it Frédéric Hemmer 7 WLCG – what and why? • • A distributed computing infrastructure to provide the production and analysis environments for the LHC experiments Managed and operated by a worldwide collaboration between the experiments and the participating computer centres • The resources are distributed – for funding and sociological reasons • Our task was to make use of the resources available to us – no matter where they are located Tier-0 (CERN): • Data recording • Initial data reconstruction • Data distribution Tier-2 (~140 centres): • Simulation • End-user analysis Tier-1 (12 centres): • Permanent storage • Re-processing • Analysis • • • • • Frédéric Hemmer ~ 160 sites, 35 countries 300000 cores 200 PB of storage 2 Million jobs/day 10 Gbps links 8 Data 2008-2013 Tape Usage Breakdown CERN Tape Writes 27 PB 23 PB 15 PB CERN Tape Archive CERN Tape Verification Data Loss: ~65 GB over 69 tapes Duration: ~2.5 years 9 June 2015 Frédéric Hemmer 9 No stop for the Computing ! 9 June 2015 Frédéric Hemmer 10 Processing on the Grid 7.E+07 6.E+07 Jobs/Month 5.E+07 4.E+07 lhcb 3.E+07 cms 2M Jobs/day atlas 2.E+07 alice 1.E+07 Apr-2013 Oct-2012 Apr-2012 Oct-2011 Oct-2010 Apr-2011 Apr-2010 Oct-2009 Apr-2009 Oct-2008 Oct-2007 Apr-2008 Apr-2007 Oct-2006 Apr-2006 Oct-2005 Oct-2004 Apr-2005 0.E+00 1.60E+09 1.4 109 HEPSPEC06/Month (210 K CPU continuous use) Close to full capacity HEPSPEC06 Hours/Month 1.40E+09 1.20E+09 1.00E+09 lhcb 8.00E+08 cms 6.00E+08 atlas 4.00E+08 alice 2.00E+08 Frédéric Hemmer Oct-2012 Apr-2013 Apr-2012 Oct-2011 Oct-2010 Apr-2011 Apr-2010 Oct-2009 Apr-2009 Oct-2008 Apr-2008 Oct-2007 Apr-2007 Oct-2006 Apr-2006 Oct-2005 Apr-2005 CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it Oct-2004 0.00E+00 11 Broader Impact of the LHC Computing Grid • WLCG has been leveraged on both sides of the Atlantic, to benefit the wider scientific community – Europe: • Enabling Grids for E-sciencE (EGEE) 2004-2010 • European Grid Infrastructure (EGI) 2010-- – USA: • Open Science Grid (OSG) 2006-2012 (+ extension?) • Many scientific applications Frédéric Hemmer Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences 12 … LHC is restarting … 9 June 2015 Frédéric Hemmer 13 9 June 2015 Frédéric Hemmer 14 9 June 2015 Frédéric Hemmer 15 GR-12-TEIKAV Up and Running! 9 June 2015 Frédéric Hemmer 16 CERN School of Computing 2015 9 June 2015 Frédéric Hemmer 17 9 June 2015 Frédéric Hemmer 18