DC2

advertisement
Zprávy z ATLAS SW Week March 2004
Seminář ATLAS SW CZ Duben 2004
Jiří Chudoba
FzÚ AV CR
ATLAS Software & Computing Week - 1 Mar. 2004
ATLAS Computing Timeline
2003
• POOL/SEAL release (done)
• ATLAS release 7 (with POOL persistency) (done)
• LCG-1 deployment (in progress)
2004
NOW
• ATLAS complete Geant4 validation (done)
• ATLAS release 8
• DC2 Phase 1: simulation production
2005
• DC2 Phase 2: intensive reconstruction (the real challenge!)
• Combined test beams (barrel wedge)
• Computing Model paper
2006
• Computing Memorandum of Understanding (moved to end 2004)
• ATLAS Computing TDR and LCG TDR
• DC3: produce data for PRR and test LCG-n
2007
• Physics Readiness Report
• Start commissioning run
• GO! Dario Barberis: Introduction & News
ATLAS Software & Computing Week - 1 Mar. 2004
Near-term Software Release Plan

7.5.0: 14th Jan 2004

7.6.0: 4th Feb

7.7.0: 25th Feb <- SPMB Decision 3rd Feb

8.0.0: 17th Mar <- DC2 & CTB Simulation Release

8.1.0: 7th Apr

8.2.0: 28th Apr

8.3.0: 19th May

9.0.0: 9th Jun <- DC2 & CTB Reconstruction Release

9.1.0: 30th Jun

9.2.0: 21st Jul

9.3.0: 11th Aug
⇐ 15th Feb: LAr Technical run starts
⇐ 1st May: Baseline DC-2 Simulation starts
⇐ 10th May: Testbeam starts
⇐ 1st Jul: Baseline DC-2 Reconstruction starts
⇐ 14th Jul: Complete Testbeam
⇐ 15th Jul: Baseline DC-2 Physics Analysis starts
Dario Barberis: Introduction & News
3
Release 8
Expected Major Milestones
 GEANT4 Simulation
• DC2 (in validation)
• Test Beam (underway)
• Pile-Up, Digitization in Athena (debugging)
 GeoModel
• Inner Detector, Muon Spectrometer




Conversion to CLHEP Units (mm, MeV, [-pi,pi])
POOL/SEAL Persistency
Bytestream Converters
Preliminary Conditions Capabilities
Other Stepping Stones
 Move to InstallArea
 jobOption.txt --> jobOption.py
 Distribution Kits
ATLAS DC2
ATLAS Software Workshop
2 March 2004
Gilbert Poulard
CERN PH-ATC
DC2: goals
 At this stage the goal includes:









Full use of Geant4; POOL; LCG applications
Pile-up and digitization in Athena
Deployment of the complete Event Data Model and the Detector Description
Simulation of full ATLAS and 2004 combined Testbeam
Test the calibration and alignment procedures
Use widely the GRID middleware and tools
Large scale physics analysis
Computing model studies (document end 2004)
Run as much as possible of the production on LCG-2
DC2 operation
 Consider DC2 as a three-part operation:
o
o
o
o
part I: production of simulated data (May-June 2004)
 needs Geant4, digitization and pile-up in Athena, POOL persistency
 “minimal” reconstruction just to validate simulation suite
 will run on any computing facilities we can get access to around the world
part II: test of Tier-0 operation (July 2004)
 needs full reconstruction software following RTF report design, definition
of AODs and TAGs
 (calibration/alignment and) reconstruction will run on Tier-0 prototype as
if data were coming from the online system (at 10% of the rate)
 output (ESD+AOD) will be distributed to Tier-1s in real time for analysis
part III: test of distributed analysis on the Grid (August-Oct. 2004)
 access to event and non-event data from anywhere in the world both in
organized and chaotic ways
in parallel: run distributed reconstruction on simulated data
DC2: Scenario & Time scale
September 03: Release7
Put in place, understand & validate:
Geant4; POOL; LCG applications
Event Data Model
Digitization; pile-up; byte-stream
Conversion of DC1 data to POOL; large scale persistency
tests and reconstruction
Testing and validation
March 17th: Release 8
(production)
Run test-production
Start final validation
May 3rd 04:
Start simulation; Pile-up & digitization
Event mixing
Transfer data to CERN
July 1st 04: “DC2”
August 1st:
Intensive Reconstruction on “Tier0”
Distribution of ESD & AOD
Calibration; alignment
Start Physics analysis
Reprocessing
DC2 resources
Process
No. of
events
Time
duration
CPU
power
Volume
of data
At
CERN
Off
site
months
kSI2k
TB
TB
TB
Simulation
107
2
600
25
5
20
Pile-up (*)
Digitization
107
2
400
75
15
60
Byte-stream
107
2
(small)
20
20
16
Total Phase I
107
2
1000
120
40
96
Reconstruction
Tier-0
107
0.5
600
5
5
10
Reconstruction
Tier-1
107
2
600
5
0
5
Total
107
130
45
111
Phase
I
(MayJune)
Phase
II
(July)
Atlas Production System schema
Task = [job]*
Dataset = [partition]*
Data
Management
System
AMI
JOB DESCRIPTION
Location
Hint
(Task)
Task
(Dataset)
Task
Transf.
Definition
+ physics signature
Human
intervention
Job
Run Info
Location
Hint
(Job)
Job
(Partition)
Partition Executable name
Transf. Release version
Definition signature
Supervisor 1
Supervisor 2
Supervisor 3
US Grid
Executer
LCG
Executer
NG
Executer
RB
LSF
Executer
NG
Local Batch
Chimera
Supervisor 4
RB
US Grid
LCG
Tiers in DC2
 Tier-0
o 20% of simulation will be done at CERN
o All data in ByteStream format (~16 TB) will be copied to CERN
o Reconstruction will be done at CERN (in ~10 days).
o Reconstruction output (ESD) will be exported in 2 copies from
Tier-0 ( 2 X ~5 TB).
Tiers in DC2
 Tier-1s will have to
o
o
o
Host simulated data produced by them or coming from Tier-2; plus ESD (&
AOD) coming from Tier-0
Run reconstruction in parallel to Tier-0 exercise (~2 months)
 This will include links to MCTruth
 Produce and host ESD and AOD
Provide access to the ATLAS V.O. members
 Tier-2s
o
o
o
Run simulation (and other components if they wish to)
Copy (replicate) their data to Tier-1
ATLAS is committed to LCG
 All information should be entered into the relevant database and catalog
Core sites and
commitments
Initial LCG-2
core sites
Other firm
commitments
Site
Immediate
Later
CERN
200
1200
CNAF
200
500
FNAL
10
?
FZK
100
?
Nikhef
124
180
PIC
100
300
RAL
70
250
Taipei
60
?
Russia
30
50
Prague
17
40
100
?
864(+147)
>2600(+>90)
Budapest
Totals
Will bring in the other 20 LCG-1 sites as quickly as possible
Comments on schedule
 The change of the schedule has been “driven” by
o ATLAS side:

the readiness of the software
• Combined test beam has a highest priority

The availability of the production tools
• The integration with grid is not always easy
o
Grid side

The readiness of LCG
• We would prefer run Grid only!

Priorities are not defined by ATLAS only
 For the Tier-0 exercise
o It will be difficult to define the starting date before we have a
better idea how work the “pile-up” and the “event-mixing”
processes
Status of the
Software for
Data Challenge 2
Armin Nairz

Generation and GEANT4 Simulation (as in Rel. 7.5.0+)
are already in a production-like state





stable and robust
CPU times per event, event sizes ‘within specifications’
Digitisation (as in Rel. 7.6.0+) could not be tested for
all sub-detectors (missing or not working)


Conclusions
for the tested ones, digitisation is working and stable
reason for confidence in working digitisation procedure for
the whole detector in/after Rel. 7.7.0
Pile-up not yet fully functional
Documentation on pre-production activities available
from DC webpage
 http://atlas.web.cern.ch/Atlas/GROUPS/SOFTWARE/DC/DC2/preprod
 contains also how-to’s (running event generation, simulation,
digitisation)
Armin NAIRZ
ATLAS Software Workshop, CERN, March 1-5, 2004
12
Další schůze






GRID
Analysis Tools
Distributed Analysis
...
Atlantis Tutorial
Athena Tutorial
ATLAS SW CZ Seminář
2.4.2004
chudoba@fzu.cz
Download