Iacopo

advertisement
Very quick ATLAS software tutorial
Iacopo Vivarelli (not an expert, just a user)
INFN Pisa
What you find here:
- How to use CMT and run Athena
- How to use ATLFAST
- Useful (at least, I hope) informations about instructions,
HowTo, tutorials
Conventions:
red – commands for the prompt
blue – hyperlinks
violet – The OS answer
Athena - where?
●
●
●
●
If one has an account on lxplus, that is the right place to
start
Athena kit can be downloaded and installed anywhere. Only
executables and libraries are there, but the sources can be
checked out from a repository (e.g. CERN)
Many and many useful information are on the ATLAS
computing page and in the ATLAS WiKi page
For the computing:
http://atlas.web.cern.ch/Atlas/GROUPS/SOFTWARE/OO/Applications/
●
For the Wiki:
https://uimon.cern.ch/twiki/bin/view/Atlas/AtlasComputing
●
Most of the informations you might need are there
ATLFAST
●
●
●
●
●
●
In the following I will explain how to generate events and use ATLFAST (fast
parametrized response of the detector)
I won't go through the Full ATLAS simulation and reconstruction, I will give
just references about it
The final strategy in ATLAS will be to make analysis inside Athena on Analysis
Object Data (AOD) (and Event Summary Data, ESD).
Instead, I will show how to produce a ROOT (or PAW) ntuple. Refer to the
computing page to know more about AODs and the Analysis Tools for them.
ATLFAST is a particle level simulation. It is extremely useful if one wants to
generate quickly a huge number of events (it takes about 1-2 hours to
generate (PYTHIA) and simulate/reconstruct 100 K tt events on my Pentium
4 2.5 Ghz single processor machine). One can easily (and quickly) get
millions of events.
It is not accurate: the response of tracker, calorimeters, muon spectrometer is
only parametrized (from the full simulation). An analysis is robust if its key
points are “certified” with the full simulation
How to run ATLFAST
●
●
●
Everything I will say refers to lxplus at CERN. Minor changes needed if
you want to run on your Athena at home.
Good reference (actually this is for interactive athena. I won't talk about
this. Anyway, most of the commands and the jobOptions are the same)
http://stavrop.home.cern.ch/stavrop/Generators/tutorial/generators.html
I will go step by step
Setting up a main CMT directory (more informations)
●
Create an empty directory (e.g. main_cmt)
●
Go into it and edit a file named requirements. It should be as follow
# Set the site where you are working. For CERN set:
set CMTSITE CERN
# For portables:
#set CMTSITE EXTSITE
# Other possible values are: LBNL, BNL
# Set the base location of all release for your site:
macro ATLAS_DIST_AREA "/afs/cern.ch/atlas/software/dist"
# Set release number, here it is 9.0.4 (which is a good release for atlfast)
macro ATLAS_RELEASE "9.0.4"
// The ATLAS version you want to use, 9.0.4 is fine for Atlfast
# the AtlasSettings package centralizes the setting of basic
# environment variables
use AtlasSettings v* $(ATLAS_DIST_AREA)
# Set the location of your prefered development area, where packages
# will be checked out, (change "${HOME}/MyTest" to be any directory
# path or soft link under your home directory):
macro devarea "${HOME}/MyTest“
// This is the place where you are going to use packages
# Add dev area to the front of your CMTPATH (but first remove any
# previously defined devarea which is under your home directory)
macro home_dir "${HOME}/scratch0/AtlasWork/Athena9.0.4"
path_remove CMTPATH "${home_dir}"
path_prepend CMTPATH "$(devarea)"
Setting up the CMT
●
Check here which version of CMT you have to use.
●
Then, for 9.0.4 it is v1r16p20040901
●
From the main_cmt directory type:
This only
the first
time
source /afs/cern.ch/sw/contrib/CMT/v1r16p20040901/mgr/setup.sh
cmt config
source setup.sh -tag=opt
Athena
●
●
// This any time you login and you want to use
The CMT should be set. To check, verify that the enviroment variable
CMTPATH has your installation directory (MyTest in the previous slide)
as the first directory and that CMTROOT is defined to the right CMT
version
The -tag=opt option means that you don't want to run the debugger on
the compiled packages (see later). Leave it there if you want only to
use Athena, remove it if you plan to develop software
Checking out packages
●
Now you have to check out a package
●
The software tree can be browsed here
●
●
Use RecExCommon: fine for ATLFAST and ready for the full
reconstruction
Check which is the tag of the wanted package in the release you want
ls /afs/cern.ch/atlas/software/dist/9.0.4/Reconstruction/RecExample/RecExCommon/
RecExCommon-00-03-02-11
●
Then we want the tag 00-03-02-11 of RecExCommon
●
Go into the MyTest directory
●
Then, check it out
cmt co -r RecExCommon-00-03-02-11 Reconstruction/RecExample/RecExCommon
●
This produces a
Reconstruction/RecExample/RecExCommon/RecExCommon-00-0302-11 directory in your MyTest. Go there.
Set up and compile packages
●
There are some important directories here:
cmt – directory to set up CMT of the package, to compile....
share - the jobOptions (see later) of the package are here
doc – some scripts and maybe documentation
●
If instead of RecExCommon one checks out another package
cmt co -r JetRec-00-03-33 Reconstruction/JetRec
then there are two more directories
JetRec (the headers directory)
src (the sources directory)
●
Go into the cmt directory. Then
cmt config
source setup.sh // This also has to be done each time you login
gmake
Almost done.....
●
If no error messages appear, then Athena should be ready to run. One
can check that athena.py is in the $PATH
which athena.py
/afs/cern.ch/atlas/software/dist/9.0.4/InstallArea/share/bin/athena.py
●
●
●
You are in the cmt directory: go into the ../run directory and type
RecExCommon_links.sh. This produces a number of files in the run
directory
Be sure that the file PDGTABLE.MeV is there
To summarize, after the setting up of everything, each time you login
you have to do the following
In the main_cmt directory: source setup.sh -tag=opt
In the MyTest directory:
cd Reconstruction/RecExample/RecExCommon/RecExCommon<Tag>/cmt
source setup.sh
Setting up a jobOption
●
●
●
●
In the following I assume to run Pythia (with the
default settings) and to use ATLFAST on the events
produced.
A jobOption is an input file for Athena. The jobOptions
specifies what you are going to do.
It uses python
You can use the following skeleton (let's name it
my_atl_job.py)
# This job options file runs Atlfast in the Athena framework
theApp.setup( MONTECARLO )
// The App is the main application of Athena
include( "PartPropSvc/PartPropSvc.py" )
#* load relevant libraries // The following commands load the libraries for Pythia, Atlfast, Gaudi (the core of Athena)
theApp.Dlls += [ "Pythia_i"]
theApp.Dlls += [ "GaudiAlg"]
theApp.Dlls += [ "AtlfastAlgs"]
theApp.Dlls += [ "RootHistCnv" ]
theApp.ExtSvc += ["AtRndmGenSvc"]
#-------------------------------------------------------------# Persistency services
#-------------------------------------------------------------# you should not need to change this
EventPersistencySvc.CnvServices = [ "McCnvSvc" ];
# Setting the Pythia seeds
AtRndmGenSvc = Service( "AtRndmGenSvc" )
AtRndmGenSvc.Seeds = [ "PYTHIA_INIT 237662 57728382", "PYTHIA 29311444 566058" ] //random seeds
# This is the name of the file where your histograms will be created.
#
theApp.HistogramPersistency="ROOT"
HistogramPersistencySvc = Service( "HistogramPersistencySvc" )
HistogramPersistencySvc.OutputFile = "ArtemisAtlfastJob.rz"
NtupleSvc = Service( "NtupleSvc" )
NtupleSvc.Output = ["FILE1 DATAFILE='ttH.root' TYP='ROOT' OPT='NEW'"]
#-------------------------------------------------------------# Event related parameters
#-------------------------------------------------------------# Number of events to be processed (default is 10)
theApp.EvtMax = 50000 // maximum number of events
#-------------------------------------------------------------# Private Application Configuration options
#-------------------------------------------------------------# OUTPUT PRINTOUT LEVEL
# Set output level threshold (2=DEBUG, 3=INFO, 4=WARNING, 5=ERROR, 6=FATAL )
# you can override this for individual modules if necessary
MessageSvc.OutputLevel
=5
# Execution of algorithms given in sequencer
#
theApp.TopAlg = ["Sequencer/TopSequence"]
// the main application is a sequence of algorithm
TopSequence = Algorithm( "TopSequence" )
TopSequence.Members = ["Sequencer/Generator","Sequencer/Atlfast"] // The algorithms that belong to the sequence
#
#GENERATORS SETUP (select by uncommenting/commenting)
#---------------#Generator.Members = {"Isajet"};
Generator = Algorithm( "Generator" )
Generator.Members = ["Pythia"]
# uncomment this option if you want top
# contribution to cross sections
Pythia = Service( "Pythia" )
include( "ttH.inc" )
// all the settings of Pythia are here, see the next slide
include( "AtlfastAlgs/AtlfastStandardOptions.py" )
// All the Atlfast settings are here
Have a look at the ATLFAST webpage for more informations
Pythia.PythiaCommand = [ "pyinit pylisti
This is the content of the ttH.inc
12",\
file. Of course it can be put
"pysubs msel 0",\
directly into the jobOptions
"pysubs msub 121 1",\
"pysubs msub 122 1",\
"pypars mstp 82 4",\
The syntax is:
"pydat1 mstj 22 2",\
- name of the pythia routine
"pydat1 mstj 11 3",\
- name of the pythia variable (vector)
"pydat1 parj 54 -0.07",\
- number of the component
"pydat1 parj 55 -0.006",\
- value
"pypars parp 82 2.2",\
"pypars mstp 128 0"]
Let's have a look at the Atlfast Options. We need the
AtlfastAlgs/AtlfastStandardOptions.py file
To get a jobOption from the distributions, one can use the get_joboptions
command:
get_joboptions AtlfastStandardOptions.py
The first lines set some algorithms that can be executed
#Atlfast.members +={"Atlfast::TrackMaker/TrackMaker"};
Atlfast = Algorithm( "Atlfast" )
Atlfast.members +=["Atlfast::Monitor/Monitor"]
#Atlfast.members +={"Atlfast::TrackNtupleMaker/TrackNtupleMaker"};
# Atlfast.members +={"Atlfast::StandardNtupleMaker/StandardNtupleMaker"}; //uncomment this
# Atlfast.members +={"Atlfast::ExampleAnalysis/ExampleAnalysis"};
# Atlfast.members +={"Atlfast::AtlfastProtoJetMaker/AtlfastProtoJetMaker"};
If one uncomment the marked line (and change the syntax to python), one will get
the standalone atlfast ntuple. Then, one changes the last line of the jobOption.
Change
include( "AtlfastAlgs/AtlfastStandardOptions.py" )
into
include( "AtlfastStandardOptions.py" )
to use your local AtlfastStandardOptions.py
Then run. This should produce an root ntuple:
athena.py my_atl_job.py
If you don't get error messages and at the end you have a file called
ttH.root in your directory, then the first job has been run.
Into AtlfastStandardOptions.py there are most of the parameters for the
ATLFAST simulation/reconstruction
The minimum PT for electrons, photons, isolated and non-isolated
muons, reconstruction, the reconstruction cone for clusters and jets, the
value for the b-tagging and tau-tagging efficiency and many other options
are there.
The Atlfast manual is useful to understand the meaning of the options (it
is the manual for the fortran version, but there are no big differencies)
Something about the options can be found also on the
http://www.hep.ucl.ac.uk/atlas/atlfast/
What is in the ntuple
Look here to get more informations about the atlfast ntuple
There are a number of blocks, with similar structure.
PLEPTONS  Isolated electrons and muons. Available variables:
nele,pxele,pyele,pzele,eeele,kfele (the pdg code of the reconstructed
electron); the same for muons.
PPHOTONS Isolated photons. Available variables: the same
PMUXSNon-isolated muons. Available variables: the same
PPJETS Jets and AtlfastB jets. Available variables: the same for jets
plus PTcalo,PTbjet,Ptujet (PT of all the jets, of the true bjet, of the true
ujet), then the same for AtlfastB jets.
PHISTORY Parton level informations
PMISSINGMissing ET informations
Be careful: pxjetb IS NOT the px of b-jets. The b flag at the end of the
jet variable name means that those are the variables obtained running
AtlfastB.
What is AtlfastB: it is a simple routine that provides
• b-jet and tau-jet identification according to the tagging and mistagging
efficiencies set in the AtlfastStandardOptions.py files. If you use the
default, the b-tagging efficiency is 50% with pt dependent mistagging
for c and u jets. Then, a random number is thrown and it is compared
with the tagging probability according to the true identifier (KFJET).
• After the jet tagging, calibration constants for the out of cone energy
are applied.
One should use the jetb variables (pxjetb,pyjetb etc.) if he/she is
interested in reconstructed id and calibrated jets.
Some further information
-Instructions for downloading and installing athena somewhere: you need
a machine with rh7.3 or SLC3 installed (I don’t know what happens with
different OS). It needs about 5 GB on the hard disk (maybe more for the
installation). Then follow the instructions: InstallingAtlasSoftware < Atlas
< Twiki
- Running Herwing / How to interface external events (generated with
Alpgen, AcerMC, CompHep, MC@NLO etc.) with pythia/ATLFAST or
Herwig/ATLFAST: index
- Running the ATLAS reconstruction on full simulated
events:Reconstruction in Athena
-Trigger simulation (LVL1 and HLT): see Atlas UK Simulation - Level2
For any question you can contact me: iacopo.vivarelli@pi.infn.it
Either I can answer or I can redirect the question……….
Download