Scalable CaloTracker (SCT) proposal for universal particle detector

advertisement
Scalable CaloTracker (SCT) proposal for universal particle
detector from zero till practically infinite energies
European Strategy for Particle Physics Open Symposium
Krakow 10-12 September 2012
A. Agocs and G.Vesztergombi
Wigner RCP, Budapest, Hungary and Roland Eotvos University, Budapest, Hungary
Abstract
After discovery of Higgs-boson the particle physicist community should turn its attention
for new challenges which could be directed toward ultra high energies. The hope in new
accelerator technologies is greatly enhanced by invention of the laser driven plasma wake
field methods which can produce beams with PeV (1015 eV) energy at some future time.
For comparison on the expected time scale, one can mention that LHC was conceived in
1984 and will reach full energy hopefully in 2014. It is an interesting question what type
of detector system would be applicable in this energy range for fixed target and collider
arrangements.
Here we should like to present a new concept which is radically different from the
present onion shell design (vertex pixel, tracker, EM-calorimeter, hadron calorimeter and
muon detector). This new system would have a completely homogenous structure built
from standard elements in a scalable way serving at the same time as very fine resolution
TRACKER and a full absorption CALORIMETER with full 4 coverage for both
charged and neutral particles with the usual exception of penetrating neutrinos. Though
one is not expecting accelerators with PeV beams before 2050, one can test the SCT
(Scalable CaloTracker) detector principle at lower energies due to its modular scalable
structure. The key element is a massively parallel information system which can process
the complete shower development on track-by-track base on adaptive granulation levels.
Introduction
Today high energy in accelerator physics means few TeV corresponding to the peak
energy of LHC, but energies beyond Peta-electronVolt (1015 eV) are rather common in
cosmic ray physics. One can cite the 2 most prominent examples:
“The Pierre Auger Observatory is an international cosmic ray observatory designed to
detect ultra-high-energy cosmic rays: single sub-atomic particles (protons or atomic
nuclei) with energies beyond 1020 eV (about the energy of a tennis ball traveling at
80 km/h). These high energy particles have an estimated arrival rate of just 1 per km2 per
century, therefore the Auger Observatory has created a detection area the size of Rhode
Island — over 3,000 km2 (1,200 sq mi) — in order to record a large number of these
events. It is located in western Argentina's Mendoza Province, near the Andes.” [1]
“The IceCube Neutrino Observatory (or simply IceCube) is a neutrino telescope
constructed at the Amundsen-Scott South Pole Station in Antarctica.
…. IceCube is sensitive mostly to high energy neutrinos, in the range of 1011 to about
1021 eV. Estimates predict a neutrino event about every 20 minutes in the fully
constructed IceCube detector.
However, there is a large background of muons created not by neutrinos from
astrophysical sources but by cosmic rays impacting the atmosphere above the detector.
There are about 106 times more cosmic ray muons than neutrino-induced muons observed
in IceCube. Most of these can be rejected using the fact that they are traveling
downwards. Most of the remaining (up-going) events are from neutrinos, but most of
these neutrinos are from cosmic rays hitting the far side of the Earth; some unknown
fraction may come from astronomical sources, and these neutrinos are the key to IceCube
point source searches. Estimates predict the detection of about 75 upgoing neutrinos per
day in the fully constructed IceCube detector. The arrival directions of these
astrophysical neutrinos are the points with which the IceCube telescope maps the sky. To
distinguish these two types of neutrinos statistically, the direction and energy of the
incoming neutrino is estimated from its collision by-products. Unexpected excesses in
energy or excesses from a given spatial direction indicate an extraterrestrial source.” [2]
These examples well illustrates the advantages and disadvantages of UHECR (Ultra High
Energy Cosmic Ray) detectors. Due to GZK limit [3] they can reach the practically
infinite energy in our Universe around 1021 eV (1012 GeV) which is till 3-4 order of
magnitude below the GUT energy. One has a cheap source of UHERC particles up to
“infinite” energy but they are extremely rare and from unknown direction therefore one
requires astronomical size detectors and infinitely long measuring times.
In parallel during the last 100 years accelerators made gradual developments from
Rutherford 7 MeV alpha-source in 1911 till the 7 TeV LHC collider in 2011 which
progress corresponds exactly ONE MILLION times increase in beam energy during a
century. In hope of technological breakthroughs form laser driven plasma wake field
accelerating methods one can predict by extrapolation 7 EeV (Exa-electronVolt, 1018 eV)
as peak energy by 2111, which gives by interpolation 7 Peta-electronVolt by 2061 thus
one can hope near 1 PeV around 2050. By this time we should like to prepare our
universal particle detector.
After Higgs’s discovery the next challenge can be the search for composite quarks. Some
dreams are formulated in ref [4].
CaloTracker principle
In experimental particle physics one makes sharp distinction between tracker and
calorimeter detectors. In reality, however, there is a possibility for gradual transition from
one type of detector toward the other. For example, there are laterally segmented
calorimeters which can be converted by gedanken experiment to tracker by decreasing
material density. More direct way of conversion is, if one takes into account that any
tracker contains some material, then taking large enough number of tracker planes and
placing them behind each others in the accumulated long detector one can reach in total
several radiation and interaction lengths, which is producing electromagnetic and
hadronic showers in a useless manner because the information for charged particles is
already contained in trajectories before the showering.
Continuing the gedanken experiment by increasing thousand times the particle
momentum at fixed detector length and magnetic field, the situation will change
dramatically. The relative accuracy of momentum measurement will decrease by
thousand times because the absolute error on curvature measurement will remain the
same!!! If the accuracy at 1 TeV/c in the tracker measuring mode (switching off the
showering in the gedanken experiment) was 0.1 %, then for 1 PeV/c one would get 100%
measuring error. Switching on the showering in the gedanken experiment, assuming 5-7
interaction length, the momentum of secondary particles would decrease below 1 TeV/c
and could be measured as accurately as at low energy for each secondary adding up even
to a more accurate sum. One is gaining on the fact that the length of the calorimeter is
increasing only logarithmically. Of course, the total particle length practically remains
the same, because one should executing parallel tracking in the same volume.
One can deduce the CaloTracker rule: it works only if one is able to track large particle
multiplicities simultaneously in the detector volume as it is shown in Fig.1.
L
Low momentum
High momentum not measurable
L<
Parent
L=4
Tracker: NO interaction
1. generation
2. generation
3. generation
CaloTracker: in higher generations one can measure momentum
L/4 = 10  Calorimeter: NO individual tracks
Figure 1. CaloTracker principle. Illustrative shower example with fixed interaction
length and fixed number of secondaries.
Beside the logarithmic gain there are other essential bonuses in the CaloTracker device: it
is equally good for charged and neutral particles (as it was proposed for EXChALIBUR
[5]), additionally if one chooses the right material composition (ratio of Z0 and ) one
can easily distinguish between different particle identities: electron, hadron and muon.
Just for illustration, one can take as a rather realistic gedanken experiment the following
example. At the PS neutrino beam the BEBC neutrino detector was detecting particles in
the 10 GeV/c momentum range. How could we convert it for detecting particles in the
Peta-Zeta energy range of IceCube assuming to have concentrated neutrino source of that
energy? Fig.2. shows our H-Tube detector, which is a tube filled with liquid hydrogen or
deuterium with diameter of 3m and length 100m. We assume dipole magnetic filed
perpendicular to the beam axis. In liquid hydrogen Z0 = 866 cm and  =716 cm which
means we have about 12-14 length for hadronic and electromagnetic showers in a well
contained manner which makes muon identification also simple. In hydrogen the Z0 and
 are practically equal. One could simplify the reconstruction processes if the
electromagnetic and hadronic part of the shower would be much different therefore in the
next we shall concentrate to such type of devices.
Solenoid field
BEBC
Dipole
field
D=3m
L = 100 m
H-Tube
Figure 2. Schematic view of H-Tube for Peta-Zeta eV neutrino beams.
Though the measuring accuracy and multi-particle capability of bubble chambers is
excellent, they are not practical in real use. One needs electronic device.
Elementary detector cell
The success of informatics is based on the integrated chip technology which provides
sophisticated elements in large volume at extremely low price based on the extreme fine
tuned standardization.
50 years ago the electronic devices was made from vacuum diodes and triodes and now
we are using integrated circuits as CPU and memory chips with millions of transistors on
1 cm2. Pixel detectors at LHC have typical pixel size of 100 microns. If one looks around
the market finds the following advertisement [7]:
“Sharp Corp. has developed a 1/2.5-inch CCD image sensor with 8.28-megapixel
effective resolution. The company achieved the resolution using the 1.75 μm x 1.75 μm
pixel cell, which is claimed as is the industry's smallest class. Sample shipment is slated
for the end of January 2007 and volume production and shipment are for April 2007.
Pricing for a sample is ¥4,000 (including tax). [~ 50$ today exchange rate in 2012]
Higher resolution is needed more than before for use in compact digital cameras and a
1/2.5-inch sensor is required to have 7-megapixel or higher resolution, according to
Sharp. The sensor package measures 10.0 x 10.0 x 3.5 mm. This is about a 60% (in
volume) compared to the company's previous model (12.0 x 13.8 x 3.9 mm).”
Let us take as the proposed elementary cell a chip of size 100 mm x 100 mm
(corresponding to some smaller wafer magnitude) with 1 micron x 1micron pixel size on
Silicon. This is a modest improvement within factor of two relative to the above CCD
image sensor. We are making a bold extrapolation on the read-out of this device which
contains 1010 pixels!!! We are assuming that each chip is working autonomously and
continuously broadcasting for each particle crossing (x,y) hit coordinates in microns, T
time in picosec and E energy-loss, z-coordinate is given by the chips network address.
This triggerless detector element is following the DETNI-XYTER design principle
proposed also for FAIR CBM experiment [6].
It is important to notice that even in the case when 100 million particles are crossing this
chip simultaneously in the core of an electromagnetic shower the occupation ratio will
not exceed 1 %!!! The hits will be well distributed laterally due to the magnetic field
which will very effectively divert the low-momentum secondaries.
If one assumes large scale mass production the price of one chip can go down to 10 $.
A reasonable detector would contain at least 1 million elementary chips which adds up to
the 10 million $ price. This is comparable to the construction price of the CMS SiTracker.
Schematic structure of the SCT
Here we should like to present the basic ideas concerning the SCT detector and in the
following it will be assumed that only the crossed pixel will produce signal (no
clustering), one can neglect the multiple scattering which is relatively decreasing with
higher momentum, for simplicity the synchrotron radiation in the magnetic field of the
detector will be neglected though it will appear even for protons. We also neglect
radiation damage.
The detector consists of identical blocks arranged along the z-axis, the total number of
blocks is Nb. The later sizes are Lx and Ly, the width is db which gives a total length of
Ltot = Nb * db. The size of the elementary detector cell is cx * cy, thus in a detector plane
the number of cells is nx * ny = Lx/cx * Ly/cy. One assumes px * py pixel size. Whole
volume is filled with homogenous By magnetic field, which can be produced by
permanent magnets. In order to minimize the traversed material one can assume vacuum
between the detector and converter planes.
W si
W em
W si
W had
Elementary cells
Zhad
Zsihad
Zem
Zsiem
Basic block length: b = 4 cm
Figure 3. Basic block schematics
The basic block in Fig.3 contains one plane hadron converter, one plane EM converter
thus one can independently regulate the interaction and radiation length and after each
converter there is a Silicon detector plane. The blocks are characterized by the following
parameter pairs:
(Zhad; Whad) (Zsihad; Wsi) (Zem; Wem) (Zsiem; Wsi)
corresponding to the z-coordinate and width of the given plane. For definiteness we
assume the next values for these parameters:
Zhad = 0 cm;
Zsihad = 1 cm;
Zem = 2 cm;
Whad =2 mm;
Wsi = 0.3 mm;
Wem = 0.1 mm.
Zsiem = 3 cm;
Using the PDG values for radiation and interaction length one gets for a block:
tem = width/Z0

Material
width
Z0
thad = width/
Silicon
0.03 cm
9.36 cm
0.3 %
45. cm
0.07 %
Polystyrene
0.2 cm
42.4 cm
0.5 %
80. cm
0.25 %
Tungsten
0.01 cm
0.35 cm
3.0 %
9.6 cm
0.10 %
In the total length Ltot = 80 meters, which is about 2 times larger than LHC ATLAS
detector one finds 82 radiation length unit and 10 interaction length unit, where Si-planes
are counted twice. Thus the average path length for hadrons will be about 8 meters and
for photons, electrons 1 meter, which gives at least 50 measured points for high
momentum electrons.
According to the PDG [8] formula the curvature resolution
dk = px / L2 sqrt( 720/(N+4)) = 10-6/ 6400 sqrt(720/404) [1/m]
It means that one gets about 30% measuring error by tracking on 1 PeV/c momentum
hadrons if B= 1 Tesla.
The factor 8 difference between radiation and interaction length and the fact that
electrons and positrons are mainly originating from the EM-converter plane with specific
Vzero signatures provides excellent tool to distinguish between electrons and charged
hadrons (Fig.4). The more localized nature of EM-showers makes relatively easy to
connect the photons from 0 decay to the corresponding hadronic vertex which are well
separated from each others. Neutral hadrons (n, K, L etc) are easily identified due to the
hadronic nature of showers generated by them. Unfortunately at ultra high energies the
decay length of KS and  particles will be much longer then their interaction length
therefore it will be hard to distinguish them from neutrons.
o
-sh
EM
r
we
h
muon +
proton
wer
EM-sho
pion +
proton
neutrino
EM-shower
pion EM-shower
Figure 4.
neutron
EM-showers are embedded in the hadronic „network”
Though this detector is not much greater than the ATLAS detector, due to its calorimetric
nature it can work even at Zeta-electronVolt (1021 eV) energies because assuming at least
10 secondary particles per hadronic interaction after 6 interaction length the average
energy of particles in the shower will go down to PeV level which can be measured on
the remaining 5 interaction lengths of the SCT.
Of course, it requires detailed studies to determine the optimal Z0 /  ratio.
In case of electromagnetic UHE ZeV showers there can be a problem, because the
number final products of the shower having energies around one MeV would mean 1015
particles (e+, e-) in total distributed in the whole volume. Fortunately, in the relatively
high magnetic field the low-momentum the particles remain inside the block where they
were created. This containment energy cut-off can be increased by increasing magnetic
field, or by increasing the distances between the converter plates which requires
additional optimizations. An other favorable circumstance is that the problem of energies
above of EeV is expected to be not actual before 2111.
Data processing
The SCT system can work only if the data reduction and processing can be realized in a
massively parallel way on-line.
As it was mentioned each elementary cell will broadcast the collected hit data
continuously in an autonomous way. There will be millions of independent nodes like in
a mobile telephone system.
Due to the calorimetric tracking the information volume per event can be enormous, but
the overall load can be varied largely depending on the intensity of the beam.
Mathematically each shower can be represented by a tree-graph, whose branches are
weighted by the momentum of the particle which can be calculated from the curvature of
the trajectory. The higher is the momentum the straighter is the track i.e. the smallest the
curvature. For the reconstruction the physical information is contained in the straight
sections of the trajectories. The main aim of the pattern recognition is to identify these
straight sections. One way is to search for straight triplets and their continuation [9]. A
more automatic way would be to create ad hoc networks which connects nodes which
stored “identical” (x,y,T) hit coordinates. The trick is that each point can be long to a
number of corridors. Those corridors will survive which contains maximal number of
successive nodes in a given region. This so-called Hough-transform will produce the ad
hoc networks. One point can belong to more than one Hough-tube.
One can apply gradual step-by-step selection process filtering first the higher momentum
tracks. It follows from the momentum and energy conservation laws that the branches of
the higher momentum are nearer to the root of the tree, therefore this selection procedure
automatically builds up the exact event tree structure from the root, which is the essential
part of the physical process. One can name this procedure as Principal Component
Selection of straight sections method.
Simulation
Scalability and translational symmetry helps the simulation.
It is enough to produce 100000 event in the region for region (1 MeV, 100GeV) an other
100000 for (100 GeV , 10 TeV ) and a third 100000 for (10TeV, 1PeV) where the lower
limit indicates that in the simulation one can stop the generation when the secondary falls
below this value, then one selects in random way from the lower energy set a
corresponding one transporting it to the correct position. This way one can speed up the
production speed by huge factors. E.g. in case of complete simulation for 1 PeV photon
one would need a tracking of 109 particles which in our short-cut case reduced to
following of few hundred particles and the rest will be embedded using events generated
for lower energy range.
Dream colliders
So far we followed the cosmic ray thinking assuming fixed target experiment: target
positioned before the SCT.
In case of collider systems it is impossible to build detectors with exactly 4 acceptance,
because only a tiny part of the incoming beam particles will interact and practically the
total beam intensity will hit the forward detector which should be dumped in a harmless
way. It is not practical to try head-on collision, because the incoming beams would
disturb each other.
If the two beams are colliding under a small angle then the CM energy will be minimally
decreased and one can measure the forward going particles with some inclination with
the same accuracy as in case of fixed target arrangement (Fig. 5a). Fortunately the beam
should be extremely well collimated and have radius at these energies well below the
micron level, which requires only tiny holes in the detector and converter planes. If one
can achieve alignment on such 1 micron level of accuracy, then the acceptance hole will
be reduced only to one or two pixels. Due to the fact that between the planes one finds
vacuum there will not be any material to traverse for particles emerging from the “beampipe”. Adding TOTEM-like extensions (as in LHC CMS) at the end of the detector one
can go as near to zero angle as one wish.
CaloTrackers with beam holes
p or e
p or e
a)
p
e
Hadron hodoscope
EM-CaloTracker
b)
e
e
c)
e-dumps
Figure 5. Collider combinations: a) pp or ee with forward-backward holes,
b) ep deepinelastic scattering with hadron hodoscope, c) ee with e-dumps
for q and muon final states.
For special purposes one can envisage other special scenarios.
For illustration, let us try to think about ep scattering with PeV beams in order to measure
the quark radii in the classical way.
One possibility is asymmetric beam with asymmetric detector. Let us assume that we are
interested only about the off scattered electrons like in the classical SLAC-MIT deepinelastic experiment. Then it is enough to place a short EM-SCT along the forward
electron direction and a simple timing interaction hodoscope in the backward proton
direction which should resist extremely high rate (Fig.5b).
The electron rate should be limited in such a way that the forward SCT could survive
without too fast radiation damage. Theoretically one can imagine a beam structure with a
continuous electron beam of 1011 e/sec intensity i.e. with 10 picosec average time
difference. There would not be intensity limit on proton beam, but there would be
required also the 100 GHz bunching structure. In this case measuring the hit timings with
1 picosec accuracy, one could minimize the beam-detector background considerably.
Other possibility for 4 geometry would be a high intensity ee collider for qq or  final
states putting e-dumps near to the collision points made from materials of extremely short
radiation length and relatively long interaction length combined with strong magnetic
fields (Fig.5c).
Scaled down model for LHC fixed target 4experiments at 7 TeV energy
The indicated 40-50 years time-scale looks to be not very attractive to the present
generation of active physicists and it would be even more difficult to recruit young
students for such a long time span, which would not produce new scientific results before
their pensioner age.
One can follow the example of laser accelerator people who already produced beam with
about 1 GeV energy and planning within few years to reach demonstration of 100 GeV
acceleration [10]. The coming 7 TeV at LHC, which is almost reaching 1% PeV energy,
opens the way for fixed target detectors at this dream range of energy. One can dream
about a parasitic beam without disturbing the normal LHC operation. The LHC is
continuously loosing protons which effect is limiting the time duration of the fillings to
day levels. Thus there are flying around the beam tube number of scattered off protons
with few TeV. One should look around the LHC ring and find some place where putting
a collimator one can select through going particles with spectra reaching 6-7 TeV, by
magnetic deflection one can select only particles above, let say, 3 TeV. If there remains at
least 1 particle/sec then one can start to develop a reduced size SCT and make physics in
a new domain. This reduced SCT e.g. may be tracker for hadronic and calorimeter for
electromagnetic showers. Thus the number of required plates will be much reduced. In
this smaller scale experiment the reconstruction algorithm can be studied with much
simpler elementary detector cells using e.g. the ones developed for FAIR CBM [6].
Summary
Scalable CaloTracker a new type of universal particle detector was proposed. The
CaloTracker principle makes it possible that one can apply accurate tracking for high
accuracy measurement of ultra high energy particles in calorimetric way which ensures
that the required detector length will increase only logarithmically till practically infinite
energies.
Depending on the process to be studied one can optimize the Z0 / ratio, the low energy
cutoff for separate tracking, the density of detector and the local resolution given by the
pixel size.
The key element is the trigerless logic based on XYTER chip readout. The individual hits
can create ad hoc networks for the straight sections of particle trajectories in the
framework of multi-million nodes of a mobile telephone-like system.
The Principal Component Selection method provides an effective and fast method to
reconstruct on-line the high energy components of the tree graph representing the shower.
On can start development using parasitic protons emerging from the LHC ring and
electronic chips developed for FAIR CBM detectors.
REFERENCES
[1] http://en.wikipedia.org/wiki/Pierre_Auger_Observatory
[2] http://en.wikipedia.org/wiki/IceCube_Neutrino_Observatory
[3] GZK-limit: Greisen, Kenneth (1966). "End to the Cosmic-Ray Spectrum?". Physical Review
Letters 16 (17): 748–750; Zatsepin, G. T.; Kuz'min, V. A. (1966). "Upper Limit of the Spectrum
of Cosmic Rays". Journal of Experimental and Theoretical Physics Letters 4: 78–80.
[4] Particle physicist's dreams about PetaelectronVolt laser plasma accelerators
G. Vesztergombi, LIGHT AT EXTREME INTENSITIES 2011 Date: 14–18 November 2011 Location:
Szeged, Hungary AIP Conf. Proc. 1462, pp. 155-158;
doi:http://dx.doi.org/10.1063/1.4736781 (4 pages)
[5] New Opportunities in the Physics Landscape at CERN
from Sunday, May 10, 2009 at 16:00 to Wednesday, May 13, 2009
Reflections about EXChALIBUR, the exclusive 4pi detector 15'
Gyoergy Vesztergombi (Res. Inst. Particle & Nucl. Phys.-Hung. Academy of Science)
http://indico.cern.ch/conferenceOtherViews.py?view=standard&confId=51128
[6] n-XYTER Front-End Boards. Christian J. Schmidt,. Karelia, June 1 – 4, 2009. CBM STS
Workshop, Karelia, June 1 sts-karelia09.jinr.ru/files/C.J.Schmidt-Xyterchip.ppt
[7] SharpSharp's New 1/2.5-Inch CCD Sensor Boasts over 8-Megapixel Resolution
http://techon.nikkeibp.co.jp/english/NEWS_EN/20070117/126544/
[8] PDG booklet July 1998, 25.9. Measurement of particle momenta in uniform
magnetic field, page 206.
[9] Agócs Ádám, Fülöp Ágnes, Jet reconstruction of individual orbits at many particles
problems, Proceedings of the 8th Joint Conference on Mathematics and
Computer Science (MACS 2010 July 14-17), J. Selye Univ. Komárno,
(ISBN 978-963-9056-38-1) Published by NOVADAT Ltd.
Hungary (August, 2011) 123-138,
[10] IZEST 100 GeV Ascent - Bordeaux ; May 31 - June 1, 2012
Workshop (and not a conference) on the design of a 100 GeV Laser Plasma
Accelerator on the PETAL laser
http://www.izest.polytechnique.edu/izest-home/izest-events/izest-100-gev-ascent/izest100-gev-ascent-bordeaux-france-95524.kjsp
Download