Lattice QCD and eScience David Richards (Jefferson Laboratory) Robert Edwards Chip Watson

advertisement
NeSC
Lattice QCD and eScience
David Richards (Jefferson Laboratory)
Robert Edwards
Chip Watson
• Introduction - Jefferson Laboratory, a research tool for QCD.
• Lattice QCD and large-scale simulation
• SciDAC
• Future prospects
Lattice QCD. . .
-1-
September 2002
NeSC
Jefferson Laboratory
Jefferson Laboratory is a nuclear physics facility housing the Continuous
Electron Beam Accelerator Facility.
Lattice QCD. . .
-2-
September 2002
NeSC
“To understand the structure and interaction of hadrons in terms of quarks
and gluons”
The strong interaction is described by Quantum Chromodynamics - QCD,
whose fundamental particles are the quarks and gluons.
QCD is a gauge theory, like QED, but non-Abelian and non-linear.
Lattice QCD. . .
-3-
September 2002
NeSC
What do the experiments tell us?
The distribution of the quarks and gluons within the hadron - structure functions and form factors of protons and neutrons.
The Neutron’s Charge Distribution Provides
Further Insights into Hadron Structure (cont)
Ÿ
Pion Cloud
RCQM
Total
• New neutron electric form
factor data reveal the shape
of the charge distribution
• And the importance of
relativistic effects in nucleon
structure
IP_9-02_final.ppt 8/19/2002 4:26 PM
Lattice QCD. . .
-4-
September 2002
NeSC
The masses of the radial and orbital excitations of nucleons
Lattice QCD. . .
-5-
September 2002
NeSC
12 GeV Upgrade
• The origin of confinement by looking for flux tubes - Hall D
• Calculation of Generalised Parton Distributions that describe the full
structure of hadrons
Lattice QCD. . .
-6-
September 2002
NeSC
The only ab initio means we have of solving the theory is through large scale
numerical simulation - lattice QCD.
Replace continuum space-time by lattice or grid of points.
• Quarks live on the Sites.
• Gluons live on the Links
ν
µ
x+ ν
Uν (x)
x
Uµ (x)
x+ µ
Requirements of large physical volume and small discretation uncertainties requires terascale computational facilities for calculation commensurate with experimental programme - SCIDAC.
Lattice QCD. . .
-7-
September 2002
NeSC
5
4
m N r0
N(1535)
3
N(939)
2
Jacobi
Fuzzed
1
0
0
0.01
0.02
0.03
0.04
2
(a/r0)
M. Göckeler et al, PLB 532, 63 (2002)
Lattice QCD. . .
-8-
September 2002
NeSC
Early Lattice Calculations Also Predict
Flux Tubes
Flux
tube
forms
between
qq
From G. Bali:
quenched QCD with
heavy quarks
Lattice QCD. . .
-9-
September 2002
NeSC
National Computational Infrastructure for Lattice Gauge
Theory
Objective
http://www.lqcd.org
Create the software and hardware infrastructure needed for terascale simulations of Quantum Chromodynamics (QCD)
Lattice QCD. . .
-10-
September 2002
NeSC
Scientific Goals
Non-perturbative study of QCD
• Calculation of weak decays of strongly interacting particles
* Determination of least well known parameters of the Standard Model
* Precision tests of the Standard Model
• Investigation of matter under extreme conditions
* Mapping the phase diagram of strongly interacting matter
* Determination of properties of the Quark Gluon Plasma
• Understanding the structure and interactions of hadrons
* Calculation of the spectrum of hadrons existing in nature and exploration of their interactions
* Determination of the quark and gluon structure of the nucleon and
other hadrons
Lattice QCD. . .
-11-
September 2002
NeSC
Project Goals
Create a unified programming environment that will enable the US
lattice community to achieve very high efficiency on diverse multiterascale hardware
• Portable, scalable software
• High-performance optimization on two target architectures
• Exploitation and optimization of existing application base
• Infrastructure for national community
• Sharing of valuable lattice data and data management
Lattice QCD. . .
-12-
September 2002
NeSC
Distributed Terascale Facility
The U.S. Lattice community five-year plan includes 3 machines of scale
ten teraflops, sited at
• Brookhaven National Laboratory
• Fermi National Accelerator Laboratory
• Thomas Jefferson National Accelerator Facility
Lattice QCD. . .
-13-
September 2002
NeSC
QCD API Design
Level 3
Dirac Operators, CG Routines, etc.
Level 2
Data Parallel QCD Lattice Wide Operations
(overlapping Algebra and Messaging)
Linear Algebra
e.g. A = B * C
Data Movement
SHIFT(A, mu)
Level 1
Single Node Lin Alg
e.g. SU(3), Dirac
Lattice QCD. . .
Message Passing
Map Lat to Network
-14-
September 2002
NeSC
Data Grid
The Lattice Portal will give access to all QCD data, such as generated
lattice configurations
• Replicated data (multi-site), global tree structured name space (like a
local file system)
• Replica catalog, using SQL database, replicated to multiple sites for fault
tolerance
• Browse by attributes as well as by name
• Parallel file transfers (bbftp, gridftp, jparss,. . . )
• Drag-n-drop between sites (gui)
• Policy based replication (auto migrate between sites)
Lattice QCD. . .
-15-
September 2002
NeSC
QCD API Status and Schedule∗
• Level 1 Linear Algebra 1st draft completed.
• Level 1 Message Passing (MP-API) design completed.
Implementation in MPI completed
Myrinet optimization on GM
C++ MP-API for QCDOC
Application Port to MP-API
QDP++
Sept 2001
Feb 2002
begun
begun
begun
• Demo of Level 2 “vertical slice”, Feb 2002
• Optimization of Lin Alg for P4 by March 2002
∗
http://physics.bu.edu/~brower/SciDAC
Lattice QCD. . .
-16-
September 2002
NeSC
QCDOC Project
• Multi-teraflops QCD requirements scale as ∼ (Nlattice
2.5
sites )
requiring:
* Thousands of processors.
* High-bandwidth, low-latency network.
• QCDOC (= QCD on a chip) architecture exploits lattice QCD regularity
and supports ∼ 20K 1 Gflops processors.
• CPU/FPU and communications hardware integrated on a single chip
manufactured by IBM.
Lattice QCD. . .
-17-
September 2002
NeSC
QCDOC Architecture
• Single-chip processing node:
* 1 Gflops, 64 bit IEEE FPU, PowerPC.
* 4 Mbytes on-chip memory.
• 6-dim mesh network.
* 1.5 Gbyte/sec/node network bandwidth.
* ≤ 500 ns network latency.
• ≤ 2 Gbyte DIMM DDR SDRAM external memory per node.
• Commodity Fast Ethernet provides
independent host access to each node.
Lattice QCD. . .
-18-
September 2002
NeSC
QCDOC Status and Schedule
• ASIC design nearly finished.
• Physics code running in simulator:
single node FPU, SU(3)×2-spinor
cache fill/flush to eDRAM
bandwidth from/to eDRAM
DDR fill/flush
84% (L1 cache)
78% (eDRAM)
3.2 GB/s sustained
2.5/2.0 GB/s
1.2/1.7 GB/s
• 2-node Ethernet simulation.
• 4-node OS development system: PowerPC 405GP and Ethernet/JTAG
boards.
• preliminary schedule:
∼ 1.5 TFlops sustained in 2002
∼ 10 TFlops sustained in 2003
Lattice QCD. . .
-19-
September 2002
NeSC
Commodity Clusters
Flexible and powerful
• Exploit commodity processor board and computer network engineering.
• Exploit commodity software (Linux OS, open source software, MPI, . . . )
• Program, run legacy codes effortlessly.
Cost effective
• Price/performance on lattice codes down to $1/MF on single node Pentium 4s.
• Clone, upgrade continuously and cheaply.
• Steady-state: continual upgrades, several thousand nodes, replace oldest
third of system each year.
Lattice QCD. . .
-20-
September 2002
NeSC
Cluster Status and Schedule
• 128-node Pentium IV Xeon cluster, with switched Myrinet interconnect
installed at JLAB Aug. 2002
• Augment in 2002/2003 with (probably) GigE grid interconnect.
• By end of FY2003 0.5 Tflop/sec sustained cluster
• National plan envisages 8 Tflop/sec sustained in 2005/2006
• Comparable cluster deployed at FNAL
• JLAB Lattice Portal http://lqcd.jlab.org
Lattice QCD. . .
-21-
September 2002
NeSC
Performance on Pentium IV
Lattice gauge calculations dominated by calculation of Dirac operator
3000
2500
2000
1500
1000
500
0
0 active dims
1 active dims
2 active dims
3 active dims
2*
2*
2
2* *2
2*
2
2* *4
2*
4
2* *4
4*
4
4* *4
4*
4
4* *4
4*
4
4* *8
4*
8
6* *8
6*
6
8* *6
8*
8*
8
16
^4
Mflops
Wilson-Dirac Operator Performance:
QMP/MPI-GM
Subgrid Lattice size (after
checkerboarding)
Exploitation of SSE instructions allows 2 Gflop/sec for the single-precision
Dirac operator on a single processor, for problems residing in cache.
Lattice QCD. . .
-22-
September 2002
NeSC
Weak Decays
Some fundamental parameters of particle physics can be experimentally extracted only with aid of lattice gauge theory calculations.
The constraints on the ρ and η quark transition parameters require both experimental measurements and accurate lattice calculations; a non-zero η leads
to violation of CP symmetry as observed in Kaon decays and as needed to
explain the matter-antimatter asymmetry of the universe. Any disagreements
between the determinations signal a breakdown of the standard model of particle physics. Nearly all uncertainties will soon be dominated by lattice QCD
uncertainties.
1
∆md
CK M
fitter
∆ms & ∆md
0.8
η
0.6
|εK|
0.4
|Vub/Vcb|
0.2
sin 2βWA
0
-1
-0.5
0
0.5
1
ρ
Lattice QCD. . .
-23-
|εK|
September
2002
NeSC
The Quark-Gluon Plasma
Last seen a few microseconds after the big bang, the quark gluon
plasma is the quarry of the RHIC facility, and can be explored from
first principles using lattice gauge theory.
QCD phase diagram
http://www-aix.gsi.de/ alice/phase-diag.jpg
Lattice QCD. . .
-24-
September 2002
NeSC
Hadron Structure
High energy scattering experiments have measured the distribution of
quarks and gluons in the proton. Ten teraflops sustained facilities will
enable calculation of the moments of the distributions from first principles.
0.6
g/15
0.4
uv
dv
u–
–
d
– –
(d – u)
*5
s
c
x f (x,Q)
Q=5 GeV
0.2
0
0
0.2
0.4
0.6
0.8
1
x
The quark and gluon distributions in the proton.
Lattice QCD. . .
-25-
September 2002
NeSC
Moments of Quark Distributions in Proton
Tflop-Year
0
1
10
0.05
2
10
10
0.04
Physical mass
Current exp
3
<x >
0.03
0.02
0.01
0
0.3
0.2
2
mπ
Lattice QCD. . .
0.1
2
(GeV )
-26-
September 2002
NeSC
A Complementary Theory Effort is Essential to Unravel
Strong QCD (e.g. for the GlueX Project)
Tflop-year
First data from
CEBAF @12 GeV
102
101
100
10-1
FY06 Clusters
8 Tflop/sec
Exotic candidate
at BNL
FY03 Clusters
0.5 TFlop/sec
Hybrid Decays
Full Hybrid Spectrum
10-2
Quenched Hybrid Spectrum
Lattice Spectrum agrees with
Experiment for Conventional Mesons.
10-3
10-4
Flux tubes between
Heavy Quarks
10-5
Hints of a confining potential
Lattice gauge
theory invented
10-6
1974
Lattice QCD. . .
1990
2000
2010
-27-
September 2002
NeSC
…and to Understand the Quark and Gluon Structure of
Hadrons
Tflop-year
First data from
CEBAF @12 GeV
102
101
100
10-1
FY06 Clusters
8 Tflop/sec
GPD Measurements
shown at JLAB
FY03 Clusters
0.5 TFlop/sec
Nucleon GPD’s in
Full QCD
Pion form factor in full QCD
10-2
Quenched nucleon GPD’s
Nucleon valence quark momentum
agrees with experiment
10-3
10-4
First nucleon structure
function calculations
10-5
First numerical simulations
Lattice gauge
theory invented
10-6
1974
Lattice QCD. . .
1990
2000
2010
-28-
September 2002
NeSC
Summary
• Lattice QCD is our only means of gaining a quantitative ab initio study
of QCD
• Precise calculations commensurate with experiment require terascale computing
• Efficient use of terascale facilities requires a suitable software infrastructure
• Sharing of valuable configurations data - data grids
Lattice QCD. . .
-29-
September 2002
Download