Towards Massively Parallel Processing Grand Challenge simulations

advertisement
17/03/09
AERES
17/03/09
AERES
Physics and
astrophysics
theory
Numerical
algorithm
development
Gravity, plasma
physics, galaxy
formation, interstellar
medium chemistry,
solar wind MHD
Simulations and
analysis on
supercomputers
and grids
Framework for
the interpretation
of observational
data
Spectral methods,
Poisson solver,
radiative transfer,
chemistry solver
Massively parallel
runs, hybrid
simulations,
distributed computing
HESS, ALMA,
Hershell, Planck,
COROT, GAIA
17/03/09
AERES
• Local computing resources: 222 cores and 33 TB
• Important use of the SIO mesocenter
• Use and active participation to EGEE grid development
• Many parallel codes written or developed locally
• Allocations on the three main supercomputing centers in France
(ranked 14th, 16th, and 48th in the top500)
• Engineer in scientific computing: expert in code parallelization
17/03/09
AERES
ACTIVE PARTICIPATION AND USE OF A&A CLUSTER
• PHYSICS AND CHEMISTRY OF INTERSTELLAR MEDIUM
• Meudon PDR code (F. Le Petit, J. Le Bourlot, E. Roueff)
• UV radiative transfer – chemistry - thermal processes
• Detailed observations  strong constraints
• Explore space parameters (density, CR flux, dust… )
• hundreds of models in 3 days (instead of several
months )
• VERY HIGH ENERGY γ-RAYS EMISSION FROM AGN
• SSC code (Katarzynski, K., J-P. Lenain, H. Sol)
• Synchrotron Self-Compton emission
• HESS observations
• 25 parameters
• 30 000 jobs (60 000 hours mono-cpu) in three
months only
17/03/09
AERES
• SUPERNOVA REMNANTS AND JETS FROM YOUNG
STARS
• HYDRO-MUSCL (C. Nguyen, C. Cavet, C. Michaut)
• Hydrodynamics and cooling
• Finite volume method: Riemann solver
• Parallelization with MPI
• Radiative transfer (under development)
• BINARY BLACK HOLES ORBITS
• KADATH (P. Grandclément, E. Gourgoulhon, J. Novak)
• General relativity: spectral solver
• Decomposition on Chebyshev polynomials
• Parallel Computation of Jacobian column per column
• Inversion of Jacobian matrix (200 000200 000)
• Use MUMPS and SuperLU parallel libraries
• Parallel version under development: scaling is promising
17/03/09
AERES
• GALAXY FORMATION
• COSMO3D (J-M Alimi, S. Courty, F. Roy, R.
Teyssier, J-P Chièze, E. Audit)
• Poisson, hydrodynamics, chemistry solver
• Domaine decomposition
• Run on hundreds of processors
• MAGNETIC STELLAR ATMOSPHERE
• CARATSTRAT (G. Alecian, M.J.Stift)
• Polarized transfer, atomic diffusion, abundance
stratification
• Radiative transfer and diffusion equation solvers
• Hybrid version MPI/ADA under development
• Up to 128 processes at CINES
17/03/09
AERES
AN EXAMPLE: THE DARK ENERGY UNIVERSE SIMULATION SERIES
J-M Alimi, Y. Rasera, F. Roy, J. Courtin, P-S Corasaniti, A. Füzfa, V. Boucher, F. Fraschetti, R. Teyssier
GOAL: Imprints of DARK ENERGY on COSMIC STRUCTURE FORMATION
17/03/09
AERES
• THREE DARK ENERGY COSMOLOGIES
• ΛCDM (standard model)
• Quintessence with Ratra-Peebles potential (RPCDM)
• Quintessence with Sugra potential (SUCDM)
 Calibrated on latest WMAP CMB data and UNION SNIa data
• THREE BOX LENGTHES
• 3.6 Gpc : good statistics on clusters
• 900 Mpc : good statistics on Milky-Way size halos – Internal structure of clusters
• 225 Mpc : small halos - internal structure - redshift evolution
 Probe from cosmological to subgalactic scales
NINE SIMULATIONS WITH 1 BILLION PARTICLES EACH
• Up to 7 billions resolution elements
• Resolve scales from 4 kpc to 4 Gpc
• Resolve halos from 3.1010 Msun to 8.1015 Msun
• Up to 500 000 resolved halos per simulation
• Up to 3 000 000 particles per halo
LARGEST DARK ENERGY SIMULATION SERIES TO DATE
17/03/09
AERES
NEEDS A SUITE OF PARALLEL CODES WITH GOOD SCALABILITY
4096 processes – 512 MB memory per process only
• Initial conditions: MPGRAFIC (S. Prunet, Pichon C.) + QUINT (Y. Rasera)
• N-body solver: RAMSES (R. Teyssier) + QUINT (Y. Rasera)
• Quick power spectrum for tests: POWERGRID (S. Prunet) + PARALLEL (Y. Rasera)
• Analysis: Parallel Friend of Friend halo finder (F. Roy) Developed for this run !!!
NEEDS A LOT OF CPU-TIME
5 000 000 hours mono-cpu (600 years) on Babel (IDRIS)
• Allocation possible thanks to Horizon Project
• First to use Babel up to 24576 processors
• Found I/O node problem and MPI bug
• Many crashes of supercomputer Efficient restart
17/03/09
AERES
NEEDS A GOOD NETWORK AND BACKUP SYSTEM
216 snapshots+ 6 lightcones+3 samples = 40 TB
• LUTH computer room moved to gigabits connection
• Bought recently: backup system (10 TB) + horizon 2 server (7 TB)
• 13 TB are stored locally + 5 TB of additional copy
NEEDS TO ANALYSE AND ORGANIZE DATA
Creation of a parallel halo finder (F. Roy)
• Parallel version using domain decomposition
• Tested up to 20483 particles and 4096 processes
• Sort particles on a region or halo basis
• Subsequent analysis is therefore communication-free
NEEDS TO DIFFUSE DATA
Dark energy universe virtual observatory
• Project: Website, « Dark energy virtual observatory », Horizon collaboration
17/03/09
AERES
ΛCDM
Sugra
Ratra-Peebles
ΛCDM
Sugra
Ratra-Peebles
z=0
z=0
z=2.3
z=2.3
z=1
z=1
• Unprecedented range of masses and scales for dark energy simulations
• Dark energy mass functions and power spectra with unprecedented accuracy
• Differences between cosmologies: help breaking degeneracies between dark energy
models
• Differences with analytical predictions: help extending analytical models
17/03/09
AERES
• Intensive computation is a strong component at LUTH
• LUTH is an active participant of the grid EGEE III in astrophysics
• leading actor for the «A&A cluster»
• Use of the grid up to 30000 jobs in 3 months
• LUTH is moving towards Massively Parallel Processing
• Several parallel applications up to 120 processes
• Development and scalability tests to move to higher number of processes
• LUTH has already performed one Grand Challenge simulation:
• up to 4096 processes
• 5 millions hours mono-cpu
• LUTH is preparing for petaflop computing
17/03/09
AERES
Download