COSMOLOGICAL SIMULATIONS

advertisement
COSMOLOGICAL SIMULATIONS
Modelling the Formation and Evolution of Cosmic Structure
H.M.P. COUCHMAN
Department of Physics and Astronomy
University of Western Ontario
London, ON
N6A 3K7
Canada
Abstract. Studies of the formation of cosmic structure are entering a
period of dramatic improvements in sophistication and verisimilitude. A
wealth of observational data is becoming available, ranging from studies of
galaxies, clusters and their cosmic distribution, to the high redshift universe and observations of microwave background uctuations. These observations, together with enormous improvements in the scale and quality of
the numerical computations that can be undertaken, promise very substantial progress in our understanding of the formation and evolution of cosmic
structure.
Numerical simulations are the crucial theoretical probe of non-linear
gravitational and hydrodynamic cosmic evolution. The talk briey reviews
the current state of numerical structure formation studies and discusses
the developments in techniques which, when coupled with current supercomputers, are leading to studies of unprecedented scope and resolution.
The talk concludes with comments on future directions for structure formation studies and on the challenges inherent in simulations of increasing
size. Initial results from some of the rst billion particle simulations are
presented.
1. Introduction
Numerical simulations of cosmic structure have a number of goals. The ultimate aim of many investigations is to explore and understand the formation, distribution and morphology of galaxies. Convincing modelling of the
astrophysical processes which contribute to the formation of galaxies is still
2
H.M.P. COUCHMAN
some way in the future. Nevertheless, the simulation of very many aspects
of cosmic structure has matured to a point where they are providing reliable
tools for investigating processes occurring in the post-recombination universe. In particular, key aims are to distinguish world models, to constrain
the initial uctuation spectrum and to understand non-linear gravitational
evolution and hydrodynamics in the cosmic context.
Traditionally, cosmological simulations have been concerned with the
modelling of Large-Scale Structure, the distribution of voids, walls, clusters etc. and have been used to generate low order spatial and velocity
statistics for a variety of cosmologies and initial uctuation spectra. New
instruments and observations are now providing a wealth of data spanning
many dierent epochs and spatial scales. The talk will outline the role that
simulation plays in interpreting these observations in cosmology and review
the progress that has been made to date as well as providing a few pointers
to the future.
2. The Post-Recombination Universe
The post recombination universe is shown schematically in Figure 1. Structure grows in the standard model from small uctuations present at the
time of recombination through the so-called \Dark Ages" to the formation of the rst bound radiating objects. In standard hierarchical models
structure grows through the aggregation of small objects into larger structures. Some of the key observational elements that must be described by a
successful structure formation model are shown in Figure 1. Over the last
decade our ability to probe to higher redshifts has increased dramatically
with detailed observations of the Lyman forest becoming available as well
as very deep surveys which promise the elusive goal of detecting primval
galaxies.
In the early 1980's, the initial spectrum was constrained only within
very broad limits and the distribution of galaxies and clusters was known
at low redshift. Numerical structure formation studies bridged this large
redshift gap with collisionless simulations to map out the general features
of gravitational clustering. The detailed measurement of the uctuation
spectrum from studies of the microwave background early next decade,
together with precise mapping of the galaxy distribution on large scales
and the probing of structure to high redshifts will dramatically alter the
nature of the gap that must be bridged by numerical simulation. With well
constrained initial conditions, and a detailed understanding of the evolution
of the collisionless component, the focus of many numerical studies will
shift to careful investigation of the crucial hydrodynamic aspects of cosmic
structure formation.
3
COSMOLOGICAL SIMULATIONS
Redshift, z
Cosmic Microwave
Background
3
10
Recombination
The "Dark Ages"
.
30
.
.
.
.
. . . .
. . .
❇
3
❇
❇
1
. . . . .
. . . .
. .
.
❇
❇
First Objects
(Pop III stars?)
❇
Quasars
Lyman alpha
Galaxies
X-ray clusters
Large-Scale Structure, Hubble Deep Field etc.
0
Figure 1. Schematic comoving look-back cone.
3. The Growth of Structure
Figure 2 illustrates the primary features of the growth of structure. The
spectrum illustrated is typical of \CDM-like" initial conditions in which
structure forms hierarchically, growing from small scales to large. For uctuations on some scale with amplitude less than unity the growth is linear;
once the amplitude of a uctuation exceeds unity, the evolution is non-linear
and numerical simulation is essential to accurately describe the structure.
Numerical simulations thus have a fundamental role to play in connecting
the initial linear spectrum with the non-linear regime. The further complication which hydrodynamic simulations are now beginning to address is
the connection between the observed distribution of luminous matter with
that of the gravitationally dominant dark matter distribution. On scales of
galaxies and below, hydrodynamic simulation is an essential component of
investigations of cosmic structure and of our interpretation of observational
data.
4
H.M.P. COUCHMAN
Bias: b(R)?
NON-LINEAR
LINEAR
z=0
Galaxies
(Observed)
Growth and
Evolution
Dark
Matter
Growth
~x1000
z=1000
Galaxies
Large
Scale
Structure
Clusters
COBE
Figure 2. A schematic view of post-recombination growth of the uctuation spectrum,
here represented as the mass variance in spheres of radius . Whilst the mass variance
is less than unity, growth is linear; for larger amplitudes reliable prediction requires
numerical simulation.
R
4. Requirements of a Numerical Simulation
Two features of gravitational collapse in the cosmic context are the presence
of irregular geometries and large density contrasts. A bound object virialises
with a density contrast of roughly 200 and this overdensity grows as the
object maintains a constant physical size within the expanding background.
A dissipated baryonic component will have a larger contrast. Lagrangian
particle methods are ideally suited to accurately modelling these features.
Indeed, it is dicult to see how one might easily model a collisionless component without representing it by particles. A number of techniques have
been used to model the hydrodynamic component. Smoothed Particle Hydrodynamics (Monaghan 1992) is a fully Lagrangian method which ts well
with gravitational particle techniques. Eulerian methods are also becoming
popular in the cosmic context, but, to be successful, adaptive mesh rene-
COSMOLOGICAL SIMULATIONS
5
ment (AMR) techniques must be employed in order to faithfully resolve
hydrodynamic phenomena over a wide range of densities.
The range of scales present in observed cosmic structures is very large;
from sub-galactic scales to superclustering and beyond, far greater than
can be modelled in a single simulation. The particle number that can be
modelled sets the mass resolution of the simulation. A simple example will
outline the scale of the problem. Suppose that we were to tackle the \Grand
Challenge" of modelling the distribution of galaxies in a representative sample of the universe. A cubic region of the universe 200 Mpc on a side would
contain of order 105 galaxies. Requiring a minimum of 100 mass elements
(or particles) per galaxy and supposing that 10% of the total matter resides in galaxies (including their haloes) would suggest a simulation with
108 particles. Simulations of this size are only now becoming feasible. Useful hydrodynamic modelling of galaxies in the same context would require
a resolution perhaps 10 times larger. This sort of resolution is considerably
greater than that which can be achieved in hydrodynamic simulations at
present. Whilst by no means all simulation programmes span this range
of scales, the argument is suggestive of the mass resolution that is desirable and explains the continual push towards higher particle number by
practitioners in the eld.
5. Achievements of cosmological simulations
The argument of the previous section suggests that very high particle number is desirable in cosmological simulations. Figure 3 illustrates the remarkable progress that has been achieved in improving the number of particles that can be simulated. The gure shows a representative sample of
leading-edge or near leading-edge simulations in the particle-number stakes
at various epochs, as well as indicating the algorithmic method used to compute inter-particle forces and the general type of computer hardware used.
Particle number is only one aspect of the problem of course; dierent algorithms have dierent memory requirements and compute the various types
of problem with diering eciency. Furthermore the hardware groupings
are very coarse; no mention has been made of vector computers for example.
Nonetheless, a dramatic improvement in attainable resolution over time is
apparent as is the necessity of using parallel supercomputers to achieve the
highest resolution.
Numerical simulations of structure formation are modelling simple physical laws. The key to progress is understanding the collective phenomena
which are modelled. Table 1 lists several areas in cosmology which have
been addressed by numerical simulation and assesses the degree of success
or reliability with which the various problems have been modelled.
6
H.M.P. COUCHMAN
Figure 3. Sketch showing the particle number achieved in various state-of-the-art simulations for dierent algorithms as a function of epoch. The representative simulations
illustrated were run either on serial computers, parallel computers (typically massively
parallel systems) or using the special purpose GRAPE hardware.
TABLE 1. Achievement and reliability of cosmological simulations.
Achievement
Gravitational:
Checks of perturbation theory
Exploring validity of simple analytic models
(e.g., Press{Schechter)
Non-linear gravitational evolution,
stable clustering etc.
Distinguishing dierent initial , spectra,
, cluster abundances etc.
Comparison of predicted gg with observed
distributions; bias etc.
Hydrodynamic:
Cluster gas and properties
Lyman Galaxy formation, rst objects etc.
Reliability
Good|symbiotic relationship
Reasonable|models limited
Reasonable|need greater dynamic range
Good
Rudimentary|need reliable galaxy
identication techniques|hydro
Good
Fairly good|low order statistics used so far
Rudimentary|need cooling and to
model star formation, feedback etc.
COSMOLOGICAL SIMULATIONS
7
6. The Current State-of the-Art
The continuing push for higher resolution has lead to an increasing use
of massively parallel supercomputers for leading edge simulations. These
computers oer both the processing speed|provided the algorithms can be
eciently parallelized|as well as providing the large total memory needed
for storage of the particle data and for the computational overhead of the
algorithm itself. Despite the diculty of programming for distributed memory architectures, parallel versions of several N-body algorithms exist; in
particular, versions of Tree, P3 M and AP3 M algorithms have been implemented.
Impressive performance is possible on current hardware. The Virgo (1996)
collaboration uses an AP3 M algorithm which has been parallelised using
Cray's proprietary data-parallel language CRAFT, to run on the T3D supercomputer. We can achieve a performance of roughly 2,500 particle updates/second/processor with excellent scaling behaviour up to 256 processors for a 17 Million particle run (for details see Pearce & Couchman 1997). The same code but without adaptive renements (essentially
P3 M) has been implemented using Cray's \shmem" library|an MPI2like one-sided message passing paradigm|for the Cray T3E (MacFarland
et al. 1998). The performance achieved in this case is 8,000 particle updates/second/processor. A 2563 (17 Million) particle run of approximately
2000 steps takes 10 hours on 128 processors of a Cray T3E. This is impressive performance even considering the slow down that occurs with the non
adaptive P3 M implementation as clustering develops.
Storage requirements become equally impressive as the particle number
increases. For collisionless simulations 6 words are required per particle or
roughly 10 words for hydrodynamic simulations. Each output slice of a 2563
particle collisionless simulation, for example, requires 500 Mb for storage
of positions and velocities in 32 bit precision. Typically 10 to 15 output
times are stored per simulation.
Two broad approaches are being pursued to achieve higher resolution.
Selective volume renormalization allows very high resolution to be achieved
in the investigation of galaxies, clusters or patches of the universe at high
redshift for example. With proper care it is possible to model the largescale cosmological eld by using nested shells of variable mass particles
and/or the application of external elds. Where uniform resolution is required over the whole computational domain there is little alternative to
the straightforward calculation with very large particle number. Such a simulation has been undertaken recently by the Virgo collaboration. A cube
of side 4000 Mpc containing 109 particles was evolved from a start redshift
of 29. A slice of the nal output of the simulation at redshift zero together
8
H.M.P. COUCHMAN
with the physical parameters used is shown in Figure 4. The code used
was the shmem P3 M code described above running on 512 processors of
the T3E in Garching, Germany. The full simulation of 500 steps took 70
hours. Each output is 25Gb with a total dataset of 1=3 Tb. With this
size dataset, analysis is a challenge, with much of the post-processing, of
necessity, being done on the the T3E or large machines local to it.
7. Conclusions
There are two broad conclusions to this talk. First, the promise of secure
knowledge of the initial uctuation spectrum together with improved reliability of collisionless simulations suggests that in the next few years|
barring upsets to the overall cosmological picture and current understanding of structure formation|we will have a good understanding of the evolution and structure of the dark matter distribution. This will provide the
basis from which we can undertake detailed hydrodynamical simulations
of the formation of galaxies and cosmic structure. Observations at high
redshift are providing detailed information which will place signicant constraints on the details of structure formation models as realised by numerical simulations.
The second conclusion is computational. As is evident from the data
presented above, the attainable resolution has increased by 4 orders of
magnitude in under two decades. With the advent, and increasing use,
of massively parallel computers the simulation of very large numbers of
particles is becoming feasible. This allows us to model a range of scales
where studies|particularly hydrodynamic studies|of very many aspects
of cosmic structure will begin to have detailed predictive power.
References
MacFarland, Tom,
Couchman, H.M.P., Pearce, F.R. and Pichlmeier, Jakob, (1998) A New
Parallel P3 M Code for Very Large-Scale Cosmological Simulations, New Astronomy,
submitted (astro-ph/9805096)
Monaghan, J.J. (1992) Annual Reviews of Astronomy and Astrophysics, 30, 543
Pearce, F.R. and Couchman, H.M.P. (1997) Hydra: An parallel adaptive grid code, New
Astronomy, v 2 n 5, p 411
Virgo: Colberg, J.M., Couchman, H.M.P., Efstathiou, G.P., Frenk, C.S., Jenkins, A.,
Nelson A.H., Peacock, J.A., Pearce, F.R., Thomas, P. A. and White, S.D.M. (1996)
9
COSMOLOGICAL SIMULATIONS
= 1; , = 0:21; h = 0:5;
8 = 0:6 CDM
Main slice: 20002 20 h,3 Mpc3
Enlargement: 450 240 20 h,3 Mpc3
P3 M: z = 29; s = 100 h,1 kpc;
10003 particles; 10243 mesh;
Cray T3E|512 cpus
Mparticle = 2 1012 h,1 M
i
200 Mpc/h
50 Mpc/h
Figure 4. A slice through the 109 particle simulation recently completed by the Virgo
consortium. The cosmology simulated is standard Cold Dark Matter but with a modied
spectrum (shape parameter, , = 0 21).
:
Download