From Climate change to Quantum Chemistry

advertisement
From Climate change to Quantum
Chemistry: Breaking Barriers using HPC
Dr. Philip Smith
(Senior Director HPCC TTU)
Dr. Per Andersen
Texas Tech University
Texas Tech University (TTU)
High Performance Assets
 The primary TTU High Performance Cluster
is a 9000+ core cluster called Hrothgar.
 Antaeus is a 480 core High Energy Physics
cluster.
 Janus is a 176 core Windows HPC cluster
 Weland is a 128 core experimental cluster.
 Lonestar is a 22,656 core TACC cluster in
which TTU has invested in.
TACC Lonestar Overview
 1,888 compute nodes, with two 6-Core processors per node,







for a total of 22,656 cores. The theoretical peak compute
performance is 302 TFLOPS.
44 TB of total memory.
276TB of local disk space.
Five large memory nodes, six cores each, each with 1TB of
memory.
Eight GPU nodes with six cores.
1PB global, parallel Lustre file storage
Interconnection Network of InfiniBand technology in a fattree topology with a 40Gbit/sec point-to-point bandwidth.
A 10 PB capacity archival system is available for long term
storage and backups.
TACC Lonestar Financing
 An initial $12 million was committed to deploy Lonestar.
 Financial commitments from the following partners;
• The National Science Foundation (NSF),
• The University of Texas at Austin,
• The University of Texas System,
• The UT Institute for Computational Engineering and Sciences,
• Texas A&M University,
• Texas Tech University and
• multiple technology partners.
TACC Lonestar
 Lonestar delivers 198 Million Core
Hours/year
 Texas Tech allocation 9.2 Million Core
hours/year (~5%)
 TTU First Year Utilization ~8.2 Million
 TTU Second Year Utilization ~9 Million
in 10 Months
Lonestar Usage by TTU
Texas Tech University Researchers With
Lonestar Projects
 William L. Hase (Chemistry)
 Rajesh Khare (Chemical Engineering)
 Jorge A. Morales (Chemistry)
 William (Bill) Poirier (Chemistry)
 Katherine Hayhoe (Political Science)
 Juyang Huang (Physics)
 Roger (Bryan) Sutton (Physiology)
 Juan (Guillermo) Araya (Mechanical Engineering)
 Kelvin Cheng (Physics)
Dr William (Bill) Hase
 Dr. Hase utilized TACC resources via Ranger
and Lonestar for a number of years.
 Hase’s research group simulates the dynamics
of molecular motion and chemical reaction at
an atomistic, microscopic level. Classical,
semi-classical, and quantum mechanical
methods are used for the simulations
 Results are compared with experiment to
validate computations.
Collision of High Energy Xenon on Ice [
Solid H2O] Surface at 140 K—Hase Group
Three types of events have been found:
• Trapping of Xe inside ice surface
• Trapping of Xe followed by desorption
• Direct scattering of Xe from the ice surface
Identifying atoms:
Xenon
Oxygen
Hydroge
n
Incident collision energy: 6.5 ev
Surface temperature ~ 140 K
Trapping
Trap-desorption
Direct scattering
Dr. Kelvin H. Cheng
Physics
 Biological systems spanning the molecular,
cellular, tissue (device) and human levels.
 At the molecular and cellular level, his
research group studies the molecular
organization and dynamics of the selfassembling lipid bilayer, an important
component of the cell membranes, and the
molecular mechanisms of its regulation of
protein activities
Disruption of a cholesterol-depleted
membrane
 The following slide shows the disruption
of a cholesterol-depleted membrane in
the presence of a disordered betaamyloid surface interacting protein.
 This mechanism is important in
Alzheimer’s patients.
Red curve is the prediction of phase space theory (PST).
Dr Katharine Hayhoe
Political Science
 Katharine Hayhoe
• atmospheric scientist (climate change)
• regional climate impacts, and science-policy
interface.
 Dr. Hayhoe uses R running large multiple
simulations in a GRID format.
 She has generated over 40 Terabytes of Climate data.
Climate Projections for the US Southeast
 Based on 25 simulations from 7 global climate models and 4
emission scenarios
 Projected changes per degree global mean temperature (GMT)
change calculated for the first 20-year period during which
GMT change for each individual simulation relative to 19902009 is equal to or greater than 1, 2 and 3oC
 Probability of reaching a given GMT threshold by a certain
date is calculated based on number of GCM simulations in
which the GMT threshold is reached
 For each value of GMT, the average of 25 simulations is
given, as well as the 10th percentile (low) and 90th percentile
(high) values to illustrate the variability in the mean value
Annual and Seasonal Temperature Change
1999-2009
Dr. Raj Khare
Chemical Engineering
 Molecular simulation tools
 Dynamics of materials and their
thermodynamic and transport
properties. Condensed phase simulations.
 The specific research areas of current interest
are physics of complex fluids, polymer Nano
composites and efficient production of
biofuels.
Condensed phase simulations
 Hydroboration of alkenes is useful in the synthesis of organic
compounds
 The QM/MM condensed phase simulations were performed
using the recently developed version of VENUS computer
program.
 The slide shows animations for the formation of Markovnikov
(Movie 1) and anti-Markovnikov (Movie 2) product formed
during the hydroboration of propene in tetrahydrofuran
solution.
 The formation of Markovnikov product formed during the
hydroboration of propene in tetrahydrofuran solution.
 The formation of anti-Markovnikov product formed during
the hydroboration of propene in tetrahydrofuran solution.
Bio-Fuels
Short chain cello-oligosaccharides
 Breaking the insoluble cellulose chains and
separating them for the efficient production of
bioethanol.
 Ideally, the glucose that is produced from this
biomass does not attach to the cellulose crystal
surface and is soluble in water which can be
easily fermented to produce ethanol.
Short chain cello-oligosaccharides
Dr. Juyang Huang
 Dr. Juyang Huang is a Professor of
Physics at Texas Tech University and
one of the earliest users of Lonestar.
 Dr. Huangs research interest is in the
biophysics of lipid membrane and
application of liposome technology.
Simulation of PKC(Alpha)-C1 domain
docks with the Lipid bilayer
 The following is a simulation of PKC(Alpha)-C1 domain
docks with the Lipid bilayer using Gromacs.
 The simulation was performed on Lonestar. It took 8 Days to
run 123711 atoms for 200ns on 84 processors.
Bio-Fuels
Short chain cello-oligosaccharides
 Breaking the insoluble cellulose chains and separating them
for the efficient production of bioethanol.
 Ideally, the glucose that is produced from this biomass does
not attach to the cellulose crystal surface and is soluble in
water which can be easily fermented to produce ethanol.
Dr William (Bill) Poirier
 Dr. Poirier was the first TTU researcher to
take advantage of Lonestar.
 In the first several months of Lonestar
operation Dr Poirier’s group was able to run
simulations of 1024 cores for 24 hours with no
wait times.
 Recently Dr. Poirier has taken full advantage
of the large memory nodes on Lonestar.
ScalIT Research
 Poirier’s group have successfully used their TACC lonestar
allocation to run the ScalIT suite of software to perform exact
quantum dynamics calculations of rovibrational spectra for
small molecular systems.
 From a computational standpoint, a very large basis set
expanded Hamiltonian matrix has been diagonalized and
distributed across multiple cores (up to 1200 currently).
 Solving the Time Independent Schrodinger Equation.
Development of New Approaches to ScalIT
 Dr. Poirier’s research has also focused on development of new
tools that focuses on solving the time independent
Schrodinger equation for molecular systems under the BornOppenheimer approximation, using a variational basis set
expansion method.
 ScalIT direct products of well know basis functions, a new
basis approach has been developed using discrete phase space
Gaussians.
 In this case Hamiltonian matrices are dense, this is unlike
ScalIT, which produces very sparse matrices. This means the
new code can utilize well known dense matrix methods for
diagonalization, e.g. Lapack and Scalapack. This allows for
relatively simple parallelization and rather efficient scale-up.
 To date diagonalize of a dense matrix has been done to a size
of 350k x 350k utilizing 2880 cores.
Dr. Guillermo Araya
 Dr Araya is a new lonestar user and has recently been given
allocation on Lonestar and special requested access to
Hrothgar.
 Dr Araya’s research interest is in computational fluid
dynamics of turbulent incompressible and compressible flows,
wind energy array modeling, DNS, LES, RANS, URANS,
turbulence modeling, heat transfer and flow control.
Dynamic method for prescribing realistic
inflow boundary conditions
 A dynamic method for prescribing realistic inflow boundary
conditions is presented for simulations of spatially developing
turbulent boundary layers. The rescaling process requires prior
knowledge about how the velocity and length scales are
related between the inlet and recycle stations.
 The dynamic method is tested in direct numerical simulations
of zero, favorable and adverse pressure gradient flows. The
dynamically obtained scaling exponents for the downstream
evolution of boundary layer parameters are found to fluctuate
in time, but on average they agree with the expected values for
zero, favorable and adverse pressure gradient flows.
 The following slides are a results generated by simulation.
Direct Numerical Simulations of spatially-developing
turbulent boundary layers under Strong Adverse Pressure
Gradients
Source: Araya G., Castillo L., Meneveau C. and Jansen K., A dynamic multi-scale
approach for turbulent inflow boundary conditions in spatially evolving flows, J. of
Fluid Mechanics vol. 670, pp. 581–605, 2011.
Iso-surfaces of instantaneous
streamwise velocity fluctuations, u’
= ±0.15
Iso-surfaces of instantaneous
θ’
thermal fluctuations,
= ±0.15
Direct Numerical Simulations of spatially-developing
turbulent boundary layers under Moderate Favorable
Pressure Gradients
Source: Araya G., Castillo L., Meneveau C. and Jansen K., A dynamic multi-scale
approach for turbulent inflow boundary conditions in spatially evolving flows, J. of
Fluid Mechanics vol. 670, pp. 581–605, 2011.
Iso-surfaces of instantaneous
streamwise velocity fluctuations, u’
= ±0.15
Iso-surfaces of instantaneous
θ’
thermal fluctuations,
= ±0.15
Download