AUS Presentation

advertisement
Advanced User Support
Amit Majumdar
5/7/09
Outline
Three categories of AUS
Update on Operational Activities
AUS.ASTA
AUS.ASP
AUS.ASEOT
Three Categories of AUS
I. Advanced Support for TeraGrid Applications (AUS.ASTA)


AUS staff work on a particular user’s code (usually)
Guided/initiated by the allocations process
I. Advanced Support for Projects (AUS.ASP)

Project initiated by AUS staff (jointly with users) – impacts many users
I. Advanced Support for EOT (AUS.ASEOT)

Advanced HPC/CI part of EOT
Update on Operational Activities
 Bi-weekly telecon among AUS POCs from every RP site
o Matching of AUS staff to ASTA projects, discussion about ASP, EOT activities
 Web/Tele-con among AUS technical staff
o Biweekly technical tele/web-conference on ASTA and other projects - using
readytalk
o 13 presentation sessions (including today), 22 technical presentations
ASTAs Started in October 2008 - TRAC
#
PI name
Field
AUS staff
1
Jordan/USC
Geo
sciences
Cui (SDSC ), Urbanic (PSC ), Kim(TACC ), Wong
(NICS), Bock (NCSA)
2
Mahesh/U. Minn
CFD
Tatineni (SDSC), Crosby (NICS),Cazes (TACC )
3
Roitberg/U. Florida
Biochem
Walker (SDSC ), Pamidighantam (NCSA ), Halloy
(NICS )
4
Voth/U. Utah
Bioechem
Blood (PSC ), Crosby (NICS), Peterson (TACC )
5
Buehler/MIT
Civil Engr
Walker (SDSC)
6
Durisen/IU
Astro
Henschel, Berry (IU)
7
Papavassiliou/ OU
CFD
Reddy (PSC )
8
Van Einde/UCSD
Str. Engr
Choi (SDSC), Halloy (NICS), Cazes (TACC)
Startup/Supplemental ASTAs
#
PI Name
Field
AUS Staff
1
Schulten/UIUC
Bio Chem
Koesterke , Liu , Milfeld, TACC
2
Ferrante/U. Washington
CFD/DNS/turbulence
Taha (NCSA), Pekurovsky (SDSC,), Bock
(NCSA), Peterson (TACC), Crosby (NICS)
3
Tabakin/U. Pitt
Quantum Computing
Gomez ( PSC)
4
Karimabadi/ UCSD
Space Physics
Tatineni (SDSC)
5
Mashayek/UIC
CFD
Taha (NCSA)
6
Malamataris/GM
CFD
Brook (NICS)
7
Hu/CSULB
Physics
Barth (TACC)
8
Yeung/Gtech
CFD/DNS
Wong(NICS), Pekurovsky (SDSC)
9
Finol/CMU
CFD
Jana (PSC)
10
Geoffrey/CMU
CS
Jana (PSC)
Roberts/RTI;Karniadakis/Brown;C
oveney/UCL
CS/CFD/Chemistry
Karonis (NIU, ANL)
14
Cobb/ORNL
Neutron Science
Lynch, Chen (ORNL)
15
Kanamitsu/SIO-UCSD
Climate Science
Peterson (TACC)
16
Burke/Pitt
Medicine
Brown (PSC)
11,12,13
ASTAs Started in January 2009 - TRAC
#
PI Name
Field
AUS staff
1
Aidun/GT
CFD
Dillman (Purdue), Khurram (NCSA)
2
Cheatham/U.Utah Biochem
Pamidighantam (NCSA), Crosby
(NICS), Milfeld (TACC), Madrid (PSC),
Walker (SDSC)
3
Gygi/UCD
Materials Science
Fahey (NICS), Kim (TACC)
4
Jansen/RPI
CFD
Wong (NICS), Peterson (TACC),
O’neal (PSC)
5
Van de
Walle/UCSB
Materials Science
Liu (TACC), Vanomoer (NCSA)
6
Yip/MIT
Materials Science
Halloy (Yip)
7
Scheraga/Cornell
Chemistry
Blood, Mahmoodi (PSC)
8
Oleynik/U. South
Florida
Materials Science
Barth (TACC), Crosby (NICS), Jundt
(LONI)
ASTAs Started in April 2009 - TRAC
#
PI Name
Field
AUS staff
1
Batista/LANL
Materials Science
Liu (TACC)
2
Schnetter/LSU
Astrophysics
Mahmoodi (PSC), Kufrin, Liu (NCSA),
Pfeiffer (SDSC)
3
Helly/SIO-UCSD
Climate Science
Cheeseman (Purdue), Jundt (LONI)
4
Liu/UKY
QCD
Reddy (PSC), Fahey (NICS)
5
Smith/Gtech
CFD
TBD upon PI’s response
6
Wheeler/UT
CFD
Khamra (TACC)
7
Cen/Princeton
Astrophysics
Brook (NICS), Kharma(TACC),
Chourasia (SDSC)
8
Latour/Clemson
BioChem
TBD upon PI’s response
9
Apte/Oregon St.
CFD
Rosales (TACC)
10
Sandholm/CPU
CS
Welling (PSC)
Total number of active ASTAs ~40
ASTA Update - PI: Durisen, IU, Astrophysics
AUS staff: Henschel, Berry (IU)
• Legacy code – OpenMP parallel – benchmarked on three Altix
systems (PSC, NCSA, ZIH) – different performance!
• Optimized subroutines (calculates the gravitational potential) –
1.8X speedup
• Simulations generate several TBs of data which is then analyzed
interactively using IDL
• Transferring this amount of data via traditional methods (ftp,
scp, etc.) to IU is extremely time consuming and tedious
• By mounting the Data Capacitor at PSC on Pople the user can
write their data directly to IU and then access from servers in
their department
• Extensive profiling and optimization of I/O performance on local
scratch vs DataCapacitor – eventually ~30% speedup in I/O
• Files appear locally as they are generated by simulation
ASTA Update - PI: Scheraga, Cornell, Biophysics
AUS Staff : Blood, Mahmoodi (PSC)
• Identified a serial bottleneck and load imbalance
problem that limited parallel scaling
• Eliminated serial bottleneck and restructured code to
eliminate imbalance
• Resulting code performing 4 times faster for large
systems
• Never ending optimization
– In the new code computation/communication balance is
changed – further profiling ongoing using CrayPAT, TAU
ASTA Update - PI: Van de Walle, UCSB, Condensed
Matter Physics
AUS Staff : Liu (TACC), Vanmoer (NCSA)
• Main effort was to identify performance issues of VASP
• Identified only routine that needed lower level (-O1)
compilation; others used –O3 : resulted in ~10 %
performance improvement
• MKL on Ranger had SMP enabled with default
OMP_NUM_THREADS of 4; caused overhead – fixed
with proper wayness and threads
• Proper setting of NPAR (determines process grouping
for band diagonalization and FFT) showed 3-4 times
speedup
Advanced Support Projects
• Two projects ongoing
– Benchmarking Molecular Dynamics codes
– Benchmarking Materials Science codes
• Other potential ones we are looking into
– Multi-core performance analysis
– Usage-based perf/profiling tools
Comparison of MD Benchmark on TeraGrid Machines at Different Parallel
Efficiencies
Advanced Support EOT
• Advanced HPC classes at various RP sites
• TG09
– AUS staff participating in organizing TG09;
reviewing papers
– AUS staff will be presenting papers at TG09;
presenting tutorials
– Joint US-AUS-XS working group meeting at TG09
Download