PetaSHA3_Final_v1 - University of Southern California

advertisement
Geoinformatics: A Petascale Cyberfacility for Physics-Based Seismic Hazard Analysis
(SCEC PetaSHA-3 Project) (NSF EAR - 0949443)
Principle Investigator:
Thomas H. Jordan – University of Southern California – Earth Sciences
Co-Principal Investigators:
Jacobo Bielak – Carnegie Mellon University – Civil Engineering
Po Chen - University of Wyoming – Geophysics
Stephen Day – San Diego State University – Geological Sciences
Kim Olsen – San Diego State University – Geological Sciences
Yifeng Cui - San Diego Supercomputer Center - Computer Science
Robert Graves - U.S.G.S. - Geophysics
Greg Beroza - Stanford University - Geophysics
Ewa Deelman - USC Information Sciences Institute - Computer Science
Final Project Report - Performance Period 15 July 2010 – 30 August 2013
1. What are the major goals of the project?
PetaSHA-3 will enable us to extend our CME cyberfacility towards petascale, and better
integrate modeling and observational data, and to apply these new geoinformatics
capabilities to basic research issues expressed in the computational pathways including
fault zone constitutive laws, rupture-drive stress transfer, high-frequency wave propagation
simulations, and physics-based PSHA. The Project poses basic science questions that can
improve current predictive SHA models and then developers the geoinformatics capabilities
needed by the earth science community to investigate these science questions. The twoyear PetaSHA-3 project has three main goals: 1. Perform basic seismic hazard research that
improves predictive models of earthquake processes through first principles of physics; 2.
Develop high performance computing (HPC) computational platforms capable of running
petascale SHA computational research on current and future national open-science
facilities; 3. Provide geoinformatics capabilities needed by solid earth researchers, building
engineers, emergency management agencies, students and the public. The PetaSHA-3
geoinformatics capabilities will include improved community structural models, scalable
computational platforms, and original research results.
Earthquake simulations output vector and tensor fields, such as ground velocity fields,
Analysis of this data is crucial to understand underlying physical phenomena and for
validating simulation models. One powerful and intuitive method to analyze these datasets
is the use of visualization.
Vector data is often reduced to scalar magnitude quantity and plotted either as as
pseudocolor 2D plots or volume rendering. Visualization of vector data is also performed
by very sparse plotting of 2D/3D streamlines, stream tubes, stream surfaces, line integral
convolution (LIC), and particle advection. Even though these methods are useful, they are
insufficient to characterize adequately the underlying detail, richness and significance of the
data. Moreover, particle advection methods are suitable for flow data, but are not adequate
to visualize vector data produced by earthquake simulations that are similar to vibration
where the particles do not move far from their original location.
1
The major goal of the project was to develop a visualization technique that would
simultaneously be (1) easily perceived by non-specialists, but (2) be amenable to
quantitative interpretation by domain experts.
Task 1. DynaShake Computational Platform (Objective O2 of proposal)
Further develop the DynaShake platform for scalable (to thousands of cores) dynamic
rupture simulations incorporating complex (non-Cartesian) fault geometry and advanced
thermomechanical models. We will evaluate candidate codes meeting the geometric
requirements (including SORD and MAFE), and integrate into the selected code new
modules for high-speed frictional weakening (in a rate- and state-dependent formulation),
shear heating, fault-normal heat conduction and thermally-driven pressurization and Darcy
flow of pore fluids.
Task 2. High-Frequency Dynamic Rupture Simulations (Objective O3 of proposal)
We will investigate high-frequency seismic energy generation through numerical
simulations using the enhanced DynaShake capability from Task 1. Simulations will
incorporate multiple physical models such as frictional breakdown, shear heating,
porothermoelastic flow (and the resulting effective normal-stress fluctuations), as well as
multiscale fault roughness. Our target will be a resolution level sufficient to capture
outer/inner spatial-scale ratios exceeding 104. Analysis of the simulations will be directed
toward advancing our fundamental understanding of high-frequency seismic wave
excitation, and will be used to provide guidance for constructing appropriate (pseudodynamic) source models for high-frequency ground motion simulations.
Task 3. Pushing Deterministic Earthquake Simulations Toward Higher Frequencies
(Objective O5 of proposal)
We will continue the effort to push the highest frequency at which deterministic simulations
of earthquakes can produce accurate synthetic seismograms. Complementing the Task 2
research to characterize high-frequency source dynamics, in Task 3 we will
assess the high-frequency accuracy of the Community Velocity Models (CVMs) of southern
California (e.g., CVM-S and CVM-H). This task entails in part comparing synthetic
seismograms, for increasingly finer sampling of the CVMs, to recorded strong motion data.
Small- to moderate-size events will be used to minimize finite source effects. We will use the
TeraShake HPC codes scaling to 10,000s of cores (e.g., AWP- Olsen) to generate synthetics
up to 3Hz and quantify the spatially-varying fit to data using a systematic goodness-of-fit
algorithm. Finally, the amplification effects of the near-surface material (Vs<500m/s) will
be examined.
Task 4. Triggering of Cascading Ruptures (Objective O4 of proposal)
Task 4a. Dynamic Loading from Large Events. We will use the highly scalable HPC codes
(e.g. AWP-Olsen) on the TeraShake platform to calculate stress-wave loading of one fault (or
fault segment) by rupture on another. This is the first step (see also Task 4b) to examine the
triggering potential of stress waves (static and dynamic) from moderate and large
earthquake scenarios (for example, a large earthquake on the southern San Andres fault) on
nearby faults in southern California. The variation in loading, and therefore in triggering
potential, will be tested for a suite of dynamic rupture models produced by the DynaShake
platform using various physical models. We will resolve the stresses onto the expected
rupture mechanism for the nearby faults, compare the stress levels to proposed triggering
thresholds from other studies, and introduce these time- dependent stresses into separate,
high-resolution dynamic triggering simulations. Task 4b. Dynamic Triggering. The
enhanced DynaShake capability from Task 1 will be used in the dynamic triggering
investigations. As in Task 2, these simulations will also incorporate multiple physical
models (frictional breakdown, shear heating, porothermoelastic flow) and fault roughness.
2
The loading will be from stress waves calculated (Task 4a) for other, nearby events, with
the analysis directed toward establishing an improved physical basis for Earthquake
Rupture Forecasts.
2. What was accomplished under these goals (you must provide information for at
least one of the 4 categories below)? For this reporting period describe: 1) major
activities; 2) specific objectives; 3) significant results, including major findings,
developments, or conclusions (both positive and negative); and 4) key outcomes or
other achievements. Include a discussion of stated goals not met.
Task 1. We developed the SORD code as a tool for dynamic simulation of geometrically and
physically complex ruptures. To do so, we integrated high-speed frictional weakening (in a
rate- and state-dependent formulation) into the code. This integration was done using a
method that time-staggers the state and velocity variables at the split nodes, producing a
stable, accurate and very efficient solution scheme. We also added
the Drucker-Prager formulation of pressure-dependent plastic yielding into SORD, with
added viscoplastic terms to suppress strain localization. The resulting code was successfully
tested using SCEC rupture dynamics benchmarks. We also implemented and successfully
tested a scheme for the generation of SORD meshes for power-law rough faults. Figure 1
shows an example in which we meshed a rough fault that is statistically self-similar over 3
orders of magnitude in spatial scale. Finally, SORD was successfully ported to
supercomputers Kraken and Blue Waters.
Task 2. We used SORD to perform three-dimensional numerical calculations of dynamic
rupture along non-planar faults to study the effects of fault roughness (self-similar over
three orders of magnitude in scale length) on rupture propagation and resultant ground
motion. The present simulations model seismic wave excitation up to ~10 Hz with rupture
lengths of ~100 km, permitting comparisons with empirical studies of ground- motion
intensity measures of engineering interest. Characteristics of site-averaged synthetic
response spectra, including the distance and period dependence of the median values,
absolute level and intra-event standard deviation, are comparable to appropriate empirical
estimates, throughout the period range 0.1-3.0 sec (e.g., Figure 2).
Task 3. We used the 2008 Mw5.4 Chino Hills, CA, earthquake, for most of the analysis in this
section. The event lends itself well to the project, as it is well recorded, and sufficiently
small to minimize finite fault effects at higher frequencies.
We used the efficient and parallel finite difference code AWP-ODC to simulate 200 s of
visco-elastic wave propagation in a 180 km x 125 km area of the Los Angeles area with
dx=25m and Qs=50Vs (Vs in km/s). These parameters allowed us to model frequencies up
to 2 Hz for Vsmin=500m/s, and up to 1 Hz for Vsmin=250m/s. The simulations were carried
out on kraken at NICS using 64k cores in about 2 wall-clock hours per run.
Figure 3 shows a comparison between data and synthetics up to 1Hz for Vsmin=500m/s
and Vsmin=250m/s for 9 stations located in the greater Los Angeles area, including both
rock, deep and shallow sediment sites. Generally, the synthetics reproduce the peak
velocities and duration and, in many cases, the shape of the P and S waveforms of the data.
In particular, it is striking that the simulations with the two different minimum Vs values
are very similar, indicating that, at least for frequencies up to 1Hz, the sediments with Vs
lower than 500 m/s can be ignored for seismic hazard analysis. While this result likely is
limited to the lower frequencies, it has important cost-saving implications for future
earthquake simulations.
3
Figure 4 compares 0.1-2Hz Chino Hills data to synthetics for Vsmin=500m/s. While the
main arrivals, the general envelope and duration of data and synthetics match fairly well at
some stations, the mismatch in amplitude is larger and the general fit is expectedly
degraded, as compared to 0.1-1.0Hz. This result is clarified further in Fig. 5 using an average
of the map-based goodness-of-fit metrics by Olsen and Mayhew (2010).
The comparisons of Chino Hills synthetics to data in this study has set the stage for driving
the frequencies higher in deterministic simulations, and have contributed to the important
ongoing SCEC project ‘High-F’ (http://scec.usc.edu/scecpedia/High-F). Guided by the
results reported here, deterministic simulations with frequencies as high as 5-10Hz have
been carried out, and led to inclusion of statistical descriptions of small- scale complexity of
the rupture as well as the surrounding medium. Moreover, the PetaSHA3 results suggest
that it may be feasible to replace the current practice of calculating broadband synthetic
seismograms (0-10Hz), often done by hybrid deterministic-stochastic methods that lack
fundamental physical basis, with fully deterministic simulations.
Co-sponsored with USGS/NEHRP, we have estimated 0-10Hz ground motions in the Salt
Lake Basin (SLB) during M 7 earthquakes on the Salt Lake City (SLC) segment of the
Wasatch fault (WFSLC) (Roten et al., 2011; 2012). First we generate a suite of realistic
source representations by simulating the spontaneous rupture process on a planar fault,
followed by projection of the slip rate histories from the spontaneous rupture scenarios
onto a detailed 3D model geometry of the WFSLC. Next we simulate 0-1 Hz wave
propagation from six source models with AWP-ODC, using the most recent version of the
Wasatch Front Community Velocity Model. Horizontal spectral accelerations at two seconds
(2s-SAs) reveal strong along-strike rupture direction effects for unilateral ruptures, as well
as significant amplifications by the low-velocity sediments on the hanging wall side of the
fault. For ruptures nucleating near the southern end of the segment we obtain 2s-SAs of up
to 1.4 g near downtown SLC, caused by a combination of rupture direction and basin edge
effects. Average 3s-SAs and 2s-SAs from the six scenarios are generally consistent with
values predicted by four next-generation attenuation models. Figure 6 shows 0.2s-SA maps
for rupture models B (northern epicenter) and B’ (southern epicenter).
We then generate broadband (BB, 0-10 Hz) ground motions for the M 7 earthquakes on the
Salt Lake City segment of the Wasatch fault, Utah, by combining the 0-1 Hz finite- difference
synthetics with high-frequency (1-10 Hz) S-to-S back-scattering operators. Average
horizontal spectral accelerations at 5 and 10 Hz (0.2s-SAs and 0.1s-SAs, respectively)
calculated from the linear BB synthetics exceed estimates from four recent ground motion
prediction equations (GMPEs) at near-fault (< 5 km) locations on the sediment generally by
more than one standard deviation, but agree better with the GMPEs at larger rupture
distances. The overprediction of the near-fault GMPE values is largely eliminated after
accounting for near-surface soil nonlinearity with 1D simulations along E-W profiles in the
Salt Lake basin, reducing the SAs from the simulations by up to 70%. The nonlinear
simulations use a simple soil model based in part on published laboratory experiments on
Bonneville clay samples.
Task 4. We have used the Mw7.2 El Mayor-Cucapah earthquake and the AWP-ODC FD code
to test the efficacy of using static Coulomb Failure Stress (sCFS) change and peak dynamic
Coulomb Stress (peak dCFS(t)) change to predict aftershocks. The underlying source
models were those by Roten and Olsen (2010), constrained by fault geometry, rake,
hypocenter location, and reported surface displacement. The dynamic rupture models
produce 0-1.5Hz ground motions in general agreement with recorded strong motion data.
We find that both sCFS and peak dCFS(t) provide high correlation coefficients (R>0.4) with
seismicity rate change a month after the mainshock, with decreasing correlation for shorter
4
time periods. We find that the correlation coefficient is generally larger for peak dCFS(t)
compared to sCFS changes, suggesting that dynamic stress changes in general provide a
better predictor for the location of the aftershocks. Static and peak dynamic stress changes
show stronger correlation with near-fault and more distant aftershocks, respectively (see
Figure 7).
We calculated the correlation of the seismicity rate change with coulomb stress changes
from the El Mayor-Cucapah mainshock simulated in two different 3D velocity models, the
SCEC CVM-H and CVM-S. The peak dCFS(t) distributions obtained from the two different
CVMs are very similar, while the corresponding peak dCFS(t) patterns are noticeably
different (see Figure 5). In particular, CVM-H generates a lobe of (directivity- induced) large
peak dCFS(t) between the Elsinore and San Jacinto fault toward the Los Angeles basin, not
present in the results using CVM-4. CVM-H tends to produce a higher correlation (on
average) than CVM-4 at periods of 1 month or less.
Finally, we have found indication of a threshold level of peak dCFS(t) (of 0.7 bars) for
triggering of aftershocks in regions of positive static stress changes. Such threshold value,
which may be used as an indicator for the location of future earthquakes or aftershocks, was
not found for static stress changes. In summary, we suggest that both sCFS and peak dCFS(t)
be incorporated in studies of stress transfer and earthquake triggering, as they both appear
to affect aftershock seismicity in a complementary way and show variable correlation for
time periods of the mainshock ranging from a day to 1 year.
In our GlyphSea visualization study we explored a new general approach that allows much
more flexibility: the use of Glyphs. The applicability of the method is very general and not
discipline specific, and the software may be implemented on a wide variety of platforms,
from laptop computers with a single 2D screen, to supercomputers driving a 4D
visualization “Cave”.
The outcome was extremely successful, and was presented at numerous scientific meetings,
both domain-specific (such as AGU) and IT meetings (IEEE, SPIE).
The most spectacular outcome was the award of an honorable mention (among 4 exaequos) by the National Science Foundation and the AAAS Journal Science in the 2010
International Science & Engineering Visualization Challenge. The award was given to A.
Chourasi, E. McQinn, J.B, Minster and J. Schulze (See attached reference).
The resulting package SEAGLYPH is easily obtained, and has been applied to other field,
such as Early Universe cosmology.
One goal that has not been met to our satisfaction is the publication in printed form of the
algorithm. It proved difficult to convince reviewers that a print version was a useful
supplement to web-published videos. A new version of the paper was submitted recently.

Technical support for the PetaSHA milestone simulations, including porting and
optimizing SCEC high performance application codes including AWP-ODC, SORD and
Ma-FE for efficient use of NSF computing systems.

Effectively perform major SCEC capability simulations on NSF TeraGrid, providing a
rule-based automatic end-to-end workflow to prepare initial inputs, heroic executions,
5
data archiving and analysis. Develop selected applications with a self-describing
machine-independent HDF5 binary file format that supports scalable parallel I/O
performance.

Develop the capability for collocated and concurrent visualization production pipeline
for simulations on current and future high performance resources. Develop novel visual
representations and products for dissemination of research to a broad range of
audience.

Act as liaison between SCEC and the national TeraGrid facilities enabling our project to
work effectively with the TeraGrid organization leveraging TeraGrid opportunities and
capabilities.
The Quake Group and Computational Seismology Lab at CMU, under the leadership of Co-PI
Jacobo Bielak participated on different aspects of the PetaSHA-3 project. In particular, some
of the objectives related to CMU activities included the continued development of
computational tools (namely Hercules, CMU’s octree-based finite element parallel code for
anelastic wave propagation) for extending the upper bound of ground-motion simulations
up to 5.0 Hz. This included validations using seismic data from Southern California
earthquakes in order to improve the SCEC Community Velocity Models available for the
region. We also worked on adapting the code toward make it available to other SCEC users
outside our Quake group. To attain these objectives we completed the following activities, in
association with complementary activities from related projects.

We improved the algorithm used in Hercules to incorporate anelastic attenuation and
obtained an additional 20–30 percent speed-up in its performance by eliminating
unnecessary computational steps in the case of infinite Q for dilatational deformation
(Qk).

As part of the continued development of Hercules, we updated the scaling performance
of the code in different systems, including various NSF and DOE machines. Figure 1
shows various Hercules’ strong scalability curves.

We ported Hercules to a new distributed version control system (Git) hosted in a more
easily accessible repository (GitHub). This has allowed us to share the code with other
SCEC researchers and provides the framework for a sustainable software development
in the future.

Together with SCEC/IT researchers, we created multiple etree models of the SCEC
Community Velocity Models available for Southern California (with and without
incorporating additional information about the geotechnical layers in the upper 350 m),
that is, CVM-S, CVM-H, and CVM-H+GTL. These models were used to represent the
material properties in large areas (180 km × 135 km) of southern California in order to
perform earthquake ground motion simulations up to frequencies of engineering
interest. In particular, the etrees were used to generate unstructured octree finiteelement meshes ranging between 5 and 15 billion elements. Figure 2 shows a
comparison of the three models built.
6

We used the abovementioned finite element models and an improved version of
Hercules to perform a set of simulations for the Mw 5.4 2008 Chino Hills, California,
earthquake using the various Southern California Velocity Models (CVM-S, CVM-H and
CVM-H+GTL). The simulations were designed to produce a valid representation of the
ground motion up to a maximum frequency of 4 Hz. We compared the results of
simulations of the Chino Hills earthquake with seismic records obtained from Southern
California strong motion networks. In total, we compared simulation results against
data in 336 stations. The quality of the match between the actual records and the
simulated synthetics was measured in terms of a commonly used goodness-of-fit (GOF)
criteria. Figure 3 shows a summary of the spatial distribution of the GOF values
obtained for the three models. Values closer to 10 indicate a better fit.

We accomplished significant progress on incorporating free-surface topography in the
modeling and simulation scheme in Hercules. We developed a finite-element based
methodology that uses special integration techniques to account for an arbitrary freesurface boundary in the simulation domain but preserves the octree- based structure of
the code, and thus does not have a significant effect on performance. We call the
proposed approach virtual topography. A schematic representation of the modeling
approach used in our virtual topography approach is shown in Figure 4. Initial
verification tests have been done for comparisons with previous simulations done by
independent groups using other analytical and numerical methods. Figure 5 shows
some of these comparisons, which indicate that the method implemented in Hercules
yields satisfactory results. Incorporating the effects of topography is a key element for
future high-frequency ground motion simulations where small-scale features can have a
significant effect on the ground motion. We will continue to work on this area with the
objective of using this methodology in realistic models for Southern California and other
active seismic zones.

We also completed an initial study on the effect of the built environment on the ground
motion. We incorporate the presence of buildings in simulations using simplified
models that capture the principal characteristics of the dynamic response of multiple
structures in regular building clusters and the interactions of the buildings with the
ground, and between the buildings themselves. Figure 6, for instance, shows the
scattered field at the free-surface for different configurations of building clusters, due to
the 1994 Northridge earthquake. We found that interaction effects increase with the
number of buildings and density of the cluster (i.e., smaller separation between
buildings).
3. What opportunities for training and professional development has the project
provided?
Two post-doctoral researchers as well as three geoscience students were employed on the
project. The post-doctoral researchers and students obtained advanced training in highperformance computing, rupture dynamics and 3D ground motion simulations during the
project. They furthermore presented their work at international conferences and gained
important technical communication skills during the project.
Graduate Student Emmett McQuinn worked with UCSD and SDSC for the bulk of this work,
under a SCEC Graduate Student ACCESS scholarship. After securing an MS in Computer
Science, Emmett had opportunities to pursue a PhD at SDSC or USC, or SIO, but preferred to
7
move to Stanford to a more lucrative career in computer graphics applied to Bio medicine.
[His work earned him the cover of Science, in Feb 2012
http://www.sciencemag.org/content/339/6119.cover-expansion ].
Amit Chourasia (SDSC), Jean-Bernard Minster (SIO), and Jürgen P. Schulze (CalIT2) were
his mentors during the completion of this work.
UCSD Ph.D students Jun Zhou, Shiyu Song, Efecan Poyraz, high school student Lucy Chen
were trained through this project for projects, Jun Zhou through mentoring of Ph.D thesis
project.
Postsdoctoral mentoring was part of the activities to help prepare Dr. Ricardo Taborda for
an academic career. As a postdoctoral fellow in the Computational Seismology Laboratory at
Carnegie Mellon (CMU), Dr. Taborda assisted Co-PI Bielak on advising three graduate
students, writing journal publications and preparing poster and oral presentations (see
publications). At CMU, Taborda has been particularly involved in co- advising the Ph.D.
thesis work of graduate student Yigit Isbiliroglu, whose topic of research is a continuation of
Taborda’s Ph.D. work on the effects of the built environment on the ground motion during
strong earthquakes and the coupled soil-structure interaction effects on the dynamic
response of building clusters. Taborda has also served as the primary liaison between
SCEC/IT group and the Quake Group at CMU, and has been closely involved with the
development of the Unified California Velocity Model (UCVM). He has used datasets
generated using UCVM to conduct his own research on the validation of the various seismic
velocity models available for Southern California (CVM-S, CVM-H).
During the Spring of 2013, Taborda interviewed at several universities and secured a
position as a new Assistant Professor at the University of Memphis (U of M), starting August
2013. At the U of M, Taborda will join the faculty of the Civil Engineering Department in the
School of Engineering and will have a joint tenure-track appointment with the Center for
Earthquake Research and Information (CERI). CERI is a University of Memphis Center of
Excellence. We expect that Taborda will continue to collaborate with SCEC and CMU from
CERI.
4. How have the results been disseminated to communities of interest?
The SCEC PetaSHA3 research group frequently presents our research activities to groups
outside of seismology. We present SCEC research results to computational science research
community. In 2011, Philip Maechling participated in a HPC panel on National
Cyberinfrastructure at Harvard Medical School as part of an Open Science Grid meeting. We
have hosted a series of computational workshops at SCEC on PetaSHA3 scientific issues
including UCVM (April 2012), CyberShake (April 2012), and SCEC Data Management (Feb
2014). Details from these meetings are posted on the public SCEC wiki at:
http://scec.usc.edu/scecpedia/Main_Page
PetaSHA3 visualizations from our M8 research received national recognition with
honorable mention in scientific visualizations for the year. This award was widely covered
in the electronic press, such as HPCWire, Futurity.org, and Scientific Computing. These sites
showed images from our visualizations to international audience. M8 visualizations won a
TeraGrid 11 Visualization Award and also received an Office of Advanced Scientific
Computing Research (OASCR) awards announced at SciDAC 2011 conference.
8
Images from the SCEC M8 simulation are used in the NSF Cyberinfrastructure for the 21st
Century Science and Engineering Advanced Computing Infrastructure Vision and Strategic
Plan (NSF Document nsf12051).
The SCEC PetaSHA3 researchers make use of wiki’s to communicate our research among
ourselves. By default, research communications are open. Many PetaSHA3 research efforts,
including the wave propagation simulation work are described in some detail on the SCEC
wiki with the following home page: http://scec.usc.edu/scecpedia
The outcome of this work was extremely successful, and was presented at numerous
scientific meetings, both domain-specific (such as AGU) and IT meetings (IEEE, SPIE). But
Emmett was extremely generous of his time, and provided numerous demonstrations to
visitors at UCSD, including several cohorts of SCEC interns. Naturally, the implementation
in a 4D “cave” was the most popular of these demonstrations and tours.

Invited Speaker, HPC China Workshop at SC’12, Nov 13, 2012, International Workshop
on CO-DESIGN, Beijing, Oct 23-25, 2012

SIAM PP’12, Savannah, Feb 15-17, 2012, International Workshop on CO-DESIGN,
Beijing, Oct 25-26, 2011

7th Int’l APEC Cooperation for Earthquake Simulation (APES) Workshop, Otaru, Oct 3-8,
2010, Int’l Conference of Numerical Analysis and Applied Mathematics (ICNAAM’10),
Rhodes, September 19-25, 2010
Media coverage:

3-D Simulation Predicts LA Will Bear Brunt Of The "Big One" - new M8 movie released
during SCEC annual meeting. Discovery News, September 2010.

TeraGrid Helps Make Possible Largest-Ever Earthquake Simulation - The scientific
results of this massive simulation have allowed us to observe things that we were not
able to see in the past. TeraGrid News, September 2010.

Supercomputing Enables Largest-Ever Earthquake Simulation - for a Magnitude 8.0
(M8) rupture of the entire southern San Andreas fault. Dr. Dobb's The World of
Software Development, August 2010. See also The Orange County Register, UCSD News

Earthquake Simulation Rocks Southern California - Jaguar raises the bar for modeling
the next big shakeup. HPC Wire, August 16, 2010. Also in ORNL News.

Preparing for the big one - NCSA works with group led by SCEC to run large earthquake
simulations on Blue Waters and characterize seismic hazard risk. NCSA News, April 21,
2010

DOE INCITE Award 2010 - Department of Energy awarded scientists from SDSC 27
million core hours to the project titled "Deterministic Simulations of Large Regional
Earthquakes at Frequencies up to 2 Hz".

UCSD News, February 2, 2010. See also HPC Wire.
9

SDSC Visualizations Win 'OASCR' Awards at SciDAC 2011 - M8 visualization is among
the recipients of the people's choice OASCR awards announced at the 2011 SciDAC
(Scientific Discovery through Advanced Computing Program) conference. Wired
Science, August 8, 2011

M8 Visualization Wins Best Visualization Display Award at TeraGrid'11 meeting TeraGrid News, July 24, 2011
5. What do you plan to do during the next reporting period to accomplish the goals?
This is our final project report.
6. Products - What has the project produced?
Some of the results reported here, including the Chino Hills results, are posted on the
interactive data website called websims: http://scec.usc.edu/websims/. Development of
this website was continued during the PetaSHA3 funding cycle, and we expect a widespread use of the system for disseminating synthetic ground motion results in the future.
Current graphics hardware is capable of displaying a very large number of glyphs at
interactive rates. Recently, it has become possible to draw glyph shapes procedurally rather
than using pre-computed lookup textures. In this work we offer a quiver of glyph
techniques which incorporates several real time interaction features such as displacement,
halos, jitter, isosurfaces and slicing for exploration of vector data. The following are the key
contributions of our study:
• A novel technique to encode and display vector data by using procedural dipole texturing,
which is extendable to arbitrary glyph shapes. This method enables us to identify macro
level features from far and micro level features when zoomed closer.
• A novel method for enabling distinction of glyphs by using user tunable wire frame lattice.
• Comprehensive demonstration and discussion of our new glyph techniques applied to
datasets from astrophyiscs and seismology.
The data sets produced by this project are secondary data sets resulting from the
processing of either seismic data recorded by field instrumentation, or from numerical
simulations. For some purposes, the products generated by GLYPHSEA can be captured in
the form of videos, such as those published online:
Chourasia, Amit, E. McQuinn, B. Minster and J. Schulze, Glyphsea: An Application to
Visualize Vector Data: http://visservices.sdsc.edu/projects/scec/vectorviz/glyphsea/
However, such a medium does not do justice the interactive capabilities of GLYPHSEA, and
as usually not 4D, so that most users would prefer to generate ab initio visualizations for
specific applications.
10
1. Unat, D., Zhou, J., Cui, Y., Cai, X. and Baden, S.: Accelerating an Earthquake Simulation
with a C-to-CUDA Translator, Journal of Computing in Science and Engineering, Vol. 14, No.
3, 48-58, May/June, CiSESI-2011-09-0094, May, 2012.
1. Christen, M., O. Schenk, and Y. Cui, PATUS for Convenient High-Performance Stencils:
Evaluation in Earthquake Simulations, Technical Paper, SC12, Salt Lake City, Nov 10-16,
2012.
2. Chourasia, A., Zhou, J., Cui, Y., Choi, DJ, Olsen, K.: Role of visualization in porting a
seismic simulation from CPU to GPU architecture (Visualization Showcase), XSEDE’12,
Chicago, July 16-20, 2012.
3. Zhou, J., Choi, DJ, Cui, Y.: GPU acceleration of a 3D finite difference earthquake code on
XSEDE Keeneland, XSEDE’12, Chicago, July 16-20, 2012.
4. Zhou, J., Didem, U., Choi, D., Guest, C. & Y. Cui, Hands-on Performance Tuning of 3D
Finite Difference Earthquake Simulation on GPU Fermi Chipset, Proceedings of International
Conference on Computational Science, Vol. 9, 976-985, Elesvier, ICCS 2012, Omaha,
Nebraska, June, 2012.
5. Cui, Y., Olsen, K., Jordan, T., Lee, K., Zhou, J., Small, P., Ely, G., Roten, D., Panda, DK,
Chourasia, A., Levesque, J., Day, S. and Maechling, P., Scalable Earthquake Simulation on
Petascale Supercomputers, Gordon Bell Finalist, Supercomputing’10, 1-20, New Orleans,
Nov, 2010.
6. Cui, Y., Looking forward to Architecture Changes with Seismic Wave Propagation Using
a 3D Finite Difference Code, Int'l Conference of Numerical Analysis and Applied
Mathematics, pp. 1781, edited by T. E. Simos, G. Psihoyios, and Ch. Tsitouras, Rhodes,
Greece, 19-25 September 2010.
7. Chavez, M., K. B. Olsen, E. Cabrera, and N. Perea (2011). Observations and Modeling of
Strong Ground Motions for the 9 October 1995 Mw 8 Colima-Jalisco, Mexico, Earthquake,
Bull. Seis. Soc. Am. 101, 1979-2000.
8. Day, S.M., D. Roten, and K.B. Olsen (2012). Adjoint analysis of the source and path
sensitivities of basin-guided waves, Geophys. J. Int. , Vol 189, pp. 1103-1124, doi:
10.1111/j.1365-246X.2012.05416.x
9. Olsen, K.B., and J. Mayhew (2010). Goodness-of-fit criteria for broadband synthetic
seismograms, with application to the 2008 Mw 5.4 Chino Hills, California, earthquake,
Seism. Res. Lett. 81,5 715-723.
10. Olsen, K.B., and G. Ely (2009). WebSims: A Web-based System for Storage, Visualization,
and Dissemination of Earthquake Ground Motion Simulations, Seismol. Res. Lett. 80, 10021007, doi:10.1785/gssrl.80.6.1002
11. Roten, D., K. B. Olsen, J. C. Pechmann, V. M. Cruz-Atienza, and H. Magistrale (2011). 3D
Simulations of M 7 Earthquakes on the Wasatch Fault, Utah, Part I: Long- Period Ground
Motion, Bull. Seis. Soc. Am. 101, 2045-2063
11
12. Roten, D., K. B. Olsen, J. C. Pechmann (2012). 3D Simulations of M 7 Earthquakes on the
Wasatch Fault, Utah, Part II: Broadband (0-10Hz) Ground Motions and Nonlinear Soil
Behavior (2012). Bull. Seis. Soc. Am. 102, 2008-2030.
13. Shi, Z., and S. M. Day (2013), Rupture dynamics and ground motion from 3-D roughfault simulations, Journal of Geophysical research, 118, 1–20, doi:10.1002/jgrb.50094.
14. Cui, Y., K.B. Olsen, J. Zhou, P. Small, A. Chourasia, S.M. Day, P.J. Maechling, T.H. Jordan.
(2012). Development and optimizations of a SCEC community anelastic wave propagation
platform for multicore systems and GPU-based accelerators, Seism. Res. Lett. Seism. Res.
Lett. 83:2, 396.
15. Day, S.M., K.B. Olsen, and Y. Cui (2011). Large-scale earthquake simulations and the
prediction of strong ground motion (invited talk), SIAM Conference on Mathematical and
Computational Issues in the Geosciences, March 21-24, 2011, Long Beach.
16. Shi, Z., S.M. Day, and G. Ely (2012) Dynamic rupture along the San Gorgonio Pass section
of the San Andreas Fault (2012), Seism. Res. Lett. 83:2, 423.
Olsen, K.B., and J.E. Mayhew (2010). Goodness-of-fit Criteria for Broadband Synthetic
Seismograms, With Application to the 2008 Mw5.4 Chino Hills, CA, Earthquake, Seism. Res.
Lett. 81 , 715-723.
17. Olsen, K.B., J.E. Mayhew, and K. Withers (2011). Goodness-of-fit criteria for broadband
synthetic seismograms, with application to the 2008 Mw 5.4 Chino Hills, CA, earthquake,
Seism. Res. Lett.
18. Olsen, K.B., B.H. Jacobsen, R. Takedatsu. (2012). Validation of broadband synthetic
seismograms with earthquake engineering-relevant metrics, Seism. Res. Lett.
Roten, D., and K.B. Olsen (2010). Simulation of Long-Period Ground Motion in the Imperial
Valley Area during the Mw 7.2 El Mayor-Cucapah Earthquake, abstract S51A-1920 poster
presented at the 2010 Fall Meeting, AGU, CA.
19. Somerville, P. G., Callaghan, S., Maechling, P., Graves, R. W., Collins, N., Olsen, K. B.,
Imperatori, W., Jones, M., Archuleta, R., Schmedes, J., And Jordan, T.H. (2011). The SCEC
Broadband Ground Motion Simulation Platform, SRL, 82(2), p. 275, 10.1785/gssrl.82.2.273.
20. Withers, K, and Kim B. Olsen (2012). Correlation of peak dynamic and static coulomb
failure stress with seismicity rate change after the M7.2 El Mayor-Cucapah earthquake,
Annual AGU Mts, San Francisco, Dec 2012, poster S43E-2517.
Software and Data Products
1. The CMU etree generator is available through SCEC’s UCVM software platform. The
etree library is available through the CMU Quake Group Web site.
2. Hercules is made available upon request via the GitHub code-hosting platform. Versions
of Hercules have been shared with researchers at USC and Argonne National Lab.
3. Data and results from the 2008 Chino Hills earthquake and simulation are made
available upon request via the CMU Quake Group Web site. The full dataset has been
already shared with researchers from USC and UC Irvine.
4. Electronic supplement to Roten et al. (2011) (Supplementary figures of spectral
12
acceleration and animation of wave propagation) can be found at
http://bssa.geoscienceworld.org/content/101/5/2045/suppl/DC1
5. Electronic supplement to Roten et al. (2012) (Table of coefficients and amplitudedependent correction functions for nonlinear soil effects, and figures showing maps of
SAs at various frequencies, PGA and PGV, with and without correction for nonlinear soil
effects, results of 1D nonlinear simulations, and comparison to ground motion
prediction equations.) can be found at
http://www.seismosoc.org/publications/BSSA_html/bssa_102-5/2011286esupp/index.html
7. Participants & Other Collaborating Organizations - Who has been involved?
The following individuals have worked on the project.
 Yifeng Cui, yfcui@sdsc.edu
 Jun Zhou, UCSD graduate student, Chinese citizen, zhoujun84@gmail.com
 Dr. DongJu Choi, researcher, dchoi@sdsc.edu
 Kwangyoon Lee, UCSD GSR, kwl002@cs.ucsd.edu
 Efecan Poyraz, UCSD graduate student, Turkey, efexan@gmail.com
 Layton Chen, programmer analyst, Canada, layton@sdsc.edu
 Lucy Chen, high school intern, female, lychen.4869@gmail.com
 Joey Reed, UCSD GSR, US Citizen
 Hieu Nyugen, UCSD GSR, Vietnam
 Sashka Davis, female, Mexico, UCSD GSR
 Bielak, Jacobo (faculty, Hispanic)
 Isbiliroglu, Yigit (graduate student, Caucasian)
 Karaoglu, Haydar (graduate student, Caucasian)
 Restrepo, Dorian (graduate student, Hispanic/South American)
 Taborda, Ricardo (postdoc, Hispanic/South American)
 Kim Olsen – M/E prof of geophysics
 Steve Day – M/E prof of geophysics
 Rumi Takedatsu, MS student, asian, female,
 Jinquan Zhong, Post-doc, asian, male
 Megan Jones (female, not latino, white, US citizen) – geoscience grad student
 Kyle Withers (male, not latino, white, US citizen) – geoscience grad student
 William Savran (male, not latino, white, US citizen) – geoscience grad student
 Zheqiang Shi, Post-doc, geoscience, male, asian
13
Other collaborators or contacts been involved?
 Dhabaleswar Panda, OSU CS Professor, panda@cse.ohio-state.edu
 John Levesque, Cray Inc., levesque@cray.com
 Scott Klasky, ORNL, klasky@ornl.gov
 Jeffrey Chen, CMU, CS Professor, zchen@mines.edu
 Dr. Daniel Roten (ETHZ, Zurich, Switzerland), a former post-doc at SDSU with Dr. Olsen
and Day, has been involved in the project. Dr. Roten has supported the project with
rupture and ground motion simulations from his position in Zurich. In return, the
project has paid a small amount travel support ($1,232) to Dr. Roten to come to CA to
work with the PIs.
The visualization work was a collaboration between:
•
Dr. Amit Chourasia (San Diego SuperComputer Center, Senior Visualization
Scientist),
•
Professor Jean-Bernard Minster (Scripps Institution of Oceanography), and
•
Professor Jürgen P. Schulze (Research Scientist, California Institute For
Telecommunication and Information Technology)
The three participants named above co-supervised Graduate Student and SCEC Graduate
Intern Emmett McQuinn.
The SCEC CME collaboration was critical in providing challenges, suggestions, and
simulation outputs. Many scientists contributed at one time or another. Of special note we
count:
•
•
•
•
Prof. Steven Day, San Diego State University
Prof. Kim Olsen, San Diego State University
Dr. Yifeng Cui, Programmer Analyst, San Diego Computer Center
Philip Maechling, Southern California Earthquake Center
Impact - What is the impact of the project? How has it contributed?
The Chino Hills simulations, as well as the rough-fault dynamic rupture and wave
propagation simulations, have demonstrated that accurate deterministic ground motion
estimation for frequencies up to 5-10Hz may be feasible. This result has important
implications for current procedures for such ground motion estimation, which primarily
consists of hybrid low-frequency deterministic-stochastic and high frequency stochastic
methods (e.g., Chavez et al., 2011; Roten et al., 2012). These hybrid methods lack physical
basis, and the PetaSHA3 results show promise for the fully deterministic methods to be able
to replace the hybrid methods in the future.
SCEC’s PetaSHA3 work optimizing our AWP-ODC wave propagation software now permits
SCEC software to run on the Top 500 HPC system. Development of this highly capable
software is an important step towards our goal of real science runs at sustained petaflops
performance. Our PetaSHA2 work that included SCEC, XSEDE, and ISI researchers, in which
we developed a new method for running many short serial jobs using workflows on Kraken
has significant potential for broad use within computational sciences. This new capability
14
will help SCEC and other researchers use workflow technology on Kraken and other newly
developed supercomputers such as Blue Waters.
The primary impact of this work is the demonstration that 4D visualizations of vector and
tensor fields is a superb way for domain scientists to discover features in the enormous
volume of data produced by 3D dynamic simulations.
The easiest example to describe is the discovery that surface velocity fields produced by the
SCEC TeraShake series of earthquake simulations contain patterns that exhibit high
vorticity. It would be impossible to detect such patterns from data collected on even a fairly
dense seismic network, and very difficult to detect in a voluminous simulation output in the
absence of suitable visualization.
Certainly this has caught the attention of earthquake engineering colleagues and further
exploration of such occurrences will be warranted.
We anticipate that visual pattern recognition will become an ever more useful analysis tools
as the methods become more readily accessible. Preliminary applications to early universe
dynamic cosmological models look promising.
Finally, from the point of view of outreach, it has been our experience that a good
visualization “speaks” to the minds of the viewers, even persons with no technical
background. This has been the case from the very beginning of the SCEC Terashake project
and has continued since. The present work is an additional step, the represent vector data.
It is possible to use the technique to represent higher order tensor data (e.g. dynamic strain
fields) but our experience so far indicates that considerable viewer training may be
required, in order to develop a useful mental picture of the tensor field.
What is the impact on the development of the principal discipline(s) of the project?
What is the impact on other disciplines?
PetaSHA3 researchers participate on national and international HPC advisory groups
including NICS User Advisory Board, TeraGrid Science Advisory Board, XSEDE Advisory
Board, NEES Cyberinfrastructure Advisory Committee, and the Global Earthquake Model
Scientific Board, and we participate in national cyberinfrastructure development and
planning workshops including the NSF EarthCube activity.
What is the impact on the development of human resources?
The computer-oriented research activities on the PetaSHA3 project are excellent
preparation for many types of work. Graduate students and research staff that have made
significant contributions to SCEC computational research have gone onto computational
science careers including positions with Amazon.com, Microsoft, Intel, and Argonne
National Laboratory.
What is the impact on physical resources that form infrastructure?
This project funding enabled community code AWP-ODC.
The web system websims, that was continually developed during the PetaSHA3 funding
cycle, was a pioneering effort that likely will guide future efforts to disseminate earthquake
simulations.
What is the impact on institutional resources that form infrastructure?
15
In some cases, SCEC simulation results are stored as persistent research artifacts. SCEC
currently maintains about 15TB of long term archives of wave propagation simulation
results.
What is the impact on technology transfer?
The user-friendly facilities of Websims is likely to change the accessibility of earthquake
simulations and other products generated by earthquake simulations, useful for the public.
What is the impact on society beyond science and technology?
In the United States, the USGS provides seismic hazard information, including seismic
hazard forecasts, to regulatory agencies and the public. SCEC PetaSHA3 research includes
significant contributions from USGS personnel and our close connection to USGS seismic
hazards programs provides an opportunity for our results to impact national seismic hazard
estimates. We believe that the SCEC PetaSHA3 project has shown how physics-based
computational models, observation-based 3D earth structure models, and high performance
computing can improve seismic hazard forecasts and that software and computational
improvements made by our PetaSHA3 research group is contributing to the development of
the USGS official seismic hazard information in the United States.
16
Download