Solving Prediction Problems in Earthquake System Science on Blue Waters
Thomas H. Jordan (Principal Investigator) [1], Scott Callaghan [1], Robert Graves [2],
Kim Olsen [3], Yifeng Cui [4], Jun Zhou [4], Efecan Poyraz [4], Philip J. Maechling [1],
David Gill [1], Kevin Milner [1], Omar Padron, Jr. [5], Gregory H. Bauer [5], Timothy
Bouvet [5], William T. Kramer [5], Gideon Juve [6], Karan Vahi [6], Ewa Deelman [6],
and Feng Wang [7]
[1] Southern California Earthquake Center, [2] USGS, [3] San Diego State University,
[4] San Diego Supercomputer Center, [5] National Center for Supercomputing
Applications [6] Information Sciences Institute, [7] AIR Worldwide
1. Executive Summary:
A major goal of earthquake system science is to predict the peak shaking at surface
sites over forecasting intervals that may range from a few days to many years. The
deep uncertainties in these predictions are expressed through two types of
probability: an aleatory variability that describes the randomness of the earthquake
system, and an epistemic uncertainty that characterizes our lack of knowledge
about the system. Standard models use empirical prediction equations that have a
high aleatory variability, primarily because they do not model crustal
heterogeneities. We show how this variance can be lowered by simulating seismic
wave propagation through 3D crustal models derived from waveform tomography.
SCEC has developed a software platform, CyberShake, that combines seismic
reciprocity with highly optimized anelastic wave propagation codes to reduce the
time of simulation-based hazard calculations to manageable levels. CyberShake
hazard models for the Los Angeles region, each comprising over 240 million
synthetic seismograms, have been computed on Blue Waters. A variancedecomposition analysis indicates that more accurate earthquake simulations have
the potential for reducing the aleatory variance of the strong-motion predictions by
at least a factor of two, which would lower exceedance probabilities at high hazard
levels by an order of magnitude. The practical ramifications of this probability gain
for the formulation of risk-reduction strategies are substantial.
2. Description of Research Activities and Results:
Key Challenges: description of the science/engineering problem being
The Southern California Earthquake Center (SCEC) coordinates basic research in
earthquake science using Southern California as its principal natural laboratory. The
Center’s earthquake system science research program integrates information
gathered by sensor networks, fieldwork, and laboratory experiments, with
knowledge formulation through physics-based, system-level modeling leading to
improved understanding of seismic hazard, and actions that reduce earthquake risk
and promote community resilience. The key challenge for SCEC’s computational
research using Blue Waters is to improve existing seismic hazard data products
using advanced computational tools and methods, and to develop improved types of
seismic hazard information useful to scientists, engineers, and the public.
SCEC has used NCSA Blue Waters to perform CyberShake computational
research, a physics-based, computationally method for improving Probabilistic
Seismic Hazard Analysis (PSHA). Using Blue Waters, we have calculated several new
PSHA seismic hazard models for Southern California, exploring the variability in
CyberShake seismic hazard estimates produced by alternative 3D earth structure
models and earthquake source models.
Why it Matters: description of the potential impact of solving this research
The CyberShake method produces site-specific probabilistic seismic hazard
curves, comparable to PSHA hazard curves produced by U.S. Geological Survey
(USGS) that are used in national seismic hazard maps. If the CyberShake method can
be shown to improve on current PSHA methods, it may influence broad impact
PSHA users including scientific, commercial, and governmental agencies like the
USGS. The value of CyberShake seismic hazard information varies by user
community. For seismologists, CyberShake provides new information about the
physics of earthquake ground motions, about the interaction of fault geometry, 3D
earth structure, ground motion attenuation, and rupture directivity. For
governmental agencies responsible for reporting seismic hazard information to the
public, CyberShake represents a new source of information that may contribute to
their understanding of seismic hazards, which they may use to improve the
information they report to the public. For building engineers who rely on PSHA
information for building purposes, CyberShake represents an extension of existing
seismic hazard information that may reduce some of the uncertainties in current
methods based on empirical ground motion attenuation models.
Why Blue Waters: explanation of why you need the unique scale and
attributes of Blue Waters to address these challenges
SCEC uses Blue Waters to perform large-scale, complex scientific computations
involving thousands of large CPU and GPU parallel jobs, hundreds of millions of
short-running serial CPU jobs, and hundreds of terabytes of temporary files. These
calculations are beyond the scale of available academic HPCC systems, and, in the
past, they required multiple months of time to complete using NSF Track 2 systems.
Using the well-balanced system capabilities of Blue Waters CPUs, GPUs, disks, and
system software, together with scientific workflow tools, SCEC’s research staff is
now able to complete CyberShake calculations within weeks rather than months.
This enables SCEC scientists to advance the methodological improvements much
more rapidly, as we work towards CyberShake calculations at the scale and
resolution required by engineering users of seismic hazard information.
Accomplishments: explanation of results you obtained
SCEC’s Blue Waters work has produced both scientific and technical
Our technical accomplishments lead to successful completion of very largescale CyberShake calculations in record time. Many activities of our computer and
computational scientists contributed to this accomplishment.
SCEC’s CyberShake workflow system has produced a repeatable and reliable
method for performing very large scale, HPC, research calculations. The tools used
by work within the shared-computer resource environment of open-science HPC
resource providers including Blue Waters. These tools have helped our research
team increase the scientific scale of the calculations two orders of magnitude over
the last five years, without increasing the scale of personnel required to complete
the calculation.
Our scientific contributions to PSHA have the potential to change standard
practices within the field. Models used in PSHA comprise two types of uncertainty,
aleatory variability describing the intrinsic randomness of the earthquakegenerating system, and epistemic uncertainty characterizing our lack of knowledge
about the system. SCEC’s physics-based system science approach can improve our
understanding of earthquake processes, so it can reduce epistemic uncertainties
over time. An example of this type of potential impact by our CyberShake project, we
applied the averaging-based factorization (ABF) technique to compare CyberShake
models and assess their consistency with the Next Generation Attenuation (NGA)
models. ABF uses a hierarchical averaging scheme to separate the shaking
intensities for large ensembles of earthquakes into relative (dimensionless)
excitation fields representing site, path, directivity, and source-complexity effects,
and it provides quantitative, map-based comparisons between models. The
CyberShake directivity effects are generally larger than predicted by the Spudich &
Chiou (2008) NGA directivity factor. The CyberShake basin effects are generally
larger than those from the three NGA models that provide basin-effect factors.
However, the basin excitation calculated from CVM-H is smaller than from CVM-S,
and shows stronger frequency dependence, primarily because the horizontal
dimensions of the basins are much larger in CVM-H. The NGA model of Abrahamson
& Silva (2008) is the most consistent with the CyberShake CVM-H calculations, with
a basin-effect correlation factor greater than 0.9 across the frequency band 0.1-0.3
Related Papers, Presentations, Posters:
1. Boese, M., Graves, R W., Callaghan, S., Maechling, P J. (2012) Site-specific GroundMotion Predictions for Earthquake Early Warning in the LA Basin using
CyberShake Simulations Abstract S53B-2498 Poster presented at 2012 Fall
Meeting, AGU, San Francisco, Calif., 3-7 Dec.
2. Callaghan, S., P. Maechling, K. Vahi, G. Juve, E. Deelman, Y. Cui, E. Poyraz, T. H.
Jordan (2013) Running A Seismic Workflow Application on Distributed
Resources, SC13, Denver, Nov 17-22, 2013 (Poster accepted for presentation)
3. Callaghan, S., Maechling, P., Gideon Juve, Karan Vahi, Robert W. Graves, Kim B.
Olsen, David Gill, Kevin Milner, John Yu and Thomas H. Jordan (2013) Running
CyberShake Seismic Hazard Workflows on Distributed HPC Resources, SCEC
Annual Meeting 2013, abstract 195, Sept 8 – 11, 2013, Palm Springs, CA
4. Callaghan, S., Maechling, P J., Milner, K., Graves, R W., Donovan, J., Wang, F.,
Jordan, T H (2012) CyberShake: Broadband Physics-Based Probabilistic Seismic
Hazard Analysis in Southern California Abstract S51A-2405 presented at 2012
Fall Meeting, AGU, San Francisco, Calif., 3-7 Dec.
5. Chourasia, A., Zhou, J., Cui, Y., Choi, DJ, Olsen, K. (2012) Role of visualization in
porting a seismic simulation from CPU to GPU architecture (Visualization
Showcase), XSEDE’12, Chicago, July 16-20, 2012.
6. Christen, M., O. Schenk, and Y. Cui (2012) PATUS for Convenient HighPerformance Stencils: Evaluation in Earthquake Simulations, Technical Paper,
SC12, Salt Lake City, Nov 10-16, 2012.
7. Cui, Y., E. Poyraz, K. Olsen, J. Zhou, K. Withers, S. Callaghan, J. Larkin, C. Guest, D.
Choi, A. Chourasia, Z. Shi, S. Day, P. Maechling, and T. H. Jordan (2013), Physicsbased seismic hazard analysis on petascale heterogeneous supercomputers,
SC13, Denver, Nov 17-22, 2013 (accepted for publication)
8. Cui, Y., E. Poyraz, J. Zhou, S. Callaghan, P. Maechling, T. H. Jordan, L. Shih, and P.
Chen (2013), Accelerating CyberShake Calculations on the XK7 Platform of Blue
Waters, Extreme Scaling Workshop, Denver, August 15-16, 2013
9. Cui, Y., K.B. Olsen, J. Zhou, P. Small, A. Chourasia, S.M. Day, P.J. Maechling, T.H.
Jordan. (2012). Development and optimizations of a SCEC community anelastic
wave propagation platform for multicore systems and GPU-based accelerators,
Seism. Res. Lett. Seism. Res. Lett. 83:2, 396.
10. Donovan, J., T. H. Jordan, and J. N. Brune (2012). Testing CyberShake using
precariously balanced rocks, 2012 Annual Meeting of the Southern California
Earthquake Center, Palm Springs, Abstract 026, September, 2012.
11. Field, E., Dawson, T.E., et al. (2012) Uniform California Earthquake Rupture
Forecast, Version 3 (UCERF3) Framework, Working Group on California
Earthquake Probabilities (WGCEP) Technical Report #8 July 9, 2012
12. Gill, D., Maechling, P., Jordan, T., Taborda, R., Callaghan, S. and Small, P. (2013).
SCEC Unified Community Velocity Model: Mesh Generation and Visualization. In
Proc. CIG/QUEST/IRIS Joint Workshop on Seismic Imaging of Structure and
Source, Poster. Fairbanks, Alaska, July 14–17.
13. Jordan, T. H. (2013) Progress Of The Southern California Earthquake Center
Technical Activity Group On Ground Motion Simulation Validation, Seismological
Society of America Meeting, April 17-19 2013, Salt Lake City, SRL (Vol. 84, No. 2)
14. Jordan, T H., Callaghan, S., Maechling, P J., Juve, G., Deelman, E., Rynge, M., Vahi,
K., Silva, F. (2012) Workflow Management of the SCEC Computational Platforms
for Physics-Based Seismic Hazard Analysis Abstract IN54B-07 presented at 2012
Fall Meeting, AGU, San Francisco, Calif., 3-7 Dec.
15. Maechling, P. and Gill, D. and Small, P. and Ely, G. and Taborda, R. and Jordan, T.
(2013). SCEC Unified Community Velocity Model: Development Goals and
Current Status. In Proc. CIG/QUEST/IRIS Joint Workshop on Seismic Imaging of
Structure and Source. Fairbanks, Alaska, July 14–17.
16. Taborda, R. and Bielak, J. (2013a). Ground-Motion Simulation and Validation of
the 2008 Chino Hills, California, Earthquake. Bull. Seismol. Soc. Ame.
17. Taborda, R. and Bielak, J. (2013b). Comparative validation of a set of physicsbased simulations of the 2008 Chino Hills earthquake using different velocity
models. In Proc. CIG/QUEST/IRIS Joint Workshop on Seismic Imaging of
Structure and Source, Poster. Fairbanks, Alaska, July 14–17.
18. Taborda, R. and Bielak, J. (2013). Comparative Validation of a Set of HighFrequency Physics-Based Simulations Using Two Different Velocity Models. In
Abstr. SSA Annu. Meet. Salt Lake City, Utah, April 17–19.
19. Taborda, R. and Bielak, J. (2013d). Ground-Motion Simulation and Validation of
the 2008 Chino Hills, California, earthquake using different velocity models. Bull.
Seismol. Soc. Am., Submitted for publication.
20. Taborda, R. and Bielak, J. (2012). Validation of a 4-Hz physics-based simulation
of the 2008 Chino Hills earthquake. In Proc. SSA Annu. Meet. San Diego,
California, April 17–19.
21. Unat, D., Zhou, J., Cui, Y., Cai, X. and Baden, S. (2012) Accelerating an Earthquake
Simulation with a C-to-CUDA Translator, Journal of Computing in Science and
Engineering, Vol. 14, No. 3, 48-58, May/June, CiSESI-2011-09-0094, May, 2012.
22. Wang, F., Jordan, T.H. (2013) Comparison Of Physics-Based Models And Ground
Motion Prediction Equations In Seismic Hazard Analysis For Southern California,
Seismological Society of America Meeting, April 17-19 2013, Salt Lake City, SRL
(Vol. 84, No. 2)
23. Wang, F., Jordan, T H. (2012) Using Averaging-Based Factorization to Compare
Seismic Hazard Models Derived from 3D Earthquake Simulations with NGA
Ground Motion Prediction Equations Abstract S51A-2405 Poster presented at
2012 Fall Meeting, AGU, San Francisco, Calif., 3-7 Dec.
24. Wang, F., and Jordan, T. H. (2013), Comparison of probabilistic seismic hazard
models using averaging-based factorization, Bull. Seismol. Soc. Am., 84 pp.,
submitted 09/27/13.
25. Wang, F., T. H. Jordan, S. Callaghan, R. Graves, K. Olsen, and P. Maechling, Using
averaging-based factorization to assess Cybershake hazard models, submitted to
American Geophysical Union Annual Meeting, December, 2013.
26. Zhou, J., Y. Cui, E. Poyraz, D. Choi, and C. Guest, (2013) Multi-GPU implementation
of a 3D finite difference time domain earthquake code on heterogeneous
supercomputers," Proceedings of International Conference on Computational
Science, Vol. 18, 1255-1264, Elesvier, ICC. 2013, Barcelona, June 5-7, 2013.
27. Zhou, J., Choi, DJ, Cui, Y. (2012) GPU acceleration of a 3D finite difference
earthquake code on XSEDE Keeneland, XSEDE’12, Chicago, July 16-20, 2012.
28. Zhou, J., Didem, U., Choi, D., Guest, C. & Y. Cui (2012) Hands-on Performance
Tuning of 3D Finite Difference Earthquake Simulation on GPU Fermi Chipset,
Proceedings of International Conference on Computational Science, Vol. 9, 976985, Elesvier, ICCS 2012, Omaha, Nebraska, June, 2012.
Figure 1: Comparison of two seismic hazard models for the Los Angeles region from CyberShake Study 14.2,
completed on Blue Waters in early March, 2014. The left panel is based on an average 1D model, and the right
panel is based on the F3DT-refined structure CVM-S4.26. The 3D model shows important amplitude
differences from the 1D model, several of which are annotated on the right panel: (1) lower near-fault
intensities due to 3D scattering; (2) much higher intensities in near-fault basins due to directivity-basin
coupling; (3) higher intensities in the Los Angeles basins; and (4) lower intensities in hard-rock areas. The
maps are computed for 3-s response spectra at an exceedance probability of 2% in 50 years. Both models
include all fault ruptures in the Uniform California Earthquake Rupture Forecast, version 2 (UCERF2), and
each comprises about 240 million seismograms.

SCEC_BWS_ExAbs_v2 - University of Southern California