PRAC_2014_Final_v3 - University of Southern California

advertisement
Petascale Research in Earthquake System Science on Blue Waters (PressOn)
(OCI-0832698)
Project Report for Performance Period:
1 October 1 2013 – 30 September 2014
Principal Investigator:
Thomas H. Jordan – University of Southern California – Earth Sciences
Co-PI:
Jacobo Bielak – Carnegie Mellon University – Civil Engineering
1. What are the major goals of the project?
Our PressOn project objectives are defined in the original proposal as:
Develop methods, capability, and software for high fidelity and physics-based
simulations of entire urban regions to assess the engineering impacts of large
magnitude earthquakes on buildings, transportation systems, and underground civil
infrastructure. SCEC’s PRAC research helps to bridge the gaps between seismology
and civil engineering, applying the latest computational seismological capabilities to
study and assess the impact of strong ground motions on a built environment.
2. What was accomplished under these goals (you must provide information
for at least one of the 4 categories below)?
2.1) Major activities;
During this performance period, SCEC used the Blue Waters computer
system to perform our earthquake system science research program. We developed
physics-based seismological modeling codes to study the impact of large
earthquakes on the built environment. Then we used the newly developed modeling
tools to perform very large-scale research calculations. We collaborated with
building engineers to ensure that our PressOn research results meet the needs of
the earthquake engineering community. The tools and techniques we have
developed are applicable to California and other regions in the world.
Over the last year, the computational research areas for our SCEC PressOn
project included: (1) 3D velocity model development, (2) dynamic rupture and
Earthquake Rupture Forecast development, (3) high frequency earthquake
simulations, (4) ensemble probabilistic seismic hazard analysis (PSHA) earthquake
simulations, (5) impact of ground motion on built environment. Our key results over
this performance period were in areas (4) and (5).
2.3) Significant results, including major findings, developments, or
conclusions;
Our most significant research calculation on Blue Waters in this performance
period was our CyberShake 14.2 hazard model calculation. The CyberShake 14.2
study produced four probabilistic seismic hazard models. Two of the CyberShake
14.2 models are shown in the associated figures file as Figure 1.
The CyberShake 14.2 calculation was made possible by our recent scientific
code improvements, including the GPU-based wave propagation codes developed as
part of our PRAC research. We also worked with the NCSA Blue Waters to integrate
our workflow tools onto Blue Waters.
The CyberShake 14.2 calculation produced important scientific results. Our
primary goal is to reduce the epistemic uncertainty in physics-based PSHA by
extending the spatiotemporal scales of deterministic earthquake simulations. We
demonstrated progress towards this goal by an analysis of the CyberShake results
using the new technique of “averaging-based factorization” (ABF). By utilizing the
large number ground-motion predictions generated by the simulation-based hazard
models (about 240 million per CyberShake run), ABF can factor any seismic
intensity functional into the aleatory effects of source complexity and directivity and
the deterministic effects of 3D wave propagation. This variance-decomposition
analysis demonstrates that improvements to simulation-based hazard models such
as CyberShake 14.2, can, in principle, reduce the residual variance by a factor of two
relative to current NSHMP standards. The consequent decrease in mean exceedance
probabilities could be up to an order of magnitude at high hazard levels.
Development of Software and System Capabilities
The SCEC PressOn research has lead to significant improvements in several high
performance scientific codes. These software improvements include both
integration of new physics and improved software efficiency.
We have developed dynamic rupture simulations that can incorporate multiple
physical models such as frictional breakdown, shear heating, thermoelastic flow
(and the resulting effective normal-stress fluctuations), as well as multiscale fault
roughness.
We continued Hercules development incorporating free-surface topography and
the influence of the built environment in the modeling and simulation scheme in
Hercules. The Hercules development team developed a finite-element based
methodology that uses special integration techniques to account for an arbitrary
free-surface boundary in the simulation domain but preserves the octree- based
structure of the code, and thus does not have a significant effect on performance. We
also developed a hybrid CPU and GPU-version of Hercules that reduces the
computational cost of high frequency ground motion simulations.
We have developed a GPU versions of AWP-ODC, Hercules, and CyberShake SGT
code that gives us 6.5 times greater computational efficiency running ground motion
simulation on Blue Waters. To measure computational efficiency, we compare the
costs of running equivalent simulations with CPU and GPU codes. If we run a given
wave propagation simulation using our CPU code, and we run an equivalent wave
propagation simulation using our GPU code, the CPU code will require 6.5 times
more Node Hours on Blue Waters than the same simulation using the GPU code.
As mentioned briefly above, we developed scientific workflow tools for use
on Blue Waters to support our CyberShake hazard model calculations.
Since we have repeated the CyberShake calculation with different input
parameters several times, we are able to compare track our performance
improvements for this calculation over several years. The makespan (total wall-
clock time needed to complete the calculation) and the required computational
resources have continued to drop over the last several years. Table 2 shows the
makespan and the computational cost of four CyberShake hazard model
calculations.
As of 2009, the largest production CyberShake study had been completed
using TACC Ranger. In that year, after receiving our PRAC PressOn award, SCEC
began planning to migrate the CyberShake calculation to Blue Waters. Since Blue
Waters became operational in 2013, SCEC we have completed two CyberShake
studies using Blue Waters: one in April 2013 (CyberShake 13.4) and a second study,
earlier this year, in February 2014 (CyberShake 14.2).
We migrated our CyberShake calculations to Blue Waters in two phases. Our
CyberShake 13.4 used the Cray XE6 nodes on Blue Waters to perform the first part
of the CyberShake workflow including the large parallel SGT jobs, and we used TACC
Stampede nodes to perform the second, high-throughput part of the workflow.
For CyberShake 14.2, we executed both parts of the CyberShake workflow on
Blue Waters. This transition was made possible by the deployment of GRAM on Blue
Waters, enabling both phases of CyberShake processing to be executed on Blue
Waters.
As we migrated CyberShake onto Blue Waters, the Blue Waters' large node
count and large scratch disk space enabled us to increase the size of a CyberShake
study from 1 hazard model (286 hazard curves), to 4 hazard models (1144 hazard
curves). We took advantage of the Cray XK7 nodes on Blue Waters to accelerate the
floating-point intensive SGT calculations by using the recently developed CUDA GPU
implementation of our SGT software component.
Implementing the CyberShake computational system on Blue Waters
involved domain scientists, computer scientists, software developers, and
middleware developers. The Blue Waters user support model emphasizes
collaborative efforts between PRAC scientific teams and NCSA technical teams, so
our CyberShake team included both SCEC geoscience domain experts and NCSA HPC
resource providers. Development efforts required technical coordination between
groups, judicious introduction of new GPU software, improved scientific software
components, increased end-to-end workflow-based automation, and Blue Watersspecific workflow optimizations.
Data management and staging were handled automatically by the workflow
tools. A single CyberShake run for one site produces approximately 52 GB of output
data, of which 12 GB is staged back to SCEC storage at the Center for High
Performance Computing and Communications (HPCC) at the University of Southern
California (USC). In addition, approximately 730 GB of temporary storage is used
per site. For each CyberShake study, a total of over 850 TB of data is processed
using SCEC’s workflow tools on Blue Waters.
SCEC’s 2009 study using TACC Ranger required a total makespan time of
1,207 hours (50.3 days). SCEC’s 2013 CyberShake Study 13.4 involved the
calculation of 1,144 hazard curves – over five times larger – and was completed in
1,467 hours (61.1 days) using Blue Waters and Stampede.
SCEC’s 2014 CyberShake Study 14.2 used both XE6 and XK7 nodes on Blue
Waters together with SCEC’s scientific workflow tools. CyberShake Study 14.2 was
able to produce 1,144 hazard curves in 342 hours (14.3 days), 77% faster than the
split (Blue Waters/Stampede) approach. We believe these CyberShake applicationlevel performance improvements highlight both the highly productive computing
environment of Blue Waters and the benefits of scientific workflow tools.
2.4) Key outcomes or other achievements.
A key outcome from our Blue Waters research this year is the use of SCEC
Blue Waters research results, specifically our CyberShake 14.2 research results, by
the SCEC Committee on Utilization of Ground Motion Simulations (UGMS). Members
of this committee are acting Civil Engineers responsible for the development of
long-period response spectral acceleration maps for Los Angeles region for
inclusion in NEHRP and ASCE 7 Seismic Provisions and in Los Angeles City Building
Codes.
SCEC researchers met with this committee in Sept 2014 to develop the
procedures needed to extend the CyberShake 14.2 results into measures needed for
engineering purposes. Specifically, we have extended the CyberShake 14.2 results to
calculate Risk-Targeted Maximum Considered Earthquake Response Spectra
(MCER) at 14 existing CyberShake 14.2 sites in southern California. For 14 selected
sites, we have calculated both Probabilistic and Deterministic MCER curves.
Examples of these two types of curves, using CyberShake 14.2 results, for a site in
Los Angeles near Century City Plaza (CCP) are shown in Figure 2.
We view this work as a key outcome from our PRAC research because this is
the first use of physics-based hazard estimates by Building Code developers, and it
represents an opportunity to use computational-based methods to improve seismic
hazard estimation methods used by the building engineering community, a
potentially transformative broad impact result.
3. What opportunities for training and professional development has the
project provided?
By providing access to Blue Waters, and travel funds to participate in Blue
Waters meetings, our PRAC proposal has provided opportunities for training and
professional development to project members.
UCSD PhD student Efecan Poyraz made important contributions to this
project during this PhD work. He presented his work at conferences, including a
Blue Waters meeting in 2013, and he gained both technical and collaboration skills
during the project. Efecan accepted a position with Google and began work there in
September 2014.
At Carnegie Mellon University, postdoctoral mentoring was part of the
activities to help prepare Dr. Ricardo Taborda for an academic career. Taborda
secured a position as a new Assistant Professor at the University of Memphis (U of
M), starting August 2013. At the U of M, Taborda has joined the faculty of the Civil
Engineering Department in the School of Engineering and has a joint tenure-track
appointment with the Center for Earthquake Research and Information (CERI). CERI
is a University of Memphis Center of Excellence. We expect that Taborda will
continue to collaborate with SCEC and CMU from CERI. Two other graduate
students, Haydar Karaoglu and Doriam Restrepo participated in the research
activities at CMU. Doriam Restrepo has successfully defended his Ph.D. thesis and
joined the faculty at the University of EAFIT in Medellin, Colombia, and Haydar
Karaoglu completed his Ph.D. studies in May 2014.
4. How have the results been disseminated to communities of interest
SCEC PressOn project members have presented and discussed our work in a
series of geoscientific and computational science meeting and workshops during the
project including the following:
1.
2.
3.
4.
5.
6.
Fall AGU, Dec 9-13, 2013 – San Francisco, CA
NSF HPC Advisory Meeting 3 March 2014, Arlington, VA
IRIS Data Management Meeting 20 March 2014, Los Angeles, CA
SSA April 30-May 2, 2014 Anchorage Alaska
Blue Waters Symposium 14 May 2014 NCSA, IL
XSEDE/PRACE International HPC Summer School June 4, 2014, Budapest
Hungary
7. Oak Ridge Leadership Computing Facility Users Meeting 22 July 2014, Oak Ridge
Tenn.
8. National Conference on Earthquake Engineering July 21-25, 2014 Anchorage,
Alaska
9. SCEC Earthquake Ground Motion Simulation Meeting, 7 Sept 2014, Palm Springs
CA
10. SCEC Annual Meeting, Sept 8-10, 2014 Palm Springs, CA
11. SC14, Nov 17-20, 2014 New Orleans LA
12. SCEC Committee for Utilization of Ground Motion Simulations Meeting,
December 3, 2014 USC, Los Angeles CA
Our PRAC team has also worked with NCSA, USC, and other communications experts
to develop scientific summaries of our recent accomplishments and impact. Several
press articles, including NCSA articles, included descriptions of SCEC’s use of Blue
Waters. Links to these online press articles include the following:
http://www.ncsa.illinois.edu/news/story/do_the_wave
- NCSA Article on SCEC Seismic Hazard
http://www.ncsa.illinois.edu/news/story/streamlining_simulation
- NCSA Article on SCEC Workflows
http://www.cnn.com/2014/03/29/us/california-earthquake/
- Interview Scientists with SCEC Animations
http://losangeles.cbslocal.com/2014/03/30/7-5-quake-on-puente-hills-thrustfault-could-be-disastrous/
- Interview with SCEC Director
http://www.ncsa.illinois.edu/news/story/blue_waters_symposium_2014_an_all_aro
und_success
- Blue Waters Symposium A Success in March 2014
http://www.hpcwire.com/soundbite/taccs-stampede-supercomputer-first-yearreview/
- TACC Stampede Year 1 HPCWire Interview Includes SCEC
http://dornsife.usc.edu/news/stories/1603/strides-in-earthquake-science/
- USC Dornsife Earthquake System Science
http://m.laweekly.com/los-angeles/blogs/Post?basename=the-big-one-quake-willhit-la-harder-than-we-thought-scientistssay&day=27&id=informer&month=01&year=2014
- Beroza Ambient Noise Study
http://www.latimes.com/science/sciencenow/la-sci-sn-virtual-earthquake-oceanwaves20140124,0,4506840.story?track=rss&utm_source=feedburner&utm_medium=feed
&utm_campaign=Feed%3A+latimes%2Fmostviewed+%28L.A.+Times++Most+Viewed+Stories%29#axzz2rO3ClMsh
- Beroza Ambient Noise Study
http://universe.sdsu.edu/sdsu_newscenter/news.aspx?s=74691
- SDSU Basin Response Research for Vancouver
http://news.usc.edu/#!/article/58305/earthquake-science-in-the-era-of-big-data/
- USC Article On SCEC Research
http://s1018582977.t.en25.com/e/es.aspx?s=1018582977&e=679&elq=316f57a9
63f44872bc9f38127e314424
- Cray HPC Coverage of AWP-ODC-GPU
http://hypocenter.usc.edu/research/CME/SCEC_Media_Briefing_20_Years_Since_No
rthridge_01_09_2014.m4v
- SCEC Director Media Presentation
http://www.huffingtonpost.com/2014/01/10/northridge-earthquaketechnology_n_4572906.html
- Article about SCEC Contributions 20 Years after the Northridge Earthquake
SCEC has hosted a series of computational workshops on ground motion
simulation and modeling issues including a SCEC Ground Motion Simulation
Validation (GMSV) Technical Activity Working Group (Sept 2014) meeting, and
SCEC Committee for Utilization of Ground Motion Simulations (UGMS) (Dec 2014)
meeting. Details from these meetings are posted on the public SCECpedia wiki.
http://scec.usc.edu/scecpedia/SCEC_UGMS_Committee_Meeting
http://collaborate.scec.org/gmsv/Main_Page
SCEC researchers make use of wiki’s to communicate our research among
ourselves. By default, research communications are open. Many SCEC PressOn
research efforts, including the wave propagation simulation work are described in
some detail on the SCEC wiki with the following home page:
http://scec.usc.edu/scecpedia
Google analytics for this site from 1 October 2013 through 30 September
2014 show 7,574 visits, from 3,496 unique visitors, with 28,174 pageviews,
averaging 3.72 pages/visit with an average of 4:39 minutes per visit for a total of
35,219 minutes (586 hours) of visitor viewing time last year.
5. What do you plan to do during the next reporting period to accomplish the
goals?
This is our final project report.
6. Products - What has the project produced?
Publication List:
[1] Baker, J. W., Luco, N., Abrahamson, N. A., Graves, R. W., Maechling, P. J., Olsen, K.
B. (2014) Engineering Uses of Physics-based Ground Motion Simulations,
Proceedings of the 10th National Conference in Earthquake Engineering, July 21-25,
2014, Anchorage Alaska, Earthquake Engineering Research Institute, 11 p.
[2] Böse, M., R. Graves, D. Gill, S. Callaghan and P. Maechling (2014) CyberShakederived ground-motion prediction models for the Los Angeles region with
application to earthquake early warning. Geophysical Journal International, 198 (3).
pp. 1438-1457. ISSN 0956-540X
[3] Lee, E.-J., P. Chen, T. H. Jordan, P. B. Maechling, M. A. M. Denolle, and G. C. Beroza
(2014), Full-3-D tomography for crustal structure in Southern California based on
the scattering-integral and the adjoint-wavefield methods, J. Geophys. Res. Solid
Earth, 119, 6421–6451, doi:10.1002/2014JB011346.
[4] Poyraz, E., Xu, Heming and Cui, Y. (2014). I/O Optimziations for High
Performance Scientific Applications, Procedia Computer Science Volume 29, 2014,
Pages 910–923 2014 International Conference on Computational Science
doi:10.1016/j.procs.2014.05.082
[5] Restrepo Doriam, and Bielak Jacobo (2014), Virtual topography: A fictitious
domain approach for analyzing free-surface irregularities in large-scale earthquake
ground motion simulation, Int. J. Numer. Meth. Engng, 100, 504–533, doi:
10.1002/nme.4756).
[6] Roten, D., K.B. Olsen, S.M. Day, Y. Cui, and D. Fah (2014), Expected seismic
shaking in Los Angeles reduced by San Andreas fault zone plasticity, Geophysical
Research Letters, 41, doi:10.1002/2014GL059411, 201
[7] Wang, F., and T. H. Jordan (2014), Comparison of probabilistic seismic-hazard
models using averaging-based factorization, Bulletin of the Seismological Society of
America June 2014 vol. 104 no. 3 1230-1257 doi: 10.1785/0120130263
[8] Proceedings of the 2014 Blue Waters Symposium (2014)
[9] Deelman, E. , Karan Vahi, Gideon Juve, Mats Rynge, Scott Callaghan, Philip J.
Maechling, Rajiv Mayani, Weiwei Chen, Rafael Ferreira da Silva, Miron Livny, Kent
Wenger (2014) Pegasus, a workflow management system for science automation,
Elsevier, Future Generation Computer Systems, Oct 2014,
doi:10.1016/j.future.2014.10.008
Software and Data Products:
1. Simulation results, as seismograms, ground motion amplitudes, rupture
definitions, and maps from the CyberShake 14.2 Hazard Model generated using
Blue Waters are posted online and available for researchers.
2. Open-source software distribution of the Unified California Velocity Model
(UCVM) software released in March 2014, available on SCEC wiki.
3. Hercules is made available upon request via the GitHub code-hosting platform.
Versions of Hercules have been shared with researchers at USC and Argonne
National Lab.
4. AWP-ODC-GPU software has been provided to DOE researcher at Oak Ridge
Leadership Computing Facility, and to Cray Inc to support their studies into
efficient GPU programs.
7. Participants & Other Collaborating Organizations - Who has been involved?
The following individuals have worked on the project. In some cases, such as the
USGS researchers, no funding has been provided to these researchers although they
have played an active role in this research.
1.
2.
3.
4.
5.
6.
7.
8.
9.
Shima Azizzadeh - szzdhrd@memphis.edu - Civil Engineering Grad Student
Jacobo Bielak - jbielak@cmu.edu - Professor Civil Engineering, Hispanic
Scott Callaghan - scottcal@usc.edu - Research Staff Computer Science
Po Chen - pchen@uwyo.edu - Geoscience professor
Dong Ju Choi - dchoi@sdsc.edu - Computer Science Grad student
Yifeng Cui - yfcui@sdsc.edu - Research Staff Computer Science
Steven Day - sday@mail.sdsu.edu - Professor Geophysics
Brittany Erickson - berickson@projects.sdsu.edu - Geosciences grad student
David Gill - davidgil@usc.edu - Research Staff Computer Science
10. Yigit Isbiliroglu - isbiliroglu@cmu.edu - Civil Engineering Grad student
11. Thomas Jordan - tjordan@usc.edu - Professor Geophysics
12. Gideon Juve - juve@isi.edu - Research Staff Computer Science
13. Haydar Karaoglu - hkaraogl@andrew.cmu.edu - PhD Student Civil Engineer
14. Naeem Khoshnevis - nkhshnvs@memphis.edu - Civil Engineering Grad Student
15. En-Jui Lee - enjuilee@usc.edu - Post-doctoral scholar - Geosciences
16. Philip Maechling - maechlin@usc.edu - Research Staff Computer Science
17. John McRaney - mcraney@usc.edu - Research Staff Geophysics
18. Kevin Milner - kmilner@usc.edu - Research staff computer science
19. Dawei My - dmu@uwyo.edy - Geoscience Grad student
20. Shiying Nie - nie@rohan.sdsu.edu - Geoscience Grad student
21. Kim Olsen - kbolsen@mail.sdsu.edu - Professor Geophysics
22. Efecan Poyraz - efecanpoyraz@gmail.com - Google, Inc.
23. Dorian Restrepo - drestrep@andrew.cmu.edu - PhD Student Geophysics
24. Daniel Roten - droten@sdsc.edu - Post-doctoral scholar
25. William Savran - wsavran@gmail.com - Geoscience Grad Student
26. Zheqiang Shi - zshi@projects.sdsu.edu - Post-doc, geoscience, male
27. Liwen Shi - shih@uhci.edu - Computer Science professor
28. Patrick Small - patrices@usc.edu - M.S. Graduate Student Computer Science
29. Xin Song - xinsong@usc.edu - PhD Student Geophysics
30. Ricardo Taborda - ricardo.taborda@memphis.edu - Professor of Civil Engineer
31. Karan Vahi - vahi@isi.edu - Computer science research staff
32. Kyle Withers - quantumkylew@aol.com - PhD Student Geophysics
33. Heming Xu - h1xu@sdsc.edu - computer science research staff
Other collaborators or contacts been involved?
1. Gregory H. Bauer - NCSA
2. Timothy Bouvet - NCSA
3. Robert Graves - US. Geological Survey
4. William T. Kramer - NCSA
5. John Levesque - Cray Inc.
6. Omar Padron, Jr. - NCSA
7. Feng Wang - AIR Worldwide
8. Impact - What is the impact of the project? How has it contributed?
8.1 What is the impact on the development of the principal discipline(s) of the
project?
SCEC’s PRAC research has helped bridge the gap between geoscientists (who
perform ground motion simulations), and civil engineers (who uses ground motion
simulations to evaluate building safety). This collaboration has led to quantitative
evaluation procedures for evaluating new ground motion simulation methods.
We believe that the SCEC PressOn PRAC project has shown how physicsbased computational models, observation-based 3D earth structure models, and
high performance computing can improve seismic hazard forecasts.
The project has supported the efforts of SCEC to translate basic research into
practical products for reducing risk and improving community resilience.
Specifically, this work helped reduce the total uncertainty in long-term hazard
models, which has important practical consequences for the seismic provisions in
building codes and especially for critical-facility operators. For example, PG&E faces
very high costs (potentially tens of billions of dollars) in seismic retrofit expenses
associated with its immoveable facilities, including many dams across California and
its Diablo Canyon Nuclear Power Plant on the central California coast. In addition,
realistic earthquake simulations can improve new systems now being developed by
the USGS for operational earthquake forecasting and earthquake early warning. As a
result of our PRAC research activities, both PG&E and the USGS are involved in
computational-based research, using the NSF HPC resources to improve risk
reduction.
8.2 What is the impact on other disciplines?
Our SCEC PRAC research has produced advances in seismology, in
engineering, and in computer science. The SCEC PRAC project has made an impact in
the computer science field in several areas.
We worked with the Blue Waters staff as early users of remote job
submission to Blue Waters, a capability we needed to support our large-scale
workflows. We have developed and optimized workflow tools, specifically Pegasus
MPI-Cluster (PMC) that enable us to run, large, many-task, workflows on Blue
Waters.
We have developed computational techniques to optimize parallel Cuda
codes running on heterogeneous CPU/GPU system. We have developed a new
computing techniques, called workflow co-scheduling, that runs computations on
GPU part of node, and post-processing on CPU part of node, a potentially highly
efficient use of heterogeneous systems.
SCEC PRAC researchers also have an impact by participating on national and
international HPC advisory groups including XSEDE Advisory Board, NEES
Cyberinfrastructure Advisory Committee, XSEDE allocation review committee, and
the Global Earthquake Model Scientific Board.
8.3 What is the impact on the development of human resources?
The interdisciplinary computational science research activities on the SCEC
PressOn PRAC project are excellent preparation for many types of work. Graduate
students and research staff that have made significant contributions to SCEC
computational research have gone onto computational science careers including
positions with Amazon.com, Microsoft, Intel, Argonne National Laboratory, Google,
and AIR Worldwide.
SCEC PressOn team members perform outreach to K-12 and involvement of
under-represented students will be done through the established programs at each
of the universities and through the NEES Education, Outreach, and Training
activities. SCEC PressOn team members describe our computing techniques at
International HPC summer schools sponsored by XSEDE and PRACE, including a
summer school in 2014 in Budapest Hungary.
The project team has a consistent track record of disseminating research
results to wider technical communities, through, amongst others, many fruitful
collaborations with key organizations including SCEC, PEER, NEES, USGS, and FEMA.
Graduate students have the opportunity to create new knowledge through multidisciplinary research in civil engineering, computational engineering, and computer
science
SCEC Intern programs continue to cross-train students in geoscience and
computer science. SCEC’s current software staff includes 2 developers who first
participated in SCEC as undergraduate interns.
8.4 What is the impact on physical resources that form infrastructure?
SCEC has established a Committee for Utilization of Ground Motion
Simulations (UGMS) to provide scientific and engineering guidance to SCEC
computational programs to ensure the computational results are useful for
engineering purposes. Members of the UGMS include members from USGS and
commercial companies, including members of the California Building Code
development organization. This group is interested in using SCEC CyberShake
results in future California Building Code development. If the California Build Code
developers use CyberShake results, our computational work will have broad impact
on California physical infrastructure.
The methodology, applications, and the results of the SCEC PressOn project
are useful for disaster planning and management, since it is the detailed knowledge
of how an urban system performs in a large earthquake that is needed for improving
disaster planning and preparation, and for evaluating mitigation policies.
Because of the critical role that ground motion and structural response plays in
infrastructure design, the accelerated availability of simulation methodologies will
help improve public safety and welfare.
8.5 What is the impact on institutional resources that form infrastructure?
None expected
8.6 What is the impact on technology transfer?
As well-validated, open-source, scientific software, the SCEC community
codes developed and optimized on the SCEC PressOn PRAC project including AWPODC, AWP-ODC-GPU, UCVM, and Hercules have attracted interest from private
researchers and commercial companies for use in seismic hazard analysis research.
By releasing our software as open-source scientific tools, we help to transfer HPC
computing into seismic hazard research.
8.7 What is the impact on society beyond science and technology?
SCEC’s PressOn PRAC project has potential broad impact beyond science and
technology through improved public seismic safety, by advancing and improving the
practice of seismic hazard analysis, by working to integrate, evaluate, and adopt a
much more computational oriented approach than currently used. Our research
provides engineers with accurate and complete information about earthquake
generated strong ground motions not currently available from observed ground
motion data.
The implementation of SCEC’s PRAC research for practical purposes depends
on effective interactions with engineering researchers and organizations and with
practicing engineers, building officials, insurers, utilities, emergency managers, and
other technical users of earthquake information. Over the course of our PRAC
project, we have engaged with these organizations to put our research results into
practical use.
The PRAC research and software tools have the potential for large societal
impact. In the United States, the USGS is responsible for providing official seismic
hazard information, including seismic hazard forecasts, to regulatory agencies and
the public. SCEC PressOn research includes significant contributions from USGS
personnel and our close connection to USGS seismic hazards programs provides an
opportunity for our results to impact national seismic hazard estimates.
Download