SCEC Crustal Deformation Modeling Workshop, Caltech, June

advertisement
SCEC Crustal Deformation Modeling Workshop, Caltech, June 10 – 12, 2002
The Crustal Deformation Modeling subset of the Fault Systems Working Group is
putting together a Community Finite Element Modeling (FEM) package for studies of
crustal deformation in Southern California. Mark Simons and Brad Hager ran a 2.5 day
workshop at Caltech, June 10 – 12, 2002, to kick off this process. The goal was to survey
what software is currently available, to define the computational challenges, and to map
out a strategy for making rapid progress. 36 scientists from 12 universities, NSF, the
USGS, JPL, Los Alamos National Lab., and GSC participated in the workshop. A
summary of the most significant outcomes of the workshop is given below. The agenda,
participant list, group mission statement, timeline for deliverables, and list of issues to be
addressed are appended. Web site: http://bowie.mit.edu/fe
Keeping in mind the importance of modularity, it is useful to divide FEM
software into three parts: meshing, assembly of equations, and equation solving.
(Existing assemblers and solvers are relatively tightly coupled as packages, but such tight
coupling is neither required nor desired.) The Workshop was organized to investigate
existing meshers and assemblers/solvers in order to determine the relative strengths and
weaknesses of academic and commercial software packages. Before the Workshop, a
series of benchmark problems was designed to test the accuracy and efficiency of the
solvers; follow on benchmarks were designed at the workshop to test the meshers.
Evaluation of assemblers/solvers: No software package known to Workshop participants
has all of the components that will be eventually required, including efficient meshing,
realistic rheologies, iterative solution of equations on distributed memory computers, and
open source. All of these components are implemented on at least one available code, so
all components could be evaluated. Results of the evaluation include: 1) ABAQUS, a
commercial code with roots in geomechanics, has the most comprehensive set of
rheologies and basis functions, iterative solvers, and can run in parallel on shared
memory machines. It is closed source, with no clear path to implementation on
distributed memory computers. 2) Tecton 2.1 is open source, has a subset of desired
rheologies and an iterative solver, but does not run in parallel and has only linear
hexahedral basis functions. Combining its assembler with a parallel solver is a high
priority. 3) Geofest has open source, a subset of desired rheologies and has both
tetrahedral and hexahedral basis functions. A serial iterative solver will be implemented
soon, then a parallel solver. 4) The available GeoFEM has open source, a comprehensive
collection of basis functions, and runs in parallel on distributed machines, but the
currently released version has only elastic rheology. 5) Benchmark comparisons were
useful, demonstrating that parallel iterative solutions are both fast and accurate, but also
revealing differences in physics among codes and disagreements in calculated responses
that are not yet fully understood. Benchmarks will be continued, both to validate codes
and to assess cost vs. accuracy for various meshes and basis functions.
Meshing: John Shaw presented the approach that the SCEC USR group will use to
generate descriptions of fault geometries using triangular surfaces. A high priority
requirement emerging from the Workshop is the necessity of converting the
discontinuous fault segments making up CFM-A into closed surfaces bounding blocks
(the Community Block Model, CBM). Carl Gable introduced the LAGriT script-oriented
meshing package used by Los Alamos; Andy Freed introduced the commercial Ideas
software, which features a GUI interface. It seems clear that the most straightforward
interface between USR and FEM would be achieved using tetrahedral elements;
investigating the speed and accuracy of unstructured tetrahedral meshes is therefore a
high priority for future benchmarking studies.
Perhaps the most important conclusion of the Workshop is that, given limited resources
and ongoing developments by other groups, the highest priority is to develop a realistic
mesh describing the fault system of southern California. We are aware of no other effort
to grid such a large region with such realism. The first step is to assure that the Gocad
output provided by the USR group is readily interpreted by the meshers. The figure
below, which shows a rendition via LAGriT of the Raymond Hill fault, demonstrates that
Gocad output is easily interpretable and the interfaceis straightforward.
Figure 1: Rendition of the Raymond Hill Fault surface of CFM-A. The fault surface
description was written out at Harvard by Gocad in an ASCII file. This file was read in
at Los Alamos and the surface rendered via LAGriT. Work is in progress to develop a
“microblock” model of the LA basin as a benchmark for mesh generation.
Fault Systems Crustal Deformation Working Group: Mission Statement
1) Build tools to understand the response to single earthquakes, and make geodetic
comparisons, infer rheology, and constrain structures
2) Build tools to simulate fault system interaction, regional strain and stress field
evolution. Produce results that would assist in the estimation or modeling of fault slip
and constrain physics
3) Develop understanding of transient stress interaction among faults
4) Determine realistic predictions of geologic features (e.g., topography, fault slip)
Fault Systems Crustal Deformation Working Group: Timeline for Deliverables
Complete by SCEC meeting 9/02:
1. Establish continuously updated WWW and list serve
2. Converge on current test cases (benchmarks)
3. Use idealized test cases to verify simulation of rheologies and time dependant affects
a. Weak form of Maxwell, KV, SLS, BB models, Poroelasticity (Rick)
4. Reference Solutions
5. Test cases for various mesh generators (quality/time/memory/processor)
6. CBM Micro block model (smooth and actual)
7. Pursue IGPP hosting of weeklong workshop summer ‘03(Brad, Mark, Carl)
Completed by SCEC meeting 9/03:
1. Published documented geometric model, based on USR and mesh to act as a standard
for future models.
2. 3d Physical model of CBM (Carl)
3. Elastic solution for single EQ in full block model (JPL, Mark, Brad)
4. Define architecture ‘seeds,’ standards of plug & play environment (JPL, Mark)
Complete by SCEC meeting 9/04:
1. Full running model with fine mesh for Northern L.A. and/or the Mojave
2. Constraints for rheologic structure in for areas with geodetic data.
3. Coarse mesh publicly available (web based) and easily accessible designed for
compatibility with USR elements.
4. Ability for user to manipulate year 1 mesh locally
5. Integration of multiple modules, including parallelism, multiple viscoelastic and fault
rheologies
Complete by SCEC meeting 9/06:
1. Similar to year 3, but include all of USR CFM-B. Run simulations that model
geodetic data and are consistent with past rupture
Issues that need to be addressed:
Wish list
1)
Flexible GUI
2)
Community oriented infrastructure for plug and play
3)
Facilitate comparison/inversion w/ real field data (partially through other
work being done at SCEC)
4)
Blend public/commercial
Geometry Description:
 Translation from USR people to usable format (X,Y,Z)
 General block model developed from fault model
Meshing approach :
 Faults that have not experienced recent activity do not need a fine mesh
 Size issues, numbers of nodes/elements, visualization
 Geometric complexities may slow processing (re-meshing abilities)
 Concentrate on Tets
 Mesher solver interaction
Solver
Iterative solver
Rheology requirements:
 Lower crust/upper mantle structure and properties,
 General visco-elastic
 Nonlinear, anisotropic, fault slip with rate state, plasticity, poroelasticity,
 Contacts vs. split/slippery nodes
 Ability to implement new rheologies
Benchmarks:
 Interacting faults
 Finish the ones started
 Realistic benchmarks from fault surface models, gravitational loading, basic fault
slip, microblock model
Short term/long term Commercial vs. Existing
Modularity between meshing/preprocessor and solver. Preprocessor
developmental focus
Commercial/Existing/New Codes
Meshing
Commercial
Ideas
Public
Solver Postproc.
Abaqus
Lagrit
Pros and Cons for Commercial solver
Tecton
GeoFem
GeoFest
Ideas
P: Extremely powerful
C: Not open Source
C: Is parallel processing possible
Pros and Cons for Public solver
P: Open source
C: Important features may be missing
P: Parallel operation possibility
P: Designed with geologic background
Pros and Cons Commercial Meshers
P: Ease of use
P: Mature Graphical Interface
C: Slow on larger problems
C: Not open source (can’t learn new tricks) but ability to script certain functions
C: Requires skilled/clever user
C: Limited geologic background
Pros and Cons Public Meshers
C: Steep learning curve
C: Poor graphical interface
P: Free
P: Partially open source
P: Strong geologic background/developments
WORKSHOP ON SOFTWARE FOR MODELING CRUSTAL
DEFORMATION
AGENDA
MONDAY, June 10, 2002
8:00 - 9:00
BREAKFAST PROVIDED
9:00 - 9:15
Greetings and logistics: Simons
Goals of the workshop: Simons
9:15 - 9:45
Relevant rheologies (motivation and wishlist): O'Connell
Codes in use - current status
Academic:
9:45 - 10:00
10:00 - 10:15
10:15 - 10:30
10:30 - 10:45
10:45 - 11:00
Visco1d - Pollitz
Geofest - Parker
BREAK
GAEA - Hearn
Tecton 2.1 - (Williams by proxy (Hager)
Earth Systems Simulator:
11:00 - 11:15 GeoFEM - Brad Hager
Commercial
11:15 - 11:30 Adina - Hager
11:30 - 12:45 Abaqus - Kenner
11:45 - 12:00 Ideas - Freed
12:00 - 1:00
LUNCH (provided)
Benchmark Discussions
1:00 - 1:45
1:45 - 2:30
2:30 - 2:45
2:45 - 3:30
3:30 - 3:45
BM1 & BM2 & BM3 - Lyzenga
BM4 - Hearn
BREAK
BM5; BM6 - Kenner
BM7 - Brad Hager
3:45 - 5:00
Discussion
What have we learned?
What seem to be the tradeoffs?
Critical issues?
6:30
Dinner at the Athenaeum
TUESDAY, June 11, 2002
Models and meshes
8:00 - 9:00
BREAKFAST PROVIDED
9:00 - 10:00
SCEC/USR group report - Shaw
Their goals
Their plan of attack
Their software
Their product
10:00 - 10:15 BREAK
10:15 - 11:15 Issues in meshing - Gable
Reminder of common problems
Quick tutorial
The LAGRIT solution
Other solutions
11:15 - 11:30 Meshing with Ideas - Freed
11:30 - 11:45 Meshing in Geofest - Parker
11:45 - 12:00 Open discussion on meshing issues
12:00 - 1:00
LUNCH (provided)
1:00 - 2:00
Brainstorm on meshing and solver challenges
Define questions for breakout groups
2:00 - 4:00
Breakout into 4 groups to set priorities and pathways
Take a BREAK around 3:00 at groups discretion
4:00 - 5:00
Report from breakout groups
Dinner on your own
WEDNESDAY, June 12, 2002
7:30 - 8:00
BREAKFAST PROVIDED
8:00 - 9:00
New benchmark suggestions
9:00 - 10:00
Open discussion
10:00 - 11:00 Our plan for the future
Next meeting
Continued benchmarking
Web site
Code development
12:00
ADJOURN
WORKSHOP ON SOFTWARE FOR MODELING CRUSTAL
DEFORMATION
ATTENDEES
Michael Aivazis
Wu-Lung Chang
Rob Clayton
Andrea Donellan
Chris DiCaprio
Yuri Fialko
Andy Freed
Carl Gable
Mike Gurnis
John (Jiangheng) He
aivazis@caltech.edu
wchang@mines.utah.edu
clay@gps.caltech.edu
andrea@aig.jpl.nasa.gov
dicaprio@gps.caltech.edu
fialko@radar.ucsd.edu
freed@seismo.berkeley.edu
gable@vega.lanl.gov
gurnis@gps.caltech.edu
he@pgc.nrcan.gc.ca
Caltech
Utah
Caltech
JPL
Caltech
SIO
Berkeley
LANL
Caltech
GSC
Brad Hager
Liz Hearn
Shelley Kenner
Rick O'Connell
Greg Lyzenga
Erik Olson
Jay Parker
Fred Pollitz
Matthew Pritchard
Paul Segall
John Shaw
Mark Simons
Jeroen Tromp
Ken Hudnut
Brad Agaard
David Oglesby
Mark Legg
Bruce Julian
William Savage
Ken Hurst
Jim Whitcomb
Tom Jordan
Luc Lavier
John Lou
Teresa S Baker
Egill Hauksson
brad@chandler.mit.edu
lizh@chandler.mit.edu
skenner@uky.edu
oconnell@geophysics.harvard.edu
lyzenga@thuban.ac.hmc.edu
erik@student.umass.edu
jwp@cobra.jpl.nasa.gov
fpollitz@usgs.gov
matt@gps.caltech.edu
segall@pangea.stanford.edu
shaw@eps.harvard.edu
simons@caltech.edu
jtromp@gps.caltech.edu
hudnut@usgs.gov
baagaard@usgs.gov
david.oglesby@ucr.edu
mark.legg@geology.sdsu.edu
julian@usgs.gov
savage@usgs.gov
hurst@cobra.jpl.nasa.gov
jwhitcom@nsf.gov
tjordan@usc.edu
luc@gps.caltech.edu
John.Z.Lou@jpl.nasa.gov
teresab@MIT.EDU
hauksson@gps.caltech.edu
MIT
MIT
Kentucky
Harvard
Harvey Mudd
UMass
JPL
USGS
Caltech
Stanford
Harvard
Caltech
Caltech
USGS
USGS
UCR
SDSU
USGS Menlo
USGS Golden
JPL
NSF
USC
Caltech
JPL
MIT
Caltech
Download