Article

The SCEC TeraShake Earthquake Simulation

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

The southern portion of the San Andreas fault, between Cajon Creek and Bombay Beach has not seen a major event since 1690, and has therefore accumulated a slip deficit of 5-6 m. The potential for this portion of the fault to rupture in a single M7.7 event is a major component of seismic hazard in southern California and northern Mexico. TeraShake is a large-scale finite-difference (fourth-order) simulation of such an event based on Olsen's Anelastic Wave Propagation Model (AWM) code, and conducted in the context of the Southern California Earthquake Center Community Modeling Environment (CME). The fault geometry is taken from the 2002 USGS National Hazard Maps. The kinematic slip function is transported and scaled from published inversions for the 2002 Denali (M7.9) earthquake. The three-dimensional crustal structure is the SCEC Community Velocity model. The 600km x 300km x 80km simulation domain extends from the Ventura Basin and Tehachapi region to the north and to Mexicali and Tijuana to the south. It includes all major population centers in southern California, and is modeled at 200m resolution using a rectangular, 1.8 giganode, 3000 x 1500 x 400 mesh. The simulated duration is 200 seconds, with a temporal resolution of 0.01seconds, maximum frequency of 0.5Hz, for a total of 20,000 time steps. The simulation is planned to run at the San Diego Supercomputer Center (SDSC) on 240 processors of the IBM Power4, DataStar machine. Validation runs conducted at one sixteenth (4D) resolution have shown that this is the optimal configuration in the trade-off between computational and I/O demands. The full run will consume about 18,000 CPU.hours. Each time step produces a 21.6GByte mesh snapshot of the entire ground motion velocity vectors. A 4D wavefield containing 2,000 time steps, amounting to 43 Tbytes of data, will be stored at SDSC. Surface data will be archived for every time step for synthetic seismogram engineering analysis, totaling 1 Tbyte. The data will be registered with the SCEC Digital Library supported by the SDSC Storage Resource Broker (SRB). Data collections will be annotated with simulation metadata, which will allow data discovery operations on metadata-based queries. The binary output will be described using HDF5 headers. Each file will be fingerprinted with MD5 checksums to preserve and validate data integrity. Data access, management and data product derivation will be provided through a set of SRB APIs, including java, C, web service and data grid workflow interfaces. High resolution visualizations of the wave propagation phenomena will be produced under diverse camera views. The surface data will be analyzed online by remote web clients plotting synthetic seismograms. Data mining operations, spectral analysis and data subsetting are planned as future work. The TeraShake simulation project has provided some insights about the cyberinfrastructure needed to advance computational geoscience, which we will discuss.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Using the methodologies above, subsurface structure of various levels of 3D heterogeneity can be incorporated in the simulation of the propagation of the seismic energy. The downside of these methods, however, is the relatively high computational cost that limits their use, since large scale computer systems need to be employed in order to carry out large scale simulations: e.g. the TeraShake simulation (Minster et al., 2004), the Tangshan earthquake simulations by Fu et al. (2017), etc. Furthermore, each of these methods has an intrinsic disadvantage such as numerical dispersion in the finite element method (Bonilla, 2002) or the limitation to models with weak heterogeneities in the boundary element method (Semblat, 2011). ...
Article
Full-text available
An identification of the responsible faults for the destructive earthquakes of 1894 in the Atalanti region was carried out by employing a novel application of 3D finite-difference wavefield modeling. Several faults proposed in the literature were tested in detailed 3D simulations, by also utilizing a detailed local 3D velocity model, as well as the local topography. The assessment of the most probable sources for these events was based on the correlation of reported damages with the distribution of the simulated peak ground acceleration. Furthermore, the distribution of the spectral amplitudes at higher frequencies that are related to the resonant frequencies of the local buildings on that time period was also used as an indicator. The general effect of the local 3D subsurface structure on the propagation of the wavefield and the spatial distribution of the ground motion was also investigated. The Malessina fault was identified as a probable source for the main event of 20/4/1894 based on the results of the 3D modeling, whereas the 3D effect was found to be a highly contributing factor to the distribution of the simulated ground motion.
... Deep sedimentary basins in southern California form significant velocity structures in the crust. These basins are generally filled with thick (up to 12 km) sequences of relatively low velocity and density sediments that have been shown to amplify seismic waves and localize hazardous ground shaking during large earthquakes (e.g., Bonamassa and Vidale, 1991;Frankel and Vidale, 1992;Bouchon and Barker, 1996;Olsen, 2000;Graves et al., 1998;Bielak et al., 1999;Aagaard et al., 2001;Komatitisch et al., 2004;Minster et al., 2004;. Thus, the first step in our development of the USR was to generate accurate descriptions of the 3D geometry and velocity structures of the major basins. ...
... The simulation was run at SDSC on 240 processors of the IBM Power4, DataStar machine [9,10]. Validation runs conducted at one-sixteenth (4D) resolution have shown that this is the optimal configuration in the trade-off between computational and I/O demands. ...
Article
Full-text available
The Southern California Earthquake Center digital library publishes scientific data generated by seismic wave propagation simulations. The output from a single simulation may be as large as 47 Terabytes of data and 400,000 files. The total size of the digital library is over 130 Terabytes with nearly three million files. We examine how this scale of simulation output can be registered into a digital library built on top of the Storage Resource Broker data grid. We also examine the multiple types of interactive services that have been developed for accessing and displaying the multi-terabyte collection.
ResearchGate has not been able to resolve any references for this publication.