34 resultados para Earthquake Rupture
em University of Queensland eSpace - Australia
Resumo:
An appreciation of the physical mechanisms which cause observed seismicity complexity is fundamental to the understanding of the temporal behaviour of faults and single slip events. Numerical simulation of fault slip can provide insights into fault processes by allowing exploration of parameter spaces which influence microscopic and macroscopic physics of processes which may lead towards an answer to those questions. Particle-based models such as the Lattice Solid Model have been used previously for the simulation of stick-slip dynamics of faults, although mainly in two dimensions. Recent increases in the power of computers and the ability to use the power of parallel computer systems have made it possible to extend particle-based fault simulations to three dimensions. In this paper a particle-based numerical model of a rough planar fault embedded between two elastic blocks in three dimensions is presented. A very simple friction law without any rate dependency and no spatial heterogeneity in the intrinsic coefficient of friction is used in the model. To simulate earthquake dynamics the model is sheared in a direction parallel to the fault plane with a constant velocity at the driving edges. Spontaneous slip occurs on the fault when the shear stress is large enough to overcome the frictional forces on the fault. Slip events with a wide range of event sizes are observed. Investigation of the temporal evolution and spatial distribution of slip during each event shows a high degree of variability between the events. In some of the larger events highly complex slip patterns are observed.
Resumo:
Statistical tests of Load-Unload Response Ratio (LURR) signals are carried in order to verify statistical robustness of the previous studies using the Lattice Solid Model (MORA et al., 2002b). In each case 24 groups of samples with the same macroscopic parameters (tidal perturbation amplitude A, period T and tectonic loading rate k) but different particle arrangements are employed. Results of uni-axial compression experiments show that before the normalized time of catastrophic failure, the ensemble average LURR value rises significantly, in agreement with the observations of high LURR prior to the large earthquakes. In shearing tests, two parameters are found to control the correlation between earthquake occurrence and tidal stress. One is, A/(kT) controlling the phase shift between the peak seismicity rate and the peak amplitude of the perturbation stress. With an increase of this parameter, the phase shift is found to decrease. Another parameter, AT/k, controls the height of the probability density function (Pdf) of modeled seismicity. As this parameter increases, the Pdf becomes sharper and narrower, indicating a strong triggering. Statistical studies of LURR signals in shearing tests also suggest that except in strong triggering cases, where LURR cannot be calculated due to poor data in unloading cycles, the larger events are more likely to occur in higher LURR periods than the smaller ones, supporting the LURR hypothesis.
Resumo:
The Accelerating Moment Release (AMR) preceding earthquakes with magnitude above 5 in Australia that occurred during the last 20 years was analyzed to test the Critical Point Hypothesis. Twelve earthquakes in the catalog were chosen based on a criterion for the number of nearby events. Results show that seven sequences with numerous events recorded leading up to the main earthquake exhibited accelerating moment release. Two occurred near in time and space to other earthquakes preceded by AM R. The remaining three sequences had very few events in the catalog so the lack of AMR detected in the analysis may be related to catalog incompleteness. Spatio-temporal scanning of AMR parameters shows that 80% of the areas in which AMR occurred experienced large events. In areas of similar background seismicity with no large events, 10 out of 12 cases exhibit no AMR, and two others are false alarms where AMR was observed but no large event followed. The relationship between AMR and Load-Unload Response Ratio (LURR) was studied. Both methods predict similar critical region sizes, however, the critical point time using AMR is slightly earlier than the time of the critical point LURR anomaly.
Finite element analysis of fault bend influence on stick-slip instability along an intra-plate fault
Resumo:
Earthquakes have been recognized as resulting from stick-slip frictional instabilities along the faults between deformable rocks. A three-dimensional finite-element code for modeling the nonlinear frictional contact behaviors between deformable bodies with the node-to-point contact element strategy has been developed and applied here to investigate the fault geometry influence on the nucleation and development process of the stick-slip instability along an intra-plate fault through a typical fault bend model, which has a pre-cut fault that is artificially bent by an angle of 5.6degrees at the fault center. The numerical results demonstrate that the geometry of the fault significantly affects nucleation, termination and restart of the stick-slip instability along the intra-plate fault, and all these instability phenomena can be well simulated using the current finite-element algorithm.
Resumo:
We examine the event statistics obtained from two differing simplified models for earthquake faults. The first model is a reproduction of the Block-Slider model of Carlson et al. (1991), a model often employed in seismicity studies. The second model is an elastodynamic fault model based upon the Lattice Solid Model (LSM) of Mora and Place (1994). We performed simulations in which the fault length was varied in each model and generated synthetic catalogs of event sizes and times. From these catalogs, we constructed interval event size distributions and inter-event time distributions. The larger, localised events in the Block-Slider model displayed the same scaling behaviour as events in the LSM however the distribution of inter-event times was markedly different. The analysis of both event size and inter-event time statistics is an effective method for comparative studies of differing simplified models for earthquake faults.
Resumo:
A statistical fractal automaton model is described which displays two modes of dynamical behaviour. The first mode, termed recurrent criticality, is characterised by quasi-periodic, characteristic events that are preceded by accelerating precursory activity. The second mode is more reminiscent of SOC automata in which large events are not preceded by an acceleration in activity. Extending upon previous studies of statistical fractal automata, a redistribution law is introduced which incorporates two model parameters: a dissipation factor and a stress transfer ratio. Results from a parameter space investigation indicate that a straight line through parameter space marks a transition from recurrent criticality to unpredictable dynamics. Recurrent criticality only occurs for models within one corner of the parameter space. The location of the transition displays a simple dependence upon the fractal correlation dimension of the cell strength distribution. Analysis of stress field evolution indicates that recurrent criticality occurs in models with significant long-range stress correlations. A constant rate of activity is associated with a decorrelated stress field.
Resumo:
The Lattice Solid Model has been used successfully as a virtual laboratory to simulate fracturing of rocks, the dynamics of faults, earthquakes and gouge processes. However, results from those simulations show that in order to make the next step towards more realistic experiments it will be necessary to use models containing a significantly larger number of particles than current models. Thus, those simulations will require a greatly increased amount of computational resources. Whereas the computing power provided by single processors can be expected to increase according to Moore's law, i.e., to double every 18-24 months, parallel computers can provide significantly larger computing power today. In order to make this computing power available for the simulation of the microphysics of earthquakes, a parallel version of the Lattice Solid Model has been implemented. Benchmarks using large models with several millions of particles have shown that the parallel implementation of the Lattice Solid Model can achieve a high parallel-efficiency of about 80% for large numbers of processors on different computer architectures.
Resumo:
A scaling law is presented that provides a complete solution to the equations bounding the stability and rupture of thin films. The scaling law depends on the fundamental physicochemical properties of the film and interface to calculate bounds for the critical thickness and other key film thicknesses, the relevant waveforms associated with instability and rupture, and film lifetimes. Critical thicknesses calculated from the scaling law are shown to bound the values reported in the literature for numerous emulsion and foam films. The majority of critical thickness values are between 15 to 40% lower than the upper bound critical thickness provided by the scaling law.
Resumo:
Despite decades of experimental and theoretical investigation on thin films, considerable uncertainty exists in the prediction of their critical rupture thickness. According to the spontaneous rupture mechanism, common thin films become unstable when capillary waves. at the interfaces begin to grow. In a horizontal film with symmetry at the midplane. unstable waves from adjacent interfaces grow towards the center of the film. As the film drains and becomes thinner, unstable waves osculate and cause the film to rupture, Uncertainty sterns from a number of sources including the theories used to predict film drainage and corrugation growth dynamics. In the early studies, (lie linear stability of small amplitude waves was investigated in the Context of the quasi-static approximation in which the dynamics of wave growth and film thinning are separated. The zeroth order wave growth equation of Vrij predicts faster wave growth rates than the first order equation derived by Sharma and Ruckenstein. It has been demonstrated in an accompanying paper that film drainage rates and times measured by numerous investigations are bounded by the predictions of the Reynolds equation and the more recent theory of Manev, Tsekov, and Radoev. Solutions to combinations of these equations yield simple scaling laws which should bound the critical rupture thickness of foam and emulsion films, In this paper, critical thickness measurements reported in the literature are compared to predictions from the bounding scaling equations and it is shown that the retarded Hamaker constants derived from approximate Lifshitz theory underestimate the critical thickness of foam and emulsion films, The non-retarded Hamaker constant more adequately bounds the critical thickness measurements over the entire range of film radii reported in the literature. This result reinforces observations made by other independent researchers that interfacial interactions in flexible liquid films are not adequately represented by the retarded Hamaker constant obtained from Lifshitz theory and that the interactions become significant at much greater separations than previously thought. (c) 2005 Elsevier B.V. All rights reserved.