604 resultados para Earthquakes.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The mountain ranges and coastlines of Washington State have steep slopes, and they are susceptible to landslides triggered by intense rainstorms, rapid snow melts, earthquakes, and rivers and waves removing slope stability. Over a 30-year timespan (1984-2014 and includes State Route (SR) 530), a total of 28 deep-seated landslides caused 300 million dollars of damage and 45 deaths (DGER, 2015). During that same timeframe, ten storm events triggered shallow landslides and debris flows across the state, resulting in nine deaths (DGER, 2015). The loss of 43 people, due to the SR 530 complex reactivating and moving at a rate and distance unexpected to residents, highlighted the need for an inventory of the stateís landslides. With only 13% of the state mapped (Lombardo et al., 2015), the intention of this statewide inventory is to communicate hazards to citizens and decision makers. In order to compile an accurate and consistent landslide inventory, Washington needs to adopt a graphic information system (GIS) based mapping protocol. A mapping protocol provides consistency for measuring and recording information about landslides, including such information as the type of landslide, the material involved, and the size of the movement. The state of Oregon shares similar landslide problems as Washington, and it created a GIS-based mapping protocol designed to inform its residents, while also saving money and reducing costly hours in the field (Burns and Madin, 2009). In order to determine if the Oregon Department of Geology and Mineral Industries (DOGAMI) protocol, developed by Burns and Madin (2009), could serve as the basis for establishing Washingtonís protocol, I used the office-based DOGAMI protocol to map landslides along a 40-50 km (25-30 mile) shoreline in Thurston County, Washington. I then compared my results to the field-based landslide inventory created in 2009 by the Washington Division of Geology and Earth Resources (DGER) along this same shoreline. If the landslide area I mapped reasonably equaled the area of the DGER (2009) inventory, I would consider the DOGAMI protocol useful for Washington, too. Utilizing 1m resolution lidar flown for Thurston County in 2011 and a GIS platform, I mapped 36 landslide deposits and scarp flanks, covering a total area of 879,530 m2 (9,467,160 ft2). I also found 48 recent events within these deposits. With an exception of two slides, all of the movements occurred within the last fifty years. Along this same coastline, the DGER (2009) recorded 159 individual landslides and complexes, for a total area of 3,256,570 m2 (35,053,400 ft2). At a first glance it appears the DGER (2009) effort found a larger total number and total area of landslides. However, in addition to their field inventory, they digitized landslides previously mapped by other researchers, and they did not field confirm these landslides, which cover a total area of 2,093,860 m2 (22,538,150 ft2) (DGER, 2009). With this questionable landslide area removed and the toes and underwater landslides accounted for because I did not have a bathymetry dataset, my results are within 6,580 m2 (70,840 ft2) of the DGERís results. This similarity shows that the DOGAMI protocol provides a consistent and accurate approach to creating a landslide inventory. With a few additional modifications, I recommend that Washington State adopts the DOGAMI protocol. Acquiring additional 1m lidar and adopting a modified DOGAMI protocol poises the DGER to map the remaining 87% of the state, with an ultimate goal of informing citizens and decision makers of the locations and frequencies of landslide hazards on a user-friendly GIS platform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High-resolution transmission electron microscopy (HRTEM) was used to study the olivine to spinel transformation. HRTEM structure images of Mg2GeO4 olivine deformed under a pressure of 6 GPa at 600 degreesC clearly show that a shear mechanism dominates the transformation. The transformation is not a nucleation and growth mechanism. It also differs in certain crucial aspects from the type of martensitic transformation proposed before. During the transformation, it is a shear movement that brings the oxygen anions to their positions in the spinel structure. An edge dislocation following each shear then puts the cations in their spinel sites. The Burgers' vector of each dislocation is perpendicular to the anion shear direction. (C) 2004 American Institute of Physics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Statistical tests of Load-Unload Response Ratio (LURR) signals are carried in order to verify statistical robustness of the previous studies using the Lattice Solid Model (MORA et al., 2002b). In each case 24 groups of samples with the same macroscopic parameters (tidal perturbation amplitude A, period T and tectonic loading rate k) but different particle arrangements are employed. Results of uni-axial compression experiments show that before the normalized time of catastrophic failure, the ensemble average LURR value rises significantly, in agreement with the observations of high LURR prior to the large earthquakes. In shearing tests, two parameters are found to control the correlation between earthquake occurrence and tidal stress. One is, A/(kT) controlling the phase shift between the peak seismicity rate and the peak amplitude of the perturbation stress. With an increase of this parameter, the phase shift is found to decrease. Another parameter, AT/k, controls the height of the probability density function (Pdf) of modeled seismicity. As this parameter increases, the Pdf becomes sharper and narrower, indicating a strong triggering. Statistical studies of LURR signals in shearing tests also suggest that except in strong triggering cases, where LURR cannot be calculated due to poor data in unloading cycles, the larger events are more likely to occur in higher LURR periods than the smaller ones, supporting the LURR hypothesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Accelerating Moment Release (AMR) preceding earthquakes with magnitude above 5 in Australia that occurred during the last 20 years was analyzed to test the Critical Point Hypothesis. Twelve earthquakes in the catalog were chosen based on a criterion for the number of nearby events. Results show that seven sequences with numerous events recorded leading up to the main earthquake exhibited accelerating moment release. Two occurred near in time and space to other earthquakes preceded by AM R. The remaining three sequences had very few events in the catalog so the lack of AMR detected in the analysis may be related to catalog incompleteness. Spatio-temporal scanning of AMR parameters shows that 80% of the areas in which AMR occurred experienced large events. In areas of similar background seismicity with no large events, 10 out of 12 cases exhibit no AMR, and two others are false alarms where AMR was observed but no large event followed. The relationship between AMR and Load-Unload Response Ratio (LURR) was studied. Both methods predict similar critical region sizes, however, the critical point time using AMR is slightly earlier than the time of the critical point LURR anomaly.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Earthquakes have been recognized as resulting from stick-slip frictional instabilities along the faults between deformable rocks. A three-dimensional finite-element code for modeling the nonlinear frictional contact behaviors between deformable bodies with the node-to-point contact element strategy has been developed and applied here to investigate the fault geometry influence on the nucleation and development process of the stick-slip instability along an intra-plate fault through a typical fault bend model, which has a pre-cut fault that is artificially bent by an angle of 5.6degrees at the fault center. The numerical results demonstrate that the geometry of the fault significantly affects nucleation, termination and restart of the stick-slip instability along the intra-plate fault, and all these instability phenomena can be well simulated using the current finite-element algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Lattice Solid Model has been used successfully as a virtual laboratory to simulate fracturing of rocks, the dynamics of faults, earthquakes and gouge processes. However, results from those simulations show that in order to make the next step towards more realistic experiments it will be necessary to use models containing a significantly larger number of particles than current models. Thus, those simulations will require a greatly increased amount of computational resources. Whereas the computing power provided by single processors can be expected to increase according to Moore's law, i.e., to double every 18-24 months, parallel computers can provide significantly larger computing power today. In order to make this computing power available for the simulation of the microphysics of earthquakes, a parallel version of the Lattice Solid Model has been implemented. Benchmarks using large models with several millions of particles have shown that the parallel implementation of the Lattice Solid Model can achieve a high parallel-efficiency of about 80% for large numbers of processors on different computer architectures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Based on the three-dimensional elastic inclusion model proposed by Dobrovolskii, we developed a rheological inclusion model to study earthquake preparation processes. By using the Corresponding Principle in the theory of rheologic mechanics, we derived the analytic expressions of viscoelastic displacement U(r, t) , V(r, t) and W(r, t), normal strains epsilon(xx) (r, t), epsilon(yy) (r, t) and epsilon(zz) (r, t) and the bulk strain theta (r, t) at an arbitrary point (x, y, z) in three directions of X axis, Y axis and Z axis produced by a three-dimensional inclusion in the semi-infinite rheologic medium defined by the standard linear rheologic model. Subsequent to the spatial-temporal variation of bulk strain being computed on the ground produced by such a spherical rheologic inclusion, interesting results are obtained, suggesting that the bulk strain produced by a hard inclusion change with time according to three stages (alpha, beta, gamma) with different characteristics, similar to that of geodetic deformation observations, but different with the results of a soft inclusion. These theoretical results can be used to explain the characteristics of spatial-temporal evolution, patterns, quadrant-distribution of earthquake precursors, the changeability, spontaneity and complexity of short-term and imminent-term precursors. It offers a theoretical base to build physical models for earthquake precursors and to predict the earthquakes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article first summarizes some available experimental results on the frictional behaviour of contact interfaces, and briefly recalls typical frictional experiments and relationships, which are applicable for rock mechanics, and then a unified description is obtained to describe the entire frictional behaviour. It is formulated based on the experimental results and applied with a stick and slip decomposition algorithm to describe the stick-slip instability phenomena, which can describe the effects observed in rock experiments without using the so-called state variable, thus avoiding related numerical difficulties. This has been implemented to our finite element code, which uses the node-to-point contact element strategy proposed by the authors to handle the frictional contact between multiple finite-deformation bodies with stick and finite frictional slip, and applied here to simulate the frictional behaviour of rocks to show its usefulness and efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recurrence interval statistics for regional seismicity follows a universal distribution function, independent of the tectonic setting or average rate of activity (Corral, 2004). The universal function is a modified gamma distribution with power-law scaling of recurrence intervals shorter than the average rate of activity and exponential decay for larger intervals. We employ the method of Corral (2004) to examine the recurrence statistics of a range of cellular automaton earthquake models. The majority of models has an exponential distribution of recurrence intervals, the same as that of a Poisson process. One model, the Olami-Feder-Christensen automaton, has recurrence statistics consistent with regional seismicity for a certain range of the conservation parameter of that model. For conservation parameters in this range, the event size statistics are also consistent with regional seismicity. Models whose dynamics are dominated by characteristic earthquakes do not appear to display universality of recurrence statistics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The heightened threat of terrorism has caused governments worldwide to plan for responding to large-scale catastrophic incidents. In England the New Dimension Programme supplies equipment, procedures and training to the Fire and Rescue Service to ensure the country's preparedness to respond to a range of major critical incidents. The Fire and Rescue Service is involved partly by virtue of being able to very quickly mobilize a large skilled workforce and specialist equipment. This paper discusses the use of discrete event simulation modeling to understand how a fire and rescue service might position its resources before an incident takes place, to best respond to a combination of different incidents at different locations if they happen. Two models are built for this purpose. The first model deals with mass decontamination of a population following a release of a hazardous substance—aiming to study resource requirements (vehicles, equipment and manpower) necessary to meet performance targets. The second model deals with the allocation of resources across regions—aiming to study cover level and response times, analyzing different allocations of resources, both centralized and decentralized. Contributions to theory and practice in other contexts (e.g. the aftermath of natural disasters such as earthquakes) are outlined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62G32, 62G05.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hurricanes, earthquakes, floods, and other serious natural hazards have been attributed with causing changes in regional economic growth, income, employment, and wealth. Natural disasters are said to cause; (1) an acceleration of existing economic trends; (2) an expansion of employment and income, due to recovery operations (the so-called silver lining); and (3) an alteration in the structure of regional economic activity due to changes in "intra" and "inter" regional trading patterns, and technological change.^ Theoretical and stylized disaster simulations (Cochrane 1975; Haas, Cochrane, and Kates 1977; Petak et al. 1982; Ellson et al. 1983, 1984; Boisvert 1992; Brookshire and McKee 1992) point towards a wide scope of possible negative and long lasting impacts upon economic activity and structure. This work examines the consequences of Hurricane Andrew on Dade County's economy. Following the work of Ellson et al. (1984), Guimaraes et al. (1993), and West and Lenze (1993; 1994), a regional econometric forecasting model (DCEFM) using a framework of "with" and "without" the hurricane is constructed and utilized to assess Hurricane Andrew's impact on the structure and level of economic activity in Dade County, Florida.^ The results of the simulation exercises show that the direct economic impact associated with Hurricane Andrew on Dade County is of short duration, and of isolated sectoral impact, with impact generally limited to construction, TCP (transportation, communications, and public utilities), and agricultural sectors. Regional growth, and changes in income and employment reacted directly to, and within the range and direction set by national economic activity. The simulations also lead to the conclusion that areal extent, infrastructure, and sector specific damages or impacts, as opposed to monetary losses, are the primary determinants of a disaster's effects upon employment, income, growth, and economic structure. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An emergency is a deviation from a planned course of events that endangers people, properties, or the environment. It can be described as an unexpected event that causes economic damage, destruction, and human suffering. When a disaster happens, Emergency Managers are expected to have a response plan to most likely disaster scenarios. Unlike earthquakes and terrorist attacks, a hurricane response plan can be activated ahead of time, since a hurricane is predicted at least five days before it makes landfall. This research looked into the logistics aspects of the problem, in an attempt to develop a hurricane relief distribution network model. We addressed the problem of how to efficiently and effectively deliver basic relief goods to victims of a hurricane disaster. Specifically, where to preposition State Staging Areas (SSA), which Points of Distributions (PODs) to activate, and the allocation of commodities to each POD. Previous research has addressed several of these issues, but not with the incorporation of the random behavior of the hurricane's intensity and path. This research presents a stochastic meta-model that deals with the location of SSAs and the allocation of commodities. The novelty of the model is that it treats the strength and path of the hurricane as stochastic processes, and models them as Discrete Markov Chains. The demand is also treated as stochastic parameter because it depends on the stochastic behavior of the hurricane. However, for the meta-model, the demand is an input that is determined using Hazards United States (HAZUS), a software developed by the Federal Emergency Management Agency (FEMA) that estimates losses due to hurricanes and floods. A solution heuristic has been developed based on simulated annealing. Since the meta-model is a multi-objective problem, the heuristic is a multi-objective simulated annealing (MOSA), in which the initial solution and the cooling rate were determined via a Design of Experiments. The experiment showed that the initial temperature (T0) is irrelevant, but temperature reduction (δ) must be very gradual. Assessment of the meta-model indicates that the Markov Chains performed as well or better than forecasts made by the National Hurricane Center (NHC). Tests of the MOSA showed that it provides solutions in an efficient manner. Thus, an illustrative example shows that the meta-model is practical.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study on risk and disaster management capacities of four Caribbean countries: Barbados, the Dominican Republic, Jamaica, and Trinidad and Tobago, examines three main dimensions: 1) the impact of natural disasters from 1900 to 2010 (number of events, number of people killed, total number affected, and damage in US$); 2) institutional assessments of disaster risk management disparity; and 3) the 2010 Inter-American Bank for Development (IADB) Disaster Risk and Risk Management indicators for the countries under study. The results show high consistency among the different sources examined, pointing out the need to extend the IADB measurements to the rest of the Caribbean countries. Indexes and indicators constitute a comparison measure vis-à-vis existing benchmarks in order to anticipate a capacity to deal with adverse events and their consequences; however, the indexes and indicators could only be tested against the occurrence of a real event. Therefore, the need exists to establish a sustainable and comprehensive evaluation system after important disasters to assess a country‘s performance, verify the indicators, and gain feedback on measurement systems and methodologies. There is diversity in emergency and preparedness for disasters in the four countries under study. The nature of the event (hurricanes, earthquakes, floods, and seismic activity), especially its frequency and the intensity of the damage experienced, is related to how each has designed its risk and disaster management policies and programs to face natural disasters. Vulnerabilities to disaster risks have been increasing, among other factors, because of uncontrolled urbanization, demographic density and poverty increase, social and economic marginalization, and lack of building code enforcement. The four countries under study have shown improvements in risk management capabilities, yet they are far from being completed prepared. Barbados‘ risk management performance is superior, in comparison, to the majority of the countries of the region. However, is still far in achieving high performance levels and sustainability in risk management, primarily when it has the highest gap between potential macroeconomic and financial losses and the ability to face them. The Dominican Republic has shown steady risk performance up to 2008, but two remaining areas for improvement are hazard monitoring and early warning systems. Jamaica has made uneven advances between 1990 and 2008, requiring significant improvements to achieve high performance levels and sustainability in risk management, as well as macroeconomic mitigation infrastructure. Trinidad and Tobago has the lowest risk management score of the 15 countries in the Latin American and Caribbean region as assessed by the IADB study in 2010, yet it has experienced an important vulnerability reduction. In sum, the results confirmed the high disaster risk management disparity in the Caribbean region.