984 resultados para Earthquake magnitude


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract

Continuous variable is one of the major data types collected by the survey organizations. It can be incomplete such that the data collectors need to fill in the missingness. Or, it can contain sensitive information which needs protection from re-identification. One of the approaches to protect continuous microdata is to sum them up according to different cells of features. In this thesis, I represents novel methods of multiple imputation (MI) that can be applied to impute missing values and synthesize confidential values for continuous and magnitude data.

The first method is for limiting the disclosure risk of the continuous microdata whose marginal sums are fixed. The motivation for developing such a method comes from the magnitude tables of non-negative integer values in economic surveys. I present approaches based on a mixture of Poisson distributions to describe the multivariate distribution so that the marginals of the synthetic data are guaranteed to sum to the original totals. At the same time, I present methods for assessing disclosure risks in releasing such synthetic magnitude microdata. The illustration on a survey of manufacturing establishments shows that the disclosure risks are low while the information loss is acceptable.

The second method is for releasing synthetic continuous micro data by a nonstandard MI method. Traditionally, MI fits a model on the confidential values and then generates multiple synthetic datasets from this model. Its disclosure risk tends to be high, especially when the original data contain extreme values. I present a nonstandard MI approach conditioned on the protective intervals. Its basic idea is to estimate the model parameters from these intervals rather than the confidential values. The encouraging results of simple simulation studies suggest the potential of this new approach in limiting the posterior disclosure risk.

The third method is for imputing missing values in continuous and categorical variables. It is extended from a hierarchically coupled mixture model with local dependence. However, the new method separates the variables into non-focused (e.g., almost-fully-observed) and focused (e.g., missing-a-lot) ones. The sub-model structure of focused variables is more complex than that of non-focused ones. At the same time, their cluster indicators are linked together by tensor factorization and the focused continuous variables depend locally on non-focused values. The model properties suggest that moving the strongly associated non-focused variables to the side of focused ones can help to improve estimation accuracy, which is examined by several simulation studies. And this method is applied to data from the American Community Survey.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although there are several studies looking at the effect of natural disasters on economic growth, less attention has been dedicated to their impact on educational outcomes, especially in more developed countries. We use the synthetic control method to examine how the L’Aquila earthquake affected subsequent enrolment at the local university. This issue has wide economic implications as the University of L’Aquila made a large contribution to the local economy before the earthquake. Our results indicate that the earthquake had no statistically significant effect on first-year enrolment at the University of L’Aquila in the three academic years after the disaster. This natural disaster, however, caused a compositional change in the first-year student population, with a substantial increase in the number of students aged 21 or above. This is likely to have been driven by post-disaster measures adopted in order to mitigate the expected negative effects on enrolment triggered by the earthquake.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Subduction of a narrow slab of oceanic lithosphere beneath a tightly curved orogenic arc requires the presence of at least one lithospheric scale tear fault. While the Calabrian subduction beneath southern Italy is considered to be the type example of this geodynamic setting, the geometry, kinematics and surface expression of the associated lateral, slab tear fault offshore eastern Sicily remain controversial. Results from a new marine geophysical survey conducted in the Ionian Sea, using high-resolution bathymetry and seismic profiling reveal active faulting at the seafloor within a 140 km long, two-branched fault system near Alfeo Seamount. The previously unidentified 60 km long NW trending North Alfeo Fault system shows primarily strike-slip kinematics as indicated by the morphology and steep-dipping transpressional and transtensional faults. Available earthquake focal mechanisms indicate dextral strike-slip motion along this fault segment. The 80 km long SSE trending South Alfeo fault system is expressed by one or two steeply dipping normal faults, bounding the western side of a 500+ m thick, 5 km wide, elongate, syntectonic Plio-Quaternary sedimentary basin. Both branches of the fault system are mechanically capable of generating magnitude 6-7 earthquakes like those that struck eastern Sicily in 1169, 1542, and 1693.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When 7.8-magnitude earthquake struck Nepal in 2015, many monuments, temples and houses turned into rubbles killing more than 8,000 people and injuring above 21,000. This unfortunate and tragic natural disaster brought international attention to Nepal. But in this time of despair and pain there was a sign of hope that should be acknowledged well, the spirit of community facing the disaster. This paper is about the indigenous community of Kathmandu on how they organized an important traditional festival just four months after the disaster when most the people were still living in the makeshifts, along with the smaller aftershock continuing almost everyday. In the country like Nepal with numerous intangible heritages, which is, still living is not taken seriously by the concerned authorities and mostly been neglected. It is the indigenous community who has been carrying out those heritages, as they are inseparable aspect of the social life. With this paper it tries look at the community involvement and intangible heritage of Kathmandu Valley, which is a part of my PhD research thesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Toppling analysis of a precariously balanced rock (PBR) can provide insights into the nature of ground motion that has not occurred at that location in the past and, by extension, realistic constraints on peak ground motions for use in engineering design. Earlier approaches have targeted simplistic 2-D models of the rock or modeled the rock-pedestal contact using spring-damper assemblies that require re-calibration for each rock. These analyses also assume that the rock does not slide on the pedestal. Here, a method to model PBRs in three dimensions is presented. The 3-D model is created from a point cloud of the rock, the pedestal, and their interface, obtained using Terrestrial Laser Scanning (TLS). The dynamic response of the model under earthquake excitation is simulated using a rigid body dynamics algorithm. The veracity of this approach is demonstrated by comparisons against data from shake table experiments. Fragility maps for toppling probability of the Echo Cliff PBR and the Pacifico PBR as a function of various ground motion parameters, rock-pedestal interface friction coefficient, and excitation direction are presented. The seismic hazard at these PBR locations is estimated using these maps. Additionally, these maps are used to assess whether the synthetic ground motions at these locations resulting from scenario earthquakes on the San Andreas Fault are realistic (toppling would indicate that the ground motions are unrealistically high).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Investigation of large, destructive earthquakes is challenged by their infrequent occurrence and the remote nature of geophysical observations. This thesis sheds light on the source processes of large earthquakes from two perspectives: robust and quantitative observational constraints through Bayesian inference for earthquake source models, and physical insights on the interconnections of seismic and aseismic fault behavior from elastodynamic modeling of earthquake ruptures and aseismic processes.

To constrain the shallow deformation during megathrust events, we develop semi-analytical and numerical Bayesian approaches to explore the maximum resolution of the tsunami data, with a focus on incorporating the uncertainty in the forward modeling. These methodologies are then applied to invert for the coseismic seafloor displacement field in the 2011 Mw 9.0 Tohoku-Oki earthquake using near-field tsunami waveforms and for the coseismic fault slip models in the 2010 Mw 8.8 Maule earthquake with complementary tsunami and geodetic observations. From posterior estimates of model parameters and their uncertainties, we are able to quantitatively constrain the near-trench profiles of seafloor displacement and fault slip. Similar characteristic patterns emerge during both events, featuring the peak of uplift near the edge of the accretionary wedge with a decay toward the trench axis, with implications for fault failure and tsunamigenic mechanisms of megathrust earthquakes.

To understand the behavior of earthquakes at the base of the seismogenic zone on continental strike-slip faults, we simulate the interactions of dynamic earthquake rupture, aseismic slip, and heterogeneity in rate-and-state fault models coupled with shear heating. Our study explains the long-standing enigma of seismic quiescence on major fault segments known to have hosted large earthquakes by deeper penetration of large earthquakes below the seismogenic zone, where mature faults have well-localized creeping extensions. This conclusion is supported by the simulated relationship between seismicity and large earthquakes as well as by observations from recent large events. We also use the modeling to connect the geodetic observables of fault locking with the behavior of seismicity in numerical models, investigating how a combination of interseismic geodetic and seismological estimates could constrain the locked-creeping transition of faults and potentially their co- and post-seismic behavior.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

How can we calculate earthquake magnitudes when the signal is clipped and over-run? When a volcano is very active, the seismic record may saturate (i.e., the full amplitude of the signal is not recorded) or be over-run (i.e., the end of one event is covered by the start of a new event). The duration, and sometimes the amplitude, of an earthquake signal are necessary for determining event magnitudes; thus, it may be impossible to calculate earthquake magnitudes when a volcano is very active. This problem is most likely to occur at volcanoes with limited networks of short period seismometers. This study outlines two methods for calculating earthquake magnitudes when events are clipped and over-run. The first method entails modeling the shape of earthquake codas as a power law function and extrapolating duration from the decay of the function. The second method draws relations between clipped duration (i.e., the length of time a signal is clipped) and the full duration. These methods allow for magnitudes to be determined within 0.2 to 0.4 units of magnitude. This error is within the range of analyst hand-picks and is within the acceptable limits of uncertainty when quickly quantifying volcanic energy release during volcanic crises. Most importantly, these estimates can be made when data are clipped or over-run. These methods were developed with data from the initial stages of the 2004-2008 eruption at Mount St. Helens. Mount St. Helens is a well-studied volcano with many instruments placed at varying distances from the vent. This fact makes the 2004-2008 eruption a good place to calibrate and refine methodologies that can be applied to volcanoes with limited networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study computed trends in extreme precipitation events of Florida for 1950-2010. Hourly aggregated rainfall data from 24 stations of the National Climatic Data Centre were analyzed to derive time-series of extreme rainfalls for 12 durations, ranging from 1 hour to 7 day. Non-parametric Mann-Kendall test and Theil-Sen Approach were applied to detect the significance of trends in annual maximum rainfalls, number of above threshold events and average magnitude of above threshold events for four common analysis periods. Trend Free Pre-Whitening (TFPW) approach was applied to remove the serial correlations and bootstrap resampling approach was used to detect the field significance of trends. The results for annual maximum rainfall revealed dominant increasing trends at the statistical significance level of 0.10, especially for hourly events in longer period and daily events in recent period. The number of above threshold events exhibited strong decreasing trends for hourly durations in all time periods.