28 resultados para earthquakes


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study new tomographic models of Colombia were calculated. I used the seismicity recorded by the Colombian seismic network during the period 2006-2009. In this time period, the improvement of the seismic network yields more stable hypocentral results with respect to older data set and allows to compute new 3D Vp and Vp/Vs models. The final dataset consists of 10813 P- and 8614 S-arrival times associated to 1405 earthquakes. Tests with synthetic data and resolution analysis indicate that velocity models are well constrained in central, western and southwestern Colombia to a depth of 160 km; the resolution is poor in the northern Colombia and close to Venezuela due to a lack of seismic stations and seismicity. The tomographic models and the relocated seismicity indicate the existence of E-SE subducting Nazca lithosphere beneath central and southern Colombia. The North-South changes in Wadati-Benioff zone, Vp & Vp/Vs pattern and volcanism, show that the downgoing plate is segmented by slab tears E-W directed, suggesting the presence of three sectors. Earthquakes in the northernmost sector represent most of the Colombian seimicity and concentrated on 100-170 km depth interval, beneath the Eastern Cordillera. Here a massive dehydration is inferred, resulting from a delay in the eclogitization of a thickened oceanic crust in a flat-subduction geometry. In this sector a cluster of intermediate-depth seismicity (Bucaramanga Nest) is present beneath the elbow of the Eastern Cordillera, interpreted as the result of massive and highly localized dehydration phenomenon caused by a hyper-hydrous oceanic crust. The central and southern sectors, although different in Vp pattern show, conversely, a continuous, steep and more homogeneous Wadati-Benioff zone with overlying volcanic areas. Here a "normalthickened" oceanic crust is inferred, allowing for a gradual and continuous metamorphic reactions to take place with depth, enabling the fluid migration towards the mantle wedge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work was to show that refined analyses of background, low magnitude seismicity allow to delineate the main active faults and to accurately estimate the directions of the regional tectonic stress that characterize the Southern Apennines (Italy), a structurally complex area with high seismic potential. Thanks the presence in the area of an integrated dense and wide dynamic network, was possible to analyzed an high quality microearthquake data-set consisting of 1312 events that occurred from August 2005 to April 2011 by integrating the data recorded at 42 seismic stations of various networks. The refined seismicity location and focal mechanisms well delineate a system of NW-SE striking normal faults along the Apenninic chain and an approximately E-W oriented, strike-slip fault, transversely cutting the belt. The seismicity along the chain does not occur on a single fault but in a volume, delimited by the faults activated during the 1980 Irpinia M 6.9 earthquake, on sub-parallel predominant normal faults. Results show that the recent low magnitude earthquakes belongs to the background seismicity and they are likely generated along the major fault segments activated during the most recent earthquakes, suggesting that they are still active today thirty years after the mainshock occurrences. In this sense, this study gives a new perspective to the application of the high quality records of low magnitude background seismicity for the identification and characterization of active fault systems. The analysis of the stress tensor inversion provides two equivalent models to explain the microearthquake generation along both the NW-SE striking normal faults and the E- W oriented fault with a dominant dextral strike-slip motion, but having different geological interpretations. We suggest that the NW-SE-striking Africa-Eurasia convergence acts in the background of all these structures, playing a primary and unifying role in the seismotectonics of the whole region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study a new, fully non-linear, approach to Local Earthquake Tomography is presented. Local Earthquakes Tomography (LET) is a non-linear inversion problem that allows the joint determination of earthquakes parameters and velocity structure from arrival times of waves generated by local sources. Since the early developments of seismic tomography several inversion methods have been developed to solve this problem in a linearized way. In the framework of Monte Carlo sampling, we developed a new code based on the Reversible Jump Markov Chain Monte Carlo sampling method (Rj-McMc). It is a trans-dimensional approach in which the number of unknowns, and thus the model parameterization, is treated as one of the unknowns. I show that our new code allows overcoming major limitations of linearized tomography, opening a new perspective in seismic imaging. Synthetic tests demonstrate that our algorithm is able to produce a robust and reliable tomography without the need to make subjective a-priori assumptions about starting models and parameterization. Moreover it provides a more accurate estimate of uncertainties about the model parameters. Therefore, it is very suitable for investigating the velocity structure in regions that lack of accurate a-priori information. Synthetic tests also reveal that the lack of any regularization constraints allows extracting more information from the observed data and that the velocity structure can be detected also in regions where the density of rays is low and standard linearized codes fails. I also present high-resolution Vp and Vp/Vs models in two widespread investigated regions: the Parkfield segment of the San Andreas Fault (California, USA) and the area around the Alto Tiberina fault (Umbria-Marche, Italy). In both the cases, the models obtained with our code show a substantial improvement in the data fit, if compared with the models obtained from the same data set with the linearized inversion codes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During my PhD, starting from the original formulations proposed by Bertrand et al., 2000 and Emolo & Zollo 2005, I developed inversion methods and applied then at different earthquakes. In particular large efforts have been devoted to the study of the model resolution and to the estimation of the model parameter errors. To study the source kinematic characteristics of the Christchurch earthquake we performed a joint inversion of strong-motion, GPS and InSAR data using a non-linear inversion method. Considering the complexity highlighted by superficial deformation data, we adopted a fault model consisting of two partially overlapping segments, with dimensions 15x11 and 7x7 km2, having different faulting styles. This two-fault model allows to better reconstruct the complex shape of the superficial deformation data. The total seismic moment resulting from the joint inversion is 3.0x1025 dyne.cm (Mw = 6.2) with an average rupture velocity of 2.0 km/s. Errors associated with the kinematic model have been estimated of around 20-30 %. The 2009 Aquila sequence was characterized by an intense aftershocks sequence that lasted several months. In this study we applied an inversion method that assumes as data the apparent Source Time Functions (aSTFs), to a Mw 4.0 aftershock of the Aquila sequence. The estimation of aSTFs was obtained using the deconvolution method proposed by Vallée et al., 2004. The inversion results show a heterogeneous slip distribution, characterized by two main slip patches located NW of the hypocenter, and a variable rupture velocity distribution (mean value of 2.5 km/s), showing a rupture front acceleration in between the two high slip zones. Errors of about 20% characterize the final estimated parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present study has been carried out with the following objectives: i) To investigate the attributes of source parameters of local and regional earthquakes; ii) To estimate, as accurately as possible, M0, fc, Δσ and their standard errors to infer their relationship with source size; iii) To quantify high-frequency earthquake ground motion and to study the source scaling. This work is based on observational data of micro, small and moderate -earthquakes for three selected seismic sequences, namely Parkfield (CA, USA), Maule (Chile) and Ferrara (Italy). For the Parkfield seismic sequence (CA), a data set of 757 (42 clusters) repeating micro-earthquakes (0 ≤ MW ≤ 2), collected using borehole High Resolution Seismic Network (HRSN), have been analyzed and interpreted. We used the coda methodology to compute spectral ratios to obtain accurate values of fc , Δσ, and M0 for three target clusters (San Francisco, Los Angeles, and Hawaii) of our data. We also performed a general regression on peak ground velocities to obtain reliable seismic spectra of all earthquakes. For the Maule seismic sequence, a data set of 172 aftershocks of the 2010 MW 8.8 earthquake (3.7 ≤ MW ≤ 6.2), recorded by more than 100 temporary broadband stations, have been analyzed and interpreted to quantify high-frequency earthquake ground motion in this subduction zone. We completely calibrated the excitation and attenuation of the ground motion in Central Chile. For the Ferrara sequence, we calculated moment tensor solutions for 20 events from MW 5.63 (the largest main event occurred on May 20 2012), down to MW 3.2 by a 1-D velocity model for the crust beneath the Pianura Padana, using all the geophysical and geological information available for the area. The PADANIA model allowed a numerical study on the characteristics of the ground motion in the thick sediments of the flood plain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aggregate masonry buildings have been generated over the years, allowing the interaction of different aggregated structural units under seismic action. The first part of this work is focused on the seismic vulnerability and fragility assessment of clay brick masonry buildings, sited in Bologna (Italy), with reference, at first, to single isolated structural units, by means of the Response Surface statistical method, taking into account some variabilities and uncertainties involved in the problem. The seismic action was defined by means of a group of selected registered accelerograms, in order to analyse the effect of the variability of the earthquakes. Identical and different structural units chosen by the Response Surface generated simulations are then aggregated in row, in order to compare the collapse PGA referred to the isolated structural unit and the one referred to the aggregate structure. The second part is focused on the seismic vulnerability and fragility assessment of stone masonry structures, sited in Seixal (Portugal), applying a methodology similar to that used for the buildings sited in Bologna. Since the availability of several information, the analyses involved the assessment of the most prevalent structural typologies in the area, considering the variability of a set of structural and geometrical parameters. The results highlighted the importance of the statistic procedures as method able to consider the variabilities and the uncertainties involved in the problem of the fragility of unreinforced masonry structures, in absence of accurate investigations on the structural typologies, as in the Seixal case study. Furthermore, it was showed that the structural units along the unreinforced clay brick or stone masonry aggregates cannot be analysed as isolated, as they are affected by the effect of the aggregation with adjacent structural units, according to the different directions of the seismic action considered and to their different position along the row aggregate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the entry into force of the latest Italian Building Code (NTC 2008, 2018), innovative criteria were provided, especially for what concerns the seismic verifications of large infrastructures. In particular, for buildings considered as strategic, such as large dams, a seismotectonic study of the site was declared necessary, which involves a re-assessment of the basic seismic hazard. This PhD project fits into this context, being part of the seismic re-evaluation process of large dams launched on a national scale following the O.P.C.M. 3274/2003, D.L. 79/2004. A full seismotectonic study in the region of two large earth dams in Southern Italy was carried out. We identified and characterized the structures that could generate earthquakes in our study area, together with the definition of the local seismic history. This information was used for the reassessment of the basic seismic hazard, using probabilistic seismic hazard assessment approaches. In recent years, fault-based models for the seismic hazard assessment have been proposed all over the world as a new emerging methodology. For this reason, we decided to test the innovative SHERIFS approach on our study area. The occasion of the seismotectonic study gave also the opportunity to focus on the characteristics of the seismic stations that provided the data for the study itself. In the context of the work presented here, we focused on the 10 stations that had been active for the longest time and we carried out a geophysical characterization, the data of which merged into a more general study on the soil-structure interaction at seismic stations and on the ways in which it could affect the SHA. Lastly, an additional experimental study on the two dams and their associated minor structures is also presented, aimed at defining their main dynamic parameters, useful for subsequent dynamic structural and geotechnical studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The topic of seismic loss assessment not only incorporates many aspects of the earthquake engineering, but also entails social factors, public policies and business interests. Because of its multidisciplinary character, this process may be complex to challenge, and sound discouraging to neophytes. In this context, there is an increasing need of deriving simplified methodologies to streamline the process and provide tools for decision-makers and practitioners. This dissertation investigates different possible applications both in the area of modelling of seismic losses, both in the analysis of observational seismic data. Regarding the first topic, the PRESSAFE-disp method is proposed for the fast evaluation of the fragility curves of precast reinforced-concrete (RC) structures. Hence, a direct application of the method to the productive area of San Felice is studied to assess the number of collapses under a specific seismic scenario. In particular, with reference to the 2012 events, two large-scale stochastic models are outlined. The outcomes of the framework are promising, in good agreement with the observed damage scenario. Furthermore, a simplified displacement-based methodology is outlined to estimate different loss performance metrics for the decision-making phase of the seismic retrofit of a single RC building. The aim is to evaluate the seismic performance of different retrofit options, for a comparative analysis of their effectiveness and the convenience. Finally, a contribution to the analysis of the observational data is presented in the last part of the dissertation. A specific database of losses of precast RC buildings damaged by the 2012 Earthquake is created. A statistical analysis is performed, allowing deriving several consequence functions. The outcomes presented may be implemented in probabilistic seismic risk assessments to forecast the losses at the large scale. Furthermore, these may be adopted to establish retrofit policies to prevent and reduce the consequences of future earthquakes in industrial areas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This PhD dissertation presents a profound study of the vulnerability of buildings and non-structural elements stemming from the investigation of the Mw 5.2 Lorca 2011 earthquake; which constitutes one of the most significant earthquakes in Spain. It left nine fatalities due to falling debris from reinforced concrete buildings, 394 injured and material damage valued at 800 million euros. Within this framework, the most relevant initiatives concerning the vulnerability of buildings and the exposure of Lorca are studied. This work revealed two lines of research: the elaboration of a rational method to determine the adequacy of a specific fragility curve for the particular seismic risk study of a region; and the relevance of researching the seismic performance of non-structural elements. As a consequence, firstly, a method to assess and select fragility curves for seismic risk studies from the catalogue of those available in the literature is elaborated and calibrated by means of a case study. The said methodology is based on a multidimensional index and provides a ranking that classifies the curves in terms of adequacy. Its results for the case of Lorca led to the elaboration of new fragility curves for unreinforced masonry buildings. Moreover, a simplified method to account for the unpredictable directionality of the seism in the creation of fragility curves is contributed. Secondly, the characterisation of the seismic capacity and demand of the non-structural elements that caused most of the human losses is studied. Concerning the capacity, an analytical approach derived from theoretical considerations to characterise the complete out-of-plane seismic response curve of unreinforced masonry cantilever walls is provided; as well as a simplified and more practical trilinear version of it. Concerning the demand, several methods for characterising the Floor Response Spectra of reinforced concrete buildings are tested through case studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis we focus on the analysis and interpretation of time dependent deformations recorded through different geodetic methods. Firstly, we apply a variational Bayesian Independent Component Analysis (vbICA) technique to GPS daily displacement solutions, to separate the postseismic deformation that followed the mainshocks of the 2016-2017 Central Italy seismic sequence from the other, hydrological, deformation sources. By interpreting the signal associated with the postseismic relaxation, we model an afterslip distribution on the faults involved by the mainshocks consistent with the co-seismic models available in literature. We find evidences of aseismic slip on the Paganica fault, responsible for the Mw 6.1 2009 L’Aquila earthquake, highlighting the importance of aseismic slip and static stress transfer to properly model the recurrence of earthquakes on nearby fault segments. We infer a possible viscoelastic relaxation of the lower crust as a contributing mechanism to the postseismic displacements. We highlight the importance of a proper separation of the hydrological signals for an accurate assessment of the tectonic processes, especially in cases of mm-scale deformations. Contextually, we provide a physical explanation to the ICs associated with the observed hydrological processes. In the second part of the thesis, we focus on strain data from Gladwin Tensor Strainmeters, working on the instruments deployed in Taiwan. We develop a novel approach, completely data driven, to calibrate these strainmeters. We carry out a joint analysis of geodetic (strainmeters, GPS and GRACE products) and hydrological (rain gauges and piezometers) data sets, to characterize the hydrological signals in Southern Taiwan. Lastly, we apply the calibration approach here proposed to the strainmeters recently installed in Central Italy. We provide, as an example, the detection of a storm that hit the Umbria-Marche regions (Italy), demonstrating the potential of strainmeters in following the dynamics of deformation processes with limited spatio-temporal signature

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Natural events are a widely recognized hazard for industrial sites where relevant quantities of hazardous substances are handled, due to the possible generation of cascading events resulting in severe technological accidents (Natech scenarios). Natural events may damage storage and process equipment containing hazardous substances, that may be released leading to major accident scenarios called Natech events. The need to assess the risk associated with Natech scenarios is growing and methodologies were developed to allow the quantification of Natech risk, considering both point sources and linear sources as pipelines. A key element of these procedures is the use of vulnerability models providing an estimation of the damage probability of equipment or pipeline segment as a result of the impact of the natural event. Therefore, the first aim of the PhD project was to outline the state of the art of vulnerability models for equipment and pipelines subject to natural events such as floods, earthquakes, and wind. Moreover, the present PhD project also aimed at the development of new vulnerability models in order to fill some gaps in literature. In particular, a vulnerability model for vertical equipment subject to wind and to flood were developed. Finally, in order to improve the calculation of Natech risk for linear sources an original methodology was developed for Natech quantitative risk assessment methodology for pipelines subject to earthquakes. Overall, the results obtained are a step forward in the quantitative risk assessment of Natech accidents. The tools developed open the way to the inclusion of new equipment in the analysis of Natech events, and the methodology for the assessment of linear risk sources as pipelines provides an important tool for a more accurate and comprehensive assessment of Natech risk.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, the seismic vulnerability of existing masonry buildings has been underscored by the destructive impacts of earthquakes. Therefore, Fibre Reinforced Cementitious Matrix (FRCM) retrofitting systems have gained prominence due to their high strength-to-weight ratio, compatibility with substrates, and potential reversibility. However, concerns linger regarding the durability of these systems when subjected to long-term environmental conditions. This doctoral dissertation addressed these concerns by studying the effects of mild temperature variations on three FRCM systems, featuring basalt, glass, and aramid fibre textiles with lime-based mortar matrices. The study subjected various specimens, including mortar triplets, bare textile specimens, FRCM coupons, and single-lap direct shear wallets, to thermal exposure. A novel approach utilizing embedded thermocouple sensors facilitated efficient monitoring and active control of the conditioning process. A shift in the failure modes was obtained in the single lap-direct shear tests, alongside a significant impact on tensile capacity for both textiles and FRCM coupons. Subsequently, bond tests results were used to indirectly calibrate an analytical approach based on mode-II fracture mechanics. A comparison between Cohesive Material Law (CML) functions at various temperatures was conducted for each of the three systems, demonstrating a good agreement between the analytical model and experimental curves. Furthermore, the durability in alkaline environment of two additional FRCM systems, characterized by basalt and glass fibre textiles with lime-based mortars, was studied through an extensive experimental campaign. Tests conducted on single yarn and textile specimens after exposure at different durations and temperatures revealed a significant impact on tensile capacity. Additionally, FRCM coupons manufactured with conditioned textile were tested to understand the influence of aged textile and curing environment on the final tensile behavior. These results contributed significantly to the existing knowledge on FRCM systems and could be used to develop a standardized alkaline testing protocol, still lacking in the scientific literature.