34 resultados para Sumatra-Andaman earthquake
Resumo:
A critical point in the analysis of ground displacements time series is the development of data driven methods that allow the different sources that generate the observed displacements to be discerned and characterised. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows reducing the dimensionality of the data space maintaining most of the variance of the dataset explained. Anyway, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. The Independent Component Analysis (ICA) is a popular technique adopted to approach this problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, I use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here I present the application of the vbICA technique to GPS position time series. First, I use vbICA on synthetic data that simulate a seismic cycle (interseismic + coseismic + postseismic + seasonal + noise) and a volcanic source, and I study the ability of the algorithm to recover the original (known) sources of deformation. Secondly, I apply vbICA to different tectonically active scenarios, such as the 2009 L'Aquila (central Italy) earthquake, the 2012 Emilia (northern Italy) seismic sequence, and the 2006 Guerrero (Mexico) Slow Slip Event (SSE).
Resumo:
The so called cascading events, which lead to high-impact low-frequency scenarios are rising concern worldwide. A chain of events result in a major industrial accident with dreadful (and often unpredicted) consequences. Cascading events can be the result of the realization of an external threat, like a terrorist attack a natural disaster or of “domino effect”. During domino events the escalation of a primary accident is driven by the propagation of the primary event to nearby units, causing an overall increment of the accident severity and an increment of the risk associated to an industrial installation. Also natural disasters, like intense flooding, hurricanes, earthquake and lightning are found capable to enhance the risk of an industrial area, triggering loss of containment of hazardous materials and in major accidents. The scientific community usually refers to those accidents as “NaTechs”: natural events triggering industrial accidents. In this document, a state of the art of available approaches to the modelling, assessment, prevention and management of domino and NaTech events is described. On the other hand, the relevant work carried out during past studies still needs to be consolidated and completed, in order to be applicable in a real industrial framework. New methodologies, developed during my research activity, aimed at the quantitative assessment of domino and NaTech accidents are presented. The tools and methods provided within this very study had the aim to assist the progress toward a consolidated and universal methodology for the assessment and prevention of cascading events, contributing to enhance safety and sustainability of the chemical and process industry.
Resumo:
Questa tesi consiste nell'analisi socio-antropologica delle risposte al sisma che il 20 e il 29 maggio ha colpito l'area nord della pianura padano-emiliana, in Italia. La zona precisa di ricerca è stata quella compresa tra i comuni di Mirandola, Cavezzo, Concordia sul Secchia e San Possidonio, della provincia di Modena. Il soggetto specifico è stato Sisma.12, un comitato di terremotati, apartitico e trasversale, che porta avanti specifiche rivendicazioni, elaborando e ponendo in essere politiche “dal basso”, che nascono dalle esperienze dei suoi membri, differenti ma partecipate, come alternative alle scelte messe in atto dalle istituzioni.
Resumo:
How to evaluate the cost-effectiveness of repair/retrofit intervention vs. demolition/replacement and what level of shaking intensity can the chosen repairing/retrofit technique sustain are open questions affecting either the pre-earthquake prevention, the post-earthquake emergency and the reconstruction phases. The (mis)conception that the cost of retrofit interventions would increase linearly with the achieved seismic performance (%NBS) often discourages stakeholders to consider repair/retrofit options in a post-earthquake damage situation. Similarly, in a pre-earthquake phase, the minimum (by-law) level of %NBS might be targeted, leading in some cases to no-action. Furthermore, the performance measure enforcing owners to take action, the %NBS, is generally evaluated deterministically. Not directly reflecting epistemic and aleatory uncertainties, the assessment can result in misleading confidence on the expected performance. The present study aims at contributing to the delicate decision-making process of repair/retrofit vs. demolition/replacement, by developing a framework to assist stakeholders with the evaluation of the effects in terms of long-term losses and benefits of an increment in their initial investment (targeted retrofit level) and highlighting the uncertainties hidden behind a deterministic approach. For a pre-1970 case study building, different retrofit solutions are considered, targeting different levels of %NBS, and the actual probability of reaching Collapse when considering a suite of ground-motions is evaluated, providing a correlation between %NBS and Risk. Both a simplified and a probabilistic loss modelling are then undertaken to study the relationship between %NBS and expected direct and indirect losses.