13 resultados para Catastrophic Cognitions
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Gossip protocols have proved to be a viable solution to set-up and manage largescale P2P services or applications in a fully decentralised scenario. The gossip or epidemic communication scheme is heavily based on stochastic behaviors and it is the fundamental idea behind many large-scale P2P protocols. It provides many remarkable features, such as scalability, robustness to failures, emergent load balancing capabilities, fast spreading, and redundancy of information. In some sense, these services or protocols mimic natural system behaviors in order to achieve their goals. The key idea of this work is that the remarkable properties of gossip hold when all the participants follow the rules dictated by the actual protocols. If one or more malicious nodes join the network and start cheating according to some strategy, the result can be catastrophic. In order to study how serious the threat posed by malicious nodes can be and what can be done to prevent attackers from cheating, we focused on a general attack model aimed to defeat a key service in gossip overlay networks (the Peer Sampling Service [JGKvS04]). We also focused on the problem of protecting against forged information exchanged in gossip services. We propose a solution technique for each problem; both techniques are general enough to be applied to distinct service implementations. As gossip protocols, our solutions are based on stochastic behavior and are fully decentralized. In addition, each technique’s behaviour is abstracted by a general primitive function extending the basic gossip scheme; this approach allows the adoptions of our solutions with minimal changes in different scenarios. We provide an extensive experimental evaluation to support the effectiveness of our techniques. Basically, these techniques aim to be building blocks or P2P architecture guidelines in building more resilient and more secure P2P services.
Resumo:
In the last decade the interest for submarine instability grew up, driven by the increasing exploitation of natural resources (primary hydrocarbons), the emplacement of bottom-lying structures (cables and pipelines) and by the development of coastal areas, whose infrastructures increasingly protrude to the sea. The great interest for this topic promoted a number of international projects such as: STEAM (Sediment Transport on European Atlantic Margins, 93-96), ENAM II (European North Atlantic Margin, 96-99), GITEC (Genesis and Impact of Tsunamis on the European Coast 92-95), STRATAFORM (STRATA FORmation on Margins, 95-01), Seabed Slope Process in Deep Water Continental Margin (Northwest Gulf of Mexico, 96-04), COSTA (Continental slope Stability, 00-05), EUROMARGINS (Slope Stability on Europe’s Passive Continental Margin), SPACOMA (04-07), EUROSTRATAFORM (European Margin Strata Formation), NGI's internal project SIP-8 (Offshore Geohazards), IGCP-511: Submarine Mass Movements and Their Consequences (05-09) and projects indirectly related to instability processes, such as TRANSFER (Tsunami Risk ANd Strategies For the European region, 06-09) or NEAREST (integrated observations from NEAR shore sourcES of Tsunamis: towards an early warning system, 06-09). In Italy, apart from a national project realized within the activities of the National Group of Volcanology during the framework 2000-2003 “Conoscenza delle parti sommerse dei vulcani italiani e valutazione del potenziale rischio vulcanico”, the study of submarine mass-movement has been underestimated until the occurrence of the landslide-tsunami events that affected Stromboli on December 30, 2002. This event made the Italian Institutions and the scientific community more aware of the hazard related to submarine landslides, mainly in light of the growing anthropization of coastal sectors, that increases the vulnerability of these areas to the consequences of such processes. In this regard, two important national projects have been recently funded in order to study coastal instabilities (PRIN 24, 06-08) and to map the main submarine hazard features on continental shelves and upper slopes around the most part of Italian coast (MaGIC Project). The study realized in this Thesis is addressed to the understanding of these processes, with particular reference to Stromboli submerged flanks. These latter represent a natural laboratory in this regard, as several kind of instability phenomena are present on the submerged flanks, affecting about 90% of the entire submerged areal and often (strongly) influencing the morphological evolution of subaerial slopes, as witnessed by the event occurred on 30 December 2002. Furthermore, each phenomenon is characterized by different pre-failure, failure and post-failure mechanisms, ranging from rock-falls, to turbidity currents up to catastrophic sector collapses. The Thesis is divided into three introductive chapters, regarding a brief review of submarine instability phenomena and related hazard (cap. 1), a “bird’s-eye” view on methodologies and available dataset (cap. 2) and a short introduction on the evolution and the morpho-structural setting of the Stromboli edifice (cap. 3). This latter seems to play a major role in the development of largescale sector collapses at Stromboli, as they occurred perpendicular to the orientation of the main volcanic rift axis (oriented in NE-SW direction). The characterization of these events and their relationships with successive erosive-depositional processes represents the main focus of cap.4 (Offshore evidence of large-scale lateral collapses on the eastern flank of Stromboli, Italy, due to structurally-controlled, bilateral flank instability) and cap. 5 (Lateral collapses and active sedimentary processes on the North-western flank of Stromboli Volcano), represented by articles accepted for publication on international papers (Marine Geology). Moreover, these studies highlight the hazard related to these catastrophic events; several calamities (with more than 40000 casualties only in the last two century) have been, in fact, the direct or indirect result of landslides affecting volcanic flanks, as observed at Oshima-Oshima (1741) and Unzen Volcano (1792) in Japan (Satake&Kato, 2001; Brantley&Scott, 1993), Krakatau (1883) in Indonesia (Self&Rampino, 1981), Ritter Island (1888), Sissano in Papua New Guinea (Ward& Day, 2003; Johnson, 1987; Tappin et al., 2001) and Mt St. Augustine (1883) in Alaska (Beget& Kienle, 1992). Flank landslide are also recognized as the most important and efficient mass-wasting process on volcanoes, contributing to the development of the edifices by widening their base and to the growth of a volcaniclastic apron at the foot of a volcano; a number of small and medium-scale erosive processes are also responsible for the carving of Stromboli submarine flanks and the transport of debris towards the deeper areas. The characterization of features associated to these processes is the main focus of cap. 6; it is also important to highlight that some small-scale events are able to create damage to coastal areas, as also witnessed by recent events of Gioia Tauro 1978, Nizza, 1979 and Stromboli 2002. The hazard potential related to these phenomena is, in fact, very high, as they commonly occur at higher frequency with respect to large-scale collapses, therefore being more significant in terms of human timescales. In the last chapter (cap. 7), a brief review and discussion of instability processes identified on Stromboli submerged flanks is presented; they are also compared with respect to analogous processes recognized in other submerged areas in order to shed lights on the main factors involved in their development. Finally, some applications of multibeam data to assess the hazard related to these phenomena are also discussed.
Resumo:
The work undertaken in this PhD thesis is aimed at the development and testing of an innovative methodology for the assessment of the vulnerability of coastal areas to marine catastrophic inundation (tsunami). Different approaches are used at different spatial scales and are applied to three different study areas: 1. The entire western coast of Thailand 2. Two selected coastal suburbs of Sydney – Australia 3. The Aeolian Islands, in the South Tyrrhenian Sea – Italy I have discussed each of these cases study in at least one scientific paper: one paper about the Thailand case study (Dall’Osso et al., in review-b), three papers about the Sydney applications (Dall’Osso et al., 2009a; Dall’Osso et al., 2009b; Dall’Osso and Dominey-Howes, in review) and one last paper about the work at the Aeolian Islands (Dall’Osso et al., in review-a). These publications represent the core of the present PhD thesis. The main topics dealt with are outlined and discussed in a general introduction while the overall conclusions are outlined in the last section.
Resumo:
The hydrologic risk (and the hydro-geologic one, closely related to it) is, and has always been, a very relevant issue, due to the severe consequences that may be provoked by a flooding or by waters in general in terms of human and economic losses. Floods are natural phenomena, often catastrophic, and cannot be avoided, but their damages can be reduced if they are predicted sufficiently in advance. For this reason, the flood forecasting plays an essential role in the hydro-geological and hydrological risk prevention. Thanks to the development of sophisticated meteorological, hydrologic and hydraulic models, in recent decades the flood forecasting has made a significant progress, nonetheless, models are imperfect, which means that we are still left with a residual uncertainty on what will actually happen. In this thesis, this type of uncertainty is what will be discussed and analyzed. In operational problems, it is possible to affirm that the ultimate aim of forecasting systems is not to reproduce the river behavior, but this is only a means through which reducing the uncertainty associated to what will happen as a consequence of a precipitation event. In other words, the main objective is to assess whether or not preventive interventions should be adopted and which operational strategy may represent the best option. The main problem for a decision maker is to interpret model results and translate them into an effective intervention strategy. To make this possible, it is necessary to clearly define what is meant by uncertainty, since in the literature confusion is often made on this issue. Therefore, the first objective of this thesis is to clarify this concept, starting with a key question: should be the choice of the intervention strategy to adopt based on the evaluation of the model prediction based on its ability to represent the reality or on the evaluation of what actually will happen on the basis of the information given by the model forecast? Once the previous idea is made unambiguous, the other main concern of this work is to develope a tool that can provide an effective decision support, making possible doing objective and realistic risk evaluations. In particular, such tool should be able to provide an uncertainty assessment as accurate as possible. This means primarily three things: it must be able to correctly combine all the available deterministic forecasts, it must assess the probability distribution of the predicted quantity and it must quantify the flooding probability. Furthermore, given that the time to implement prevention strategies is often limited, the flooding probability will have to be linked to the time of occurrence. For this reason, it is necessary to quantify the flooding probability within a horizon time related to that required to implement the intervention strategy and it is also necessary to assess the probability of the flooding time.
Strategy as a matter of beliefs: the recorded music industry reinventing itself by rethinking itself
Resumo:
Managerial and organizational cognition studies the ways cognitions of managers in groups, organizations and industries shape their strategies and actions. Cognitions refer to simplified representations of managers’ internal and external environments, necessary to cope with the rich, ambiguous information requirements that characterize strategy making. Despite the important achievements in the field, many unresolved puzzles remain as to this process, particular as to the cognitive factors that condition actors in framing a response to a discontinuity, how actors can change their models in the face of a discontinuity, and the reciprocal relation between cognition and action. I leverage on the recent case of the recorded music industry in the face of the digital technology to study these issues, through a strategy-oriented study of the way early response to the discontinuity was constructed and of the subsequent evolution of this response. Through a longitudinal historical and cognitive analysis of actions and cognitions at both the industry and firm-level during the period in which the response took place (1999-2010), I gain important insights on the way historical beliefs in the industry shaped early response to the digital disruption, on the role of outsiders in promoting change through renewed vision about important issues, and on the reciprocal relationship between cognitive and strategic change.
Resumo:
Most basaltic volcanoes are affected by recurrent lateral instabilities during their evolution. Numerous factors have been shown to be involved in the process of flank destabilization occurring over long periods of time or by instantaneous failures. However, the role of these factors on the mechanical behaviour and stability of volcanic edifices is poorly-constrained as lateral failure usually results from the combined effects of several parameters. Our study focuses on the morphological and structural comparison of two end-member basaltic systems, La Reunion (Indian ocean, France) and Stromboli (southern Tyrrhenian sea, Italy). We showed that despite major differences on their volumes and geodynamic settings, both systems present some similarities as they are characterized by an intense intrusive activity along well-developed rift zones and recurrent phenomena of flank collapse during their evolution. Among the factors of instability, the examples of la Reunion and Stromboli evidence the major contribution of intrusive complexes to volcano growth and destruction as attested by field observations and the monitoring of these active volcanoes. Classical models consider the relationship between vertical intrusions of magma and flank movements along a preexisting sliding surface. A set of published and new field data from Piton des Neiges volcano (La Reunion) allowed us to recognize the role of subhorizontal intrusions in the process of flank instability and to characterize the geometry of both subvertical and subhorizontal intrusions within basaltic edifices. This study compares the results of numerical modelling of the displacements associated with high-angle and low-angle intrusions within basaltic volcanoes. We use a Mixed Boundary Element Method to investigate the mechanical response of an edifice to the injection of magmatic intrusions in different stress fields. Our results indicate that the anisotropy of the stress field favours the slip along the intrusions due to cointrusive shear stress, generating flank-scale displacements of the edifice, especially in the case of subhorizontal intrusions, capable of triggering large-scale flank collapses on basaltic volcanoes. Applications of our theoretical results to real cases of flank displacements on basaltic volcanoes (such as the 2007 eruptive crisis at La Reunion and Stromboli) revealed that the previous model of subvertical intrusions-related collapse is a likely mechanism affecting small-scale steeply-sloping basaltic volcanoes like Stromboli. Furthermore, our field study combined to modelling results confirms the importance of shallow-dipping intrusions in the morpho-structural evolution of large gently-sloping basaltic volcanoes like Piton de la Fournaise, Etna and Kilauea, with particular regards to flank instability, which can cause catastrophic tsunamis.
Resumo:
La Comunità Europea, alla luce dei recenti eventi alluvionali occorsi nei Paesi Membri ed al progressivo aumento dei danni economici da essi provocati, ha recentemente emanato una direttiva (Direttiva Europea 2007/60/CE, Flood Directive) per la valutazione e la predisposizione di piani di gestione del rischio idraulico alluvionale. Con riferimento a tale contesto l’attività di ricerca condotta si è concentrata sulla valutazione delle potenzialità offerte dalla modellistica numerico-idraulica mono e bidimensionale quale strumento per l’attuazione della Direttiva 2007/60. Le attività sono state affrontate ponendo particolare attenzione alla valutazione dei termini di incertezza che caratterizzano l’applicazione dei modelli numerico-idraulici, esaminando i possibili effetti di tale incertezza sulla mappatura della pericolosità idraulica. In particolare, lo studio si concentra su diversi tratti fluviali del corso medio inferiore del Fiume Po e si articola in tre parti: 1) analisi dell’incertezza connessa alla definizione delle scale di deflusso in una generica sezione fluviale e valutazione dei suoi effetti sulla calibrazione dei modelli numerici quasi-bidimensionali (quasi-2D); 2) definizione di mappe probabilistiche di allagamento per tratti fluviali arginati in presenza di tre sorgenti di incertezza: incertezza nelle condizioni al contorno di monte, nelle condizioni di valle e nell’identificazione delle eventuali brecce arginali; 3) valutazione dell’applicabilità di un modello quasi-2D per la definizione, a grande scala spaziale, di strategie alternative al tradizionale rialzo dei manufatti arginali per la mitigazione del rischio alluvionale associato a eventi di piena catastrofici. Le analisi condotte, oltre ad aver definito e valutato le potenzialità di metodologie e modelli idraulici a diversa complessità, hanno evidenziato l’entità e l’impatto dei più importanti elementi d’incertezza, sottolineando come la corretta mappatura della pericolosità idraulica debba sempre essere accompagnata da una valutazione della sua incertezza.
Resumo:
Throughout the alpine domain, shallow landslides represent a serious geologic hazard, often causing severe damages to infrastructures, private properties, natural resources and in the most catastrophic events, threatening human lives. Landslides are a major factor of landscape evolution in mountainous and hilly regions and represent a critical issue for mountainous land management, since they cause loss of pastoral lands. In several alpine contexts, shallow landsliding distribution is strictly connected to the presence and condition of vegetation on the slopes. With the aid of high-resolution satellite images, it's possible to divide automatically the mountainous territory in land cover classes, which contribute with different magnitude to the stability of the slopes. The aim of this research is to combine EO (Earth Observation) land cover maps with ground-based measurements of the land cover properties. In order to achieve this goal, a new procedure has been developed to automatically detect grass mantle degradation patterns from satellite images. Moreover, innovative surveying techniques and instruments are tested to measure in situ the shear strength of grass mantle and the geomechanical and geotechnical properties of these alpine soils. Shallow landsliding distribution is assessed with the aid of physically based models, which use the EO-based map to distribute the resistance parameters across the landscape.
Resumo:
Composite laminates present important advantages compared to conventional monolithic materials, mainly because for equal stiffness and strength they have a weight up to four times lower. However, due to their ply-by-ply nature, they are susceptible to delamination, whose propagation can bring the structure to a rapid catastrophic failure. In this thesis, in order to increase the service life of composite materials, two different approaches were explored: increase the intrinsic resistance of the material or confer to them the capability of self-repair. The delamination has been hindered through interleaving the composite laminates with polymeric nanofibers, which completed the hierarchical reinforcement scale of the composite. The manufacturing process for the integration of the nanofibrous mat in the laminate was optimized, resulting in an enhancement of mode I fracture toughness up to 250%. The effect of the geometrical dimensions of the nano-reinforcement on the architecture of the micro one (UD and woven laminates) was studied on mode I and II. Moreover, different polymeric materials were employed as nanofibrous reinforcement (Nylon 66 and polyvinylidene fluoride). The nano toughening mechanism was studied by micrograph analysis of the crack path and SEM analysis of the fracture surface. The fatigue behavior to the onset of the delamination and the crack growth rate for woven laminates interleaved with Nylon 66 nanofibers was investigated. Furthermore, the impact behavior of GLARE aluminum-glass epoxy laminates, toughened with Nylon 66 nanofibers was investigated. Finally, the possibility of confer to the composite material the capability of self-repair was explored. An extrinsic self-healing-system, based on core-shell nanofibers filled with a two-component epoxy system, was developed by co-electrospinning technique. The healing potential of the nano vascular system has been proved by microscope electron observation of the healing agent release as result of the vessels rupture and the crosslinking reaction was verified by thermal analysis.
Resumo:
Carbon Fiber Reinforced Polymers (CFRPs) display high specific mechanical properties, allowing the creation of lightweight components and products by metals replacement. To reach outstanding mechanical performances, the use of stiff thermoset matrices, like epoxy, is preferred. Laminated composites are commonly used for their ease of manipulation during object manufacturing. However, the natural anisotropic structure of laminates makes them vulnerable toward delamination. Moreover, epoxy-based CFRPs are very stiff materials, thus showing low damping capacity, which results in unwanted vibrations and structure-borne noise that may contribute to delamination triggering. Hence, searching for systems able to limit these drawbacks is of primary importance for safety reasons, as well as for economic ones. In this experimental thesis, the production and integration of innovative rubbery nanofibrous mats into CFRP laminates are presented. A smart approach, based on single-needle electrospinning of rubber-containing blends, is proposed for producing dimensionally stable rubbery nanofibers without the need for rubber crosslinking. Nano-modified laminates aim at obtaining structural composites with improved delamination resistance and enhanced damping capacity, without significantly lowering other relevant mechanical properties. The possibility of producing nanofibers nano-reinforced with graphene to be applied for reinforcing composite laminates is also investigated. Moreover, the use of piezoelectric nanofibrous mats in hybrid composite laminates for achieving self-sensing capability is presented too as a different approach to prevent the catastrophic consequences of possible structural laminate failure. Finally, an accurate, systematic, and critical study concerning tensile testing of nonwovens, using electrospun Nylon 66 random nanofibrous mats as a case study, is proposed. Nanofibers diameter and specimen geometry were investigated to thoroughly describe the nanomat tensile behaviour, also considering the polymer thermal properties, and the number of nanofibers crossings as a function of the nanofibers diameter. Stress-strain data were also analysed using a phenomenological data fitting model to interpret the tensile behaviour better.
Resumo:
In the last decade, manufacturing companies have been facing two significant challenges. First, digitalization imposes adopting Industry 4.0 technologies and allows creating smart, connected, self-aware, and self-predictive factories. Second, the attention on sustainability imposes to evaluate and reduce the impact of the implemented solutions from economic and social points of view. In manufacturing companies, the maintenance of physical assets assumes a critical role. Increasing the reliability and the availability of production systems leads to the minimization of systems’ downtimes; In addition, the proper system functioning avoids production wastes and potentially catastrophic accidents. Digitalization and new ICT technologies have assumed a relevant role in maintenance strategies. They allow assessing the health condition of machinery at any point in time. Moreover, they allow predicting the future behavior of machinery so that maintenance interventions can be planned, and the useful life of components can be exploited until the time instant before their fault. This dissertation provides insights on Predictive Maintenance goals and tools in Industry 4.0 and proposes a novel data acquisition, processing, sharing, and storage framework that addresses typical issues machine producers and users encounter. The research elaborates on two research questions that narrow down the potential approaches to data acquisition, processing, and analysis for fault diagnostics in evolving environments. The research activity is developed according to a research framework, where the research questions are addressed by research levers that are explored according to research topics. Each topic requires a specific set of methods and approaches; however, the overarching methodological approach presented in this dissertation includes three fundamental aspects: the maximization of the quality level of input data, the use of Machine Learning methods for data analysis, and the use of case studies deriving from both controlled environments (laboratory) and real-world instances.
Resumo:
One of the most visionary goals of Artificial Intelligence is to create a system able to mimic and eventually surpass the intelligence observed in biological systems including, ambitiously, the one observed in humans. The main distinctive strength of humans is their ability to build a deep understanding of the world by learning continuously and drawing from their experiences. This ability, which is found in various degrees in all intelligent biological beings, allows them to adapt and properly react to changes by incrementally expanding and refining their knowledge. Arguably, achieving this ability is one of the main goals of Artificial Intelligence and a cornerstone towards the creation of intelligent artificial agents. Modern Deep Learning approaches allowed researchers and industries to achieve great advancements towards the resolution of many long-standing problems in areas like Computer Vision and Natural Language Processing. However, while this current age of renewed interest in AI allowed for the creation of extremely useful applications, a concerningly limited effort is being directed towards the design of systems able to learn continuously. The biggest problem that hinders an AI system from learning incrementally is the catastrophic forgetting phenomenon. This phenomenon, which was discovered in the 90s, naturally occurs in Deep Learning architectures where classic learning paradigms are applied when learning incrementally from a stream of experiences. This dissertation revolves around the Continual Learning field, a sub-field of Machine Learning research that has recently made a comeback following the renewed interest in Deep Learning approaches. This work will focus on a comprehensive view of continual learning by considering algorithmic, benchmarking, and applicative aspects of this field. This dissertation will also touch on community aspects such as the design and creation of research tools aimed at supporting Continual Learning research, and the theoretical and practical aspects concerning public competitions in this field.
Resumo:
Il patrimonio culturale sopravvissuto fino ai giorni nostri nonostante calamità naturali ed eventi catastrofici è oggi sempre più in pericolo: gli eventi naturali, accelerati e resi ancora più distruttivi dagli effetti del cambiamento climatico, lo scoppio continuo di nuovi conflitti armati e l’inconsapevolezza con cui gli uomini sfruttano il territorio comportano un aumento dei rischi e dei possibili danni ad un patrimonio che, tuttavia, è di importanza vitale per la crescita dell’umanità. Per evitare che il patrimonio culturale venga disperso o distrutto, è necessario applicare misure di prevenzione e protezione mirate, utilizzando in maniera efficiente gli strumenti disponibili; lo scopo ultimo della prevenzione e della protezione deve essere la resilienza, che va costruita attraverso la conoscenza e l’attenta pianificazione della gestione del patrimonio. Il presente lavoro di ricerca si propone dunque di analizzare i metodi e le strategie utilizzabili per la valutazione e la gestione del rischio applicati ai beni culturali, verificando a quale livello di consapevolezza si è giunti a livello sia nazionale che internazionale, passando in rassegna le tecnologie che permettono di proteggere il patrimonio agevolando il lavoro di mitigazione del rischio e applicando un prototipo di calcolo e analisi del rischio al caso studio del Museo di Nonantola, in provincia di Modena.