10 resultados para slope-scaling

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural hazard related to the volcanic activity represents a potential risk factor, particularly in the vicinity of human settlements. Besides to the risk related to the explosive and effusive activity, the instability of volcanic edifices may develop into large landslides often catastrophically destructive, as shown by the collapse of the northern flank of Mount St. Helens in 1980. A combined approach was applied to analyse slope failures that occurred at Stromboli volcano. SdF slope stability was evaluated by using high-resolution multi-temporal DTMMs and performing limit equilibrium stability analyses. High-resolution topographical data collected with remote sensing techniques and three-dimensional slope stability analysis play a key role in understanding instability mechanism and the related risks. Analyses carried out on the 2002–2003 and 2007 Stromboli eruptions, starting from high-resolution data acquired through airborne remote sensing surveys, permitted the estimation of the lava volumes emplaced on the SdF slope and contributed to the investigation of the link between magma emission and slope instabilities. Limit Equilibrium analyses were performed on the 2001 and 2007 3D models, in order to simulate the slope behavior before 2002-2003 landslide event and after the 2007 eruption. Stability analyses were conducted to understand the mechanisms that controlled the slope deformations which occurred shortly after the 2007 eruption onset, involving the upper part of slope. Limit equilibrium analyses applied to both cases yielded results which are congruent with observations and monitoring data. The results presented in this work undoubtedly indicate that hazard assessment for the island of Stromboli should take into account the fact that a new magma intrusion could lead to further destabilisation of the slope, which may be more significant than the one recently observed because it will affect an already disarranged deposit and fractured and loosened crater area. The two-pronged approach based on the analysis of 3D multi-temporal mapping datasets and on the application of LE methods contributed to better understanding volcano flank behaviour and to be prepared to undertake actions aimed at risk mitigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Graphene excellent properties make it a promising candidate for building future nanoelectronic devices. Nevertheless, the absence of an energy gap is an open problem for the transistor application. In this thesis, graphene nanoribbons and pattern-hydrogenated graphene, two alternatives for inducing an energy gap in graphene, are investigated by means of numerical simulations. A tight-binding NEGF code is developed for the simulation of GNR-FETs. To speed up the simulations, the non-parabolic effective mass model and the mode-space tight-binding method are developed. The code is used for simulation studies of both conventional and tunneling FETs. The simulations show the great potential of conventional narrow GNR-FETs, but highlight at the same time the leakage problems in the off-state due to various tunneling mechanisms. The leakage problems become more severe as the width of the devices is made larger, and thus the band gap smaller, resulting in a poor on/off current ratio. The tunneling FET architecture can partially solve these problems thanks to the improved subthreshold slope; however, it is also shown that edge roughness, unless well controlled, can have a detrimental effect in the off-state performance. In the second part of this thesis, pattern-hydrogenated graphene is simulated by means of a tight-binding model. A realistic model for patterned hydrogenation, including disorder, is developed. The model is validated by direct comparison of the momentum-energy resolved density of states with the experimental angle-resolved photoemission spectroscopy. The scaling of the energy gap and the localization length on the parameters defining the pattern geometry is also presented. The results suggest that a substantial transport gap can be attainable with experimentally achievable hydrogen concentration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this work of thesis is the refined estimations of source parameters. To such a purpose we used two different approaches, one in the frequency domain and the other in the time domain. In frequency domain, we analyzed the P- and S-wave displacement spectra to estimate spectral parameters, that is corner frequencies and low frequency spectral amplitudes. We used a parametric modeling approach which is combined with a multi-step, non-linear inversion strategy and includes the correction for attenuation and site effects. The iterative multi-step procedure was applied to about 700 microearthquakes in the moment range 1011-1014 N•m and recorded at the dense, wide-dynamic range, seismic networks operating in Southern Apennines (Italy). The analysis of the source parameters is often complicated when we are not able to model the propagation accurately. In this case the empirical Green function approach is a very useful tool to study the seismic source properties. In fact the Empirical Green Functions (EGFs) consent to represent the contribution of propagation and site effects to signal without using approximate velocity models. An EGF is a recorded three-component set of time-histories of a small earthquake whose source mechanism and propagation path are similar to those of the master event. Thus, in time domain, the deconvolution method of Vallée (2004) was applied to calculate the source time functions (RSTFs) and to accurately estimate source size and rupture velocity. This technique was applied to 1) large event, that is Mw=6.3 2009 L’Aquila mainshock (Central Italy), 2) moderate events, that is cluster of earthquakes of 2009 L’Aquila sequence with moment magnitude ranging between 3 and 5.6, 3) small event, i.e. Mw=2.9 Laviano mainshock (Southern Italy).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Constructing ontology networks typically occurs at design time at the hands of knowledge engineers who assemble their components statically. There are, however, use cases where ontology networks need to be assembled upon request and processed at runtime, without altering the stored ontologies and without tampering with one another. These are what we call "virtual [ontology] networks", and keeping track of how an ontology changes in each virtual network is called "multiplexing". Issues may arise from the connectivity of ontology networks. In many cases, simple flat import schemes will not work, because many ontology managers can cause property assertions to be erroneously interpreted as annotations and ignored by reasoners. Also, multiple virtual networks should optimize their cumulative memory footprint, and where they cannot, this should occur for very limited periods of time. We claim that these problems should be handled by the software that serves these ontology networks, rather than by ontology engineering methodologies. We propose a method that spreads multiple virtual networks across a 3-tier structure, and can reduce the amount of erroneously interpreted axioms, under certain raw statement distributions across the ontologies. We assumed OWL as the core language handled by semantic applications in the framework at hand, due to the greater availability of reasoners and rule engines. We also verified that, in common OWL ontology management software, OWL axiom interpretation occurs in the worst case scenario of pre-order visit. To measure the effectiveness and space-efficiency of our solution, a Java and RESTful implementation was produced within an Apache project. We verified that a 3-tier structure can accommodate reasonably complex ontology networks better, in terms of the expressivity OWL axiom interpretation, than flat-tree import schemes can. We measured both the memory overhead of the additional components we put on top of traditional ontology networks, and the framework's caching capabilities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study is focused on radio-frequency inductively coupled thermal plasma (ICP) synthesis of nanoparticles, combining experimental and modelling approaches towards process optimization and industrial scale-up, in the framework of the FP7-NMP SIMBA European project (Scaling-up of ICP technology for continuous production of Metallic nanopowders for Battery Applications). First the state of the art of nanoparticle production through conventional and plasma routes is summarized, then results for the characterization of the plasma source and on the investigation of the nanoparticle synthesis phenomenon, aiming at highlighting fundamental process parameters while adopting a design oriented modelling approach, are presented. In particular, an energy balance of the torch and of the reaction chamber, employing a calorimetric method, is presented, while results for three- and two-dimensional modelling of an ICP system are compared with calorimetric and enthalpy probe measurements to validate the temperature field predicted by the model and used to characterize the ICP system under powder-free conditions. Moreover, results from the modeling of critical phases of ICP synthesis process, such as precursor evaporation, vapour conversion in nanoparticles and nanoparticle growth, are presented, with the aim of providing useful insights both for the design and optimization of the process and on the underlying physical phenomena. Indeed, precursor evaporation, one of the phases holding the highest impact on industrial feasibility of the process, is discussed; by employing models to describe particle trajectories and thermal histories, adapted from the ones originally developed for other plasma technologies or applications, such as DC non-transferred arc torches and powder spherodization, the evaporation of micro-sized Si solid precursor in a laboratory scale ICP system is investigated. Finally, a discussion on the role of thermo-fluid dynamic fields on nano-particle formation is presented, as well as a study on the effect of the reaction chamber geometry on produced nanoparticle characteristics and process yield.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Semiconductor nanowires (NWs) are one- or quasi one-dimensional systems whose physical properties are unique as compared to bulk materials because of their nanoscaled sizes. They bring together quantum world and semiconductor devices. NWs-based technologies may achieve an impact comparable to that of current microelectronic devices if new challenges will be faced. This thesis primarily focuses on two different, cutting-edge aspects of research over semiconductor NW arrays as pivotal components of NW-based devices. The first part deals with the characterization of electrically active defects in NWs. It has been elaborated the set-up of a general procedure which enables to employ Deep Level Transient Spectroscopy (DLTS) to probe NW arrays’ defects. This procedure has been applied to perform the characterization of a specific system, i.e. Reactive Ion Etched (RIE) silicon NW arrays-based Schottky barrier diodes. This study has allowed to shed light over how and if growth conditions introduce defects in RIE processed silicon NWs. The second part of this thesis concerns the bowing induced by electron beam and the subsequent clustering of gallium arsenide NWs. After a justified rejection of the mechanisms previously reported in literature, an original interpretation of the electron beam induced bending has been illustrated. Moreover, this thesis has successfully interpreted the formation of NW clusters in the framework of the lateral collapse of fibrillar structures. These latter are both idealized models and actual artificial structures used to study and to mimic the adhesion properties of natural surfaces in lizards and insects (Gecko effect). Our conclusion are that mechanical and surface properties of the NWs, together with the geometry of the NW arrays, play a key role in their post-growth alignment. The same parameters open, then, to the benign possibility of locally engineering NW arrays in micro- and macro-templates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of this thesis is to obtain a better understanding of the methods to assess the stability of a slope. We have illustrated the principal variants of the Limit Equilibrium (LE) method found in literature, focalizing our attention on the Minimum Lithostatic Deviation (MLD) method, developed by Prof. Tinti and his collaborators (e.g. Tinti and Manucci, 2006, 2008). We had two main goals: the first was to test the MLD method on some real cases. We have selected the case of the Vajont landslide with the objective to reconstruct the conditions that caused the destabilization of Mount Toc, and two sites in the Norwegian margin, where failures has not occurred recently, with the aim to evaluate the present stability state and to assess under which conditions they might be mobilized. The second goal was to study the stability charts by Taylor and by Michalowski, and to use the MLD method to investigate the correctness and adequacy of this engineering tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The instability of river bank can result in considerable human and land losses. The Po river is the most important in Italy, characterized by main banks of significant and constantly increasing height. This study presents multilayer perceptron of artificial neural network (ANN) to construct prediction models for the stability analysis of river banks along the Po River, under various river and groundwater boundary conditions. For this aim, a number of networks of threshold logic unit are tested using different combinations of the input parameters. Factor of safety (FS), as an index of slope stability, is formulated in terms of several influencing geometrical and geotechnical parameters. In order to obtain a comprehensive geotechnical database, several cone penetration tests from the study site have been interpreted. The proposed models are developed upon stability analyses using finite element code over different representative sections of river embankments. For the validity verification, the ANN models are employed to predict the FS values of a part of the database beyond the calibration data domain. The results indicate that the proposed ANN models are effective tools for evaluating the slope stability. The ANN models notably outperform the derived multiple linear regression models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method for automatic scaling of oblique ionograms has been introduced. This method also provides a rejection procedure for ionograms that are considered to lack sufficient information, depicting a very good success rate. Observing the Kp index of each autoscaled ionogram, can be noticed that the behavior of the autoscaling program does not depend on geomagnetic conditions. The comparison between the values of the MUF provided by the presented software and those obtained by an experienced operator indicate that the procedure developed for detecting the nose of oblique ionogram traces is sufficiently efficient and becomes much more efficient as the quality of the ionograms improves. These results demonstrate the program allows the real-time evaluation of MUF values associated with a particular radio link through an oblique radio sounding. The automatic recognition of a part of the trace allows determine for certain frequencies, the time taken by the radio wave to travel the path between the transmitter and receiver. The reconstruction of the ionogram traces, suggests the possibility of estimating the electron density between the transmitter and the receiver, from an oblique ionogram. The showed results have been obtained with a ray-tracing procedure based on the integration of the eikonal equation and using an analytical ionospheric model with free parameters. This indicates the possibility of applying an adaptive model and a ray-tracing algorithm to estimate the electron density in the ionosphere between the transmitter and the receiver An additional study has been conducted on a high quality ionospheric soundings data set and another algorithm has been designed for the conversion of an oblique ionogram into a vertical one, using Martyn's theorem. This allows a further analysis of oblique soundings, throw the use of the INGV Autoscala program for the automatic scaling of vertical ionograms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Landslides of the lateral spreading type, involving brittle geological units overlying ductile terrains, are a common occurrence in the sandstone and limestone plateaux of the northern Apennines of Italy. These instability phenomena can become particularly risky, when historical towns and cultural heritage sites built on the top of them are endangered. Neverthless, the mechanisms controlling the developing of related instabilities, i.e. toppling and rock falls, at the edges of rock plateaux are not fully understood yet. In addition, the groundwater flow path developing at the contact between the more permeable units, i.e. the jointed rock slab, and the relatively impermeable clay-rich units have not been already studied in details, even if they may play a role in this kind of instability processes, acting as eventual predisposing and/or triggering factors. Field survey, Terrestrial Laser Scanner and Close Range Photogrammetry techniques, laboratory tests on the involved materials, hydrogeological monitoring and modelling, displacements evaluation and stability analysis through continuum and discontinuum numerical codes have been performed on the San Leo case study, with the aim to bring further insights for the understanding and the assessment of the slope processes taking place in this geological context. The current research permitted to relate the aquifer behaviour of the rocky slab to slope instability processes. The aquifer hosted in the fractured slab leads to the development of perennial and ephemeral springs at the contact between the two units. The related piping erosion phenomena, together with slope processes in the clay-shales led to the progressive undermining of the slab. The cliff becomes progressively unstable due to undermining and undergoes large-scale landslides due to fall or topple.