937 resultados para Geo-referenciação
Resumo:
Several countries have acquired, over the past decades, large amounts of area covering Airborne Electromagnetic data. Contribution of airborne geophysics has dramatically increased for both groundwater resource mapping and management proving how those systems are appropriate for large-scale and efficient groundwater surveying. We start with processing and inversion of two AEM dataset from two different systems collected over the Spiritwood Valley Aquifer area, Manitoba, Canada respectively, the AeroTEM III (commissioned by the Geological Survey of Canada in 2010) and the “Full waveform VTEM” dataset, collected and tested over the same survey area, during the fall 2011. We demonstrate that in the presence of multiple datasets, either AEM and ground data, due processing, inversion, post-processing, data integration and data calibration is the proper approach capable of providing reliable and consistent resistivity models. Our approach can be of interest to many end users, ranging from Geological Surveys, Universities to Private Companies, which are often proprietary of large geophysical databases to be interpreted for geological and\or hydrogeological purposes. In this study we deeply investigate the role of integration of several complimentary types of geophysical data collected over the same survey area. We show that data integration can improve inversions, reduce ambiguity and deliver high resolution results. We further attempt to use the final, most reliable output resistivity models as a solid basis for building a knowledge-driven 3D geological voxel-based model. A voxel approach allows a quantitative understanding of the hydrogeological setting of the area, and it can be further used to estimate the aquifers volumes (i.e. potential amount of groundwater resources) as well as hydrogeological flow model prediction. In addition, we investigated the impact of an AEM dataset towards hydrogeological mapping and 3D hydrogeological modeling, comparing it to having only a ground based TEM dataset and\or to having only boreholes data.
Resumo:
The energy released during a seismic crisis in volcanic areas is strictly related to the physical processes in the volcanic structure. In particular Long Period seismicity, that seems to be related to the oscillation of a fluid-filled crack (Chouet , 1996, Chouet, 2003, McNutt, 2005), can precedes or accompanies an eruption. The present doctoral thesis is focused on the study of the LP seismicity recorded in the Campi Flegrei volcano (Campania, Italy) during the October 2006 crisis. Campi Flegrei Caldera is an active caldera; the combination of an active magmatic system and a dense populated area make the Campi Flegrei a critical volcano. The source dynamic of LP seismicity is thought to be very different from the other kind of seismicity ( Tectonic or Volcano Tectonic): it’s characterized by a time sustained source and a low content in frequency. This features implies that the duration–magnitude, that is commonly used for VT events and sometimes for LPs as well, is unadapted for LP magnitude evaluation. The main goal of this doctoral work was to develop a method for the determination of the magnitude for the LP seismicity; it’s based on the comparison of the energy of VT event and LP event, linking the energy to the VT moment magnitude. So the magnitude of the LP event would be the moment magnitude of a VT event with the same energy of the LP. We applied this method to the LP data-set recorded at Campi Flegrei caldera in 2006, to an LP data-set of Colima volcano recorded in 2005 – 2006 and for an event recorded at Etna volcano. Experimenting this method to lots of waveforms recorded at different volcanoes we tested its easy applicability and consequently its usefulness in the routinely and in the quasi-real time work of a volcanological observatory.
Resumo:
The surface properties of minerals have important implications in geology, environment, industry and biotechnology and for certain aspects in the research on the origin of life. This research project aims to widen the knowledge on the nanoscale surface properties of chlorite and phlogopite by means of advanced methodologies, and also to investigate the interaction of fundamental biomolecules, such as nucleotides, RNA, DNA and amino acid glycine with the surface of the selected phyllosilicates. Multiple advanced and complex experimental approaches based on scanning probe microscopy and spatially resolved spectroscopy were used and in some cases specifically developed. The results demonstrate that chlorite exposes at the surface atomically flat terraces with 0.5 nm steps typically generated by the fragmentation of the octahedral sheet of the interlayer (brucitic-type). This fragmentation at the nanoscale generates a high anisotropy and inhomogeneity with surface type and isomorphous cationic substitutions determining variations of the effective surface potential difference, ranging between 50-100 mV and 400-500 mV, when measured in air, between the TOT surface and the interlayer brucitic sheet. The surface potential was ascribed to be the driving force of the observed high affinity of the surface with the fundamental biomolecules, like single molecules of nucleotides, DNA, RNA and amino acids. Phlogopite was also observed to present an extended atomically flat surface, featuring negative surface potential values of some hundreds of millivolts and no significant local variations. Phlogopite surface was sometimes observed to present curvature features that may be ascribed to local substitutions of the interlayer cations or the presence of a crystal lattice mismatch or structural defects, such as stacking faults or dislocation loops. Surface chemistry was found similar to the bulk. The study of the interaction with nucleotides and glycine revealed a lower affinity with respect to the brucite-like surface of chlorite.
Resumo:
In questo elaborato viene studiato un nuovo strumento satellitare chiamato MIPAS2k: uno spettrometro a trasformata di Fourier, in grado di misurare gli spettri di emissione dei gas atmosferici attraverso la tecnica di misure al lembo. Lo scopo di MIPAS2k è quello di determinare la distribuzione spaziale di quantità atmosferiche tra cui il VMR delle specie chimiche: ozono, acqua, acido nitrico e protossido di azoto. La necessità di idearne un successore è nata dopo la perdita di contatto con lo strumento MIPAS da cui MIPAS2k, pur preservandone alcune caratteristiche, presenta differenze fondamentali quali: i parametri osservazionali, il tipo di detector utilizzato per eseguire le scansioni al lembo e la logica attraverso cui vengono ricavate le distribuzioni dei parametri atmosferici. L’algoritmo attraverso cui viene effettuata l’inversione dei dati di MIPAS2k, chiamato FULL2D, usa la stessa logica di base di quello utilizzato per MIPAS chiamato Geo-Fit. La differenza fondamentale tra i due metodi risiede nel modo in cui i parametri sono rappresentati. Il Geo-Fit ricostruisce il campo atmosferico delle quantità da determinare tramite profili verticali mentre il FULL2D rappresenta i valori del campo atmosferico all’interno degli elementi della discretizzazione bidimensionale dell’atmosfera. Non avendo a disposizione misure del nuovo strumento si è dovuto valutarne le performance attraverso l’analisi su osservati simulati creati ricorrendo al modello diretto del trasferimento radiativo e utilizzando un’atmosfera di riferimento ad alta risoluzione. Le distribuzioni bidimensionali delle quantità atmosferiche di interesse sono state quindi ricavate usando il modello di inversione dei dati FULL2D applicato agli osservati simulati di MIPAS2k. I valori dei parametri ricavati sono stati confrontati con i valori dell’atmosfera di riferimento e analizzati utilizzando mappe e quantificatori. Con i risultati di queste analisi e' stato possibile determinare la risoluzione spaziale e la precisione dei prodotti di MIPAS2k per due diverse risoluzioni spettrali.
Resumo:
This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.
Resumo:
The object of this work has been the analysis of natural processes controlling the geological evolution of the Montenegro and Northern Albania Continental Margin (MACM) during the Late Quaternary. These include the modern sediment dispersal system and oceanographic regime, the building and shaping of the shelf margin at the scale of 100 kyr and relative to the most recent transition between glacial and interglacial periods. The analysis of the new data shows that the MACM is a shelf-slope system formed by a suite of physiographic elements, including: an inner and an outer continental shelf, separated by two tectonically-controlled morphological highs; a lobated drowned mid-shelf paleodelta, formed during the last sea level fall and low stand; an upper continental slope, affected by gravity-driven instability and a system of extensional faults with surficial displacement, featuring an orientation coherent with the regional tectonics. The stratigraphic study of the MACM shows a clear correspondence between the Late Pleistocene/Holocene mud-wedge and the low reflectivity sectors of the inner shelf. Conversely, most of the outer shelf and part of the continental slope expose deposits from the last sea level low stand, featuring a general sediment starving condition or the presence of a thin postglacial sediments cover. The MACM shows uplift in correspondence of the Kotor and Bar ridges, and subsidence in the outer shelf and upper slope sectors. In fact, seaward of these tectonic ridges, the sparker seismic profile show the presence of four well-defined seismo-stratigraphic sequences, interpreted as forced regression deposits, formed during the last four main glacial phases. In this way, the MACM records the 100 kyr scale sea level fluctuations on its seismo-stratigraphic architecture over the last 350 kyr. Over such time range, through the identification of the paleoshoreline deposits, we estimated an average subsidence rate of about 1.2 mm/yr.
Resumo:
Uno dei problemi più diffusi, nell'ambito della logistica, è rappresentato dai costi di trasporto. La gestione dei flussi merci, l'approvvigionamento dei clienti, e la relativa pianifcazione della movimentazione dei veicoli, hanno incidenze notevoli sui costi di gestione aziendali, i quali vengono stimati mediamente nel 45% dei costi logistici. A ragione di questo, sono sempre di più le aziende che ricorrono all'impiego di uffici dedicati alla pianifcazione delle consegne e la gestione dei trasporti in generale. Sebbene le voci di bilancio relative al trasporto raggiungano cifre rilevanti, fno al 4% del fatturato annuo, il tema della pianifcazione viene spesso sottovalutato. Infatti la soluzione a problemi di pianifcazione e monitoraggio dei costi, è spesso demandata a procedure manuali senza supporto informatico. Nasce da qui l'esigenza di proporre uno strumento informatico che supporti gli addetti preposti alla pianifcazione, sviluppando un sistema che copra esigenze di pianifcazione dei viaggi, controllo e consuntivazione dei costi di trasporto, e monitoraggio dei mezzi in tempo reale. La proposta di Gesp srl, Geographic Information Systems, azienda italiana che opera da anni nel campo delle applicazioni software geo-spaziali, prende il nome di Nuovo Sistema Trasporti, o più semplicemente, NST. In quest'ambito prende corpo questa tesi, la quale si pone l'obiettivo di illustrare le fasi di nascita, analisi, progettazione e sviluppo di un software generico per il supporto alla logistica. Saranno così analizzati: le problematiche affrontate nella fase di defnizione, e kick-off (avvio), del progetto, il problema del routing, o Vehicle Routing Problem e le tecniche di Ricerca Operativa che vengono applicate per la sua risoluzione; le moderne metodologie di gestione e sviluppo di un software; l'architettura e le tecnologie impiegate per la distribuzione dell'applicativo.