6 resultados para Synthetic aperture techniques

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Satellite SAR (Synthetic Aperture Radar) interferometry represents a valid technique for digital elevation models (DEM) generation, providing metric accuracy even without ancillary data of good quality. Depending on the situations the interferometric phase could be interpreted both as topography and as a displacement eventually occurred between the two acquisitions. Once that these two components have been separated it is possible to produce a DEM from the first one or a displacement map from the second one. InSAR DEM (Digital Elevation Model) generation in the cryosphere is not a straightforward operation because almost every interferometric pair contains also a displacement component, which, even if small, when interpreted as topography during the phase to height conversion step could introduce huge errors in the final product. Considering a glacier, assuming the linearity of its velocity flux, it is therefore necessary to differentiate at least two pairs in order to isolate the topographic residue only. In case of an ice shelf the displacement component in the interferometric phase is determined not only by the flux of the glacier but also by the different heights of the two tides. As a matter of fact even if the two scenes of the interferometric pair are acquired at the same time of the day only the main terms of the tide disappear in the interferogram, while the other ones, smaller, do not elide themselves completely and so correspond to displacement fringes. Allowing for the availability of tidal gauges (or as an alternative of an accurate tidal model) it is possible to calculate a tidal correction to be applied to the differential interferogram. It is important to be aware that the tidal correction is applicable only knowing the position of the grounding line, which is often a controversial matter. In this thesis it is described the methodology applied for the generation of the DEM of the Drygalski ice tongue in Northern Victoria Land, Antarctica. The displacement has been determined both in an interferometric way and considering the coregistration offsets of the two scenes. A particular attention has been devoted to investigate the importance of the role of some parameters, such as timing annotations and orbits reliability. Results have been validated in a GIS environment by comparison with GPS displacement vectors (displacement map and InSAR DEM) and ICEsat GLAS points (InSAR DEM).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Landslides are common features of the landscape of the north-central Apennine mountain range and cause frequent damage to human facilities and infrastructure. Most of these landslides move periodically with moderate velocities and, only after particular rainfall events, some accelerate abruptly. Synthetic aperture radar interferometry (InSAR) provides a particularly convenient method for studying deforming slopes. We use standard two-pass interferometry, taking advantage of the short revisit time of the Sentinel-1 satellites. In this paper we present the results of the InSAR analysis developed on several study areas in central and Northern Italian Apennines. The aims of the work described within the articles contained in this paper, concern: i) the potential of the standard two-pass interferometric technique for the recognition of active landslides; ii) the exploration of the potential related to the displacement time series resulting from a two-pass multiple time-scale InSAR analysis; iii) the evaluation of the possibility of making comparisons with climate forcing for cognitive and risk assessment purposes. Our analysis successfully identified more than 400 InSAR deformation signals (IDS) in the different study areas corresponding to active slope movements. The comparison between IDSs and thematic maps allowed us to identify the main characteristics of the slopes most prone to landslides. The analysis of displacement time series derived from monthly interferometric stacks or single 6-day interferograms allowed the establishment of landslide activity thresholds. This information, combined with the displacement time series, allowed the relationship between ground deformation and climate forcing to be successfully investigated. The InSAR data also gave access to the possibility of validating geographical warning systems and comparing the activity state of landslides with triggering probability thresholds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis is focused on the development of a method for the synthesis of silicon nanocrystals with different sizes, narrow size distribution, good optical properties and stability in air. The resulting silicon nanocrystals have been covalently functionalized with different chromophores with the aim to exploit the new electronic and chemical properties that emerge from the interaction between silicon nanocrystal surface and ligands. The purpose is to use these chromophores as light harvesting antennae, increasing the optical absorption of silicon nanocrystals. Functionalized silicon nanocrystals have been characterized with different analytical techniques leading to a good knowledge of optical properties of semiconductor quantum dots.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A critical point in the analysis of ground displacements time series is the development of data driven methods that allow the different sources that generate the observed displacements to be discerned and characterised. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows reducing the dimensionality of the data space maintaining most of the variance of the dataset explained. Anyway, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. The Independent Component Analysis (ICA) is a popular technique adopted to approach this problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, I use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here I present the application of the vbICA technique to GPS position time series. First, I use vbICA on synthetic data that simulate a seismic cycle (interseismic + coseismic + postseismic + seasonal + noise) and a volcanic source, and I study the ability of the algorithm to recover the original (known) sources of deformation. Secondly, I apply vbICA to different tectonically active scenarios, such as the 2009 L'Aquila (central Italy) earthquake, the 2012 Emilia (northern Italy) seismic sequence, and the 2006 Guerrero (Mexico) Slow Slip Event (SSE).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RAD52 is a protein involved in various DNA reparation mechanisms. In the last few years, RAD52 has been proposed as a novel pharmacological target for cancer synthetic lethality strategies. Hence, this work has the purpose to investigate RAD52 protein, with biophysical and structural tools to shed light on proteins features and mechanistic details that are, up to now poorly described, and to design novel strategies for its inhibition. My PhD work had two goals: the structural and functional characterization of RAD52 and the identification of novel RAD52 inhibitors. For the first part, RAD52 was characterized both for its DNA interaction and oligomerization state together with its propensity to form high molecular weight superstructures. Moreover, using EM and Cryo-EM techniques, additional RAD52 structural hallmarks were obtained, valuable both for understanding protein mechanism of action and for drug discovery purpose. The second part of my PhD project focused on the design and characterization of novel RAD52 inhibitors to be potentially used in combination therapies with PARPi to achieve cancer cells synthetic lethality, avoiding resistance occurrence and side effects. With this aim we selected and characterized promising RAD52 inhibitors through three different approaches: 19F NMR fragment-based screening; virtual screening campaign; aptamers computational design. Selected hits (fragments, molecules and aptamers) were investigated for their binding to RAD52 and for their mechanism of inhibition. Collected data highlighted the identification of hits worthy to be developed into more potent and selective RAD52 inhibitors. Finally, a side project carried out during my PhD is reported. GSK-3β protein, an already validated pharmacological target was investigated using biophysical and structural biology tools. Here, an innovative and adaptable drug discovery screening pipeline able to directly identify selective compounds with binding affinities not higher than a reference binder was developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Artificial Intelligence (AI) and Machine Learning (ML) are novel data analysis techniques providing very accurate prediction results. They are widely adopted in a variety of industries to improve efficiency and decision-making, but they are also being used to develop intelligent systems. Their success grounds upon complex mathematical models, whose decisions and rationale are usually difficult to comprehend for human users to the point of being dubbed as black-boxes. This is particularly relevant in sensitive and highly regulated domains. To mitigate and possibly solve this issue, the Explainable AI (XAI) field became prominent in recent years. XAI consists of models and techniques to enable understanding of the intricated patterns discovered by black-box models. In this thesis, we consider model-agnostic XAI techniques, which can be applied to Tabular data, with a particular focus on the Credit Scoring domain. Special attention is dedicated to the LIME framework, for which we propose several modifications to the vanilla algorithm, in particular: a pair of complementary Stability Indices that accurately measure LIME stability, and the OptiLIME policy which helps the practitioner finding the proper balance among explanations' stability and reliability. We subsequently put forward GLEAMS a model-agnostic surrogate interpretable model which requires to be trained only once, while providing both Local and Global explanations of the black-box model. GLEAMS produces feature attributions and what-if scenarios, from both dataset and model perspective. Eventually, we argue that synthetic data are an emerging trend in AI, being more and more used to train complex models instead of original data. To be able to explain the outcomes of such models, we must guarantee that synthetic data are reliable enough to be able to translate their explanations to real-world individuals. To this end we propose DAISYnt, a suite of tests to measure synthetic tabular data quality and privacy.