8 resultados para delineation
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The aim of this work was to show that refined analyses of background, low magnitude seismicity allow to delineate the main active faults and to accurately estimate the directions of the regional tectonic stress that characterize the Southern Apennines (Italy), a structurally complex area with high seismic potential. Thanks the presence in the area of an integrated dense and wide dynamic network, was possible to analyzed an high quality microearthquake data-set consisting of 1312 events that occurred from August 2005 to April 2011 by integrating the data recorded at 42 seismic stations of various networks. The refined seismicity location and focal mechanisms well delineate a system of NW-SE striking normal faults along the Apenninic chain and an approximately E-W oriented, strike-slip fault, transversely cutting the belt. The seismicity along the chain does not occur on a single fault but in a volume, delimited by the faults activated during the 1980 Irpinia M 6.9 earthquake, on sub-parallel predominant normal faults. Results show that the recent low magnitude earthquakes belongs to the background seismicity and they are likely generated along the major fault segments activated during the most recent earthquakes, suggesting that they are still active today thirty years after the mainshock occurrences. In this sense, this study gives a new perspective to the application of the high quality records of low magnitude background seismicity for the identification and characterization of active fault systems. The analysis of the stress tensor inversion provides two equivalent models to explain the microearthquake generation along both the NW-SE striking normal faults and the E- W oriented fault with a dominant dextral strike-slip motion, but having different geological interpretations. We suggest that the NW-SE-striking Africa-Eurasia convergence acts in the background of all these structures, playing a primary and unifying role in the seismotectonics of the whole region.
Resumo:
In this thesis two major topics inherent with medical ultrasound images are addressed: deconvolution and segmentation. In the first case a deconvolution algorithm is described allowing statistically consistent maximum a posteriori estimates of the tissue reflectivity to be restored. These estimates are proven to provide a reliable source of information for achieving an accurate characterization of biological tissues through the ultrasound echo. The second topic involves the definition of a semi automatic algorithm for myocardium segmentation in 2D echocardiographic images. The results show that the proposed method can reduce inter- and intra observer variability in myocardial contours delineation and is feasible and accurate even on clinical data.
Resumo:
The surface electrocardiogram (ECG) is an established diagnostic tool for the detection of abnormalities in the electrical activity of the heart. The interest of the ECG, however, extends beyond the diagnostic purpose. In recent years, studies in cognitive psychophysiology have related heart rate variability (HRV) to memory performance and mental workload. The aim of this thesis was to analyze the variability of surface ECG derived rhythms, at two different time scales: the discrete-event time scale, typical of beat-related features (Objective I), and the “continuous” time scale of separated sources in the ECG (Objective II), in selected scenarios relevant to psychophysiological and clinical research, respectively. Objective I) Joint time-frequency and non-linear analysis of HRV was carried out, with the goal of assessing psychophysiological workload (PPW) in response to working memory engaging tasks. Results from fourteen healthy young subjects suggest the potential use of the proposed indices in discriminating PPW levels in response to varying memory-search task difficulty. Objective II) A novel source-cancellation method based on morphology clustering was proposed for the estimation of the atrial wavefront in atrial fibrillation (AF) from body surface potential maps. Strong direct correlation between spectral concentration (SC) of atrial wavefront and temporal variability of the spectral distribution was shown in persistent AF patients, suggesting that with higher SC, shorter observation time is required to collect spectral distribution, from which the fibrillatory rate is estimated. This could be time and cost effective in clinical decision-making. The results held for reduced leads sets, suggesting that a simplified setup could also be considered, further reducing the costs. In designing the methods of this thesis, an online signal processing approach was kept, with the goal of contributing to real-world applicability. An algorithm for automatic assessment of ambulatory ECG quality, and an automatic ECG delineation algorithm were designed and validated.
Resumo:
The Reverse Vaccinology (RV) approach allows using genomic information for the delineation of new protein-based vaccines starting from an in silico analysis. The first powerful example of the application of the RV approach is given by the development of a protein-based vaccine against serogroup B Meningococcus. A similar approach was also used to identify new Staphylococcus aureus vaccine candidates, including the ferric hydroxamate-binding lipoprotein FhuD2. S. aureus is a widespread human pathogen, which employs various different strategies for iron uptake, including: (i) siderophore-mediated iron acquisition using the endogenous siderophores staphyloferrin A and B, (ii) siderophore-mediated iron acquisition using xeno-siderophores (the pathway exploited by FhuD2) and (iii) heme-mediated iron acquisition. In this work the high resolution crystal structure of FhuD2 in the iron (III)-siderophore-bound form was determined. FhuD2 belongs to the Periplasmic Binding Protein family (PBP ) class III, and is principally formed by two globular domains, at the N- and C-termini of the protein, that make up a cleft where ferrichrome-iron (III) is bound. The N- and C-terminal domains, connected by a single long α-helix, present Rossmann-like folds, showing a β-stranded core and an α-helical periphery, which do not undergo extensive structural rearrangement when they interact with the ligand, typical of class III PBP members. The structure shows that ferrichrome-bound iron does not come directly into contact with the protein; rather, the metal ion is fully coordinated by six oxygen donors of the hydroxamate groups of three ornithine residues, which, with the three glycine residues, make up the peptide backbone of ferrichrome. Furthermore, it was found that iron-free ferrichrome is able to subtract iron from transferrin. This study shows for the first time the structure of FhuD2, which was found to bind to siderophores ,and that the protein plays an important role in S. aureus colonization and infection phases.
Resumo:
Over the last decades the impact of natural disasters to the global environment is becoming more and more severe. The number of disasters has dramatically increased, as well as the cost to the global economy and the number of people affected. Among the natural disaster, flood catastrophes are considered to be the most costly, devastating, broad extent and frequent, because of the tremendous fatalities, injuries, property damage, economic and social disruption they cause to the humankind. In the last thirty years, the World has suffered from severe flooding and the huge impact of floods has caused hundreds of thousands of deaths, destruction of infrastructures, disruption of economic activity and the loss of property for worth billions of dollars. In this context, satellite remote sensing, along with Geographic Information Systems (GIS), has become a key tool in flood risk management analysis. Remote sensing for supporting various aspects of flood risk management was investigated in the present thesis. In particular, the research focused on the use of satellite images for flood mapping and monitoring, damage assessment and risk assessment. The contribution of satellite remote sensing for the delineation of flood prone zones, the identification of damaged areas and the development of hazard maps was explored referring to selected cases of study.
Resumo:
Recent scholarly works on the relationship between ‘fashion’ and ‘sustainability’ have identified a need for a systemic transition towards fashion media ‘for sustaianbility’. Nevertheless, the academic research on the topic is still limited and rather circumscribed to the analysis of marketing practices, while only recently some more systemic and critical analyses of the symbolic production of sustainability through fashion media have been undertaken. Responding to this need for an in-depth investigation of ‘sustainability’-related media production, my research focuses on the ‘fashion sustainability’-related discursive formations in the context of one of the most influential fashion magazines today – Vogue Italia. In order to investigate the ways in which the ‘sustainability’ discourse was formed and has evolved, the study considered the entire Vogue Italia archive from 1965 to 2021. The data collection was carried out in two phases, and the individualised relevant discursive units were then in-depth and critically analysed to allow for a grounded assessment of the media giant’s position. The Discourse-Historical Approach provided a methodological base for the analysis, which took into consideration the various levels of context: the immediate textual and intertextual, but also the broader socio-cultural context of the predominant, over-production oriented and capital-led fashion system. The findings led to a delineation of the evolution of the ‘fashion sustainability’ discourse, unveiling how despite Vogue Italia’s auto-determination as attentive to ‘sustainability’-related topics, the magazine is systemically employing discursive strategies which significantly mitigate the meaning of the ‘sustainable commitment’ and thus the meaning of ‘fashion sustainability’.
Resumo:
The integration of distributed and ubiquitous intelligence has emerged over the last years as the mainspring of transformative advancements in mobile radio networks. As we approach the era of “mobile for intelligence”, next-generation wireless networks are poised to undergo significant and profound changes. Notably, the overarching challenge that lies ahead is the development and implementation of integrated communication and learning mechanisms that will enable the realization of autonomous mobile radio networks. The ultimate pursuit of eliminating human-in-the-loop constitutes an ambitious challenge, necessitating a meticulous delineation of the fundamental characteristics that artificial intelligence (AI) should possess to effectively achieve this objective. This challenge represents a paradigm shift in the design, deployment, and operation of wireless networks, where conventional, static configurations give way to dynamic, adaptive, and AI-native systems capable of self-optimization, self-sustainment, and learning. This thesis aims to provide a comprehensive exploration of the fundamental principles and practical approaches required to create autonomous mobile radio networks that seamlessly integrate communication and learning components. The first chapter of this thesis introduces the notion of Predictive Quality of Service (PQoS) and adaptive optimization and expands upon the challenge to achieve adaptable, reliable, and robust network performance in dynamic and ever-changing environments. The subsequent chapter delves into the revolutionary role of generative AI in shaping next-generation autonomous networks. This chapter emphasizes achieving trustworthy uncertainty-aware generation processes with the use of approximate Bayesian methods and aims to show how generative AI can improve generalization while reducing data communication costs. Finally, the thesis embarks on the topic of distributed learning over wireless networks. Distributed learning and its declinations, including multi-agent reinforcement learning systems and federated learning, have the potential to meet the scalability demands of modern data-driven applications, enabling efficient and collaborative model training across dynamic scenarios while ensuring data privacy and reducing communication overhead.
Resumo:
The present Dissertation shows how recent statistical analysis tools and open datasets can be exploited to improve modelling accuracy in two distinct yet interconnected domains of flood hazard (FH) assessment. In the first Part, unsupervised artificial neural networks are employed as regional models for sub-daily rainfall extremes. The models aim to learn a robust relation to estimate locally the parameters of Gumbel distributions of extreme rainfall depths for any sub-daily duration (1-24h). The predictions depend on twenty morphoclimatic descriptors. A large study area in north-central Italy is adopted, where 2238 annual maximum series are available. Validation is performed over an independent set of 100 gauges. Our results show that multivariate ANNs may remarkably improve the estimation of percentiles relative to the benchmark approach from the literature, where Gumbel parameters depend on mean annual precipitation. Finally, we show that the very nature of the proposed ANN models makes them suitable for interpolating predicted sub-daily rainfall quantiles across space and time-aggregation intervals. In the second Part, decision trees are used to combine a selected blend of input geomorphic descriptors for predicting FH. Relative to existing DEM-based approaches, this method is innovative, as it relies on the combination of three characteristics: (1) simple multivariate models, (2) a set of exclusively DEM-based descriptors as input, and (3) an existing FH map as reference information. First, the methods are applied to northern Italy, represented with the MERIT DEM (∼90m resolution), and second, to the whole of Italy, represented with the EU-DEM (25m resolution). The results show that multivariate approaches may (a) significantly enhance flood-prone areas delineation relative to a selected univariate one, (b) provide accurate predictions of expected inundation depths, (c) produce encouraging results in extrapolation, (d) complete the information of imperfect reference maps, and (e) conveniently convert binary maps into continuous representation of FH.