19 resultados para case-method
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Porous materials are widely used in many fields of industrial applications, to achieve the requirements of noise reduction, that nowadays derive from strict regulations. The modeling of porous materials is still a problematic issue. Numerical simulations are often problematic in case of real complex geometries, especially in terms of computational times and convergence. At the same time, analytical models, even if partly limited by restrictive simplificative hypotheses, represent a powerful instrument to capture quickly the physics of the problem and general trends. In this context, a recently developed numerical method, called the Cell Method, is described, is presented in the case of the Biot's theory and applied for representative cases. The peculiarity of the Cell Method is that it allows for a direct algebraic and geometrical discretization of the field equations, without any reduction to a weak integral form. Then, the second part of the thesis presents the case of interaction between two poroelastic materials under the context of double porosity. The idea of using periodically repeated inclusions of a second porous material into a layer composed by an original material is described. In particular, the problem is addressed considering the efficiency of the analytical method. A analytical procedure for the simulation of heterogeneous layers based is described and validated considering both conditions of absorption and transmission; a comparison with the available numerical methods is performed. ---------------- I materiali porosi sono ampiamente utilizzati per diverse applicazioni industriali, al fine di raggiungere gli obiettivi di riduzione del rumore, che sono resi impegnativi da norme al giorno d'oggi sempre più stringenti. La modellazione dei materiali porori per applicazioni vibro-acustiche rapprensenta un aspetto di una certa complessità. Le simulazioni numeriche sono spesso problematiche quando siano coinvolte geometrie di pezzi reali, in particolare riguardo i tempi computazionali e la convergenza. Allo stesso tempo, i modelli analitici, anche se parzialmente limitati a causa di ipotesi semplificative che ne restringono l'ambito di utilizzo, rappresentano uno strumento molto utile per comprendere rapidamente la fisica del problema e individuare tendenze generali. In questo contesto, un metodo numerico recentemente sviluppato, il Metodo delle Celle, viene descritto, implementato nel caso della teoria di Biot per la poroelasticità e applicato a casi rappresentativi. La peculiarità del Metodo delle Celle consiste nella discretizzazione diretta algebrica e geometrica delle equazioni di campo, senza alcuna riduzione a forme integrali deboli. Successivamente, nella seconda parte della tesi viene presentato il caso delle interazioni tra due materiali poroelastici a contatto, nel contesto dei materiali a doppia porosità. Viene descritta l'idea di utilizzare inclusioni periodicamente ripetute di un secondo materiale poroso all'interno di un layer a sua volta poroso. In particolare, il problema è studiando il metodo analitico e la sua efficienza. Una procedura analitica per il calcolo di strati eterogenei di materiale viene descritta e validata considerando sia condizioni di assorbimento, sia di trasmissione; viene effettuata una comparazione con i metodi numerici a disposizione.
Resumo:
This thesis deals with a novel control approach based on the extension of the well-known Internal Model Principle to the case of periodic switched linear exosystems. This extension, inspired by power electronics applications, aims to provide an effective design method to robustly achieve the asymptotic tracking of periodic references with an infinite number of harmonics. In the first part of the thesis the basic components of the novel control scheme are described and preliminary results on stabilization are provided. In the second part, advanced control methods for two applications coming from the world high energy physics are presented.
Resumo:
The arterial wall contains MSCs with mesengenic and angiogenic abilities. These multipotent precursors have been isolated from variously-sized human adult segments, belying the notion that vessel wall is a relatively quiescent tissue. Recently, our group identified in normal human arteries a vasculogenic niche and subsequently isolated and characterized resident MSCs (VW-MSCs) with angiogenic ability and multilineage potential. To prove that VW-MSCs are involved in normal and pathological vascular remodeling, we used a long-term organ culture system; this method was of critical importance to follow spontaneous 3-D vascular remodeling without any influence of blood cells. Next we tried to identify and localize in situ the VW-MSCs and to understand their role in the vascular remodeling in failed arterial homografts. Subsequently, we isolated this cell population and tested in vitro their multilineage differentiation potential through immunohistochemical, immunofluorescence, RT-PCR and ultrastructural analysis. From 25-30cm2 of each vascular wall homograft sample, we isolated a cell population with MSCs properties; these cells expressed MSC lineage molecules (CD90, CD44, CD105, CD29, CD73), stemness (Notch-1, Oct-4, Sca-1, Stro-1) and pericyte markers (NG2) whilst were negative for hematopoietic and endothelial markers (CD34, CD133, CD45, KDR, CD146, CD31 and vWF). MSCs derived from failed homografts (H-MSCs) exhibited adipogenic, osteogenic and chondrogenic potential but scarce propensity to angiogenic and leiomyogenic differentiation. The present study demonstrates that failed homografts contain MSCs with morphological, phenotypic and functional MSCs properties; H-MSCs are long-lived in culture, highly proliferating and endowed with prompt ability to differentiate into adipocytes, osteocytes and chondrocytes; compared with VW-MSCs from normal arteries, H-MSCs show a failure in angiogenic and leiomyogenic differentiation. A switch in MSCs plasticity could be the basis of pathological remodeling and contribute to aneurysmal failure of arterial homografts. The study of VW-MSCs in a pathological setting indicate that additional mechanisms are involved in vascular diseases; their knowledge will be useful for opening new therapeutic options in cardiovascular diseases.
Resumo:
The main goals of this Ph.D. study are to investigate the regional and global geophysical components related to present polar ice melting and to provide independent cross validation checks of GIA models using both geophysical data detected by satellite mission, and geological observations from far field sites, in order to determine a lower and upper bound of uncertainty of GIA effect. The subject of this Thesis is the sea level change from decades to millennia scale. Within ice2sea collaboration, we developed a Fortran numerical code to analyze the local short-term sea level change and vertical deformation resulting from the loss of ice mass. This method is used to investigate polar regions: Greenland and Antarctica. We have used mass balance based on ICESat data for Greenland ice sheet and a plausible mass balance for Antarctic ice sheet. We have determined the regional and global fingerprint of sea level variations, vertical deformations of the solid surface of the Earth and variations of shape of the geoid for each ice source mentioned above. The coastal areas are affected by the long wavelength component of GIA process. Hence understanding the response of the Earth to loading is crucial in various contexts. Based on the hypothesis that Earth mantle materials obey to a linear rheology, and that the physical parameters of this rheology can be only characterized by their depth dependence, we investigate the Glacial Isostatic Effect upon the far field sites of Mediterranean area using an improved SELEN program. We presented new and revised observations for archaeological fish tanks located along the Tyrrhenian and Adriatic coast of Italy and new RSL for the SE Tunisia. Spatial and temporal variations of the Holocene sea levels studied in central Italy and Tunisia, provided important constraints on the melting history of the major ice sheets.
Resumo:
This thesis addresses the formulation of a referee assignment problem for the Italian Volleyball Serie A Championships. The problem has particular constraints such as a referee must be assigned to different teams in a given period of times, and the minimal/maximal level of workload for each referee is obtained by considering cost and profit in the objective function. The problem has been solved through an exact method by using an integer linear programming formulation and a clique based decomposition for improving the computing time. Extensive computational experiments on real-world instances have been performed to determine the effectiveness of the proposed approach.
Resumo:
In order to handle Natural disasters, emergency areas are often individuated over the territory, close to populated centres. In these areas, rescue services are located which respond with resources and materials for population relief. A method of automatic positioning of these centres in case of a flood or an earthquake is presented. The positioning procedure consists of two distinct parts developed by the research group of Prof Michael G. H. Bell of Imperial College, London, refined and applied to real cases at the University of Bologna under the coordination of Prof Ezio Todini. There are certain requirements that need to be observed such as the maximum number of rescue points as well as the number of people involved. Initially, the candidate points are decided according to the ones proposed by the local civil protection services. We then calculate all possible routes from each candidate rescue point to all other points, generally using the concept of the "hyperpath", namely a set of paths each one of which may be optimal. The attributes of the road network are of fundamental importance, both for the calculation of the ideal distance and eventual delays due to the event measured in travel time units. In a second phase, the distances are used to decide the optimum rescue point positions using heuristics. This second part functions by "elimination". In the beginning, all points are considered rescue centres. During every interaction we wish to delete one point and calculate the impact it creates. In each case, we delete the point that creates less impact until we reach the number of rescue centres we wish to keep.
Resumo:
Throughout the world, pressures on water resources are increasing, mainly as a result of human activity. Because of their accessibility, groundwater and surface water are the most used reservoirs. The evaluation of the water quality requires the identification of the interconnections among the water reservoirs, natural landscape features, human activities and aquatic health. This study focuses on the estimation of the water pollution linked to two different environmental issues: salt water intrusion and acid mine drainage related to the exploitation of natural resources. Effects of salt water intrusion occurring in the shallow aquifer north of Ravenna (Italy) was analysed through the study of ion- exchange occurring in the area and its variance throughout the year, applying a depth-specific sampling method. In the study area were identified ion exchange, calcite and dolomite precipitation, and gypsum dissolution and sulphate reduction as the main processes controlling the groundwater composition. High concentrations of arsenic detected only at specific depth indicate its connexion with the organic matter. Acid mine drainage effects related to the tin extraction in the Bolivian Altiplano was studied, on water and sediment matrix. Water contamination results strictly dependent on the seasonal variation, on pH and redox conditions. During the dry season the strong evaporation and scarce water flow lead to low pH values, high concentrations of heavy metals in surface waters and precipitation of secondary minerals along the river, which could be released in oxidizing conditions as demonstrated through the sequential extraction analysis. The increase of the water flow during the wet season lead to an increase of pH values and a decrease in heavy metal concentrations, due to dilution effect and, as e.g. for the iron, to precipitation.
Resumo:
This research has focused on the study of the behavior and of the collapse of masonry arch bridges. The latest decades have seen an increasing interest in this structural type, that is still present and in use, despite the passage of time and the variation of the transport means. Several strategies have been developed during the time to simulate the response of this type of structures, although even today there is no generally accepted standard one for assessment of masonry arch bridges. The aim of this thesis is to compare the principal analytical and numerical methods existing in literature on case studies, trying to highlight values and weaknesses. The methods taken in exam are mainly three: i) the Thrust Line Analysis Method; ii) the Mechanism Method; iii) the Finite Element Methods. The Thrust Line Analysis Method and the Mechanism Method are analytical methods and derived from two of the fundamental theorems of the Plastic Analysis, while the Finite Element Method is a numerical method, that uses different strategies of discretization to analyze the structure. Every method is applied to the case study through computer-based representations, that allow a friendly-use application of the principles explained. A particular closed-form approach based on an elasto-plastic material model and developed by some Belgian researchers is also studied. To compare the three methods, two different case study have been analyzed: i) a generic masonry arch bridge with a single span; ii) a real masonry arch bridge, the Clemente Bridge, built on Savio River in Cesena. In the analyses performed, all the models are two-dimensional in order to have results comparable between the different methods taken in exam. The different methods have been compared with each other in terms of collapse load and of hinge positions.
Resumo:
Bioinformatics, in the last few decades, has played a fundamental role to give sense to the huge amount of data produced. Obtained the complete sequence of a genome, the major problem of knowing as much as possible of its coding regions, is crucial. Protein sequence annotation is challenging and, due to the size of the problem, only computational approaches can provide a feasible solution. As it has been recently pointed out by the Critical Assessment of Function Annotations (CAFA), most accurate methods are those based on the transfer-by-homology approach and the most incisive contribution is given by cross-genome comparisons. In the present thesis it is described a non-hierarchical sequence clustering method for protein automatic large-scale annotation, called “The Bologna Annotation Resource Plus” (BAR+). The method is based on an all-against-all alignment of more than 13 millions protein sequences characterized by a very stringent metric. BAR+ can safely transfer functional features (Gene Ontology and Pfam terms) inside clusters by means of a statistical validation, even in the case of multi-domain proteins. Within BAR+ clusters it is also possible to transfer the three dimensional structure (when a template is available). This is possible by the way of cluster-specific HMM profiles that can be used to calculate reliable template-to-target alignments even in the case of distantly related proteins (sequence identity < 30%). Other BAR+ based applications have been developed during my doctorate including the prediction of Magnesium binding sites in human proteins, the ABC transporters superfamily classification and the functional prediction (GO terms) of the CAFA targets. Remarkably, in the CAFA assessment, BAR+ placed among the ten most accurate methods. At present, as a web server for the functional and structural protein sequence annotation, BAR+ is freely available at http://bar.biocomp.unibo.it/bar2.0.
Resumo:
Background: Clinical trials have demonstrated that selected secondary prevention medications for patients after acute myocardial infarction (AMI) reduce mortality. Yet, these medications are generally underprescribed in daily practice, and older people are often absent from drug trials. Objectives: To examine the relationship between adherence to evidence-based (EB) drugs and post-AMI mortality, focusing on the effects of single therapy and polytherapy in very old patients (≥80 years) compared with elderly and adults (<80 years). Methods: Patients hospitalised for AMI between 01/01/2008 and 30/06/2011 and resident in the Local Health Authority of Bologna were followed up until 31/12/2011. Medication adherence was calculated as the proportion of days covered for filled prescriptions of angiotensin-converting enzyme inhibitors (ACEIs)/angiotensin receptor blockers (ARBs), β-blockers, antiplatelet drugs, and statins. We adopted a risk set sampling method, and the adjusted relationship between medication adherence (PDC≥75%) and mortality was investigated using conditional multiple logistic regression. Results: The study population comprised 4861 patients. During a median follow-up of 2.8 years, 1116 deaths (23.0%) were observed. Adherence to the 4 EB drugs was 7.1%, while nonadherence to any of the drugs was 19.7%. For both patients aged ≥80 years and those aged <80 years, rate ratios of death linearly decreased as the number of EB drugs taken increased. There was a significant inverse relationship between adherence to each of 4 medications and mortality, although its magnitude was higher for ACEIs/ARBs (adj. rate ratio=0.60, 95%CI=0.52–0.69) and statins (0.60, 0.50–0.72), and lower for β-blockers (0.75, 0.61–0.92) and antiplatelet drugs (0.73, 0.63–0.84). Conclusions: The beneficial effect of EB polytherapy on long-term mortality following AMI is evident also in nontrial older populations. Given that adherence to combination therapies is largely suboptimal, the implementation of strategies and initiatives to increase the use of post-AMI secondary preventive medications in old patients is crucial.
Resumo:
The Schroeder's backward integration method is the most used method to extract the decay curve of an acoustic impulse response and to calculate the reverberation time from this curve. In the literature the limits and the possible improvements of this method are widely discussed. In this work a new method is proposed for the evaluation of the energy decay curve. The new method has been implemented in a Matlab toolbox. Its performance has been tested versus the most accredited literature method. The values of EDT and reverberation time extracted from the energy decay curves calculated with both methods have been compared in terms of the values themselves and in terms of their statistical representativeness. The main case study consists of nine Italian historical theatres in which acoustical measurements were performed. The comparison of the two extraction methods has also been applied to a critical case, i.e. the structural impulse responses of some building elements. The comparison underlines that both methods return a comparable value of the T30. Decreasing the range of evaluation, they reveal increasing differences; in particular, the main differences are in the first part of the decay, where the EDT is evaluated. This is a consequence of the fact that the new method returns a “locally" defined energy decay curve, whereas the Schroeder's method accumulates energy from the tail to the beginning of the impulse response. Another characteristic of the new method for the energy decay extraction curve is its independence on the background noise estimation. Finally, a statistical analysis is performed on the T30 and EDT values calculated from the impulse responses measurements in the Italian historical theatres. The aim of this evaluation is to know whether a subset of measurements could be considered representative for a complete characterization of these opera houses.
Resumo:
The main objective of this thesis is to obtain a better understanding of the methods to assess the stability of a slope. We have illustrated the principal variants of the Limit Equilibrium (LE) method found in literature, focalizing our attention on the Minimum Lithostatic Deviation (MLD) method, developed by Prof. Tinti and his collaborators (e.g. Tinti and Manucci, 2006, 2008). We had two main goals: the first was to test the MLD method on some real cases. We have selected the case of the Vajont landslide with the objective to reconstruct the conditions that caused the destabilization of Mount Toc, and two sites in the Norwegian margin, where failures has not occurred recently, with the aim to evaluate the present stability state and to assess under which conditions they might be mobilized. The second goal was to study the stability charts by Taylor and by Michalowski, and to use the MLD method to investigate the correctness and adequacy of this engineering tool.
Resumo:
In this dissertation, we focus on developing new green bio-based gel systems and evaluating both the cleaning efficiency and the release of residues on the treated surface, different micro or no destructive techniques, such as optical microscopy, TGA, FTIR spectroscopy, HS-SPME and micro-Spatially Offset Raman spectroscopy (micro-SORS) were tested, proposing advanced analytical protocols. In the first part, a ternary PHB-DMC/BD gel system composed by biodiesel, dimethyl carbonate and poly-3 hydroxybutyrate was developed for cleaning of wax-based coatings applied on indoor bronze. The evaluation of the cleaning efficacy of the gel was carried out on a standard bronze sample which covered a layer of beeswax by restores of Opificio delle Pietre Dure in Florence, and a real case precious indoor bronze sculpture Pulpito della Passione attributed to Donatello. Results obtained by FTIR analysis showed an efficient removal of the wax coating. In the second part, two new kinds of combined gels based on electrospun tissues (PVA and nylon) and PHB-GVL gel were developed for removal of dammar varnish from painting. The electrospun tissue combined gels exhibited good mechanical property, and showed good efficient in cleaning over normal gel. In the third part, green deep eutectic solvent which consists urea and choline chloride was proposed to produce the rigid gel with agar for the removal of proteinaceous coating from oil painting. Rabbit glue and whole egg decorated oil painting mock-ups were selected for evaluating its cleaning efficiency, results obtained by ATR analysis showed the DES-agar gel has good cleaning performance. Furthermore, we proposed micro-SORS as a valuable alternative non-destructive method to explore the DES diffusion on painting mock-up. As a result, the micro-SORS was successful applied for monitoring the liquid diffusion behavior in painting sub-layer, providing a great and useful instrument for noninvasive residues detection in the conservation field.
Resumo:
The topic of seismic loss assessment not only incorporates many aspects of the earthquake engineering, but also entails social factors, public policies and business interests. Because of its multidisciplinary character, this process may be complex to challenge, and sound discouraging to neophytes. In this context, there is an increasing need of deriving simplified methodologies to streamline the process and provide tools for decision-makers and practitioners. This dissertation investigates different possible applications both in the area of modelling of seismic losses, both in the analysis of observational seismic data. Regarding the first topic, the PRESSAFE-disp method is proposed for the fast evaluation of the fragility curves of precast reinforced-concrete (RC) structures. Hence, a direct application of the method to the productive area of San Felice is studied to assess the number of collapses under a specific seismic scenario. In particular, with reference to the 2012 events, two large-scale stochastic models are outlined. The outcomes of the framework are promising, in good agreement with the observed damage scenario. Furthermore, a simplified displacement-based methodology is outlined to estimate different loss performance metrics for the decision-making phase of the seismic retrofit of a single RC building. The aim is to evaluate the seismic performance of different retrofit options, for a comparative analysis of their effectiveness and the convenience. Finally, a contribution to the analysis of the observational data is presented in the last part of the dissertation. A specific database of losses of precast RC buildings damaged by the 2012 Earthquake is created. A statistical analysis is performed, allowing deriving several consequence functions. The outcomes presented may be implemented in probabilistic seismic risk assessments to forecast the losses at the large scale. Furthermore, these may be adopted to establish retrofit policies to prevent and reduce the consequences of future earthquakes in industrial areas.
Resumo:
In this thesis I show a triple new connection we found between quantum integrability, N=2 supersymmetric gauge theories and black holes perturbation theory. I use the approach of the ODE/IM correspondence between Ordinary Differential Equations (ODE) and Integrable Models (IM), first to connect basic integrability functions - the Baxter’s Q, T and Y functions - to the gauge theory periods. This fundamental identification allows several new results for both theories, for example: an exact non linear integral equation (Thermodynamic Bethe Ansatz, TBA) for the gauge periods; an interpretation of the integrability functional relations as new exact R-symmetry relations for the periods; new formulas for the local integrals of motion in terms of gauge periods. This I develop in all details at least for the SU(2) gauge theory with Nf=0,1,2 matter flavours. Still through to the ODE/IM correspondence, I connect the mathematically precise definition of quasinormal modes of black holes (having an important role in gravitational waves’ obervations) with quantization conditions on the Q, Y functions. In this way I also give a mathematical explanation of the recently found connection between quasinormal modes and N=2 supersymmetric gauge theories. Moreover, it follows a new simple and effective method to numerically compute the quasinormal modes - the TBA - which I compare with other standard methods. The spacetimes for which I show these in all details are in the simplest Nf=0 case the D3 brane in the Nf=1,2 case a generalization of extremal Reissner-Nordström (charged) black holes. Then I begin treating also the Nf=3,4 theories and argue on how our integrability-gauge-gravity correspondence can generalize to other types of black holes in either asymptotically flat (Nf=3) or Anti-de-Sitter (Nf=4) spacetime. Finally I begin to show the extension to a 4-fold correspondence with also Conformal Field Theory (CFT), through the renowned AdS/CFT correspondence.