44 resultados para Numerical Analysis and Scientific Computing
Resumo:
The hydrologic risk (and the hydro-geologic one, closely related to it) is, and has always been, a very relevant issue, due to the severe consequences that may be provoked by a flooding or by waters in general in terms of human and economic losses. Floods are natural phenomena, often catastrophic, and cannot be avoided, but their damages can be reduced if they are predicted sufficiently in advance. For this reason, the flood forecasting plays an essential role in the hydro-geological and hydrological risk prevention. Thanks to the development of sophisticated meteorological, hydrologic and hydraulic models, in recent decades the flood forecasting has made a significant progress, nonetheless, models are imperfect, which means that we are still left with a residual uncertainty on what will actually happen. In this thesis, this type of uncertainty is what will be discussed and analyzed. In operational problems, it is possible to affirm that the ultimate aim of forecasting systems is not to reproduce the river behavior, but this is only a means through which reducing the uncertainty associated to what will happen as a consequence of a precipitation event. In other words, the main objective is to assess whether or not preventive interventions should be adopted and which operational strategy may represent the best option. The main problem for a decision maker is to interpret model results and translate them into an effective intervention strategy. To make this possible, it is necessary to clearly define what is meant by uncertainty, since in the literature confusion is often made on this issue. Therefore, the first objective of this thesis is to clarify this concept, starting with a key question: should be the choice of the intervention strategy to adopt based on the evaluation of the model prediction based on its ability to represent the reality or on the evaluation of what actually will happen on the basis of the information given by the model forecast? Once the previous idea is made unambiguous, the other main concern of this work is to develope a tool that can provide an effective decision support, making possible doing objective and realistic risk evaluations. In particular, such tool should be able to provide an uncertainty assessment as accurate as possible. This means primarily three things: it must be able to correctly combine all the available deterministic forecasts, it must assess the probability distribution of the predicted quantity and it must quantify the flooding probability. Furthermore, given that the time to implement prevention strategies is often limited, the flooding probability will have to be linked to the time of occurrence. For this reason, it is necessary to quantify the flooding probability within a horizon time related to that required to implement the intervention strategy and it is also necessary to assess the probability of the flooding time.
Resumo:
In this thesis two major topics inherent with medical ultrasound images are addressed: deconvolution and segmentation. In the first case a deconvolution algorithm is described allowing statistically consistent maximum a posteriori estimates of the tissue reflectivity to be restored. These estimates are proven to provide a reliable source of information for achieving an accurate characterization of biological tissues through the ultrasound echo. The second topic involves the definition of a semi automatic algorithm for myocardium segmentation in 2D echocardiographic images. The results show that the proposed method can reduce inter- and intra observer variability in myocardial contours delineation and is feasible and accurate even on clinical data.
Resumo:
In this dissertation the pyrolytic conversion of biomass into chemicals and fuels was investigated from the analytical point of view. The study was focused on the liquid (bio-oil) and solid (char) fractions obtainable from biomass pyrolysis. The drawbacks of Py-GC-MS described so far were partially solved by coupling different analytical configurations (Py-GC-MS, Py-GC-MIP-AED and off-line Py-SPE and Py-SPME-GC-MS with derivatization procedures). The application of different techniques allowed a satisfactory comparative analysis of pyrolysis products of different biomass and a high throughput screening on effect of 33 catalysts on biomass pyrolysis. As the results of the screening showed, the most interesting catalysts were those containing copper (able to reduce the high molecular weight fraction of bio-oil without large yield decrease) and H-ZSM-5 (able to entirely convert the bio-oil into “gasoline like” aromatic products). In order to establish the noxious compounds content of the liquid product, a clean-up step was included in the Py-SPE procedure. This allowed to investigate pollutants (PAHs) generation from pyrolysis and catalytic pyrolysis of biomass. In fact, bio-oil from non-catalytic pyrolysis of biomass showed a moderate PAHs content, while the use of H-ZSM-5 catalyst for bio-oil up-grading determined an astonishing high production of PAHs (if compared to what observed in alkanes cracking), indicating an important concern in the substitution fossil fuel with bio-oil derived from biomass. Moreover, the analytical procedures developed in this thesis were directly applied for the detailed study of the most useful process scheme and up-grading route to chemical intermediates (anhydrosugars), transportation fuels or commodity chemicals (aromatic hydrocarbons). In the applied study, poplar and microalgae biomass were investigated and overall GHGs balance of pyrolysis of agricultural residues in Ravenna province was performed. A special attention was put on the comparison of the effect of bio-char different use (fuel or as soil conditioner) on the soil health and GHGs emissions.
Resumo:
The present work is devoted to the assessment of the energy fluxes physics in the space of scales and physical space of wall-turbulent flows. The generalized Kolmogorov equation will be applied to DNS data of a turbulent channel flow in order to describe the energy fluxes paths from production to dissipation in the augmented space of wall-turbulent flows. This multidimensional description will be shown to be crucial to understand the formation and sustainment of the turbulent fluctuations fed by the energy fluxes coming from the near-wall production region. An unexpected behavior of the energy fluxes comes out from this analysis consisting of spiral-like paths in the combined physical/scale space where the controversial reverse energy cascade plays a central role. The observed behavior conflicts with the classical notion of the Richardson/Kolmogorov energy cascade and may have strong repercussions on both theoretical and modeling approaches to wall-turbulence. To this aim a new relation stating the leading physical processes governing the energy transfer in wall-turbulence is suggested and shown able to capture most of the rich dynamics of the shear dominated region of the flow. Two dynamical processes are identified as driving mechanisms for the fluxes, one in the near wall region and a second one further away from the wall. The former, stronger one is related to the dynamics involved in the near-wall turbulence regeneration cycle. The second suggests an outer self-sustaining mechanism which is asymptotically expected to take place in the log-layer and could explain the debated mixed inner/outer scaling of the near-wall statistics. The same approach is applied for the first time to a filtered velocity field. A generalized Kolmogorov equation specialized for filtered velocity field is derived and discussed. The results will show what effects the subgrid scales have on the resolved motion in both physical and scale space, singling out the prominent role of the filter length compared to the cross-over scale between production dominated scales and inertial range, lc, and the reverse energy cascade region lb. The systematic characterization of the resolved and subgrid physics as function of the filter scale and of the wall-distance will be shown instrumental for a correct use of LES models in the simulation of wall turbulent flows. Taking inspiration from the new relation for the energy transfer in wall turbulence, a new class of LES models will be also proposed. Finally, the generalized Kolmogorov equation specialized for filtered velocity fields will be shown to be an helpful statistical tool for the assessment of LES models and for the development of new ones. As example, some classical purely dissipative eddy viscosity models are analyzed via an a priori procedure.
Resumo:
This thesis deals with the development of the upcoming aeronautical mobile airport communications system (AeroMACS) system. We analyzed the performance of AeroMACS and we investigated potential solutions for enhancing its performance. Since the most critical results correspond to the channel scenario having less diversity1, we tackled this problem investigating potential solutions for increasing the diversity of the system and therefore improving its performance. We accounted different forms of diversity as space diversity and time diversity. More specifically, space (antenna and cooperative) diversity and time diversity are analyzed as countermeasures for the harsh fading conditions that are typical of airport environments. Among the analyzed techniques, two novel concepts are introduced, namely unequal diversity coding and flexible packet level codes. The proposed techniques have been analyzed on a novel airport channel model, derived from a measurement campaign at the airport of Munich (Germany). The introduced techniques largely improve the performance of the conventional AeroMACS link; representing thus appealing solutions for the long term evolution of the system.
Resumo:
Neisseria meningitidis (Nm) is the major cause of septicemia and meningococcal meningitis. During the course of infection, it must adapt to different host environments as a crucial factor for survival. Despite the severity of meningococcal sepsis, little is known about how Nm adapts to permit survival and growth in human blood. A previous time-course transcriptome analysis, using an ex vivo model of human whole blood infection, showed that Nm alters the expression of nearly 30% of ORFs of the genome: major dynamic changes were observed in the expression of transcriptional regulators, transport and binding proteins, energy metabolism, and surface-exposed virulence factors. Starting from these data, mutagenesis studies of a subset of up-regulated genes were performed and the mutants were tested for the ability to survive in human whole blood; Nm mutant strains lacking the genes encoding NMB1483, NalP, Mip, NspA, Fur, TbpB, and LctP were sensitive to killing by human blood. Then, the analysis was extended to the whole Nm transcriptome in human blood, using a customized 60-mer oligonucleotide tiling microarray. The application of specifically developed software combined with this new tiling array allowed the identification of different types of regulated transcripts: small intergenic RNAs, antisense RNAs, 5’ and 3’ untranslated regions and operons. The expression of these RNA molecules was confirmed by 5’-3’RACE protocol and specific RT-PCR. Here we describe the complete transcriptome of Nm during incubation in human blood; we were able to identify new proteins important for survival in human blood and also to identify additional roles of previously known virulence factors in aiding survival in blood. In addition the tiling array analysis demonstrated that Nm expresses a set of new transcripts, not previously identified, and suggests the presence of a circuit of regulatory RNA elements used by Nm to adapt to proliferate in human blood.
Resumo:
Extrusion is a process used to form long products of constant cross section, from simple billets, with a high variety of shapes. Aluminum alloys are the materials most processed in the extrusion industry due to their deformability and the wide field of applications that range from buildings to aerospace and from design to automotive industries. The diverse applications imply different requirements that can be fulfilled by the wide range of alloys and treatments, that is from critical structural application to high quality surface and aesthetical aspect. Whether one or the other is the critical aspect, they both depend directly from microstructure. The extrusion process is moreover marked by high deformations and complex strain gradients making difficult the control of microstructure evolution that is at present not yet fully achieved. Nevertheless the evolution of Finite Element modeling has reached a maturity and can therefore start to be used as a tool for investigation and prediction of microstructure evolution. This thesis will analyze and model the evolution of microstructure throughout the entire extrusion process for 6XXX series aluminum alloys. Core phase of the work was the development of specific tests to investigate the microstructure evolution and validate the model implemented in a commercial FE code. Along with it two essential activities were carried out for a correct calibration of the model beyond the simple research of contour parameters, thus leading to the understanding and control of both code and process. In this direction activities were also conducted on building critical knowhow on the interpretation of microstructure and extrusion phenomena. It is believed, in fact, that the sole analysis of the microstructure evolution regardless of its relevance in the technological aspects of the process would be of little use for the industry as well as ineffective for the interpretation of the results.
Resumo:
This study is focused on radio-frequency inductively coupled thermal plasma (ICP) synthesis of nanoparticles, combining experimental and modelling approaches towards process optimization and industrial scale-up, in the framework of the FP7-NMP SIMBA European project (Scaling-up of ICP technology for continuous production of Metallic nanopowders for Battery Applications). First the state of the art of nanoparticle production through conventional and plasma routes is summarized, then results for the characterization of the plasma source and on the investigation of the nanoparticle synthesis phenomenon, aiming at highlighting fundamental process parameters while adopting a design oriented modelling approach, are presented. In particular, an energy balance of the torch and of the reaction chamber, employing a calorimetric method, is presented, while results for three- and two-dimensional modelling of an ICP system are compared with calorimetric and enthalpy probe measurements to validate the temperature field predicted by the model and used to characterize the ICP system under powder-free conditions. Moreover, results from the modeling of critical phases of ICP synthesis process, such as precursor evaporation, vapour conversion in nanoparticles and nanoparticle growth, are presented, with the aim of providing useful insights both for the design and optimization of the process and on the underlying physical phenomena. Indeed, precursor evaporation, one of the phases holding the highest impact on industrial feasibility of the process, is discussed; by employing models to describe particle trajectories and thermal histories, adapted from the ones originally developed for other plasma technologies or applications, such as DC non-transferred arc torches and powder spherodization, the evaporation of micro-sized Si solid precursor in a laboratory scale ICP system is investigated. Finally, a discussion on the role of thermo-fluid dynamic fields on nano-particle formation is presented, as well as a study on the effect of the reaction chamber geometry on produced nanoparticle characteristics and process yield.
Resumo:
Climate-change related impacts, notably coastal erosion, inundation and flooding from sea level rise and storms, will increase in the coming decades enhancing the risks for coastal populations. Further recourse to coastal armoring and other engineered defenses to address risk reduction will exacerbate threats to coastal ecosystems. Alternatively, protection services provided by healthy ecosystems is emerging as a key element in climate adaptation and disaster risk management. I examined two distinct approaches to coastal defense on the base of their ecological and ecosystem conservation values. First, I analyzed the role of coastal ecosystems in providing services for hazard risk reduction. The value in wave attenuation of coral reefs was quantitatively demonstrated using a meta-analysis approach. Results indicate that coral reefs can provide wave attenuation comparable to hard engineering artificial defenses and at lower costs. Conservation and restoration of existing coral reefs are cost-effective management options for disaster risk reduction. Second, I evaluated the possibility to enhance the ecological value of artificial coastal defense structures (CDS) as habitats for marine communities. I documented the suitability of CDS to support native, ecologically relevant, habitat-forming canopy algae exploring the feasibility of enhancing CDS ecological value by promoting the growth of desired species. Juveniles of Cystoseira barbata can be successfully transplanted at both natural and artificial habitats and not affected by lack of surrounding adult algal individuals nor by substratum orientation. Transplantation success was limited by biotic disturbance from macrograzers on CDS compared to natural habitats. Future work should explore the reasons behind the different ecological functioning of artificial and natural habitats unraveling the factors and mechanisms that cause it. The comprehension of the functioning of systems associated with artificial habitats is the key to allow environmental managers to identify proper mitigation options and to forecast the impact of alternative coastal development plans.
Resumo:
The goal of many plant scientists’ research is to explain natural phenotypic variation in term of simple changes in DNA sequence. DNA-based molecular markers are extensively used for the construction of genome-wide molecular maps and to perform genetic analysis for simple and complex traits. The PhD thesis was divided into two main research lines according to the different approaches adopted. The first research line is to analyze the genetic diversity in an Italian apple germplasm collection for the identification of markers tightly linked to targeted genes by an association genetic method. This made it possible to identify synomym and homonym accessions and triploids. The fruit red skin color trait has been used to test the reliability of the genetic approaches in this species. The second line is related to the development of molecular markers closely linked to the Rvi13 and Rvi5 scab resistance genes, previously mapped on apple’s chromosome 10 and 17 respectively by using the traditional linkage mapping method. Both region have been fine-mapped with various type of markers that could be used for marker-assisted selection in future breeding programs and to isolate the two resistance genes.
Resumo:
The discovery of the Cosmic Microwave Background (CMB) radiation in 1965 is one of the fundamental milestones supporting the Big Bang theory. The CMB is one of the most important source of information in cosmology. The excellent accuracy of the recent CMB data of WMAP and Planck satellites confirmed the validity of the standard cosmological model and set a new challenge for the data analysis processes and their interpretation. In this thesis we deal with several aspects and useful tools of the data analysis. We focus on their optimization in order to have a complete exploitation of the Planck data and contribute to the final published results. The issues investigated are: the change of coordinates of CMB maps using the HEALPix package, the problem of the aliasing effect in the generation of low resolution maps, the comparison of the Angular Power Spectrum (APS) extraction performances of the optimal QML method, implemented in the code called BolPol, and the pseudo-Cl method, implemented in Cromaster. The QML method has been then applied to the Planck data at large angular scales to extract the CMB APS. The same method has been applied also to analyze the TT parity and the Low Variance anomalies in the Planck maps, showing a consistent deviation from the standard cosmological model, the possible origins for this results have been discussed. The Cromaster code instead has been applied to the 408 MHz and 1.42 GHz surveys focusing on the analysis of the APS of selected regions of the synchrotron emission. The new generation of CMB experiments will be dedicated to polarization measurements, for which are necessary high accuracy devices for separating the polarizations. Here a new technology, called Photonic Crystals, is exploited to develop a new polarization splitter device and its performances are compared to the devices used nowadays.
Resumo:
Nanotechnologies are rapidly expanding because of the opportunities that the new materials offer in many areas such as the manufacturing industry, food production, processing and preservation, and in the pharmaceutical and cosmetic industry. Size distribution of the nanoparticles determines their properties and is a fundamental parameter that needs to be monitored from the small-scale synthesis up to the bulk production and quality control of nanotech products on the market. A consequence of the increasing number of applications of nanomaterial is that the EU regulatory authorities are introducing the obligation for companies that make use of nanomaterials to acquire analytical platforms for the assessment of the size parameters of the nanomaterials. In this work, Asymmetrical Flow Field-Flow Fractionation (AF4) and Hollow Fiber F4 (HF5), hyphenated with Multiangle Light Scattering (MALS) are presented as tools for a deep functional characterization of nanoparticles. In particular, it is demonstrated the applicability of AF4-MALS for the characterization of liposomes in a wide series of mediums. Afterwards the technique is used to explore the functional features of a liposomal drug vector in terms of its biological and physical interaction with blood serum components: a comprehensive approach to understand the behavior of lipid vesicles in terms of drug release and fusion/interaction with other biological species is described, together with weaknesses and strength of the method. Afterwards the size characterization, size stability, and conjugation of azidothymidine drug molecules with a new generation of metastable drug vectors, the Metal Organic Frameworks, is discussed. Lastly, it is shown the applicability of HF5-ICP-MS for the rapid screening of samples of relevant nanorisk: rather than a deep and comprehensive characterization it this time shown a quick and smart methodology that within few steps provides qualitative information on the content of metallic nanoparticles in tattoo ink samples.
Resumo:
Waste management represents an important issue in our society and Waste-to-Energy incineration plants have been playing a significant role in the last decades, showing an increased importance in Europe. One of the main issues posed by waste combustion is the generation of air contaminants. Particular concern is present about acid gases, mainly hydrogen chloride and sulfur oxides, due to their potential impact on the environment and on human health. Therefore, in the present study the main available technological options for flue gas treatment were analyzed, focusing on dry treatment systems, which are increasingly applied in Municipal Solid Wastes (MSW) incinerators. An operational model was proposed to describe and optimize acid gas removal process. It was applied to an existing MSW incineration plant, where acid gases are neutralized in a two-stage dry treatment system. This process is based on the injection of powdered calcium hydroxide and sodium bicarbonate in reactors followed by fabric filters. HCl and SO2 conversions were expressed as a function of reactants flow rates, calculating model parameters from literature and plant data. The implementation in a software for process simulation allowed the identification of optimal operating conditions, taking into account the reactant feed rates, the amount of solid products and the recycle of the sorbent. Alternative configurations of the reference plant were also assessed. The applicability of the operational model was extended developing also a fundamental approach to the issue. A predictive model was developed, describing mass transfer and kinetic phenomena governing the acid gas neutralization with solid sorbents. The rate controlling steps were identified through the reproduction of literature data, allowing the description of acid gas removal in the case study analyzed. A laboratory device was also designed and started up to assess the required model parameters.
Resumo:
The first part of this thesis has focused on the construction of a twelve-phase asynchronous machine for More Electric Aircraft (MEA) applications. In fact, the aerospace world has found in electrification the way to improve the efficiency, reliability and maintainability of an aircraft. This idea leads to the aircraft a new management and distribution of electrical services. In this way is possible to remove or to reduce the hydraulic, mechanical and pneumatic systems inside the aircraft. The second part of this dissertation is dedicated on the enhancement of the control range of matrix converters (MCs) operating with non-unity input power factor and, at the same time, on the reduction of the switching power losses. The analysis leads to the determination in closed form of a modulation strategy that features a control range, in terms of output voltage and input power factor, that is greater than that of the traditional strategies under the same operating conditions, and a reduction in the switching power losses.