10 resultados para methods of resolution enhancement
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
In this work, new tools in atmospheric pollutant sampling and analysis were applied in order to go deeper in source apportionment study. The project was developed mainly by the study of atmospheric emission sources in a suburban area influenced by a municipal solid waste incinerator (MSWI), a medium-sized coastal tourist town and a motorway. Two main research lines were followed. For what concerns the first line, the potentiality of the use of PM samplers coupled with a wind select sensor was assessed. Results showed that they may be a valid support in source apportionment studies. However, meteorological and territorial conditions could strongly affect the results. Moreover, new markers were investigated, particularly focusing on the processes of biomass burning. OC revealed a good biomass combustion process indicator, as well as all determined organic compounds. Among metals, lead and aluminium are well related to the biomass combustion. Surprisingly PM was not enriched of potassium during bonfire event. The second research line consists on the application of Positive Matrix factorization (PMF), a new statistical tool in data analysis. This new technique was applied to datasets which refer to different time resolution data. PMF application to atmospheric deposition fluxes identified six main sources affecting the area. The incinerator’s relative contribution seemed to be negligible. PMF analysis was then applied to PM2.5 collected with samplers coupled with a wind select sensor. The higher number of determined environmental indicators allowed to obtain more detailed results on the sources affecting the area. Vehicular traffic revealed the source of greatest concern for the study area. Also in this case, incinerator’s relative contribution seemed to be negligible. Finally, the application of PMF analysis to hourly aerosol data demonstrated that the higher the temporal resolution of the data was, the more the source profiles were close to the real one.
Resumo:
Researches performed during the PhD course intended to assess innovative applications of near-infrared spectroscopy in reflectance (NIR) in the production chain of beer. The purpose is to measure by NIR the "malting quality" (MQ) parameter of barley, to monitor the malting process and to know if a certain type of barley is suitable for the production of beer and spirits. Moreover, NIR will be applied to monitor the brewing process. First of all, it was possible to check the quality of the raw materials like barley, maize and barley malt using a rapid, non-destructive and reliable method, with a low error of prediction. The more interesting result obtained at this level was that the repeatability of the NIR calibration models developed was comparable with the one of the reference method. Moreover, about malt, new kinds of validation were used in order to estimate the real predictive power of the proposed calibration models and to understand the long-term effects. Furthermore, the precision of all the calibration models developed for malt evaluation was estimated and statistically compared with the reference methods, with good results. Then, new calibration models were developed for monitoring the malting process, measuring the moisture content and other malt quality parameters during germination. Moreover it was possible to obtain by NIR an estimate of the "malting quality" (MQ) of barley and to predict whether if its germination will be rapid and uniform and if a certain type of barley is suitable for the production of beer and spirits. Finally, the NIR technique was applied to monitor the brewing process, using correlations between NIR spectra of beer and analytical parameters, and to assess beer quality. These innovative results are potentially very useful for the actors involved in the beer production chain, especially the calibration models suitable for the control of the malting process and for the assessment of the “malting quality” of barley, which need to be deepened in future studies.
Resumo:
The consumer demand for natural, minimally processed, fresh like and functional food has lead to an increasing interest in emerging technologies. The aim of this PhD project was to study three innovative food processing technologies currently used in the food sector. Ultrasound-assisted freezing, vacuum impregnation and pulsed electric field have been investigated through laboratory scale systems and semi-industrial pilot plants. Furthermore, analytical and sensory techniques have been developed to evaluate the quality of food and vegetable matrix obtained by traditional and emerging processes. Ultrasound was found to be a valuable technique to improve the freezing process of potatoes, anticipating the beginning of the nucleation process, mainly when applied during the supercooling phase. A study of the effects of pulsed electric fields on phenol and enzymatic profile of melon juice has been realized and the statistical treatment of data was carried out through a response surface method. Next, flavour enrichment of apple sticks has been realized applying different techniques, as atmospheric, vacuum, ultrasound technologies and their combinations. The second section of the thesis deals with the development of analytical methods for the discrimination and quantification of phenol compounds in vegetable matrix, as chestnut bark extracts and olive mill waste water. The management of waste disposal in mill sector has been approached with the aim of reducing the amount of waste, and at the same time recovering valuable by-products, to be used in different industrial sectors. Finally, the sensory analysis of boiled potatoes has been carried out through the development of a quantitative descriptive procedure for the study of Italian and Mexican potato varieties. An update on flavour development in fresh and cooked potatoes has been realized and a sensory glossary, including general and specific definitions related to organic products, used in the European project Ecropolis, has been drafted.
Resumo:
The thesis is focused on the development of a method for the synthesis of silicon nanocrystals with different sizes, narrow size distribution, good optical properties and stability in air. The resulting silicon nanocrystals have been covalently functionalized with different chromophores with the aim to exploit the new electronic and chemical properties that emerge from the interaction between silicon nanocrystal surface and ligands. The purpose is to use these chromophores as light harvesting antennae, increasing the optical absorption of silicon nanocrystals. Functionalized silicon nanocrystals have been characterized with different analytical techniques leading to a good knowledge of optical properties of semiconductor quantum dots.
Resumo:
Noise is constant presence in measurements. Its origin is related to the microscopic properties of matter. Since the seminal work of Brown in 1828, the study of stochastic processes has gained an increasing interest with the development of new mathematical and analytical tools. In the last decades, the central role that noise plays in chemical and physiological processes has become recognized. The dual role of noise as nuisance/resource pushes towards the development of new decomposition techniques that divide a signal into its deterministic and stochastic components. In this thesis I show how methods based on Singular Spectrum Analysis have the right properties to fulfil the previously mentioned requirement. During my work I applied SSA to different signals of interest in chemistry: I developed a novel iterative procedure for the denoising of powder X-ray diffractograms; I “denoised” bi-dimensional images from experiments of electrochemiluminescence imaging of micro-beads obtaining new insight on ECL mechanism. I also used Principal Component Analysis to investigate the relationship between brain electrophysiological signals and voice emission.
Resumo:
This research activity aims at providing a reliable estimation of particular state variables or parameters concerning the dynamics and performance optimization of a MotoGP-class motorcycle, integrating the classical model-based approach with new methodologies involving artificial intelligence. The first topic of the research focuses on the estimation of the thermal behavior of the MotoGP carbon braking system. Numerical tools are developed to assess the instantaneous surface temperature distribution in the motorcycle's front brake discs. Within this application other important brake parameters are identified using Kalman filters, such as the disc convection coefficient and the power distribution in the disc-pads contact region. Subsequently, a physical model of the brake is built to estimate the instantaneous braking torque. However, the results obtained with this approach are highly limited by the knowledge of the friction coefficient (μ) between the disc rotor and the pads. Since the value of μ is a highly nonlinear function of many variables (namely temperature, pressure and angular velocity of the disc), an analytical model for the friction coefficient estimation appears impractical to establish. To overcome this challenge, an innovative hybrid solution is implemented, combining the benefit of artificial intelligence (AI) with classical model-based approach. Indeed, the disc temperature estimated through the thermal model previously implemented is processed by a machine learning algorithm that outputs the actual value of the friction coefficient thus improving the braking torque computation performed by the physical model of the brake. Finally, the last topic of this research activity regards the development of an AI algorithm to estimate the current sideslip angle of the motorcycle's front tire. While a single-track motorcycle kinematic model and IMU accelerometer signals theoretically enable sideslip calculation, the presence of accelerometer noise leads to a significant drift over time. To address this issue, a long short-term memory (LSTM) network is implemented.
Resumo:
The object of the present study is the process of gas transport in nano-sized materials, i.e. systems having structural elements of the order of nanometers. The aim of this work is to advance the understanding of the gas transport mechanism in such materials, for which traditional models are not often suitable, by providing a correct interpretation of the relationship between diffusive phenomena and structural features. This result would allow the development new materials with permeation properties tailored on the specific application, especially in packaging systems. The methods used to achieve this goal were a detailed experimental characterization and different simulation methods. The experimental campaign regarded the determination of oxygen permeability and diffusivity in different sets of organic-inorganic hybrid coatings prepared via sol-gel technique. The polymeric samples coated with these hybrid layers experienced a remarkable enhancement of the barrier properties, which was explained by the strong interconnection at the nano-scale between the organic moiety and silica domains. An analogous characterization was performed on microfibrillated cellulose films, which presented remarkable barrier effect toward oxygen when it is dry, while in the presence of water the performance significantly drops. The very low value of water diffusivity at low activities is also an interesting characteristic which deals with its structural properties. Two different approaches of simulation were then considered: the diffusion of oxygen through polymer-layered silicates was modeled on a continuum scale with a CFD software, while the properties of n-alkanthiolate self assembled monolayers on gold were analyzed from a molecular point of view by means of a molecular dynamics algorithm. Modeling transport properties in layered nanocomposites, resulting from the ordered dispersion of impermeable flakes in a 2-D matrix, allowed the calculation of the enhancement of barrier effect in relation with platelets structural parameters leading to derive a new expression. On this basis, randomly distributed systems were simulated and the results were analyzed to evaluate the different contributions to the overall effect. The study of more realistic three-dimensional geometries revealed a prefect correspondence with the 2-D approximation. A completely different approach was applied to simulate the effect of temperature on the oxygen transport through self assembled monolayers; the structural information obtained from equilibrium MD simulations showed that raising the temperature, makes the monolayer less ordered and consequently less crystalline. This disorder produces a decrease in the barrier free energy and it lowers the overall resistance to oxygen diffusion, making the monolayer more permeable to small molecules.
Resumo:
Monitoring foetal health is a very important task in clinical practice to appropriately plan pregnancy management and delivery. In the third trimester of pregnancy, ultrasound cardiotocography is the most employed diagnostic technique: foetal heart rate and uterine contractions signals are simultaneously recorded and analysed in order to ascertain foetal health. Because ultrasound cardiotocography interpretation still lacks of complete reliability, new parameters and methods of interpretation, or alternative methodologies, are necessary to further support physicians’ decisions. To this aim, in this thesis, foetal phonocardiography and electrocardiography are considered as different techniques. Further, variability of foetal heart rate is thoroughly studied. Frequency components and their modifications can be analysed by applying a time-frequency approach, for a distinct understanding of the spectral components and their change over time related to foetal reactions to internal and external stimuli (such as uterine contractions). Such modifications of the power spectrum can be a sign of autonomic nervous system reactions and therefore represent additional, objective information about foetal reactivity and health. However, some limits of ultrasonic cardiotocography still remain, such as in long-term foetal surveillance, which is often recommendable mainly in risky pregnancies. In these cases, the fully non-invasive acoustic recording, foetal phonocardiography, through maternal abdomen, represents a valuable alternative to the ultrasonic cardiotocography. Unfortunately, the so recorded foetal heart sound signal is heavily loaded by noise, thus the determination of the foetal heart rate raises serious signal processing issues. A new algorithm for foetal heart rate estimation from foetal phonocardiographic recordings is presented in this thesis. Different filtering and enhancement techniques, to enhance the first foetal heart sounds, were applied, so that different signal processing techniques were implemented, evaluated and compared, by identifying the strategy characterized on average by the best results. In particular, phonocardiographic signals were recorded simultaneously to ultrasonic cardiotocographic signals in order to compare the two foetal heart rate series (the one estimated by the developed algorithm and the other provided by cardiotocographic device). The algorithm performances were tested on phonocardiographic signals recorded on pregnant women, showing reliable foetal heart rate signals, very close to the ultrasound cardiotocographic recordings, considered as reference. The algorithm was also tested by using a foetal phonocardiographic recording simulator developed and presented in this research thesis. The target was to provide a software for simulating recordings relative to different foetal conditions and recordings situations and to use it as a test tool for comparing and assessing different foetal heart rate extraction algorithms. Since there are few studies about foetal heart sounds time characteristics and frequency content and the available literature is poor and not rigorous in this area, a data collection pilot study was also conducted with the purpose of specifically characterising both foetal and maternal heart sounds. Finally, in this thesis, the use of foetal phonocardiographic and electrocardiographic methodology and their combination, are presented in order to detect foetal heart rate and other functioning anomalies. The developed methodologies, suitable for longer-term assessment, were able to detect heart beat events correctly, such as first and second heart sounds and QRS waves. The detection of such events provides reliable measures of foetal heart rate, potentially information about measurement of the systolic time intervals and foetus circulatory impedance.
Resumo:
A first phase of the research activity has been related to the study of the state of art of the infrastructures for cycling, bicycle use and methods for evaluation. In this part, the candidate has studied the "bicycle system" in countries with high bicycle use and in particular in the Netherlands. Has been carried out an evaluation of the questionnaires of the survey conducted within the European project BICY on mobility in general in 13 cities of the participating countries. The questionnaire was designed, tested and implemented, and was later validated by a test in Bologna. The results were corrected with information on demographic situation and compared with official data. The cycling infrastructure analysis was conducted on the basis of information from the OpenStreetMap database. The activity consisted in programming algorithms in Python that allow to extract data from the database infrastructure for a region, to sort and filter cycling infrastructure calculating some attributes, such as the length of the arcs paths. The results obtained were compared with official data where available. The structure of the thesis is as follows: 1. Introduction: description of the state of cycling in several advanced countries, description of methods of analysis and their importance to implement appropriate policies for cycling. Supply and demand of bicycle infrastructures. 2. Survey on mobility: it gives details of the investigation developed and the method of evaluation. The results obtained are presented and compared with official data. 3. Analysis cycling infrastructure based on information from the database of OpenStreetMap: describes the methods and algorithms developed during the PhD. The results obtained by the algorithms are compared with official data. 4. Discussion: The above results are discussed and compared. In particular the cycle demand is compared with the length of cycle networks within a city. 5. Conclusions