14 resultados para model quality
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
This thesis proposes a new document model, according to which any document can be segmented in some independent components and transformed in a pattern-based projection, that only uses a very small set of objects and composition rules. The point is that such a normalized document expresses the same fundamental information of the original one, in a simple, clear and unambiguous way. The central part of my work consists of discussing that model, investigating how a digital document can be segmented, and how a segmented version can be used to implement advanced tools of conversion. I present seven patterns which are versatile enough to capture the most relevant documents’ structures, and whose minimality and rigour make that implementation possible. The abstract model is then instantiated into an actual markup language, called IML. IML is a general and extensible language, which basically adopts an XHTML syntax, able to capture a posteriori the only content of a digital document. It is compared with other languages and proposals, in order to clarify its role and objectives. Finally, I present some systems built upon these ideas. These applications are evaluated in terms of users’ advantages, workflow improvements and impact over the overall quality of the output. In particular, they cover heterogeneous content management processes: from web editing to collaboration (IsaWiki and WikiFactory), from e-learning (IsaLearning) to professional printing (IsaPress).
Resumo:
The quality of temperature and humidity retrievals from the infrared SEVIRI sensors on the geostationary Meteosat Second Generation (MSG) satellites is assessed by means of a one dimensional variational algorithm. The study is performed with the aim of improving the spatial and temporal resolution of available observations to feed analysis systems designed for high resolution regional scale numerical weather prediction (NWP) models. The non-hydrostatic forecast model COSMO (COnsortium for Small scale MOdelling) in the ARPA-SIM operational configuration is used to provide background fields. Only clear sky observations over sea are processed. An optimised 1D–VAR set-up comprising of the two water vapour and the three window channels is selected. It maximises the reduction of errors in the model backgrounds while ensuring ease of operational implementation through accurate bias correction procedures and correct radiative transfer simulations. The 1D–VAR retrieval quality is firstly quantified in relative terms employing statistics to estimate the reduction in the background model errors. Additionally the absolute retrieval accuracy is assessed comparing the analysis with independent radiosonde and satellite observations. The inclusion of satellite data brings a substantial reduction in the warm and dry biases present in the forecast model. Moreover it is shown that the retrieval profiles generated by the 1D–VAR are well correlated with the radiosonde measurements. Subsequently the 1D–VAR technique is applied to two three–dimensional case–studies: a false alarm case–study occurred in Friuli–Venezia–Giulia on the 8th of July 2004 and a heavy precipitation case occurred in Emilia–Romagna region between 9th and 12th of April 2005. The impact of satellite data for these two events is evaluated in terms of increments in the integrated water vapour and saturation water vapour over the column, in the 2 meters temperature and specific humidity and in the surface temperature. To improve the 1D–VAR technique a method to calculate flow–dependent model error covariance matrices is also assessed. The approach employs members from an ensemble forecast system generated by perturbing physical parameterisation schemes inside the model. The improved set–up applied to the case of 8th of July 2004 shows a substantial neutral impact.
Resumo:
The research performed during the PhD candidature was intended to evaluate the quality of white wines, as a function of the reduction in SO2 use during the first steps of the winemaking process. In order to investigate the mechanism and intensity of interactions occurring between lysozyme and the principal macro-components of musts and wines, a series of experiments on model wine solutions were undertaken, focusing attention on the polyphenols, SO2, oenological tannins, pectines, ethanol, and sugar components. In the second part of this research program, a series of conventional sulphite added vinifications were compared to vinifications in which sulphur dioxide was replaced by lysozyme and consequently define potential winemaking protocols suitable for the production of SO2-free wines. To reach the final goal, the technological performance of two selected yeast strains with a low aptitude to produce SO2 during fermentation were also evaluated. The data obtained suggested that the addition of lysozyme and oenological tannins during the alcoholic fermentation could represent a promising alternative to the use of sulphur dioxide and a reliable starting point for the production of SO2-free wines. The different vinification protocols studied influenced the composition of the volatile profile in wines at the end of the alcoholic fermentation, especially with regards to alcohols and ethyl esters also a consequence of the yeast’s response to the presence or absence of sulphites during fermentation, contributing in different ways to the sensory profiles of wines. In fact, the aminoacids analysis showed that lysozyme can affect the consumption of nitrogen as a function of the yeast strain used in fermentation. During the bottle storage, the evolution of volatile compounds is affected by the presence of SO2 and oenological tannins, confirming their positive role in scaveging oxygen and maintaining the amounts of esters over certain levels, avoiding a decline in the wine’s quality. Even though a natural decrease was found on phenolic profiles due to oxidation effects caused by the presence of oxygen dissolved in the medium during the storage period, the presence of SO2 together with tannins contrasted the decay of phenolic content at the end of the fermentation. Tannins also showed a central role in preserving the polyphenolic profile of wines during the storage period, confirming their antioxidant property, acting as reductants. Our study focused on the fundamental chemistry relevant to the oxidative phenolic spoilage of white wines has demonstrated the suitability of glutathione to inhibit the production of yellow xanthylium cation pigments generated from flavanols and glyoxylic acid at the concentration that it typically exists in wine. The ability of glutathione to bind glyoxylic acid rather than acetaldehyde may enable glutathione to be used as a ‘switch’ for glyoxylic acid-induced polymerisation mechanisms, as opposed to the equivalent acetaldehyde polymerisation, in processes such as microoxidation. Further research is required to assess the ability of glutathione to prevent xanthylium cation production during the in-situ production of glyoxylic acid and in the presence of sulphur dioxide.
Resumo:
Due to the growing attention of consumers towards their food, improvement of quality of animal products has become one of the main focus of research. To this aim, the application of modern molecular genetics approaches has been proved extremely useful and effective. This innovative drive includes all livestock species productions, including pork. The Italian pig breeding industry is unique because needs heavy pigs slaughtered at about 160 kg for the production of high quality processed products. For this reason, it requires precise meat quality and carcass characteristics. Two aspects have been considered in this thesis: the application of the transcriptome analysis in post mortem pig muscles as a possible method to evaluate meat quality parameters related to the pre mortem status of the animals, including health, nutrition, welfare, and with potential applications for product traceability (chapters 3 and 4); the study of candidate genes for obesity related traits in order to identify markers associated with fatness in pigs that could be applied to improve carcass quality (chapters 5, 6, and 7). Chapter three addresses the first issue from a methodological point of view. When we considered this issue, it was not obvious that post mortem skeletal muscle could be useful for transcriptomic analysis. Therefore we demonstrated that the quality of RNA extracted from skeletal muscle of pigs sampled at different post mortem intervals (20 minutes, 2 hours, 6 hours, and 24 hours) is good for downstream applications. Degradation occurred starting from 48 h post mortem even if at this time it is still possible to use some RNA products. In the fourth chapter, in order to demonstrate the potential use of RNA obtained up to 24 hours post mortem, we present the results of RNA analysis with the Affymetrix microarray platform that made it possible to assess the level of expression of more of 24000 mRNAs. We did not identify any significant differences between the different post mortem times suggesting that this technique could be applied to retrieve information coming from the transcriptome of skeletal muscle samples not collected just after slaughtering. This study represents the first contribution of this kind applied to pork. In the fifth chapter, we investigated as candidate for fat deposition the TBC1D1 [TBC1 (tre-2/USP6, BUB2, cdc16) gene. This gene is involved in mechanisms regulating energy homeostasis in skeletal muscle and is associated with predisposition to obesity in humans. By resequencing a fragment of the TBC1D1 gene we identified three synonymous mutations localized in exon 2 (g.40A>G, g.151C>T, and g.172T>C) and 2 polymorphisms localized in intron 2 (g.219G>A and g.252G>A). One of these polymorphisms (g.219G>A) was genotyped by high resolution melting (HRM) analysis and PCR-RFLP. Moreover, this gene sequence was mapped by radiation hybrid analysis on porcine chromosome 8. The association study was conducted in 756 performance tested pigs of Italian Large White and Italian Duroc breeds. Significant results were obtained for lean meat content, back fat thickness, visible intermuscular fat and ham weight. In chapter six, a second candidate gene (tribbles homolog 3, TRIB3) is analyzed in a study of association with carcass and meat quality traits. The TRIB3 gene is involved in energy metabolism of skeletal muscle and plays a role as suppressor of adipocyte differentiation. We identified two polymorphisms in the first coding exon of the porcine TRIB3 gene, one is a synonymous SNP (c.132T> C), a second is a missense mutation (c.146C> T, p.P49L). The two polymorphisms appear to be in complete linkage disequilibrium between and within breeds. The in silico analysis of the p.P49L substitution suggests that it might have a functional effect. The association study in about 650 pigs indicates that this marker is associated with back fat thickness in Italian Large White and Italian Duroc breeds in two different experimental designs. This polymorphisms is also associated with lactate content of muscle semimembranosus in Italian Large White pigs. Expression analysis indicated that this gene is transcribed in skeletal muscle and adipose tissue as well as in other tissues. In the seventh chapter, we reported the genotyping results for of 677 SNPs in extreme divergent groups of pigs chosen according to the extreme estimated breeding values for back fat thickness. SNPs were identified by resequencing, literature mining and in silico database mining. analysis, data reported in the literature of 60 candidates genes for obesity. Genotyping was carried out using the GoldenGate (Illumina) platform. Of the analyzed SNPs more that 300 were polymorphic in the genotyped population and had minor allele frequency (MAF) >0.05. Of these SNPs, 65 were associated (P<0.10) with back fat thickness. One of the most significant gene marker was the same TBC1D1 SNPs reported in chapter 5, confirming the role of this gene in fat deposition in pig. These results could be important to better define the pig as a model for human obesity other than for marker assisted selection to improve carcass characteristics.
Resumo:
Pig meat quality is determined by several parameters, such as lipid content, tenderness, water-holding capacity, pH, color and flavor, that affect consumers’ acceptance and technological properties of meat. Carcass quality parameters are important for the production of fresh and dry-cure high-quality products, in particular the fat deposition and the lean cut yield. The identification of genes and markers associated with meat and carcass quality traits is of prime interest, for the possibility of improving the traits by marker-assisted selection (MAS) schemes. Therefore, the aim of this thesis was to investigate seven candidate genes for meat and carcass quality traits in pigs. In particular, we focused on genes belonging to the family of the lipid droplet coat proteins perilipins (PLIN1 and PLIN2) and to the calpain/calpastatin system (CAST, CAPN1, CAPN3, CAPNS1) and on the gene encoding for PPARg-coactivator 1A (PPARGC1A). In general, the candidate genes investigation included the protein localization, the detection of polymorphisms, the association analysis with meat and carcass traits and the analysis of the expression level, in order to assess the involvement of the gene in pork quality. Some of the analyzed genes showed effects on various pork traits that are subject to selection in genetic improvement programs, suggesting a possible involvement of the genes in controlling the traits variability. In particular, significant association results have been obtained for PLIN2, CAST and PPARGC1A genes, that are worthwhile of further validation. The obtained results contribute to a better understanding of biological mechanisms important for pig production as well as for a possible use of pig as animal model for studies regarding obesity in humans.
Resumo:
The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.
Resumo:
In the last few years the resolution of numerical weather prediction (nwp) became higher and higher with the progresses of technology and knowledge. As a consequence, a great number of initial data became fundamental for a correct initialization of the models. The potential of radar observations has long been recognized for improving the initial conditions of high-resolution nwp models, while operational application becomes more frequent. The fact that many nwp centres have recently taken into operations convection-permitting forecast models, many of which assimilate radar data, emphasizes the need for an approach to providing quality information which is needed in order to avoid that radar errors degrade the model's initial conditions and, therefore, its forecasts. Environmental risks can can be related with various causes: meteorological, seismical, hydrological/hydraulic. Flash floods have horizontal dimension of 1-20 Km and can be inserted in mesoscale gamma subscale, this scale can be modeled only with nwp model with the highest resolution as the COSMO-2 model. One of the problems of modeling extreme convective events is related with the atmospheric initial conditions, in fact the scale dimension for the assimilation of atmospheric condition in an high resolution model is about 10 Km, a value too high for a correct representation of convection initial conditions. Assimilation of radar data with his resolution of about of Km every 5 or 10 minutes can be a solution for this problem. In this contribution a pragmatic and empirical approach to deriving a radar data quality description is proposed to be used in radar data assimilation and more specifically for the latent heat nudging (lhn) scheme. Later the the nvective capabilities of the cosmo-2 model are investigated through some case studies. Finally, this work shows some preliminary experiments of coupling of a high resolution meteorological model with an Hydrological one.
Resumo:
The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.
Resumo:
The soil carries out a wide range of functions and it is important study the effects of land use on soil quality in order to provide most sustainable practices. Three fields trial have been considered to assess soil quality and functionality after human alteration, and to determine the power of soil enzymatic activities, biochemical indexes and mathematical model in the evaluation of soil status. The first field was characterized by conventional and organic management in which were tested also tillage effects. The second was characterized by conventional, organic and agro-ecological management. Finally, the third was a beech forest where was tested the effects of N deposition on soil organic carbon sequestration. Results highlight that both enzyme activities and biochemical indexes could be valid parameters for soil quality evaluation. Conventional management and plowing negatively affected soil quality and functionality with intensive tillage that lead to the downturn of microbial biomass and activity. Both organic and agro-ecological management revealed to be good practices for the maintenance of soil functionality with better microbial activity and metabolic efficiency. This positively affected also soil organic carbon content. At the eutrophic forest, enzyme activities and biochemical indexes positively respond to the treatments but one year of experimentation resulted to be not enough to observe variation in soil organic carbon content. Mathematical models and biochemical indicators resulted to be valid tools for assess soil quality, nonetheless it would be better including the microbial component in the mathematical model and consider more than one index if the aim of the work is to evaluate the overall soil quality and functionality. Concluding, the forest site is the richest one in terms of organic carbon, microbial biomass and activity while, the organic and the agro-ecological management seem to be the more sustainable but without taking in consideration the yield.
Resumo:
Air pollution is one of the greatest health risks in the world. At the same time, the strong correlation with climate change, as well as with Urban Heat Island and Heat Waves, make more intense the effects of all these phenomena. A good air quality and high levels of thermal comfort are the big goals to be reached in urban areas in coming years. Air quality forecast help decision makers to improve air quality and public health strategies, mitigating the occurrence of acute air pollution episodes. Air quality forecasting approaches combine an ensemble of models to provide forecasts from global to regional air pollution and downscaling for selected countries and regions. The development of models dedicated to urban air quality issues requires a good set of data regarding the urban morphology and building material characteristics. Only few examples of air quality forecast system at urban scale exist in the literature and often they are limited to selected cities. This thesis develops by setting up a methodology for the development of a forecasting tool. The forecasting tool can be adapted to all cities and uses a new parametrization for vegetated areas. The parametrization method, based on aerodynamic parameters, produce the urban spatially varying roughness. At the core of the forecasting tool there is a dispersion model (urban scale) used in forecasting mode, and the meteorological and background concentration forecasts provided by two regional numerical weather forecasting models. The tool produces the 1-day spatial forecast of NO2, PM10, O3 concentration, the air temperature, the air humidity and BLQ-Air index values. The tool is automatized to run every day, the maps produced are displayed on the e-Globus platform, updated every day. The results obtained indicate that the forecasting output were in good agreement with the observed measurements.
Resumo:
There are only a few insights concerning the influence that agronomic and management variability may have on superficial scald (SS) in pears. Abate Fétel pears were picked during three seasons (2018, 2019 and 2020) from thirty commercial orchards in the Emilia Romagna region, Italy. Using a multivariate statistical approach, high heterogeneity between farms for SS development after cold storage with regular atmosphere was demonstrated. Indeed, some factors seem to affect SS in all growing seasons: high yields, soil texture, improper irrigation and Nitrogen management, use of plant growth regulators, late harvest, precipitations, Calcium and cow manure, presence of nets, orchard age, training system and rootstock. Afterwards, we explored the spatio/temporal variability of fruit attributes in two pear orchards. Environmental and physiological spatial variables were recorded by a portable RTK GPS. High spatial variability of the SS index was observed. Through a geostatistical approach, some characteristics, including soil electrical conductivity and fruit size, have been shown to be negatively correlated with SS. Moreover, regression tree analyses were applied suggesting the presence of threshold values of antioxidant capacity, total phenolic content, and acidity against SS. High pulp firmness and IAD values before storage, denoting a more immature fruit, appeared to be correlated with low SS. Finally, a convolution neural networks (CNN) was tested to detect SS and the starch pattern index (SPI) in pears for portable device applications. Preliminary statistics showed that the model for SS had low accuracy but good precision, and the CNN for SPI denoted good performances compared to the Ctifl and Laimburg scales. The major conclusion is that Abate Fétel pears can potentially be stored in different cold rooms, according to their origin and quality features, ensuring the best fruit quality for the final consumers. These results might lead to a substantial improvement in the Italian pear industry.
Resumo:
Over the past 30 years, unhealthy diets and lifestyles have increased the incidence of noncommunicable diseases and are culprits of diffusion on world’s population of syndromes as obesity or other metabolic disorders, reaching pandemic proportions. In order to comply with such scenario, the food industry has tackled these challenges with different approaches, as the reformulation of foods, fortification of foods, substitution of ingredients and supplements with healthier ingredients, reduced animal protein, reduced fats and improved fibres applications. Although the technological quality of these emerging food products is known, the impact they have on the gut microbiota of consumers remains unclear. In the present PhD thesis, the recipient work was conducted to study different foods with the substitution of the industrial and market components to that of novel green oriented and sustainable ingredients. So far, this thesis included eight representative case studies of the most common substitutions/additions/fortifications in dairy, meat, and vegetable products. The products studied were: (i) a set of breads fortified with polyphenol-rich olive fiber, to replace synthetic antioxidant and preservatives, (ii) a set of Gluten-free breads fortified with algae powder, to fortify the protein content of standard GF products, (iii) different formulations of salami where nitrates were replaced by ascorbic acid and vegetal extract antioxidants and nitrate-reducers starter cultures, (iv) chocolate fiber plus D-Limonene food supplement, as a novel prebiotic formula, (v) hemp seed bran and its alkalase hydrolysate, to introduce as a supplement, (vi) milk with and without lactose, to evaluate the different impact on human colonic microbiota of healthy or lactose-intolerants, (vii) lactose-free whey fermented and/or with probiotics added, to be introduced as an alternative beverage, exploring its impact on human colonic microbiota from healthy or lactose-intolerants, and (viii) antibiotics, to assess whether maternal amoxicillin affects the colon microbiota of piglets.
Resumo:
Fabry disease (FD), X-linked metabolic disorder caused by a deficiency in α-galactosidase A activity, leads to the accumulation of glycosphingolipids, mainly Gb3 and lyso-Gb3, in several organs. Gastrointestinal (GI) symptoms are among the earliest and most common, strongly impacting patients’ quality of life. However, the origin of these symptoms and the exact mechanisms of pathogenesis are still poorly understood, thus the pressing need to improve their knowledge. Here we aimed to evaluate whether a FD murine model (α-galactosidase A Knock-Out) captures the functional GI issues experienced by patients. In particular, the potential mechanisms involved in the development and maintenance of GI symptoms were explored by looking at the microbiota-gut-brain axis involvement. Moreover, we sought to examine the effects of lyso-Gb3 on colonic contractility and the intestinal epithelium and the enteric nervous system, which together play important roles in regulating intestinal ion transport and fluid and electrolyte homeostasis. Fabry mice revealed visceral hypersensitivity and a diarrhea-like phenotype accompanied by anxious-like behavior and reduced locomotor activity. They reported also an imbalance of SCFAs and an early compositional and functional dysbiosis of the gut microbiota, which partly persisted with advancing age. Moreover, overexpression of TRPV1 was found in affected mice, and partial alteration of TRPV4 and TRPA1 as well, identifying them as possible therapeutic targets. The Ussing chamber results after treatment with lyso-Gb3 showed an increase in Isc (likely mediated by HCO3- ions movement) which affects neuron-mediated secretion, especially capsaicin- and partly veratridine-mediated. This first characterization of gut-brain axis dysfunction in FD mouse provides functional validation of the model, suggesting new targets and possible therapeutic approaches. Furthermore, lyso-Gb3 is confirmed to be not only a marker for the diagnosis and follow-up of FD but also a possible player in the alteration of the FD colonic ion transport process.
Resumo:
The design process of any electric vehicle system has to be oriented towards the best energy efficiency, together with the constraint of maintaining comfort in the vehicle cabin. Main aim of this study is to research the best thermal management solution in terms of HVAC efficiency without compromising occupant’s comfort and internal air quality. An Arduino controlled Low Cost System of Sensors was developed and compared against reference instrumentation (average R-squared of 0.92) and then used to characterise the vehicle cabin in real parking and driving conditions trials. Data on the energy use of the HVAC was retrieved from the car On-Board Diagnostic port. Energy savings using recirculation can reach 30 %, but pollutants concentration in the cabin builds up in this operating mode. Moreover, the temperature profile appeared strongly nonuniform with air temperature differences up to 10° C. Optimisation methods often require a high number of runs to find the optimal configuration of the system. Fast models proved to be beneficial for these task, while CFD-1D model are usually slower despite the higher level of detail provided. In this work, the collected dataset was used to train a fast ML model of both cabin and HVAC using linear regression. Average scaled RMSE over all trials is 0.4 %, while computation time is 0.0077 ms for each second of simulated time on a laptop computer. Finally, a reinforcement learning environment was built in OpenAI and Stable-Baselines3 using the built-in Proximal Policy Optimisation algorithm to update the policy and seek for the best compromise between comfort, air quality and energy reward terms. The learning curves show an oscillating behaviour overall, with only 2 experiments behaving as expected even if too slow. This result leaves large room for improvement, ranging from the reward function engineering to the expansion of the ML model.