9 resultados para Addition techniques
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Lipolysis and oxidation of lipids in foods are the major biochemical and chemical processes that cause food quality deterioration, leading to the characteristic, unpalatable odour and flavour called rancidity. In addition to unpalatability, rancidity may give rise to toxic levels of certain compounds like aldehydes, hydroperoxides, epoxides and cholesterol oxidation products. In this PhD study chromatographic and spectroscopic techniques were employed to determine the degree of rancidity in different animal products and its relationship with technological parameters like feeding fat sources, packaging, processing and storage conditions. To achieve this goal capillary gas chromatography (CGC) was employed not only to determine the fatty acids profile but also, after solid phase extraction, the amount of free fatty acids (FFA), diglycerides (DG), sterols (cholesterol and phytosterols) and cholesterol oxidation products (COPs). To determine hydroperoxides, primary products of oxidation and quantify secondary products UV/VIS absorbance spectroscopy was applied. Most of the foods analysed in this study were meat products. In actual fact, lipid oxidation is a major deterioration reaction in meat and meat products and results in adverse changes in the colour, flavour and texture of meat. The development of rancidity has long recognized as a serious problem during meat handling, storage and processing. On a dairy product, a vegetal cream, a study of lipid fraction and development of rancidity during storage was carried out to evaluate its shelf-life and some nutritional features life saturated/unsaturated fatty acids ratio and phytosterols content. Then, according to the interest that has been growing around functional food in the last years, a new electrophoretic method was optimized and compared with HPLC to check the quality of a beehive product like royal jelly. This manuscript reports the main results obtained in the five activities briefly summarized as follows: 1) comparison between HPLC and a new electrophoretic method in the evaluation of authenticity of royal jelly; 2) study of the lipid fraction of a vegetal cream under different storage conditions; 3) study of lipid oxidation in minced beef during storage under a modified atmosphere packaging, before and after cooking; 4) evaluation of the influence of dietary fat and processing on the lipid fraction of chicken patties; 5) study of the lipid fraction of typical Italian and Spanish pork dry sausages and cured hams.
Resumo:
Gossip protocols have proved to be a viable solution to set-up and manage largescale P2P services or applications in a fully decentralised scenario. The gossip or epidemic communication scheme is heavily based on stochastic behaviors and it is the fundamental idea behind many large-scale P2P protocols. It provides many remarkable features, such as scalability, robustness to failures, emergent load balancing capabilities, fast spreading, and redundancy of information. In some sense, these services or protocols mimic natural system behaviors in order to achieve their goals. The key idea of this work is that the remarkable properties of gossip hold when all the participants follow the rules dictated by the actual protocols. If one or more malicious nodes join the network and start cheating according to some strategy, the result can be catastrophic. In order to study how serious the threat posed by malicious nodes can be and what can be done to prevent attackers from cheating, we focused on a general attack model aimed to defeat a key service in gossip overlay networks (the Peer Sampling Service [JGKvS04]). We also focused on the problem of protecting against forged information exchanged in gossip services. We propose a solution technique for each problem; both techniques are general enough to be applied to distinct service implementations. As gossip protocols, our solutions are based on stochastic behavior and are fully decentralized. In addition, each technique’s behaviour is abstracted by a general primitive function extending the basic gossip scheme; this approach allows the adoptions of our solutions with minimal changes in different scenarios. We provide an extensive experimental evaluation to support the effectiveness of our techniques. Basically, these techniques aim to be building blocks or P2P architecture guidelines in building more resilient and more secure P2P services.
Resumo:
The research activity described in this thesis is focused mainly on the study of finite-element techniques applied to thermo-fluid dynamic problems of plant components and on the study of dynamic simulation techniques applied to integrated building design in order to enhance the energy performance of the building. The first part of this doctorate thesis is a broad dissertation on second law analysis of thermodynamic processes with the purpose of including the issue of the energy efficiency of buildings within a wider cultural context which is usually not considered by professionals in the energy sector. In particular, the first chapter includes, a rigorous scheme for the deduction of the expressions for molar exergy and molar flow exergy of pure chemical fuels. The study shows that molar exergy and molar flow exergy coincide when the temperature and pressure of the fuel are equal to those of the environment in which the combustion reaction takes place. A simple method to determine the Gibbs free energy for non-standard values of the temperature and pressure of the environment is then clarified. For hydrogen, carbon dioxide, and several hydrocarbons, the dependence of the molar exergy on the temperature and relative humidity of the environment is reported, together with an evaluation of molar exergy and molar flow exergy when the temperature and pressure of the fuel are different from those of the environment. As an application of second law analysis, a comparison of the thermodynamic efficiency of a condensing boiler and of a heat pump is also reported. The second chapter presents a study of borehole heat exchangers, that is, a polyethylene piping network buried in the soil which allows a ground-coupled heat pump to exchange heat with the ground. After a brief overview of low-enthalpy geothermal plants, an apparatus designed and assembled by the author to carry out thermal response tests is presented. Data obtained by means of in situ thermal response tests are reported and evaluated by means of a finite-element simulation method, implemented through the software package COMSOL Multyphysics. The simulation method allows the determination of the precise value of the effective thermal properties of the ground and of the grout, which are essential for the design of borehole heat exchangers. In addition to the study of a single plant component, namely the borehole heat exchanger, in the third chapter is presented a thorough process for the plant design of a zero carbon building complex. The plant is composed of: 1) a ground-coupled heat pump system for space heating and cooling, with electricity supplied by photovoltaic solar collectors; 2) air dehumidifiers; 3) thermal solar collectors to match 70% of domestic hot water energy use, and a wood pellet boiler for the remaining domestic hot water energy use and for exceptional winter peaks. This chapter includes the design methodology adopted: 1) dynamic simulation of the building complex with the software package TRNSYS for evaluating the energy requirements of the building complex; 2) ground-coupled heat pumps modelled by means of TRNSYS; and 3) evaluation of the total length of the borehole heat exchanger by an iterative method developed by the author. An economic feasibility and an exergy analysis of the proposed plant, compared with two other plants, are reported. The exergy analysis was performed by considering the embodied energy of the components of each plant and the exergy loss during the functioning of the plants.
Resumo:
Tumors involving bone and soft tissues are extremely challenging situations. With the recent advances of multi-modal treatment, not only the type of surgery has moved from amputation to limb-sparing procedures, but also the survivorship has improved considerably and reconstructive techniques have the goal to allow a considerably higher quality of life. In bone reconstruction, tissue engineering strategies are the main area of research. Re-vascularization and re-vitalisation of a massive allograft would considerably improve the outcome of biological reconstructions. Using a rabbit animal model, in this study we showed that, by implanting a vascular pedicle inside a weight bearing massive cortical allograft, the bone regeneration inside the allograft was higher compared to the non-vascularized implants, given the patency of the vascular pedicle. Improvement in the animal model and the addition of Stem Cells and Growth factors will allow a further improvement in the results. In soft tissue tumors, free and pedicled flaps have been proven to be of great help as reconstruction strategies. In this study we analyzed the functional and overall outcome of 14 patients who received a re-innervated vascularized flap. We have demonstrated that the use of the innovative technique of motor re-innervated muscular flaps is effective when the resection involves important functional compartments of the upper or lower limb, with no increase of post-operative complications. Although there was no direct comparison between this type of reconstruction and the standard non-innervated reconstruction, we underlined the remarkable high overall functional scores and patient satisfaction following this procedure.
Resumo:
This thesis presents the outcomes of my Ph.D. course in telecommunications engineering. The focus of my research has been on Global Navigation Satellite Systems (GNSS) and in particular on the design of aiding schemes operating both at position and physical level and the evaluation of their feasibility and advantages. Assistance techniques at the position level are considered to enhance receiver availability in challenging scenarios where satellite visibility is limited. Novel positioning techniques relying on peer-to-peer interaction and exchange of information are thus introduced. More specifically two different techniques are proposed: the Pseudorange Sharing Algorithm (PSA), based on the exchange of GNSS data, that allows to obtain coarse positioning where the user has scarce satellite visibility, and the Hybrid approach, which also permits to improve the accuracy of the positioning solution. At the physical level, aiding schemes are investigated to improve the receiver’s ability to synchronize with satellite signals. An innovative code acquisition strategy for dual-band receivers, the Cross-Band Aiding (CBA) technique, is introduced to speed-up initial synchronization by exploiting the exchange of time references between the two bands. In addition vector configurations for code tracking are analyzed and their feedback generation process thoroughly investigated.
Resumo:
The development of a multibody model of a motorbike engine cranktrain is presented in this work, with an emphasis on flexible component model reduction. A modelling methodology based upon the adoption of non-ideal joints at interface locations, and the inclusion of component flexibility, is developed: both are necessary tasks if one wants to capture dynamic effects which arise in lightweight, high-speed applications. With regard to the first topic, both a ball bearing model and a journal bearing model are implemented, in order to properly capture the dynamic effects of the main connections in the system: angular contact ball bearings are modelled according to a five-DOF nonlinear scheme in order to grasp the crankshaft main bearings behaviour, while an impedance-based hydrodynamic bearing model is implemented providing an enhanced operation prediction at the conrod big end locations. Concerning the second matter, flexible models of the crankshaft and the connecting rod are produced. The well-established Craig-Bampton reduction technique is adopted as a general framework to obtain reduced model representations which are suitable for the subsequent multibody analyses. A particular component mode selection procedure is implemented, based on the concept of Effective Interface Mass, allowing an assessment of the accuracy of the reduced models prior to the nonlinear simulation phase. In addition, a procedure to alleviate the effects of modal truncation, based on the Modal Truncation Augmentation approach, is developed. In order to assess the performances of the proposed modal reduction schemes, numerical tests are performed onto the crankshaft and the conrod models in both frequency and modal domains. A multibody model of the cranktrain is eventually assembled and simulated using a commercial software. Numerical results are presented, demonstrating the effectiveness of the implemented flexible model reduction techniques. The advantages over the conventional frequency-based truncation approach are discussed.
Resumo:
Automatically recognizing faces captured under uncontrolled environments has always been a challenging topic in the past decades. In this work, we investigate cohort score normalization that has been widely used in biometric verification as means to improve the robustness of face recognition under challenging environments. In particular, we introduce cohort score normalization into undersampled face recognition problem. Further, we develop an effective cohort normalization method specifically for the unconstrained face pair matching problem. Extensive experiments conducted on several well known face databases demonstrate the effectiveness of cohort normalization on these challenging scenarios. In addition, to give a proper understanding of cohort behavior, we study the impact of the number and quality of cohort samples on the normalization performance. The experimental results show that bigger cohort set size gives more stable and often better results to a point before the performance saturates. And cohort samples with different quality indeed produce different cohort normalization performance. Recognizing faces gone after alterations is another challenging problem for current face recognition algorithms. Face image alterations can be roughly classified into two categories: unintentional (e.g., geometrics transformations introduced by the acquisition devide) and intentional alterations (e.g., plastic surgery). We study the impact of these alterations on face recognition accuracy. Our results show that state-of-the-art algorithms are able to overcome limited digital alterations but are sensitive to more relevant modifications. Further, we develop two useful descriptors for detecting those alterations which can significantly affect the recognition performance. In the end, we propose to use the Structural Similarity (SSIM) quality map to detect and model variations due to plastic surgeries. Extensive experiments conducted on a plastic surgery face database demonstrate the potential of SSIM map for matching face images after surgeries.
Resumo:
This thesis collects the outcomes of a Ph.D. course in Telecommunications Engineering and it is focused on the study and design of possible techniques able to counteract interference signal in Global Navigation Satellite System (GNSS) systems. The subject is the jamming threat in navigation systems, that has become a very increasingly important topic in recent years, due to the wide diffusion of GNSS-based civil applications. Detection and mitigation techniques are developed in order to fight out jamming signals, tested in different scenarios and including sophisticated signals. The thesis is organized in two main parts, which deal with management of GNSS intentional counterfeit signals. The first part deals with the interference management, focusing on the intentional interfering signal. In particular, a technique for the detection and localization of the interfering signal level in the GNSS bands in frequency domain has been proposed. In addition, an effective mitigation technique which exploits the periodic characteristics of the common jamming signals reducing interfering effects at the receiver side has been introduced. Moreover, this technique has been also tested in a different and more complicated scenario resulting still effective in mitigation and cancellation of the interfering signal, without high complexity. The second part still deals with the problem of interference management, but regarding with more sophisticated signal. The attention is focused on the detection of spoofing signal, which is the most complex among the jamming signal types. Due to this highly difficulty in detect and mitigate this kind of signal, spoofing threat is considered the most dangerous. In this work, a possible techniques able to detect this sophisticated signal has been proposed, observing and exploiting jointly the outputs of several operational block measurements of the GNSS receiver operating chain.
Resumo:
Since its approval by FDA in 2001, capsule endoscopy revolutionized the study of small bowel. One of the main limitations of its diffusion has been the high cost. More recently, a new videocapsule system (OMOM CE) has been developed in China and obtained the CE mark. Its cost is approximately half that of other capsule systems. However, there are few studies regarding the clinical experience with this new videocapsule system and none of them has been performed in the western world. Among the limitations of capsule endoscopy, there is also one linked to the diagnostic yield. The rapid transit of the device in the proximal segments implies a high risk of false negatives; an indirect confirmation of this limit is offered by the poor ability to identify the papilla of Vater. In addition, recent studies show that in patients with obscure gastrointestinal bleeding, the negative outcome of capsule endoscopy is correlated to a significant risk of recurrence of anemia in the short term, as well as the presence of small bowel lesions documented by a second capsule endoscopy. It was recently approved the use of a new device called "CapsoCam" (CapsoVision, Inc. Saratoga) characterized by four side cameras that offer a panoramic view of 360 degrees, instead of the front to 160°. Two recent pilot studies showed comparable safety profiles and diagnostic yield with the more standardized capsule. Namely, side vision has made possible a clear visualization of the papilla in 70% of cases. The aim of our study is to evaluate the feasibility and diagnostic yield of these two new devices, which first may allow a reduction in costs. Moreover, their complementary use could lead to a recovery diagnostic in patients with false negative results in an initial investigation.