896 resultados para Vacuum extraction
Resumo:
Osmotic Dehydration and Vacuum Impregnation are interesting operations in the food industry with applications in minimal fruit processing and/or freezing, allowing to develop new products with specific innovative characteristics. Osmotic dehydration is widely used for the partial removal of water from cellular tissue by immersion in hypertonic (osmotic) solution. The driving force for the diffusion of water from the tissue is provided by the differences in water chemical potential between the external solution and the internal liquid phase of the cells. Vacuum Impregnation of porous products immersed in a liquid phase consist of reduction of pressure in a solid-liquid system (vacuum step) followed by the restoration of atmospheric pressure (atmospheric step). During the vacuum step the internal gas in the product pores is expanded and partially flows out while during the atmospheric step, there is a compression of residual gas and the external liquid flows into the pores (Fito, 1994). This process is also a very useful unit operation in food engineering as it allows to introduce specific solutes in the tissue which can play different functions (antioxidants, pH regulators, preservatives, cryoprotectants etc.). The present study attempts to enhance our understanding and knowledge of fruit as living organism, interacting dynamically with the environment, and to explore metabolic, structural, physico-chemical changes during fruit processing. The use of innovative approaches and/or technologies such as SAFES (Systematic Approach to Food Engineering System), LF-NMR (Low Frequency Nuclear Magnetic Resonance), GASMAS (Gas in Scattering Media Absorption Spectroscopy) are very promising to deeply study these phenomena. SAFES methodology was applied in order to study irreversibility of the structural changes of kiwifruit during short time of osmotic treatment. The results showed that the deformed tissue can recover its initial state 300 min after osmotic dehydration at 25 °C. The LF-NMR resulted very useful in water status and compartmentalization study, permitting to separate observation of three different water population presented in vacuole, cytoplasm plus extracellular space and cell wall. GASMAS techniques was able to study the pressure equilibration after Vacuum Impregnation showing that after restoration of atmospheric pressure in the solid-liquid system, there was a reminding internal low pressure in the apple tissue that slowly increases until reaching the atmospheric pressure, in a time scale that depends on the vacuum applied during the vacuum step. The physiological response of apple tissue on Vacuum Impregnation process was studied indicating the possibility of vesicular transport within the cells. Finally, the possibility to extend the freezing tolerance of strawberry fruits impregnated with cryoprotectants was proven.
Resumo:
The central objective of research in Information Retrieval (IR) is to discover new techniques to retrieve relevant information in order to satisfy an Information Need. The Information Need is satisfied when relevant information can be provided to the user. In IR, relevance is a fundamental concept which has changed over time, from popular to personal, i.e., what was considered relevant before was information for the whole population, but what is considered relevant now is specific information for each user. Hence, there is a need to connect the behavior of the system to the condition of a particular person and his social context; thereby an interdisciplinary sector called Human-Centered Computing was born. For the modern search engine, the information extracted for the individual user is crucial. According to the Personalized Search (PS), two different techniques are necessary to personalize a search: contextualization (interconnected conditions that occur in an activity), and individualization (characteristics that distinguish an individual). This movement of focus to the individual's need undermines the rigid linearity of the classical model overtaken the ``berry picking'' model which explains that the terms change thanks to the informational feedback received from the search activity introducing the concept of evolution of search terms. The development of Information Foraging theory, which observed the correlations between animal foraging and human information foraging, also contributed to this transformation through attempts to optimize the cost-benefit ratio. This thesis arose from the need to satisfy human individuality when searching for information, and it develops a synergistic collaboration between the frontiers of technological innovation and the recent advances in IR. The search method developed exploits what is relevant for the user by changing radically the way in which an Information Need is expressed, because now it is expressed through the generation of the query and its own context. As a matter of fact the method was born under the pretense to improve the quality of search by rewriting the query based on the contexts automatically generated from a local knowledge base. Furthermore, the idea of optimizing each IR system has led to develop it as a middleware of interaction between the user and the IR system. Thereby the system has just two possible actions: rewriting the query, and reordering the result. Equivalent actions to the approach was described from the PS that generally exploits information derived from analysis of user behavior, while the proposed approach exploits knowledge provided by the user. The thesis went further to generate a novel method for an assessment procedure, according to the "Cranfield paradigm", in order to evaluate this type of IR systems. The results achieved are interesting considering both the effectiveness achieved and the innovative approach undertaken together with the several applications inspired using a local knowledge base.
Resumo:
The identification of people by measuring some traits of individual anatomy or physiology has led to a specific research area called biometric recognition. This thesis is focused on improving fingerprint recognition systems considering three important problems: fingerprint enhancement, fingerprint orientation extraction and automatic evaluation of fingerprint algorithms. An effective extraction of salient fingerprint features depends on the quality of the input fingerprint. If the fingerprint is very noisy, we are not able to detect a reliable set of features. A new fingerprint enhancement method, which is both iterative and contextual, is proposed. This approach detects high-quality regions in fingerprints, selectively applies contextual filtering and iteratively expands like wildfire toward low-quality ones. A precise estimation of the orientation field would greatly simplify the estimation of other fingerprint features (singular points, minutiae) and improve the performance of a fingerprint recognition system. The fingerprint orientation extraction is improved following two directions. First, after the introduction of a new taxonomy of fingerprint orientation extraction methods, several variants of baseline methods are implemented and, pointing out the role of pre- and post- processing, we show how to improve the extraction. Second, the introduction of a new hybrid orientation extraction method, which follows an adaptive scheme, allows to improve significantly the orientation extraction in noisy fingerprints. Scientific papers typically propose recognition systems that integrate many modules and therefore an automatic evaluation of fingerprint algorithms is needed to isolate the contributions that determine an actual progress in the state-of-the-art. The lack of a publicly available framework to compare fingerprint orientation extraction algorithms, motivates the introduction of a new benchmark area called FOE (including fingerprints and manually-marked orientation ground-truth) along with fingerprint matching benchmarks in the FVC-onGoing framework. The success of such framework is discussed by providing relevant statistics: more than 1450 algorithms submitted and two international competitions.
Resumo:
The work presented in this thesis is focused on the open-ended coaxial-probe frequency-domain reflectometry technique for complex permittivity measurement at microwave frequencies of dispersive dielectric multilayer materials. An effective dielectric model is introduced and validated to extend the applicability of this technique to multilayer materials in on-line system context. In addition, the thesis presents: 1) a numerical study regarding the imperfectness of the contact at the probe-material interface, 2) a review of the available models and techniques, 3) a new classification of the extraction schemes with guidelines on how they can be used to improve the overall performance of the probe according to the problem requirements.
Resumo:
The aim of this work is to explore, within the framework of the presumably asymptotically safe Quantum Einstein Gravity, quantum corrections to black hole spacetimes, in particular in the case of rotating black holes. We have analysed this problem by exploiting the scale dependent Newton s constant implied by the renormalization group equation for the effective average action, and introducing an appropriate "cutoff identification" which relates the renormalization scale to the geometry of the spacetime manifold. We used these two ingredients in order to "renormalization group improve" the classical Kerr metric that describes the spacetime generated by a rotating black hole. We have focused our investigation on four basic subjects of black hole physics. The main results related to these topics can be summarized as follows. Concerning the critical surfaces, i.e. horizons and static limit surfaces, the improvement leads to a smooth deformation of the classical critical surfaces. Their number remains unchanged. In relation to the Penrose process for energy extraction from black holes, we have found that there exists a non-trivial correlation between regions of negative energy states in the phase space of rotating test particles and configurations of critical surfaces of the black hole. As for the vacuum energy-momentum tensor and the energy conditions we have shown that no model with "normal" matter, in the sense of matter fulfilling the usual energy conditions, can simulate the quantum fluctuations described by the improved Kerr spacetime that we have derived. Finally, in the context of black hole thermodynamics, we have performed calculations of the mass and angular momentum of the improved Kerr black hole, applying the standard Komar integrals. The results reflect the antiscreening character of the quantum fluctuations of the gravitational field. Furthermore we calculated approximations to the entropy and the temperature of the improved Kerr black hole to leading order in the angular momentum. More generally we have proven that the temperature can no longer be proportional to the surface gravity if an entropy-like state function is to exist.
Resumo:
Except the article forming the main content most HTML documents on the WWW contain additional contents such as navigation menus, design elements or commercial banners. In the context of several applications it is necessary to draw the distinction between main and additional content automatically. Content extraction and template detection are the two approaches to solve this task. This thesis gives an extensive overview of existing algorithms from both areas. It contributes an objective way to measure and evaluate the performance of content extraction algorithms under different aspects. These evaluation measures allow to draw the first objective comparison of existing extraction solutions. The newly introduced content code blurring algorithm overcomes several drawbacks of previous approaches and proves to be the best content extraction algorithm at the moment. An analysis of methods to cluster web documents according to their underlying templates is the third major contribution of this thesis. In combination with a localised crawling process this clustering analysis can be used to automatically create sets of training documents for template detection algorithms. As the whole process can be automated it allows to perform template detection on a single document, thereby combining the advantages of single and multi document algorithms.
Resumo:
This research work is aimed at the valorization of two types of pomace deriving from the extra virgin olive oil mechanical extraction process, such as olive pomace and a new by-product named “paté”, in the livestock sector as important sources of antioxidants and unsaturated fatty acids. In the first research the suitability of dried stoned olive pomace as a dietary supplement for dairy buffaloes was evaluated. The effectiveness of this utilization in modifying fatty acid composition and improving the oxidative stability of buffalo milk and mozzarella cheese have been proven by means of the analysis of qualitative and quantitative parameters. In the second research the use of paté as a new by-product in dietary feed supplementation for dairy ewes, already fed with a source of unsaturated fatty acids such as extruded linseed, was studied in order to assess the effect of this combination on the dairy products obtained. The characterization of paté as a new by-product was also carried out, studying the optimal conditions of its stabilization and preservation at the same time. The main results, common to both researches, have been the detection and the characterization of hydrophilic phenols in the milk. The analytical detection of hydroxytyrosol and tyrosol in the ewes’ milk fed with the paté and hydroxytyrosol in buffalo fed with pomace showed for the first time the presence in the milk of hydroxytyrosol, which is one of the most important bioactive compounds of the oil industry products; the transfer of these antioxidants and the proven improvement of the quality of milk fat could positively interact in the prevention of some human cardiovascular diseases and some tumours, increasing in this manner the quality of dairy products, also improving their shelf-life. These results also provide important information on the bioavailability of these phenolic compounds.
Resumo:
Within the framework of the AdS5/CFT4 correspondence, the GKP string living on a AdS5 x S5 background finds a counterpart in the anti-ferromagnetic vacuum state for the spin chain, fruitfully employed to investigate the dual N=4 SYM superconformal gauge theory. The thesis mainly deals with the excitations over such a vacuum: dispersion relations and scattering matrices are computed, moreover a set of Asymptotic Bethe Ansatz equations is formulated. Furthermore, the survey of the GKP vacuum within the AdS4/CFT3 duality between a string theory on AdS4 x CP 3 and N=6 Chern-Simons reveals intriguing connections relating the latter to N=4 SYM, in a peculiar high spin limit.
Resumo:
In questo lavoro ci si propone di descrivere la realizzazione di un sistema laser con cavit´a esterna e di un apparato da ultra-alto-vuoto, che verranno impiegati in un esperimento di miscele di atomi ultrafreddi che utilizza due specie atomiche di bosoni: 87Rb e 41K. Speciale attenzione viene rivolta verso le caratteristiche dello schema utilizzato e sul sistema di controllo in temperatura, che rendono questo sistema laser particolarmente stabile in frequenza e insensibile alle vibrazioni e variazioni di temperatura. Si sono poi analizzate le propriet´a dei materiali impiegati e delle procedure sperimentali adottate per la realizzazione del nuovo apparato da vuoto, al fine di garantire migliori prestazioni rispetto al sistema attualmente in uso.
Resumo:
This thesis aims at investigating methods and software architectures for discovering what are the typical and frequently occurring structures used for organizing knowledge in the Web. We identify these structures as Knowledge Patterns (KPs). KP discovery needs to address two main research problems: the heterogeneity of sources, formats and semantics in the Web (i.e., the knowledge soup problem) and the difficulty to draw relevant boundary around data that allows to capture the meaningful knowledge with respect to a certain context (i.e., the knowledge boundary problem). Hence, we introduce two methods that provide different solutions to these two problems by tackling KP discovery from two different perspectives: (i) the transformation of KP-like artifacts to KPs formalized as OWL2 ontologies; (ii) the bottom-up extraction of KPs by analyzing how data are organized in Linked Data. The two methods address the knowledge soup and boundary problems in different ways. The first method provides a solution to the two aforementioned problems that is based on a purely syntactic transformation step of the original source to RDF followed by a refactoring step whose aim is to add semantics to RDF by select meaningful RDF triples. The second method allows to draw boundaries around RDF in Linked Data by analyzing type paths. A type path is a possible route through an RDF that takes into account the types associated to the nodes of a path. Then we present K~ore, a software architecture conceived to be the basis for developing KP discovery systems and designed according to two software architectural styles, i.e, the Component-based and REST. Finally we provide an example of reuse of KP based on Aemoo, an exploratory search tool which exploits KPs for performing entity summarization.
Resumo:
Over the past ten years, the cross-correlation of long-time series of ambient seismic noise (ASN) has been widely adopted to extract the surface-wave part of the Green’s Functions (GF). This stochastic procedure relies on the assumption that ASN wave-field is diffuse and stationary. At frequencies <1Hz, the ASN is mainly composed by surface-waves, whose origin is attributed to the sea-wave climate. Consequently, marked directional properties may be observed, which call for accurate investigation about location and temporal evolution of the ASN-sources before attempting any GF retrieval. Within this general context, this thesis is aimed at a thorough investigation about feasibility and robustness of the noise-based methods toward the imaging of complex geological structures at the local (∼10-50km) scale. The study focused on the analysis of an extended (11 months) seismological data set collected at the Larderello-Travale geothermal field (Italy), an area for which the underground geological structures are well-constrained thanks to decades of geothermal exploration. Focusing on the secondary microseism band (SM;f>0.1Hz), I first investigate the spectral features and the kinematic properties of the noise wavefield using beamforming analysis, highlighting a marked variability with time and frequency. For the 0.1-0.3Hz frequency band and during Spring- Summer-time, the SMs waves propagate with high apparent velocities and from well-defined directions, likely associated with ocean-storms in the south- ern hemisphere. Conversely, at frequencies >0.3Hz the distribution of back- azimuths is more scattered, thus indicating that this frequency-band is the most appropriate for the application of stochastic techniques. For this latter frequency interval, I tested two correlation-based methods, acting in the time (NCF) and frequency (modified-SPAC) domains, respectively yielding esti- mates of the group- and phase-velocity dispersions. Velocity data provided by the two methods are markedly discordant; comparison with independent geological and geophysical constraints suggests that NCF results are more robust and reliable.
Resumo:
The lattice formulation of Quantum ChromoDynamics (QCD) has become a reliable tool providing an ab initio calculation of low-energy quantities. Despite numerous successes, systematic uncertainties, such as discretisation effects, finite-size effects, and contaminations from excited states, are inherent in any lattice calculation. Simulations with controlled systematic uncertainties and close to the physical pion mass have become state-of-the-art. We present such a calculation for various hadronic matrix elements using non-perturbatively O(a)-improved Wilson fermions with two dynamical light quark flavours. The main topics covered in this thesis are the axial charge of the nucleon, the electro-magnetic form factors of the nucleon, and the leading hadronic contributions to the anomalous magnetic moment of the muon. Lattice simulations typically tend to underestimate the axial charge of the nucleon by 5 − 10%. We show that including excited state contaminations using the summed operator insertion method leads to agreement with the experimentally determined value. Further studies of systematic uncertainties reveal only small discretisation effects. For the electro-magnetic form factors of the nucleon, we see a similar contamination from excited states as for the axial charge. The electro-magnetic radii, extracted from a dipole fit to the momentum dependence of the form factors, show no indication of finite-size or cutoff effects. If we include excited states using the summed operator insertion method, we achieve better agreement with the radii from phenomenology. The anomalous magnetic moment of the muon can be measured and predicted to very high precision. The theoretical prediction of the anomalous magnetic moment receives contribution from strong, weak, and electro-magnetic interactions, where the hadronic contributions dominate the uncertainties. A persistent 3σ tension between the experimental determination and the theoretical calculation is found, which is considered to be an indication for physics beyond the Standard Model. We present a calculation of the connected part of the hadronic vacuum polarisation using lattice QCD. Partially twisted boundary conditions lead to a significant improvement of the vacuum polarisation in the region of small momentum transfer, which is crucial in the extraction of the hadronic vacuum polarisation.
Resumo:
In this thesis we are going to talk about technologies which allow us to approach sentiment analysis on newspapers articles. The final goal of this work is to help social scholars to do content analysis on big corpora of texts in a faster way thanks to the support of automatic text classification.
Resumo:
In questo lavoro si introducono i concetti di base di Natural Language Processing, soffermandosi su Information Extraction e analizzandone gli ambiti applicativi, le attività principali e la differenza rispetto a Information Retrieval. Successivamente si analizza il processo di Named Entity Recognition, focalizzando l’attenzione sulle principali problematiche di annotazione di testi e sui metodi per la valutazione della qualità dell’estrazione di entità. Infine si fornisce una panoramica della piattaforma software open-source di language processing GATE/ANNIE, descrivendone l’architettura e i suoi componenti principali, con approfondimenti sugli strumenti che GATE offre per l'approccio rule-based a Named Entity Recognition.
Resumo:
Some of the most interesting phenomena that arise from the developments of the modern physics are surely vacuum fluctuations. They appear in different branches of physics, such as Quantum Field Theory, Cosmology, Condensed Matter Physics, Atomic and Molecular Physics, and also in Mathematical Physics. One of the most important of these vacuum fluctuations, sometimes called "zero-point energy", as well as one of the easiest quantum effect to detect, is the so-called Casimir effect. The purposes of this thesis are: - To propose a simple retarded approach for dynamical Casimir effect, thus a description of this vacuum effect when we have moving boundaries. - To describe the behaviour of the force acting on a boundary, due to its self-interaction with the vacuum.