16 resultados para Extraction and Processing Industry
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Traceability is often perceived by food industry executives as an additional cost of doing business, one to be avoided if possible. However, a traceability system can in fact comply the regulatory requirements, increase food safety and recall performance, improving marketing performances and, as well as, improving supply chain management. Thus, traceability affects business performances of firms in terms of costs and benefits determined by traceability practices. Costs and benefits affect factors such as, firms’ characteristics, level of traceability and ,lastly, costs and benefits perceived prior to traceability implementation. This thesis was undertaken to understand how these factors are linked to affect the outcome of costs and benefits. Analysis of the results of a plant level survey of the Italian ichthyic processing industry revealed that processors generally adopt various level of traceability while government support appears to increase the level of traceability and the expectations and actual costs and benefits. None of the firms’ characteristics, with the exception of government support, influences costs and level of traceability. Only size of firms and level of QMS certifications are linked with benefits while precision of traceability increases benefits without affecting costs. Finally, traceability practices appear due to the request from “external“ stakeholders such as government, authority and customers rather than “internal” factors (e.g. improving the firm management) while the traceability system does not provide any added value from the market in terms of price premium or market share increase.
Resumo:
The identification of people by measuring some traits of individual anatomy or physiology has led to a specific research area called biometric recognition. This thesis is focused on improving fingerprint recognition systems considering three important problems: fingerprint enhancement, fingerprint orientation extraction and automatic evaluation of fingerprint algorithms. An effective extraction of salient fingerprint features depends on the quality of the input fingerprint. If the fingerprint is very noisy, we are not able to detect a reliable set of features. A new fingerprint enhancement method, which is both iterative and contextual, is proposed. This approach detects high-quality regions in fingerprints, selectively applies contextual filtering and iteratively expands like wildfire toward low-quality ones. A precise estimation of the orientation field would greatly simplify the estimation of other fingerprint features (singular points, minutiae) and improve the performance of a fingerprint recognition system. The fingerprint orientation extraction is improved following two directions. First, after the introduction of a new taxonomy of fingerprint orientation extraction methods, several variants of baseline methods are implemented and, pointing out the role of pre- and post- processing, we show how to improve the extraction. Second, the introduction of a new hybrid orientation extraction method, which follows an adaptive scheme, allows to improve significantly the orientation extraction in noisy fingerprints. Scientific papers typically propose recognition systems that integrate many modules and therefore an automatic evaluation of fingerprint algorithms is needed to isolate the contributions that determine an actual progress in the state-of-the-art. The lack of a publicly available framework to compare fingerprint orientation extraction algorithms, motivates the introduction of a new benchmark area called FOE (including fingerprints and manually-marked orientation ground-truth) along with fingerprint matching benchmarks in the FVC-onGoing framework. The success of such framework is discussed by providing relevant statistics: more than 1450 algorithms submitted and two international competitions.
Resumo:
Recently, global meat market is facing several dramatic changes due to shifting in diet and life style, consumer demands, and economical considerations. Firstly, there was a tremendous increase in the poultry meat demand. Furthermore, current forecast and projection studies pointed out that the expansion of the poultry market will continue in future. In response to this demand, there was a great success to increase growth rate of meat-type chickens in the last few decades in order to optimize the production of poultry meat. Accordingly, the increase of growth rate induced the appearance of several muscle abnormalities such as pale-soft-exudative (PSE) syndrome and deep-pectoral-myopathy (DPM) and more recently white striping and wooden breast. Currently, there is growing interest in meat industry to understand how much the magnitude of the effect of these abnormalities on different quality traits for raw and processed meat. Therefore, the major part of the research activities during the PhD project was dedicated to evaluate the different implications of recent muscle abnormalities such as white striping and wooden breast on meat quality traits and their incidence under commercial conditions. Generally, our results showed that the incidence of these muscle abnormalities was very high under commercial conditions and had great adverse impact on meat quality traits. Secondly, there is growing market share of convenient, healthy, and functional processed meat products. Accordingly, the remaining part of research activities of the PhD project was dedicated to evaluate the possibility to formulate processed meat products with higher perceived healthy profile such as phosphate free-marinated chicken meat and low sodium-marinated rabbit meat products. Overall all findings showed that sodium bicarbonate can be considered as promising component to replace phosphates in meat products, while potassium chloride under certain conditions was successfully used to produce low marinated rabbit meat products.
Resumo:
Statistical modelling and statistical learning theory are two powerful analytical frameworks for analyzing signals and developing efficient processing and classification algorithms. In this thesis, these frameworks are applied for modelling and processing biomedical signals in two different contexts: ultrasound medical imaging systems and primate neural activity analysis and modelling. In the context of ultrasound medical imaging, two main applications are explored: deconvolution of signals measured from a ultrasonic transducer and automatic image segmentation and classification of prostate ultrasound scans. In the former application a stochastic model of the radio frequency signal measured from a ultrasonic transducer is derived. This model is then employed for developing in a statistical framework a regularized deconvolution procedure, for enhancing signal resolution. In the latter application, different statistical models are used to characterize images of prostate tissues, extracting different features. These features are then uses to segment the images in region of interests by means of an automatic procedure based on a statistical model of the extracted features. Finally, machine learning techniques are used for automatic classification of the different region of interests. In the context of neural activity signals, an example of bio-inspired dynamical network was developed to help in studies of motor-related processes in the brain of primate monkeys. The presented model aims to mimic the abstract functionality of a cell population in 7a parietal region of primate monkeys, during the execution of learned behavioural tasks.
Resumo:
In this thesis two major topics inherent with medical ultrasound images are addressed: deconvolution and segmentation. In the first case a deconvolution algorithm is described allowing statistically consistent maximum a posteriori estimates of the tissue reflectivity to be restored. These estimates are proven to provide a reliable source of information for achieving an accurate characterization of biological tissues through the ultrasound echo. The second topic involves the definition of a semi automatic algorithm for myocardium segmentation in 2D echocardiographic images. The results show that the proposed method can reduce inter- and intra observer variability in myocardial contours delineation and is feasible and accurate even on clinical data.
Resumo:
A main objective of the human movement analysis is the quantitative description of joint kinematics and kinetics. This information may have great possibility to address clinical problems both in orthopaedics and motor rehabilitation. Previous studies have shown that the assessment of kinematics and kinetics from stereophotogrammetric data necessitates a setup phase, special equipment and expertise to operate. Besides, this procedure may cause feeling of uneasiness on the subjects and may hinder with their walking. The general aim of this thesis is the implementation and evaluation of new 2D markerless techniques, in order to contribute to the development of an alternative technique to the traditional stereophotogrammetric techniques. At first, the focus of the study has been the estimation of the ankle-foot complex kinematics during stance phase of the gait. Two particular cases were considered: subjects barefoot and subjects wearing ankle socks. The use of socks was investigated in view of the development of the hybrid method proposed in this work. Different algorithms were analyzed, evaluated and implemented in order to have a 2D markerless solution to estimate the kinematics for both cases. The validation of the proposed technique was done with a traditional stereophotogrammetric system. The implementation of the technique leads towards an easy to configure (and more comfortable for the subject) alternative to the traditional stereophotogrammetric system. Then, the abovementioned technique has been improved so that the measurement of knee flexion/extension could be done with a 2D markerless technique. The main changes on the implementation were on occlusion handling and background segmentation. With the additional constraints, the proposed technique was applied to the estimation of knee flexion/extension and compared with a traditional stereophotogrammetric system. Results showed that the knee flexion/extension estimation from traditional stereophotogrammetric system and the proposed markerless system were highly comparable, making the latter a potential alternative for clinical use. A contribution has also been given in the estimation of lower limb kinematics of the children with cerebral palsy (CP). For this purpose, a hybrid technique, which uses high-cut underwear and ankle socks as “segmental markers” in combination with a markerless methodology, was proposed. The proposed hybrid technique is different than the abovementioned markerless technique in terms of the algorithm chosen. Results showed that the proposed hybrid technique can become a simple and low-cost alternative to the traditional stereophotogrammetric systems.
Resumo:
The purpose of the PhD research was the identification of new strategies of farming and processing, with the aim to improve the nutritional and technological characteristics of poultry meat. Part of the PhD research was focused on evaluation of alternative farming systems, with the aim to increase animal welfare and to improve the meat quality and sensorial characteristics in broiler chickens. It was also assessed the use of innovative ingredients for marination of poultry meat (sodium bicarbonate and natural antioxidants) The research was developed by studying the following aspects: - Meat quality characteristics, oxidative stability and sensorial traits of chicken meat obtained from two different farming systems: free range vs conventional; - Meat quality traits of frozen chicken breast pre-salted using increasing concentrations of sodium chloride; - Use of sodium bicarbonate in comparison with sodium trypolyphosphate for marination of broiler breast meat and phase; - Marination with thyme and orange essential oils mixture to improve chicken meat quality traits, susceptibility to lipid oxidation and sensory traits. The following meat quality traits analyseswere performed: Colour, pH, water holding capacity by conventional (gravimetric methods, pressure application, centrifugation and cooking) and innovative methods (low-field NMR and DSC analysis) ability to absorb marinade soloutions, texture (shear force using different probes and texture profile analysis), proximate analysis (moisture, proteins, lipids, ash content, collagen, fatty acid), susceptibility to lipid oxidation (determinations of reactive substances with thiobarbituric acid and peroxide value), sensorial analysis (triangle test and consumer test).
Resumo:
Since last century, the rising interest of value-added and advanced functional materials has spurred a ceaseless development in terms of industrial processes and applications. Among the emerging technologies, thanks to their unique features and versatility in terms of supported processes, non-equilibrium plasma discharges appear as a key solvent-free, high-throughput and cost-efficient technique. Nevertheless, applied research studies are needed with the aim of addressing plasma potentialities optimizing devices and processes for future industrial applications. In this framework, the aim of this dissertation is to report on the activities carried out and the results achieved concerning the development and optimization of plasma techniques for nanomaterial synthesis and processing to be applied in the biomedical field. In the first section, the design and investigation of a plasma assisted process for the production of silver (Ag) nanostructured multilayer coatings exhibiting anti-biofilm and anti-clot properties is described. With the aim on enabling in-situ and on-demand deposition of Ag nanoparticles (NPs), the optimization of a continuous in-flight aerosol process for particle synthesis is reported. The stability and promising biological performances of deposited coatings spurred further investigation through in-vitro and in-vivo tests which results are reported and discussed. With the aim of addressing the unanswered questions and tuning NPs functionalities, the second section concerns the study of silver containing droplet conversion in a flow-through plasma reactor. The presented results, obtained combining different analysis techniques, support a formation mechanism based on droplet to particle conversion driven by plasma induced precursor reduction. Finally, the third section deals with the development of a simulative and experimental approach used to investigate the in-situ droplet evaporation inside the plasma discharge addressing the main contributions to liquid evaporation in the perspective of process industrial scale up.
Resumo:
Neural representations (NR) have emerged in the last few years as a powerful tool to represent signals from several domains, such as images, 3D shapes, or audio. Indeed, deep neural networks have been shown capable of approximating continuous functions that describe a given signal with theoretical infinite resolution. This finding allows obtaining representations whose memory footprint is fixed and decoupled from the resolution at which the underlying signal can be sampled, something that is not possible with traditional discrete representations, e.g., grids of pixels for images or voxels for 3D shapes. During the last two years, many techniques have been proposed to improve the capability of NR to approximate high-frequency details and to make the optimization procedures required to obtain NR less demanding both in terms of time and data requirements, motivating many researchers to deploy NR as the main form of data representation for complex pipelines. Following this line of research, we first show that NR can approximate precisely Unsigned Distance Functions, providing an effective way to represent garments that feature open 3D surfaces and unknown topology. Then, we present a pipeline to obtain in a few minutes a compact Neural Twin® for a given object, by exploiting the recent advances in modeling neural radiance fields. Furthermore, we move a step in the direction of adopting NR as a standalone representation, by considering the possibility of performing downstream tasks by processing directly the NR weights. We first show that deep neural networks can be compressed into compact latent codes. Then, we show how this technique can be exploited to perform deep learning on implicit neural representations (INR) of 3D shapes, by only looking at the weights of the networks.
Resumo:
Over the past years fruit and vegetable industry has become interested in the application of both osmotic dehydration and vacuum impregnation as mild technologies because of their low temperature and energy requirements. Osmotic dehydration is a partial dewatering process by immersion of cellular tissue in hypertonic solution. The diffusion of water from the vegetable tissue to the solution is usually accompanied by the simultaneous solutes counter-diffusion into the tissue. Vacuum impregnation is a unit operation in which porous products are immersed in a solution and subjected to a two-steps pressure change. The first step (vacuum increase) consists of the reduction of the pressure in a solid-liquid system and the gas in the product pores is expanded, partially flowing out. When the atmospheric pressure is restored (second step), the residual gas in the pores compresses and the external liquid flows into the pores. This unit operation allows introducing specific solutes in the tissue, e.g. antioxidants, pH regulators, preservatives, cryoprotectancts. Fruit and vegetable interact dynamically with the environment and the present study attempts to enhance our understanding on the structural, physico-chemical and metabolic changes of plant tissues upon the application of technological processes (osmotic dehydration and vacuum impregnation), by following a multianalytical approach. Macro (low-frequency nuclear magnetic resonance), micro (light microscopy) and ultrastructural (transmission electron microscopy) measurements combined with textural and differential scanning calorimetry analysis allowed evaluating the effects of individual osmotic dehydration or vacuum impregnation processes on (i) the interaction between air and liquid in real plant tissues, (ii) the plant tissue water state and (iii) the cell compartments. Isothermal calorimetry, respiration and photosynthesis determinations led to investigate the metabolic changes upon the application of osmotic dehydration or vacuum impregnation. The proposed multianalytical approach should enable both better designs of processing technologies and estimations of their effects on tissue.
Resumo:
Lipolysis and oxidation of lipids in foods are the major biochemical and chemical processes that cause food quality deterioration, leading to the characteristic, unpalatable odour and flavour called rancidity. In addition to unpalatability, rancidity may give rise to toxic levels of certain compounds like aldehydes, hydroperoxides, epoxides and cholesterol oxidation products. In this PhD study chromatographic and spectroscopic techniques were employed to determine the degree of rancidity in different animal products and its relationship with technological parameters like feeding fat sources, packaging, processing and storage conditions. To achieve this goal capillary gas chromatography (CGC) was employed not only to determine the fatty acids profile but also, after solid phase extraction, the amount of free fatty acids (FFA), diglycerides (DG), sterols (cholesterol and phytosterols) and cholesterol oxidation products (COPs). To determine hydroperoxides, primary products of oxidation and quantify secondary products UV/VIS absorbance spectroscopy was applied. Most of the foods analysed in this study were meat products. In actual fact, lipid oxidation is a major deterioration reaction in meat and meat products and results in adverse changes in the colour, flavour and texture of meat. The development of rancidity has long recognized as a serious problem during meat handling, storage and processing. On a dairy product, a vegetal cream, a study of lipid fraction and development of rancidity during storage was carried out to evaluate its shelf-life and some nutritional features life saturated/unsaturated fatty acids ratio and phytosterols content. Then, according to the interest that has been growing around functional food in the last years, a new electrophoretic method was optimized and compared with HPLC to check the quality of a beehive product like royal jelly. This manuscript reports the main results obtained in the five activities briefly summarized as follows: 1) comparison between HPLC and a new electrophoretic method in the evaluation of authenticity of royal jelly; 2) study of the lipid fraction of a vegetal cream under different storage conditions; 3) study of lipid oxidation in minced beef during storage under a modified atmosphere packaging, before and after cooking; 4) evaluation of the influence of dietary fat and processing on the lipid fraction of chicken patties; 5) study of the lipid fraction of typical Italian and Spanish pork dry sausages and cured hams.
Resumo:
Nanotechnologies are rapidly expanding because of the opportunities that the new materials offer in many areas such as the manufacturing industry, food production, processing and preservation, and in the pharmaceutical and cosmetic industry. Size distribution of the nanoparticles determines their properties and is a fundamental parameter that needs to be monitored from the small-scale synthesis up to the bulk production and quality control of nanotech products on the market. A consequence of the increasing number of applications of nanomaterial is that the EU regulatory authorities are introducing the obligation for companies that make use of nanomaterials to acquire analytical platforms for the assessment of the size parameters of the nanomaterials. In this work, Asymmetrical Flow Field-Flow Fractionation (AF4) and Hollow Fiber F4 (HF5), hyphenated with Multiangle Light Scattering (MALS) are presented as tools for a deep functional characterization of nanoparticles. In particular, it is demonstrated the applicability of AF4-MALS for the characterization of liposomes in a wide series of mediums. Afterwards the technique is used to explore the functional features of a liposomal drug vector in terms of its biological and physical interaction with blood serum components: a comprehensive approach to understand the behavior of lipid vesicles in terms of drug release and fusion/interaction with other biological species is described, together with weaknesses and strength of the method. Afterwards the size characterization, size stability, and conjugation of azidothymidine drug molecules with a new generation of metastable drug vectors, the Metal Organic Frameworks, is discussed. Lastly, it is shown the applicability of HF5-ICP-MS for the rapid screening of samples of relevant nanorisk: rather than a deep and comprehensive characterization it this time shown a quick and smart methodology that within few steps provides qualitative information on the content of metallic nanoparticles in tattoo ink samples.
Resumo:
The increasing demand for alternatives to meat food products, which is linked to ethical and environmental reasons, highlights the necessity of using different protein sources. Plant proteins provide a valid option, thanks to the relative low costs, high availability and wide supply sources. The current process used to produce plant concentrates and isolates is the alkaline extraction followed by isoelectric precipitation. However, despite the high purity of the proteins, it presents some drawbacks. Innovative protein extraction processes are emerging, with the aim of reducing the environmental impact and the costs, as well as improving the functional properties. In this study, the traditional wet protein extraction and another simplified wet process were used to obtain protein-rich extracts out of different plants. The sources considered in the project were de-oiled sunflower and canola, chickpea, lentils, and the camelina meal, an emerging oleaginous seed interesting for its high content of omega 3. The extracts obtained from the two processes were then analysed for their capacities to hold water and fat, to form gel and a stable foam. Results highlighted strong differences concerning the protein content, yield and functionalities. The extracts obtained with the alkaline process confirmed the literature data about the four plant sources (sunflower, canola, chickpea and lentils) and allow to obtain a camelina concentrate with a protein content of 63 % and a protein recovery of 41 %. The second easiest process was not effective to obtain a protein enrichment in oleaginous sources, whereas an enrichment of 10 and 15 % was obtained in chickpea and lentils, respectively. The functional properties were also completely different: the easiest process produced protein ingredients completely water-soluble at pH 7, with a discrete foaming capacity compared to the extracts obtained with alkaline process. These characteristics could make these extracts suitable for the plant milk-analogue products.
Resumo:
The aim of this dissertation is to describe the methodologies required to design, operate, and validate the performance of ground stations dedicated to near and deep space tracking, as well as the models developed to process the signals acquired, from raw data to the output parameters of the orbit determination of spacecraft. This work is framed in the context of lunar and planetary exploration missions by addressing the challenges in receiving and processing radiometric data for radio science investigations and navigation purposes. These challenges include the designing of an appropriate back-end to read, convert and store the antenna voltages, the definition of appropriate methodologies for pre-processing, calibration, and estimation of radiometric data for the extraction of information on the spacecraft state, and the definition and integration of accurate models of the spacecraft dynamics to evaluate the goodness of the recorded signals. Additionally, the experimental design of acquisition strategies to perform direct comparison between ground stations is described and discussed. In particular, the evaluation of the differential performance between stations requires the designing of a dedicated tracking campaign to maximize the overlap of the recorded datasets at the receivers, making it possible to correlate the received signals and isolate the contribution of the ground segment to the noise in the single link. Finally, in support of the methodologies and models presented, results from the validation and design work performed on the Deep Space Network (DSN) affiliated nodes DSS-69 and DSS-17 will also be reported.
Resumo:
The project aims to gather an understanding of additive manufacturing and other manufacturing 4.0 techniques with an eyesight for industrialization. First the internal material anisotropy of elements created with the most economically feasible FEM technique was established. An understanding of the main drivers for variability for AM was portrayed, with the focus on achieving material internal isotropy. Subsequently, a technique for deposition parameter optimization was presented, further procedure testing was performed following other polymeric materials and composites. A replicability assessment by means of the use of technology 4.0 was proposed, and subsequent industry findings gathered the ultimate need of developing a process that demonstrate how to re-engineer designs in order to show the best results with AM processing. The latest study aims to apply the Industrial Design and Structure Method (IDES) and applying all the knowledge previously stacked into fully reengineer a product with focus of applying tools from 4.0 era, from product feasibility studies, until CAE – FEM analysis and CAM – DfAM. These results would help in making AM and FDM processes a viable option to be combined with composites technologies to achieve a reliable, cost-effective manufacturing method that could also be used for mass market, industry applications.