867 resultados para image processing and analysis
Resumo:
In order to simplify computer management, several system administrators are adopting advanced techniques to manage software configuration of enterprise computer networks, but the tight coupling between hardware and software makes every PC an individual managed entity, lowering the scalability and increasing the costs to manage hundreds or thousands of PCs. Virtualization is an established technology, however its use is been more focused on server consolidation and virtual desktop infrastructure, not for managing distributed computers over a network. This paper discusses the feasibility of the Distributed Virtual Machine Environment, a new approach for enterprise computer management that combines virtualization and distributed system architecture as the basis of the management architecture. © 2008 IEEE.
Resumo:
In this project, the main focus is to apply image processing techniques in computer vision through an omnidirectional vision system to agricultural mobile robots (AMR) used for trajectory navigation problems, as well as localization matters. To carry through this task, computational methods based on the JSEG algorithm were used to provide the classification and the characterization of such problems, together with Artificial Neural Networks (ANN) for pattern recognition. Therefore, it was possible to run simulations and carry out analyses of the performance of JSEG image segmentation technique through Matlab/Octave platforms, along with the application of customized Back-propagation algorithm and statistical methods as structured heuristics methods in a Simulink environment. Having the aforementioned procedures been done, it was practicable to classify and also characterize the HSV space color segments, not to mention allow the recognition of patterns in which reasonably accurate results were obtained. ©2010 IEEE.
Resumo:
The methodology for fracture analysis of polymeric composites with scanning electron microscopes (SEM) is still under discussion. Many authors prefer to use sputter coating with a conductive material instead of applying low-voltage (LV) or variable-pressure (VP) methods, which preserves the original surfaces. The present work examines the effects of sputter coating with 25 nm of gold on the topography of carbon-epoxy composites fracture surfaces, using an atomic force microscope. Also, the influence of SEM imaging parameters on fractal measurements is evaluated for the VP-SEM and LV-SEM methods. It was observed that topographic measurements were not significantly affected by the gold coating at tested scale. Moreover, changes on SEM setup leads to nonlinear outcome on texture parameters, such as fractal dimension and entropy values. For VP-SEM or LV-SEM, fractal dimension and entropy values did not present any evident relation with image quality parameters, but the resolution must be optimized with imaging setup, accompanied by charge neutralization. © Wiley Periodicals, Inc.
Resumo:
Recently, there is an interest in technologies that favour the use of coproducts for animal nutrition. The effect of adding two enzyme mixtures in diets for dogs formulated with wheat bran (WB) was evaluated. Two foods with similar compositions were formulated: negative control (NC; without WB) and test diet (25% of WB). The test diet was divided into four treatments: without enzyme (positive control), enzyme mixture 1 (ENZ1; added before extrusion β-glucanase, xylanase, cellulase, glucoamylase, phytase); enzyme mixture 2 (ENZ2; added before extrusion the ENZ1 more α-amylase); enzyme mixture 2 added after the extrusion (ENZ2ex). ENZ1 and ENZ2 were used to evaluate the enzyme effect on extruder pre-conditioner (processing additive) and ENZ2ex to evaluate the effect of enzyme supplementation for the animal. Digestibility was measured through total collection of faeces and urine. The experiment followed a randomized block design with five treatments (diets) and six dogs per diet, totalling 30 dogs (7.0 ± 1.2 years old and 11.0 ± 2.2 kg of body weight). Data were submitted to analysis of variance and means compared by Tukey's test and orthogonal contrasts (p < 0.05). Reducing sugars showed an important reduction after extrusion, suggesting the formation of carbohydrate complexes. The apparent total tract digestibility (ATTD) of dry matter, organic matter, crude protein, acid-hydrolysed fat and energy was higher in NC than in diets with WB (p < 0.001), without effects of enzyme additions. WB diets resulted in higher faecal production and concentration of short-chain fatty acids (SCFA) and reduced pH and ammonia concentration (p < 0.01), with no effect of enzyme addition. The enzyme addition did not result in improved digestibility of a diet high in non-starch polysaccharides; however, only ATTD was measured and nutrient fermentation in the large intestine may have interfered with the results obtained. WB modified fermentation product formation in the colon of dogs. © 2013 Blackwell Verlag GmbH.
Resumo:
O objetivo deste trabalho foi elaborar um produto matinal extrusado de quirera de arroz e bandinha de feijão, além de verificar a influência do processo de extrusão nas suas características físico-químicas, nutricionais, tecnológicas e sensoriais. O produto final apresentou teor considerável de proteínas (9,9 g.100 g-1), podendo ser considerado uma boa fonte desse nutriente para crianças e adolescentes. Para a fibra alimentar, observou-se teor de 3,71 g.100 g-1 do produto pronto para o consumo. Dessa forma, o floco matinal de arroz e feijão pode receber a alegação de alimento fonte de fibras, de acordo com a legislação brasileira. Com relação às propriedades tecnológicas, o extrusado estudado apresentou índice de expansão de 8,89 e densidade aparente de 0,25 g.cm-3. Quanto à análise sensorial, o floco matinal avaliado obteve notas médias de aceitação, situadas no intervalo de 6,8 a 7,7, que corresponde às categorias "gostei ligeiramente" e "gostei muito". Para a intenção de compra, 79% dos provadores opinaram que certamente ou possivelmente comprariam o produto. O emprego de quirera de arroz e bandinha de feijão é uma interessante alternativa para a elaboração de produto matinal extrusado, apresentando boas qualidades de ordem nutricional, tecnológica e sensorial.
Resumo:
Bio-molecular computing, 'computations performed by bio-molecules', is already challenging traditional approaches to computation both theoretically and technologically. Often placed within the wider context of ´bio-inspired' or 'natural' or even 'unconventional' computing, the study of natural and artificial molecular computations is adding to our understanding of biology, physical sciences and computer science well beyond the framework of existing design and implementation paradigms. In this introduction, We wish to outline the current scope of the field and assemble some basic arguments that, bio-molecular computation is of central importance to computer science, physical sciences and biology using HOL - Higher Order Logic. HOL is used as the computational tool in our R&D work. DNA was analyzed as a chemical computing engine, in our effort to develop novel formalisms to understand the molecular scale bio-chemical computing behavior using HOL. In our view, our focus is one of the pioneering efforts in this promising domain of nano-bio scale chemical information processing dynamics.
Resumo:
Background: Intrauterine insemination (IUI) is widely used to treat infertility, and its adequate indication is important to obtain good pregnancy rates. To assess which couples could benefit from IUI, this study aimed to evaluate whether sperm motility using a discontinuous gradient of different densities and incubation in CO2 in normospermic individuals is able to predict pregnancy.Methods: A total of 175 couples underwent 175 IUI cycles. The inclusion criteria for women were as follows: 35 years old or younger (age range: from 27 to 35 years) with normal fallopian tubes; endometriosis grades I-II; unexplained infertility; nonhyperandrogenic ovulatory dysfunction. Men with normal seminal parameters were also included. All patients underwent ovarian stimulation with clomiphene citrate and human hMG or r-FSH. When one or (at most) three follicles measuring 18 to 20 mm were observed, hCG (5000 UI) or r-hCG (250 mcg) was administered and IUI performed 36-40 h after hCG. Sperm processing was performed using a discontinuous concentration gradient. A 20 microliters aliquot was incubated for 24 h at 37 degrees C in 5% CO2 following a total progressive motility analysis. The Mann-Whitney and Chi-square tests, as well as a ROC curve were used to determine the cutoff value for motility.Results: Of the 175 couples, 52 (in 52 IUI cycles) achieved clinical pregnancies (CP rate per cycle: 29.7%). The analysis of age, duration and causes of infertility did not indicate any statistical significance between pregnancy and no pregnancy groups, similar to the results for total sperm count and morphology analyses, excluding progressive motility (p < 0.0001). The comparison of progressive motility after processing and 24 h after incubation between these two groups indicated that progressive motility 24 h after incubation was higher in the pregnancy group. The analysis of the progressive motility of the pregnancy group after processing and 24 h after incubation has not shown any motility difference at 24 h after incubation; additionally, in couples who did not obtain pregnancy, there was a statistically significant decrease in progressive motility 24 h after incubation (p < 0.0001). The ROC curve analysis generated a cutoff value of 56.5% for progressive motility at 24 h after incubation and this cutoff value produced 96.1% sensitivity, 92.7% specificity, 84.7% positive predictive value and 98.3% negative predictive value.Conclusions: We concluded that the sperm motility of normospermic individuals 24 h after incubation at 37 degrees C in 5% CO2, with a cutoff value of 56.5%, is predictive of IUI success.
Resumo:
This paper presents a technique for performing analog design synthesis at circuit level providing feedback to the designer through the exploration of the Pareto frontier. A modified simulated annealing which is able to perform crossover with past anchor points when a local minimum is found which is used as the optimization algorithm on the initial synthesis procedure. After all specifications are met, the algorithm searches for the extreme points of the Pareto frontier in order to obtain a non-exhaustive exploration of the Pareto front. Finally, multi-objective particle swarm optimization is used to spread the results and to find a more accurate frontier. Piecewise linear functions are used as single-objective cost functions to produce a smooth and equal convergence of all measurements to the desired specifications during the composition of the aggregate objective function. To verify the presented technique two circuits were designed, which are: a Miller amplifier with 96 dB Voltage gain, 15.48 MHz unity gain frequency, slew rate of 19.2 V/mu s with a current supply of 385.15 mu A, and a complementary folded cascode with 104.25 dB Voltage gain, 18.15 MHz of unity gain frequency and a slew rate of 13.370 MV/mu s. These circuits were synthesized using a 0.35 mu m technology. The results show that the method provides a fast approach for good solutions using the modified SA and further good Pareto front exploration through its connection to the particle swarm optimization algorithm.
Resumo:
The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.
Resumo:
The principal aim of this research project has been the evaluation of the specific role of yeasts in ripening processes of dry-cured meat products, i.e. speck and in salami produced by adding Lactobacilli starter cultures, i.e. L. sakei, L. casei, L. fermentum, L. rhamnosus, L.sakei + S.xylosus. In particular the contribution of the predominant yeasts to the hydrolytic patterns of meat proteins has been studied both in model system and in real products. In fact, although several papers have been published on the microbial, enzymatic, aromatic and chemical characterization of dry-cured meat e.g. ham over ripening, the specific role of yeasts has been often underestimated. Therefore this research work has been focused on the following aspects: 1. Characterization of the yeasts and lactic acid bacteria in samples of speck produced by different farms and analyzed during the various production and ripening phases 2. Characterization of the superficial or internal yeasts population in salami produced with or without the use of lactobacilli as starter cultures 3. Molecular characterization of different strains of yeasts and detection of the dominant biotypes able to survive despite environmental stress factors (such as smoke, salt) 4. Study of the proteolytic profiles of speck and salami during the ripening process and comparison with the proteolytic profiles produced in meat model systems by a relevant number of yeasts isolated from speck and salami 5. Study of the proteolytic profiles of Lactobacilli starter cultures in meat model systems 6. Comparative statistical analysis of the proteolytic profiles to find possible relationships between specific bands and peptides and specific microorganisms 7. Evaluation of the aromatic characteristics of speck and salami to assess relationships among the metabolites released by the starter cultures or the dominant microflora
Resumo:
This thesis investigates two distinct research topics. The main topic (Part I) is the computational modelling of cardiomyocytes derived from human stem cells, both embryonic (hESC-CM) and induced-pluripotent (hiPSC-CM). The aim of this research line lies in developing models of the electrophysiology of hESC-CM and hiPSC-CM in order to integrate the available experimental data and getting in-silico models to be used for studying/making new hypotheses/planning experiments on aspects not fully understood yet, such as the maturation process, the functionality of the Ca2+ hangling or why the hESC-CM/hiPSC-CM action potentials (APs) show some differences with respect to APs from adult cardiomyocytes. Chapter I.1 introduces the main concepts about hESC-CMs/hiPSC-CMs, the cardiac AP, and computational modelling. Chapter I.2 presents the hESC-CM AP model, able to simulate the maturation process through two developmental stages, Early and Late, based on experimental and literature data. Chapter I.3 describes the hiPSC-CM AP model, able to simulate the ventricular-like and atrial-like phenotypes. This model was used to assess which currents are responsible for the differences between the ventricular-like AP and the adult ventricular AP. The secondary topic (Part II) consists in the study of texture descriptors for biological image processing. Chapter II.1 provides an overview on important texture descriptors such as Local Binary Pattern or Local Phase Quantization. Moreover the non-binary coding and the multi-threshold approach are here introduced. Chapter II.2 shows that the non-binary coding and the multi-threshold approach improve the classification performance of cellular/sub-cellular part images, taken from six datasets. Chapter II.3 describes the case study of the classification of indirect immunofluorescence images of HEp2 cells, used for the antinuclear antibody clinical test. Finally the general conclusions are reported.
Resumo:
This thesis presents several data processing and compression techniques capable of addressing the strict requirements of wireless sensor networks. After introducing a general overview of sensor networks, the energy problem is introduced, dividing the different energy reduction approaches according to the different subsystem they try to optimize. To manage the complexity brought by these techniques, a quick overview of the most common middlewares for WSNs is given, describing in detail SPINE2, a framework for data processing in the node environment. The focus is then shifted on the in-network aggregation techniques, used to reduce data sent by the network nodes trying to prolong the network lifetime as long as possible. Among the several techniques, the most promising approach is the Compressive Sensing (CS). To investigate this technique, a practical implementation of the algorithm is compared against a simpler aggregation scheme, deriving a mixed algorithm able to successfully reduce the power consumption. The analysis moves from compression implemented on single nodes to CS for signal ensembles, trying to exploit the correlations among sensors and nodes to improve compression and reconstruction quality. The two main techniques for signal ensembles, Distributed CS (DCS) and Kronecker CS (KCS), are introduced and compared against a common set of data gathered by real deployments. The best trade-off between reconstruction quality and power consumption is then investigated. The usage of CS is also addressed when the signal of interest is sampled at a Sub-Nyquist rate, evaluating the reconstruction performance. Finally the group sparsity CS (GS-CS) is compared to another well-known technique for reconstruction of signals from an highly sub-sampled version. These two frameworks are compared again against a real data-set and an insightful analysis of the trade-off between reconstruction quality and lifetime is given.
Digital signal processing and digital system design using discrete cosine transform [student course]
Resumo:
The discrete cosine transform (DCT) is an important functional block for image processing applications. The implementation of a DCT has been viewed as a specialized research task. We apply a micro-architecture based methodology to the hardware implementation of an efficient DCT algorithm in a digital design course. Several circuit optimization and design space exploration techniques at the register-transfer and logic levels are introduced in class for generating the final design. The students not only learn how the algorithm can be implemented, but also receive insights about how other signal processing algorithms can be translated into a hardware implementation. Since signal processing has very broad applications, the study and implementation of an extensively used signal processing algorithm in a digital design course significantly enhances the learning experience in both digital signal processing and digital design areas for the students.
Resumo:
The fracture properties of high-strength spray-formed Al alloys were investigated, with consideration of the effects of elemental additions such as zinc,manganese, and chromium and the influence of the addition of SiC particulate. Fracture resistance values between 13.6 and 25.6 MPa (m)1/2 were obtained for the monolithic alloys in the T6 and T7 conditions, respectively. The alloys with SiC particulate compared well and achieved fracture resistance values between 18.7 and 25.6 MPa (m)1/2. The spray-formed materials exhibited a loss in fracture resistance (KI) compared to ingot metallurgy 7075 alloys but had an improvedperformance compared to high-solute powder metallurgy alloys of similar composition. Characterization of the fracture surfaces indicated a predominantly intergranular decohesion, possibly facilitated by the presence of incoherent particles at the grain boundary regions and by the large strength differentialbetween the matrix and precipitate zone. It is believed that at the slip band-grain boundary intersection, particularly in the presence of large dispersoids and/or inclusions, microvoid nucleation would be significantly enhanced. Differences in fracture surfaces between the alloys in the T6 and T7 condition were observed and are attributed to inhomogeneous slip distribution, which results in strain localization at grain boundaries. The best overall combination of fracture resistance properties were obtained for alloys with minimum amounts of chromium and manganese additions.