911 resultados para Machine vision and image processing
Resumo:
Multispectral images are becoming more common in the field of remote sensing, computer vision, and industrial applications. Due to the high accuracy of the multispectral information, it can be used as an important quality factor in the inspection of industrial products. Recently, the development on multispectral imaging systems and the computational analysis on the multispectral images have been the focus of a growing interest. In this thesis, three areas of multispectral image analysis are considered. First, a method for analyzing multispectral textured images was developed. The method is based on a spectral cooccurrence matrix, which contains information of the joint distribution of spectral classes in a spectral domain. Next, a procedure for estimating the illumination spectrum of the color images was developed. Proposed method can be used, for example, in color constancy, color correction, and in the content based search from color image databases. Finally, color filters for the optical pattern recognition were designed, and a prototype of a spectral vision system was constructed. The spectral vision system can be used to acquire a low dimensional component image set for the two dimensional spectral image reconstruction. The data obtained by the spectral vision system is small and therefore convenient for storing and transmitting a spectral image.
Resumo:
In image processing, segmentation algorithms constitute one of the main focuses of research. In this paper, new image segmentation algorithms based on a hard version of the information bottleneck method are presented. The objective of this method is to extract a compact representation of a variable, considered the input, with minimal loss of mutual information with respect to another variable, considered the output. First, we introduce a split-and-merge algorithm based on the definition of an information channel between a set of regions (input) of the image and the intensity histogram bins (output). From this channel, the maximization of the mutual information gain is used to optimize the image partitioning. Then, the merging process of the regions obtained in the previous phase is carried out by minimizing the loss of mutual information. From the inversion of the above channel, we also present a new histogram clustering algorithm based on the minimization of the mutual information loss, where now the input variable represents the histogram bins and the output is given by the set of regions obtained from the above split-and-merge algorithm. Finally, we introduce two new clustering algorithms which show how the information bottleneck method can be applied to the registration channel obtained when two multimodal images are correctly aligned. Different experiments on 2-D and 3-D images show the behavior of the proposed algorithms
Resumo:
Blood flow in human aorta is an unsteady and complex phenomenon. The complex patterns are related to the geometrical features like curvature, bends, and branching and pulsatile nature of flow from left ventricle of heart. The aim of this work was to understand the effect of aorta geometry on the flow dynamics. To achieve this, 3D realistic and idealized models of descending aorta were reconstructed from Computed Tomography (CT) images of a female patient. The geometries were reconstructed using medical image processing code. The blood flow in aorta was assumed to be laminar and incompressible and the blood was assumed to be Newtonian fluid. A time dependent pulsatile and parabolic boundary condition was deployed at inlet. Steady and unsteady blood flow simulations were performed in real and idealized geometries of descending aorta using a Finite Volume Method (FVM) code. Analysis of Wall Shear Stress (WSS) distribution, pressure distribution, and axial velocity profiles were carried out in both geometries at steady and unsteady state conditions. The results obtained in thesis work reveal that the idealization of geometry underestimates the values of WSS especially near the region with sudden change of diameter. However, the resultant pressure and velocity in idealized geometry are close to those in real geometry
Resumo:
Diabetes is a rapidly increasing worldwide problem which is characterised by defective metabolism of glucose that causes long-term dysfunction and failure of various organs. The most common complication of diabetes is diabetic retinopathy (DR), which is one of the primary causes of blindness and visual impairment in adults. The rapid increase of diabetes pushes the limits of the current DR screening capabilities for which the digital imaging of the eye fundus (retinal imaging), and automatic or semi-automatic image analysis algorithms provide a potential solution. In this work, the use of colour in the detection of diabetic retinopathy is statistically studied using a supervised algorithm based on one-class classification and Gaussian mixture model estimation. The presented algorithm distinguishes a certain diabetic lesion type from all other possible objects in eye fundus images by only estimating the probability density function of that certain lesion type. For the training and ground truth estimation, the algorithm combines manual annotations of several experts for which the best practices were experimentally selected. By assessing the algorithm’s performance while conducting experiments with the colour space selection, both illuminance and colour correction, and background class information, the use of colour in the detection of diabetic retinopathy was quantitatively evaluated. Another contribution of this work is the benchmarking framework for eye fundus image analysis algorithms needed for the development of the automatic DR detection algorithms. The benchmarking framework provides guidelines on how to construct a benchmarking database that comprises true patient images, ground truth, and an evaluation protocol. The evaluation is based on the standard receiver operating characteristics analysis and it follows the medical practice in the decision making providing protocols for image- and pixel-based evaluations. During the work, two public medical image databases with ground truth were published: DIARETDB0 and DIARETDB1. The framework, DR databases and the final algorithm, are made public in the web to set the baseline results for automatic detection of diabetic retinopathy. Although deviating from the general context of the thesis, a simple and effective optic disc localisation method is presented. The optic disc localisation is discussed, since normal eye fundus structures are fundamental in the characterisation of DR.
Resumo:
With the increase of use of digital media the need for the methods of multimedia protection becomes extremely important. The number of the solutions to the problem from encryption to watermarking is large and is growing every year. In this work digital image watermarking is considered, specifically a novel method of digital watermarking of color and spectral images. An overview of existing methods watermarking of color and grayscale images is given in the paper. Methods using independent component analysis (ICA) for detection and the ones using discrete wavelet transform (DWT) and discrete cosine transform (DCT) are considered in more detail. A novel method of watermarking proposed in this paper allows embedding of a color or spectral watermark image into color or spectral image consequently and successful extraction of the watermark out of the resultant watermarked image. A number of experiments have been performed on the quality of extraction depending on the parameters of the embedding procedure. Another set of experiments included the test of the robustness of the algorithm proposed. Three techniques have been chosen for that purpose: median filter, low-pass filter (LPF) and discrete cosine transform (DCT), which are a part of a widely known StirMark - Image Watermarking Robustness Test. The study shows that the proposed watermarking technique is fragile, i.e. watermark is altered by simple image processing operations. Moreover, we have found that the contents of the image to be watermarked do not affect the quality of the extraction. Mixing coefficients, that determine the amount of the key and watermark image in the result, should not exceed 1% of the original. The algorithm proposed has proven to be successful in the task of watermark embedding and extraction.
Resumo:
The research proposes a methodology for assessing broiler breeder response to changes in rearing thermal environment. The continuous video recording of a flock analyzed may offer compelling evidences of thermal comfort, as well as other indications of welfare. An algorithm for classifying specific broiler breeder behavior was developed. Videos were recorded over three boxes where 30 breeders were reared. The boxes were mounted inside an environmental chamber were ambient temperature varied from cold to hot. Digital images were processed based on the number of pixels, according to their light intensity variation and binary contrast allowing a sequence of behaviors related to welfare. The system used the default of x, y coordinates, where x represents the horizontal distance from the top left of the work area to the point P, and y is the vertical distance. The video images were observed, and a grid was developed for identifying the area the birds stayed and the time they spent at that place. The sequence was analyzed frame by frame confronting the data with specific adopted thermal neutral rearing standards. The grid mask overlapped the real bird image. The resulting image allows the visualization of clusters, as birds in flock behave in certain patterns. An algorithm indicating the breeder response to thermal environment was developed.
Resumo:
Pathological gambling, a form of behavioral addiction, refers to maladaptive, compulsive gambling behavior severely interfering with an individual’s normal life. The prevalence of pathological gambling has been estimated to be 1–2% in western societies. The reward deficiency hypothesis of addiction assumes that individuals that have, or are prone, to addictions have blunted mesolimbic dopamine reward signaling, which leads to compulsive reward seeking in an attempt to compensate for the malfunctioning brain reward network. In this research project, the effects of gambling were measured using brain [11C] raclopride PET during slot machine gambling and possible brain structural changes associated with pathological gambling using MRI. The subjects included pathological gamblers and healthy volunteers. In addition, impulse control disorders associated with Parkinson’s disease were investigated by using brain [18F]fluorodopa PET and conducting an epidemiological survey. The results demonstrate mesolimbic dopamine release during gambling in both pathological gamblers and healthy volunteers. Striatal dopamine was released irrespective of the gambling outcome, whether the subjects won or not. There was no difference in gambling induced dopamine release between pathological gamblers and control subjects, although the magnitude of the dopamine release correlated with gambling related symptom severity in pathological gamblers. The results also show that pathological gambling is associated with extensive abnormality of brain white matter integrity, as measured with diffusion tensor imaging, similar to substance-addictions. In Parkinson’s disease patients with impulse control disorders, enhanced brain [18F] fluorodopa uptake in the medial orbitofrontal cortex was observed, indicating increased presynaptic monoamine function in this region, which is known to influence signaling in the mesolimbic system and reward processing. Finally, a large epidemiological survey in Finnish Parkinson’s disease patients showed that compulsive behaviors are very common in Parkinson disease and they are strongly associated with depression. These findings demonstrate the role of dopamine in pathological gambling, without support for the concept of reward deficiency syndrome.
Resumo:
Knowledge of the behaviour of cellulose, hemicelluloses, and lignin during wood and pulp processing is essential for understanding and controlling the processes. Determination of monosaccharide composition gives information about the structural polysaccharide composition of wood material and helps when determining the quality of fibrous products. In addition, monitoring of the acidic degradation products gives information of the extent of degradation of lignin and polysaccharides. This work describes two capillary electrophoretic methods developed for the analysis of monosaccharides and for the determination of aliphatic carboxylic acids from alkaline oxidation solutions of lignin and wood. Capillary electrophoresis (CE), in its many variants is an alternative separation technique to chromatographic methods. In capillary zone electrophoresis (CZE) the fused silica capillary is filled with an electrolyte solution. An applied voltage generates a field across the capillary. The movement of the ions under electric field is based on the charge and hydrodynamic radius of ions. Carbohydrates contain hydroxyl groups that are ionised only in strongly alkaline conditions. After ionisation, the structures are suitable for electrophoretic analysis and identification through either indirect UV detection or electrochemical detection. The current work presents a new capillary zone electrophoretic method, relying on in-capillary reaction and direct UV detection at the wavelength of 270 nm. The method has been used for the simultaneous separation of neutral carbohydrates, including mono- and disaccharides and sugar alcohols. The in-capillary reaction produces negatively charged and UV-absorbing compounds. The optimised method was applied to real samples. The methodology is fast since no other sample preparation, except dilution, is required. A new method for aliphatic carboxylic acids in highly alkaline process liquids was developed. The goal was to develop a method for the simultaneous analysis of the dicarboxylic acids, hydroxy acids and volatile acids that are oxidation and degradation products of lignin and wood polysaccharides. The CZE method was applied to three process cases. First, the fate of lignin under alkaline oxidation conditions was monitored by determining the level of carboxylic acids from process solutions. In the second application, the degradation of spruce wood using alkaline and catalysed alkaline oxidation were compared by determining carboxylic acids from the process solutions. In addition, the effectiveness of membrane filtration and preparative liquid chromatography in the enrichment of hydroxy acids from black liquor was evaluated, by analysing the effluents with capillary electrophoresis.
Resumo:
Monimutkaisissa ja muuttuvissa ympäristöissä työskentelevät robotit tarvitsevat kykyä manipuloida ja tarttua esineisiin. Tämä työ tutkii robottitarttumisen ja robottitartuntapis-teiden koneoppimisen aiempaa tutkimusta ja nykytilaa. Nykyaikaiset menetelmät käydään läpi, ja Le:n koneoppimiseen pohjautuva luokitin toteutetaan, koska se tarjoaa parhaan onnistumisprosentin tutkituista menetelmistä ja on muokattavissa sopivaksi käytettävissä olevalle robotille. Toteutettu menetelmä käyttää intensititeettikuvaan ja syvyyskuvaan po-hjautuvia ominaisuuksi luokitellakseen potentiaaliset tartuntapisteet. Tämän toteutuksen tulokset esitellään.
Resumo:
This study examines information security as a process (information securing) in terms of what it does, especially beyond its obvious role of protector. It investigates concepts related to ‘ontology of becoming’, and examines what it is that information securing produces. The research is theory driven and draws upon three fields: sociology (especially actor-network theory), philosophy (especially Gilles Deleuze and Félix Guattari’s concept of ‘machine’, ‘territory’ and ‘becoming’, and Michel Serres’s concept of ‘parasite’), and information systems science (the subject of information security). Social engineering (used here in the sense of breaking into systems through non-technical means) and software cracker groups (groups which remove copy protection systems from software) are analysed as examples of breaches of information security. Firstly, the study finds that information securing is always interruptive: every entity (regardless of whether or not it is malicious) that becomes connected to information security is interrupted. Furthermore, every entity changes, becomes different, as it makes a connection with information security (ontology of becoming). Moreover, information security organizes entities into different territories. However, the territories – the insides and outsides of information systems – are ontologically similar; the only difference is in the order of the territories, not in the ontological status of entities that inhabit the territories. In other words, malicious software is ontologically similar to benign software; they both are users in terms of a system. The difference is based on the order of the system and users: who uses the system and what the system is used for. Secondly, the research shows that information security is always external (in the terms of this study it is a ‘parasite’) to the information system that it protects. Information securing creates and maintains order while simultaneously disrupting the existing order of the system that it protects. For example, in terms of software itself, the implementation of a copy protection system is an entirely external addition. In fact, this parasitic addition makes software different. Thus, information security disrupts that which it is supposed to defend from disruption. Finally, it is asserted that, in its interruption, information security is a connector that creates passages; it connects users to systems while also creating its own threats. For example, copy protection systems invite crackers and information security policies entice social engineers to use and exploit information security techniques in a novel manner.
Resumo:
This thesis is concerned with the state and parameter estimation in state space models. The estimation of states and parameters is an important task when mathematical modeling is applied to many different application areas such as the global positioning systems, target tracking, navigation, brain imaging, spread of infectious diseases, biological processes, telecommunications, audio signal processing, stochastic optimal control, machine learning, and physical systems. In Bayesian settings, the estimation of states or parameters amounts to computation of the posterior probability density function. Except for a very restricted number of models, it is impossible to compute this density function in a closed form. Hence, we need approximation methods. A state estimation problem involves estimating the states (latent variables) that are not directly observed in the output of the system. In this thesis, we use the Kalman filter, extended Kalman filter, Gauss–Hermite filters, and particle filters to estimate the states based on available measurements. Among these filters, particle filters are numerical methods for approximating the filtering distributions of non-linear non-Gaussian state space models via Monte Carlo. The performance of a particle filter heavily depends on the chosen importance distribution. For instance, inappropriate choice of the importance distribution can lead to the failure of convergence of the particle filter algorithm. In this thesis, we analyze the theoretical Lᵖ particle filter convergence with general importance distributions, where p ≥2 is an integer. A parameter estimation problem is considered with inferring the model parameters from measurements. For high-dimensional complex models, estimation of parameters can be done by Markov chain Monte Carlo (MCMC) methods. In its operation, the MCMC method requires the unnormalized posterior distribution of the parameters and a proposal distribution. In this thesis, we show how the posterior density function of the parameters of a state space model can be computed by filtering based methods, where the states are integrated out. This type of computation is then applied to estimate parameters of stochastic differential equations. Furthermore, we compute the partial derivatives of the log-posterior density function and use the hybrid Monte Carlo and scaled conjugate gradient methods to infer the parameters of stochastic differential equations. The computational efficiency of MCMC methods is highly depend on the chosen proposal distribution. A commonly used proposal distribution is Gaussian. In this kind of proposal, the covariance matrix must be well tuned. To tune it, adaptive MCMC methods can be used. In this thesis, we propose a new way of updating the covariance matrix using the variational Bayesian adaptive Kalman filter algorithm.
Resumo:
A direct-driven permanent magnet synchronous machine for a small urban use electric vehicle is presented. The measured performance of the machine at the test bench as well as the performance over the modified New European Drive Cycle will be given. The effect of optimal current components, maximizing the efficiency and taking into account the iron loss, is compared with the simple id=0 – control. The machine currents and losses during the drive cycle are calculated and compared with each other.
Resumo:
The objective of the present study was to determine contrast sensitivity curves of concentric circular patterns with radial frequencies of 0.25, 0.5, 1.0, 2.0, and 4.0 cycles per degree in young and older adult volunteers. These parameters were also compared with sensitivity contrasts for sine-wave gratings. All participants had normal acuity vision and were free of identifiable ocular illness. Contrast sensitivity was measured in 6 young adults aged 19 to 23 years and 6 older adults aged 60 to 69 years using the psychophysical forced-choice method. In this paradigm the volunteers had to decide which of two stimuli contained the above radial frequencies at low contrast levels. The other neutral stimulus was gray with homogeneous luminance. We detected a decline in contrast sensitivity for older adults at all radial frequencies compared to young adults. Also, contrast sensitivity for sine-wave gratings at all measured frequencies was better, as predicted, for all young adults. Maximum sensitivities in the radial frequency contrast sensitivity function and contrast sensitivity function occurred at 0.25 and 0.5 cycles per degree, respectively, for both young and older adults. These results suggest age-related changes in the contrast sensitivity function for concentric symmetrical stimuli.
Resumo:
To study the effect of age on the metrics of upper and lower eyelid saccades, eyelid movement of two groups of 30 subjects each were measured using computed image analysis. The patients were divided on the basis of age into a younger group (20-30 years) and an older group (60-91 years). Eyelid saccade functions were fitted by the damped harmonic oscillator model. Amplitude and peak velocity were used to compare the effect of age on the saccades of the upper and lower eyelid. There was no statistically significant difference in saccade amplitude between groups for the upper eyelid (mean ± SEM; upward, young = 9.18 ± 0.32 mm, older = 8.93 ± 0.31 mm, t = 0.56, P = 0.58; downward, young = 9.11 ± 0.27 mm, older = 8.86 ± 0.32 mm, t = 0.58, P = 0.56) However, there was a clear decline in the peak velocity of the upper eyelid saccades of older subjects (upward, young = 59.06 ± 2.34 mm/s, older = 50.12 ± 1.95 mm/s, t = 2.93, P = 0.005; downward, young = 71.78 ± 1.78 mm/s, older = 60.29 ± 2.62 mm/s, t = 3.63, P = 0.0006). In contrast, for the lower eyelid there was a clear increase of saccade amplitude in the elderly group (upward, young = 2.27 ± 0.09 mm, older = 2.98 ± 0.15 mm, t = 4.33, P < 0.0001; downward, young = 2.21 ± 0.10 mm, older = 2.96 ± 0.17 mm, t = 3.85, P < 0.001). These data suggest that the aging process affects the metrics of the lid saccades in a different manner according to the eyelid. In the upper eyelid the lower tension exerted by a weak aponeurosis is reflected only on the peak velocity of the saccades. In the lower eyelid, age is accompanied by an increase in saccade amplitude which indicates that the force transmission to the lid is not affected in the elderly.
Resumo:
Strawberries were submitted to freezing after pre-treatments with hydrocolloid and calcium salts (pectin and calcium chloride) at different concentrations, in the attempt to establish a correlation of the effects of these substances and their processing, on the physical and microstructural characteristics of fruits after thawing. Strawberry halves were submitted to impregnation with controlled vacuum pressure of 84.4, 50.5 and 16.6 kPa; comprising pectin at concentrations of 0, 1.5 and 3%; with the addition of calcium chloride at concentrations of 0, 3 and 6%; and glucose at 20%, for 4 hours. Measurements were made of the total soluble solid contents, cellular fluid loss, texture and viscosity of the solution, before and after the freezing/thawing. Images of the tissue cuts during the freezing, in function of time, were taken in an optic microscope coupled to a cold-stage and controlled temperature system, where the reduction of the cellular area was quantified using an image analyzing software. The pectin concentration had an influence on and demonstrated a potential for protection of the frozen tissue samples. The photomicrographs showed that the loss of cellular fluid occurs during the growth of ice formed in the intercellular spaces and it is retarded through treatments with high pectin concentrations.