17 resultados para Binary images
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
The use of iris recognition for human authentication has been spreading in the past years. Daugman has proposed a method for iris recognition, composed by four stages: segmentation, normalization, feature extraction, and matching. In this paper we propose some modifications and extensions to Daugman's method to cope with noisy images. These modifications are proposed after a study of images of CASIA and UBIRIS databases. The major modification is on the computationally demanding segmentation stage, for which we propose a faster and equally accurate template matching approach. The extensions on the algorithm address the important issue of pre-processing that depends on the image database, being mandatory when we have a non infra-red camera, like a typical WebCam. For this scenario, we propose methods for reflection removal and pupil enhancement and isolation. The tests, carried out by our C# application on grayscale CASIA and UBIRIS images show that the template matching segmentation method is more accurate and faster than the previous one, for noisy images. The proposed algorithms are found to be efficient and necessary when we deal with non infra-red images and non uniform illumination.
Resumo:
The rapid growth in genetics and molecular biology combined with the development of techniques for genetically engineering small animals has led to increased interest in in vivo small animal imaging. Small animal imaging has been applied frequently to the imaging of small animals (mice and rats), which are ubiquitous in modeling human diseases and testing treatments. The use of PET in small animals allows the use of subjects as their own control, reducing the interanimal variability. This allows performing longitudinal studies on the same animal and improves the accuracy of biological models. However, small animal PET still suffers from several limitations. The amounts of radiotracers needed, limited scanner sensitivity, image resolution and image quantification issues, all could clearly benefit from additional research. Because nuclear medicine imaging deals with radioactive decay, the emission of radiation energy through photons and particles alongside with the detection of these quanta and particles in different materials make Monte Carlo method an important simulation tool in both nuclear medicine research and clinical practice. In order to optimize the quantitative use of PET in clinical practice, data- and image-processing methods are also a field of intense interest and development. The evaluation of such methods often relies on the use of simulated data and images since these offer control of the ground truth. Monte Carlo simulations are widely used for PET simulation since they take into account all the random processes involved in PET imaging, from the emission of the positron to the detection of the photons by the detectors. Simulation techniques have become an importance and indispensable complement to a wide range of problems that could not be addressed by experimental or analytical approaches.
Resumo:
Fluorescence confocal microscopy (FCM) is now one of the most important tools in biomedicine research. In fact, it makes it possible to accurately study the dynamic processes occurring inside the cell and its nucleus by following the motion of fluorescent molecules over time. Due to the small amount of acquired radiation and the huge optical and electronics amplification, the FCM images are usually corrupted by a severe type of Poisson noise. This noise may be even more damaging when very low intensity incident radiation is used to avoid phototoxicity. In this paper, a Bayesian algorithm is proposed to remove the Poisson intensity dependent noise corrupting the FCM image sequences. The observations are organized in a 3-D tensor where each plane is one of the images acquired along the time of a cell nucleus using the fluorescence loss in photobleaching (FLIP) technique. The method removes simultaneously the noise by considering different spatial and temporal correlations. This is accomplished by using an anisotropic 3-D filter that may be separately tuned in space and in time dimensions. Tests using synthetic and real data are described and presented to illustrate the application of the algorithm. A comparison with several state-of-the-art algorithms is also presented.
Resumo:
We investigate the effect of distinct bonding energies on the onset of criticality of low functionality fluid mixtures. We focus on mixtures ofparticles with two and three patches as this includes the mixture where "empty" fluids were originally reported. In addition to the number of patches, thespecies differ in the type of patches or bonding sites. For simplicity, we consider that the patches on each species are identical: one species has threepatches of type A and the other has two patches of type B. We have found a rich phase behavior with closed miscibility gaps, liquid-liquid demixing, and negative azeotropes. Liquid-liquid demixing was found to pre-empt the "empty" fluid regime, of these mixtures, when the AB bonds are weaker than the AA or BB bonds. By contrast, mixtures in this class exhibit "empty" fluid behavior when the AB bonds are stronger than at least one of the other two. Mixtureswith bonding energies epsilon(BB) = epsilon(AB) and epsilon(AA) < epsilon(BB), were found to exhibit an unusual negative azeotrope. (C) 2011 American Institute of Physics. [doi:10.1063/1.3561396]
Resumo:
Mestrado em Radiações Aplicadas às Tecnologias da Saúde - Área de especialização: Imagem Digital por Radiação X.
Resumo:
We investigate the phase behaviour of 2D mixtures of bi-functional and three-functional patchy particles and 3D mixtures of bi-functional and tetra-functional patchy particles by means of Monte Carlo simulations and Wertheim theory. We start by computing the critical points of the pure systems and then we investigate how the critical parameters change upon lowering the temperature. We extend the successive umbrella sampling method to mixtures to make it possible to extract information about the phase behaviour of the system at a fixed temperature for the whole range of densities and compositions of interest. (C) 2013 AIP Publishing LLC.
Resumo:
Fluorescence confocal microscopy images present a low signal to noise ratio and a time intensity decay due to the so called photoblinking and photobleaching effects. These effects, together with the Poisson multiplicative noise that corrupts the images, make long time biological observation processes very difficult.
Resumo:
Liver steatosis is mainly a textural abnormality of the hepatic parenchyma due to fat accumulation on the hepatic vesicles. Today, the assessment is subjectively performed by visual inspection. Here a classifier based on features extracted from ultrasound (US) images is described for the automatic diagnostic of this phatology. The proposed algorithm estimates the original ultrasound radio-frequency (RF) envelope signal from which the noiseless anatomic information and the textural information encoded in the speckle noise is extracted. The features characterizing the textural information are the coefficients of the first order autoregressive model that describes the speckle field. A binary Bayesian classifier was implemented and the Bayes factor was calculated. The classification has revealed an overall accuracy of 100%. The Bayes factor could be helpful in the graphical display of the quantitative results for diagnosis purposes.
Resumo:
In visual sensor networks, local feature descriptors can be computed at the sensing nodes, which work collaboratively on the data obtained to make an efficient visual analysis. In fact, with a minimal amount of computational effort, the detection and extraction of local features, such as binary descriptors, can provide a reliable and compact image representation. In this paper, it is proposed to extract and code binary descriptors to meet the energy and bandwidth constraints at each sensing node. The major contribution is a binary descriptor coding technique that exploits the correlation using two different coding modes: Intra, which exploits the correlation between the elements that compose a descriptor; and Inter, which exploits the correlation between descriptors of the same image. The experimental results show bitrate savings up to 35% without any impact in the performance efficiency of the image retrieval task. © 2014 EURASIP.
Resumo:
In this paper, a novel ROM-less RNS-to-binary converter is proposed, using a new balanced moduli set {22n-1, 22n + 1, 2n-3, 2n + 3} for n even. The proposed converter is implemented with a two stage ROM-less approach, which computes the value of X based only in arithmetic operations, without using lookup tables. Experimental results for 24 to 120 bits of Dynamic Range, show that the proposed converter structure allows a balanced system with 20% faster arithmetic channels regarding the related state of the art, while requiring similar area resources. This improvement in the channel's performance is enough to offset the higher conversion costs of the proposed converter. Furthermore, up to 20% better Power-Delay-Product efficiency metric can be achieved for the full RNS architecture using the proposed moduli set. © 2014 IEEE.
Resumo:
In this paper an automatic classification algorithm is proposed for the diagnosis of the liver steatosis, also known as, fatty liver, from ultrasound images. The features, automatically extracted from the ultrasound images used by the classifier, are basically the ones used by the physicians in the diagnosis of the disease based on visual inspection of the ultrasound images. The main novelty of the method is the utilization of the speckle noise that corrupts the ultrasound images to compute textural features of the liver parenchyma relevant for the diagnosis. The algorithm uses the Bayesian framework to compute a noiseless image, containing anatomic and echogenic information of the liver and a second image containing only the speckle noise used to compute the textural features. The classification results, with the Bayes classifier using manually classified data as ground truth show that the automatic classifier reaches an accuracy of 95% and a 100% of sensitivity.
Resumo:
In the field of appearance-based robot localization, the mainstream approach uses a quantized representation of local image features. An alternative strategy is the exploitation of raw feature descriptors, thus avoiding approximations due to quantization. In this work, the quantized and non-quantized representations are compared with respect to their discriminativity, in the context of the robot global localization problem. Having demonstrated the advantages of the non-quantized representation, the paper proposes mechanisms to reduce the computational burden this approach would carry, when applied in its simplest form. This reduction is achieved through a hierarchical strategy which gradually discards candidate locations and by exploring two simplifying assumptions about the training data. The potential of the non-quantized representation is exploited by resorting to the entropy-discriminativity relation. The idea behind this approach is that the non-quantized representation facilitates the assessment of the distinctiveness of features, through the entropy measure. Building on this finding, the robustness of the localization system is enhanced by modulating the importance of features according to the entropy measure. Experimental results support the effectiveness of this approach, as well as the validity of the proposed computation reduction methods.
Resumo:
We investigate the thermodynamics and percolation regimes of model binary mixtures of patchy colloidal particles. The particles of each species have three sites of two types, one of which promotes bonding of particles of the same species while the other promotes bonding of different species. We find up to four percolated structures at low temperatures and densities: two gels where only one species percolates, a mixed gel where particles of both species percolate but neither species percolates separately, and a bicontinuous gel where particles of both species percolate separately forming two interconnected networks. The competition between the entropy and the energy of bonding drives the stability of the different percolating structures. Appropriate mixtures exhibit one or more connectivity transitions between the mixed and bicontinuous gels, as the temperature and/or the composition changes.
Resumo:
Self-compacting concrete (SCC) can soon be expected to replace conventional concrete due to its many advantages. Its main characteristics in the fresh state are achieved essentially by a higher volume of mortar (more ultrafine material) and a decrease of the coarse-aggregates. The use of over-large volumes of additions such as fly ash (FA) and/or limestone filler (LF) can substantially affect the concrete's pore structure and consequently its durability. In this context, an experimental programme was conducted to evaluate the effect on the concrete's porosity and microstructure of incorporating FA and LF in binary and ternary mixes of SCC. For this, a total of 11 SIX mixes were produced; 1 with cement only (C); 3 with C + FA in 30%, 60% and 70% substitution (fad); 3 with C + LF in 30%, 60% and 70% fad; 4 with C + FA + LF in combinations of 10-20%, 20-10%, 20-40% and 40-20% f(ad), respectively. The results enabled conclusions to be established regarding the SCC's durability, based on its permeability and the microstructure of its pore structure. The properties studied are strongly affected by the type and quantity of additions. The use of ternary mixes also proves to be extremely favourable, confirming the beneficial effect of the synergy between these additions. (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
Abstract Self-compacting concrete (SCC) can soon be expected to replace conventional concrete due to its many advantages. Its main characteristics in the fresh state are achieved essentially by a higher volume of mortar (more ultrafine material) and a decrease of the coarse-aggregates. The use of over-large volumes of additions such as fly ash (FA) and/or limestone filler (LF) can substantially affect the concrete's pore structure and consequently its durability. In this context, an experimental programme was conducted to evaluate the effect on the concrete's porosity and microstructure of incorporating FA and LF in binary and ternary mixes of SCC. For this, a total of 11 SCC mixes were produced: 1 with cement only (C); 3 with C + FA in 30%, 60% and 70% substitution (fad); 3 with C + LF in 30%, 60% and 70% fad; 4 with C + FA + LF in combinations of 10-20%, 20-10%, 20-40% and 40-20% fad, respectively. The results enabled conclusions to be established regarding the SCC's durability, based on its permeability and the microstructure of its pore structure. The properties studied are strongly affected by the type and quantity of additions. The use of ternary mixes also proves to be extremely favourable, confirming the beneficial effect of the synergy between these additions. © 2015 Elsevier Ltd. All rights reserved.