865 resultados para Associative Classifiers
Resumo:
This work discusses the application of techniques of ensembles in multimodal recognition systems development in revocable biometrics. Biometric systems are the future identification techniques and user access control and a proof of this is the constant increases of such systems in current society. However, there is still much advancement to be developed, mainly with regard to the accuracy, security and processing time of such systems. In the search for developing more efficient techniques, the multimodal systems and the use of revocable biometrics are promising, and can model many of the problems involved in traditional biometric recognition. A multimodal system is characterized by combining different techniques of biometric security and overcome many limitations, how: failures in the extraction or processing the dataset. Among the various possibilities to develop a multimodal system, the use of ensembles is a subject quite promising, motivated by performance and flexibility that they are demonstrating over the years, in its many applications. Givin emphasis in relation to safety, one of the biggest problems found is that the biometrics is permanently related with the user and the fact of cannot be changed if compromised. However, this problem has been solved by techniques known as revocable biometrics, which consists of applying a transformation on the biometric data in order to protect the unique characteristics, making its cancellation and replacement. In order to contribute to this important subject, this work compares the performance of individual classifiers methods, as well as the set of classifiers, in the context of the original data and the biometric space transformed by different functions. Another factor to be highlighted is the use of Genetic Algorithms (GA) in different parts of the systems, seeking to further maximize their eficiency. One of the motivations of this development is to evaluate the gain that maximized ensembles systems by different GA can bring to the data in the transformed space. Another relevant factor is to generate revocable systems even more eficient by combining two or more functions of transformations, demonstrating that is possible to extract information of a similar standard through applying different transformation functions. With all this, it is clear the importance of revocable biometrics, ensembles and GA in the development of more eficient biometric systems, something that is increasingly important in the present day
Resumo:
Objective to establish a methodology for the oil spill monitoring on the sea surface, located at the Submerged Exploration Area of the Polo Region of Guamaré, in the State of Rio Grande do Norte, using orbital images of Synthetic Aperture Radar (SAR integrated with meteoceanographycs products. This methodology was applied in the following stages: (1) the creation of a base map of the Exploration Area; (2) the processing of NOAA/AVHRR and ERS-2 images for generation of meteoceanographycs products; (3) the processing of RADARSAT-1 images for monitoring of oil spills; (4) the integration of RADARSAT-1 images with NOAA/AVHRR and ERS-2 image products; and (5) the structuring of a data base. The Integration of RADARSAT-1 image of the Potiguar Basin of day 21.05.99 with the base map of the Exploration Area of the Polo Region of Guamaré for the identification of the probable sources of the oil spots, was used successfully in the detention of the probable spot of oil detected next to the exit to the submarine emissary in the Exploration Area of the Polo Region of Guamaré. To support the integration of RADARSAT-1 images with NOAA/AVHRR and ERS-2 image products, a methodology was developed for the classification of oil spills identified by RADARSAT-1 images. For this, the following algorithms of classification not supervised were tested: K-means, Fuzzy k-means and Isodata. These algorithms are part of the PCI Geomatics software, which was used for the filtering of RADARSAT-1 images. For validation of the results, the oil spills submitted to the unsupervised classification were compared to the results of the Semivariogram Textural Classifier (STC). The mentioned classifier was developed especially for oil spill classification purposes and requires PCI software for the whole processing of RADARSAT-1 images. After all, the results of the classifications were analyzed through Visual Analysis; Calculation of Proportionality of Largeness and Analysis Statistics. Amongst the three algorithms of classifications tested, it was noted that there were no significant alterations in relation to the spills classified with the STC, in all of the analyses taken into consideration. Therefore, considering all the procedures, it has been shown that the described methodology can be successfully applied using the unsupervised classifiers tested, resulting in a decrease of time in the identification and classification processing of oil spills, if compared with the utilization of the STC classifier
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Computer systems are used to support breast cancer diagnosis, with decisions taken from measurements carried out in regions of interest (ROIs). We show that support decisions obtained from square or rectangular ROIs can to include background regions with different behavior of healthy or diseased tissues. In this study, the background regions were identified as Partial Pixels (PP), obtained with a multilevel method of segmentation based on maximum entropy. The behaviors of healthy, diseased and partial tissues were quantified by fractal dimension and multiscale lacunarity, calculated through signatures of textures. The separability of groups was achieved using a polynomial classifier. The polynomials have powerful approximation properties as classifiers to treat characteristics linearly separable or not. This proposed method allowed quantifying the ROIs investigated and demonstrated that different behaviors are obtained, with distinctions of 90% for images obtained in the Cranio-caudal (CC) and Mediolateral Oblique (MLO) views.
Resumo:
The goal of this work is to assess the efficacy of texture measures for estimating levels of crowd densities ill images. This estimation is crucial for the problem of crowd monitoring. and control. The assessment is carried out oil a set of nearly 300 real images captured from Liverpool Street Train Station. London, UK using texture measures extracted from the images through the following four different methods: gray level dependence matrices, straight lille segments. Fourier analysis. and fractal dimensions. The estimations of dowel densities are given in terms of the classification of the input images ill five classes of densities (very low, low. moderate. high and very high). Three types of classifiers are used: neural (implemented according to the Kohonen model). Bayesian. and an approach based on fitting functions. The results obtained by these three classifiers. using the four texture measures. allowed the conclusion that, for the problem of crowd density estimation. texture analysis is very effective.
Resumo:
Petroleum well drilling monitoring has become an important tool for detecting and preventing problems during the well drilling process. In this paper, we propose to assist the drilling process by analyzing the cutting images at the vibrating shake shaker, in which different concentrations of cuttings can indicate possible problems, such as the collapse of the well borehole walls. In such a way, we present here an innovative computer vision system composed by a real time cutting volume estimator addressed by support vector regression. As far we know, we are the first to propose the petroleum well drilling monitoring by cutting image analysis. We also applied a collection of supervised classifiers for cutting volume classification. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
As condições meteorológicas são determinantes para a produção agrícola; a precipitação, em particular, pode ser citada como a mais influente por sua relação direta com o balanço hídrico. Neste sentido, modelos agrometeorológicos, os quais se baseiam nas respostas das culturas às condições meteorológicas, vêm sendo cada vez mais utilizados para a estimativa de rendimentos agrícolas. Devido às dificuldades de obtenção de dados para abastecer tais modelos, métodos de estimativa de precipitação utilizando imagens dos canais espectrais dos satélites meteorológicos têm sido empregados para esta finalidade. O presente trabalho tem por objetivo utilizar o classificador de padrões floresta de caminhos ótimos para correlacionar informações disponíveis no canal espectral infravermelho do satélite meteorológico GOES-12 com a refletividade obtida pelo radar do IPMET/UNESP localizado no município de Bauru, visando o desenvolvimento de um modelo para a detecção de ocorrência de precipitação. Nos experimentos foram comparados quatro algoritmos de classificação: redes neurais artificiais (ANN), k-vizinhos mais próximos (k-NN), máquinas de vetores de suporte (SVM) e floresta de caminhos ótimos (OPF). Este último obteve melhor resultado, tanto em eficiência quanto em precisão.
Resumo:
O desempenho animal é a medida mais direta de se avaliar a qualidade dos alimentos. Entretanto, dados de desempenho são insuficientes para se detectar as possíveis interações que possam ocorrer no ambiente ruminal. O objetivo do presente trabalho foi avaliar os possíveis efeitos associativos nas concentrações de ácidos graxos voláteis (AGVs), nitrogênio amoniacal (N-NH3) e pH da fração líquida remanescente da digestão da matéria seca (MS) de volumosos exclusivos (cana-de-açúcar= CN; capim-elefante com 60 dias= CP60 e 180 dias= CP180 de crescimento; e silagem de milho= SIL) e suas combinações (cana-de-açúcar+silagem de milho= CNSIL; cana-de-açúcar+capim-elefante-60d= CNCP60; cana-de-açúcar+capim-elefante-180d= CNCP180; silagem de milho+capim-elefante-60d= SILCP60; silagem de milho+capim-elefante-180d= SILCP180) na proporção de 50% na MS, que levam a resultados de desempenhos positivos ou negativos de bovinos. As concentrações de AGVs, N-NH3 e pH dos tratamentos foram: CN= 56,9 mmol L-1, 50,1 mg dL-1, 5,7; CNSIL= 61,4 mmol L-1, 50,7 mg dL-1, 5,8; CNCP60= 54,7 mmol L-1, 47,6 mg dL-1, 5,8; CNCP180= 45,4 mmol L-1, 49,4 mg dL-1, 6,0; SIL= 57,2 mmol L-1, 54,0 mg dL-1, 5,8; SILCP60= 57,1 mmol L-1, 53,1 mg dL-1, 5,9; SILCP180= 55,9 mmol L-1, 52,3 mg dL-1, 6,0; CP60= 58,1 mmol L-1, 49,4 mg dL-1, 5,9; CP180= 44,0 mmol L-1, 46,4 mg dL-1, 6,1. Os carboidratos não estruturais e amido, aliados à fibra e proteína, contribuíram para que ocorresse o efeito associativo positivo na mistura 50:50 cana/silagem. Isso pode ter propiciado os melhores resultados de desempenho em bovinos devido ao elevado padrão fermentativo.
Resumo:
A autora analisa a participação eleitoral em 2002, relacionando-a às formas de participação associativa. A hipótese testada é de que o eleitor com vínculos associativos tem maior participação eleitoral. Nesse sentido, os resultados sugerem que para os eleitores ativos há um perfil associado à participação em greves e filiação sindical mas, quanto à sua filiação partidária, sugerem que há outros fatores intervenientes na relação. O artigo utiliza os dados do ESEB 2002
Resumo:
A body of research has developed within the context of nonlinear signal and image processing that deals with the automatic, statistical design of digital window-based filters. Based on pairs of ideal and observed signals, a filter is designed in an effort to minimize the error between the ideal and filtered signals. The goodness of an optimal filter depends on the relation between the ideal and observed signals, but the goodness of a designed filter also depends on the amount of sample data from which it is designed. In order to lessen the design cost, a filter is often chosen from a given class of filters, thereby constraining the optimization and increasing the error of the optimal filter. To a great extent, the problem of filter design concerns striking the correct balance between the degree of constraint and the design cost. From a different perspective and in a different context, the problem of constraint versus sample size has been a major focus of study within the theory of pattern recognition. This paper discusses the design problem for nonlinear signal processing, shows how the issue naturally transitions into pattern recognition, and then provides a review of salient related pattern-recognition theory. In particular, it discusses classification rules, constrained classification, the Vapnik-Chervonenkis theory, and implications of that theory for morphological classifiers and neural networks. The paper closes by discussing some design approaches developed for nonlinear signal processing, and how the nature of these naturally lead to a decomposition of the error of a designed filter into a sum of the following components: the Bayes error of the unconstrained optimal filter, the cost of constraint, the cost of reducing complexity by compressing the original signal distribution, the design cost, and the contribution of prior knowledge to a decrease in the error. The main purpose of the paper is to present fundamental principles of pattern recognition theory within the framework of active research in nonlinear signal processing.
Resumo:
The in vitro gas production of four single roughages and their paired combinations (1:1 on dry matter basis) were evaluated. Two roughage samples (100 mg) per treatment were fermented with ruminal fluid during a 48 h incubation period. Total 48 h gas volumes of fermentation dry matter (DM), neutral detergent fiber (NDF) and soluble compounds in neutral detergent (NDS) were for sugarcane = 16.8, 11.2, 6.9 mL; sugarcane + corn silage = 20.1, 12.6, 9.1 mL; sugarcane + 60-day elephantgrass = 16.5, 17.6 mL; sugarcane + 180-day elephantgrass = 13.8, 8.2, 5.9 mL; corn silage = 18.8, 16.8, 4.7 mL; corn silage + 60-day elephantgrass = 16.3, 15.4, 2.4 mL; corn silage + 180-day elephantgrass = 16.1, 11.8, 4.2 mL; 60-day elephantgrass = 16.9, 19.0 mL and 180-day elephantgrass = fermented 10.7, 12.2 mL, respectively. The NDS gas production was not possible to estimate for sugarcane + 60-day elephantgrass, 60-day elephantgrass and 180-day elephantgrass. The present data shows that the curves subtraction method can be an option to evaluate the contribution of the soluble fractions in roughages to digestion kinetics. However, this method underestimates the NDS gas contribution when roughages are low in crude protein and soluble carbohydrates. It is advisable to directly apply the two-compartmental mathematical model to the digestion curves for roughage DM, when determining the NDS gas volume and the digestion rate. This method is more straightforward and accurate when compared to the curve subtraction method. Non-structural carbohydrates combined with fiber and protein promoted a positive associative effect in sugarcane + corn silage (50:50) mixture. Therefore, it can be concluded that the soluble fraction of roughages greatly contributes to gas production. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
This paper addresses biometric identification using large databases, in particular, iris databases. In such applications, it is critical to have low response time, while maintaining an acceptable recognition rate. Thus, the trade-off between speed and accuracy must be evaluated for processing and recognition parts of an identification system. In this paper, a graph-based framework for pattern recognition, called Optimum-Path Forest (OPF), is utilized as a classifier in a pre-developed iris recognition system. The aim of this paper is to verify the effectiveness of OPF in the field of iris recognition, and its performance for various scale iris databases. The existing Gauss-Laguerre Wavelet based coding scheme is used for iris encoding. The performance of the OPF and two other - Hamming and Bayesian - classifiers, is compared using small, medium, and large-scale databases. Such a comparison shows that the OPF has faster response for large-scale databases, thus performing better than the more accurate, but slower, classifiers.
Resumo:
Characteristics of speech, especially figures of speech, are used by specific communities or domains, and, in this way, reflect their identities through their choice of vocabulary. This topic should be an object of study in the context of knowledge representation once it deals with different contexts of production of documents. This study aims to explore the dimensions of the concepts of euphemism, dysphemism, and orthophemism, focusing on the latter with the goal of extracting a concept which can be included in discussions about subject analysis and indexing. Euphemism is used as an alternative to a non-preferred expression or as an alternative to an offensive attribution-to avoid potential offense taken by the listener or by other persons, for instance, pass away. Dysphemism, on the other hand, is used by speakers to talk about people and things that frustrate and annoy them-their choice of language indicates disapproval and the topic is therefore denigrated, humiliated, or degraded, for instance, kick the bucket. While euphemism tries to make something sound better, dysphemism tries to make something sound worse. Orthophemism (Allan and Burridge 2006) is also used as an alternative to expressions, but it is a preferred, formal, and direct language of expression when representing an object or a situation, for instance, die. This paper suggests that the comprehension and use of such concepts could support the following issues: possible contributions from linguistics and terminology to subject analysis as demonstrated by Talamo et al. (1992); decrease of polysemy and ambiguity of terms used to represent certain topics of documents; and construction and evaluation of indexing languages. The concept of orthophemism can also serves to support associative relationships in the context of subject analysis, indexing, and even information retrieval related to more specific requests.
Resumo:
Majority of biometric researchers focus on the accuracy of matching using biometrics databases, including iris databases, while the scalability and speed issues have been neglected. In the applications such as identification in airports and borders, it is critical for the identification system to have low-time response. In this paper, a graph-based framework for pattern recognition, called Optimum-Path Forest (OPF), is utilized as a classifier in a pre-developed iris recognition system. The aim of this paper is to verify the effectiveness of OPF in the field of iris recognition, and its performance for various scale iris databases. This paper investigates several classifiers, which are widely used in iris recognition papers, and the response time along with accuracy. The existing Gauss-Laguerre Wavelet based iris coding scheme, which shows perfect discrimination with rotary Hamming distance classifier, is used for iris coding. The performance of classifiers is compared using small, medium, and large scale databases. Such comparison shows that OPF has faster response for large scale database, thus performing better than more accurate but slower Bayesian classifier.
Resumo:
Research on Blindsight, Neglect/Extinction and Phantom limb syndromes, as well as electrical measurements of mammalian brain activity, have suggested the dependence of vivid perception on both incoming sensory information at primary sensory cortex and reentrant information from associative cortex. Coherence between incoming and reentrant signals seems to be a necessary condition for (conscious) perception. General reticular activating system and local electrical synchronization are some of the tools used by the brain to establish coarse coherence at the sensory cortex, upon which biochemical processes are coordinated. Besides electrical synchrony and chemical modulation at the synapse, a central mechanism supporting such a coherence is the N-methyl-D-aspartate channel, working as a 'coincidence detector' for an incoming signal causing the depolarization necessary to remove Mg 2+, and reentrant information releasing the glutamate that finally prompts Ca 2+ entry. We propose that a signal transduction pathway activated by Ca 2+ entry into cortical neurons is in charge of triggering a quantum computational process that accelerates inter-neuronal communication, thus solving systemic conflict and supporting the unity of consciousness. © 2001 Elsevier Science Ltd.