876 resultados para WAVELET TRANSFORM
Resumo:
With the widespread proliferation of computers, many human activities entail the use of automatic image analysis. The basic features used for image analysis include color, texture, and shape. In this paper, we propose a new shape description method, called Hough Transform Statistics (HTS), which uses statistics from the Hough space to characterize the shape of objects or regions in digital images. A modified version of this method, called Hough Transform Statistics neighborhood (HTSn), is also presented. Experiments carried out on three popular public image databases showed that the HTS and HTSn descriptors are robust, since they presented precision-recall results much better than several other well-known shape description methods. When compared to Beam Angle Statistics (BAS) method, a shape description method that inspired their development, both the HTS and the HTSn methods presented inferior results regarding the precision-recall criterion, but superior results in the processing time and multiscale separability criteria. The linear complexity of the HTS and the HTSn algorithms, in contrast to BAS, make them more appropriate for shape analysis in high-resolution image retrieval tasks when very large databases are used, which are very common nowadays. (C) 2014 Elsevier Inc. All rights reserved.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS
Resumo:
Research on image processing has shown that combining segmentation methods may lead to a solid approach to extract semantic information from different sort of images. Within this context, the Normalized Cut (NCut) is usually used as a final partitioning tool for graphs modeled in some chosen method. This work explores the Watershed Transform as a modeling tool, using different criteria of the hierarchical Watershed to convert an image into an adjacency graph. The Watershed is combined with an unsupervised distance learning step that redistributes the graph weights and redefines the Similarity matrix, before the final segmentation step using NCut. Adopting the Berkeley Segmentation Data Set and Benchmark as a background, our goal is to compare the results obtained for this method with previous work to validate its performance.
Resumo:
Despite the efficacy of minutia-based fingerprint matching techniques for good-quality images captured by optical sensors, minutia-based techniques do not often perform so well on poor-quality images or fingerprint images captured by small solid-state sensors. Solid-state fingerprint sensors are being increasingly deployed in a wide range of applications for user authentication purposes. Therefore, it is necessary to develop new fingerprint-matching techniques that utilize other features to deal with fingerprint images captured by solid-state sensors. This paper presents a new fingerprint matching technique based on fingerprint ridge features. This technique was assessed on the MSU-VERIDICOM database, which consists of fingerprint impressions obtained from 160 users (4 impressions per finger) using a solid-state sensor. The combination of ridge-based matching scores computed by the proposed ridge-based technique with minutia-based matching scores leads to a reduction of the false non-match rate by approximately 1.7% at a false match rate of 0.1%. © 2005 IEEE.
Resumo:
One of the key issues which makes the waveletGalerkin method unsuitable for solving general electromagnetic problems is a lack of exact representations of the connection coefficients. This paper presents the mathematical formulae and computer procedures for computing some common connection coefficients. The characteristic of the present formulae and procedures is that the arbitrary point values of the connection coefficients, rather than the dyadic point values, can be determined. A numerical example is also given to demonstrate the feasibility of using the wavelet-Galerkin method to solve engineering field problems. © 2000 IEEE.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Pós-graduação em Matematica Aplicada e Computacional - FCT
Resumo:
Pós-graduação em Matematica Aplicada e Computacional - FCT
Resumo:
Vortex-induced motion (VIM) is a highly nonlinear dynamic phenomenon. Usual spectral analysis methods, using the Fourier transform, rely on the hypotheses of linear and stationary dynamics. A method to treat nonstationary signals that emerge from nonlinear systems is denoted Hilbert-Huang transform (HHT) method. The development of an analysis methodology to study the VIM of a monocolumn production, storage, and offloading system using HHT is presented. The purposes of the present methodology are to improve the statistics analysis of VIM. The results showed to be comparable to results obtained from a traditional analysis (mean of the 10% highest peaks) particularly for the motions in the transverse direction, although the difference between the results from the traditional analysis for the motions in the in-line direction showed a difference of around 25%. The results from the HHT analysis are more reliable than the traditional ones, owing to the larger number of points to calculate the statistics characteristics. These results may be used to design risers and mooring lines, as well as to obtain VIM parameters to calibrate numerical predictions. [DOI: 10.1115/1.4003493]
Resumo:
In this paper, a definition of the Hilbert transform operating on Colombeau's temperated generalized functions is given. Similar results to some theorems that hold in the classical theory, or in certain subspaces of Schwartz distributions, have been obtained in this framework.
Resumo:
We present a method of generation of exact and explicit forms of one-sided, heavy-tailed Levy stable probability distributions g(alpha)(x), 0 <= x < infinity, 0 < alpha < 1. We demonstrate that the knowledge of one such a distribution g a ( x) suffices to obtain exactly g(alpha)p ( x), p = 2, 3, .... Similarly, from known g(alpha)(x) and g(beta)(x), 0 < alpha, beta < 1, we obtain g(alpha beta)( x). The method is based on the construction of the integral operator, called Levy transform, which implements the above operations. For a rational, alpha = l/k with l < k, we reproduce in this manner many of the recently obtained exact results for g(l/k)(x). This approach can be also recast as an application of the Efros theorem for generalized Laplace convolutions. It relies solely on efficient definite integration. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4709443]