876 resultados para Wavelet Transform
Resumo:
Computerized tomography is an imaging technique which produces cross sectional map of an object from its line integrals. Image reconstruction algorithms require collection of line integrals covering the whole measurement range. However, in many practical situations part of projection data is inaccurately measured or not measured at all. In such incomplete projection data situations, conventional image reconstruction algorithms like the convolution back projection algorithm (CBP) and the Fourier reconstruction algorithm, assuming the projection data to be complete, produce degraded images. In this paper, a multiresolution multiscale modeling using the wavelet transform coefficients of projections is proposed for projection completion. The missing coefficients are then predicted based on these models at each scale followed by inverse wavelet transform to obtain the estimated projection data.
Resumo:
This paper presents the design of a full fledged OCR system for printed Kannada text. The machine recognition of Kannada characters is difficult due to similarity in the shapes of different characters, script complexity and non-uniqueness in the representation of diacritics. The document image is subject to line segmentation, word segmentation and zone detection. From the zonal information, base characters, vowel modifiers and consonant conjucts are separated. Knowledge based approach is employed for recognizing the base characters. Various features are employed for recognising the characters. These include the coefficients of the Discrete Cosine Transform, Discrete Wavelet Transform and Karhunen-Louve Transform. These features are fed to different classifiers. Structural features are used in the subsequent levels to discriminate confused characters. Use of structural features, increases recognition rate from 93% to 98%. Apart from the classical pattern classification technique of nearest neighbour, Artificial Neural Network (ANN) based classifiers like Back Propogation and Radial Basis Function (RBF) Networks have also been studied. The ANN classifiers are trained in supervised mode using the transform features. Highest recognition rate of 99% is obtained with RBF using second level approximation coefficients of Haar wavelets as the features on presegmented base characters.
Resumo:
Feature extraction in bilingual OCR is handicapped by the increase in the number of classes or characters to be handled. This is evident in the case of Indian languages whose alphabet set is large. It is expected that the complexity of the feature extraction process increases with the number of classes. Though the determination of the best set of features that could be used cannot be ascertained through any quantitative measures, the characteristics of the scripts can help decide on the feature extraction procedure. This paper describes a hierarchical feature extraction scheme for recognition of printed bilingual (Tamil and Roman) text. The scheme divides the combined alphabet set of both the scripts into subsets by the extraction of certain spatial and structural features. Three features viz geometric moments, DCT based features and Wavelet transform based features are extracted from the grouped symbols and a linear transformation is performed on them for the purpose of efficient representation in the feature space. The transformation is obtained by the maximization of certain criterion functions. Three techniques : Principal component analysis, maximization of Fisher's ratio and maximization of divergence measure have been employed to estimate the transformation matrix. It has been observed that the proposed hierarchical scheme allows for easier handling of the alphabets and there is an appreciable rise in the recognition accuracy as a result of the transformations.
Resumo:
Thanks to advances in sensor technology, today we have many applications (space-borne imaging, medical imaging, etc.) where images of large sizes are generated. Straightforward application of wavelet techniques for above images involves certain difficulties. Embedded coders such as EZW and SPIHT require that the wavelet transform of the full image be buffered for coding. Since the transform coefficients also require storing in high precision, buffering requirements for large images become prohibitively high. In this paper, we first devise a technique for embedded coding of large images using zero trees with reduced memory requirements. A 'strip buffer' capable of holding few lines of wavelet coefficients from all the subbands belonging to the same spatial location is employed. A pipeline architecure for a line implementation of above technique is then proposed. Further, an efficient algorithm to extract an encoded bitstream corresponding to a region of interest in the image has also been developed. Finally, the paper describes a strip based non-embedded coding which uses a single pass algorithm. This is to handle high-input data rates. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Pixel based image fusion entails combining geometric details of a high-resolution Panchromatic (PAN) image and spectral information of a low-resolution Multispectral (MS) image to produce images with highest spatial content while preserving the spectral information. This work reviews and implements six fusion techniques – À Trous algorithm based wavelet transform (ATW), Mulitresolution Analysis based Intensity Modulation, Gram Schmidt fusion, CN Spectral, Luminance Chrominance and High pass fusion (HPF) on IKONOS imagery having 1 m PAN and 4 m MS channels. Comparative performance analysis of techniques by various methods reveals that ATW followed by HPF perform best among all the techniques.
Resumo:
Lamb wave type guided wave propagation in foam core sandwich structures and detectability of damages using spectral analysis method are reported in this paper. An experimental study supported by theoretical evaluation of the guided wave characteristics is presented here that shows the applicability of Lamb wave type guided ultrasonic wave for detection of damage in foam core sandwich structures. Sandwich beam specimens were fabricated with 10 mm thick foam core and 0.3 mm thick aluminum face sheets. Thin piezoelectric patch actuators and sensors are used to excite and sense guided wave. Group velocity dispersion curves and frequency response of sensed signal are obtained experimentally. The nature of damping present in the sandwich panel is monitored by measuring the sensor signal amplitude at various different distances measured from the center of the linear phased array. Delaminations of increasing width are created and detected experimentally by pitch-catch interrogation with guided waves and wavelet transform of the sensed signal. Signal amplitudes are analyzed for various different sizes of damages to differentiate the damage size/severity. A sandwich panel is also fabricated with a planer dimension of 600 mm x 400 mm. Release film delamination is introduced during fabrication. Non-contact Laser Doppler Vibrometer (LDV) is used to scan the panel while exciting with a surface bonded piezoelectric actuator. Presence of damage is confirmed by the reflected wave fringe pattern obtained from the LDV scan. With this approach it is possible to locate and monitor the damages by tracking the wave packets scattered from the damages.
Resumo:
We report on the Lamb wave type guided wave propagation in honeycomb core sandwich structures. An experimental study supported by theoretical evaluation of the guided wave characteristics is presented that proves the potential of Lamb wave type guided wave for detection of damage in sandwich structures. A sandwich panel is fabricated with planar dimension of 600 mm x 600 mm, having a core thickness of 7 mm, cell size of 5 mm and 0.1 mm thick aluminum face sheets. Thin piezoelectric patch actuators and sensors are used to excite and sense a frequency band limited guided wave with a central frequency. A linear phased array of piezoelectric patch actuators is used to achieve higher signal strength and directivity. Group velocity dispersion curves and corresponding frequency response of sensed signal are obtained experimentally. Linearity between the excitation signal amplitude and the corresponding sensed signal amplitude is found for certain range of parameters. The nature of damping present in the sandwich panel is monitored by measuring the sensor signal amplitude at various different distances measured from the center of the linear phased array. Indentation and low velocity impact induced damages of increasing diameter covering several honeycomb cells are created. Crushing of honeycomb core with rupture of face sheet is observed while introducing the damage. The damages are then detected experimentally by pitch-catch interrogation with guided waves and wavelet transform of the sensed signal. Signal amplitudes are analyzed for various different sizes of damages to differentiate the damage size/severity. Monotonic changes in the sensor signal amplitude due to increase in the damage size has been established successfully. With this approach it is possible to locate and monitor the damages with the help of phased array and by tracking the wave packets scattered from the damages. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Compressive Sensing (CS) is a new sensing paradigm which permits sampling of a signal at its intrinsic information rate which could be much lower than Nyquist rate, while guaranteeing good quality reconstruction for signals sparse in a linear transform domain. We explore the application of CS formulation to music signals. Since music signals comprise of both tonal and transient nature, we examine several transforms such as discrete cosine transform (DCT), discrete wavelet transform (DWT), Fourier basis and also non-orthogonal warped transforms to explore the effectiveness of CS theory and the reconstruction algorithms. We show that for a given sparsity level, DCT, overcomplete, and warped Fourier dictionaries result in better reconstruction, and warped Fourier dictionary gives perceptually better reconstruction. “MUSHRA” test results show that a moderate quality reconstruction is possible with about half the Nyquist sampling.
Resumo:
Automated security is one of the major concerns of modern times. Secure and reliable authentication systems are in great demand. A biometric trait like the finger knuckle print (FKP) of a person is unique and secure. Finger knuckle print is a novel biometric trait and is not explored much for real-time implementation. In this paper, three different algorithms have been proposed based on this trait. The first approach uses Radon transform for feature extraction. Two levels of security are provided here and are based on eigenvalues and the peak points of the Radon graph. In the second approach, Gabor wavelet transform is used for extracting the features. Again, two levels of security are provided based on magnitude values of Gabor wavelet and the peak points of Gabor wavelet graph. The third approach is intended to authenticate a person even if there is a damage in finger knuckle position due to injury. The FKP image is divided into modules and module-wise feature matching is done for authentication. Performance of these algorithms was found to be much better than very few existing works. Moreover, the algorithms are designed so as to implement in real-time system with minimal changes.
Resumo:
We present new data on the strength of oceanic lithosphere along the Ninetyeast Ridge (NER) from two independent methods: spectral analysis (Bouguer coherence) using the fan wavelet transform technique, and spatial analysis (flexure inversion) with the convolution method. The two methods provide effective elastic thickness (T-e) patterns that broadly complement each other, and correlate well with known surface structures and regional-scale features. Furthermore, our study presents a new high resolution database on the Moho configuration, which obeys flexural isostasy, and exhibit regional correlations with the T-e variations. A continuous ridge structure with a much lower T-e value than that of normal oceanic lithosphere provides strong support for the hotspot theory. The derived T-e values vary over the northern (higher T-e similar to 10-20 km), central (anomalously low T-e similar to 0-5 km), and southern (low T-e similar to 5 km) segments of the NER. The lack of correlation of the T-e value with the progressive aging of the lithosphere implies differences in thermo-mechanical setting of the crust and underlying mantle in different parts of the NER, again indicating diversity in their evolution. The anomalously low T-e and deeper Moho (similar to 22 km) estimates of the central NER (between 0.5 degrees N and 17 degrees S) are attributed to the interaction of a hotspot with the Wharton spreading ridge that caused significant thermal rejuvenation and hence weakening of the lithosphere. The higher mechanical strength values in the northern NER (north of 0.5 degrees N) may support the idea of off-ridge emplacement and a relatively large plate motion at the time of volcanism. The low T-e and deeper Moho (similar to 22 km) estimates in the southern part (south of 17 degrees S) suggest that the lithosphere was weak and therefore younger at the time of volcanism, and this supports the idea that the southern NER was emplaced on the edge of the Indian plate. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
The effects of evaporation and the presence of agglomerating nanoparticles on the oscillation characteristics of pendant droplets are studied experimentally using ethanol and aqueous nanoalumina suspension, respectively. Axisymmetric oscillations induced by a round air jet are considered. Wavelet transform of the time evolution of the 2nd modal coefficient revealed that while a continuous increase in the natural frequency of the droplet occurs with time due to the diameter regression induced by vaporization in the case of ethanol droplet, no such change in resonant frequency occurs in the case of the agglomerating droplet. However, a gradual reduction in the oscillation amplitude ensues as the agglomeration becomes dominant. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
Selection of relevant features is an open problem in Brain-computer interfacing (BCI) research. Sometimes, features extracted from brain signals are high dimensional which in turn affects the accuracy of the classifier. Selection of the most relevant features improves the performance of the classifier and reduces the computational cost of the system. In this study, we have used a combination of Bacterial Foraging Optimization and Learning Automata to determine the best subset of features from a given motor imagery electroencephalography (EEG) based BCI dataset. Here, we have employed Discrete Wavelet Transform to obtain a high dimensional feature set and classified it by Distance Likelihood Ratio Test. Our proposed feature selector produced an accuracy of 80.291% in 216 seconds.
Resumo:
Signals recorded from the brain often show rhythmic patterns at different frequencies, which are tightly coupled to the external stimuli as well as the internal state of the subject. In addition, these signals have very transient structures related to spiking or sudden onset of a stimulus, which have durations not exceeding tens of milliseconds. Further, brain signals are highly nonstationary because both behavioral state and external stimuli can change on a short time scale. It is therefore essential to study brain signals using techniques that can represent both rhythmic and transient components of the signal, something not always possible using standard signal processing techniques such as short time fourier transform, multitaper method, wavelet transform, or Hilbert transform. In this review, we describe a multiscale decomposition technique based on an over-complete dictionary called matching pursuit (MP), and show that it is able to capture both a sharp stimulus-onset transient and a sustained gamma rhythm in local field potential recorded from the primary visual cortex. We compare the performance of MP with other techniques and discuss its advantages and limitations. Data and codes for generating all time-frequency power spectra are provided.
Resumo:
In the previous paper, a class of nonlinear system is mapped to a so-called skeleton linear model (SLM) based on the joint time-frequency analysis method. Behavior of the nonlinear system may be indicated quantitatively by the variance of the coefficients of SLM versus its response. Using this model we propose an identification method for nonlinear systems based on nonstationary vibration data in this paper. The key technique in the identification procedure is a time-frequency filtering method by which solution of the SLM is extracted from the response data of the corresponding nonlinear system. Two time-frequency filtering methods are discussed here. One is based on the quadratic time-frequency distribution and its inverse transform, the other is based on the quadratic time-frequency distribution and the wavelet transform. Both numerical examples and an experimental application are given to illustrate the validity of the technique.
Resumo:
Wavelet Variable Interval Time Average (WVITA) is introduced as a method incorporating burst event detection in wall turbulence. Wavelet transform is performed to unfold the longitudinal fluctuating velocity time series measured in the near wall region of a turbulent boundary layer using hot-film anemometer. This unfolding is both in time and in space simultaneously. The splitted kinetic of the longitudinal fluctuating velocity time series among different scales is obtained by integrating the square of wavelet coefficient modulus over temporal space. The time scale that related to burst events in wall turbulence passing through the fixed probe is ascertained by maximum criterion of the kinetic energy evolution across scales. Wavelet transformed localized variance of the fluctuating velocity time series at the maximum kinetic scale is put forward instead of localized short time average variance in Variable Interval Time Average (VITA) scheme. The burst event detection result shows that WVITA scheme can avoid erroneous judgement and solve the grouping problem more effectively which is caused by VITA scheme itself and can not be avoided by adjusting the threshold level or changing the short time average interval.