917 resultados para wavelet texture analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laser plasma interferograms are currently analyzed by extraction of the phase-shift map with fast Fourier transform (FFT) techniques [Appl. Opt. 18, 3101 (1985)]. This methodology works well when interferograms are only marginally affected by noise and reduction of fringe visibility, but it can fail to produce accurate phase-shift maps when low-quality images are dealt with. We present a novel procedure for a phase-shift map computation that makes extensive use of the ridge extraction in the continuous wavelet transform (CWT) framework. The CWT tool is flexible because of the wide adaptability of the analyzing basis, and it can be accurate because of the intrinsic noise reduction in the ridge extraction. A comparative analysis of the accuracy performances of them new tool and the FFT-based one shows that the CWT-based tool produces phase maps considerably less noisy and that it can better resolve local inhomogeneties. (C) 2001 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many of the most interesting questions ecologists ask lead to analyses of spatial data. Yet, perhaps confused by the large number of statistical models and fitting methods available, many ecologists seem to believe this is best left to specialists. Here, we describe the issues that need consideration when analysing spatial data and illustrate these using simulation studies. Our comparative analysis involves using methods including generalized least squares, spatial filters, wavelet revised models, conditional autoregressive models and generalized additive mixed models to estimate regression coefficients from synthetic but realistic data sets, including some which violate standard regression assumptions. We assess the performance of each method using two measures and using statistical error rates for model selection. Methods that performed well included generalized least squares family of models and a Bayesian implementation of the conditional auto-regressive model. Ordinary least squares also performed adequately in the absence of model selection, but had poorly controlled Type I error rates and so did not show the improvements in performance under model selection when using the above methods. Removing large-scale spatial trends in the response led to poor performance. These are empirical results; hence extrapolation of these findings to other situations should be performed cautiously. Nevertheless, our simulation-based approach provides much stronger evidence for comparative analysis than assessments based on single or small numbers of data sets, and should be considered a necessary foundation for statements of this type in future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A methodology which allows a non-specialist to rapidly design silicon wavelet transform cores has been developed. This methodology is based on a generic architecture utilizing time-interleaved coefficients for the wavelet transform filters. The architecture is scaleable and it has been parameterized in terms of wavelet family, wavelet type, data word length and coefficient word length. The control circuit is designed in such a way that the cores can also be cascaded without any interface glue logic for any desired level of decomposition. This parameterization allows the use of any orthonormal wavelet family thereby extending the design space for improved transformation from algorithm to silicon. Case studies for stand alone and cascaded silicon cores for single and multi-stage analysis respectively are reported. The typical design time to produce silicon layout of a wavelet based system has been reduced by an order of magnitude. The cores are comparable in area and performance to hand-crafted designs. The designs have been captured in VHDL so they are portable across a range of foundries and are also applicable to FPGA and PLD implementations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biosignal measurement and processing is increasingly being deployed in ambulatory situations particularly in connected health applications. Such an environment dramatically increases the likelihood of artifacts which can occlude features of interest and reduce the quality of information available in the signal. If multichannel recordings are available for a given signal source, then there are currently a considerable range of methods which can suppress or in some cases remove the distorting effect of such artifacts. There are, however, considerably fewer techniques available if only a single-channel measurement is available and yet single-channel measurements are important where minimal instrumentation complexity is required. This paper describes a novel artifact removal technique for use in such a context. The technique known as ensemble empirical mode decomposition with canonical correlation analysis (EEMD-CCA) is capable of operating on single-channel measurements. The EEMD technique is first used to decompose the single-channel signal into a multidimensional signal. The CCA technique is then employed to isolate the artifact components from the underlying signal using second-order statistics. The new technique is tested against the currently available wavelet denoising and EEMD-ICA techniques using both electroencephalography and functional near-infrared spectroscopy data and is shown to produce significantly improved results. © 1964-2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wavelet transforms provide basis functions for time-frequency analysis and have properties that are particularly useful for compression of analogue point on wave transient and disturbance power system signals. This paper evaluates the reduction properties of the wavelet transform using real power system data and discusses the application of the reduction method for information transfer in network communications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Periodic monitoring of structures such as bridges is necessary as their condition can deteriorate due to environmental conditions and ageing, causing the bridge to become unsafe. This monitoring - so called Structural Health Monitoring (SHM) - can give an early warning if a bridge becomes unsafe. This paper investigates an alternative wavelet-based approach for the monitoring of bridge structures which consists of the use of a vehicle fitted with accelerometers on its axles. A simplified vehicle-bridge interaction model is used in theoretical simulations to examine the effectiveness of the approach in detecting damage in the bridge. The accelerations of the vehicle are processed using a continuous wavelet transform, allowing a time-frequency analysis to be performed. This enables the identification of both the existence and location of damage from the vehicle response. Based on this analysis, a damage index is established. A parametric study is carried out to investigate the effect of parameters such as the bridge span length, vehicle speed, vehicle mass, damage level, signal noise level and road surface roughness on the accuracy of results. In addition, a laboratory experiment is carried out to validate the results of the theoretical analysis and assess the ability of the approach to detect changes in the bridge response.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates a wavelet-based damage detection approach for bridge structures. By analysing the continuous wavelet transform of the vehicle response, the approach aims to identify changes in the bridge response which may indicate the existence of damage. A numerical vehicle-bridge interaction model is used in simulations as part of a sensitivity study. Furthermore, a laboratory experiment is carried out to investigate the effects of varying vehicle configuration, speed and bridge damping on the ability of the vehicle to detect changes in the bridge response. The accelerations of the vehicle and bridge are processed using a continuous wavelet transform, allowing time-frequency analysis to be carried out on the responses of the laboratory vehicle-bridge interaction system. Results indicate the most favourable conditions for successful implementation of the approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A rapid design methodology for orthonormal wavelet transform cores has been developed. This methodology is based on a generic, scaleable architecture utilising time-interleaved coefficients for the wavelet transform filters. The architecture has been captured in VHDL and parameterised in terms of wavelet family, wavelet type, data word length and coefficient word length. The control circuit is embedded within the cores and allows them to be cascaded without any interface glue logic for any desired level of decomposition. Case studies for stand alone and cascaded silicon cores for single and multi-stage wavelet analysis respectively are reported. The design time to produce silicon layout of a wavelet based system has been reduced to typically less than a day. The cores are comparable in area and performance to handcrafted designs. The designs are portable across a range of foundries and are also applicable to FPGA and PLD implementations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, there has been a move towards the development of indirect structural health monitoring (SHM)techniques for bridges; the low-cost vibration-based method presented in this paper is such an approach. It consists of the use of a moving vehicle fitted with accelerometers on its axles and incorporates wavelet analysis and statistical pattern recognition. The aim of the approach is to both detect and locate damage in bridges while reducing the need for direct instrumentation of the bridge. In theoretical simulations, a simplified vehicle-bridge interaction model is used to investigate the effectiveness of the approach in detecting damage in a bridge from vehicle accelerations. For this purpose, the accelerations are processed using a continuous wavelet transform as when the axle passes over a damaged section, any discontinuity in the signal would affect the wavelet coefficients. Based on these coefficients, a damage indicator is formulated which can distinguish between different damage levels. However, it is found to be difficult to quantify damage of varying levels when the vehicle’s transverse position is varied between bridge crossings. In a real bridge field experiment, damage was applied artificially to a steel truss bridge to test the effectiveness of the indirect approach in practice; for this purpose a two-axle van was driven across the bridge at constant speed. Both bridge and vehicle acceleration measurements were recorded. The dynamic properties of the test vehicle were identified initially via free vibration tests. It was found that the resulting damage indicators for the bridge and vehicle showed similar patterns, however, it was difficult to distinguish between different artificial damage scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the results of exploratory experiments using lexical valence extracted from brain using electroencephalography (EEG) for sentiment analysis. We selected 78 English words (36 for training and 42 for testing), presented as stimuli to 3 English native speakers. EEG signals were recorded from the subjects while they performed a mental imaging task for each word stimulus. Wavelet decomposition was employed to extract EEG features from the time-frequency domain. The extracted features were used as inputs to a sparse multinomial logistic regression (SMLR) classifier for valence classification, after univariate ANOVA feature selection. After mapping EEG signals to sentiment valences, we exploited the lexical polarity extracted from brain data for the prediction of the valence of 12 sentences taken from the SemEval-2007 shared task, and compared it against existing lexical resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wavelet entropy assesses the degree of order or disorder in signals and presents this complex information in a simple metric. Relative wavelet entropy assesses the similarity between the spectral distributions of two signals, again in a simple metric. Wavelet entropy is therefore potentially a very attractive tool for waveform analysis. The ability of this method to track the effects of pharmacologic modulation of vascular function on Doppler blood velocity waveforms was assessed. Waveforms were captured from ophthalmic arteries of 10 healthy subjects at baseline, after the administration of glyceryl trinitrate (GTN) and after two doses of N(G)-nitro-L-arginine-methyl ester (L-NAME) to produce vasodilation and vasoconstriction, respectively. Wavelet entropy had a tendency to decrease from baseline in response to GTN, but significantly increased after the administration of L-NAME (mean: 1.60 ± 0.07 after 0.25 mg/kg and 1.72 ± 0.13 after 0.5 mg/kg vs. 1.50 ± 0.10 at baseline, p < 0.05). Relative wavelet entropy had a spectral distribution from increasing doses of L-NAME comparable to baseline, 0.07 ± 0.04 and 0.08 ± 0.03, respectively, whereas GTN had the most dissimilar spectral distribution compared with baseline (0.17 ± 0.08, p = 0.002). Wavelet entropy can detect subtle changes in Doppler blood velocity waveform structure in response to nitric-oxide-mediated changes in arteriolar smooth muscle tone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates a low-cost wavelet-based approach for the preliminary monitoring of bridge structures, consisting of the use of a vehicle fitted with accelerometers on its axles. The approach aims to reduce the need for direct instrumentation of the bridge. A time-frequency analysis is carried out in order to identify the existence and location of damage from vehicle accelerations. Firstly, in theoretical simulations, a simplified vehicle-bridge interaction model is used to investigate the effectiveness of the approach. A number of damage indicators are evaluated and compared. A range of parameters such as the bridge span, vehicle speed, damage level and location, signal noise and road roughness are varied in simulations. Secondly, a scaled laboratory experiment is carried out to validate the results of the theoretical analysis and assess the ability of the selected damage indicators to detect changes in the bridge response from vehicle accelerations. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2015

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La texture est un élément clé pour l’interprétation des images de télédétection à fine résolution spatiale. L’intégration de l’information texturale dans un processus de classification automatisée des images se fait habituellement via des images de texture, souvent créées par le calcul de matrices de co-occurrences (MCO) des niveaux de gris. Une MCO est un histogramme des fréquences d’occurrence des paires de valeurs de pixels présentes dans les fenêtres locales, associées à tous les pixels de l’image utilisée; une paire de pixels étant définie selon un pas et une orientation donnés. Les MCO permettent le calcul de plus d’une dizaine de paramètres décrivant, de diverses manières, la distribution des fréquences, créant ainsi autant d’images texturales distinctes. L’approche de mesure des textures par MCO a été appliquée principalement sur des images de télédétection monochromes (ex. images panchromatiques, images radar monofréquence et monopolarisation). En imagerie multispectrale, une unique bande spectrale, parmi celles disponibles, est habituellement choisie pour générer des images de texture. La question que nous avons posée dans cette recherche concerne justement cette utilisation restreinte de l’information texturale dans le cas des images multispectrales. En fait, l’effet visuel d’une texture est créé, non seulement par l’agencement particulier d’objets/pixels de brillance différente, mais aussi de couleur différente. Plusieurs façons sont proposées dans la littérature pour introduire cette idée de la texture à plusieurs dimensions. Parmi celles-ci, deux en particulier nous ont intéressés dans cette recherche. La première façon fait appel aux MCO calculées bande par bande spectrale et la seconde utilise les MCO généralisées impliquant deux bandes spectrales à la fois. Dans ce dernier cas, le procédé consiste en le calcul des fréquences d’occurrence des paires de valeurs dans deux bandes spectrales différentes. Cela permet, en un seul traitement, la prise en compte dans une large mesure de la « couleur » des éléments de texture. Ces deux approches font partie des techniques dites intégratives. Pour les distinguer, nous les avons appelées dans cet ouvrage respectivement « textures grises » et « textures couleurs ». Notre recherche se présente donc comme une analyse comparative des possibilités offertes par l’application de ces deux types de signatures texturales dans le cas spécifique d’une cartographie automatisée des occupations de sol à partir d’une image multispectrale. Une signature texturale d’un objet ou d’une classe d’objets, par analogie aux signatures spectrales, est constituée d’une série de paramètres de texture mesurés sur une bande spectrale à la fois (textures grises) ou une paire de bandes spectrales à la fois (textures couleurs). Cette recherche visait non seulement à comparer les deux approches intégratives, mais aussi à identifier la composition des signatures texturales des classes d’occupation du sol favorisant leur différentiation : type de paramètres de texture / taille de la fenêtre de calcul / bandes spectrales ou combinaisons de bandes spectrales. Pour ce faire, nous avons choisi un site à l’intérieur du territoire de la Communauté Métropolitaine de Montréal (Longueuil) composé d’une mosaïque d’occupations du sol, caractéristique d’une zone semi urbaine (résidentiel, industriel/commercial, boisés, agriculture, plans d’eau…). Une image du satellite SPOT-5 (4 bandes spectrales) de 10 m de résolution spatiale a été utilisée dans cette recherche. Puisqu’une infinité d’images de texture peuvent être créées en faisant varier les paramètres de calcul des MCO et afin de mieux circonscrire notre problème nous avons décidé, en tenant compte des études publiées dans ce domaine : a) de faire varier la fenêtre de calcul de 3*3 pixels à 21*21 pixels tout en fixant le pas et l’orientation pour former les paires de pixels à (1,1), c'est-à-dire à un pas d’un pixel et une orientation de 135°; b) de limiter les analyses des MCO à huit paramètres de texture (contraste, corrélation, écart-type, énergie, entropie, homogénéité, moyenne, probabilité maximale), qui sont tous calculables par la méthode rapide de Unser, une approximation des matrices de co-occurrences, c) de former les deux signatures texturales par le même nombre d’éléments choisis d’après une analyse de la séparabilité (distance de Bhattacharya) des classes d’occupation du sol; et d) d’analyser les résultats de classification (matrices de confusion, exactitudes, coefficients Kappa) par maximum de vraisemblance pour conclure sur le potentiel des deux approches intégratives; les classes d’occupation du sol à reconnaître étaient : résidentielle basse et haute densité, commerciale/industrielle, agricole, boisés, surfaces gazonnées (incluant les golfs) et plans d’eau. Nos principales conclusions sont les suivantes a) à l’exception de la probabilité maximale, tous les autres paramètres de texture sont utiles dans la formation des signatures texturales; moyenne et écart type sont les plus utiles dans la formation des textures grises tandis que contraste et corrélation, dans le cas des textures couleurs, b) l’exactitude globale de la classification atteint un score acceptable (85%) seulement dans le cas des signatures texturales couleurs; c’est une amélioration importante par rapport aux classifications basées uniquement sur les signatures spectrales des classes d’occupation du sol dont le score est souvent situé aux alentours de 75%; ce score est atteint avec des fenêtres de calcul aux alentours de11*11 à 15*15 pixels; c) Les signatures texturales couleurs offrant des scores supérieurs à ceux obtenus avec les signatures grises de 5% à 10%; et ce avec des petites fenêtres de calcul (5*5, 7*7 et occasionnellement 9*9) d) Pour plusieurs classes d’occupation du sol prises individuellement, l’exactitude dépasse les 90% pour les deux types de signatures texturales; e) une seule classe est mieux séparable du reste par les textures grises, celle de l’agricole; f) les classes créant beaucoup de confusions, ce qui explique en grande partie le score global de la classification de 85%, sont les deux classes du résidentiel (haute et basse densité). En conclusion, nous pouvons dire que l’approche intégrative par textures couleurs d’une image multispectrale de 10 m de résolution spatiale offre un plus grand potentiel pour la cartographie des occupations du sol que l’approche intégrative par textures grises. Pour plusieurs classes d’occupations du sol un gain appréciable en temps de calcul des paramètres de texture peut être obtenu par l’utilisation des petites fenêtres de traitement. Des améliorations importantes sont escomptées pour atteindre des exactitudes de classification de 90% et plus par l’utilisation des fenêtres de calcul de taille variable adaptées à chaque type d’occupation du sol. Une méthode de classification hiérarchique pourrait être alors utilisée afin de séparer les classes recherchées une à la fois par rapport au reste au lieu d’une classification globale où l’intégration des paramètres calculés avec des fenêtres de taille variable conduirait inévitablement à des confusions entre classes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study addresses to understand the sedimentological properties of the coasts of kodungallur and chellanam, central Kerala to bring out the relationship between the textural, mineralogical and geochemical characters with that of the respective environment. The grain size study of the beach ridge sediments from different pits has been investigated at close intervals, which enables to understand the grain size variations with depth. The sediment samples from various pits of the beach ridges indicate that the sediments range primarily from medium to very fine sand, well to moderately sorted, fine to coarse skewed and leptokurtic to platykurtic. The study area is considered as a prograding coast. Variations in grain size down the pit give three phases of beach building activities i.e.; a coarsening upward sequence in the bottom layers, a fining upward in the middle and coarsening upward in the top. Beach ridges are formed by swash built sediments with cross bedding and setting lag type sediments with seaward dipping/horizontal units. Geochemical signatures in the study area have been brought out through the analysis of major and trace elements. Iron is significantly enriched and its control over many trace elements is evident. Copper, chromium, cobalt, lithium, lead and zinc show decreasing trend with depth, while sodium, potassium,strontium,nickel and organic carbon increases. The association of many trace elements with organic carbon has also been established. Dissolution of trace elements in anoxic environment, at depth and reprecipitation in the oxic layers, at near or subsurface, are the major mechanism that brought out the variation of certain environmentally sensitive elements