995 resultados para Wavelet transformation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a multi-stage algorithm for the dynamic condition monitoring of a gear. The algorithm provides information referred to the gear status (fault or normal condition) and estimates the mesh stiffness per shaft revolution in case that any abnormality is detected. In the first stage, the analysis of coefficients generated through discrete wavelet transformation (DWT) is proposed as a fault detection and localization tool. The second stage consists in establishing the mesh stiffness reduction associated with local failures by applying a supervised learning mode and coupled with analytical models. To do this, a multi-layer perceptron neural network has been configured using as input features statistical parameters sensitive to torsional stiffness decrease and derived from wavelet transforms of the response signal. The proposed method is applied to the gear condition monitoring and results show that it can update the mesh dynamic properties of the gear on line.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To examine the relationship between the auditory brain-stem response (ABR) and its reconstructed waveforms following discrete wavelet transformation (DWT), and to comment on the resulting implications for ABR DWT time-frequency analysis. Methods: ABR waveforms were recorded from 120 normal hearing subjects at 90, 70, 50, 30, 10 and 0 dBnHL, decomposed using a 6 level discrete wavelet transformation (DWT), and reconstructed at individual wavelet scales (frequency ranges) A6, D6, D5 and D4. These waveforms were then compared for general correlations, and for patterns of change due to stimulus level, and subject age, gender and test ear. Results: The reconstructed ABR DWT waveforms showed 3 primary components: a large-amplitude waveform in the low-frequency A6 scale (0-266.6 Hz) with its single peak corresponding in latency with ABR waves III and V; a mid-amplitude waveform in the mid-frequency D6 scale (266.6-533.3 Hz) with its first 5 waves corresponding in latency to ABR waves 1, 111, V, VI and VII; and a small-amplitude, multiple-peaked waveform in the high-frequency D5 scale (533.3-1066.6 Hz) with its first 7 waves corresponding in latency to ABR waves 1, 11, 111, IV, V, VI and VII. Comparisons between ABR waves 1, 111 and V and their corresponding reconstructed ABR DWT waves showed strong correlations and similar, reliable, and statistically robust changes due to stimulus level and subject age, gender and test ear groupings. Limiting these findings, however, was the unexplained absence of a small number (2%, or 117/6720) of reconstructed ABR DWT waves, despite their corresponding ABR waves being present. Conclusions: Reconstructed ABR DWT waveforms can be used as valid time-frequency representations of the normal ABR, but with some limitations. In particular, the unexplained absence of a small number of reconstructed ABR DWT waves in some subjects, probably resulting from 'shift invariance' inherent to the DWT process, needs to be addressed. Significance: This is the first report of the relationship between the ABR and its reconstructed ABR DWT waveforms in a large normative sample. (C) 2004 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The article describes the method of preliminary segmentation of a speech signal with wavelet transformation use, consisting of two stages. At the first stage there is an allocation of sibilants and pauses, at the second – the further segmentation of the rest signal parts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Communication has become an essential function in our civilization. With the increasing demand for communication channels, it is now necessary to find ways to optimize the use of their bandwidth. One way to achieve this is by transforming the information before it is transmitted. This transformation can be performed by several techniques. One of the newest of these techniques is the use of wavelets. Wavelet transformation refers to the act of breaking down a signal into components called details and trends by using small waveforms that have a zero average in the time domain. After this transformation the data can be compressed by discarding the details, transmitting the trends. In the receiving end, the trends are used to reconstruct the image. In this work, the wavelet used for the transformation of an image will be selected from a library of available bases. The accuracy of the reconstruction, after the details are discarded, is dependent on the wavelets chosen from the wavelet basis library. The system developed in this thesis takes a 2-D image and decomposes it using a wavelet bank. A digital signal processor is used to achieve near real-time performance in this transformation task. A contribution of this thesis project is the development of DSP-based test bed for the future development of new real-time wavelet transformation algorithms.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We propose a method to encode a 3D magnetic resonance image data and a decoder in such way that fast access to any 2D image is possible by decoding only the corresponding information from each subband image and thus provides minimum decoding time. This will be of immense use for medical community, because most of the PET and MRI data are volumetric data. Preprocessing is carried out at every level before wavelet transformation, to enable easier identification of coefficients from each subband image. Inclusion of special characters in the bit stream facilitates access to corresponding information from the encoded data. Results are taken by performing Daub4 along x (row), y (column) direction and Haar along z (slice) direction. Comparable results are achieved with the existing technique. In addition to that decoding time is reduced by 1.98 times. Arithmetic coding is used to encode corresponding information independently

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper introduces a method to classify EEG signals using features extracted by an integration of wavelet transform and the nonparametric Wilcoxon test. Orthogonal Haar wavelet coefficients are ranked based on the Wilcoxon test’s statistics. The most prominent discriminant wavelets are assembled to form a feature set that serves as inputs to the naïve Bayes classifier. Two benchmark datasets, named Ia and Ib, downloaded from the brain–computer interface (BCI) competition II are employed for the experiments. Classification performance is evaluated using accuracy, mutual information, Gini coefficient and F-measure. Widely used classifiers, including feedforward neural network, support vector machine, k-nearest neighbours, ensemble learning Adaboost and adaptive neuro-fuzzy inference system, are also implemented for comparisons. The proposed combination of Haar wavelet features and naïve Bayes classifier considerably dominates the competitive classification approaches and outperforms the best performance on the Ia and Ib datasets reported in the BCI competition II. Application of naïve Bayes also provides a low computational cost approach that promotes the implementation of a potential real-time BCI system.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The wavelet transform and Lipschitz exponent perform well in detecting signal singularity.With the bridge crack damage modeled as rotational springs based on fracture mechanics, the deflection time history of the beam under the moving load is determined with a numerical method. The continuous wavelet transformation (CWT) is applied to the deflection of the beam to identify the location of the damage, and the Lipschitz exponent is used to evaluate the damage degree. The influence of different damage degrees,multiple damage, different sensor locations, load velocity and load magnitude are studied.Besides, the feasibility of this method is verified by a model experiment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

提出了一种基于丰度的图像检索方法 ,在小波变换的基础上定义了边缘平均重复度来描述丰度 ,由此构成相似性度量的特征空间 .依据这一研究结果 ,可快速对海底资源图像库进行丰度意义下的检索并形成资源分布图 .最后 ,给出了对海底资源图像库进行检索的实验结果

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Spätestens seit der Formulierung der modernen Portfoliotheorie durch Harry Markowitz (1952) wird den aktiven Portfoliomanagementstrategien besondere Aufmerksamkeit in Wissenschaft und Anlagepraxis gewidmet. Diese Arbeit ist im Schnittstellenbereich zwischen neoklassischer Kapitalmarkttheorie und technischer Analyse angesiedelt. Es wird untersucht, inwieweit eine passive Buy&Hold-Strategie, die als einzige im Einklang mit der Effizienzmarkthypothese nach Fama (1970) steht, durch Verwendung von aktiven Strategien geschlagen werden kann. Der Autor präsentiert einen Wavelet-basierten Ansatz für die Analyse der Finanzzeitreihen. Die Wavelet-Transformation wird als ein mathematisches Datenaufbereitungstool herangezogen und ermöglicht eine Multiskalendarstellung einer Datenreihe, durch das Aufspalten dieser in eine Approximationszeitreihe und eine Detailszeitreihe, ohne dass dadurch Informationen verloren gehen. Diese Arbeit beschränkt sich auf die Verwendung der Daubechies Wavelets. Die Multiskalendarstellung dient als Grundlage für die Entwicklung von zwei technischen Indikatoren. Der Wavelet Stochastik Indikator greift auf die Idee des bekannten Stochastik-Indikators zurück und verwendet nicht mehr die Kurszeitreihe, sondern die Approximationszeitreihe als Input. Eine auf diesem Indikator basierende Investmentstrategie wird umfangreicher Sensitivitätsanalyse unterworfen, die aufzeigt, dass eine Buy&Hold-Strategie durchaus outperformt werden kann. Die Idee des Momentum-Indikators wird durch den Wavelet Momentum Indikator aufgegriffen, welcher die Detailszeitreihen als Input heranzieht. Im Rahmen der Sensitivitätsanalyse einer Wavelet Momentum Strategie wird jedoch die Buy&Hold -Strategie nicht immer geschlagen. Ein Wavelet-basiertes Prognosemodell verwendet ähnlich wie die technischen Indikatoren die Multiskalendarstellung. Die Approximationszeitreihen werden dabei durch das Polynom 2. Grades und die Detailszeitreihen durch die Verwendung der Sinusregression extrapoliert. Die anschließende Aggregation der extrapolierten Zeitreihen führt zu prognostizierten Wertpapierkursen. Kombinierte Handelsstrategien zeigen auf, wie Wavelet Stochastik Indikator, Wavelet Momentum Indikator und das Wavelet-basierte Prognosemodell miteinander verknüpft werden können. Durch die Verknüpfung einzelner Strategien gelingt es, die Buy&Hold-Strategie zu schlagen. Der letzte Abschnitt der Arbeit beschäftigt sich mit der Modellierung von Handelssystem-portfolios. Angestrebt wird eine gleichzeitige Diversifikation zwischen Anlagen und Strategien, die einer ständigen Optimierung unterworfen wird. Dieses Verfahren wird als ein systematischer, an bestimmte Optimierungskriterien gebundener Investmentprozess verstanden, mit welchem es gelingt, eine passive Buy&Hold-Strategie zu outperformen. Die Arbeit stellt eine systematische Verknüpfung zwischen der diskreten Wavelet Transformation und technisch quantitativen Investmentstrategien her. Es werden auch die Problemfelder der durchaus viel versprechenden Verwendung der Wavelet Transformation im Rahmen der technischen Analyse beleuchtet.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents a human daily activity classification approach based on the sensory data collected from a single tri-axial accelerometer worn on waist belt. The classification algorithm was realized to distinguish 6 different activities including standing, jumping, sitting-down, walking, running and falling through three major steps: wavelet transformation, Principle Component Analysis (PCA)-based dimensionality reduction and followed by implementing a radial basis function (RBF) kernel Support Vector Machine (SVM) classifier. Two trials were conducted to evaluate different aspects of the classification scheme. In the first trial, the classifier was trained and evaluated by using a dataset of 420 samples collected from seven subjects by using a k-fold cross-validation method. The parameters σ and c of the RBF kernel were optimized through automatic searching in terms of yielding the highest recognition accuracy and robustness. In the second trial, the generation capability of the classifier was also validated by using the dataset collected from six new subjects. The average classification rates of 95% and 93% are obtained in trials 1 and 2, respectively. The results in trial 2 show the system is also good at classifying activity signals of new subjects. It can be concluded that the collective effects of the usage of single accelerometer sensing, the setting of the accelerometer placement and efficient classifier would make this wearable sensing system more realistic and more comfortable to be implemented for long-term human activity monitoring and classification in ambulatory environment, therefore, more acceptable by users.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Healthcare plays an important role in promoting the general health and well-being of people around the world. The difficulty in healthcare data classification arises from the uncertainty and the high-dimensional nature of the medical data collected. This paper proposes an integration of fuzzy standard additive model (SAM) with genetic algorithm (GA), called GSAM, to deal with uncertainty and computational challenges. GSAM learning process comprises three continual steps: rule initialization by unsupervised learning using the adaptive vector quantization clustering, evolutionary rule optimization by GA and parameter tuning by the gradient descent supervised learning. Wavelet transformation is employed to extract discriminative features for high-dimensional datasets. GSAM becomes highly capable when deployed with small number of wavelet features as its computational burden is remarkably reduced. The proposed method is evaluated using two frequently-used medical datasets: the Wisconsin breast cancer and Cleveland heart disease from the UCI Repository for machine learning. Experiments are organized with a five-fold cross validation and performance of classification techniques are measured by a number of important metrics: accuracy, F-measure, mutual information and area under the receiver operating characteristic curve. Results demonstrate the superiority of the GSAM compared to other machine learning methods including probabilistic neural network, support vector machine, fuzzy ARTMAP, and adaptive neuro-fuzzy inference system. The proposed approach is thus helpful as a decision support system for medical practitioners in the healthcare practice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper proposes a combination of fuzzy standard additive model (SAM) with wavelet features for medical diagnosis. Wavelet transformation is used to reduce the dimension of high-dimensional datasets. This helps to improve the convergence speed of supervised learning process of the fuzzy SAM, which has a heavy computational burden in high-dimensional data. Fuzzy SAM becomes highly capable when deployed with wavelet features. This combination remarkably reduces its computational training burden. The performance of the proposed methodology is examined for two frequently used medical datasets: the lump breast cancer and heart disease. Experiments are deployed with a five-fold cross validation. Results demonstrate the superiority of the proposed method compared to other machine learning methods including probabilistic neural network, support vector machine, fuzzy ARTMAP, and adaptive neuro-fuzzy inference system. Faster convergence but higher accuracy shows a win-win solution of the proposed approach.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Understanding neural functions requires knowledge from analysing electrophysiological data. The process of assigning spikes of a multichannel signal into clusters, called spike sorting, is one of the important problems in such analysis. There have been various automated spike sorting techniques with both advantages and disadvantages regarding accuracy and computational costs. Therefore, developing spike sorting methods that are highly accurate and computationally inexpensive is always a challenge in the biomedical engineering practice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

 Understanding neural functions requires the observation of the activities of single neurons that are represented via electrophysiological data. Processing and understanding these data are challenging problems in biomedical engineering. A microelectrode commonly records the activity of multiple neurons. Spike sorting is a process of classifying every single action potential (spike) to a particular neuron. This paper proposes a combination between diffusion maps (DM) and mean shift clustering method for spike sorting. DM is utilized to extract spike features, which are highly capable of discriminating different spike shapes. Mean shift clustering provides an automatic unsupervised clustering, which takes extracted features from DM as inputs. Experimental results show a noticeable dominance of the features extracted by DM compared to those selected by wavelet transformation (WT). Accordingly, the proposed integrated method is significantly superior to the popular existing combination of WT and superparamagnetic clustering regarding spike sorting accuracy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Spike sorting plays an important role in analysing electrophysiological data and understanding neural functions. Developing spike sorting methods that are highly accurate and computationally inexpensive is always a challenge in the biomedical engineering practice. This paper proposes an automatic unsupervised spike sorting method using the landmark-based spectral clustering (LSC) method in connection with features extracted by the locality preserving projection (LPP) technique. Gap statistics is employed to evaluate the number of clusters before the LSC can be performed. Experimental results show that LPP spike features are more discriminative than those of the popular wavelet transformation (WT). Accordingly, the proposed method LPP-LSC demonstrates a significant dominance compared to the existing method that is the combination between WT feature extraction and the superparamagnetic clustering. LPP and LSC are both linear algorithms that help reduce computational burden and thus their combination can be applied into realtime spike analysis.