996 resultados para multiresolution analysis (MRA)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Water temperature measurements from Wivenhoe Dam offer a unique opportunity for studying fluctuations of temperatures in a subtropical dam as a function of time and depth. Cursory examination of the data indicate a complicated structure across both time and depth. We propose simplifying the task of describing these data by breaking the time series at each depth into physically meaningful components that individually capture daily, subannual, and annual (DSA) variations. Precise definitions for each component are formulated in terms of a wavelet-based multiresolution analysis. The DSA components are approximately pairwise uncorrelated within a given depth and between different depths. They also satisfy an additive property in that their sum is exactly equal to the original time series. Each component is based upon a set of coefficients that decomposes the sample variance of each time series exactly across time and that can be used to study both time-varying variances of water temperature at each depth and time-varying correlations between temperatures at different depths. Each DSA component is amenable for studying a certain aspect of the relationship between the series at different depths. The daily component in general is weakly correlated between depths, including those that are adjacent to one another. The subannual component quantifies seasonal effects and in particular isolates phenomena associated with the thermocline, thus simplifying its study across time. The annual component can be used for a trend analysis. The descriptive analysis provided by the DSA decomposition is a useful precursor to a more formal statistical analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multiresolution histograms have been used for indexing and retrieval of images. Multiresolution histograms used traditionally are 2d-histograms which encode pixel intensities. Earlier we proposed a method for decomposing images by connectivity. In this paper, we propose to encode centroidal distances of an image in multiresolution histograms; the image is decomposed a priori, by connectivity. Multiresolution histograms thus obtained are 3d-histograms which encode connectivity and centroidal distances. The statistical technique of Principal Component Analysis is applied to multiresolution 3d-histograms and the resulting data is used to index images. Distance between two images is computed as the L2-difference of their principal components. Experiments are performed on Item S8 within the MPEG-7 image dataset. We also analyse the effect of pixel intensity thresholding on multiresolution images.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A multiresolution technique based on multiwavelets scale-space representation for stereo correspondence estimation is presented. The technique uses the well-known coarse-to-fine strategy, involving the calculation of stereo correspondences at the coarsest resolution level with consequent refinement up to the finest level. Vector coefficients of the multiwavelets transform modulus are used as corresponding features, where modulus maxima defines the shift invariant high-level features (multiscale edges) with phase pointing to the normal of the feature surface. The technique addresses the estimation of optimal corresponding points and the corresponding 2D disparity maps. Illuminative variation that can exist between the perspective views of the same scene is controlled using scale normalization at each decomposition level by dividing the details space coefficients with approximation space. The problems of ambiguity, explicitly, and occlusion, implicitly, are addressed by using a geometric topological refinement procedure. Geometric refinement is based on a symbolic tagging procedure introduced to keep only the most consistent matches in consideration. Symbolic tagging is performed based on probability of occurrence and multiple thresholds. The whole procedure is constrained by the uniqueness and continuity of the corresponding stereo features. The comparative performance of the proposed algorithm with eight famous existing algorithms, presented in the literature, is shown to validate the claims of promising performance of the proposed algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the procedure for detection and tracking of a vehicle from an on-road image sequence taken by a monocular video capturing device in real time. The main objective of such a visual tracking system is to closely follow objects in each frame of a video stream, such that the object position as well as other geometric information are always known. In the tracking system described, the video capturing device is also moving. It is a challenge to detect and track a moving vehicle under a constantly changing environment coupled to real time video processing. The system suggested is robust to implement under different illuminating conditions by using the monocular video capturing device. The vehicle tracking algorithm is one of the most important modules in an autonomous vehicle system, not only it should be very accurate but also must have the safety of other vehicles, pedestrians, and the moving vehicle itself. In order to achieve this an algorithm of multi resolution technique based on Haar basis functions were used for the wavelet transform, where a combination of classification was carried out with the multilayer feed forward neural network. The classification is done in a reduced dimensional space, where principle component analysis (PCA) dimensional reduction technique has been applied to make the classification process much more efficient. The results show the effectiveness of the proposed methodology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a study of the mathematical properties of voice as an audio signal -- This work includes signals in which the channel conditions are not ideal for emotion recognition -- Multiresolution analysis- discrete wavelet transform – was performed through the use of Daubechies Wavelet Family (Db1-Haar, Db6, Db8, Db10) allowing the decomposition of the initial audio signal into sets of coefficients on which a set of features was extracted and analyzed statistically in order to differentiate emotional states -- ANNs proved to be a system that allows an appropriate classification of such states -- This study shows that the extracted features using wavelet decomposition are enough to analyze and extract emotional content in audio signals presenting a high accuracy rate in classification of emotional states without the need to use other kinds of classical frequency-time features -- Accordingly, this paper seeks to characterize mathematically the six basic emotions in humans: boredom, disgust, happiness, anxiety, anger and sadness, also included the neutrality, for a total of seven states to identify

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ionospheric effect is one of the major errors in GPS data processing over long baselines. As a dispersive medium, it is possible to compute its influence on the GPS signal with the ionosphere-free linear combination of L1 and L2 observables, requiring dual-frequency receivers. In the case of single-frequency receivers, ionospheric effects are either neglected or reduced by using a model. In this paper, an alternative for single-frequency users is proposed. It involves multiresolution analysis (MRA) using a wavelet analysis of the double-difference observations to remove the short- and medium-scale ionosphere variations and disturbances, as well as some minor tropospheric effects. Experiments were carried out over three baseline lengths from 50 to 450 km, and the results provided by the proposed method were better than those from dual-frequency receivers. The horizontal root mean square was of about 0.28 m (1 sigma).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Essa dissertação tem por objetivo analisar a influência de famílias wavelets e suas ordens no desempenho de um algoritmo de localização de faltas a partir das ondas viajantes de dois terminais de uma linha de transmissão aérea. Tornou-se objetivo secundário a modelagem de um sistema elétrico de potência (SEP) para obtenção de um universo de faltas que validassem o localizador. Para isso, parte de um SEP da Eletrobrás-Eletronorte em 500/230 kV foi modelado no Alternative Transient Program (ATP) utilizando-se parâmetros reais. A Transformada Wavelet, via análise multiresolução (AMR), é empregada valendo-se de sua característica de localização temporal, permitindo caracterizações precisas de instantes de transitórios eletromagnéticos ocasionados por faltas, as quais geram ondas que ao se propagarem em direção aos terminais da linha contêm os tempos de propagação destas do local do defeito a tais terminais e podem ser convenientemente extraídos por tal transformada. Pela metodologia adotada no algoritmo, a diferença entre esses tempos determina com boa exatidão o local de ocorrência da falta sobre a linha. Entretanto, um dos agentes variantes do erro nessa estimação é a escolha da Wavelet usada na AMR dos sinais, sendo, portanto, a avaliação dessa escolha sobre o erro, objetivo principal do trabalho, justificada pela ainda inexistente fundamentação científica que garanta a escolha de uma wavelet ótima a uma certa aplicação. Dentre um leque de Wavelets discretas, obtiveram-se resultados adequados para 16 delas, havendo erros máximos inferiores aos 250 metros estipulados para a precisão. Duas Wavelets, a Db15 e a Sym17, sobressaíram-se ao errarem, respectivamente, 3,5 e 1,1 vezes menos que as demais. A metodologia empregada consta da: exportação dos dados das faltas do ATP para o MATLAB®; aplicação da transformação modal de Clarke; decomposição dos modos alfa e síntese dos níveis 1 de detalhes via AMR; cálculo de suas máximas magnitudes e determinação dos índices temporais; e por fim, a teoria das ondas viajantes equaciona e estima o local do defeito sobre a LT, sendo tudo isso programado no MATLAB e os erros de localização analisados estatisticamente no Microsoft Excell®. Ao final elaborou-se ainda uma GUI (Guide User Interface) para a Interface Homem-Máquina (IHM) do localizador, servindo também para análises gráficas de qualquer das contingências aplicadas ao SEP. Os resultados alcançados demonstram uma otimização de performance em razão da escolha da wavelet mais adequada ao algoritmo e norteiam para uma aplicação prática do localizador.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

90.00% 90.00%

Publicador:

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Meta-regression analysis (MRA) provides an empirical framework through which to integrate disparate economics research results, filter out likely publication selection bias, and explain their wide variation using socio-economic and econometric explanatory variables. In dozens of applications, MRA has found excess variation among reported research findings, some of which is explained by socio-economic variables (e.g., researchers’ gender). MRA can empirically model and test socio-economic theories about economics research. Here, we make two strong claims: socio-economic MRAs, broadly conceived, explain much of the excess variation routinely found in empirical economics research; whereas, any other type of literature review (or summary) is biased.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents the first comprehensive synthesis of economic valuations of wetlands in developing countries. Meta-regression analysis (MRA) is applied to 1432 estimates of the economic value of 379 distinct wetlands from 50 countries. We find that wetlands are a normal good, wetland size has a negative effect on wetland values, and urban wetlands and marine wetlands are more valuable than other wetlands. Wetland values estimated by stated preferences are lower than those estimated by market price methods. The MRA benefit transfer function has a median transfer error of 17%. Overall, MRA appears to be useful for deriving the economic value of wetlands at policy sites in developing nations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The scheme is based on Ami Harten's ideas (Harten, 1994), the main tools coming from wavelet theory, in the framework of multiresolution analysis for cell averages. But instead of evolving cell averages on the finest uniform level, we propose to evolve just the cell averages on the grid determined by the significant wavelet coefficients. Typically, there are few cells in each time step, big cells on smooth regions, and smaller ones close to irregularities of the solution. For the numerical flux, we use a simple uniform central finite difference scheme, adapted to the size of each cell. If any of the required neighboring cell averages is not present, it is interpolated from coarser scales. But we switch to ENO scheme in the finest part of the grids. To show the feasibility and efficiency of the method, it is applied to a system arising in polymer-flooding of an oil reservoir. In terms of CPU time and memory requirements, it outperforms Harten's multiresolution algorithm.The proposed method applies to systems of conservation laws in 1Dpartial derivative(t)u(x, t) + partial derivative(x)f(u(x, t)) = 0, u(x, t) is an element of R-m. (1)In the spirit of finite volume methods, we shall consider the explicit schemeupsilon(mu)(n+1) = upsilon(mu)(n) - Deltat/hmu ((f) over bar (mu) - (f) over bar (mu)-) = [Dupsilon(n)](mu), (2)where mu is a point of an irregular grid Gamma, mu(-) is the left neighbor of A in Gamma, upsilon(mu)(n) approximate to 1/mu-mu(-) integral(mu-)(mu) u(x, t(n))dx are approximated cell averages of the solution, (f) over bar (mu) = (f) over bar (mu)(upsilon(n)) are the numerical fluxes, and D is the numerical evolution operator of the scheme.According to the definition of (f) over bar (mu), several schemes of this type have been proposed and successfully applied (LeVeque, 1990). Godunov, Lax-Wendroff, and ENO are some of the popular names. Godunov scheme resolves well the shocks, but accuracy (of first order) is poor in smooth regions. Lax-Wendroff is of second order, but produces dangerous oscillations close to shocks. ENO schemes are good alternatives, with high order and without serious oscillations. But the price is high computational cost.Ami Harten proposed in (Harten, 1994) a simple strategy to save expensive ENO flux calculations. The basic tools come from multiresolution analysis for cell averages on uniform grids, and the principle is that wavelet coefficients can be used for the characterization of local smoothness.. Typically, only few wavelet coefficients are significant. At the finest level, they indicate discontinuity points, where ENO numerical fluxes are computed exactly. Elsewhere, cheaper fluxes can be safely used, or just interpolated from coarser scales. Different applications of this principle have been explored by several authors, see for example (G-Muller and Muller, 1998).Our scheme also uses Ami Harten's ideas. But instead of evolving the cell averages on the finest uniform level, we propose to evolve the cell averages on sparse grids associated with the significant wavelet coefficients. This means that the total number of cells is small, with big cells in smooth regions and smaller ones close to irregularities. This task requires improved new tools, which are described next.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction: Nocturnal frontal lobe epilepsy (NFLE) is a distinct syndrome of partial epilepsy whose clinical features comprise a spectrum of paroxysmal motor manifestations of variable duration and complexity, arising from sleep. Cardiovascular changes during NFLE seizures have previously been observed, however the extent of these modifications and their relationship with seizure onset has not been analyzed in detail. Objective: Aim of present study is to evaluate NFLE seizure related changes in heart rate (HR) and in sympathetic/parasympathetic balance through wavelet analysis of HR variability (HRV). Methods: We evaluated the whole night digitally recorded video-polysomnography (VPSG) of 9 patients diagnosed with NFLE with no history of cardiac disorders and normal cardiac examinations. Events with features of NFLE seizures were selected independently by three examiners and included in the study only if a consensus was reached. Heart rate was evaluated by measuring the interval between two consecutive R-waves of QRS complexes (RRi). RRi series were digitally calculated for a period of 20 minutes, including the seizures and resampled at 10 Hz using cubic spline interpolation. A multiresolution analysis was performed (Daubechies-16 form), and the squared level specific amplitude coefficients were summed across appropriate decomposition levels in order to compute total band powers in bands of interest (LF: 0.039062 - 0.156248, HF: 0.156248 - 0.624992). A general linear model was then applied to estimate changes in RRi, LF and HF powers during three different period (Basal) (30 sec, at least 30 sec before seizure onset, during which no movements occurred and autonomic conditions resulted stationary); pre-seizure period (preSP) (10 sec preceding seizure onset) and seizure period (SP) corresponding to the clinical manifestations. For one of the patients (patient 9) three seizures associated with ictal asystole were recorded, hence he was treated separately. Results: Group analysis performed on 8 patients (41 seizures) showed that RRi remained unchanged during the preSP, while a significant tachycardia was observed in the SP. A significant increase in the LF component was instead observed during both the preSP and the SP (p<0.001) while HF component decreased only in the SP (p<0.001). For patient 9 during the preSP and in the first part of SP a significant tachycardia was observed associated with an increased sympathetic activity (increased LF absolute values and LF%). In the second part of the SP a progressive decrease in HR that gradually exceeded basal values occurred before IA. Bradycardia was associated with an increase in parasympathetic activity (increased HF absolute values and HF%) contrasted by a further increase in LF until the occurrence of IA. Conclusions: These data suggest that changes in autonomic balance toward a sympathetic prevalence always preceded clinical seizure onset in NFLE, even when HR changes were not yet evident, confirming that wavelet analysis is a sensitive technique to detect sudden variations of autonomic balance occurring during transient phenomena. Finally we demonstrated that epileptic asystole is associated with a parasympathetic hypertonus counteracted by a marked sympathetic activation.