954 resultados para complex wavelet transform
Resumo:
Binary systems are key environments to study the fundamental properties of stars. In this work, we analyze 99 binary systems identified by the CoRoT space mission. From the study of the phase diagrams of these systems, our sample is divided into three groups: those whose systems are characterized by the variability relative to the binary eclipses; those presenting strong modulations probably due to the presence of stellar spots on the surface of star; and those whose systems have variability associated with the expansion and contraction of the surface layers. For eclipsing binary stars, phase diagrams are used to estimate the classification in regard to their morphology, based on the study of equipotential surfaces. In this context, to determine the rotation period, and to identify the presence of active regions, and to investigate if the star exhibits or not differential rotation and study stellar pulsation, we apply the wavelet procedure. The wavelet transform has been used as a powerful tool in the treatment of a large number of problems in astrophysics. Through the wavelet transform, one can perform an analysis in time-frequency light curves rich in details that contribute significantly to the study of phenomena associated with the rotation, the magnetic activity and stellar pulsations. In this work, we apply Morlet wavelet (6th order), which offers high time and frequency resolution and obtain local (energy distribution of the signal) and global (time integration of local map) wavelet power spectra. Using the wavelet analysis, we identify thirteen systems with periodicities related to the rotational modulation, besides the beating pattern signature in the local wavelet map of five pulsating stars over the entire time span.
Resumo:
The increasing demand in electricity and decrease forecast, increasingly, of fossil fuel reserves, as well as increasing environmental concern in the use of these have generated a concern about the quality of electricity generation, making it well welcome new investments in generation through alternative, clean and renewable sources. Distributed generation is one of the main solutions for the independent and selfsufficient generating systems, such as the sugarcane industry. This sector has grown considerably, contributing expressively in the production of electricity to the distribution networks. Faced with this situation, one of the main objectives of this study is to propose the implementation of an algorithm to detect islanding disturbances in the electrical system, characterized by situations of under- or overvoltage. The algorithm should also commonly quantize the time that the system was operating in these conditions, to check the possible consequences that will be caused in the electric power system. In order to achieve this it used the technique of wavelet multiresolution analysis (AMR) for detecting the generated disorders. The data obtained can be processed so as to be used for a possible predictive maintenance in the protection equipment of electrical network, since they are prone to damage on prolonged operation under abnormal conditions of frequency and voltage.
Resumo:
The focus of this work is the automatic analysis of disturbance records for electrical power generating units. The main proposition is a method based on wavelet transform applied to short-term disturbance records (waveform records). The goal of the method is to detect the time instants of recorded disturbances and extract meaningful information that characterize the faults. The result is a set of representative information of the monitored signals in power generators. This information can be further classified by an expert system (or other classification method) in order to classify the faults and other abnormal operating conditions. The large amount of data produced by digital fault recorders during faults justify the research of methods to assist the analysts in their task of analysing the disturbances. The literature review pointed out the state of the art and possible applications for oscillography records. The review of the COMTRADE standard and wavelet transform underlines the choice of the method for solving the problem. The conducted tests lead to the determination of the best mother wavelet for the segmentation process. The application of the proposed method to five case studies with real oscillographic records confirmed the accuracy and efficiency of the proposed scheme. With this research, the post-operation analysis of occurrences is improved and as a direct result is the reduction of the time that generators are offline.
Resumo:
Tese (doutorado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Civil e Ambiental, 2015.
Resumo:
In the Hydrocarbon exploration activities, the great enigma is the location of the deposits. Great efforts are undertaken in an attempt to better identify them, locate them and at the same time, enhance cost-effectiveness relationship of extraction of oil. Seismic methods are the most widely used because they are indirect, i.e., probing the subsurface layers without invading them. Seismogram is the representation of the Earth s interior and its structures through a conveniently disposed arrangement of the data obtained by seismic reflection. A major problem in this representation is the intensity and variety of present noise in the seismogram, as the surface bearing noise that contaminates the relevant signals, and may mask the desired information, brought by waves scattered in deeper regions of the geological layers. It was developed a tool to suppress these noises based on wavelet transform 1D and 2D. The Java language program makes the separation of seismic images considering the directions (horizontal, vertical, mixed or local) and bands of wavelengths that form these images, using the Daubechies Wavelets, Auto-resolution and Tensor Product of wavelet bases. Besides, it was developed the option in a single image, using the tensor product of two-dimensional wavelets or one-wavelet tensor product by identities. In the latter case, we have the wavelet decomposition in a two dimensional signal in a single direction. This decomposition has allowed to lengthen a certain direction the two-dimensional Wavelets, correcting the effects of scales by applying Auto-resolutions. In other words, it has been improved the treatment of a seismic image using 1D wavelet and 2D wavelet at different stages of Auto-resolution. It was also implemented improvements in the display of images associated with breakdowns in each Auto-resolution, facilitating the choices of images with the signals of interest for image reconstruction without noise. The program was tested with real data and the results were good
Resumo:
In this work, spoke about the importance of image compression for the industry, it is known that processing and image storage is always a challenge in petrobrás to optimize the storage time and store a maximum number of images and data. We present an interactive system for processing and storing images in the wavelet domain and an interface for digital image processing. The proposal is based on the Peano function and wavelet transform in 1D. The storage system aims to optimize the computational space, both for storage and for transmission of images. Being necessary to the application of the Peano function to linearize the images and the 1D wavelet transform to decompose it. These applications allow you to extract relevant information for the storage of an image with a lower computational cost and with a very small margin of error when comparing the images, original and processed, ie, there is little loss of quality when applying the processing system presented . The results obtained from the information extracted from the images are displayed in a graphical interface. It is through the graphical user interface that the user uses the files to view and analyze the results of the programs directly on the computer screen without the worry of dealing with the source code. The graphical user interface, programs for image processing via Peano Function and Wavelet Transform 1D, were developed in Java language, allowing a direct exchange of information between them and the user
Resumo:
Plasma edge turbulence in Tokamak Chauffage Alfven Bresilien (TCABR) [R. M. O. Galvao et al., Plasma Phys. Contr. Fusion 43, 1181 (2001)] is investigated for multifractal properties of the fluctuating floating electrostatic potential measured by Langmuir probes. The multifractality in this signal is characterized by the full multifractal spectra determined by applying the wavelet transform modulus maxima. In this work, the dependence of the multifractal spectrum with the radial position is presented. The multifractality degree inside the plasma increases with the radial position reaching a maximum near the plasma edge and becoming almost constant in the scrape-off layer. Comparisons between these results with those obtained for random test time series with the same Hurst exponents and data length statistically confirm the reported multifractal behavior. Moreover, the persistence of these signals, characterized by their Hurst exponent, present radial profile similar to the deterministic component estimated from analysis based on dynamical recurrences. (C) 2008 American Institute of Physics.
Resumo:
Objective: We carry out a systematic assessment on a suite of kernel-based learning machines while coping with the task of epilepsy diagnosis through automatic electroencephalogram (EEG) signal classification. Methods and materials: The kernel machines investigated include the standard support vector machine (SVM), the least squares SVM, the Lagrangian SVM, the smooth SVM, the proximal SVM, and the relevance vector machine. An extensive series of experiments was conducted on publicly available data, whose clinical EEG recordings were obtained from five normal subjects and five epileptic patients. The performance levels delivered by the different kernel machines are contrasted in terms of the criteria of predictive accuracy, sensitivity to the kernel function/parameter value, and sensitivity to the type of features extracted from the signal. For this purpose, 26 values for the kernel parameter (radius) of two well-known kernel functions (namely. Gaussian and exponential radial basis functions) were considered as well as 21 types of features extracted from the EEG signal, including statistical values derived from the discrete wavelet transform, Lyapunov exponents, and combinations thereof. Results: We first quantitatively assess the impact of the choice of the wavelet basis on the quality of the features extracted. Four wavelet basis functions were considered in this study. Then, we provide the average accuracy (i.e., cross-validation error) values delivered by 252 kernel machine configurations; in particular, 40%/35% of the best-calibrated models of the standard and least squares SVMs reached 100% accuracy rate for the two kernel functions considered. Moreover, we show the sensitivity profiles exhibited by a large sample of the configurations whereby one can visually inspect their levels of sensitiveness to the type of feature and to the kernel function/parameter value. Conclusions: Overall, the results evidence that all kernel machines are competitive in terms of accuracy, with the standard and least squares SVMs prevailing more consistently. Moreover, the choice of the kernel function and parameter value as well as the choice of the feature extractor are critical decisions to be taken, albeit the choice of the wavelet family seems not to be so relevant. Also, the statistical values calculated over the Lyapunov exponents were good sources of signal representation, but not as informative as their wavelet counterparts. Finally, a typical sensitivity profile has emerged among all types of machines, involving some regions of stability separated by zones of sharp variation, with some kernel parameter values clearly associated with better accuracy rates (zones of optimality). (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The search for more realistic modeling of financial time series reveals several stylized facts of real markets. In this work we focus on the multifractal properties found in price and index signals. Although the usual minority game (MG) models do not exhibit multifractality, we study here one of its variants that does. We show that the nonsynchronous MG models in the nonergodic phase is multifractal and in this sense, together with other stylized facts, constitute a better modeling tool. Using the structure function (SF) approach we detected the stationary and the scaling range of the time series generated by the MG model and, from the linear (non-linear) behavior of the SF we identified the fractal (multifractal) regimes. Finally, using the wavelet transform modulus maxima (WTMM) technique we obtained its multifractal spectrum width for different dynamical regimes. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
The Random Parameter model was proposed to explain the structure of the covariance matrix in problems where most, but not all, of the eigenvalues of the covariance matrix can be explained by Random Matrix Theory. In this article, we explore the scaling properties of the model, as observed in the multifractal structure of the simulated time series. We use the Wavelet Transform Modulus Maxima technique to obtain the multifractal spectrum dependence with the parameters of the model. The model shows a scaling structure compatible with the stylized facts for a reasonable choice of the parameter values. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
This paper proposes a novel computer vision approach that processes video sequences of people walking and then recognises those people by their gait. Human motion carries different information that can be analysed in various ways. The skeleton carries motion information about human joints, and the silhouette carries information about boundary motion of the human body. Moreover, binary and gray-level images contain different information about human movements. This work proposes to recover these different kinds of information to interpret the global motion of the human body based on four different segmented image models, using a fusion model to improve classification. Our proposed method considers the set of the segmented frames of each individual as a distinct class and each frame as an object of this class. The methodology applies background extraction using the Gaussian Mixture Model (GMM), a scale reduction based on the Wavelet Transform (WT) and feature extraction by Principal Component Analysis (PCA). We propose four new schemas for motion information capture: the Silhouette-Gray-Wavelet model (SGW) captures motion based on grey level variations; the Silhouette-Binary-Wavelet model (SBW) captures motion based on binary information; the Silhouette-Edge-Binary model (SEW) captures motion based on edge information and the Silhouette Skeleton Wavelet model (SSW) captures motion based on skeleton movement. The classification rates obtained separately from these four different models are then merged using a new proposed fusion technique. The results suggest excellent performance in terms of recognising people by their gait.
Resumo:
We present a review of perceptual image quality metrics and their application to still image compression. The review describes how image quality metrics can be used to guide an image compression scheme and outlines the advantages, disadvantages and limitations of a number of quality metrics. We examine a broad range of metrics ranging from simple mathematical measures to those which incorporate full perceptual models. We highlight some variation in the models for luminance adaptation and the contrast sensitivity function and discuss what appears to be a lack of a general consensus regarding the models which best describe contrast masking and error summation. We identify how the various perceptual components have been incorporated in quality metrics, and identify a number of psychophysical testing techniques that can be used to validate the metrics. We conclude by illustrating some of the issues discussed throughout the paper with a simple demonstration. (C) 1998 Elsevier Science B.V. All rights reserved.
Resumo:
In this paper, a hybrid intelligent approach is proposed for short-term electricity prices forecasting in a competitive market. The proposed approach is based on the wavelet transform and a hybrid of neural networks and fuzzy logic. Results from a case study based on the electricity market of mainland Spain are presented. A thorough comparison is carried out, taking into account the results of previous publications. Conclusions are duly drawn. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
PURPOSE: Fatty liver disease (FLD) is an increasing prevalent disease that can be reversed if detected early. Ultrasound is the safest and ubiquitous method for identifying FLD. Since expert sonographers are required to accurately interpret the liver ultrasound images, lack of the same will result in interobserver variability. For more objective interpretation, high accuracy, and quick second opinions, computer aided diagnostic (CAD) techniques may be exploited. The purpose of this work is to develop one such CAD technique for accurate classification of normal livers and abnormal livers affected by FLD. METHODS: In this paper, the authors present a CAD technique (called Symtosis) that uses a novel combination of significant features based on the texture, wavelet transform, and higher order spectra of the liver ultrasound images in various supervised learning-based classifiers in order to determine parameters that classify normal and FLD-affected abnormal livers. RESULTS: On evaluating the proposed technique on a database of 58 abnormal and 42 normal liver ultrasound images, the authors were able to achieve a high classification accuracy of 93.3% using the decision tree classifier. CONCLUSIONS: This high accuracy added to the completely automated classification procedure makes the authors' proposed technique highly suitable for clinical deployment and usage.