8 resultados para wavelet transforms
Resumo:
In a recent paper Leong-Huang:2010 {Journal of Applied Statistics 37, 215–233} proposed a wavelet-correlation-based approach to test for cointegration between two time series. However, correlation and cointegration are two different concepts even when wavelet analysis is used. It is known that statistics based on nonstationary integrated variables have non-standard asymptotic distributions. However, wavelet analysis offsets the integrating order of nonstationary series so that traditional asymptotics on stationary variables suffices to ascertain the statistical properties of wavelet-based statistics. Based on this, this note shows that wavelet correlations cannot be used as a test of cointegration.
Resumo:
208 p.
Resumo:
Hyper-spectral data allows the construction of more robust statistical models to sample the material properties than the standard tri-chromatic color representation. However, because of the large dimensionality and complexity of the hyper-spectral data, the extraction of robust features (image descriptors) is not a trivial issue. Thus, to facilitate efficient feature extraction, decorrelation techniques are commonly applied to reduce the dimensionality of the hyper-spectral data with the aim of generating compact and highly discriminative image descriptors. Current methodologies for data decorrelation such as principal component analysis (PCA), linear discriminant analysis (LDA), wavelet decomposition (WD), or band selection methods require complex and subjective training procedures and in addition the compressed spectral information is not directly related to the physical (spectral) characteristics associated with the analyzed materials. The major objective of this article is to introduce and evaluate a new data decorrelation methodology using an approach that closely emulates the human vision. The proposed data decorrelation scheme has been employed to optimally minimize the amount of redundant information contained in the highly correlated hyper-spectral bands and has been comprehensively evaluated in the context of non-ferrous material classification
Resumo:
Background: Over many years, it has been assumed that enzymes work either in an isolated way, or organized in small catalytic groups. Several studies performed using "metabolic networks models'' are helping to understand the degree of functional complexity that characterizes enzymatic dynamic systems. In a previous work, we used "dissipative metabolic networks'' (DMNs) to show that enzymes can present a self-organized global functional structure, in which several sets of enzymes are always in an active state, whereas the rest of molecular catalytic sets exhibit dynamics of on-off changing states. We suggested that this kind of global metabolic dynamics might be a genuine and universal functional configuration of the cellular metabolic structure, common to all living cells. Later, a different group has shown experimentally that this kind of functional structure does, indeed, exist in several microorganisms. Methodology/Principal Findings: Here we have analyzed around 2.500.000 different DMNs in order to investigate the underlying mechanism of this dynamic global configuration. The numerical analyses that we have performed show that this global configuration is an emergent property inherent to the cellular metabolic dynamics. Concretely, we have found that the existence of a high number of enzymatic subsystems belonging to the DMNs is the fundamental element for the spontaneous emergence of a functional reactive structure characterized by a metabolic core formed by several sets of enzymes always in an active state. Conclusions/Significance: This self-organized dynamic structure seems to be an intrinsic characteristic of metabolism, common to all living cellular organisms. To better understand cellular functionality, it will be crucial to structurally characterize these enzymatic self-organized global structures.
Resumo:
The learning of probability distributions from data is a ubiquitous problem in the fields of Statistics and Artificial Intelligence. During the last decades several learning algorithms have been proposed to learn probability distributions based on decomposable models due to their advantageous theoretical properties. Some of these algorithms can be used to search for a maximum likelihood decomposable model with a given maximum clique size, k, which controls the complexity of the model. Unfortunately, the problem of learning a maximum likelihood decomposable model given a maximum clique size is NP-hard for k > 2. In this work, we propose a family of algorithms which approximates this problem with a computational complexity of O(k · n^2 log n) in the worst case, where n is the number of implied random variables. The structures of the decomposable models that solve the maximum likelihood problem are called maximal k-order decomposable graphs. Our proposals, called fractal trees, construct a sequence of maximal i-order decomposable graphs, for i = 2, ..., k, in k − 1 steps. At each step, the algorithms follow a divide-and-conquer strategy based on the particular features of this type of structures. Additionally, we propose a prune-and-graft procedure which transforms a maximal k-order decomposable graph into another one, increasing its likelihood. We have implemented two particular fractal tree algorithms called parallel fractal tree and sequential fractal tree. These algorithms can be considered a natural extension of Chow and Liu’s algorithm, from k = 2 to arbitrary values of k. Both algorithms have been compared against other efficient approaches in artificial and real domains, and they have shown a competitive behavior to deal with the maximum likelihood problem. Due to their low computational complexity they are especially recommended to deal with high dimensional domains.
Resumo:
[ES] En este trabajo se define el cambio sintáctico, se analizan los factores que lo causan o facilitan y se estudian sus tipos principales en griego antiguo.
Resumo:
18 p.
Resumo:
One of the key systems of a Wave Energy Converter for extraction of wave energy is the Power Take-Off (PTO) device. This device transforms the mechanical energy of a moving body into electrical energy. This paper describes the model of an innovative PTO based on independently activated double-acting hydraulic cylinders array. The model has been developed using a simulation tool, based on a port-based approach to model hydraulics systems. The components and subsystems used in the model have been parameterized as real components and their values experimentally obtained from an existing prototype. In fact, the model takes into account most of the hydraulic losses of each component. The simulations show the flexibility to apply different restraining torques to the input movement depending on the geometrical configuration and the hydraulic cylinders on duty, easily modified by a control law. The combination of these two actions allows suitable flexibility to adapt the device to different sea states whilst optimizing the energy extraction. The model has been validated using a real test bench showing good correlations between simulation and experimental tests.