900 resultados para ensemble empirical mode decomposition with canonical correlation analysis (EEMD-CCA)
Resumo:
Modelling video sequences by subspaces has recently shown promise for recognising human actions. Subspaces are able to accommodate the effects of various image variations and can capture the dynamic properties of actions. Subspaces form a non-Euclidean and curved Riemannian manifold known as a Grassmann manifold. Inference on manifold spaces usually is achieved by embedding the manifolds in higher dimensional Euclidean spaces. In this paper, we instead propose to embed the Grassmann manifolds into reproducing kernel Hilbert spaces and then tackle the problem of discriminant analysis on such manifolds. To achieve efficient machinery, we propose graph-based local discriminant analysis that utilises within-class and between-class similarity graphs to characterise intra-class compactness and inter-class separability, respectively. Experiments on KTH, UCF Sports, and Ballet datasets show that the proposed approach obtains marked improvements in discrimination accuracy in comparison to several state-of-the-art methods, such as the kernel version of affine hull image-set distance, tensor canonical correlation analysis, spatial-temporal words and hierarchy of discriminative space-time neighbourhood features.
Resumo:
Spectroscopic studies of complex clinical fluids have led to the application of a more holistic approach to their chemical analysis becoming more popular and widely employed. The efficient and effective interpretation of multidimensional spectroscopic data relies on many chemometric techniques and one such group of tools is represented by so-called correlation analysis methods. Typical of these techniques are two-dimensional correlation analysis and statistical total correlation spectroscopy (STOCSY). Whilst the former has largely been applied to optical spectroscopic analysis, STOCSY was developed and has been applied almost exclusively to NMR metabonomic studies. Using a 1H NMR study of human blood plasma, from subjects recovering from exhaustive exercise trials, the basic concepts and applications of these techniques are examined. Typical information from their application to NMR-based metabonomics is presented and their value in aiding interpretation of NMR data obtained from biological systems is illustrated. Major energy metabolites are identified in the NMR spectra and the dynamics of their appearance and removal from plasma during exercise recovery are illustrated and discussed. The complementary nature of two-dimensional correlation analysis and statistical total correlation spectroscopy are highlighted.
Resumo:
The diagnostics of mechanical components operating in transient conditions is still an open issue, in both research and industrial field. Indeed, the signal processing techniques developed to analyse stationary data are not applicable or are affected by a loss of effectiveness when applied to signal acquired in transient conditions. In this paper, a suitable and original signal processing tool (named EEMED), which can be used for mechanical component diagnostics in whatever operating condition and noise level, is developed exploiting some data-adaptive techniques such as Empirical Mode Decomposition (EMD), Minimum Entropy Deconvolution (MED) and the analytical approach of the Hilbert transform. The proposed tool is able to supply diagnostic information on the basis of experimental vibrations measured in transient conditions. The tool has been originally developed in order to detect localized faults on bearings installed in high speed train traction equipments and it is more effective to detect a fault in non-stationary conditions than signal processing tools based on spectral kurtosis or envelope analysis, which represent until now the landmark for bearings diagnostics.
Resumo:
The signal processing techniques developed for the diagnostics of mechanical components operating in stationary conditions are often not applicable or are affected by a loss of effectiveness when applied to signals measured in transient conditions. In this chapter, an original signal processing tool is developed exploiting some data-adaptive techniques such as Empirical Mode Decomposition, Minimum Entropy Deconvolution and the analytical approach of the Hilbert transform. The tool has been developed to detect localized faults on bearings of traction systems of high speed trains and it is more effective to detect a fault in non-stationary conditions than signal processing tools based on envelope analysis or spectral kurtosis, which represent until now the landmark for bearings diagnostics.
Resumo:
We investigate the utility to computational Bayesian analyses of a particular family of recursive marginal likelihood estimators characterized by the (equivalent) algorithms known as "biased sampling" or "reverse logistic regression" in the statistics literature and "the density of states" in physics. Through a pair of numerical examples (including mixture modeling of the well-known galaxy dataset) we highlight the remarkable diversity of sampling schemes amenable to such recursive normalization, as well as the notable efficiency of the resulting pseudo-mixture distributions for gauging prior-sensitivity in the Bayesian model selection context. Our key theoretical contributions are to introduce a novel heuristic ("thermodynamic integration via importance sampling") for qualifying the role of the bridging sequence in this procedure, and to reveal various connections between these recursive estimators and the nested sampling technique.
Resumo:
An escape mechanism in a bistable system driven by colored noise of large but finite correlation time (tau) is analyzed. It is shown that the fluctuating potential theory [Phys. Rev. A 38, 3749 (1988)] becomes invalid in a region around the inflection points of the bistable potential, resulting in the underestimation of the mean first passage time at finite tau by this theory. It is shown that transitions at large but finite tau are caused by noise spikes, with edges rising and falling exponentially in a time of O(tau). Simulation of the dynamics of the bistable system driven by noise spikes of the above-mentioned nature clearly reveal the physical mechanism behind the transition.
Resumo:
A modeling framework is presented in this paper, integrating hydrologic scenarios projected from a General Circulation Model (GCM) with a water quality simulation model to quantify the future expected risk. Statistical downscaling with a Canonical Correlation Analysis (CCA) is carried out to develop the future scenarios of hydro-climate variables starting with simulations provided by a GCM. A Multiple Logistic Regression (MLR) is used to quantify the risk of Low Water Quality (LWQ) corresponding to a threshold quality level, by considering the streamflow and water temperature as explanatory variables. An Imprecise Fuzzy Waste Load Allocation Model (IFWLAM) presented in an earlier study is then used to develop adaptive policies to address the projected water quality risks. Application of the proposed methodology is demonstrated with the case study of Tunga-Bhadra river in India. The results showed that the projected changes in the hydro-climate variables tend to diminish DO levels, thus increasing the future risk levels of LWQ. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
In the present study, variable temperature FT-IR spectroscopic investigations were used to characterize the spectral changes in oleic acid during heating oleic acid in the temperature range from -30 degrees;C to 22 degrees C. In order to extract more information about the spectral variations taking place during the phase transition process, 2D correlation spectroscopy (2DCOS) was employed for the stretching (C?O) and rocking (CH2) band of oleic acid. However, the interpretation of these spectral variations in the FT-IR spectra is not straightforward, because the absorption bands are heavily overlapped and change due to two processes: recrystallization of the ?-phase and melting of the oleic acid. Furthermore, the solid phase transition from the ?- to the a-phase was also observed between -4 degrees C and -2 degrees C. Thus, for a more detailed 2DCOS analysis, we have split up the spectral data set in the subsets recorded between -30 degrees C to -16 degrees C, -16 degrees C to 10 degrees C, and 10 degrees C to 22 degrees C. In the corresponding synchronous and asynchronous 2D correlation plots, absorption bands that are characteristic of the crystalline and amorphous regions of oleic acid were separated.
Resumo:
This paper presents an approach to model the expected impacts of climate change on irrigation water demand in a reservoir command area. A statistical downscaling model and an evapotranspiration model are used with a general circulation model (GCM) output to predict the anticipated change in the monthly irrigation water requirement of a crop. Specifically, we quantify the likely changes in irrigation water demands at a location in the command area, as a response to the projected changes in precipitation and evapotranspiration at that location. Statistical downscaling with a canonical correlation analysis is carried out to develop the future scenarios of meteorological variables (rainfall, relative humidity (RH), wind speed (U-2), radiation, maximum (Tmax) and minimum (Tmin) temperatures) starting with simulations provided by a GCM for a specified emission scenario. The medium resolution Model for Interdisciplinary Research on Climate GCM is used with the A1B scenario, to assess the likely changes in irrigation demands for paddy, sugarcane, permanent garden and semidry crops over the command area of Bhadra reservoir, India. Results from the downscaling model suggest that the monthly rainfall is likely to increase in the reservoir command area. RH, Tmax and Tmin are also projected to increase with small changes in U-2. Consequently, the reference evapotranspiration, modeled by the Penman-Monteith equation, is predicted to increase. The irrigation requirements are assessed on monthly scale at nine selected locations encompassing the Bhadra reservoir command area. The irrigation requirements are projected to increase, in most cases, suggesting that the effect of projected increase in rainfall on the irrigation demands is offset by the effect due to projected increase/change in other meteorological variables (viz., Tmax and Tmin, solar radiation, RH and U-2). The irrigation demand assessment study carried out at a river basin will be useful for future irrigation management systems. Copyright (c) 2012 John Wiley & Sons, Ltd.
Resumo:
Research has been undertaken to ascertain the predictability of non-stationary time series using wavelet and Empirical Mode Decomposition (EMD) based time series models. Methods have been developed in the past to decompose a time series into components. Forecasting of these components combined with random component could yield predictions. Using this ideology, wavelet and EMD analyses have been incorporated separately which decomposes a time series into independent orthogonal components with both time and frequency localizations. The component series are fit with specific auto-regressive models to obtain forecasts which are later combined to obtain the actual predictions. Four non-stationary streamflow sites (USGS data resources) of monthly total volumes and two non-stationary gridded rainfall sites (IMD) of monthly total rainfall are considered for the study. The predictability is checked for six and twelve months ahead forecasts across both the methodologies. Based on performance measures, it is observed that wavelet based method has better prediction capabilities over EMD based method despite some of the limitations of time series methods and the manner in which decomposition takes place. Finally, the study concludes that the wavelet based time series algorithm can be used to model events such as droughts with reasonable accuracy. Also, some modifications that can be made in the model have been discussed that could extend the scope of applicability to other areas in the field of hydrology. (C) 2013 Elesvier B.V. All rights reserved.
Resumo:
Guidance laws based on a conventional sliding mode ensures only asymptotic convergence. However, convergence to the desired impact angle within a finite time is important in most practical guidance applications. These finite time convergent guidance laws suffer from singularity leading to control saturation. In this paper, guidance laws to intercept targets at a desired impact angle, from any initial heading angle, without exhibiting any singularity, are presented. The desired impact angle, which is defined in terms of a desired line-of-sight angle, is achieved in finite time by selecting the interceptor's lateral acceleration to enforce nonsingular terminal sliding mode on a switching surface designed using nonlinear engagement dynamics. Numerical simulation results are presented to validate the proposed guidance laws for different initial engagement geometries and impact angles. Although the guidance laws are designed for constant speed interceptors, its robustness against the time-varying speed of interceptors is also evaluated through extensive simulation results.