5 resultados para SPECTRAL-ANALYSIS


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In a recent paper Leong-Huang:2010 {Journal of Applied Statistics 37, 215–233} proposed a wavelet-correlation-based approach to test for cointegration between two time series. However, correlation and cointegration are two different concepts even when wavelet analysis is used. It is known that statistics based on nonstationary integrated variables have non-standard asymptotic distributions. However, wavelet analysis offsets the integrating order of nonstationary series so that traditional asymptotics on stationary variables suffices to ascertain the statistical properties of wavelet-based statistics. Based on this, this note shows that wavelet correlations cannot be used as a test of cointegration.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The hydrological response of a catchment to rainfall on different timescales is result of a complex system involving a range of physical processes which may operate simultaneously and have different spatial and temporal influences. This paper presents the analysis of streamflow response of a small humid-temperate catchment (Aixola, 4.8 km(2)) in the Basque Country on different timescales and discusses the role of the controlling factors. Firstly, daily time series analysis was used to establish a hypothesis on the general functioning of the catchment through the relationship between precipitation and discharge on an annual and multiannual scale (2003-2008). Second, rainfall-runoff relationships and relationships among several hydrological variables, including catchment antecedent conditions, were explored at the event scale (222 events) to check and improve the hypothesis. Finally, the evolution of electrical conductivity (EC) during some of the monitored storm events (28 events) was examined to identify the time origin of waters. Quick response of the catchment to almost all the rainfall events as well as a considerable regulation capacity was deduced from the correlation and spectral analyses. These results agree with runoff event scale data analysis; however, the event analysis revealed the non-linearity of the system, as antecedent conditions play a significant role in this catchment. Further, analysis at the event scale made possible to clarify factors controlling (precipitation, precipitation intensity and initial discharge) the different aspects of the runoff response (runoff coefficient and discharge increase) for this catchment. Finally, the evolution of EC of the waters enabled the time origin (event or pre-event waters) of the quickflow to be established; specifically, the conductivity showed that pre-event waters usually represent a high percentage of the total discharge during runoff peaks. The importance of soil waters in the catchment is being studied more deeply.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Quality of cardiopulmonary resuscitation (CPR) is key to increase survival from cardiac arrest. Providing chest compressions with adequate rate and depth is difficult even for well-trained rescuers. The use of real-time feedback devices is intended to contribute to enhance chest compression quality. These devices are typically based on the double integration of the acceleration to obtain the chest displacement during compressions. The integration process is inherently unstable and leads to important errors unless boundary conditions are applied for each compression cycle. Commercial solutions use additional reference signals to establish these conditions, requiring additional sensors. Our aim was to study the accuracy of three methods based solely on the acceleration signal to provide feedback on the compression rate and depth. Materials and Methods We simulated a CPR scenario with several volunteers grouped in couples providing chest compressions on a resuscitation manikin. Different target rates (80, 100, 120, and 140 compressions per minute) and a target depth of at least 50 mm were indicated. The manikin was equipped with a displacement sensor. The accelerometer was placed between the rescuer's hands and the manikin's chest. We designed three alternatives to direct integration based on different principles (linear filtering, analysis of velocity, and spectral analysis of acceleration). We evaluated their accuracy by comparing the estimated depth and rate with the values obtained from the reference displacement sensor. Results The median (IQR) percent error was 5.9% (2.8-10.3), 6.3% (2.9-11.3), and 2.5% (1.2-4.4) for depth and 1.7% (0.0-2.3), 0.0% (0.0-2.0), and 0.9% (0.4-1.6) for rate, respectively. Depth accuracy depended on the target rate (p < 0.001) and on the rescuer couple (p < 0.001) within each method. Conclusions Accurate feedback on chest compression depth and rate during CPR is possible using exclusively the chest acceleration signal. The algorithm based on spectral analysis showed the best performance. Despite these encouraging results, further research should be conducted to asses the performance of these algorithms with clinical data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hyper-spectral data allows the construction of more robust statistical models to sample the material properties than the standard tri-chromatic color representation. However, because of the large dimensionality and complexity of the hyper-spectral data, the extraction of robust features (image descriptors) is not a trivial issue. Thus, to facilitate efficient feature extraction, decorrelation techniques are commonly applied to reduce the dimensionality of the hyper-spectral data with the aim of generating compact and highly discriminative image descriptors. Current methodologies for data decorrelation such as principal component analysis (PCA), linear discriminant analysis (LDA), wavelet decomposition (WD), or band selection methods require complex and subjective training procedures and in addition the compressed spectral information is not directly related to the physical (spectral) characteristics associated with the analyzed materials. The major objective of this article is to introduce and evaluate a new data decorrelation methodology using an approach that closely emulates the human vision. The proposed data decorrelation scheme has been employed to optimally minimize the amount of redundant information contained in the highly correlated hyper-spectral bands and has been comprehensively evaluated in the context of non-ferrous material classification