972 resultados para fuzzy sample entropy
Resumo:
n this paper, a time series complexity analysis of dense array electroencephalogram signals is carried out using the recently introduced Sample Entropy (SampEn) measure. This statistic quantifies the regularity in signals recorded from systems that can vary from the purely deterministic to purely stochastic realm. The present analysis is conducted with an objective of gaining insight into complexity variations related to changing brain dynamics for EEG recorded from the three cases of passive, eyes closed condition, a mental arithmetic task and the same mental task carried out after a physical exertion task. It is observed that the statistic is a robust quantifier of complexity suited for short physiological signals such as the EEG and it points to the specific brain regions that exhibit lowered complexity during the mental task state as compared to a passive, relaxed state. In the case of mental tasks carried out before and after the performance of a physical exercise, the statistic can detect the variations brought in by the intermediate fatigue inducing exercise period. This enhances its utility in detecting subtle changes in the brain state that can find wider scope for applications in EEG based brain studies.
Resumo:
Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiff(max)) for q not equal 1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiff(max) values were capable of distinguish HRV groups (p-values 5.10 x 10(-3); 1.11 x 10(-7), and 5.50 x 10(-7) for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4758815]
Resumo:
International audience
Resumo:
In financial decision-making processes, the adopted weights of the objective functions have significant impacts on the final decision outcome. However, conventional rating and weighting methods exhibit difficulty in deriving appropriate weights for complex decision-making problems with imprecise information. Entropy is a quantitative measure of uncertainty and has been useful in exploring weights of attributes in decision making. A fuzzy and entropy-based mathematical approach is employed to solve the weighting problem of the objective functions in an overall cash-flow model. The multiproject being undertaken by a medium-size construction firm in Hong Kong was used as a real case study to demonstrate the application of entropy. Its application in multiproject cash flow situations is demonstrated. The results indicate that the overall before-tax profit was HK$ 0.11 millions lower after the introduction of appropriate weights. In addition, the best time to invest in new projects arising from positive cash flow was identified to be two working months earlier than the nonweight system.
Resumo:
The concept of entropy rate is well defined in dynamical systems theory but is impossible to apply it directly to finite real world data sets. With this in mind, Pincus developed Approximate Entropy (ApEn), which uses ideas from Eckmann and Ruelle to create a regularity measure based on entropy rate that can be used to determine the influence of chaotic behaviour in a real world signal. However, this measure was found not to be robust and so an improved formulation known as the Sample Entropy (SampEn) was created by Richman and Moorman to address these issues. We have developed a new, related, regularity measure which is not based on the theory provided by Eckmann and Ruelle and proves a more well-behaved measure of complexity than the previous measures whilst still retaining a low computational cost.
Resumo:
The use of human brain electroencephalography (EEG) signals for automatic person identi cation has been investigated for a decade. It has been found that the performance of an EEG-based person identication system highly depends on what feature to be extracted from multi-channel EEG signals. Linear methods such as Power Spectral Density and Autoregressive Model have been used to extract EEG features. However these methods assumed that EEG signals are stationary. In fact, EEG signals are complex, non-linear, non-stationary, and random in nature. In addition, other factors such as brain condition or human characteristics may have impacts on the performance, however these factors have not been investigated and evaluated in previous studies. It has been found in the literature that entropy is used to measure the randomness of non-linear time series data. Entropy is also used to measure the level of chaos of braincomputer interface systems. Therefore, this thesis proposes to study the role of entropy in non-linear analysis of EEG signals to discover new features for EEG-based person identi- cation. Five dierent entropy methods including Shannon Entropy, Approximate Entropy, Sample Entropy, Spectral Entropy, and Conditional Entropy have been proposed to extract entropy features that are used to evaluate the performance of EEG-based person identication systems and the impacts of epilepsy, alcohol, age and gender characteristics on these systems. Experiments were performed on the Australian EEG and Alcoholism datasets. Experimental results have shown that, in most cases, the proposed entropy features yield very fast person identication, yet with compatible accuracy because the feature dimension is low. In real life security operation, timely response is critical. The experimental results have also shown that epilepsy, alcohol, age and gender characteristics have impacts on the EEG-based person identication systems.
Resumo:
This study investigated movement synchronization of players within and between teams during competitive association football performance. Cluster phase analysis was introduced as a method to assess synchronies between whole teams and between individual players with their team as a function of time, ball possession and field direction. Measures of dispersion (SD) and regularity (sample entropy – SampEn – and cross sample entropy – Cross-SampEn) were used to quantify the magnitude and structure of synchrony. Large synergistic relations within each professional team sport collective were observed, particularly in the longitudinal direction of the field (0.89 ± 0.12) compared to the lateral direction (0.73 ± 0.16, p < .01). The coupling between the group measures of the two teams also revealed that changes in the synchrony of each team were intimately related (Cross-SampEn values of 0.02 ± 0.01). Interestingly, ball possession did not influence team synchronization levels. In player–team synchronization, individuals tended to be coordinated under near in-phase modes with team behavior (mean ranges between −7 and 5° of relative phase). The magnitudes of variations were low, but more irregular in time, for the longitudinal (SD: 18 ± 3°; SampEn: 0.07 ± 0.01), compared to the lateral direction (SD: 28 ± 5°; SampEn: 0.06 ± 0.01, p < .05) on-field. Increases in regularity were also observed between the first (SampEn: 0.07 ± 0.01) and second half (SampEn: 0.06 ± 0.01, p < .05) of the observed competitive game. Findings suggest that the method of analysis introduced in the current study may offer a suitable tool for examining team’s synchronization behaviors and the mutual influence of each team’s cohesiveness in competing social collectives.
Resumo:
提出了一种基于加权模糊相对熵的电机转子故障模糊识别方法。该方法将加权思想引入到模糊相对熵,用于识别电机转子故障严重程度。加权方法的引入增加了信息量丰富的符号区间的模糊相对熵占全部区间模糊相对熵的比重,可以更充分、合理地利用该区间的故障信息进行故障识别。电机转子断条故障诊断仿真实验结果表明,提出的方法有效地实现了电机故障的定量分析,能够准确地识别出电机转子故障的严重程度,使算法的鲁棒性得到了改善,故障分类的可靠性及准确程度得到了提高。
Resumo:
The method of entropy has been useful in evaluating inconsistency on human judgments. This paper illustrates an entropy-based decision support system called e-FDSS to the solution of multicriterion risk and decision analysis in projects of construction small and medium enterprises (SMEs). It is optimized and solved by fuzzy logic, entropy, and genetic algorithms. A case study demonstrated the use of entropy in e-FDSS on analyzing multiple risk criteria in the predevelopment stage of SME projects. Survey data studying the degree of impact of selected project risk criteria on different projects were input into the system in order to evaluate the preidentified project risks in an impartial environment. Without taking into account the amount of uncertainty embedded in the evaluation process; the results showed that all decision vectors are indeed full of bias and the deviations of decisions are finally quantified providing a more objective decision and risk assessment profile to the stakeholders of projects in order to search and screen the most profitable projects.
Resumo:
The Poincaré plot for heart rate variability analysis is a technique considered geometrical and non-linear, that can be used to assess the dynamics of heart rate variability by a representation of the values of each pair of R-R intervals into a simplified phase space that describes the system's evolution. The aim of the present study was to verify if there is some correlation between SD1, SD2 and SD1/SD2 ratio and heart rate variability nonlinear indexes either in disease or healthy conditions. 114 patients with arterial coronary disease and 65 healthy subjects underwent 30. minute heart rate registration, in supine position and the analyzed indexes were as follows: SD1, SD2, SD1/SD2, Sample Entropy, Lyapunov Exponent, Hurst Exponent, Correlation Dimension, Detrended Fluctuation Analysis, SDNN, RMSSD, LF, HF and LF/HF ratio. Correlation coefficients between SD1, SD2 and SD1/SD2 indexes and the other variables were tested by the Spearman rank correlation test and a regression analysis. We verified high correlation between SD1/SD2 index and HE and DFA (α1) in both groups, suggesting that this ratio can be used as a surrogate variable. © 2013 Elsevier B.V.
Resumo:
The relationship between sleep apnoea–hypopnoea syndrome (SAHS) severity and the regularity of nocturnal oxygen saturation (SaO2) recordings was analysed. Three different methods were proposed to quantify regularity: approximate entropy (AEn), sample entropy (SEn) and kernel entropy (KEn). A total of 240 subjects suspected of suffering from SAHS took part in the study. They were randomly divided into a training set (96 subjects) and a test set (144 subjects) for the adjustment and assessment of the proposed methods, respectively. According to the measurements provided by AEn, SEn and KEn, higher irregularity of oximetry signals is associated with SAHS-positive patients. Receiver operating characteristic (ROC) and Pearson correlation analyses showed that KEn was the most reliable predictor of SAHS. It provided an area under the ROC curve of 0.91 in two-class classification of subjects as SAHS-negative or SAHS-positive. Moreover, KEn measurements from oximetry data exhibited a linear dependence on the apnoea–hypopnoea index, as shown by a correlation coefficient of 0.87. Therefore, these measurements could be used for the development of simplified diagnostic techniques in order to reduce the demand for polysomnographies. Furthermore, KEn represents a convincing alternative to AEn and SEn for the diagnostic analysis of noisy biomedical signals.
Resumo:
In this work we present an optimized fuzzy visual servoing system for obstacle avoidance using an unmanned aerial vehicle. The cross-entropy theory is used to optimise the gains of our controllers. The optimization process was made using the ROS-Gazebo 3D simulation with purposeful extensions developed for our experiments. Visual servoing is achieved through an image processing front-end that uses the Camshift algorithm to detect and track objects in the scene. Experimental flight trials using a small quadrotor were performed to validate the parameters estimated from simulation. The integration of cross- entropy methods is a straightforward way to estimate optimal gains achieving excellent results when tested in real flights.
Resumo:
The Cross-Entropy (CE) is an efficient method for the estimation of rare-event probabilities and combinatorial optimization. This work presents a novel approach of the CE for optimization of a Soft-Computing controller. A Fuzzy controller was designed to command an unmanned aerial system (UAS) for avoiding collision task. The only sensor used to accomplish this task was a forward camera. The CE is used to reach a near-optimal controller by modifying the scaling factors of the controller inputs. The optimization was realized using the ROS-Gazebo simulation system. In order to evaluate the optimization a big amount of tests were carried out with a real quadcopter.
Resumo:
An adaptive learning scheme, based on a fuzzy approximation to the gradient descent method for training a pattern classifier using unlabeled samples, is described. The objective function defined for the fuzzy ISODATA clustering procedure is used as the loss function for computing the gradient. Learning is based on simultaneous fuzzy decisionmaking and estimation. It uses conditional fuzzy measures on unlabeled samples. An exponential membership function is assumed for each class, and the parameters constituting these membership functions are estimated, using the gradient, in a recursive fashion. The induced possibility of occurrence of each class is useful for estimation and is computed using 1) the membership of the new sample in that class and 2) the previously computed average possibility of occurrence of the same class. An inductive entropy measure is defined in terms of induced possibility distribution to measure the extent of learning. The method is illustrated with relevant examples.
Resumo:
Using generalized bosons, we construct the fuzzy sphere S-F(2) and monopoles on S-F(2) in a reducible representation of SU(2). The corresponding quantum states are naturally obtained using the GNS-construction. We show that there is an emergent nonabelian unitary gauge symmetry which is in the commutant of the algebra of observables. The quantum states are necessarily mixed and have non-vanishing von Neumann entropy, which increases monotonically under a bistochastic Markov map. The maximum value of the entropy has a simple relation to the degeneracy of the irreps that constitute the reducible representation that underlies the fuzzy sphere.