954 resultados para non-stationary signals
Resumo:
This thesis aimed at addressing some of the issues that, at the state of the art, avoid the P300-based brain computer interface (BCI) systems to move from research laboratories to end users’ home. An innovative asynchronous classifier has been defined and validated. It relies on the introduction of a set of thresholds in the classifier, and such thresholds have been assessed considering the distributions of score values relating to target, non-target stimuli and epochs of voluntary no-control. With the asynchronous classifier, a P300-based BCI system can adapt its speed to the current state of the user and can automatically suspend the control when the user diverts his attention from the stimulation interface. Since EEG signals are non-stationary and show inherent variability, in order to make long-term use of BCI possible, it is important to track changes in ongoing EEG activity and to adapt BCI model parameters accordingly. To this aim, the asynchronous classifier has been subsequently improved by introducing a self-calibration algorithm for the continuous and unsupervised recalibration of the subjective control parameters. Finally an index for the online monitoring of the EEG quality has been defined and validated in order to detect potential problems and system failures. This thesis ends with the description of a translational work involving end users (people with amyotrophic lateral sclerosis-ALS). Focusing on the concepts of the user centered design approach, the phases relating to the design, the development and the validation of an innovative assistive device have been described. The proposed assistive technology (AT) has been specifically designed to meet the needs of people with ALS during the different phases of the disease (i.e. the degree of motor abilities impairment). Indeed, the AT can be accessed with several input devices either conventional (mouse, touchscreen) or alterative (switches, headtracker) up to a P300-based BCI.
Resumo:
The slope of the two-interval, forced-choice psychometric function (e.g. the Weibull parameter, ß) provides valuable information about the relationship between contrast sensitivity and signal strength. However, little is known about how or whether ß varies with stimulus parameters such as spatiotemporal frequency and stimulus size and shape. A second unresolved issue concerns the best way to estimate the slope of the psychometric function. For example, if an observer is non-stationary (e.g. their threshold drifts between experimental sessions), ß will be underestimated if curve fitting is performed after collapsing the data across experimental sessions. We measured psychometric functions for 2 experienced observers for 14 different spatiotemporal configurations of pulsed or flickering grating patches and bars on each of 8 days. We found ß ˜ 3 to be fairly constant across almost all conditions, consistent with a fixed nonlinear contrast transducer and/or a constant level of intrinsic stimulus uncertainty (e.g. a square law transducer and a low level of intrinsic uncertainty). Our analysis showed that estimating a single ß from results averaged over several experimental sessions was slightly more accurate than averaging multiple estimates from several experimental sessions. However, the small levels of non-stationarity (SD ˜ 0.8 dB) meant that the difference between the estimates was, in practice, negligible.
Resumo:
2000 Mathematics Subject Classification: Primary 60J80, Secondary 60G99.
Resumo:
Once thought to be predominantly the domain of cortex, multisensory integration has now been found at numerous sub-cortical locations in the auditory pathway. Prominent ascending and descending connection within the pathway suggest that the system may utilize non-auditory activity to help filter incoming sounds as they first enter the ear. Active mechanisms in the periphery, particularly the outer hair cells (OHCs) of the cochlea and middle ear muscles (MEMs), are capable of modulating the sensitivity of other peripheral mechanisms involved in the transduction of sound into the system. Through indirect mechanical coupling of the OHCs and MEMs to the eardrum, motion of these mechanisms can be recorded as acoustic signals in the ear canal. Here, we utilize this recording technique to describe three different experiments that demonstrate novel multisensory interactions occurring at the level of the eardrum. 1) In the first experiment, measurements in humans and monkeys performing a saccadic eye movement task to visual targets indicate that the eardrum oscillates in conjunction with eye movements. The amplitude and phase of the eardrum movement, which we dub the Oscillatory Saccadic Eardrum Associated Response or OSEAR, depended on the direction and horizontal amplitude of the saccade and occurred in the absence of any externally delivered sounds. 2) For the second experiment, we use an audiovisual cueing task to demonstrate a dynamic change to pressure levels in the ear when a sound is expected versus when one is not. Specifically, we observe a drop in frequency power and variability from 0.1 to 4kHz around the time when the sound is expected to occur in contract to a slight increase in power at both lower and higher frequencies. 3) For the third experiment, we show that seeing a speaker say a syllable that is incongruent with the accompanying audio can alter the response patterns of the auditory periphery, particularly during the most relevant moments in the speech stream. These visually influenced changes may contribute to the altered percept of the speech sound. Collectively, we presume that these findings represent the combined effect of OHCs and MEMs acting in tandem in response to various non-auditory signals in order to manipulate the receptive properties of the auditory system. These influences may have a profound, and previously unrecognized, impact on how the auditory system processes sounds from initial sensory transduction all the way to perception and behavior. Moreover, we demonstrate that the entire auditory system is, fundamentally, a multisensory system.
Resumo:
The evolution of wireless communication systems leads to Dynamic Spectrum Allocation for Cognitive Radio, which requires reliable spectrum sensing techniques. Among the spectrum sensing methods proposed in the literature, those that exploit cyclostationary characteristics of radio signals are particularly suitable for communication environments with low signal-to-noise ratios, or with non-stationary noise. However, such methods have high computational complexity that directly raises the power consumption of devices which often have very stringent low-power requirements. We propose a strategy for cyclostationary spectrum sensing with reduced energy consumption. This strategy is based on the principle that p processors working at slower frequencies consume less power than a single processor for the same execution time. We devise a strict relation between the energy savings and common parallel system metrics. The results of simulations show that our strategy promises very significant savings in actual devices.
Resumo:
The evolution of wireless communication systems leads to Dynamic Spectrum Allocation for Cognitive Radio, which requires reliable spectrum sensing techniques. Among the spectrum sensing methods proposed in the literature, those that exploit cyclostationary characteristics of radio signals are particularly suitable for communication environments with low signal-to-noise ratios, or with non-stationary noise. However, such methods have high computational complexity that directly raises the power consumption of devices which often have very stringent low-power requirements. We propose a strategy for cyclostationary spectrum sensing with reduced energy consumption. This strategy is based on the principle that p processors working at slower frequencies consume less power than a single processor for the same execution time. We devise a strict relation between the energy savings and common parallel system metrics. The results of simulations show that our strategy promises very significant savings in actual devices.
Resumo:
Coprime and nested sampling are well known deterministic sampling techniques that operate at rates significantly lower than the Nyquist rate, and yet allow perfect reconstruction of the spectra of wide sense stationary signals. However, theoretical guarantees for these samplers assume ideal conditions such as synchronous sampling, and ability to perfectly compute statistical expectations. This thesis studies the performance of coprime and nested samplers in spatial and temporal domains, when these assumptions are violated. In spatial domain, the robustness of these samplers is studied by considering arrays with perturbed sensor locations (with unknown perturbations). Simplified expressions for the Fisher Information matrix for perturbed coprime and nested arrays are derived, which explicitly highlight the role of co-array. It is shown that even in presence of perturbations, it is possible to resolve $O(M^2)$ under appropriate conditions on the size of the grid. The assumption of small perturbations leads to a novel ``bi-affine" model in terms of source powers and perturbations. The redundancies in the co-array are then exploited to eliminate the nuisance perturbation variable, and reduce the bi-affine problem to a linear underdetermined (sparse) problem in source powers. This thesis also studies the robustness of coprime sampling to finite number of samples and sampling jitter, by analyzing their effects on the quality of the estimated autocorrelation sequence. A variety of bounds on the error introduced by such non ideal sampling schemes are computed by considering a statistical model for the perturbation. They indicate that coprime sampling leads to stable estimation of the autocorrelation sequence, in presence of small perturbations. Under appropriate assumptions on the distribution of WSS signals, sharp bounds on the estimation error are established which indicate that the error decays exponentially with the number of samples. The theoretical claims are supported by extensive numerical experiments.
Resumo:
United States federal agencies assess flood risk using Bulletin 17B procedures which assume annual maximum flood series are stationary. This represents a significant limitation of current flood frequency models as the flood distribution is thereby assumed to be unaffected by trends or periodicity of atmospheric/climatic variables and/or anthropogenic activities. The validity of this assumption is at the core of this thesis, which aims to improve understanding of the forms and potential causes of non-stationarity in flood series for moderately impaired watersheds in the Upper Midwest and Northeastern US. Prior studies investigated non-stationarity in flood series for unimpaired watersheds; however, as the majority of streams are located in areas of increasing human activity, relative and coupled impacts of natural and anthropogenic factors need to be considered such that non-stationary flood frequency models can be developed for flood risk forecasting over relevant planning horizons for large scale water resources planning and management.
Resumo:
Um evento extremo de precipitação ocorreu na primeira semana do ano 2000, de 1º a 5 de janeiro, no Vale do Paraíba, parte leste do Estado de São Paulo, Brasil, causando enorme impacto socioeconômico, com mortes e destruição. Este trabalho estudou este evento em 10 estações meteorológicas selecionadas que foram consideradas como aquelas tendo dados mais homogêneos do Que outras estações na região. O modelo de distribuição generalizada de Pareto (DGP) para valores extremos de precipitação de 5 dias foi desenvolvido, individualmente para cada uma dessas estações. Na modelagem da DGP, foi adotada abordagem não-estacionaria considerando o ciclo anual e tendência de longo prazo como co-variaveis. Uma conclusão desta investigação é que as quantidades de precipitação acumulada durante os 5 dias do evento estudado podem ser classificadas como extremamente raras para a região, com probabilidade de ocorrência menor do que 1% para maioria das estações, e menor do que 0,1% em três estações.
Resumo:
This article deals with the efficiency of fractional integration parameter estimators. This study was based on Monte Carlo experiments involving simulated stochastic processes with integration orders in the range]-1,1[. The evaluated estimation methods were classified into two groups: heuristics and semiparametric/maximum likelihood (ML). The study revealed that the comparative efficiency of the estimators, measured by the lesser mean squared error, depends on the stationary/non-stationary and persistency/anti-persistency conditions of the series. The ML estimator was shown to be superior for stationary persistent processes; the wavelet spectrum-based estimators were better for non-stationary mean reversible and invertible anti-persistent processes; the weighted periodogram-based estimator was shown to be superior for non-invertible anti-persistent processes.
Resumo:
Preliminary version
Resumo:
Tese de Doutoramento, Matemática (Investigação Operacional), 23 de Setembro de 2006, Universidade dos Açores.
Resumo:
The objective of this article is to provide additional knowledge to the discussion of long-term memory, leaning over the behavior of the main Portuguese stock index. The first four moments are calculated using time windows of increasing size and sliding time windows of fixed size equal to 50 days and suggest that daily returns are non-ergodic and non-stationary. Seeming that the series is best described by a fractional Brownian motion approach, we use the rescaled-range analysis (R/S) and the detrended fluctuation analysis (DFA). The findings indicate evidence of long term memory in the form of persistence. This evidence of fractal structure suggests that the market is subject to greater predictability and contradicts the efficient market hypothesis in its weak form. This raises issues regarding theoretical modeling of asset pricing. In addition, we carried out a more localized (in time) study to identify the evolution of the degree of long-term dependency over time using windows 200-days and 400-days. The results show a switching feature in the index, from persistent to anti-persistent, quite evident from 2010.
Resumo:
This article aims to contribute to the discussion of long-term dependence, focusing on the behavior of the main Belgian stock index. Non-parametric analyzes of the general characteristics of temporal frequency show that daily returns are non-ergodic and non-stationary. Therefore, we use the rescaled-range analysis (R/S) and the detrended fluctuation analysis (DFA), under the fractional Brownian motion approach, and we found slight evidence of long-term dependence. These results refute the random walk hypothesis with i.i.d. increments, which is the basis of the EMH in its weak form, and call into question some theoretical modeling of asset pricing. Other more localized complementary study, to identify the evolution of the degree of dependence over time windows, showed that the index has become less persistent from 2010. This may mean a maturing market by the extension of the effects of current financial crisis.
Resumo:
Prepared for presentation at the Portuguese Finance Network International Conference 2014, Vilamoura, Portugal, June 18-20