9 resultados para Single event upset (SEUs)

em CentAUR: Central Archive University of Reading - UK


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Disequilibria between Pb-210 and Ra-226 can be used to trace magma degassing, because the intermediate nuclides, particularly Rn-222, are volatile. Products of the 1980-1986 eruptions of Mount St. Helens have been analysed for (Pb-210/Ra-226). Both excesses and deficits of Pb-210 are encountered suggesting rapid gas transfer. The time scale of diffuse, non-eruptive gas escape prior to 1980 as documented by Pb-210 deficits is on the order of a decade using the model developed by Gauthier and Condomines (Earth Planet. Sci. Lett. 172 (1999) 111-126) for a non-renewed magma chamber and efficient Rn removal. The time required to build-up Pb-210 excess is much shorter (months) as can be observed from steady increases of (Pb-210/Ra-226) with time during 1980-1982. The formation of Pb-210 excess requires both rapid gas transport through the magma and periodic blocking of gas escape routes. Superposed on this time trend is the natural variability of (Pb-210/Ra-226) in a single eruption caused by tapping magma from various depths. The two time scales of gas transport, to create both Pb-210 deficits and Pb-210 excesses, cannot be reconciled in a single event. Rather Pb-210 deficits are associated with pre-eruptive diffuse degassing, while Pb-210 excesses document the more vigorous degassing associated with eruption and recharge of the system. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Space applications are challenged by the reliability of parallel computing systems (FPGAs) employed in space crafts due to Single-Event Upsets. The work reported in this paper aims to achieve self-managing systems which are reliable for space applications by applying autonomic computing constructs to parallel computing systems. A novel technique, 'Swarm-Array Computing' inspired by swarm robotics, and built on the foundations of autonomic and parallel computing is proposed as a path to achieve autonomy. The constitution of swarm-array computing comprising for constituents, namely the computing system, the problem / task, the swarm and the landscape is considered. Three approaches that bind these constituents together are proposed. The feasibility of one among the three proposed approaches is validated on the SeSAm multi-agent simulator and landscapes representing the computing space and problem are generated using the MATLAB.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Abstract. Different types of mental activity are utilised as an input in Brain-Computer Interface (BCI) systems. One such activity type is based on Event-Related Potentials (ERPs). The characteristics of ERPs are not visible in single-trials, thus averaging over a number of trials is necessary before the signals become usable. An improvement in ERP-based BCI operation and system usability could be obtained if the use of single-trial ERP data was possible. The method of Independent Component Analysis (ICA) can be utilised to separate single-trial recordings of ERP data into components that correspond to ERP characteristics, background electroencephalogram (EEG) activity and other components with non- cerebral origin. Choice of specific components and their use to reconstruct “denoised” single-trial data could improve the signal quality, thus allowing the successful use of single-trial data without the need for averaging. This paper assesses single-trial ERP signals reconstructed using a selection of estimated components from the application of ICA on the raw ERP data. Signal improvement is measured using Contrast-To-Noise measures. It was found that such analysis improves the signal quality in all single-trials.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article investigates the relation between stimulus-evoked neural activity and cerebral hemodynamics. Specifically, the hypothesis is tested that hemodynamic responses can be modeled as a linear convolution of experimentally obtained measures of neural activity with a suitable hemodynamic impulse response function. To obtain a range of neural and hemodynamic responses, rat whisker pad was stimulated using brief (less than or equal to2 seconds) electrical stimuli consisting of single pulses (0.3 millisecond, 1.2 mA) combined both at different frequencies and in a paired-pulse design. Hemodynamic responses were measured using concurrent optical imaging spectroscopy and laser Doppler flowmetry, whereas neural responses were assessed through current source density analysis of multielectrode recordings from a single barrel. General linear modeling was used to deconvolve the hemodynamic impulse response to a single "neural event" from the hemodynamic and neural responses to stimulation. The model provided an excellent fit to the empirical data. The implications of these results for modeling schemes and for physiologic systems coupling neural and hemodynamic activity are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use proper orthogonal decomposition (POD) to study a transient teleconnection event at the onset of the 2001 planet-encircling dust storm on Mars, in terms of empirical orthogonal functions (EOFs). There are several differences between this and previous studies of atmospheric events using EOFs. First, instead of using a single variable such as surface pressure or geopotential height on a given pressure surface, we use a dataset describing the evolution in time of global and fully three-dimensional atmospheric fields such as horizontal velocity and temperature. These fields are produced by assimilating Thermal Emission Spectrometer observations from NASA's Mars Global Surveyor spacecraft into a Mars general circulation model. We use total atmospheric energy (TE) as a physically meaningful quantity which weights the state variables. Second, instead of adopting the EOFs to define teleconnection patterns as planetary-scale correlations that explain a large portion of long time-scale variability, we use EOFs to understand transient processes due to localised heating perturbations that have implications for the atmospheric circulation over distant regions. The localised perturbation is given by anomalous heating due to the enhanced presence of dust around the northern edge of the Hellas Planitia basin on Mars. We show that the localised disturbance is seemingly restricted to a small number (a few tens) of EOFs. These can be classified as low-order, transitional, or high-order EOFs according to the TE amount they explain throughout the event. Despite the global character of the EOFs, they show the capability of accounting for the localised effects of the perturbation via the presence of specific centres of action. We finally discuss possible applications for the study of terrestrial phenomena with similar characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work reported in this paper is motivated towards handling single node failures for parallel summation algorithms in computer clusters. An agent based approach is proposed in which a task to be executed is decomposed to sub-tasks and mapped onto agents that traverse computing nodes. The agents intercommunicate across computing nodes to share information during the event of a predicted node failure. Two single node failure scenarios are considered. The Message Passing Interface is employed for implementing the proposed approach. Quantitative results obtained from experiments reveal that the agent based approach can handle failures more efficiently than traditional failure handling approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current methods for estimating event-related potentials (ERPs) assume stationarity of the signal. Empirical Mode Decomposition (EMD) is a data-driven decomposition technique that does not assume stationarity. We evaluated an EMD-based method for estimating the ERP. On simulated data, EMD substantially reduced background EEG while retaining the ERP. EMD-denoised single trials also estimated shape, amplitude, and latency of the ERP better than raw single trials. On experimental data, EMD-denoised trials revealed event-related differences between two conditions (condition A and B) more effectively than trials lowpass filtered at 40 Hz. EMD also revealed event-related differences on both condition A and condition B that were clearer and of longer duration than those revealed by low-pass filtering at 40 Hz. Thus, EMD-based denoising is a promising data-driven, nonstationary method for estimating ERPs and should be investigated further.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the primary features of modern government-to-citizen (G2C) service provision is the ability to offer a citizen-centric view of the e-government portal. Life-event approach is one of the most widely adopted paradigms supporting the idea of solving a complex event in a citizen’s life through a single service provision. Several studies have used this approach to design e-government portals. However, they were limited in terms of use and scalability. There were no mechanisms that show how to specify a life-event for structuring public e-services, or how to systematically match life-events with these services taking into consideration the citizen needs. We introduce the NOrm-Based Life-Event (NoBLE) framework for G2C e-service provision with a set of mechanisms as a guide for designing active life-event oriented e-government portals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The validity of ensemble averaging on event-related potential (ERP) data has been questioned, due to its assumption that the ERP is identical across trials. Thus, there is a need for preliminary testing for cluster structure in the data. New method: We propose a complete pipeline for the cluster analysis of ERP data. To increase the signalto-noise (SNR) ratio of the raw single-trials, we used a denoising method based on Empirical Mode Decomposition (EMD). Next, we used a bootstrap-based method to determine the number of clusters, through a measure called the Stability Index (SI). We then used a clustering algorithm based on a Genetic Algorithm (GA)to define initial cluster centroids for subsequent k-means clustering. Finally, we visualised the clustering results through a scheme based on Principal Component Analysis (PCA). Results: After validating the pipeline on simulated data, we tested it on data from two experiments – a P300 speller paradigm on a single subject and a language processing study on 25 subjects. Results revealed evidence for the existence of 6 clusters in one experimental condition from the language processing study. Further, a two-way chi-square test revealed an influence of subject on cluster membership.