76 resultados para Artificial Neural Networks


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper considers variations of a neuron pool selection method known as Affordable Neural Network (AfNN). A saliency measure, based on the second derivative of the objective function is proposed to assess the ability of a trained AfNN to provide neuronal redundancy. The discrepancies between the various affordability variants are explained by correlating unique sub group selections with relevant saliency variations. Overall this study shows that the method in which neurons are selected from a pool is more relevant to how salient individual neurons are, than how often a particular neuron is used during training. The findings herein are relevant to not only providing an analogy to brain function but, also, in optimizing the way a neural network using the affordability method is trained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deep Brain Stimulator devices are becoming widely used for therapeutic benefits in movement disorders such as Parkinson's disease. Prolonging the battery life span of such devices could dramatically reduce the risks and accumulative costs associated with surgical replacement. This paper demonstrates how an artificial neural network can be trained using pre-processing frequency analysis of deep brain electrode recordings to detect the onset of tremor in Parkinsonian patients. Implementing this solution into an 'intelligent' neurostimulator device will remove the need for continuous stimulation currently used, and open up the possibility of demand-driven stimulation. Such a methodology could potentially decrease the power consumption of a deep brain pulse generator.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The human electroencephalogram (EEG) is globally characterized by a 1/f power spectrum superimposed with certain peaks, whereby the "alpha peak" in a frequency range of 8-14 Hz is the most prominent one for relaxed states of wakefulness. We present simulations of a minimal dynamical network model of leaky integrator neurons attached to the nodes of an evolving directed and weighted random graph (an Erdos-Renyi graph). We derive a model of the dendritic field potential (DFP) for the neurons leading to a simulated EEG that describes the global activity of the network. Depending on the network size, we find an oscillatory transition of the simulated EEG when the network reaches a critical connectivity. This transition, indicated by a suitably defined order parameter, is reflected by a sudden change of the network's topology when super-cycles are formed from merging isolated loops. After the oscillatory transition, the power spectra of simulated EEG time series exhibit a 1/f continuum superimposed with certain peaks. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

More than thirty years ago, Amari and colleagues proposed a statistical framework for identifying structurally stable macrostates of neural networks from observations of their microstates. We compare their stochastic stability criterion with a deterministic stability criterion based on the ergodic theory of dynamical systems, recently proposed for the scheme of contextual emergence and applied to particular inter-level relations in neuroscience. Stochastic and deterministic stability criteria for macrostates rely on macro-level contexts, which make them sensitive to differences between different macro-levels.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dynamic neural networks (DNNs), which are also known as recurrent neural networks, are often used for nonlinear system identification. The main contribution of this letter is the introduction of an efficient parameterization of a class of DNNs. Having to adjust less parameters simplifies the training problem and leads to more parsimonious models. The parameterization is based on approximation theory dealing with the ability of a class of DNNs to approximate finite trajectories of nonautonomous systems. The use of the proposed parameterization is illustrated through a numerical example, using data from a nonlinear model of a magnetic levitation system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This special section contains papers addressing various aspects associated with the issue Of Cultured neural networks. These are networks, that are formed through the monitored growth of biological neural tissue. In keeping with the aims of the International Journal of Adaptive Control and Signal Processing, the key focus of these papers is to took at particular aspects of signal processing in terms of both stimulating such a network and in assigning intent to signals collected as network outputs. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we propose to study a class of neural networks with recent-history distributed delays. A sufficient condition is derived for the global exponential periodicity of the proposed neural networks, which has the advantage that it assumes neither the differentiability nor monotonicity of the activation function of each neuron nor the symmetry of the feedback matrix or delayed feedback matrix. Our criterion is shown to be valid by applying it to an illustrative system. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The possibility of using a radial basis function neural network (RBFNN) to accurately recognise and predict the onset of Parkinson’s disease tremors in human subjects is discussed in this paper. The data for training the RBFNN are obtained by means of deep brain electrodes implanted in a Parkinson disease patient’s brain. The effectiveness of a RBFNN is initially demonstrated by a real case study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of identification of a nonlinear dynamic system is considered. A two-layer neural network is used for the solution of the problem. Systems disturbed with unmeasurable noise are considered, although it is known that the disturbance is a random piecewise polynomial process. Absorption polynomials and nonquadratic loss functions are used to reduce the effect of this disturbance on the estimates of the optimal memory of the neural-network model.