944 resultados para neural computing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a new model-based proportional–integral–derivative (PID) tuning and controller approach is introduced for Hammerstein systems that are identified on the basis of the observational input/output data. The nonlinear static function in the Hammerstein system is modelled using a B-spline neural network. The control signal is composed of a PID controller, together with a correction term. Both the parameters in the PID controller and the correction term are optimized on the basis of minimizing the multistep ahead prediction errors. In order to update the control signal, the multistep ahead predictions of the Hammerstein system based on B-spline neural networks and the associated Jacobian matrix are calculated using the de Boor algorithms, including both the functional and derivative recursions. Numerical examples are utilized to demonstrate the efficacy of the proposed approaches.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spiking neural networks are usually limited in their applications due to their complex mathematical models and the lack of intuitive learning algorithms. In this paper, a simpler, novel neural network derived from a leaky integrate and fire neuron model, the ‘cavalcade’ neuron, is presented. A simulation for the neural network has been developed and two basic learning algorithms implemented within the environment. These algorithms successfully learn some basic temporal and instantaneous problems. Inspiration for neural network structures from these experiments are then taken and applied to process sensor information so as to successfully control a mobile robot.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study inverse problems in neural field theory, i.e., the construction of synaptic weight kernels yielding a prescribed neural field dynamics. We address the issues of existence, uniqueness, and stability of solutions to the inverse problem for the Amari neural field equation as a special case, and prove that these problems are generally ill-posed. In order to construct solutions to the inverse problem, we first recast the Amari equation into a linear perceptron equation in an infinite-dimensional Banach or Hilbert space. In a second step, we construct sets of biorthogonal function systems allowing the approximation of synaptic weight kernels by a generalized Hebbian learning rule. Numerically, this construction is implemented by the Moore–Penrose pseudoinverse method. We demonstrate the instability of these solutions and use the Tikhonov regularization method for stabilization and to prevent numerical overfitting. We illustrate the stable construction of kernels by means of three instructive examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multisensory integration involves bottom-up as well as top-down processes. We investigated the influences of top-down control on the neural responses to multisensory stimulation using EEG recording and time-frequency analyses. Participants were stimulated at the index or thumb of the left hand, using tactile vibrators mounted on a foam cube. Simultaneously they received a visual distractor from a light emitting diode adjacent to the active vibrator (spatially congruent trial) or adjacent to the inactive vibrator (spatially incongruent trial). The task was to respond to the elevation of the tactile stimulus (upper or lower), while ignoring the simultaneous visual distractor. To manipulate top-down control on this multisensory stimulation, the proportion of spatially congruent (vs. incongruent) trials was changed across blocks. Our results reveal that the behavioral cost of responding to incongruent than congruent trials (i.e., the crossmodal congruency effect) was modulated by the proportion of congruent trials. Most importantly, the EEG gamma band response and the gamma-theta coupling were also affected by this modulation of top-down control, whereas the late theta band response related to the congruency effect was not. These findings suggest that gamma band response is more than a marker of multisensory binding, being also sensitive to the correspondence between expected and actual multisensory stimulation. By contrast, theta band response was affected by congruency but appears to be largely immune to stimulation expectancy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this research was to investigate the changes in neural processing in mild cognitive impairment. We measured phase synchrony, amplitudes, and event-related potentials in veridical and false memory to determine whether these differed in participants with mild cognitive impairment compared with typical, age-matched controls. Empirical mode decomposition phase locking analysis was used to assess synchrony, which is the first time this analysis technique has been applied in a complex cognitive task such as memory processing. The technique allowed assessment of changes in frontal and parietal cortex connectivity over time during a memory task, without a priori selection of frequency ranges, which has been shown previously to influence synchrony detection. Phase synchrony differed significantly in its timing and degree between participant groups in the theta and alpha frequency ranges. Timing differences suggested greater dependence on gist memory in the presence of mild cognitive impairment. The group with mild cognitive impairment had significantly more frontal theta phase locking than the controls in the absence of a significant behavioural difference in the task, providing new evidence for compensatory processing in the former group. Both groups showed greater frontal phase locking during false than true memory, suggesting increased searching when no actual memory trace was found. Significant inter-group differences in frontal alpha phase locking provided support for a role for lower and upper alpha oscillations in memory processing. Finally, fronto-parietal interaction was significantly reduced in the group with mild cognitive impairment, supporting the notion that mild cognitive impairment could represent an early stage in Alzheimer’s disease, which has been described as a ‘disconnection syndrome’.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bimanual actions impose intermanual coordination demands not present during unimanual actions. We investigated the functional neuroanatomical correlates of these coordination demands in motor imagery (MI) of everyday actions using functional magnetic resonance imaging (fMRI). For this, 17 participants imagined unimanual actions with the left and right hand as well as bimanual actions while undergoing fMRI. A univariate fMRI analysis showed no reliable cortical activations specific to bimanual MI, indicating that intermanual coordination demands in MI are not associated with increased neural processing. A functional connectivity analysis based on psychophysiological interactions (PPI), however, revealed marked increases in connectivity between parietal and premotor areas within and between hemispheres. We conclude that in MI of everyday actions intermanual coordination demands are primarily met by changes in connectivity between areas and only moderately, if at all, by changes in the amount of neural activity. These results are the first characterization of the neuroanatomical correlates of bimanual coordination demands in MI. Our findings support the assumed equivalence of overt and imagined actions and highlight the differences between uni- and bimanual actions. The findings extent our understanding of the motor system and may aid the development of clinical neurorehabilitation approaches based on mental practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pocket Data Mining (PDM) is our new term describing collaborative mining of streaming data in mobile and distributed computing environments. With sheer amounts of data streams are now available for subscription on our smart mobile phones, the potential of using this data for decision making using data stream mining techniques has now been achievable owing to the increasing power of these handheld devices. Wireless communication among these devices using Bluetooth and WiFi technologies has opened the door wide for collaborative mining among the mobile devices within the same range that are running data mining techniques targeting the same application. This paper proposes a new architecture that we have prototyped for realizing the significant applications in this area. We have proposed using mobile software agents in this application for several reasons. Most importantly the autonomic intelligent behaviour of the agent technology has been the driving force for using it in this application. Other efficiency reasons are discussed in details in this paper. Experimental results showing the feasibility of the proposed architecture are presented and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data and a data warehouse. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular we look at two aspects, first how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories --- this is an important and challenging aspect of P-found because the data volumes involved are too large to be centralised. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling new scientific discoveries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the recent years, the area of data mining has been experiencing considerable demand for technologies that extract knowledge from large and complex data sources. There has been substantial commercial interest as well as active research in the area that aim to develop new and improved approaches for extracting information, relationships, and patterns from large datasets. Artificial neural networks (NNs) are popular biologically-inspired intelligent methodologies, whose classification, prediction, and pattern recognition capabilities have been utilized successfully in many areas, including science, engineering, medicine, business, banking, telecommunication, and many other fields. This paper highlights from a data mining perspective the implementation of NN, using supervised and unsupervised learning, for pattern recognition, classification, prediction, and cluster analysis, and focuses the discussion on their usage in bioinformatics and financial data analysis tasks. © 2012 Wiley Periodicals, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This contribution introduces a new digital predistorter to compensate serious distortions caused by memory high power amplifiers (HPAs) which exhibit output saturation characteristics. The proposed design is based on direct learning using a data-driven B-spline Wiener system modeling approach. The nonlinear HPA with memory is first identified based on the B-spline neural network model using the Gauss-Newton algorithm, which incorporates the efficient De Boor algorithm with both B-spline curve and first derivative recursions. The estimated Wiener HPA model is then used to design the Hammerstein predistorter. In particular, the inverse of the amplitude distortion of the HPA's static nonlinearity can be calculated effectively using the Newton-Raphson formula based on the inverse of De Boor algorithm. A major advantage of this approach is that both the Wiener HPA identification and the Hammerstein predistorter inverse can be achieved very efficiently and accurately. Simulation results obtained are presented to demonstrate the effectiveness of this novel digital predistorter design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simulating spiking neural networks is of great interest to scientists wanting to model the functioning of the brain. However, large-scale models are expensive to simulate due to the number and interconnectedness of neurons in the brain. Furthermore, where such simulations are used in an embodied setting, the simulation must be real-time in order to be useful. In this paper we present NeMo, a platform for such simulations which achieves high performance through the use of highly parallel commodity hardware in the form of graphics processing units (GPUs). NeMo makes use of the Izhikevich neuron model which provides a range of realistic spiking dynamics while being computationally efficient. Our GPU kernel can deliver up to 400 million spikes per second. This corresponds to a real-time simulation of around 40 000 neurons under biologically plausible conditions with 1000 synapses per neuron and a mean firing rate of 10 Hz.