229 resultados para NEURAL PRECURSORS
Resumo:
This article discusses the identification of nonlinear dynamic systems using multi-layer perceptrons (MLPs). It focuses on both structure uncertainty and parameter uncertainty, which have been widely explored in the literature of nonlinear system identification. The main contribution is that an integrated analytic framework is proposed for automated neural network structure selection, parameter identification and hysteresis network switching with guaranteed neural identification performance. First, an automated network structure selection procedure is proposed within a fixed time interval for a given network construction criterion. Then, the network parameter updating algorithm is proposed with guaranteed bounded identification error. To cope with structure uncertainty, a hysteresis strategy is proposed to enable neural identifier switching with guaranteed network performance along the switching process. Both theoretic analysis and a simulation example show the efficacy of the proposed method.
Resumo:
In this article we intoduce a novel stochastic Hebb-like learning rule for neural networks that is neurobiologically motivated. This learning rule combines features of unsupervised (Hebbian) and supervised (reinforcement) learning and is stochastic with respect to the selection of the time points when a synapse is modified. Moreover, the learning rule does not only affect the synapse between pre- and postsynaptic neuron, which is called homosynaptic plasticity, but effects also further remote synapses of the pre-and postsynaptic neuron. This more complex form of synaptic plasticity has recently come under investigations in neurobiology and is called heterosynaptic plasticity. We demonstrate that this learning rule is useful in training neural networks by learning parity functions including the exclusive-or (XOR) mapping in a multilayer feed-forward network. We find, that our stochastic learning rule works well, even in the presence of noise. Importantly, the mean leaxning time increases with the number of patterns to be learned polynomially, indicating efficient learning.
Resumo:
Reaction of trans-[Pt(NC5H4CHBu2n)(2)Cl-2] 1 with an excess of HC=CR (R = Ph, C6H4Me, C6H4NO2) affords the monomeric complex trans-[Pt(NC5H4CHBu2n)(2)(C=CR)(2)] (R = Ph 2a, C6H4Me 2b, C6H4NO2 2c), the trans arrangement of the alkynyl ligands being confirmed from spectroscopic data and by an X-ray analysis of 2c;when 1 is treated with 1 equiv, of HC=CC6H2(Me)(2)C=CH the polymer [Pt(NC5H4CHBu2n)(2)C=CC6H2Me2C=C](n) is formed, which is soluble in a range of organic solvents.