760 resultados para Dynamic artificial neural network
Resumo:
A neural network is a highly interconnected set of simple processors. The many connections allow information to travel rapidly through the network, and due to their simplicity, many processors in one network are feasible. Together these properties imply that we can build efficient massively parallel machines using neural networks. The primary problem is how do we specify the interconnections in a neural network. The various approaches developed so far such as outer product, learning algorithm, or energy function suffer from the following deficiencies: long training/ specification times; not guaranteed to work on all inputs; requires full connectivity.
Alternatively we discuss methods of using the topology and constraints of the problems themselves to design the topology and connections of the neural solution. We define several useful circuits-generalizations of the Winner-Take-All circuitthat allows us to incorporate constraints using feedback in a controlled manner. These circuits are proven to be stable, and to only converge on valid states. We use the Hopfield electronic model since this is close to an actual implementation. We also discuss methods for incorporating these circuits into larger systems, neural and nonneural. By exploiting regularities in our definition, we can construct efficient networks. To demonstrate the methods, we look to three problems from communications. We first discuss two applications to problems from circuit switching; finding routes in large multistage switches, and the call rearrangement problem. These show both, how we can use many neurons to build massively parallel machines, and how the Winner-Take-All circuits can simplify our designs.
Next we develop a solution to the contention arbitration problem of high-speed packet switches. We define a useful class of switching networks and then design a neural network to solve the contention arbitration problem for this class. Various aspects of the neural network/switch system are analyzed to measure the queueing performance of this method. Using the basic design, a feasible architecture for a large (1024-input) ATM packet switch is presented. Using the massive parallelism of neural networks, we can consider algorithms that were previously computationally unattainable. These now viable algorithms lead us to new perspectives on switch design.
Learning new articulator trajectories for a speech production model using artificial neural networks
Resumo:
A pilot study was conducted to study the ability of an artificial neural network to predict the biomass of Peruvian anchoveta Engraulis ringens, given time series of earlier biomasses, and of environmental parameters (ocenographic data and predator abundances). Acceptable predictions of three months or more appear feasible after thorough scrutiny of the input data set.
Resumo:
The liquid-crystal light valve (LCLV) is a useful component for performing integration, thresholding, and gain functions in optical neural networks. Integration of the neural activation channels is implemented by pixelation of the LCLV, with use of a structured metallic layer between the photoconductor and the liquid-crystal layer. Measurements are presented for this type of valve, examples of which were prepared for two specific neural network implementations. The valve fabrication and measurement were carried out at the State Optical Institute, St. Petersburg, Russia, and the modeling and system applications were investigated at the Institute of Microtechnology, Neuchâtel, Switzerland.