Neural approach for solving several types of optimization problems


Autoria(s): da Silva, I. N.; Amaral, W. C.; Arruda, L. V. R.
Contribuinte(s)

Universidade Estadual Paulista (UNESP)

Data(s)

20/05/2014

20/05/2014

01/03/2006

Resumo

Neural networks consist of highly interconnected and parallel nonlinear processing elements that are shown to be extremely effective in computation. This paper presents an architecture of recurrent neural net-works that can be used to solve several classes of optimization problems. More specifically, a modified Hopfield network is developed and its inter-nal parameters are computed explicitly using the valid-subspace technique. These parameters guarantee the convergence of the network to the equilibrium points, which represent a solution of the problem considered. The problems that can be treated by the proposed approach include combinatorial optimiza-tion problems, dynamic programming problems, and nonlinear optimization problems.

Formato

563-580

Identificador

http://dx.doi.org/10.1007/s10957-006-9032-9

Journal of Optimization Theory and Applications. New York: Springer/plenum Publishers, v. 128, n. 3, p. 563-580, 2006.

0022-3239

http://hdl.handle.net/11449/38376

10.1007/s10957-006-9032-9

WOS:000241554100005

Idioma(s)

eng

Publicador

Springer

Relação

Journal of Optimization Theory and Applications

Direitos

closedAccess

Palavras-Chave #recurrent neural networks #nonlinear optimization #dynamic programming #combinatorial optimization #Hopfield network
Tipo

info:eu-repo/semantics/article