28 resultados para subspace


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Economic Dispatch (ED) problems have recently been solved by artificial neural networks approaches. In most of these dispatch models, the cost function must be linear or quadratic. Therefore, functions that have several minimum points represent a problem to the simulation since these approaches have not accepted nonlinear cost function. Another drawback pointed out in the literature is that some of these neural approaches fail to converge efficiently towards feasible equilibrium points. This paper discusses the application of a modified Hopfield architecture for solving ED problems defined by nonlinear cost function. The internal parameters of the neural network adopted here are computed using the valid-subspace technique, which guarantees convergence to equilibrium points that represent a solution for the ED problem. Simulation results and a comparative analysis involving a 3-bus test system are presented to illustrate efficiency of the proposed approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Systems based on artificial neural networks have high computational rates due to the use of a massive number of simple processing elements. Neural networks with feedback connections provide a computing model capable of solving a rich class of optimization problems. In this paper, a modified Hopfield network is developed for solving constrained nonlinear optimization problems. The internal parameters of the network are obtained using the valid-subspace technique. Simulated examples are presented as an illustration of the proposed approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neural networks consist of highly interconnected and parallel nonlinear processing elements that are shown to be extremely effective in computation. This paper presents an architecture of recurrent neural net-works that can be used to solve several classes of optimization problems. More specifically, a modified Hopfield network is developed and its inter-nal parameters are computed explicitly using the valid-subspace technique. These parameters guarantee the convergence of the network to the equilibrium points, which represent a solution of the problem considered. The problems that can be treated by the proposed approach include combinatorial optimiza-tion problems, dynamic programming problems, and nonlinear optimization problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present the critical exponents nu (L2), eta (L2) and gamma (L) for an m-axial Lifshitz point at second order in an epsilon (L) expansion. We introduce a constraint involving the loop momenta along the m-dimensional subspace in order to perform two- and three-loop integrals. The results are valid in the range 0 less than or equal to m less than or equal to d. The case m = 0 corresponds to the usual Ising-like critical behaviour.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability of neural networks to realize some complex nonlinear function makes them attractive for system identification. This paper describes a novel barrier method using artificial neural networks to solve robust parameter estimation problems for nonlinear model with unknown-but-bounded errors and uncertainties. This problem can be represented by a typical constrained optimization problem. More specifically, a modified Hopfield network is developed and its internal parameters are computed using the valid-subspace technique. These parameters guarantee the network convergence to the equilibrium points. A solution for the robust estimation problem with unknown-but-bounded error corresponds to an equilibrium point of the network. Simulation results are presented as an illustration of the proposed approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Systems based on artificial neural networks have high computational rates due to the use of a massive number of simple processing elements and the high degree of connectivity between these elements. Neural networks with feedback connections provide a computing model capable of solving a large class of optimization problems. This paper presents a novel approach for solving dynamic programming problems using artificial neural networks. More specifically, a modified Hopfield network is developed and its internal parameters are computed using the valid-subspace technique. These parameters guarantee the convergence of the network to the equilibrium points which represent solutions (not necessarily optimal) for the dynamic programming problem. Simulated examples are presented and compared with other neural networks. The results demonstrate that proposed method gives a significant improvement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Colombeau's theory, given an open subset Ω of ℝn, there is a differential algebra G(Ω) of generalized functions which contains in a natural way the space D′(Ω) of distributions as a vector subspace. There is also a simpler version of the algebra G,(Ω). Although this subalgebra does not contain, in canonical way, the space D′(Ω) is enough for most applications. This work is developed in the simplified generalized functions framework. In several applications it is necessary to compute higher intrinsic derivatives of generalized functions, and since these derivatives are multilinear maps, it is necessary to define the space of generalized functions in Banach spaces. In this article we introduce the composite function for a special class of generalized mappings (defined in open subsets of Banach spaces with values in Banach spaces) and we compute the higher intrinsic derivative of this composite function.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neural networks are dynamic systems consisting of highly interconnected and parallel nonlinear processing elements that are shown to be extremely effective in computation. This paper presents an architecture of recurrent neural networks for solving the N-Queens problem. More specifically, a modified Hopfield network is developed and its internal parameters are explicitly computed using the valid-subspace technique. These parameters guarantee the convergence of the network to the equilibrium points, which represent a solution of the considered problem. The network is shown to be completely stable and globally convergent to the solutions of the N-Queens problem. A fuzzy logic controller is also incorporated in the network to minimize convergence time. Simulation results are presented to validate the proposed approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)