10 resultados para LEAST-SQUARES METHODS
em SAPIENTIA - Universidade do Algarve - Portugal
Resumo:
Least squares solutions are a very important problem, which appear in a broad range of disciplines (for instance, control systems, statistics, signal processing). Our interest in this kind of problems lies in their use of training neural network controllers.
Resumo:
Least squares solutions are a very important problem, which appear in a broad range of disciplines (for instance, control systems, statistics, signal processing). Our interest in this kind of problems lies in their use of training neural network controllers.
Resumo:
A non-linear least-squares methodology for simultaneously estimating parameters of selectivity curves with a pre-defined functional form, across size classes and mesh sizes, using catch size frequency distributions, was developed based on the model of Kirkwood and Walker [Kirkwood, G.P., Walker, T.L, 1986. Gill net selectivities for gummy shark, Mustelus antarcticus Gunther, taken in south-eastern Australian waters. Aust. J. Mar. Freshw. Res. 37, 689-697] and [Wulff, A., 1986. Mathematical model for selectivity of gill nets. Arch. Fish Wiss. 37, 101-106]. Observed catches of fish of size class I in mesh m are modeled as a function of the estimated numbers of fish of that size class in the population and the corresponding selectivities. A comparison was made with the maximum likelihood methodology of [Kirkwood, G.P., Walker, T.I., 1986. Gill net selectivities for gummy shark, Mustelus antarcticus Gunther, taken in south-eastern Australian waters. Aust. J. Mar. Freshw. Res. 37, 689-697] and [Wulff, A., 1986. Mathematical model for selectivity of gill nets. Arch. Fish Wiss; 37, 101-106], using simulated catch data with known selectivity curve parameters, and two published data sets. The estimated parameters and selectivity curves were generally consistent for both methods, with smaller standard errors for parameters estimated by non-linear least-squares. The proposed methodology is a useful and accessible alternative which can be used to model selectivity in situations where the parameters of a pre-defined model can be assumed to be functions of gear size; facilitating statistical evaluation of different models and of goodness of fit. (C) 1998 Elsevier Science B.V.
Resumo:
In this paper a parallel implementation of an Adaprtive Generalized Predictive Control (AGPC) algorithm is presented. Since the AGPC algorithm needs to be fed with knowledge of the plant transfer function, the parallelization of a standard Recursive Least Squares (RLS) estimator and a GPC predictor is discussed here.
Resumo:
In this paper a parallel implementation of an Adaprtive Generalized Predictive Control (AGPC) algorithm is presented. Since the AGPC algorithm needs to be fed with knowledge of the plant transfer function, the parallelization of a standard Recursive Least Squares (RLS) estimator and a GPC predictor is discussed here.
Resumo:
In this paper a parallel implementation of an Adaprtive Generalized Predictive Control (AGPC) algorithm is presented. Since the AGPC algorithm needs to be fed with knowledge of the plant transfer function, the parallelization of a standard Recursive Least Squares (RLS) estimator and a GPC predictor is discussed here.
Resumo:
The Adaptive Generalized Predictive Control (AGPC) algorithm can be speeded up using parallel processing. Since the AGPC algorithm needs to be fed with the knowledge of the plant transfer function, the parallelization of a standard Recursive Least Squares (RLS) estimator and a GPC predictor is discussed here.
Resumo:
The Adaptive Generalized Predictive Control (GPC) algorithm can be speeded up using parallel processing. Since the GPC algorithm needs to be fed with knowledge of the plant transfer function, the parallelization of a standard Recursive Least Squares (RLS) estimator and a GPC predictor is discussed here.
Resumo:
In this paper the parallelization of a new learning algorithm for multilayer perceptrons, specifically targeted for nonlinear function approximation purposes, is discussed. Each major step of the algorithm is parallelized, a special emphasis being put in the most computationally intensive task, a least-squares solution of linear systems of equations.
Resumo:
Dissertação de mest., Qualidade em Análises, Faculdade de Ciências e Tecnologia, Univ. do Algarve, 2013