Derivative-free optimization and neural networks for robust regression


Autoria(s): Beliakov, Gleb; Kelarev, Andrei; Yearwood, John
Data(s)

01/01/2012

Resumo

Large outliers break down linear and nonlinear regression models. Robust regression methods allow one to filter out the outliers when building a model. By replacing the traditional least squares criterion with the least trimmed squares (LTS) criterion, in which half of data is treated as potential outliers, one can fit accurate regression models to strongly contaminated data. High-breakdown methods have become very well established in linear regression, but have started being applied for non-linear regression only recently. In this work, we examine the problem of fitting artificial neural networks (ANNs) to contaminated data using LTS criterion. We introduce a penalized LTS criterion which prevents unnecessary removal of valid data. Training of ANNs leads to a challenging non-smooth global optimization problem. We compare the efficiency of several derivative-free optimization methods in solving it, and show that our approach identifies the outliers correctly when ANNs are used for nonlinear regression.

Identificador

http://hdl.handle.net/10536/DRO/DU:30051386

Idioma(s)

eng

Publicador

Taylor & Francis

Relação

http://dro.deakin.edu.au/eserv/DU:30051386/beliakov-derivativefreeoptimization-2012.pdf

http://dx.doi.org/10.1080/02331934.2012.674946

Direitos

2012, Taylor & Francis

Palavras-Chave #global optimization #least-trimmed squares #neural networks #non-smooth optimization #robust regression
Tipo

Journal Article