Building Support Vector Machines in the Context of Regularised Least Squares


Autoria(s): Peng, Jian; Rafferty, Karen; Ferguson, Stuart
Data(s)

08/06/2016

31/12/1969

Resumo

This paper formulates a linear kernel support vector machine (SVM) as a regularized least-squares (RLS) problem. By defining a set of indicator variables of the errors, the solution to the RLS problem is represented as an equation that relates the error vector to the indicator variables. Through partitioning the training set, the SVM weights and bias are expressed analytically using the support vectors. It is also shown how this approach naturally extends to Sums with nonlinear kernels whilst avoiding the need to make use of Lagrange multipliers and duality theory. A fast iterative solution algorithm based on Cholesky decomposition with permutation of the support vectors is suggested as a solution method. The properties of our SVM formulation are analyzed and compared with standard SVMs using a simple example that can be illustrated graphically. The correctness and behavior of our solution (merely derived in the primal context of RLS) is demonstrated using a set of public benchmarking problems for both linear and nonlinear SVMs.

Identificador

http://pure.qub.ac.uk/portal/en/publications/building-support-vector-machines-in-the-context-of-regularised-least-squares(e4655c2f-5f31-4084-a4b9-9754d0b77abb).html

http://dx.doi.org/10.1016/j.neucom.2016.03.087

Idioma(s)

eng

Direitos

info:eu-repo/semantics/embargoedAccess

Fonte

Peng , J , Rafferty , K & Ferguson , S 2016 , ' Building Support Vector Machines in the Context of Regularised Least Squares ' Neurocomputing , pp. 1 . DOI: 10.1016/j.neucom.2016.03.087

Tipo

article