Learning with symmetric label noise: The importance of being unhinged


Autoria(s): van Rooyen, Brendan; Menon, Aditya Krishna; Williamson, Robert C.
Data(s)

2015

Resumo

Convex potential minimisation is the de facto approach to binary classification. However, Long and Servedio [2008] proved that under symmetric label noise (SLN), minimisation of any convex potential over a linear function class can result in classification performance equivalent to random guessing. This ostensibly shows that convex losses are not SLN-robust. In this paper, we propose a convex, classification-calibrated loss and prove that it is SLN-robust. The loss avoids the Long and Servedio [2008] result by virtue of being negatively unbounded. The loss is a modification of the hinge loss, where one does not clamp at zero; hence, we call it the unhinged loss. We show that the optimal unhinged solution is equivalent to that of a strongly regularised SVM, and is the limiting solution for any convex potential; this implies that strong l2 regularisation makes most standard learners SLN-robust. Experiments confirm the unhinged loss’ SLN-robustness.

Identificador

http://eprints.qut.edu.au/91573/

Publicador

Neural Information Processing Systems Foundation, Inc.

Relação

https://papers.nips.cc/paper/5941-learning-with-symmetric-label-noise-the-importance-of-being-unhinged

van Rooyen, Brendan, Menon, Aditya Krishna, & Williamson, Robert C. (2015) Learning with symmetric label noise: The importance of being unhinged. In Advances in Neural Information Processing Systems 28 (NIPS 2015), Neural Information Processing Systems Foundation, Inc., Palais des Congrès de Montréal, Montréal, Canada, pp. 10-18.

Direitos

Copyright 2015 Neural Information Processing Systems Foundation, Inc.

Fonte

ARC Centre of Excellence for Mathematical & Statistical Frontiers (ACEMS); School of Mathematical Sciences; Science & Engineering Faculty

Tipo

Conference Paper