Classification Performance of Multilayer Perceptrons with Different Risk Functionals
Data(s) |
08/01/2015
08/01/2015
2014
|
---|---|
Resumo |
In the present paper we assess the performance of information-theoretic inspired risks functionals in multilayer perceptrons with reference to the two most popular ones, Mean Square Error and Cross-Entropy. The information-theoretic inspired risks, recently proposed, are: HS and HR2 are, respectively, the Shannon and quadratic Rényi entropies of the error; ZED is a risk reflecting the error density at zero errors; EXP is a generalized exponential risk, able to mimic a wide variety of risk functionals, including the information-thoeretic ones. The experiments were carried out with multilayer perceptrons on 35 public real-world datasets. All experiments were performed according to the same protocol. The statistical tests applied to the experimental results showed that the ubiquitous mean square error was the less interesting risk functional to be used by multilayer perceptrons. Namely, mean square error never achieved a significantly better classification performance than competing risks. Cross-entropy and EXP were the risks found by several tests to be significantly better than their competitors. Counts of significantly better and worse risks have also shown the usefulness of HS and HR2 for some datasets. |
Identificador |
http://hdl.handle.net/10400.22/5357 10.1142/S021800141450013X |
Idioma(s) |
eng |
Publicador |
World Scientific Publishing Company |
Relação |
International Journal of Pattern Recognition and Artificial Intelligence;Vol. 28, Issue 6 http://www.worldscientific.com/doi/abs/10.1142/S021800141450013X?journalCode=ijprai |
Direitos |
restrictedAccess |
Palavras-Chave | #Neural Networks #Risk Functionals #Classification #Multilayer perceptrons |
Tipo |
article |