Random Prism: a noise-tolerant alternative to Random Forests


Autoria(s): Stahl, Frederic; Bramer, Max
Data(s)

01/11/2014

Resumo

Ensemble learning can be used to increase the overall classification accuracy of a classifier by generating multiple base classifiers and combining their classification results. A frequently used family of base classifiers for ensemble learning are decision trees. However, alternative approaches can potentially be used, such as the Prism family of algorithms that also induces classification rules. Compared with decision trees, Prism algorithms generate modular classification rules that cannot necessarily be represented in the form of a decision tree. Prism algorithms produce a similar classification accuracy compared with decision trees. However, in some cases, for example, if there is noise in the training and test data, Prism algorithms can outperform decision trees by achieving a higher classification accuracy. However, Prism still tends to overfit on noisy data; hence, ensemble learners have been adopted in this work to reduce the overfitting. This paper describes the development of an ensemble learner using a member of the Prism family as the base classifier to reduce the overfitting of Prism algorithms on noisy datasets. The developed ensemble classifier is compared with a stand-alone Prism classifier in terms of classification accuracy and resistance to noise.

Formato

text

Identificador

http://centaur.reading.ac.uk/32914/1/AI2011%20%281%29.pdf

Stahl, F. <http://centaur.reading.ac.uk/view/creators/90005065.html> and Bramer, M. (2014) Random Prism: a noise-tolerant alternative to Random Forests. Expert Systems, 31 (5). pp. 411-420. ISSN 1468-0394 doi: 10.1111/exsy.12032 <http://dx.doi.org/10.1111/exsy.12032> (special issue on innovative techniques and applications of artificial intelligence)

Idioma(s)

en

Publicador

Wiley-Blackwell

Relação

http://centaur.reading.ac.uk/32914/

creatorInternal Stahl, Frederic

10.1111/exsy.12032

Tipo

Article

PeerReviewed