Using upper bounds on attainable discrimination to select discrete valued features


Autoria(s): Lovell, D. R.; Dance, C. R.; Niranjan, M.; Prager, R. W.; Dalton, K. J.
Data(s)

1996

Resumo

Selection of features that will permit accurate pattern classification is a difficult task. However, if a particular data set is represented by discrete valued features, it becomes possible to determine empirically the contribution that each feature makes to the discrimination between classes. This paper extends the discrimination bound method so that both the maximum and average discrimination expected on unseen test data can be estimated. These estimation techniques are the basis of a backwards elimination algorithm that can be use to rank features in order of their discriminative power. Two problems are used to demonstrate this feature selection process: classification of the Mushroom Database, and a real-world, pregnancy related medical risk prediction task - assessment of risk of perinatal death.

Identificador

http://eprints.qut.edu.au/79895/

Publicador

IEEE

Relação

http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=548353

DOI:10.1109/NNSP.1996.548353

Lovell, D. R., Dance, C. R., Niranjan, M., Prager, R. W., & Dalton, K. J. (1996) Using upper bounds on attainable discrimination to select discrete valued features. In Neural Networks for Signal Processing [1996] VI. Proceedings of the 1996 IEEE Signal Processing Society Workshop, IEEE, Kyoto, pp. 233-242.

Direitos

IEEE

Fonte

School of Electrical Engineering & Computer Science; Science & Engineering Faculty

Palavras-Chave #Algorithms #Calculations #Data reduction #Errors #Estimation #Pattern recognition #Testing #Discrete valued features #Discrimination bound method #Feature selection process #Neural networks
Tipo

Conference Paper