An objective function based on Bayesian likelihoods of necessity and sufficiency for concept learning in the absence of labeled counter-examples
Contribuinte(s) |
Arabnia, Hamid Mun, Youngson |
---|---|
Data(s) |
01/01/2004
|
Resumo |
Supervised machine learning techniques generally require that the training set on which learning is based contain sufficient examples representative of the target concept, as well as known counter-examples of the concept; however, in many application domains it is not possible to supply a set of labeled counter-examples. This paper proposes an objective function based on Bayesian likelihoods of necessity and sufficiency. This function can be used to guide search towards the discovery of a concept description given only a set of labeled positive examples of the target concept, and as a corpus of unlabeled examples. Results of experiments performed on several datasets from the VCI repository show that the technique achieves comparable accuracy to conventional supervised learning techniques, despite the fact that the latter require a set of labeled counter-examples to be supplied. The technique can be applied in many domains in which the provision of labeled counter-examples is problematic.<br /> |
Identificador | |
Idioma(s) |
eng |
Publicador |
CSREA Press |
Relação |
http://dro.deakin.edu.au/eserv/DU:30005285/skabar-anobjectivefunction-2004.pdf |
Direitos |
2004, CSREA Press |
Palavras-Chave | #concept learning #genetic algorithms #semi-supervised learning |
Tipo |
Conference Paper |