Supervised Gaussian Process Latent Variable Model for Dimensionality Reduction


Autoria(s): Gao, Xinbo; Wang, Xiumei; Tao, Dacheng; Li, Xuelong
Data(s)

01/04/2011

Resumo

The Gaussian process latent variable model (GP-LVM) has been identified to be an effective probabilistic approach for dimensionality reduction because it can obtain a low-dimensional manifold of a data set in an unsupervised fashion. Consequently, the GP-LVM is insufficient for supervised learning tasks (e. g., classification and regression) because it ignores the class label information for dimensionality reduction. In this paper, a supervised GP-LVM is developed for supervised learning tasks, and the maximum a posteriori algorithm is introduced to estimate positions of all samples in the latent variable space. We present experimental evidences suggesting that the supervised GP-LVM is able to use the class label information effectively, and thus, it outperforms the GP-LVM and the discriminative extension of the GP-LVM consistently. The comparison with some supervised classification methods, such as Gaussian process classification and support vector machines, is also given to illustrate the advantage of the proposed method.

Identificador

http://ir.opt.ac.cn/handle/181661/8628

http://www.irgrid.ac.cn/handle/1471x/146759

Idioma(s)

英语

Palavras-Chave #电子、电信技术::信号与模式识别 #电子、电信技术::计算机应用其他学科(含图像处理) #Dimensionality reduction #Gaussian process latent variable model (GP-LVM) #generalized discriminant analysis (GDA) #probabilistic principal component analysis (probabilistic PCA) #supervised learning
Tipo

期刊论文