Regularized nonnegative shared subspace learning


Autoria(s): Gupta, Sunil Kumar; Phung, Dinh; Adams, Brett; Venkatesh, Svetha
Data(s)

01/01/2013

Resumo

Joint modeling of related data sources has the potential to improve various data mining tasks such as transfer learning, multitask clustering, information retrieval etc. However, diversity among various data sources might outweigh the advantages of the joint modeling, and thus may result in performance degradations. To this end, we propose a regularized shared subspace learning framework, which can exploit the mutual strengths of related data sources while being immune to the effects of the variabilities of each source. This is achieved by further imposing a mutual orthogonality constraint on the constituent subspaces which segregates the common patterns from the source specific patterns, and thus, avoids performance degradations. Our approach is rooted in nonnegative matrix factorization and extends it further to enable joint analysis of related data sources. Experiments performed using three real world data sets for both retrieval and clustering applications demonstrate the benefits of regularization and validate the effectiveness of the model. Our proposed solution provides a formal framework appropriate for jointly analyzing related data sources and therefore, it is applicable to a wider context in data mining.<br />

Identificador

http://hdl.handle.net/10536/DRO/DU:30044260

Idioma(s)

eng

Publicador

Springer

Relação

http://dro.deakin.edu.au/eserv/DU:30044260/gupta-regularizednonnegative-2013.pdf

http://dx.doi.org/10.1007/s10618-011-0244-8

Direitos

2011, The Author(s)

Palavras-Chave #auxiliary sources #multi-task clustering #nonnegative shared subspace learning #transfer learning
Tipo

Journal Article