A fast non-smooth nonnegative matrix factorization for learning sparse representation
Data(s) |
01/01/2016
|
---|---|
Resumo |
Nonnegative matrix factorization (NMF) is a hot topic in machine learning and data processing. Recently, a constrained version, non-smooth NMF (NsNMF), shows a great potential in learning meaningful sparse representation of the observed data. However, it suffers from a slow linear convergence rate, discouraging its applications to large-scale data representation. In this paper, a fast NsNMF (FNsNMF) algorithm is proposed to speed up NsNMF. In the proposed method, it first shows that the cost function of the derived sub-problem is convex and the corresponding gradient is Lipschitz continuous. Then, the optimization to this function is replaced by solving a proximal function, which is designed based on the Lipschitz constant and can be solved through utilizing a constructed fast convergent sequence. Due to the usage of the proximal function and its efficient optimization, our method can achieve a nonlinear convergence rate, much faster than NsNMF. Simulations in both computer generated data and the real-world data show the advantages of our algorithm over the compared methods. |
Identificador | |
Idioma(s) |
eng |
Publicador |
IEEE |
Relação |
http://dro.deakin.edu.au/eserv/DU:30087587/xiang-fastnonsmooth-2016.pdf http://www.dx.doi.org/10.1109/ACCESS.2016.2605704 |
Direitos |
2016, IEEE |
Palavras-Chave | #Nonnegative matrix factorization #sparse representation #nonlinear convergence rate |
Tipo |
Journal Article |