Joint multiple dictionary learning for tensor sparse coding


Autoria(s): Fu, Yifan; Gao, Junbin; Sun, Yanfeng; Hong, Xia
Data(s)

2014

Resumo

Traditional dictionary learning algorithms are used for finding a sparse representation on high dimensional data by transforming samples into a one-dimensional (1D) vector. This 1D model loses the inherent spatial structure property of data. An alternative solution is to employ Tensor Decomposition for dictionary learning on their original structural form —a tensor— by learning multiple dictionaries along each mode and the corresponding sparse representation in respect to the Kronecker product of these dictionaries. To learn tensor dictionaries along each mode, all the existing methods update each dictionary iteratively in an alternating manner. Because atoms from each mode dictionary jointly make contributions to the sparsity of tensor, existing works ignore atoms correlations between different mode dictionaries by treating each mode dictionary independently. In this paper, we propose a joint multiple dictionary learning method for tensor sparse coding, which explores atom correlations for sparse representation and updates multiple atoms from each mode dictionary simultaneously. In this algorithm, the Frequent-Pattern Tree (FP-tree) mining algorithm is employed to exploit frequent atom patterns in the sparse representation. Inspired by the idea of K-SVD, we develop a new dictionary update method that jointly updates elements in each pattern. Experimental results demonstrate our method outperforms other tensor based dictionary learning algorithms.

Formato

text

Identificador

http://centaur.reading.ac.uk/39732/1/IJCNN2014_Yifan2.pdf

Fu, Y., Gao, J., Sun, Y. and Hong, X. <http://centaur.reading.ac.uk/view/creators/90000432.html> (2014) Joint multiple dictionary learning for tensor sparse coding. In: 2014 International Joint Conference on Neural Networks (IJCNN), July 6-11, 2014., Beijing, China.

Idioma(s)

en

Relação

http://centaur.reading.ac.uk/39732/

creatorInternal Hong, Xia

http://dx.doi.org/10.1109/IJCNN.2014.6889490

Tipo

Conference or Workshop Item

PeerReviewed