Publication | Open Access
A Fast Non-Smooth Nonnegative Matrix Factorization for Learning Sparse Representation
32
Citations
27
References
2016
Year
Sparse RepresentationEngineeringMachine LearningData ScienceLearning Sparse RepresentationPattern RecognitionSparse Neural NetworkMatrix FactorizationNonnegative Matrix FactorizationMultilinear Subspace LearningAtomic DecompositionComputer ScienceFast NsnmfDeep LearningLow-rank ApproximationNon-smooth Nmf
Nonnegative matrix factorization (NMF) is a hot topic in machine learning and data processing. Recently, a constrained version, non-smooth NMF (NsNMF), shows a great potential in learning meaningful sparse representation of the observed data. However, it suffers from a slow linear convergence rate, discouraging its applications to large-scale data representation. In this paper, a fast NsNMF (FNsNMF) algorithm is proposed to speed up NsNMF. In the proposed method, it first shows that the cost function of the derived sub-problem is convex and the corresponding gradient is Lipschitz continuous. Then, the optimization to this function is replaced by solving a proximal function, which is designed based on the Lipschitz constant and can be solved through utilizing a constructed fast convergent sequence. Due to the usage of the proximal function and its efficient optimization, our method can achieve a nonlinear convergence rate, much faster than NsNMF. Simulations in both computer generated data and the real-world data show the advantages of our algorithm over the compared methods.
| Year | Citations | |
|---|---|---|
Page 1
Page 1