Abstract
Recently, sparse subspace clustering, as a subspace learning technique, has been successfully applied to several computer vision applications, e.g. face clustering and motion segmentation. The main idea of sparse subspace clustering is to learn an effective sparse representation that are used to construct an affinity matrix for spectral clustering. While most of existing sparse subspace clustering algorithms and its extensions seek the forms of convex relaxation, the use of non-convex and non-smooth l q (0 < q < 1) norm has demonstrated better recovery performance. In this paper we propose an l q norm based Sparse Subspace Clustering method (lqSSC), which is motivated by the recent work that l q norm can enhance the sparsity and make better approximation to l 0 than l 1. However, the optimization of l q norm with multiple constraints is much difficult. To solve this non-convex problem, we make use of the Alternating Direction Method of Multipliers (ADMM) for solving the l q norm optimization, updating the variables in an alternating minimization way. ADMM splits the unconstrained optimization into multiple terms, such that the l q norm term can be solved via Smooth Iterative Reweighted Least Square (SIRLS), which converges with guarantee. Different from traditional IRLS algorithms, the proposed algorithm is based on gradient descent with adaptive weight, making it well suit for general sparse subspace clustering problem. Experiments on computer vision tasks (synthetic data, face clustering and motion segmentation) demonstrate that the proposed approach achieves considerable improvement of clustering accuracy than the convex based subspace clustering methods.
Similar content being viewed by others
References
Basri R, Jacobs DW (2001) Lambertian reflectance and linear subspaces. International Conference on Computer Vision
Basri R, Jacobs D W (2003) Lambertian reflectance and linear subspaces. IEEE Trans Pattern Anal Mach Intell 25.2:218–233
Boyd S, et al. (2011) Distributed optimization and statistical learning via the alternating direction method of multipliers. Found Trends Mach Learn 3.1:1–122
Candes EJ, Wakin MB, Boyd SP (2008) Enhancing sparsity by reweighted l1 minimization. J Fourier Anal Appl 14.5–6:877–905
Cheng W, Chow T WS, Zhao M (2016) Locality constrained-p sparse subspace clustering for image clustering[J]. Neurocomputing 205:22–31
Daubechies I, et al. (2010) Iteratively reweighted least squares minimization for sparse recovery. Commun Pure Appl Math 63.1:1–38
Deng Y, et al. (2013) Low-rank structure learning via nonconvex heuristic recovery. IEEE Trans Neural Netw Learn Syst 24.3:383–396
Dyer E L, Sankaranarayanan A C, Baraniuk R G (2013) Greedy feature selection for subspace clustering[J]. J Mach Learn Res 14(1):2487–2517
Dyer EL, et al. (2015) Self-expressive decompositions for matrix approximation and clustering. arXiv preprint arXiv:1505.00824
Elhamifar E, Vidal R (2013) Sparse subspace clustering: algorithm, theory, and applications. IEEE Trans Pattern Anal Mach Intell 35.11:2765–2781
Feng J, Lin Z, Xu H, et al. (2014) Robust subspace segmentation with block-diagonal prior[C]. In: IEEE Conference on computer vision and pattern recognition (CVPR), 2014. IEEE, pp 3818–3825
Fornasier M, Rauhut H, Ward R (2011) Low-rank matrix recovery via iteratively reweighted least squares minimization. SIAM J Optim 21.4:1614–1640
Fornasier M, et al. (2015) Conjugate gradient acceleration of iteratively re-weighted least squares methods. arXiv preprint arXiv:1509.04063
Guo S, Wang Z, Ruan Q (2013) Enhancing sparsity via l p (0 < p < 1) minimization for robust face recognition. Neurocomputing 99:592–602
He R, et al. (2015) Robust subspace clustering with complex noise. IEEE Trans Image Process 24.11:4001–4013
Heckel R, B?lcskei H (2013) Robust subspace clustering via thresholding[J]. arXiv preprint arXiv:1307.4891
Heckel R, Tschannen M, Bölcskei H (2015) Dimensionality-reduced subspace clustering[J]. arXiv preprint arXiv:1507.07105
Huang S-Y, Yeh Y-R, Eguchi S, Candes E J, Li X, Ma Y, Wring J (2009) Robust principal component analysis. J Assoc Comput Mach 53(3):3179–213
Hund M, Böhm D, Sturm W, et al. (2016) Visual analytics for concept exploration in subspaces of patient groups[J]. Brain Inform:1–15
Lai M-J, Wang J (2011) An unconstrained l q minimization with l q for sparse solution of underdetermined linear systems. SIAM J Optim 21.1:82–101
Lai M-J, Yangyang X, Yin W (2013) Improved iteratively reweighted least squares for unconstrained smoothed ł q minimization. SIAM J Numer Anal 51.2:927–957
Liu G, Lin Z, Yong Y (2010) Robust subspace segmentation by low-rank representation. In: Proceedings of the 27th international conference on machine learning (ICML-10)
Liu J, Chen Y, Zhang J, et al. (2014) Enhancing low-rank subspace clustering by manifold regularization[J]. IEEE Trans Image Process 23(9):4022–4030
Lu C (2012) Robust and efficient subspace segmentation via least squares regression, ECCV. pp 1–14
Lu C, Lin Z, Yan S (2015) Smoothed low rank and sparse matrix recovery by iteratively reweighted least squares minimization. IEEE Trans Image Process 24.2:646–654
Peng X, Yi Z, Tang H (2015) Robust subspace clustering via thresholding ridge regression. In: AAAI Conference on artificial intelligence (aaaI)
Soltanolkotabi M, Elhamifar E, Candes E J (2014) Robust subspace clustering. Ann Stat 42.2:669–699
Sui Y, Zhang S, Zhang L (2015) Robust visual tracking via sparsity-induced subspace learning[J]. IEEE Trans Image Process 24(12):4686–4700
Tron R, Vidal R (2007) A benchmark for the comparison of 3-d motion segmentation algorithms. In: IEEE Conference on computer vision and pattern recognition, 2007. CVPR’07. IEEE
Vidal R, Favaro P (2014) Low rank subspace clustering (LRSC). Pattern Recog Lett 43:47–61
Wang S, Yuan X, Yao T, Yan S (2011) Efficient subspace segmentation via quadratic programming. In: Proc. Twenty-Fifth AAAI conf. artif. intell., pp 519–524
Wen J, Li D, Zhu F (2015) Stable recovery of sparse signals via lp-minimization. Appl Comput Harm Anal 38.1:161–176
Xu J, et al. (2015) Reweighted sparse subspace clustering. Comput Vis Image Understand
Yang AY, et al. (2013) Fast-minimization algorithms for robust face recognition. IEEE Trans Image Process 22.8:3234–3246
You C, Vidal R (2015) Subspace-sparse representation. arXiv preprint arXiv:1507.01307
Zhang C H, Zhang T (2012) A general theory of concave regularization for high-dimensional sparse estimation problems[J]. Stat Sci:576–593
Zhang Y, et al. (2013) Robust subspace clustering via half-quadratic minimization. In: 2013 IEEE International conference on computer vision (ICCV). IEEE
Acknowledgments
This work is partially supported by NSF of China under Grant 61672548, 61173081, and the Guangzhou Science and Tech-nology Program, China, under Grant 201510010165.
Author information
Authors and Affiliations
Corresponding author
Appendix: Proof of the Theory 1
Appendix: Proof of the Theory 1
To prove Theorem 1, let \(u = z^{(j)} - \frac {1}{\rho } w^{(j)}\) , and u is a constant here. Firstly, for j=1,..,d
Then
Note that f(x)=x q/2(0 < q < 1) is a concave function, for any y,z∈R 1
Using (30), we have
By using the rule of (30), the left part of (31) can be transformed as
We now consider the (18), multiplying (x (k)−x (k+1))′ on both sides
Simplify (32) and get (33), then convert it as follows
Note that f(x)=x 2 is a concave function, then for all y,z∈R d
By employing this inequality, we have
This prove that J(x,𝜖) is an decreasing sequence. Note that
Thus the sequence {x (k)} is bounded. Furthermore, if 𝜖>0, the boundedness of {x (k))} implies that there exists a subsequence {x (k j )} converging to some point x 𝜖 ∗, Note that \(||x^{(k+1)} - x^{(k)}||_{2} \rightarrow 0\), thus the subsequence x (k j ) also converges to x 𝜖 ∗. Consider the subsequence in the (18)
Let \(k_{j} \rightarrow \infty \), we get
Therefore, x 𝜖 ∗ is a critical point of (18).
Rights and permissions
About this article
Cite this article
Kuang, S., Chao, H. & Yang, J. Efficient l q norm based sparse subspace clustering via smooth IRLS and ADMM. Multimed Tools Appl 76, 23163–23185 (2017). https://doi.org/10.1007/s11042-016-4091-x
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-016-4091-x