Approximated Function Based Spectral Gradient Algorithm for Sparse Signal Recovery
Abstract
Numerical algorithms for the l0-norm regularized non-smooth non-convex minimization problems have recently became a topic of great interest within signal processing, compressive sensing, statistics, and machine learning. Nevertheless, the l0-norm makes the problem combinatorial and generally computationally intractable. In this paper, we construct a new surrogate function to approximate l0-norm regularization, and subsequently make the discrete optimization problem continuous and smooth. Then we use the well-known spectral gradient algorithm to solve the resulting smooth optimization problem. Experiments are provided which illustrate this method is very promising.References
J. Barzilai, J.M. Borwein, Two point step size gradient method, IMA J. Numer. Anal. 8 (1988), 141-148.
E. Candès, J. Romberg, and T. Tao, Stable signal recovery from imcomplete and inaccurate information, Commun. Pure Appl. Math., 59 (2005), 1207-1233.
E. Candès, J. Romberg, and T. Tao, Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequence information, IEEE Trans. Inform. Theory, 52 (2006), 489-509.
X. Chen, D. Ge, Z. Wang, and Y. Ye, Complexity of unconstrained L2− Lp minimization, Math. Program., DOI 10.1007/s10107-012-0613-0.
X. Chen, F. Xu, and Y. Ye, Lower bound theory of nonzero entries in solutions of ℓ2-ℓp Minimization, SIAM J. Sci. Comput., 32 (2010), 2832-2852.
D.L. Donoho, For most large underdetermined systems of linear equations, the minimal l1-norm solution is also the sparsest solution, Comm. Pure Appl. Math., 59 (2006), 907-934.
D.L. DONOHO, Compressed sensing, IEEE Trans. Inform. Theory, 52 (2006), 1289-1306.
Y.H. Dai, W.W. Hager, K. Schittkowski, H. Zhang, The cyclic Barzilai-Borwein method for unconstrained optimization, IMA J. Numer. Anal., 26 (2006), 604-627.
D. Ge, X. Jiang, and Y. Ye, A note on the complexity of Lp minimization, Math. Program.,129(2011), 285-299.
R. Gribnoval and M. Nielsen, Sparse decompositions in unions of bases, IEEE Trans. Info. Theory, 49(2003), 3320-3325.
E.T. Hale, W. Yin and Y. Zhang, Fixed-point continuation for ℓ1-minimization: Methodology and convergence, SAIM Journal on Optimization., 19 (2008), 1107-1130.
M. Raydan, On the Barzilai and Borwein gradient method for the large scale unconstrained minimization problem, IMA J. Numer. Anal., 13 (1993), 321-326.
M. Raydan, The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem, SIAM J. Optim., 7 (1997), 26-33.
S.J. Wright, R.D. Nowak and M.A.T. Figueuredo, Sparse reconstruction by separable approximation, in proceedings of the International Conference on Acoustics, Speech, and Signal Processing, 2008, 3373-3376.
A.S. Weigend, D.E. Rumelhart, and B. A. Huberman (1991), Generalization by weight-elimination with application to forecasting, in Advances in Neural Information Processing Systems 3 (Denver 1990), R. P. Lippmann, J. E. Moody, and D. S. Touretzky, Editors, 875-882. Morgan Kaufmann, San Mateo, CA.
Z. Wen, W. Yin, D. Goldfarb, and Y. Zhang, A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization, and continuation, SIAM Journal on Scientific Computing, 32 (2010), 1832-1857.
Z. Wen, W. Yin, H. Zhang and D. Goldfarb, On the convergence of an active-set method for ℓ1-minimization, Optimization Method and Software, 27 (2012), 1127-1146.
Y. Xiao, S.-Y. Wu and L. Qi, Nonmonotone Barzilai-Borwein Gradient Algorithm for ℓ1-Regularized Nonsmooth Minimization in Compressive Sensing, avaiable at http://arxiv.org/abs/1207.4538.
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).