×

A first-order smoothed penalty method for compressed sensing. (English) Zbl 1219.90123

Summary: We propose a first-order smoothed penalty algorithm (SPA) to solve the sparse recovery problem \(\min\{\|x\|_1:Ax=b\}\). SPA is efficient as long as the matrix-vector products \(Ax\) and \(A^{T}y\) can be computed efficiently; in particular, \(A\) does not need to have orthogonal rows. SPA converges to the target signal by solving a sequence of penalized optimization subproblems, and each subproblem is solved using Y. Nesterov’s optimal algorithm for simple sets [Introductory lectures on convex optimization. A basic course. Boston: Kluwer Academic Publishers (2004; Zbl 1086.90045); Math. Program. 103, No. 1 (A), 127–152 (2005; Zbl 1079.90102)]. We show that the SPA iterates \(x_k\) are \(\varepsilon\)-feasible, i.e. \(\|Ax_k-b\|_2\leq\varepsilon\), and \(\varepsilon\)-optimal, i.e. \(|\,\|x_k\|_1-\|x^\ast\|_1|\leq\varepsilon\) after \(\widetilde{O}(\varepsilon^{-\frac{3}{2}})\) iterations. SPA is able to work with \(\ell_1, \ell_2\), or \(\ell_\infty\) penalty on the infeasibility, and SPA can be easily extended to solve the relaxed recovery problem \(\min\{\|x\|_1:\|Ax-b\|_2\leq\delta\}\).

MSC:

90C25 Convex programming
90C06 Large-scale problems in mathematical programming
49M29 Numerical methods involving duality
65D10 Numerical smoothing, curve fitting
90C90 Applications of mathematical programming

Software:

FPC_AS; NESTA