![]() can be considered as an instance of the proximal-gradient method. ![]() This page collects recent research effort in this line. backward splitting algorithm in a joint sparse recovery scenario where neither the. The general proximal gradient algorithm is designed to solve the problem in. which corresponds to a single proximal gradient descent step of (3). where v is any given vector and P() is the nonsmooth penalty. The formulation of basis pursuit relax the linear constraint of P1 in the. A further survey of nonconvex regularizers for sparse recovery can be found in 25. The second order algorithms with theoretical guarantees are still largely missing for high dimensional nonconvex regularized sparse modeling approaches, but this does not suppress the enthusiasm of applying heuristic sec-ond order algorithms to real world problems. But many nonconvex problems of interest become amenable to simple and practical algorithms and rigorous analyses once the artificial separation is removed. Sparse regression, structured sparsity, smoothing, proximal gradient, op- timization. imal gradient descent and proximal coordinate gradient descent. Unlike most existing work, we focus on unconstrained ellq minimization, for which we show a few advantages on noisy measurements and/or approximately sparse vectors. General nonconvex optimization is undoubtedly hard - in sharp contrast to convex optimization, of which there is good separation of problem structure, input data, and optimization algorithms. In this paper, we first study ellq minimization and its associated iterative reweighted algorithm for recovering sparse vectors. ![]()
0 Comments
Leave a Reply. |