The $l_1$-norm regularized minimization problem is a non-differentiable problem and has a wide range of applications in the field of compressive sensing. Many approaches have been proposed in the literature. Among them, smoothing $l_1$-norm is one of the effective approaches. This paper follows this path, in which we adopt six smoothing functions to approximate the $l_1$-norm. Then, we recast the signal recovery problem as a smoothing penalized least squares optimization problem, and apply the nonlinear conjugate gradient method to solve the smoothing model. The algorithm is shown globally convergent. In addition, the simulation results not only suggest some nice smoothing functions, but also show that the proposed algorithm is competitive in view of relative The $l_q$-quasi-norm sparsity problem of under-determined linear systems is a nonconvex when $q<1$ and non-differentiable problem, and has been well studied particularly on compressed sensing. In this paper, we first construct an elastic $l_q-l_1$ unconstrained minimization model for sparse recovery. Then, we transform this problem into a smoothing nonsingular linear equation by the first optimality condition and approximate technique, and propose an algorithm to solve the smoothing linear system. The boundedness and asymptotically regularity of sequence generated by our method is proved. The algorithm is shown globally convergent. Moreover, we investigate the error bound about the limit point of sequence obtained by our algorithm and the sparse solution. Numerical experiments show that our algorithm has better performance.
|Effective start/end date||2017/08/01 → 2019/07/31|
- Compressive sensing
- sparse solution
- conjugate gradient algorithm
- elastic $l_q-l_1$ minimization
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.