Gaussian mixtures based IRLS for sparse recovery with quadratic convergence
Chiara Ravazzi, Enrico Magli
IEEE Transactions on Signal Processing, Vol. 63(13), pp 3474-3489, July 2015.
Abstract
In this paper we propose a new class of iteratively re-weighted least squares (IRLS) for sparse recovery problems. The proposed methods are inspired by constrained maximum likelihood estimation under a Gaussian scale mixture (GSM) distribution assumption. In the noise-free setting, we provide sufficient conditions ensuring the convergence of the sequences generated by these algorithms to the set of fixed points of the maps that rule their dynamics and derive conditions verifiable a posteriori for the convergence to a sparse solution. We further prove that these algorithms are quadratically fast in a neighborhood of a sparse solution. We show through numerical experiments that the proposed methods outperform classical
IRLS for ℓτ-minimization with τ∈(0; 1] in terms of speed and of sparsity-undersampling tradeoff and are robust even in presence
of noise. The simplicity and the theoretical guarantees provided in this paper make this class of algorithms an attractive solution for sparse recovery problems.