Publication
ICML 2009
Conference paper

Gradient descent with sparsification : An iterative algorithm for sparse recovery with restricted isometry property

View publication

Abstract

We present an algorithm for finding an s-sparse vector x that minimizes the square error ||y - Φx||2 where Φ satisfies the restricted isometry property (RIP), with isometric constant Δ2s < 1/3. Our algorithm, called GraDeS (Gradient Descent with Sparsification) iteratively updates x as: x ← Hs (x + 1/gamma;. ΦT(y-Φx) where γ > 1 and Hs sets all but s largest magnitude coordinates to zero. GraDeS converges to the correct solution in constant number of iterations. The condition Δ2s < 1/3 is most general for which a near-linear time algorithm is known. In comparison, the best condition under which a polynomialtime algorithm is known, is Δ2s < 2-1. Our Matlab implementation of GraDeS outperforms previously proposed algorithms like Subspace Pursuit, StOMP, OMP, and Lasso by an order of magnitude. Curiously, our experiments also uncovered cases where L1-regularized regression (Lasso) fails but GraDeS finds the correct solution. Copyright 2009.

Date

15 Sep 2009

Publication

ICML 2009

Authors

Topics

Share