About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ICML 2009
Conference paper
Gradient descent with sparsification : An iterative algorithm for sparse recovery with restricted isometry property
Abstract
We present an algorithm for finding an s-sparse vector x that minimizes the square error ||y - Φx||2 where Φ satisfies the restricted isometry property (RIP), with isometric constant Δ2s < 1/3. Our algorithm, called GraDeS (Gradient Descent with Sparsification) iteratively updates x as: x ← Hs (x + 1/gamma;. ΦT(y-Φx) where γ > 1 and Hs sets all but s largest magnitude coordinates to zero. GraDeS converges to the correct solution in constant number of iterations. The condition Δ2s < 1/3 is most general for which a near-linear time algorithm is known. In comparison, the best condition under which a polynomialtime algorithm is known, is Δ2s < 2-1. Our Matlab implementation of GraDeS outperforms previously proposed algorithms like Subspace Pursuit, StOMP, OMP, and Lasso by an order of magnitude. Curiously, our experiments also uncovered cases where L1-regularized regression (Lasso) fails but GraDeS finds the correct solution. Copyright 2009.