Pavel Kisilev, Daniel Freedman, et al.
ICPR 2012
We present an algorithm for finding an s-sparse vector x that minimizes the square error ||y - Φx||2 where Φ satisfies the restricted isometry property (RIP), with isometric constant Δ2s < 1/3. Our algorithm, called GraDeS (Gradient Descent with Sparsification) iteratively updates x as: x ← Hs (x + 1/gamma;. ΦT(y-Φx) where γ > 1 and Hs sets all but s largest magnitude coordinates to zero. GraDeS converges to the correct solution in constant number of iterations. The condition Δ2s < 1/3 is most general for which a near-linear time algorithm is known. In comparison, the best condition under which a polynomialtime algorithm is known, is Δ2s < 2-1. Our Matlab implementation of GraDeS outperforms previously proposed algorithms like Subspace Pursuit, StOMP, OMP, and Lasso by an order of magnitude. Curiously, our experiments also uncovered cases where L1-regularized regression (Lasso) fails but GraDeS finds the correct solution. Copyright 2009.
Pavel Kisilev, Daniel Freedman, et al.
ICPR 2012
Michelle X. Zhou, Fei Wang, et al.
ICMEW 2013
Sudeep Sarkar, Kim L. Boyer
Computer Vision and Image Understanding
James E. Gentile, Nalini Ratha, et al.
BTAS 2009