Saurabh Paul, Christos Boutsidis, et al.
JMLR
We present an algorithm for finding an s-sparse vector x that minimizes the square-error ||y - Φx||2 where Φ satisfies the restricted isometry property (RIP), with isometric constant δ2s < 1/3. Our algorithm, called GraDeS (Gradient Descent with Sparsification) iteratively updates x as: x ← Hs (x+1/γ· ΦT(y- Φx)) where γ > 1 and Hs sets all but s largest magnitude coordinates to zero. GraDeS converges to the correct solution in constant number of iterations. The condition δ2s < 1/3 is most general for which a near-linear time algorithm is known. In comparison, the best condition under which a polynomial-time algorithm is known, is δ2s < √2-1 Our Matlab implementation of GraDeS outperforms previously proposed algorithms like Subspace Pursuit, StOMP, OMP, and Lasso by an order of magnitude. Curiously, our experiments also uncovered cases where L1-regularized regression (Lasso) fails but GraDeS finds the correct solution.
Saurabh Paul, Christos Boutsidis, et al.
JMLR
C.A. Micchelli, W.L. Miranker
Journal of the ACM
Joxan Jaffar
Journal of the ACM
Cristina Cornelio, Judy Goldsmith, et al.
JAIR