Efficient and Robust Compressed Sensing Using Optimized Expander Graphs

Expander graphs have been recently proposed to construct efficient compressed sensing algorithms. In particular, it has been shown that any <i>n</i>-dimensional vector that is <i>k</i>-sparse can be fully recovered using <i>O</i>(<i>k</i>log<i>n</i>) measurements and only <i>O</i>(<i>k</i>log<i>n</i>) simple recovery iterations. In this paper, we improve upon this result by considering expander graphs with expansion coefficient beyond <sup>3</sup> <i>/</i> <sub>4</sub> and show that, with the same number of measurements, only <i>O</i>(<i>k</i>) recovery iterations are required, which is a significant improvement when <i>n</i> is large. In fact, full recovery can be accomplished by at most <i>2k</i> very simple iterations. The number of iterations can be reduced arbitrarily close to <i>k</i>, and the recovery algorithm can be implemented very efficiently using a simple priority queue with total recovery time <i>O</i>(<i>n</i>log(<sup>n</sup>/<sub>k</sub>))). We also show that by tolerating a small penalty on the number of measurements, and not on the number of recovery iterations, one can use the efficient construction of a family of expander graphs to come up with explicit measurement matrices for this method. We compare our result with other recently developed expander-graph-based methods and argue that it compares favorably both in terms of the number of required measurements and in terms of the time complexity and the simplicity of recovery. Finally, we will show how our analysis extends to give a robust algorithm that finds the position and sign of the <i>k</i> significant elements of an almost <i>k</i>-sparse signal and then, using very simple optimization techniques, finds a <i>k</i>-sparse signal which is close to the best <i>k</i>-term approximation of the original signal.

[1]  W. B. Johnson,et al.  Extensions of Lipschitz mappings into Hilbert space , 1984 .

[2]  A. O. Sorungbe Expanded Programme on Immunization in Nigeria. , 1989, Reviews of infectious diseases.

[3]  Daniel A. Spielman,et al.  Expander codes , 1994, Proceedings 35th Annual Symposium on Foundations of Computer Science.

[4]  Omer Reingold,et al.  Randomness Conductors and Constant-Degree Expansion Beyond the Degree / 2 Barrier , 2001 .

[5]  Avi Wigderson,et al.  Randomness conductors and constant-degree lossless expanders , 2002, STOC '02.

[6]  P. Sarnak What is . . . An expander , 2004 .

[7]  Alexander Vardy,et al.  Correcting errors beyond the Guruswami-Sudan radius in polynomial time , 2005, 46th Annual IEEE Symposium on Foundations of Computer Science (FOCS'05).

[8]  N. Linial,et al.  Expander Graphs and their Applications , 2006 .

[9]  B. Hassibi,et al.  Further Results on Performance Analysis for Compressive Sensing Using Expander Graphs , 2007, 2007 Conference Record of the Forty-First Asilomar Conference on Signals, Systems and Computers.

[10]  Weiyu Xu,et al.  Efficient Compressive Sensing with Deterministic Guarantees Using Expander Graphs , 2007, 2007 IEEE Information Theory Workshop.

[11]  E. Candès,et al.  People Hearing Without Listening : ” An Introduction To Compressive Sampling , 2007 .

[12]  Piotr Indyk,et al.  Combining geometry and combinatorics: A unified approach to sparse signal recovery , 2008, 2008 46th Annual Allerton Conference on Communication, Control, and Computing.

[13]  A. Robert Calderbank,et al.  A fast reconstruction algorithm for deterministic compressive sensing using second order reed-muller codes , 2008, 2008 42nd Annual Conference on Information Sciences and Systems.

[14]  P. Indyk,et al.  Near-Optimal Sparse Recovery in the L1 Norm , 2008, 2008 49th Annual IEEE Symposium on Foundations of Computer Science.

[15]  R. DeVore,et al.  A Simple Proof of the Restricted Isometry Property for Random Matrices , 2008 .

[16]  James R. Lee,et al.  Almost Euclidean subspaces of e N 1 via expander codes , 2008, SODA 2008.

[17]  Enkatesan G Uruswami Unbalanced expanders and randomness extractors from Parvaresh-Vardy codes , 2008 .

[18]  Vahid Tarokh,et al.  A Frame Construction and a Universal Distortion Bound for Sparse Representations , 2008, IEEE Transactions on Signal Processing.

[19]  E.J. Candes,et al.  An Introduction To Compressive Sampling , 2008, IEEE Signal Processing Magazine.

[20]  P. Indyk,et al.  Near-Optimal Sparse Recovery in the L1 Norm , 2008, 2008 49th Annual IEEE Symposium on Foundations of Computer Science.

[21]  Piotr Indyk Explicit constructions for compressed sensing of sparse signals , 2008, SODA '08.

[22]  Venkatesan Guruswami,et al.  Almost Euclidean subspaces of ℓ1N VIA expander codes , 2007, SODA '08.

[23]  J. Tropp,et al.  CoSaMP: Iterative signal recovery from incomplete and inaccurate samples , 2008, Commun. ACM.

[24]  R. Calderbank,et al.  Chirp sensing codes: Deterministic compressed sensing measurements for fast recovery , 2009 .

[25]  A. Robert Calderbank,et al.  Construction of a Large Class of Deterministic Sensing Matrices That Satisfy a Statistical Isometry Property , 2009, IEEE Journal of Selected Topics in Signal Processing.

[26]  Piotr Indyk,et al.  Sparse Recovery Using Sparse Random Matrices , 2010, LATIN.

[27]  Deanna Needell,et al.  CoSaMP: Iterative signal recovery from incomplete and inaccurate samples , 2008, ArXiv.