Compressed sensing - probabilistic analysis of a null-space characterization

It is well known that compressed sensing problems reduce to solving large under-determined systems of equations. To assure that the problem is well defined, i.e., that the solution is unique the vector of unknowns is of course assumed to be sparse. Nonetheless, even when the solution is unique, finding it in general may be computationally difficult. However, starting with the seminal work of Candes and Tao [2005], it has been shown that linear programming techniques, obtained from an l1-norm relaxation of the original non-convex problem, can provably find the unknown vector in certain instances. In particular, using a certain restricted isometry property, Candes and Tao [2005] shows that for measurement matrices chosen from a random Gaussian ensemble, l1 optimization can find the correct solution with overwhelming probability even when the number of non-zero entries of the unknown vector is proportional to the number of measurements (and the total number of unknowns). The subsequent paper [Donoho and Tanner, 2005] uses results on neighborly polytopes from [Vershik and Sporyshev, 1992] to give a "sharp" bound on what this proportionality should be in the Gaussian case. In the current paper, we observe that what matters is not so much the distribution from which the entries of the measurement matrix A are drawn, but rather the statistics of the null-space of A. Using this observation, we provide an alternative proof of the main result of Candes and Tao [2005] by analyzing matrices whose null-space is isotropic (of which i.i.d. Gaussian ensembles are a special case).

[1]  A. Wyner Random packings and coverings of the unit n-sphere , 1967 .

[2]  K. Böröczky,et al.  Covering the Sphere by Equal Spherical Balls , 2003 .

[3]  Arkadi Nemirovski,et al.  On sparse representation in pairs of bases , 2003, IEEE Trans. Inf. Theory.

[4]  Imre Bárány,et al.  A note on the size of the largest ball inside a convex polytope , 2005, Period. Math. Hung..

[5]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[6]  D. Donoho,et al.  Neighborliness of randomly projected simplices in high dimensions. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[7]  M. Rudelson,et al.  Geometric approach to error-correcting codes and reconstruction of signals , 2005, math/0502299.

[8]  Robert D. Nowak,et al.  Signal Reconstruction From Noisy Random Projections , 2006, IEEE Transactions on Information Theory.

[9]  Nathan Linial,et al.  How Neighborly Can a Centrally Symmetric Polytope Be? , 2006, Discret. Comput. Geom..

[10]  E.J. Candes Compressive Sampling , 2022 .

[11]  Yin Zhang Caam When is missing data recoverable ? , 2006 .

[12]  Martin J. Wainwright,et al.  Sharp thresholds for high-dimensional and noisy recovery of sparsity , 2006, ArXiv.

[13]  Richard G. Baraniuk,et al.  Random Filters for Compressive Sampling and Reconstruction , 2006, 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings.

[14]  R. DeVore,et al.  A Simple Proof of the Restricted Isometry Property for Random Matrices , 2008 .

[15]  Martin J. Wainwright,et al.  Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$ -Constrained Quadratic Programming (Lasso) , 2009, IEEE Transactions on Information Theory.