New and Improved Johnson-Lindenstrauss Embeddings via the Restricted Isometry Property

Consider an m by N matrix Phi with the Restricted Isometry Property of order k and level delta, that is, the norm of any k-sparse vector in R^N is preserved to within a multiplicative factor of 1 +- delta under application of Phi. We show that by randomizing the column signs of such a matrix Phi, the resulting map with high probability embeds any fixed set of p = O(e^k) points in R^N into R^m without distorting the norm of any point in the set by more than a factor of 1 +- delta. Consequently, matrices with the Restricted Isometry Property and with randomized column signs provide optimal Johnson-Lindenstrauss embeddings up to logarithmic factors in N. In particular, our results improve the best known bounds on the necessary embedding dimension m for a wide class of structured random matrices; for partial Fourier and partial Hadamard matrices, we improve the recent bound m = O(delta^(-4) log(p) log^4(N)) appearing in Ailon and Liberty to m = O(delta^(-2) log(p) log^4(N)), which is optimal up to the logarithmic factors in N. Our results also have a direct application in the area of compressed sensing for redundant dictionaries.

[1]  W. Hoeffding Probability Inequalities for sums of Bounded Random Variables , 1963 .

[2]  F. T. Wright,et al.  A Bound on Tail Probabilities for Quadratic Forms in Independent Random Variables , 1971 .

[3]  W. B. Johnson,et al.  Extensions of Lipschitz mappings into Hilbert space , 1984 .

[4]  Peter Frankl,et al.  The Johnson-Lindenstrauss lemma and the sphericity of some graphs , 1987, J. Comb. Theory, Ser. B.

[5]  M. Talagrand New concentration inequalities in product spaces , 1996 .

[6]  P. MassartLedoux Concentration Inequalities Using the Entropy Method , 2002 .

[7]  Noga Alon,et al.  Problems and results in extremal combinatorics--I , 2003, Discret. Math..

[8]  Sanjoy Dasgupta,et al.  An elementary proof of a theorem of Johnson and Lindenstrauss , 2003, Random Struct. Algorithms.

[9]  E. Candès,et al.  Stable signal recovery from incomplete and inaccurate measurements , 2005, math/0503066.

[10]  Tamás Sarlós,et al.  Improved Approximation Algorithms for Large Matrices via Random Projections , 2006, 2006 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS'06).

[11]  D. Donoho For most large underdetermined systems of equations, the minimal 𝓁1‐norm near‐solution approximates the sparsest near‐solution , 2006 .

[12]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[13]  Emmanuel J. Candès,et al.  Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies? , 2004, IEEE Transactions on Information Theory.

[14]  Bernard Chazelle,et al.  Approximate nearest neighbors and the fast Johnson-Lindenstrauss transform , 2006, STOC '06.

[15]  Per-Gunnar Martinsson,et al.  Randomized algorithms for the low-rank approximation of matrices , 2007, Proceedings of the National Academy of Sciences.

[16]  Ronald A. DeVore,et al.  Deterministic constructions of compressed sensing matrices , 2007, J. Complex..

[17]  M. Rudelson,et al.  On sparse reconstruction from Fourier and Gaussian measurements , 2008 .

[18]  Nir Ailon,et al.  Fast Dimension Reduction Using Rademacher Series on Dual BCH Codes , 2008, SODA '08.

[19]  R. DeVore,et al.  A Simple Proof of the Restricted Isometry Property for Random Matrices , 2008 .

[20]  Amit Singer,et al.  Dense Fast Random Projections and Lean Walsh Transforms , 2008, APPROX-RANDOM.

[21]  Justin K. Romberg,et al.  Compressive Sensing by Random Convolution , 2009, SIAM J. Imaging Sci..

[22]  Richard G. Baraniuk,et al.  Random Projections of Smooth Manifolds , 2009, Found. Comput. Math..

[23]  M. A. Iwen Simple deterministically constructible RIP matrices with sublinear fourier sampling requirements , 2009, 2009 43rd Annual Conference on Information Sciences and Systems.

[24]  Holger Rauhut,et al.  Circulant and Toeplitz matrices in compressed sensing , 2009, ArXiv.

[25]  Trac D. Tran,et al.  Fast and efficient dimensionality reduction using Structurally Random Matrices , 2009, 2009 IEEE International Conference on Acoustics, Speech and Signal Processing.

[26]  Rachel Ward,et al.  Compressed Sensing With Cross Validation , 2008, IEEE Transactions on Information Theory.

[27]  Holger Rauhut,et al.  The Gelfand widths of lp-balls for 0p<=1 , 2010, J. Complex..

[28]  S. Foucart A note on guaranteed sparse recovery via ℓ1-minimization , 2010 .

[29]  Anirban Dasgupta,et al.  A sparse Johnson: Lindenstrauss transform , 2010, STOC '10.

[30]  Massimo Fornasier,et al.  Compressive Sensing and Structured Random Matrices , 2010 .

[31]  Daniel M. Kane,et al.  A Derandomized Sparse Johnson-Lindenstrauss Transform , 2010, Electron. Colloquium Comput. Complex..

[32]  Jan Vyb'iral A variant of the Johnson-Lindenstrauss lemma for circulant matrices , 2010, 1002.2847.

[33]  H. Rauhut,et al.  Sparse Legendre expansions via $\ell_1$ minimization , 2010, 1003.0251.

[34]  Justin K. Romberg,et al.  Beyond Nyquist: Efficient Sampling of Sparse Bandlimited Signals , 2009, IEEE Transactions on Information Theory.

[35]  Stephen J. Dilworth,et al.  Explicit constructions of RIP matrices and related problems , 2010, ArXiv.

[36]  Yonina C. Eldar,et al.  Compressed Sensing with Coherent and Redundant Dictionaries , 2010, ArXiv.

[37]  Nir Ailon,et al.  An almost optimal unrestricted fast Johnson-Lindenstrauss transform , 2010, SODA '11.

[38]  Nathan Halko,et al.  Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions , 2009, SIAM Rev..

[39]  Holger Rauhut,et al.  Sparse Legendre expansions via l1-minimization , 2012, J. Approx. Theory.

[40]  Holger Rauhut,et al.  Compressive Sensing with structured random matrices , 2012 .

[41]  J. Romberg,et al.  Restricted Isometries for Partial Random Circulant Matrices , 2010, arXiv.org.