A case for orthogonal measurements in linear inverse problems

We investigate the random matrices that have orthonormal rows and provide a comparison to matrices with independent Gaussian entries. We find that, orthonormality provides an inherent advantage for the conditioning. In particular, for any given subset S of ℝn, we show that orthonormal matrices have better restricted eigenvalues compared to Gaussians. We consider implications of this result for the linear inverse problems; in particular, we investigate the noisy sparse estimation setup and applications to restricted isometry property. We relate our findings to the results known for Gaussian processes and precise undersampling theorems. We then discuss and illustrate universality of the noise robustness behavior for partial unitary matrices including Hadamard and Discrete Cosine Transform.

[1]  Christos Thrampoulidis,et al.  The squared-error of generalized LASSO: A precise analysis , 2013, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[2]  Joel A. Tropp,et al.  Living on the edge: A geometric theory of phase transitions in convex optimization , 2013, ArXiv.

[3]  Y. Gordon On Milman's inequality and random subspaces which escape through a mesh in ℝ n , 1988 .

[4]  Mikko Vehkaperä,et al.  Analysis of regularized LS reconstruction and random matrix ensembles in compressed sensing , 2014, ISIT.

[5]  E. Candès The restricted isometry property and its implications for compressed sensing , 2008 .

[6]  Roman Vershynin,et al.  Introduction to the non-asymptotic analysis of random matrices , 2010, Compressed Sensing.

[7]  Michael I. Jordan,et al.  Computational and statistical tradeoffs via convex relaxation , 2012, Proceedings of the National Academy of Sciences.

[8]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[9]  Babak Hassibi,et al.  A simplified approach to recovery conditions for low rank matrices , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.

[10]  Andrea Montanari,et al.  Universality in Polytope Phase Transitions and Message Passing Algorithms , 2012, ArXiv.

[11]  E. Candès,et al.  Stable signal recovery from incomplete and inaccurate measurements , 2005, math/0503066.

[12]  D. Donoho,et al.  Sparse nonnegative solution of underdetermined linear equations by linear programming. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[13]  David L. Donoho,et al.  Precise Undersampling Theorems , 2010, Proceedings of the IEEE.

[14]  Mikko Vehkaperä,et al.  Analysis of Regularized LS Reconstruction and Random Matrix Ensembles in Compressed Sensing , 2013, IEEE Transactions on Information Theory.

[15]  Sundeep Rangan,et al.  Asymptotic Analysis of MAP Estimation via the Replica Method and Applications to Compressed Sensing , 2009, IEEE Transactions on Information Theory.

[16]  Gou Hosoya,et al.  国際会議参加報告:2014 IEEE International Symposium on Information Theory , 2014 .

[17]  Andrea Montanari,et al.  The Noise-Sensitivity Phase Transition in Compressed Sensing , 2010, IEEE Transactions on Information Theory.

[18]  Martin J. Wainwright,et al.  Restricted Eigenvalue Properties for Correlated Gaussian Designs , 2010, J. Mach. Learn. Res..

[19]  Pablo A. Parrilo,et al.  The Convex Geometry of Linear Inverse Problems , 2010, Foundations of Computational Mathematics.

[20]  Emmanuel J. Candès,et al.  Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements , 2011, IEEE Transactions on Information Theory.

[21]  David L. Donoho,et al.  Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing , 2009, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[22]  M. Talagrand,et al.  Probability in Banach Spaces: Isoperimetry and Processes , 1991 .