Simple error bounds for regularized noisy linear inverse problems

Consider estimating a structured signal x<sub>0</sub> from linear, underdetermined and noisy measurements y = Ax<sub>0</sub>+z, via solving a variant of the lasso algorithm: x̂ = arg min<sub>x</sub>{∥y-Ax∥2+λf(x)}. Here, f is a convex function aiming to promote the structure of x<sub>0</sub>, say ℓ<sub>1</sub>-norm to promote sparsity or nuclear norm to promote low-rankness. We assume that the entries of A are independent and normally distributed and make no assumptions on the noise vector z, other than it being independent of A. Under this generic setup, we derive a general, non-asymptotic and rather tight upper bound on the ℓ<sub>2</sub>-norm of the estimation error ∥x̂ - x<sub>0</sub>∥2. Our bound is geometric in nature and obeys a simple formula; the roles of λ, f and x<sub>0</sub> are all captured by a single summary parameter δ(λ∂f(x<sub>0</sub>)), termed the Gaussian squared distance to the scaled subdifferential. We connect our result to the literature and verify its validity through simulations.

[1]  丸山 徹 Convex Analysisの二,三の進展について , 1977 .

[2]  Y. Gordon On Milman's inequality and random subspaces which escape through a mesh in ℝ n , 1988 .

[3]  M. Talagrand,et al.  Probability in Banach Spaces: Isoperimetry and Processes , 1991 .

[4]  Marjorie G. Hahn Review: Michel Ledoux, Michel Talagrand, Probability in Banach Spaces: Isoperimetry and Processes , 1994 .

[5]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[6]  N. S. Barnett,et al.  Inequalities for Beta and Gamma functions via some classical and new integral inequalities. , 2000 .

[7]  E. Candès,et al.  Stable signal recovery from incomplete and inaccurate measurements , 2005, math/0503066.

[8]  Mihailo Stojnic,et al.  Various thresholds for ℓ1-optimization in compressed sensing , 2009, ArXiv.

[9]  Mihailo Stojnic,et al.  Block-length dependent thresholds in block-sparse compressed sensing , 2009, ArXiv.

[10]  M. Stojnic Various thresholds for $\ell_1$-optimization in compressed sensing , 2009 .

[11]  Martin J. Wainwright,et al.  A unified framework for high-dimensional analysis of $M$-estimators with decomposable regularizers , 2009, NIPS.

[12]  P. Bickel,et al.  SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR , 2008, 0801.1095.

[13]  A. Belloni,et al.  Square-Root Lasso: Pivotal Recovery of Sparse Signals via Conic Programming , 2010, 1009.5689.

[14]  Andrea Montanari,et al.  The Noise-Sensitivity Phase Transition in Compressed Sensing , 2010, IEEE Transactions on Information Theory.

[15]  Andrea Montanari,et al.  The LASSO Risk for Gaussian Matrices , 2010, IEEE Transactions on Information Theory.

[16]  W. Marsden I and J , 2012 .

[17]  Pablo A. Parrilo,et al.  The Convex Geometry of Linear Inverse Problems , 2010, Foundations of Computational Mathematics.

[18]  Christos Thrampoulidis,et al.  The squared-error of generalized LASSO: A precise analysis , 2013, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[19]  Mihailo Stojnic Upper-bounding ℓ1-optimization weak thresholds , 2013, ArXiv.

[20]  Joel A. Tropp,et al.  Living on the edge: A geometric theory of phase transitions in convex optimization , 2013, ArXiv.

[21]  Mihailo Stojnic,et al.  A framework to characterize performance of LASSO algorithms , 2013, ArXiv.

[22]  Joel A. Tropp,et al.  Living on the edge: phase transitions in convex programs with random data , 2013, 1303.6672.

[23]  Christos Thrampoulidis,et al.  Simple Bounds for Noisy Linear Inverse Problems with Exact Side Information , 2013, ArXiv.

[24]  Rina Foygel,et al.  Corrupted Sensing: Novel Guarantees for Separating Structured Signals , 2013, IEEE Transactions on Information Theory.

[25]  Gou Hosoya,et al.  国際会議参加報告:2014 IEEE International Symposium on Information Theory , 2014 .

[26]  Babak Hassibi,et al.  Asymptotically Exact Denoising in Relation to Compressed Sensing , 2013, ArXiv.

[27]  Aaas News,et al.  Book Reviews , 1893, Buffalo Medical and Surgical Journal.