Recovery threshold for optimal weight ℓ1 minimization

We consider the problem of recovering a sparse signal from underdetermined measurements when we have prior information about the sparsity structure of the signal. In particular, we assume that the entries of the signal can be partitioned into two known sets S1, S2 where the relative sparsities over the two sets are different. In this situation it is advantageous to replace classical ℓ1 minimization with weighted ℓ1 minimization, where the sparser set is given a larger weight. In this paper we give a simple closed form expression for the minimum number of measurements required for successful recovery when the optimal weights are chosen. The formula shows that the number of measurements is upper bounded by the sum of the minimum number of measurements needed had we measured the S1 and S2 components of the signal separately. In fact, our results indicate that this upper bound is tight and we actually have equality. Our proof technique uses the “escape through a mesh” framework and connects to the Minimax MSE of a certain basis pursuit denisoing problem.

[1]  Robert D. Nowak,et al.  Tight Measurement Bounds for Exact Recovery of Structured Sparse Signals , 2011, ArXiv.

[2]  D. Donoho,et al.  Neighborliness of randomly projected simplices in high dimensions. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[3]  Andrea Montanari,et al.  Accurate Prediction of Phase Transitions in Compressed Sensing via a Connection to Minimax Denoising , 2011, IEEE Transactions on Information Theory.

[4]  David P. Wipf,et al.  Iterative Reweighted 1 and 2 Methods for Finding Sparse Solutions , 2010, IEEE J. Sel. Top. Signal Process..

[5]  Mihailo Stojnic,et al.  Block-length dependent thresholds in block-sparse compressed sensing , 2009, ArXiv.

[6]  M. Stojnic Various thresholds for $\ell_1$-optimization in compressed sensing , 2009 .

[7]  Babak Hassibi,et al.  A simpler approach to weighted ℓ1 minimization , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[8]  Wei Lu,et al.  Modified-CS: Modifying compressive sensing for problems with partially known support , 2009, 2009 IEEE International Symposium on Information Theory.

[9]  Weiyu Xu,et al.  Weighted ℓ1 minimization for sparse recovery with prior information , 2009, 2009 IEEE International Symposium on Information Theory.

[10]  Y. Gordon On Milman's inequality and random subspaces which escape through a mesh in ℝ n , 1988 .

[11]  Babak Hassibi,et al.  New Null Space Results and Recovery Thresholds for Matrix Rank Minimization , 2010, ArXiv.

[12]  Weiyu Xu,et al.  Improved sparse recovery thresholds with two-step reweighted ℓ1 minimization , 2010, 2010 IEEE International Symposium on Information Theory.

[13]  Pablo A. Parrilo,et al.  The Convex Geometry of Linear Inverse Problems , 2010, Foundations of Computational Mathematics.

[14]  Andrea Montanari,et al.  The LASSO Risk for Gaussian Matrices , 2010, IEEE Transactions on Information Theory.

[15]  Laurent Jacques,et al.  A short note on compressed sensing with partially known signal support , 2009, Signal Process..

[16]  Mihailo Stojnic,et al.  Various thresholds for ℓ1-optimization in compressed sensing , 2009, ArXiv.

[17]  Weiyu Xu,et al.  Breaking through the thresholds: an analysis for iterative reweighted ℓ1 minimization via the Grassmann angle framework , 2009, 2010 IEEE International Conference on Acoustics, Speech and Signal Processing.

[18]  Toshiyuki Tanaka,et al.  Optimal incorporation of sparsity information by weighted ℓ1 optimization , 2010, 2010 IEEE International Symposium on Information Theory.

[19]  Andrea Montanari,et al.  Message-passing algorithms for compressed sensing , 2009, Proceedings of the National Academy of Sciences.