Two Conditions for Equivalence of 0-Norm Solution and 1-Norm Solution in Sparse Representation

In sparse representation, two important sparse solutions, the 0-norm and 1-norm solutions, have been receiving much of attention. The 0-norm solution is the sparsest, however it is not easy to obtain. Although the 1-norm solution may not be the sparsest, it can be easily obtained by the linear programming method. In many cases, the 0-norm solution can be obtained through finding the 1-norm solution. Many discussions exist on the equivalence of the two sparse solutions. This paper analyzes two conditions for the equivalence of the two sparse solutions. The first condition is necessary and sufficient, however, difficult to verify. Although the second is necessary but is not sufficient, it is easy to verify. In this paper, we analyze the second condition within the stochastic framework and propose a variant. We then prove that the equivalence of the two sparse solutions holds with high probability under the variant of the second condition. Furthermore, in the limit case where the 0-norm solution is extremely sparse, the second condition is also a sufficient condition with probability 1.

[1]  Yuanqing Li,et al.  Blind estimation of channel parameters and source components for EEG signals: a sparse factorization approach , 2006, IEEE Transactions on Neural Networks.

[2]  Pierre Vandergheynst,et al.  Learning Bimodal Structure in Audio–Visual Data , 2009, IEEE Transactions on Neural Networks.

[3]  Balas K. Natarajan,et al.  Sparse Approximate Solutions to Linear Systems , 1995, SIAM J. Comput..

[4]  Jianfeng Feng,et al.  Voxel Selection in fMRI Data Analysis Based on Sparse Representation , 2009, IEEE Transactions on Biomedical Engineering.

[5]  Kay Chen Tan,et al.  Global exponential stability of discrete-time neural networks for constrained quadratic optimization , 2004, Neurocomputing.

[6]  David K. Smith Theory of Linear and Integer Programming , 1987 .

[7]  Xiaolin Hu,et al.  Design of General Projection Neural Networks for Solving Monotone Linear Variational Inequalities and Linear and Quadratic Optimization Problems , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[8]  Liqun Qi,et al.  A novel neural network for variational inequalities with linear and nonlinear constraints , 2005, IEEE Transactions on Neural Networks.

[9]  Joel A. Tropp,et al.  Greed is good: algorithmic results for sparse approximation , 2004, IEEE Transactions on Information Theory.

[10]  D. Donoho,et al.  Maximal Sparsity Representation via l 1 Minimization , 2002 .

[11]  Yuanqing Li,et al.  Analysis of Sparse Representation and Blind Source Separation , 2004, Neural Computation.

[12]  Malur K. Sundareshan,et al.  Exponential stability and a systematic synthesis of a neural network for quadratic minimization , 1991, Neural Networks.

[13]  Patrick T. Harker,et al.  Finite-dimensional variational inequality and nonlinear complementarity problems: A survey of theory, algorithms and applications , 1990, Math. Program..

[14]  Yuanqing Li,et al.  Equivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation , 2008, IEEE Transactions on Neural Networks.

[15]  Daniel W. C. Ho,et al.  Underdetermined blind source separation based on sparse representation , 2006, IEEE Transactions on Signal Processing.

[16]  A. Cohen An Introduction to Probability Theory and Mathematical Statistics , 1979 .

[17]  Joseph F. Murray,et al.  Dictionary Learning Algorithms for Sparse Representation , 2003, Neural Computation.

[18]  David L. Donoho,et al.  Neighborly Polytopes And Sparse Solution Of Underdetermined Linear Equations , 2005 .

[19]  R. M. Norton,et al.  The Double Exponential Distribution: Using Calculus to Find a Maximum Likelihood Estimator , 1984 .

[20]  Yuanqing Li,et al.  Probability estimation for recoverability analysis of blind source separation based on sparse representation , 2006, IEEE Transactions on Information Theory.

[21]  Thomas Martinetz,et al.  Simple Method for High-Performance Digit Recognition Based on Sparse Coding , 2008, IEEE Transactions on Neural Networks.

[22]  Jean-Jacques Fuchs,et al.  On sparse representations in arbitrary redundant bases , 2004, IEEE Transactions on Information Theory.

[23]  Xiaoming Huo,et al.  Uncertainty principles and ideal atomic decomposition , 2001, IEEE Trans. Inf. Theory.

[24]  Michael Zibulevsky,et al.  Underdetermined blind source separation using sparse representations , 2001, Signal Process..

[25]  Michael Elad,et al.  Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ1 minimization , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[26]  Stephen A. Billings,et al.  Sparse Model Identification Using a Forward Orthogonal Regression Algorithm Aided by Mutual Information , 2007, IEEE Transactions on Neural Networks.

[27]  Rémi Gribonval,et al.  Sparse representations in unions of bases , 2003, IEEE Trans. Inf. Theory.

[28]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[29]  Barak A. Pearlmutter,et al.  Blind Source Separation by Sparse Decomposition in a Signal Dictionary , 2001, Neural Computation.

[30]  X. Huo,et al.  When do stepwise algorithms meet subset selection criteria , 2007, 0708.2149.

[31]  María José Pérez-Ilzarbe Convergence analysis of a discrete-time recurrent neural network to perform quadratic real optimization with bound constraints , 1998, IEEE Trans. Neural Networks.

[32]  Barak A. Pearlmutter,et al.  Blind source separation by sparse decomposition , 2000, SPIE Defense + Commercial Sensing.

[33]  Michael Elad,et al.  A generalized uncertainty principle and sparse representation in pairs of bases , 2002, IEEE Trans. Inf. Theory.

[34]  Abdesselam Bouzerdoum,et al.  Neural network for quadratic optimization with bound constraints , 1993, IEEE Trans. Neural Networks.

[35]  D. Donoho For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution , 2006 .

[36]  Bruno A. Olshausen,et al.  Learning Sparse Image Codes using a Wavelet Pyramid Architecture , 2000, NIPS.