Null space conditions and thresholds for rank minimization

Minimizing the rank of a matrix subject to constraints is a challenging problem that arises in many applications in machine learning, control theory, and discrete geometry. This class of optimization problems, known as rank minimization, is NP-hard, and for most practical problems there are no efficient algorithms that yield exact solutions. A popular heuristic replaces the rank function with the nuclear norm—equal to the sum of the singular values—of the decision variable and has been shown to provide the optimal low rank solution in a variety of scenarios. In this paper, we assess the practical performance of this heuristic for finding the minimum rank matrix subject to linear equality constraints. We characterize properties of the null space of the linear operator defining the constraint set that are necessary and sufficient for the heuristic to succeed. We then analyze linear constraints sampled uniformly at random, and obtain dimension-free bounds under which our null space properties hold almost surely as the matrix dimensions tend to infinity. Finally, we provide empirical evidence that these probabilistic bounds provide accurate predictions of the heuristic’s performance in non-asymptotic scenarios.

[1]  Weiyu Xu,et al.  Necessary and sufficient conditions for success of the nuclear norm heuristic for rank minimization , 2008, 2008 47th IEEE Conference on Decision and Control.

[2]  David L. Donoho,et al.  High-Dimensional Centrally Symmetric Polytopes with Neighborliness Proportional to Dimension , 2006, Discret. Comput. Geom..

[3]  S. Szarek Metric Entropy of Homogeneous Spaces , 1997, math/9701213.

[4]  D. Slepian The one-sided barrier problem for Gaussian noise , 1962 .

[5]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2008, Found. Comput. Math..

[6]  Y. Gordon Gaussian Processes and Almost Spherical Sections of Convex Bodies , 1988 .

[7]  Pablo A. Parrilo,et al.  On cone-invariant linear matrix inequalities , 2000, IEEE Trans. Autom. Control..

[8]  Charles R. Johnson,et al.  Topics in Matrix Analysis , 1991 .

[9]  G. Papavassilopoulos,et al.  On the rank minimization problem over a positive semidefinite linear matrix inequality , 1997, IEEE Trans. Autom. Control..

[10]  D. Donoho,et al.  Sparse nonnegative solution of underdetermined linear equations by linear programming. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[11]  Inderjit S. Dhillon,et al.  Rank minimization via online learning , 2008, ICML '08.

[12]  Lieven Vandenberghe,et al.  Interior-Point Method for Nuclear Norm Approximation with Application to System Identification , 2009, SIAM J. Matrix Anal. Appl..

[13]  Laurent El Ghaoui,et al.  Rank Minimization under LMI constraints: A Framework for Output Feedback Problems , 2007 .

[14]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[15]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[16]  Jos F. Sturm,et al.  A Matlab toolbox for optimization over symmetric cones , 1999 .

[17]  Xiaoming Huo,et al.  Uncertainty principles and ideal atomic decomposition , 2001, IEEE Trans. Inf. Theory.

[18]  Pablo A. Parrilo,et al.  Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization , 2007, SIAM Rev..

[19]  D. Donoho,et al.  Neighborliness of randomly projected simplices in high dimensions. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[20]  Kilian Q. Weinberger,et al.  Unsupervised Learning of Image Manifolds by Semidefinite Programming , 2004, Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004..

[21]  Ilya V. Kolmanovsky,et al.  Predictive energy management of a power-split hybrid electric vehicle , 2009, 2009 American Control Conference.

[22]  J. Kuelbs Probability on Banach spaces , 1978 .

[23]  Stephen P. Boyd,et al.  A rank minimization heuristic with application to minimum order system approximation , 2001, Proceedings of the 2001 American Control Conference. (Cat. No.01CH37148).

[24]  M. Yuan,et al.  Dimension reduction and coefficient estimation in multivariate linear regression , 2007 .

[25]  V. Marčenko,et al.  DISTRIBUTION OF EIGENVALUES FOR SOME SETS OF RANDOM MATRICES , 1967 .

[26]  C. Beck,et al.  Computational study and comparisons of LFT reducibility methods , 1998, Proceedings of the 1998 American Control Conference. ACC (IEEE Cat. No.98CH36207).

[27]  Yin Zhang,et al.  A Simple Proof for Recoverability of `1-Minimization , 2005 .

[28]  R. DeVore,et al.  A Simple Proof of the Restricted Isometry Property for Random Matrices , 2008 .

[29]  Emmanuel J. Candès,et al.  A Singular Value Thresholding Algorithm for Matrix Completion , 2008, SIAM J. Optim..

[30]  Massimiliano Pontil,et al.  Convex multi-task feature learning , 2008, Machine Learning.

[31]  Weiyu Xu,et al.  Compressed sensing - probabilistic analysis of a null-space characterization , 2008, 2008 IEEE International Conference on Acoustics, Speech and Signal Processing.

[32]  Y. Gordon Some inequalities for Gaussian processes and applications , 1985 .

[33]  Shiqian Ma,et al.  Fixed point and Bregman iterative methods for matrix rank minimization , 2009, Math. Program..

[34]  Stephen A. Vavasis,et al.  Nuclear norm minimization for the planted clique and biclique problems , 2009, Math. Program..

[35]  Shimon Ullman,et al.  Uncovering shared structures in multiclass classification , 2007, ICML '07.

[36]  Nathan Srebro,et al.  Fast maximum margin matrix factorization for collaborative prediction , 2005, ICML.

[37]  Z. Bai,et al.  METHODOLOGIES IN SPECTRAL ANALYSIS OF LARGE DIMENSIONAL RANDOM MATRICES, A REVIEW , 2008 .

[38]  Yoram Bresler,et al.  Efficient and guaranteed rank minimization by atomic decomposition , 2009, 2009 IEEE International Symposium on Information Theory.