An Interior-Point Method for Large-Scale $\ell_1$-Regularized Least Squares

Recently, a lot of attention has been paid to regularization based methods for sparse signal reconstruction (e.g., basis pursuit denoising and compressed sensing) and feature selection (e.g., the Lasso algorithm) in signal processing, statistics, and related fields. These problems can be cast as -regularized least-squares programs (LSPs), which can be reformulated as convex quadratic programs, and then solved by several standard methods such as interior-point methods, at least for small and medium size problems. In this paper, we describe a specialized interior-point method for solving large-scale -regularized LSPs that uses the preconditioned conjugate gradients algorithm to compute the search direction. The interior-point method can solve large sparse problems, with a million variables and observations, in a few tens of minutes on a PC. It can efficiently solve large dense problems, that arise in sparse signal recovery with orthogonal transforms, by exploiting fast algorithms for these transforms. The method is illustrated on a magnetic resonance imaging data set.

[1]  H. D. Brunk,et al.  Statistical inference under order restrictions : the theory and application of isotonic regression , 1973 .

[2]  Michael A. Saunders,et al.  LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares , 1982, TOMS.

[3]  C. Kelley Iterative Methods for Linear and Nonlinear Equations , 1987 .

[4]  F. T. Wright,et al.  Order restricted statistical inference , 1988 .

[5]  L. Rudin,et al.  Nonlinear total variation based noise removal algorithms , 1992 .

[6]  Stephen P. Boyd,et al.  A primal—dual potential reduction method for problems involving matrix inequalities , 1995, Math. Program..

[7]  O. Nelles,et al.  An Introduction to Optimization , 1996, IEEE Antennas and Propagation Magazine.

[8]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[9]  James Demmel,et al.  Applied Numerical Linear Algebra , 1997 .

[10]  Stephen J. Wright Primal-Dual Interior-Point Methods , 1997, Other Titles in Applied Mathematics.

[11]  Arnold Neumaier,et al.  Solving Ill-Conditioned and Singular Linear Systems: A Tutorial on Regularization , 1998, SIAM Rev..

[12]  Yinyu Ye,et al.  Interior point algorithms: theory and analysis , 1997 .

[13]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[14]  L. Portugal,et al.  A truncated primal‐infeasible dual‐feasible network interior point method , 2000 .

[15]  Adrian S. Lewis,et al.  Convex Analysis And Nonlinear Optimization , 2000 .

[16]  Wenjiang J. Fu,et al.  Asymptotics for lasso-type estimators , 2000 .

[17]  Stephen J. Wright,et al.  Numerical Optimization (Springer Series in Operations Research and Financial Engineering) , 2000 .

[18]  M. R. Osborne,et al.  A new approach to variable selection in least squares problems , 2000 .

[19]  L. Portugal,et al.  A truncated primal-infeasible dual-feasible network interior point method , 2000, Networks.

[20]  Ariela Sofer,et al.  Interior-point methodology for 3-D PET reconstruction , 2000, IEEE Transactions on Medical Imaging.

[21]  D K Smith,et al.  Numerical Optimization , 2001, J. Oper. Res. Soc..

[22]  I. Daubechies,et al.  An iterative thresholding algorithm for linear inverse problems with a sparsity constraint , 2003, math/0307152.

[23]  Yousef Saad,et al.  Iterative methods for sparse linear systems , 2003 .

[24]  Eric R. Ziegel,et al.  The Elements of Statistical Learning , 2003, Technometrics.

[25]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.

[26]  D. Gorinevsky Monotonic regression filters for trending deterioration faults , 2004, Proceedings of the 2004 American Control Conference.

[27]  Jean-Jacques Fuchs,et al.  Recovery of exact sparse representations in the presence of noise , 2004, 2004 IEEE International Conference on Acoustics, Speech, and Signal Processing.

[28]  Stephen P. Boyd,et al.  Moving horizon filter for monotonic trends , 2004, 2004 43rd IEEE Conference on Decision and Control (CDC) (IEEE Cat. No.04CH37601).

[29]  Saharon Rosset,et al.  Tracking Curved Regularized Optimization Solution Paths , 2004, NIPS 2004.

[30]  Robert Tibshirani,et al.  The Entire Regularization Path for the Support Vector Machine , 2004, J. Mach. Learn. Res..

[31]  Jean-Jacques Fuchs,et al.  Recovery of exact sparse representations in the presence of bounded noise , 2005, IEEE Transactions on Information Theory.

[32]  Robert D. Nowak,et al.  A bound optimization approach to wavelet-based image deconvolution , 2005, IEEE International Conference on Image Processing 2005.

[33]  E. Candès,et al.  Stable signal recovery from incomplete and inaccurate measurements , 2005, math/0503066.

[34]  Wotao Yin,et al.  An Iterative Regularization Method for Total Variation-Based Image Restoration , 2005, Multiscale Model. Simul..

[35]  M. Zibulevsky,et al.  Sequential Subspace Optimization Method for Large-Scale Unconstrained Problems , 2005 .

[36]  H. Zou,et al.  Regularization and variable selection via the elastic net , 2005 .

[37]  H. Zou,et al.  Addendum: Regularization and variable selection via the elastic net , 2005 .

[38]  Yaakov Tsaig,et al.  Extensions of compressed sensing , 2006, Signal Process..

[39]  Michael Elad,et al.  Image Denoising with Shrinkage and Redundant Representations , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[40]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[41]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[42]  E.J. Candes Compressive Sampling , 2022 .

[43]  Michael Elad,et al.  Stable recovery of sparse overcomplete representations in the presence of noise , 2006, IEEE Transactions on Information Theory.

[44]  N. Meinshausen,et al.  High-dimensional graphs and variable selection with the Lasso , 2006, math/0608017.

[45]  Martin J. Wainwright,et al.  Sharp thresholds for high-dimensional and noisy recovery of sparsity , 2006, ArXiv.

[46]  H. Zou The Adaptive Lasso and Its Oracle Properties , 2006 .

[47]  Joel A. Tropp,et al.  Just relax: convex programming methods for identifying sparse signals in noise , 2006, IEEE Transactions on Information Theory.

[48]  M. Yuan,et al.  Model selection and estimation in regression with grouped variables , 2006 .

[49]  Peng Zhao,et al.  On Model Selection Consistency of Lasso , 2006, J. Mach. Learn. Res..

[50]  Y. Nesterov Gradient methods for minimizing composite objective function , 2007 .

[51]  Mário A. T. Figueiredo,et al.  Gradient Projection for Sparse Reconstruction: Application to Compressed Sensing and Other Inverse Problems , 2007, IEEE Journal of Selected Topics in Signal Processing.

[52]  R. Tibshirani,et al.  Forward stagewise regression and the monotone lasso , 2007, 0705.0269.

[53]  R. Tibshirani,et al.  PATHWISE COORDINATE OPTIMIZATION , 2007, 0708.1485.

[54]  Stephen P. Boyd,et al.  An Interior-Point Method for Large-Scale l1-Regularized Logistic Regression , 2007, J. Mach. Learn. Res..

[55]  Yaakov Tsaig,et al.  Fast Solution of $\ell _{1}$ -Norm Minimization Problems When the Solution May Be Sparse , 2008, IEEE Transactions on Information Theory.

[56]  D. Donoho,et al.  Fast Solution of -Norm Minimization Problems When the Solution May Be Sparse , 2008 .

[57]  Stephen P. Boyd,et al.  Optimal Estimation of Deterioration From Diagnostic Image Sequence , 2009, IEEE Transactions on Signal Processing.