1. Computational considerations. When Lasso [11] was proposed, it was a computational challenge to solve the associated quadratic program min β 1 � y − Xβ � 2 s.t. � β� 1 ≤ t Lasso(t) given just a single parameter t. Two active-set methods were described in [11], with some concern about efficiency if p were large, where X is n × p . Later when basis pursuit de-noising (BPDN) was introduced [2], the intention was to deal with p very large and to allow X to be a sparse matrix or a fast operator. A primal–dual interior method was used to solve the associated quadratic program, but it remained a challenge to deal with a single parameter. The authors’ new Dantzig Selector (DS) also assumes a specific parameter. It is helpful to state the BPDN and DS models together: min β,r λ� β� 1 + 1 � r� 2 s.t. r = y − Xβ, BPDN(λ) min β,r � β� 1 s.t. � X T r� ∞ ≤ λ, r = y − Xβ. DS(λ) For reference purposes we also state the corresponding dual problems: min r −y T r + 1 � r� 2 s.t. � X T r� ∞ ≤ λ, BPdual(λ) min r,z −y T r + λ� z� 1 s.t. � X T r� ∞ ≤ λ, r = Xz.
[1]
Yaakov Tsaig,et al.
Fast Solution of $\ell _{1}$ -Norm Minimization Problems When the Solution May Be Sparse
,
2008,
IEEE Transactions on Information Theory.
[2]
D. Donoho,et al.
Fast Solution of -Norm Minimization Problems When the Solution May Be Sparse
,
2008
.
[3]
D. Madigan,et al.
[Least Angle Regression]: Discussion
,
2004
.
[4]
M. R. Osborne,et al.
A new approach to variable selection in least squares problems
,
2000
.
[5]
M. R. Osborne,et al.
On the LASSO and its Dual
,
2000
.
[6]
Michael A. Saunders,et al.
Atomic Decomposition by Basis Pursuit
,
1998,
SIAM J. Sci. Comput..
[7]
R. Tibshirani.
Regression Shrinkage and Selection via the Lasso
,
1996
.