Discussion: The Dantzig selector: Statistical estimation when p is much larger than n

1. Computational considerations. When Lasso [11] was proposed, it was a computational challenge to solve the associated quadratic program min β 1 � y − Xβ � 2 s.t. � β� 1 ≤ t Lasso(t) given just a single parameter t. Two active-set methods were described in [11], with some concern about efficiency if p were large, where X is n × p . Later when basis pursuit de-noising (BPDN) was introduced [2], the intention was to deal with p very large and to allow X to be a sparse matrix or a fast operator. A primal–dual interior method was used to solve the associated quadratic program, but it remained a challenge to deal with a single parameter. The authors’ new Dantzig Selector (DS) also assumes a specific parameter. It is helpful to state the BPDN and DS models together: min β,r λ� β� 1 + 1 � r� 2 s.t. r = y − Xβ, BPDN(λ) min β,r � β� 1 s.t. � X T r� ∞ ≤ λ, r = y − Xβ. DS(λ) For reference purposes we also state the corresponding dual problems: min r −y T r + 1 � r� 2 s.t. � X T r� ∞ ≤ λ, BPdual(λ) min r,z −y T r + λ� z� 1 s.t. � X T r� ∞ ≤ λ, r = Xz.