On Extensions of LARS by Information Geometry : Convex Objectives and ' p -Norm

oThis paper addresses extensions of the Least Angle Regression (LARS) algorithm from two different aspects: (i) from quadratic to more general objectives, and (ii) from '1-norm to 'p-norm for p < 1. The equiangular vector, which is the key of LARS, is reproduced in connection with the Riemannian metric induced by the objective function, thereby making the extensions feasible. It is shown, in the case of p < 1, that two types of trajectory o the c-trajectory and the -trajectory o need to be distinguished by revealing the discontinuity of the -trajectory. I. INTRODUCTION

[1]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[2]  B. Turlach Discussion of "Least Angle Regression" by Efron, Hastie, Johnstone and Tibshirani , 2004 .

[3]  Zongben Xu,et al.  $L_{1/2}$ Regularization: A Thresholding Representation Theory and a Fast Solver , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[4]  Fumiyasu Komaki,et al.  An Extension of Least Angle Regression Based on the Information Geometry of Dually Flat Spaces , 2009 .

[5]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[6]  Shun-ichi Amari,et al.  Methods of information geometry , 2000 .

[7]  Yaakov Tsaig,et al.  Fast Solution of $\ell _{1}$ -Norm Minimization Problems When the Solution May Be Sparse , 2008, IEEE Transactions on Information Theory.

[8]  E.J. Candes,et al.  An Introduction To Compressive Sampling , 2008, IEEE Signal Processing Magazine.

[9]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[10]  David L Donoho,et al.  Compressed sensing , 2006, IEEE Transactions on Information Theory.