The Lasso revisited: entropy bounds and dual certificates
The Lasso is least squares estimation with an ℓ1-penalty on the coeffcients. In this talk we consider the case where the design matrix is structured, and where its columns have high correlations. We consider for this situation the theory for non-adaptive (minimax) rates and adaptive (oracle) rates.
The ℓ1-penalty leads to studying the entropy of the signed convex hull of the columns of the design matrix. We will present a bound for this entropy based on projection theory. This generalizes a result in Ball and Pajor  which bounds the entropy of the convex hull in terms of the covering number of its extreme points. The entropy bound is applied to obtain non-adaptive rates.
For the adaptive rates, we will combine the approach of Dalalyan et al.  with dual certificates as given in Candes and Plan . This will be applied to the trend filtering problem, to higher dimensional extensions and to total variation on graphs.
Joint work with Francesco Ortelli
K. Ball and A. Pajor. The entropy of convex bodies with "few" extreme points. London Math. Soc. Lecture Note Series, 158:25-32, 1990.
Emmanuel J Candes and Yaniv Plan. A probabilistic and RIPless theory of compressed sensing. IEEE Transactions on Information Theory, 57(11):7235-7254, 2011.
Arnak Dalalyan, Mohamed Hebiri, and Johannes Lederer. On the prediction performance of the lasso. Bernoulli, 23(1):552-581, 2017.
Joint event of SFB 1294 'Data Assimilation' and the UFS Data-centric Sciences.
***This years Kálmán Lecture will be conducted online due to the corona pandemic. We will send out an invitation for a zoom meeting via our email list. If you are not on the mail list already, please send an email up front to Hendrik.Gessner[at]uni-potsdam.de ***