Alois Kneip

The paper considers linear regression problems where the number of predictor variables is possibly larger than the sample size. The basic motivation of the study is to combine the points of view of model selection and functional regression by using a factor approach: it is assumed that the predictor vector can be decomposed into a sum of two independent random components reflecting common factors and specific variabilities of the explanatory variables. It is first shown that model selection procedures such as Lasso or Dantzig selector provide efficient estimators of a sparse vector of parameters. However, sparseness of coefficients is a restrictive assumption in this context. Common factors may possess a significant influence on the response variable which cannot be captured by the specific effects of a small number of individual variables. We therefore propose to include principal components as additional explanatory variables in an augmented regression model. We give finite sample inequalities for estimates of these components. It is then shown that model selection procedures can be used to estimate the parameters of the augmented model and we state theoretical properties of the estimators.

**Keywords:** Principal Components; Functional Regression; Model selection; Lasso

**Biography:** PhD. in Mathematics 1988. Professor of Statistics at the Université Catholique de Louvain from 1994-2000. Professor of Statistics at the department of Economics, University of Mainz, 2000-2005. Professor of Statistics at the department of Economics, University of Bonn, 2000-2005.