Shrinkage Estimation of a Univariate Normal Mean
Hannes Leeb1, Adityanand Guntuboyina2
1Department of Statistics, University of Vienna, Vienna, Austria; 2Department of Statistics, Yale University, New Haven, CT, United States

It is well-known that there is no Stein phenomenon in dimensions one and two; cf. Stein (1956). In a linear regression setting, we study estimation of the mean of a new response given the corresponding new explanatory variables and a training sample. When the explanatory variables are held fixed, an estimator based on the James-Stein estimator performs poorly when compared to the maximum likelihood estimator in terms of worst-case risk. But on average (with respect to the new explanatory variables), this univariate James-Stein-based estimator dominates the maximum likelihood estimator, irrespective of the unknown parameters. We give an explicit finite-sample analysis of this phenomenon and find, in particular, that shrinkage estimation has certain attractive properties, even when the goal is estimation of a univariate normal mean.

Keywords: Linear inference, regression; Ridge regression, shrinkage estimators; Parametric inference

Biography: Hannes Leeb obtained his Ph.D. at the University of Salzburg in 1997. After a post doc position at the University of Vienna, he moved to Yale University, first as a visitor, then as assistant professor and later as associate professor of statistics. Last year, he again moved to the University of Vienna as full professor of statistics.