Asymptotic Optimality for Sliced Inverse Regression
François Portier, Bernard Delyon
IRMAR, University of Rennes1, Rennes, France

Regression analysis deals with the conditional distribution of the response given the predictors. For a large number of predictors, convergence rates of estimators decrease significantly. A solution proposed by the sufficient dimension reduction (SDR) introduced in [3], consists in replacing the vector of the predictors by its orthogonal projection on a subspace of the covariate space. This reduction is legitimate when the projection contains the whole information available about the response. When it exists, the central subspace (CS) is the smallest such subspace. In this context, recent years research has proposed some new challenges. For example, recent works target an exhaustive estimation of the CS [2], or a better rate of convergence [1].

In this paper, we offer a complete framework for SDR called the test function methodology (TFM). Concerning the CS estimation, most of the existing methods result from an investigation of the dependance between the predictors and the response through inverse regression, i.e. the study of the conditional distribution of the covariates knowing the response. Different methods have been proposed for this, TFM extends those by the introduction of a nonlinear transformation of the response. Actually, we generalize the sliced inverse regression estimation [3] by considering different families of functions, not just indicators. We also provide conditions that guarantee an exhaustive estimation of the CS. Moreover, inside the indicator family, the best transformation is calculated via the minimization of the asymptotic mean squared error deriving from the distance between the CS and its estimate. This leads us to a plug-in method which is evaluated with several simulations.


[1] Cook, R. Dennis; Ni, Liqiang. Sufficient dimension reduction via inverse regression: a minimum discrepancy approach. J. Amer. Statist. Assoc. 100 (2005), no. 470, 410–428.

[2] Li, Bing; Wang, Shaoli. On directional regression for dimension reduction. J. Amer. Statist. Assoc. 102 (2007), no. 479, 997–1008.

[3] Li, Ker-Chau. Sliced inverse regression for dimension reduction. J. Amer. Statist. Assoc. 86 (1991), no. 414, 316–342.

Keywords: sufficient dimension reduction; central subspace; inverse regression; slicing estimation

Biography: François Portier is a Ph.D student from University of Rennes 1. Member of the statistics department, his research topics are nonparametric and semiparametric statistics, and more precisely dimension reduction in regression.