Back To Index Previous Article Next Article Full Text


Statistica Sinica 21 (2011), 707-730





ASYMPTOTIC PROPERTIES OF SUFFICIENT DIMENSION

REDUCTION WITH A DIVERGING NUMBER OF

PREDICTORS


Yichao Wu and Lexin Li


North Carolina State University


Abstract: We investigate asymptotic properties of a family of sufficient dimension reduction estimators when the number of predictors $p$ diverges to infinity with the sample size. We adopt a general formulation of dimension reduction estimation through least squares regression of a set of transformations of the response. This formulation allows us to establish the consistency of reduction projection estimation. We then introduce the SCAD max penalty, along with a difference convex optimization algorithm, to achieve variable selection. We show that the penalized estimator selects all truly relevant predictors and excludes all irrelevant ones with probability approaching one, meanwhile it maintains consistent reduction basis estimation for relevant predictors. Our work differs from most model-based selection methods in that it does not require a traditional model, and it extends existing sufficient dimension reduction and model-free variable selection approaches from the fixed $p$ scenario to a diverging $p$.



Key words and phrases: Central subspace, diverging parameters; SCAD, sliced inverse regression.

Back To Index Previous Article Next Article Full Text