Back To Index Previous Article Next Article Full Text Supplement


Statistica Sinica 19 (2009), 869-883





DIMENSION REDUCTION FOR CONDITIONAL

VARIANCE IN REGRESSIONS


Li-Ping Zhu and Li-Xing Zhu


East China Normal University and Hong Kong Baptist University


Abstract: Both the conditional mean and variance in regressions with high dimensional predictors are of importance in modeling. In this paper, we investigate estimation of the conditional variance. To attack the curse of dimensionality, we introduce a notion of central variance subspace (CVS) to capture the information contained in the conditional variance. To estimate the CVS, the impact from the conditional mean needs to be fully removed. To this end, a three-step procedure is proposed: Estimating exhaustively the CMS by an outer product gradient (OPG) method; estimating consistently the structural dimension of the CMS by a modified Bayesian information criterion (BIC); and estimating the conditional mean by a kernel smoother. After removing the conditional mean from the response, we suggest a squared residuals-based OPG method to identify the CVS. The asymptotic normality of candidate matrices, and hence of corresponding eigenvalues and eigenvectors, is obtained. Illustrative examples from simulation studies and a dataset are presented to assess the finite sample performance of the theoretical results.



Key words and phrases: Asymptotic normality, central variance subspace, dimension reduction, heteroscedasticity, outer product gradient.

Back To Index Previous Article Next Article Full Text Supplement