Back To Index Previous Article Next Article Full Text


Statistica Sinica 21 (2011), 1473-1513
doi:10.5705/ss.2010.081





A STEPWISE REGRESSION METHOD AND CONSISTENT

MODEL SELECTION FOR HIGH-DIMENSIONAL

SPARSE LINEAR MODELS


Ching-Kang Ing and Tze Leung Lai


Academia Sinica and Stanford University


Abstract: We introduce a fast stepwise regression method, called the orthogonal greedy algorithm (OGA), that selects input variables to enter a $p$-dimensional linear regression model (with $p\gg n$, the sample size) sequentially so that the selected variable at each step minimizes the residual sum squares. We derive the convergence rate of OGA and develop a consistent model selection procedure along the OGA path that can adjust for potential spuriousness of the greedily chosen regressors among a large number of candidate variables. The resultant regression estimate is shown to have the oracle property of being equivalent to least squares regression on an asymptotically minimal set of relevant regressors under a strong sparsity condition.



Key words and phrases: Componentwise linear regression, greedy algorithm, high-dimensional information criterion, Lasso, oracle property and inequalities, sparsity.

Back To Index Previous Article Next Article Full Text