Back To Index Previous Article Next Article Full Text Supplement


Statistica Sinica 18(2008), 1603-1618





ADAPTIVE LASSO FOR SPARSE HIGH-DIMENSIONAL

REGRESSION MODELS


Jian Huang, Shuangge Ma and Cun-Hui Zhang


University of Iowa, Yale University and Rutgers University
Abstract: We study the asymptotic properties of the adaptive Lasso estimators in sparse, high-dimensional, linear regression models when the number of covariates may increase with the sample size. We consider variable selection using the adaptive Lasso, where the $L_1$ norms in the penalty are re-weighted by data-dependent weights. We show that, if a reasonable initial estimator is available, under appropriate conditions, the adaptive Lasso correctly selects covariates with nonzero coefficients with probability converging to one, and that the estimators of nonzero coefficients have the same asymptotic distribution they would have if the zero coefficients were known in advance. Thus, the adaptive Lasso has an oracle property in the sense of Fan and Li (2001) and Fan and Peng (2004). In addition, under a partial orthogonality condition in which the covariates with zero coefficients are weakly correlated with the covariates with nonzero coefficients, marginal regression can be used to obtain the initial estimator. With this initial estimator, the adaptive Lasso has the oracle property even when the number of covariates is much larger than the sample size.



Key words and phrases: Asymptotic normality, high-dimensional data, penalized regression, variable selection, oracle property, zero-consistency.

Back To Index Previous Article Next Article Full Text Supplement