Back To Index Previous Article Next Article Full Text

Statistica Sinica 27 (2017), 1017-1036

RISK CONSISTENCY OF CROSS-VALIDATION
WITH LASSO-TYPE PROCEDURES
Darren Homrighausen and Daniel J. McDonald
Colorado State University and Indiana University

Abstract: The lasso and related sparsity inducing algorithms have been the target of substantial theoretical and applied research. Correspondingly, many results are known about their behavior for a fixed or optimally chosen tuning parameter specified up to unknown constants. In practice, however, this oracle tuning parameter is inaccessible so one must use the data to select one. Common statistical practice is to use a variant of cross-validation for this task. However, little is known about the theoretical properties of the resulting predictions with such data-dependent methods. We consider the high-dimensional setting with random design wherein the number of predictors p grows with the number of observations 𝓃. Under typical assumptions on the data generating process, similar to those in the literature, we recover oracle rates up to a log factor when choosing the tuning parameter with cross-validation. Under weaker conditions, when the true model is not necessarily linear, we show that the lasso remains risk consistent relative to its linear oracle. We also generalize these results to the group lasso and square-root lasso and investigate the predictive and model selection performance of cross-validation via simulation.

Key words and phrases: Linear oracle, model selection, persistence, regularization.

Back To Index Previous Article Next Article Full Text