Back To Index Previous Article Next Article Full Text


Statistica Sinica 16(2006), 589-615





THE DOUBLY REGULARIZED SUPPORT

VECTOR MACHINE


Li Wang, Ji Zhu and Hui Zou


University of Michigan and University of Minnesota


Abstract: The standard $L_2$-norm support vector machine (SVM) is a widely used tool for classification problems. The $L_1$-norm SVM is a variant of the standard $L_2$-norm SVM, that constrains the $L_1$-norm of the fitted coefficients. Due to the nature of the $L_1$-norm, the $L_1$-norm SVM has the property of automatically selecting variables, not shared by the standard $L_2$-norm SVM. It has been argued that the $L_1$-norm SVM may have some advantage over the $L_2$-norm SVM, especially with high dimensional problems and when there are redundant noise variables. On the other hand, the $L_1$-norm SVM has two drawbacks: (1) when there are several highly correlated variables, the $L_1$-norm SVM tends to pick only a few of them, and remove the rest; (2) the number of selected variables is upper bounded by the size of the training data. A typical example where these occur is in gene microarray analysis. In this paper, we propose a doubly regularized support vector machine (DrSVM). The DrSVM uses the elastic-net penalty, a mixture of the $L_2$-norm and the $L_1$-norm penalties. By doing so, the DrSVM performs automatic variable selection in a way similar to the $L_1$-norm SVM. In addition, the DrSVM encourages highly correlated variables to be selected (or removed) together. We illustrate how the DrSVM can be particularly useful when the number of variables is much larger than the size of the training data ($p \gg n$). We also develop efficient algorithms to compute the whole solution paths of the DrSVM.



Key words and phrases: Grouping effect, p ≫ n, quadratic programming, SVM, variable selection.

Back To Index Previous Article Next Article Full Text