Back To Index Previous Article Next Article Full Text

Statistica Sinica 35 (2025), 1605-1625

LEVERAGE CLASSIFIER: ANOTHER LOOK AT
SUPPORT VECTOR MACHINE

Yixin Han1, Jun Yu2, Nan Zhang3, Cheng Meng 4, Ping Ma*5,
Wenxuan Zhong5 and Changliang Zou1

1Nankai University, 2Beijing Institute of Technology,
3Fudan University, 4Renmin University and 5University of Georgia

Abstract: Support vector machine (SVM) is a popular classifier known for accuracy, flexibility, and robustness. However, its intensive computation has hindered its application to large-scale datasets. In this paper, we propose a new optimal leverage classifier based on linear SVM under a nonseparable setting. Our classifier aims to select an informative subset of the training sample to reduce data size, enabling efficient computation while maintaining high accuracy. We take a novel view of SVM under the general subsampling framework and rigorously investigate the statistical properties. We propose a two-step subsampling procedure consisting of a pilot estimation of the optimal subsampling probabilities and a subsampling step to construct the classifier. We develop a new Bahadur representation of the SVM coefficients and derive unconditional asymptotic distribution and optimal subsampling probabilities without giving the full sample. Numerical results demonstrate that our classifiers outperform the existing methods in terms of estimation, computation, and prediction.

Key words and phrases: Classification, large-scale dataset, martingale, optimal subsampling, support vector machine.

Back To Index Previous Article Next Article Full Text