Back To Index Previous Article Next Article Full Text

Statistica Sinica 32 (2022), 1701-1721

METRIC LEARNING VIA CROSS-VALIDATION

Linlin Dai, Kani Chen, Gang Li and Yuanyuan Lin

Southwestern University of Finance and Economics, Hong Kong University of Science and Technology,
University of California, Los Angeles and The Chinese University of Hong Kong

Abstract: In this paper, we propose a cross-validation metric learning approach to learn a distance metric for dimension reduction in the multiple-index model. We minimize a leave-one-out cross-validation-type loss function, where the unknown link function is approximated by a metric-based kernel-smoothing function. To the best of our knowledge, we are the first to reduce the dimensionality of multiple-index models in a framework of metric learning. The resulting metric contains crucial information on both the central mean subspace and the optimal kernel-smoothing bandwidth. Under weak assumptions on the design of the predictors, we establish asymptotic theories for the consistency and convergence rate of the estimated directions, as well as the optimal rate of the bandwidth. Furthermore, we develop a novel estimation procedure to determine the structural dimension of the central mean subspace. The proposed approach is relatively easy to implement numerically by employing fast gradient-based algorithms. Various empirical studies illustrate its advantages over other existing methods.

Key words and phrases: Multiple-index model, nonparametric regression, sufficient dimension reduction.

Back To Index Previous Article Next Article Full Text