Back To Index Previous Article Next Article Full Text

Statistica Sinica 33 (2023), 1249-1270

AN EFFICIENT CONVEX FORMULATION FOR
REDUCED-RANK LINEAR DISCRIMINANT
ANALYSIS IN HIGH DIMENSIONS

Jing Zeng1 , Xin Zhang2 and Qing Mai2

1University of Science and Technology of China and 2Florida State University

Abstract: In this paper, we propose a parsimonious reduced-rank linear discriminant analysis model for high-dimensional sparse multi-class discriminant analysis. We construct a sparse dimension reduction subspace to contain all the information necessary for a linear discriminant analysis. We show explicitly the connections between our model and two well-studied models in the literature: the principal fitted component model in sufficient dimension reduction, and the multivariate reducedrank regression model. The likelihood-inspired efficient estimator is then recast from a convex optimization perspective. A doubly penalized convex optimization is proposed to unite sparsity and low-rankness in high dimensions, and is then solved efficiently using a three-operator splitting algorithm. We establish the rank selection consistency and classification error consistency of the proposed method when the number of variables grows very fast with the sample size. The effectiveness of the proposed method is demonstrated by means of extensive simulation studies and an application to facial recognition data sets.

Key words and phrases: Dimension reduction, discriminant analysis, linear discriminant analysis, nuclear norm penalty, variable selection.

Back To Index Previous Article Next Article Full Text