Abstract: When making important decisions, it is crucial to be able to quantify the uncertainty and control the error of any classifiers. We propose a selective classification framework that provides an "indecision" option for observations that cannot be classified with confidence. The false selection rate (FSR), defined as the expected fraction of erroneous classifications among all definitive classifications, provides a useful error rate notion that trades a fraction of indecisions for fewer classification errors. We develop a new class of locally adaptive shrinkage and selection (LASS) rules for FSR control in the context of high-dimensional linear discriminant analysis (LDA). LASS is easy to analyze, exhibits robust performance across sparse and dense regimes, and controls the FSR under weaker conditions than those of existing methods. Lastly, we demonstrate the empirical performances of LASS using both simulated and real data.
Key words and phrases: Classification with confidence, false discovery rate, linear discriminant analysis, risk control, shrinkage estimation.