Back To Index Previous Article Next Article Full Text


Statistica Sinica 21 (2011), 449-474





EMPIRICAL BAYESIAN THRESHOLDING FOR SPARSE

SIGNALS USING MIXTURE LOSS FUNCTIONS


Vikas C. Raykar and Linda H. Zhao


Siemens Healthcare and University of Pennsylvania


Abstract: We develop an empirical Bayesian thresholding rule for the normal mean problem that adapts well to the sparsity of the signal. An key element is the use of a mixture loss function that combines both the $L_p$ loss and the $0-1$ loss function. The Bayes procedures under this loss are explicitly given as thresholding rules and are easy to compute. The prior on each mean is a mixture of an atom of probability at zero, and a Laplace or normal density for the nonzero part. The mixing probability as well as the spread of the non-zero part are hyperparameters that are estimated by the empirical Bayes procedure. Our simulation experiments demonstrate that the proposed method performs better than the other competing methods for a wide range of scenarios. We also apply our proposed method for feature selection to four data sets.



Key words and phrases: Empirical bayes, mixture loss, mixture prior, normal means, sparsity, thresholding.

Back To Index Previous Article Next Article Full Text