Abstract: We develop an empirical Bayesian thresholding rule for the normal mean problem that adapts well to the sparsity of the signal. An key element is the use of a mixture loss function that combines both the loss and the loss function. The Bayes procedures under this loss are explicitly given as thresholding rules and are easy to compute. The prior on each mean is a mixture of an atom of probability at zero, and a Laplace or normal density for the nonzero part. The mixing probability as well as the spread of the non-zero part are hyperparameters that are estimated by the empirical Bayes procedure. Our simulation experiments demonstrate that the proposed method performs better than the other competing methods for a wide range of scenarios. We also apply our proposed method for feature selection to four data sets.
Key words and phrases: Empirical bayes, mixture loss, mixture prior, normal means, sparsity, thresholding.