Back To Index Previous Article Next Article Full Text


Statistica Sinica 13(2003), 625-639





A NONITERATIVE SAMPLING METHOD FOR

COMPUTING POSTERIORS IN THE STRUCTURE OF

EM-TYPE ALGORITHMS


Ming Tan$^*$, Guo-Liang Tian$^*$ and Kai Wang Ng$^{\dag}$


$^*$University of Maryland at Baltimore and $^\dag$The University of Hong Kong


Abstract: We propose a noniterative sampling approach by combining the inverse Bayes formulae (IBF), sampling/importance resampling and posterior mode estimates from the Expectation/Maximization (EM) algorithm to obtain an i.i.d. sample approximately from the posterior distribution for problems where the EM-type algorithms apply. The IBF shows that the posterior is proportional to the ratio of two conditional distributions and its numerator provides a natural class of built-in importance sampling functions (ISFs) directly from the model specification. Given that the posterior mode by an EM-type algorithm is relatively easy to obtain, a best ISF can be identified by using that posterior mode, which results in a large overlap area under the target density and the ISF. We show why this procedure works theoretically. Therefore, the proposed method provides a novel alternative to perfect sampling and eliminates the convergence problems of Markov chain Monte Carlo methods. We first illustrate the method with a proof-of-principle example and then apply the method to hierarchical (or mixed-effects) models for longitudinal data. We conclude with a discussion.



Key words and phrases: Bayesian computation, data augmentation, EM algorithm, Gibbs sampler, inverse Bayes formulae, MCMC, sampling/importance resampling.



Back To Index Previous Article Next Article Full Text