Back To Index Previous Article Next Article Full Text

Statistica Sinica 34 (2024), 27-46

SELECTION OF PROPOSAL DISTRIBUTIONS FOR
MULTIPLE IMPORTANCE SAMPLING
Vivekananda Roy* and Evangelos Evangelou
Iowa State University and University of Bath

Abstract: In general, the naive importance sampling (IS) estimator does not work well in examples involving simultaneous inference on several targets, because the importance weights can take arbitrarily large values, making the estimator highly unstable. In such situations, researchers prefer alternative multiple IS estimators involving samples from multiple proposal distributions. Just like the naive IS, the success of these multiple IS estimators depends crucially on the choice of the proposal distributions, which is the focus of this study. We propose three methods: (i) a geometric space-filling approach, (ii) a minimax variance approach, and (iii) a maximum entropy approach. The first two methods apply to any IS estimator, whereas the third approach is described in the context of a two-stage IS estimator. For the first method, we propose a suitable measure of "closeness" based on the symmetric Kullback-Leibler divergence and the second and third approaches use estimates of asymptotic variances of an IS estimator and the reverse logistic regression estimator, respectively. Thus, when samples from the proposal distributions are obtained by running Markov chains, we provide consistent spectral variance estimators for these asymptotic variances. Lastly, we demonstrate the proposed methods for selecting proposal densities using various detailed examples.

Key words and phrases: Bayes factor, central limit theorem, marginal likelihood, Markov chain, polynomial ergodicity, reverse logistic regression.

Back To Index Previous Article Next Article Full Text