Back To Index Previous Article Next Article Full Text


Statistica Sinica 11(2001), 981-1003



ON COMPUTATION USING GIBBS SAMPLING

FOR MULTILEVEL MODELS


Alan E. Gelfand, Bradley P. Carlin and Matilde Trevisani


University of Connecticut, University of Minnesota and University of Padua


Abstract: Multilevel models incorporating random effects at the various levels are enjoying increased popularity. An implicit problem with such models is identifiability. From a Bayesian perspective, formal identifiability is not an issue. Rather, when implementing iterative simulation-based model fitting, a poorly behaved Gibbs sampler frequently arises. The objective of this paper is to shed light on two computational issues in this regard. The first concerns autocorrelation in the sequence of iterates of the Markov chain. For estimable functions we clarify when, after convergence, autocorrelation will drop off to zero rapidly, enabling high effective sample size. The second concerns immediate convergence, i.e., when, at an arbitrary iteration, the simulated value of a variable is in fact an observation from the posterior distribution of the variable. Again, for estimable functions, we clarify when the chain will produce at each iteration a sample drawn essentially from the true posterior of the function. We provide both analytical and computational support for our conclusions, including exemplification for three multilevel models having normal, Poisson, and binary responses, respectively.



Key words and phrases: Autocorrelation, estimable function, exact sampling, identifiability.



Back To Index Previous Article Next Article Full Text