Statistica Sinica 27 (2017), 1205-1223
Abstract: We consider prediction when distributions of current and future observations may differ. We derive the asymptotic Kullback-Leibler risks of Bayesian predictive distributions when both the numbers of current and future observations grow to infinity. Based on these results, we construct model selection criteria when the true distributions of current and future observations may differ. Through numerical experiments, we show that Bayesian predictive distributions based on the proposed model selection criteria work well.
Key words and phrases: Bayesian prediction, curve extrapolation, Kullback-Leibler divergence, model selection, bootstrap.