Back To Index Previous Article Next Article Full Text


Statistica Sinica 13(2003), 1165-1178





PREDICTIVE DENSITIES: GENERAL ASYMPTOTIC

RESULTS AND ADMISSIBILITY


Gang Wei and Ruhal Mukerjee


Hong Kong Baptist University and Indian Institute of Management


Abstract: For general regular parametric models, we compare predictive densities under the criterion of average Kullback-Leibler divergence. Asymptotic results are given via a Bayesian route without any assumption on curved exponentiality. We also address the issue of asymptotic admissibility of predictive densities and give a complete characterization when the underlying parameter is scalar-valued. Bayes predictive densities are considered in particular and the status of probability matching priors in this regard is examined. Finally, we indicate the consequences of working under more general $\alpha$-divergences.



Key words and phrases: α-divergence, Bayes predictive density, estimative density, Jeffreys' prior, Kullback-Leibler divergence, probability matching prior.



Back To Index Previous Article Next Article Full Text