Abstract: For general regular parametric models, we compare predictive densities under the criterion of average Kullback-Leibler divergence. Asymptotic results are given via a Bayesian route without any assumption on curved exponentiality. We also address the issue of asymptotic admissibility of predictive densities and give a complete characterization when the underlying parameter is scalar-valued. Bayes predictive densities are considered in particular and the status of probability matching priors in this regard is examined. Finally, we indicate the consequences of working under more general -divergences.
Key words and phrases: α-divergence, Bayes predictive density, estimative density, Jeffreys' prior, Kullback-Leibler divergence, probability matching prior.