Statistica Sinica

Ritei Shibata

Abstract:Estimation of Kullback-Leibler information is a crucial part of deriving a statistical model selection procedure which, likeAIC, is based on the likelihood principle. To discriminate between nested models, we have to estimate Kullback-Leibler information up to the order of a constant, while Kullback-Leibler information itself is of the order of the number of observations. A correction term employed inAICis an example of how to fulfill this requirement; however the correction is a simple minded bias correction to the log maximum likelihood and there is no assurance that such a bias correction yields a good estimate of Kullback-Leibler information. In this paper we investigate a bootstrap type estimate of Kullback-Leibler information as an alternative. We first show that both bootstrap estimates proposed by Efron (1983, 1986) and by Cavanaugh and Shumway (1997) are at least asymptotically equivalent and there exist many other equivalent bootstrap estimates. We also show that all such methods are asymptotically equivalent to a non-bootstrap method known asTIC(Takeuchi (1976)), which is a generalization ofAICwhen the re-sampling method is non-parametric. Otherwise, for example, if the re-sampling method is parametric they are asymptotically equivalent toAIC. Therefore, the use of a bootstrap type estimate is not advantageous if enough observations are available and simple calculations of a non-bootstrap estimateAICorTICis not a burden. At the same time, it is also true that the use of a bootstrap estimate in place of a non-bootstrap estimate is reasonable and advantageous if the non-bootstrap estimate is too complicated to evaluate analytically.

Key words and phrases:Bias estimation, bootstrap, information criterion, Kullback- Leibler information.