Back To Index Previous Article Next Article Full Text


Statistica Sinica 13(2003), 97-109



EFFECTS OF BAGGING AND BIAS CORRECTION ON

ESTIMATORS DEFINED BY ESTIMATING EQUATIONS


Song Xi Chen and Peter Hall


National University of Singapore and Australian National University


Abstract: Bagging an estimator approximately doubles its bias through the impact of bagging on quadratic terms in expansions of the estimator. This difficulty can be alleviated by bagging a suitably bias-corrected estimator, however. In these and other circumstances, what is the overall impact of bagging and/or bias correction, and how can it be characterised? We answer these questions in the case of general estimators defined by estimating equations, including for example maximum likelihood and method of moments estimators. It is shown that, despite the considerable variety of estimators that can be constructed by bagging and bias correction, the number of modes of behaviour is very small. In particular, bagging a bias-corrected estimator produces a new estimator that is second-order equivalent to the original, unadjusted estimator. Furthermore, the conventional bagged estimator, and the standard bias-corrected estimator, represent virtually equal but opposite adjustments of the conventional estimator. In particular, bagging adds back the adjustment provided by bias correction. If we bag a doubly bias corrected estimator, constructed so as to counteract the tendency of bagging to exacerbate bias, then the result is an estimator that is second-order equivalent to the standard bias-corrected estimator. These results do not depend on the manner of bias correction; that procedure may be implemented using the jackknife, the parametric bootstrap or the nonparametric bootstrap. They show that, when bagging is applied to relatively conventional statistical problems, it cannot reliably be expected to improve performance. Its domain is, in effect, restricted to problems such as regression trees, where variability is so high that it cannot be plausibly modelled using the approach taken here.



Key words and phrases: Bootstrap, estimating function, jackknife, maximum likelihood, mean square error, parametric bootstrap.



Back To Index Previous Article Next Article Full Text