Back To Index Previous Article Next Article Full Text


Statistica Sinica 12(2002), 429-447



ERROR-DEPENDENT SMOOTHING RULES

IN LOCAL LINEAR REGRESSION


Ming-Yen Cheng$^{1,2}$ and Peter Hall$^1$


$^1$Australian National University and $^2$National Taiwan University


Abstract: We suggest an adaptive, error-dependent smoothing method for reducing the variance of local-linear curve estimators. It involves weighting the bandwidth used at the $i$th datum in proportion to a power of the absolute value of the $i$th residual. We show that the optimal power is 2/3. Arguing in this way, we prove that asymptotic variance can be reduced by 24% in the case of Normal errors, and by 35% for double-exponential errors. These results might appear to violate Jianqing Fan's bounds on performance of local-linear methods, but note that our approach to smoothing produces nonlinear estimators. In the case of Normal errors, our estimator has slightly better mean squared error performance than that suggested by Fan's minimax bound, calculated by him over all estimators, not just linear ones. However, these improvements are available only for single functions, not uniformly over Fan's function class. Even greater improvements in performance are achievable for error distributions with heavier tails. For symmetric error distributions the method has no first-order effect on bias, and existing bias-reduction techniques may be used in conjunction with error-dependent smoothing. In the case of asymmetric error distributions an overall reduction in mean squared error is achievable, involving a trade-off between bias and variance contributions. However, in this setting, the technique is relatively complex and probably not practically feasible.



Key words and phrases: Bandwidth, kernel method, nonparametric regression, tail weight, variance reduction.



Back To Index Previous Article Next Article Full Text