Back To Index Previous Article Next Article Full Text

Statistica Sinica 28 (2018), 2733-2748

DATA SHARPENING IN LOCAL REGRESSION
GUIDED BY GLOBAL CONSTRAINT
W. John Braun, X. Joan Hu and Xiuli Kang
University of British Columbia, Simon Fraser University
and University of Kansas

Abstract: Data sharpening for kernel regression and density estimation was introduced by the late Peter Hall. We review briefly his enormous contribution to the literature in this area and then propose a data sharpening procedure arising from imposition of a soft global functional constraint in local regression analysis. Instead of enforcing the constraint everywhere, the procedure guides the data in directions which enable satisfaction or near-satisfaction of the given property globally through the use of a penalty. It results in a modified local regression estimator which possesses a closed functional form and which includes a conventional local regression estimator as a special case. The approach can accommodate various constraints, most of which in practice are motivated by expert prior knowledge. We demonstrate theoretically and numerically that the proposed estimator is an improved variant of the corresponding local regression estimator. It achieves a reduction in variance while maintaining the bias at the same level. Although the focus in the paper is on local polynomial regression, the technique can be applied, in principle, to any linear nonparametric estimator, including regression splines, smoothing and penalized splines and other recently proposed kernel estimators. We exhibit usefulness of the proposed approach with an analysis of a collection of temperatures at the airport of Vancouver. The analysis reveals a possible monotonic trend underlying the conventional supposition of a periodic (seasonal) temporal structure.

Key words and phrases: Bias-variance trade-off, functional constraint, kernel smoothing, quadratic penalty.

Back To Index Previous Article Next Article Full Text