Back To Index Previous Article Next Article Full Text


Statistica Sinica 10(2000), 457-473



INFORMATION BOUND FOR BANDWIDTH SELECTION

IN KERNEL ESTIMATION OF DENSITY DERIVATIVES


Tiee-Jian Wu and Yue Lin


National Cheng-Kung University and University of Houston


Abstract: Based on a random sample of size from an unknown density on the real line, several data-driven methods for selecting the bandwidth in kernel estimation of , , have recently been proposed which have a very fast asymptotic rate of convergence to the optimal bandwidth, where denotes the th derivative of . In particular, for all and sufficiently smooth , the best possible relative rate of convergence is . For , Fan and Marron (1992) employed semiparametric arguments to obtain the best possible constant coefficient, that is, an analog of the usual Fisher information bound, in this convergence. The purpose of this paper is to show that their arguments can be extended to establish information bounds for all . The extension from the special case to the case of general requires some nontrival work and gives a significant benchmark as to how well a bandwidth selector can hope to perform in kernel estimation of .



Key words and phrases: Bandwidth selection, density derivatives, kernel estimates, nonparametric information bounds, semiparametric methods.



Back To Index Previous Article Next Article Full Text