Back To Index Previous Article Next Article Full Text


Statistica Sinica 16(2006), 425-439





CONVERGENCE RATES OF COMPACTLY SUPPORTED

RADIAL BASIS FUNCTION REGULARIZATION


Yi Lin and Ming Yuan


University of Wisconsin-Madison and Georgia Institute of Technology


Abstract: Regularization with radial basis functions is an effective method in many machine learning applications. In recent years classes of radial basis functions with compact support have been proposed in the approximation theory literature and have become more and more popular due to their computational advantages. In this paper we study the statistical properties of the method of regularization with compactly supported basis functions. We consider three popular classes of compactly supported radial basis functions. In the setting of estimating a periodic function in a white noise problem, we show that regularization with (periodized) compactly supported radial basis functions is rate optimal and adapts to unknown smoothness up to an order related to the radial basis function used. Due to results on equivalence of the white noise model with many important models including regression and density estimation, our results are expected to give insight on the performance of such methods in more general settings than the white noise model.



Key words and phrases: method of regularization, nonparametric estimation, radial basis functions, rate of convergence, reproducing kernel.

Back To Index Previous Article Next Article Full Text