Back To Index Previous Article Next Article Full Text

Statistica Sinica 24 (2014), 555-575

ASYMPTOTIC LAWS FOR CHANGE POINT ESTIMATION
IN INVERSE REGRESSION
Sophie Frick, Thorsten Hohage and Axel Munk
Universität Göttingen

Abstract: We derive rates of convergence and asymptotic normality of the least squares estimator for a large class of parametric inverse regression models Y = (Φf)(X)+ε. Our theory provides a unified asymptotic tretament for estimation of f with discontinuities of certain order, including piecewise polynomials and piecewise kink functions. Our results cover several classical and new examples, including splines with free knots or the estimation of piecewise linear functions with indirect observations under a nonlinear Hammerstein integral operator. Furthermore, we show that 0-penalisation leads to a consistent model selection, using techniques from empirical process theory. The asymptotic normality is used to provide confidence bands for f. Simulation studies and a data example from rheology illustrate the results.

Key words and phrases: Asymptotic normality, change point analysis, confidence bands, dynamic stress moduli, entropy bounds, Hammerstein integral equations, jump detection, penalized least squares estimator, reproducing kernel Hilbert spaces, sparsity, statistical inverse problems.

Back To Index Previous Article Next Article Full Text