Abstract: We consider a regression model with errors-in-variables. Let , be i.i.d. copies of satisfying , , involving independent and unobserved random variables . The density of and the constant noise level are known while the densities of and are unknown. Using the observations , , we propose an estimator of the regression function which is defined as the ratio of two adaptive estimators an estimator of divided by an estimator of , the density of . Both estimators are obtained by minimization of penalized contrast functions. We prove that the MISE of on a compact set is bounded by the sum of the two MISEs of the estimators of and . Rates of convergence are given when and belong to various smoothness classes and when the error is either ordinary smooth or super smooth. The rate of is optimal in a minimax sense in all cases where lower bounds are available.
Key words and phrases: Adaptive estimation, density deconvolution, errors-in-variables, minimax estimation, nonparametric regression, projection estimators.