Back To Index Previous Article Next Article Full Text

Statistica Sinica 26 (2016), 527-546 doi:http://dx.doi.org/10.5705/ss.202014.0151

KERNEL ADDITIVE SLICED INVERSE REGRESSION
Heng Lian and Qin Wang
University of New South Wales and Virginia Commonwealth University

Abstract: In recent years, nonlinear sufficient dimension reduction (SDR) methods have gained increasing popularity. While there is a large literature on semiparametric models in regression, parsimonious structured nonlinear SDR has attracted little attention so far. In this paper, extending kernel sliced inverse regression, we study additive models in the context of SDR and demonstrate its potential usefulness due to its flexibility and parsimony. We clarify the improved convergence rate using additive structure due to the faster rate of decay of the kernel’s eigenvalues. Additive structure also opens the possibility of nonparametric variable selection. This sparsification of the kernel, however, does not introduce additional tuning parameters, in contrast with sparse regression. Simulations and data sets are presented to illustrate the benefits and limitations of the approach.

Key words and phrases: Kernel method, nonlinear dimension reduction, sliced inverse regression, variable selection.

Back To Index Previous Article Next Article Full Text