Statistica Sinica 28 (2018), 1009-1029
Abstract: Sliced inverse regression is a valuable tool for dimension reduction. One can replace the predictor vector with a few linear combinations of its components without loss of information on the regression. This paper is about richer nonlinear dimension reduction. Each direction of sliced inverse regression is simply a slope vector of multiple linear regression applied to an optimally transformed response. Using this connection, we propose a nonlinear version of sliced inverse regression by replacing linear function by an additive function of the predictors. Our procedure has a clear interpretation as sliced inverse regression on a set of adaptively chosen transformations of the predictors. The flexibility of our method is illustrated via a simulation study and a data application.
Key words and phrases: Canonical correlation, optimal scoring, sufficient dimension reduction.