Abstract: In this article, we develop a sufficient dimension reduction theory for time series. This does not require specification of a model but seeks to find a matrix with the smallest possible number () such that the conditional distribution of is the same as that of , where , resulting in no loss of information about the conditional distribution of the series given its past values. We define the subspace spanned by the columns of as the time series central subspace and estimate it by maximizing Kullback-Leibler distance. We show that the estimator is consistent when and are known. In addition, for unknown and , we propose a consistent estimator of and a graphical method to determine . Finally, we present examples and a data analysis to illustrate a theory that may open new research avenues in time series analysis.
Key words and phrases: Density estimator, Kullback-Leibler distance, nonlinear time series, threshold, time series central subspace.