Back To Index Previous Article Next Article Full Text


Statistica Sinica 20 (2010), 747-770





DIMENSION REDUCTION IN TIME SERIES


Jin-Hong Park$^1$, T. N. Sriram$^2$ and Xiangrong Yin$^2$


$^1$College of Charleston and $^2$University of Georgia


Abstract: In this article, we develop a sufficient dimension reduction theory for time series. This does not require specification of a model but seeks to find a $p\times d$ matrix ${\Phi}_{\text{d}}$ with the smallest possible number $d$ ($\le p$) such that the conditional distribution of $x_{t}\vert{\X}_{t-1}$ is the same as that of $x_t\vert{\Phi}_{\text{d}}^T{\X}_{t-1}$, where ${\X}_{t-1}=(x_{t-1}, \ldots, x_{t-p})^T$, resulting in no loss of information about the conditional distribution of the series given its past $p$ values. We define the subspace spanned by the columns of ${\Phi}_{\text{d}}$ as the time series central subspace and estimate it by maximizing Kullback-Leibler distance. We show that the estimator is consistent when $p$ and $d$ are known. In addition, for unknown $d$ and $p$, we propose a consistent estimator of $d$ and a graphical method to determine $p$. Finally, we present examples and a data analysis to illustrate a theory that may open new research avenues in time series analysis.



Key words and phrases: Density estimator, Kullback-Leibler distance, nonlinear time series, threshold, time series central subspace.

Back To Index Previous Article Next Article Full Text