Back To Index Previous Article Next Article Full Text


Statistica Sinica 17(2007), 199-220





SELF-NORMALIZATION FOR HEAVY-TAILED TIME SERIES

WITH LONG MEMORY


Tucker McElroy$^{1,2}$ and Dimitris Politis$^2$


$^1$U.S. Census Bureau and $^2$University of California, San Diego


Abstract: Many time series data sets have heavy tails and/or long memory, both of which are well-known to greatly influence the rate of convergence of the sample mean. Typically time series analysts consider models with either heavy tails or long memory; we consider both. The paper is essentially a theoretical case study that explores the growth rate of the sample mean for a particular heavy-tailed, long memory time series model. An exact rate of convergence, which displays the competition between memory and tail thickness in fostering sample mean growth, is obtained in our main theorem. An appropriate self-normalization is used to produce a studentized sample mean statistic, computable without prior knowledge of the tail and memory parameters. This paper presents a novel heavy-tailed time series model that also has long memory in the sense of sums of well-defined autocovariances; we explicitly show the role that memory and tail thickness play in determining the sample mean's rate of growth, and we construct an appropriate studentization. Our model is a natural extension of long memory Gaussian models to data with infinite variance, and therefore pertains to a wide range of applications, including finance, insurance, and hydrology.



Key words and phrases: Heavy-tailed data, infinite variance, long-range dependence, studentization.

Back To Index Previous Article Next Article Full Text