Back To Index Previous Article Next Article Full Text

Statistica Sinica 29 (2019), 165-184

HYPER MARKOV LAWS FOR CORRELATION MATRICES
Jeremy Gaskins
University of Louisville

Abstract: Parsimoniously modeling dependence in multivariate data is a challenging task, particularly if the dependence parameter is a correlation matrix due to modeling assumptions or identifiability constraints. In this work, we connect the techniques of graphical models and the hyper inverse Wishart distribution to introduce hyper Markov priors for correlation matrices. The priors are formed by taking a Markov combination of non-sparse correlation matrix distributions, where these distributions come from marginalizing the diagonal elements out of an inverse Wishart or Wishart prior. These priors produce a sparse correlation matrix with zero elements in its inverse when variables are conditionally independent. An MCMC scheme for posterior inference is introduced, and the performance is considered in the context of the Gaussian copula model using a simulation study and a financial data example.

Key words and phrases: Copula model, dependence modeling, Gaussian graphical model, hyper inverse Wishart, reversible jump MCMC, sparsity.

Back To Index Previous Article Next Article Full Text