Back To Index Previous Article Next Article Full Text


Statistica Sinica 18(2008), 535-558





INFORMATION IDENTITIES AND TESTING HYPOTHESES:

POWER ANALYSIS FOR CONTINGENCY TABLES


Philip E. Cheng$^1$, Michelle Liou$^1$, John A. D. Aston$^{1,2}$ and Arthur C. Tsai$^1$


$^1$Academia Sinica and $^2$University of Warwick


Abstract: An information theoretic approach to the evaluation of $2\times2$ contingency tables is proposed. By investigating the relationship between the Kullback-Leibler divergence and the maximum likelihood estimator, information identities are established for testing hypotheses, in particular, for testing independence. These identities not only validate the calibration of $p$ values, but also yield a unified power analysis for the likelihood ratio test, Fisher's exact test and the Pearson-Yates chi-square test. It is shown that a widely discussed exact unconditional test for the equality of binomial parameters is ill-posed for testing independence, and that using this test to criticize Fisher's exact test as being conservative is logically flawed.



Key words and phrases: Chi-square test, contingency table, exact test, Kullback-Leibler divergence, likelihood ratio test, mutual information.

Back To Index Previous Article Next Article Full Text