Abstract: An information theoretic approach to the evaluation of contingency tables is proposed. By investigating the relationship between the Kullback-Leibler divergence and the maximum likelihood estimator, information identities are established for testing hypotheses, in particular, for testing independence. These identities not only validate the calibration of values, but also yield a unified power analysis for the likelihood ratio test, Fisher's exact test and the Pearson-Yates chi-square test. It is shown that a widely discussed exact unconditional test for the equality of binomial parameters is ill-posed for testing independence, and that using this test to criticize Fisher's exact test as being conservative is logically flawed.
Key words and phrases: Chi-square test, contingency table, exact test, Kullback-Leibler divergence, likelihood ratio test, mutual information.