Abstract: This paper studies a general problem of making inferences for functions of two sets of parameters where, when the first set is given, there exists a statistic with a known distribution. We study the distribution of this statistic when the first set of parameters is unknown and is replaced by an estimator. We show that under mild conditions the variance of the statistic is inflated when the unconstrained maximum likelihood estimator (MLE) is used, but deflated when the constrained MLE is used. The results are shown to be useful in hypothesis testing and confidence-interval construction in providing simpler and improved inference methods than do the standard large sample likelihood inference theories. We provide three applications of our theories, namely Box-Cox regression, dynamic regression, and spatial regression, to illustrate the generality and versatility of our results.
Key words and phrases: Asymptotic distribution, finite sample performance, index parameter, variance deflation, variance inflation.