Abstract: Factor analysis is a standard tool in educational testing contexts, which can be fit using the EM algorithm (Dempster, Laird and Rubin (1977)). An extension of EM, called the ECME algorithm (Liu and Rubin (1994)), can be used to obtain ML estimates more efficiently in factor analysis models. ECME has an E-step, identical to the E-step of EM, but instead of EM's M-step, it has a sequence of CM (conditional maximization) steps, each of which maximizes either the constrained expected complete-data log-likelihood, as with the ECM algorithm (Meng and Rubin (1993)), or the constrained actual log-likelihood. For factor analysis, we use two CM steps: the first maximizes the expected complete-data log-likelihood over the factor loadings given fixed uniquenesses, and the second maximizes the actual likelihood over the uniquenesses given fixed factor loadings. We also describe EM and ECME for ML estimation of factor analysis from incomplete data, which arise in applications of factor analysis in educational testing contexts. ECME shares with EM its monotone increase in likelihood and stable convergence to an ML estimate, but converges more quickly than EM. This more rapid convergence not only can shorten CPU time, but at least as important, it allows for a substantially easier assessment of convergence, as shown by examples. We believe that the application of ECME to factor analysis illustrates the role that extended EM-type algorithms, such as the even more general AECM algorithm (Meng and van Dyk (1997)) and the PX-EM algorithm (Liu, Rubin and Wu (1997)), can play in fitting complex models that can arise in educational testing contexts.
Key words and phrases: EM, ECM, incomplete data, missing data.