Article ID: | iaor20072124 |
Country: | United States |
Volume: | 29 |
Issue: | 1 |
Start Page Number: | 27 |
End Page Number: | 44 |
Publication Date: | Feb 2004 |
Journal: | Mathematics of Operations Research |
Authors: | Tseng Paul |
Keywords: | entropy |
The EM algorithm is a popular method for maximum likelihood estimation from incomplete data. This method may be viewed as a proximal point method for maximizing the log-likelihood function using an integral form of the Kullback–Leibler distance function. Motivated by this interpretation, we consider a proximal point method using an integral form of entropy-like distance function. We give a convergence analysis of the resulting proximal point method in the case where the cluster points lie in the interior of the objective function domain. This result is applied to a normal/independent example and a Gaussian mixture example to establish convergence of the EM algorithm on these examples. Further convergence analysis of the method for maximization over an orthant is given in low dimensions. Sublinear convergence and schemes for accelerating convergence are also discussed.