On Convergence Properties of the EM Algorithm for Gaussian Mixtures


Autoria(s): Jordan, Michael; Xu, Lei
Data(s)

20/10/2004

20/10/2004

21/04/1995

Resumo

"Expectation-Maximization'' (EM) algorithm and gradient-based approaches for maximum likelihood learning of finite Gaussian mixtures. We show that the EM step in parameter space is obtained from the gradient via a projection matrix $P$, and we provide an explicit expression for the matrix. We then analyze the convergence of EM in terms of special properties of $P$ and provide new results analyzing the effect that $P$ has on the likelihood surface. Based on these mathematical results, we present a comparative discussion of the advantages and disadvantages of EM and other algorithms for the learning of Gaussian mixture models.

Formato

9 p.

291671 bytes

476864 bytes

application/postscript

application/pdf

Identificador

AIM-1520

CBCL-111

http://hdl.handle.net/1721.1/7195

Idioma(s)

en_US

Relação

AIM-1520

CBCL-111

Palavras-Chave #learning #neural networks #EM algorithm #clustering #mixture models #statistics