Hierarchical Mixtures of Experts and the EM Algorithm


Autoria(s): Jordan, Michael I.; Jacobs, Robert A.
Data(s)

20/10/2004

20/10/2004

01/08/1993

Resumo

We present a tree-structured architecture for supervised learning. The statistical model underlying the architecture is a hierarchical mixture model in which both the mixture coefficients and the mixture components are generalized linear models (GLIM's). Learning is treated as a maximum likelihood problem; in particular, we present an Expectation-Maximization (EM) algorithm for adjusting the parameters of the architecture. We also develop an on-line learning algorithm in which the parameters are updated incrementally. Comparative simulation results are presented in the robot dynamics domain.

Formato

29 p.

190144 bytes

678911 bytes

application/octet-stream

application/pdf

Identificador

AIM-1440

CBCL-083

http://hdl.handle.net/1721.1/7206

Idioma(s)

en_US

Relação

AIM-1440

CBCL-083

Palavras-Chave #supervised learning #statistics #decision trees #neuralsnetworks