Probabilistic Independence Networks for Hidden Markov Probability Models


Autoria(s): Smyth, Padhraic; Heckerman, David; Jordan, Michael
Data(s)

20/10/2004

20/10/2004

13/03/1996

Resumo

Graphical techniques for modeling the dependencies of randomvariables have been explored in a variety of different areas includingstatistics, statistical physics, artificial intelligence, speech recognition, image processing, and genetics.Formalisms for manipulating these models have been developedrelatively independently in these research communities. In this paper weexplore hidden Markov models (HMMs) and related structures within the general framework of probabilistic independencenetworks (PINs). The paper contains a self-contained review of the basic principles of PINs.It is shown that the well-known forward-backward (F-B) and Viterbialgorithms for HMMs are special cases of more general inference algorithms forarbitrary PINs. Furthermore, the existence of inference and estimationalgorithms for more general graphical models provides a set of analysistools for HMM practitioners who wish to explore a richer class of HMMstructures.Examples of relatively complex models to handle sensorfusion and coarticulationin speech recognitionare introduced and treated within the graphical model framework toillustrate the advantages of the general approach.

Formato

31 p.

664995 bytes

687871 bytes

application/postscript

application/pdf

Identificador

AIM-1565

CBCL-132

http://hdl.handle.net/1721.1/7185

Idioma(s)

en_US

Relação

AIM-1565

CBCL-132

Palavras-Chave #AI #MIT #Artificial Intelligence #graphical models #Hidden Markov models #HMM's #learning #probabilistic models #speech recognition #Bayesian networks #belief networks #Markov networks #probabilistic propagation #inference #coarticulation