Krishna S. Nathan, Jayashree Subrahmonia, et al.
ICPR 1996
This paper addresses the problem of estimating the optimal Hidden Markov Model (HMM) topology. The optimal topology is defined as the one that gives the smallest error-rate with the minimal number of parameters. The paper introduces a Bayesian model selection criterion that is suitable for Continuous Hidden Markov Models topology optimization. The criterion is derived from the Laplacian approximation of the posterior of a model structure, and shares the algorithmic simplicity of conventional Bayesian selection criteria, such as Schwarz’s Bayesian Information Criterion (BIC). Unlike, BIC, which uses a multivariate Normal distribution assumption for the prior of all parameters of the model, the proposed HMM-oriented Bayesian Information Criterion (HBIC), models each parameter by a different distribution, one more appropriate for that parameter. The results on an handwriting recognition task shows that the HBIC realizes a much smaller and efficient system than a system generated through the BIC.
Krishna S. Nathan, Jayashree Subrahmonia, et al.
ICPR 1996
Alain Biem, Bruce Elmegreen, et al.
ICASSP 2010
Alain Biem, Eric Bouillet, et al.
SIGMOD 2010
Michael A. Bauer, Alain Biem, et al.
JoPCS