About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
Paper
A Bayesian model selection criterion for HMM topology optimization
Abstract
This paper addresses the problem of estimating the optimal Hidden Markov Model (HMM) topology. The optimal topology is defined as the one that gives the smallest error-rate with the minimal number of parameters. The paper introduces a Bayesian model selection criterion that is suitable for Continuous Hidden Markov Models topology optimization. The criterion is derived from the Laplacian approximation of the posterior of a model structure, and shares the algorithmic simplicity of conventional Bayesian selection criteria, such as Schwarz’s Bayesian Information Criterion (BIC). Unlike, BIC, which uses a multivariate Normal distribution assumption for the prior of all parameters of the model, the proposed HMM-oriented Bayesian Information Criterion (HBIC), models each parameter by a different distribution, one more appropriate for that parameter. The results on an handwriting recognition task shows that the HBIC realizes a much smaller and efficient system than a system generated through the BIC.