About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ICASSP 2008
Conference paper
Variational Bhattacharyya divergence for hidden Markov models
Abstract
Many applications require the use of divergence measures between probability distributions. Several of these, such as the Kullback-Leibler (KL) divergence and the Bhattacharyya divergence, are tractable for simple distributions such as Gaussians, but are intractable for more complex distributions such as hidden Markov models (HMMs) used in speech recognizers. For tasks related to classification error, the Bhattacharyya divergence is of special importance, due to its relationship with the Bayes error. Here we derive novel variational approximations to the Bhattacharyya divergence for HMMs. Remarkably the variational Bhattacharyya divergence can be computed in a simple closed-form expression for a given sequence length. One of the approximations can even be integrated over all possible sequence lengths in a closed-form expression. We apply the variational Bhattacharyya divergence for HMMs to word confusability, the problem of estimating the probability of mistaking one spoken word for another. ©2008 IEEE.