Publication
INTERSPEECH 2007
Conference paper

Bhattacharyya error and divergence using variational importance sampling

Abstract

Many applications require the use of divergence measures between probability distributions. Several of these, such as the Kullback Leibler (KL) divergence and the Bhattacharyya divergence, are tractable for single Gaussians, but intractable for complex distributions such as Gaussian mixture models (GMMs) used in speech recognizers. For tasks related to classification error, the Bhattacharyya divergence is of special importance. Here we derive efficient approximations to the Bhattacharyya divergence for GMMs, using novel variational methods and importance sampling. We introduce a combination of the two, variational importance sampling (VISa), which performs importance sampling using a proposal distribution derived from the variational approximation. VISa achieves the same accuracy as naive importance sampling at a fraction of the computation. Finally we apply the Bhattacharyya divergence to compute word con-fusability and compare the corresponding estimates using the KL divergence.

Date

Publication

INTERSPEECH 2007

Authors

Topics

Share