Publication
ICASSP 2008
Conference paper

Accelerated Monte Carlo for Kullback-Leibler divergence between Gaussian mixture models

View publication

Abstract

Kullback Leibler (KL) divergence is widely used as a measure of dissimilarity between two probability distributions; however, the required integral is not tractable for gaussian mixture models (GMMs), and naive Monte-Carlo sampling methods can be expensive. Our work aims to improve the estimation of KL divergence for GMMs by sampling methods. We show how to accelerate Monte-Carlo sampling using variational approximations of the KL divergence. To this end we employ two different methodologies, control variates, and importance sampling. With control variates we use sampling to estimate the difference between the variational approximation and the the unknown KL divergence. With importance sampling, we estimate the KL divergence directly, using a sampling distribution derived from the variational approximation. We show that with these techniques we can achieve improvements in accuracy equivalent to using a factor of 30 times more samples. ©2008 IEEE.

Date

Publication

ICASSP 2008

Authors

Share