Publication
ICASSP 2003
Conference paper

EM mixture model probability table compression

Abstract

This paper presents a new probability table compression method based on mixture models applied to N-tuple recognizers. Joint probability tables are modeled by lower dimensional probability mixtures and their mixture coefficients. The maximum likelihood parameters of the mixture models are trained by the Expectation-Maximization (EM) algorithm and quantized to one byte integers. The probability elements which mixture models do not estimate reliably are kept separately. Experimental results with on-line handwritten UNIPEN digits show that the total memory size of an N-tuple recognizer is reduced from 11.8M bytes to 0.55M bytes, while the recognition rate drops from 97.7% to 97.5%.

Date

Publication

ICASSP 2003

Authors

Share