About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ICASSP 2010
Conference paper
An improved consensus-like method for minimum bayes risk decoding and lattice combination
Abstract
In this paper we describe a method for Minimum Bayes Risk decoding for speech recognition. This is a technique similar to Consensus a.k.a. Confusion Network Decoding, in which we attempt to find the hypothesis that minimizes the Bayes' Risk with respect to the word error rate, based on a lattice of alternative outputs. Our method is an E-M like technique which makes approximations which we believe are less severe than the approximations made in Consensus, and our experimental results show an improvement in WER both for lattice rescoring and lattice-based system combination, versus baselines such as Consensus, Confusion Network Combination and ROVER. ©2010 IEEE.