Using random forest language models in the IBM RT-04 CTS system
Peng Xu, Lidia Mangu
INTERSPEECH - Eurospeech 2005
ASR decoders are often required to produce a word lattice in addition to the best scoring path. While exact lattice generation is expensive (in the context of a Viterbi decoder), approximative algorithms are available to produce high quality lattices at much lower cost. Ideally, we would like to have an algorithm which does not require additional resources (either memory of CPU) in comparison to the best-path only decoder. We will present a lattice generation technique which uses a relatively strong approximation in comparison to other published techniques but requires little memory overhead (in comparison to a decoder optimized for best path only). We will show that the technique is suitable for tasks where a grammar is used as language model with little impact on the lattice quality (evaluated as n-best coverage).
Peng Xu, Lidia Mangu
INTERSPEECH - Eurospeech 2005
Rajesh Balchandran, Leonid Rachevsky, et al.
INTERSPEECH 2010
Raul Fernandez, Rosalind W. Picard
INTERSPEECH - Eurospeech 2005
Olivier Siohan, Michiel Bacchiani
INTERSPEECH - Eurospeech 2005